Nov 29 05:35:54 localhost kernel: Linux version 5.14.0-642.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-68.el9) #1 SMP PREEMPT_DYNAMIC Thu Nov 20 14:15:03 UTC 2025
Nov 29 05:35:54 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Nov 29 05:35:54 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 29 05:35:54 localhost kernel: BIOS-provided physical RAM map:
Nov 29 05:35:54 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Nov 29 05:35:54 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Nov 29 05:35:54 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Nov 29 05:35:54 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Nov 29 05:35:54 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Nov 29 05:35:54 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Nov 29 05:35:54 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Nov 29 05:35:54 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Nov 29 05:35:54 localhost kernel: NX (Execute Disable) protection: active
Nov 29 05:35:54 localhost kernel: APIC: Static calls initialized
Nov 29 05:35:54 localhost kernel: SMBIOS 2.8 present.
Nov 29 05:35:54 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Nov 29 05:35:54 localhost kernel: Hypervisor detected: KVM
Nov 29 05:35:54 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Nov 29 05:35:54 localhost kernel: kvm-clock: using sched offset of 3300021411 cycles
Nov 29 05:35:54 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Nov 29 05:35:54 localhost kernel: tsc: Detected 2800.000 MHz processor
Nov 29 05:35:54 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Nov 29 05:35:54 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Nov 29 05:35:54 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Nov 29 05:35:54 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Nov 29 05:35:54 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Nov 29 05:35:54 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Nov 29 05:35:54 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Nov 29 05:35:54 localhost kernel: Using GB pages for direct mapping
Nov 29 05:35:54 localhost kernel: RAMDISK: [mem 0x2d83a000-0x32c14fff]
Nov 29 05:35:54 localhost kernel: ACPI: Early table checksum verification disabled
Nov 29 05:35:54 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Nov 29 05:35:54 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 29 05:35:54 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 29 05:35:54 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 29 05:35:54 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Nov 29 05:35:54 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 29 05:35:54 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 29 05:35:54 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Nov 29 05:35:54 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Nov 29 05:35:54 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Nov 29 05:35:54 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Nov 29 05:35:54 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Nov 29 05:35:54 localhost kernel: No NUMA configuration found
Nov 29 05:35:54 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Nov 29 05:35:54 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd3000-0x23fffdfff]
Nov 29 05:35:54 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Nov 29 05:35:54 localhost kernel: Zone ranges:
Nov 29 05:35:54 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Nov 29 05:35:54 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Nov 29 05:35:54 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Nov 29 05:35:54 localhost kernel:   Device   empty
Nov 29 05:35:54 localhost kernel: Movable zone start for each node
Nov 29 05:35:54 localhost kernel: Early memory node ranges
Nov 29 05:35:54 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Nov 29 05:35:54 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Nov 29 05:35:54 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Nov 29 05:35:54 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Nov 29 05:35:54 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Nov 29 05:35:54 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Nov 29 05:35:54 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Nov 29 05:35:54 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Nov 29 05:35:54 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Nov 29 05:35:54 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Nov 29 05:35:54 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Nov 29 05:35:54 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Nov 29 05:35:54 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Nov 29 05:35:54 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Nov 29 05:35:54 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Nov 29 05:35:54 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Nov 29 05:35:54 localhost kernel: TSC deadline timer available
Nov 29 05:35:54 localhost kernel: CPU topo: Max. logical packages:   8
Nov 29 05:35:54 localhost kernel: CPU topo: Max. logical dies:       8
Nov 29 05:35:54 localhost kernel: CPU topo: Max. dies per package:   1
Nov 29 05:35:54 localhost kernel: CPU topo: Max. threads per core:   1
Nov 29 05:35:54 localhost kernel: CPU topo: Num. cores per package:     1
Nov 29 05:35:54 localhost kernel: CPU topo: Num. threads per package:   1
Nov 29 05:35:54 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Nov 29 05:35:54 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Nov 29 05:35:54 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Nov 29 05:35:54 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Nov 29 05:35:54 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Nov 29 05:35:54 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Nov 29 05:35:54 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Nov 29 05:35:54 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Nov 29 05:35:54 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Nov 29 05:35:54 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Nov 29 05:35:54 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Nov 29 05:35:54 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Nov 29 05:35:54 localhost kernel: Booting paravirtualized kernel on KVM
Nov 29 05:35:54 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Nov 29 05:35:54 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Nov 29 05:35:54 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Nov 29 05:35:54 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Nov 29 05:35:54 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Nov 29 05:35:54 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Nov 29 05:35:54 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 29 05:35:54 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64", will be passed to user space.
Nov 29 05:35:54 localhost kernel: random: crng init done
Nov 29 05:35:54 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Nov 29 05:35:54 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Nov 29 05:35:54 localhost kernel: Fallback order for Node 0: 0 
Nov 29 05:35:54 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Nov 29 05:35:54 localhost kernel: Policy zone: Normal
Nov 29 05:35:54 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Nov 29 05:35:54 localhost kernel: software IO TLB: area num 8.
Nov 29 05:35:54 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Nov 29 05:35:54 localhost kernel: ftrace: allocating 49313 entries in 193 pages
Nov 29 05:35:54 localhost kernel: ftrace: allocated 193 pages with 3 groups
Nov 29 05:35:54 localhost kernel: Dynamic Preempt: voluntary
Nov 29 05:35:54 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Nov 29 05:35:54 localhost kernel: rcu:         RCU event tracing is enabled.
Nov 29 05:35:54 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Nov 29 05:35:54 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Nov 29 05:35:54 localhost kernel:         Rude variant of Tasks RCU enabled.
Nov 29 05:35:54 localhost kernel:         Tracing variant of Tasks RCU enabled.
Nov 29 05:35:54 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Nov 29 05:35:54 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Nov 29 05:35:54 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 29 05:35:54 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 29 05:35:54 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 29 05:35:54 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Nov 29 05:35:54 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Nov 29 05:35:54 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Nov 29 05:35:54 localhost kernel: Console: colour VGA+ 80x25
Nov 29 05:35:54 localhost kernel: printk: console [ttyS0] enabled
Nov 29 05:35:54 localhost kernel: ACPI: Core revision 20230331
Nov 29 05:35:54 localhost kernel: APIC: Switch to symmetric I/O mode setup
Nov 29 05:35:54 localhost kernel: x2apic enabled
Nov 29 05:35:54 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Nov 29 05:35:54 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Nov 29 05:35:54 localhost kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Nov 29 05:35:54 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Nov 29 05:35:54 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Nov 29 05:35:54 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Nov 29 05:35:54 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Nov 29 05:35:54 localhost kernel: Spectre V2 : Mitigation: Retpolines
Nov 29 05:35:54 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Nov 29 05:35:54 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Nov 29 05:35:54 localhost kernel: RETBleed: Mitigation: untrained return thunk
Nov 29 05:35:54 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Nov 29 05:35:54 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Nov 29 05:35:54 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Nov 29 05:35:54 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Nov 29 05:35:54 localhost kernel: x86/bugs: return thunk changed
Nov 29 05:35:54 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Nov 29 05:35:54 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Nov 29 05:35:54 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Nov 29 05:35:54 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Nov 29 05:35:54 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Nov 29 05:35:54 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Nov 29 05:35:54 localhost kernel: Freeing SMP alternatives memory: 40K
Nov 29 05:35:54 localhost kernel: pid_max: default: 32768 minimum: 301
Nov 29 05:35:54 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Nov 29 05:35:54 localhost kernel: landlock: Up and running.
Nov 29 05:35:54 localhost kernel: Yama: becoming mindful.
Nov 29 05:35:54 localhost kernel: SELinux:  Initializing.
Nov 29 05:35:54 localhost kernel: LSM support for eBPF active
Nov 29 05:35:54 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Nov 29 05:35:54 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Nov 29 05:35:54 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Nov 29 05:35:54 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Nov 29 05:35:54 localhost kernel: ... version:                0
Nov 29 05:35:54 localhost kernel: ... bit width:              48
Nov 29 05:35:54 localhost kernel: ... generic registers:      6
Nov 29 05:35:54 localhost kernel: ... value mask:             0000ffffffffffff
Nov 29 05:35:54 localhost kernel: ... max period:             00007fffffffffff
Nov 29 05:35:54 localhost kernel: ... fixed-purpose events:   0
Nov 29 05:35:54 localhost kernel: ... event mask:             000000000000003f
Nov 29 05:35:54 localhost kernel: signal: max sigframe size: 1776
Nov 29 05:35:54 localhost kernel: rcu: Hierarchical SRCU implementation.
Nov 29 05:35:54 localhost kernel: rcu:         Max phase no-delay instances is 400.
Nov 29 05:35:54 localhost kernel: smp: Bringing up secondary CPUs ...
Nov 29 05:35:54 localhost kernel: smpboot: x86: Booting SMP configuration:
Nov 29 05:35:54 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Nov 29 05:35:54 localhost kernel: smp: Brought up 1 node, 8 CPUs
Nov 29 05:35:54 localhost kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Nov 29 05:35:54 localhost kernel: node 0 deferred pages initialised in 9ms
Nov 29 05:35:54 localhost kernel: Memory: 7765624K/8388068K available (16384K kernel code, 5787K rwdata, 13900K rodata, 4192K init, 7172K bss, 616276K reserved, 0K cma-reserved)
Nov 29 05:35:54 localhost kernel: devtmpfs: initialized
Nov 29 05:35:54 localhost kernel: x86/mm: Memory block size: 128MB
Nov 29 05:35:54 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Nov 29 05:35:54 localhost kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Nov 29 05:35:54 localhost kernel: pinctrl core: initialized pinctrl subsystem
Nov 29 05:35:54 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Nov 29 05:35:54 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Nov 29 05:35:54 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Nov 29 05:35:54 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Nov 29 05:35:54 localhost kernel: audit: initializing netlink subsys (disabled)
Nov 29 05:35:54 localhost kernel: audit: type=2000 audit(1764394551.698:1): state=initialized audit_enabled=0 res=1
Nov 29 05:35:54 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Nov 29 05:35:54 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Nov 29 05:35:54 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Nov 29 05:35:54 localhost kernel: cpuidle: using governor menu
Nov 29 05:35:54 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Nov 29 05:35:54 localhost kernel: PCI: Using configuration type 1 for base access
Nov 29 05:35:54 localhost kernel: PCI: Using configuration type 1 for extended access
Nov 29 05:35:54 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Nov 29 05:35:54 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Nov 29 05:35:54 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Nov 29 05:35:54 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Nov 29 05:35:54 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Nov 29 05:35:54 localhost kernel: Demotion targets for Node 0: null
Nov 29 05:35:54 localhost kernel: cryptd: max_cpu_qlen set to 1000
Nov 29 05:35:54 localhost kernel: ACPI: Added _OSI(Module Device)
Nov 29 05:35:54 localhost kernel: ACPI: Added _OSI(Processor Device)
Nov 29 05:35:54 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Nov 29 05:35:54 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Nov 29 05:35:54 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Nov 29 05:35:54 localhost kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Nov 29 05:35:54 localhost kernel: ACPI: Interpreter enabled
Nov 29 05:35:54 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Nov 29 05:35:54 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Nov 29 05:35:54 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Nov 29 05:35:54 localhost kernel: PCI: Using E820 reservations for host bridge windows
Nov 29 05:35:54 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Nov 29 05:35:54 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Nov 29 05:35:54 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Nov 29 05:35:54 localhost kernel: acpiphp: Slot [3] registered
Nov 29 05:35:54 localhost kernel: acpiphp: Slot [4] registered
Nov 29 05:35:54 localhost kernel: acpiphp: Slot [5] registered
Nov 29 05:35:54 localhost kernel: acpiphp: Slot [6] registered
Nov 29 05:35:54 localhost kernel: acpiphp: Slot [7] registered
Nov 29 05:35:54 localhost kernel: acpiphp: Slot [8] registered
Nov 29 05:35:54 localhost kernel: acpiphp: Slot [9] registered
Nov 29 05:35:54 localhost kernel: acpiphp: Slot [10] registered
Nov 29 05:35:54 localhost kernel: acpiphp: Slot [11] registered
Nov 29 05:35:54 localhost kernel: acpiphp: Slot [12] registered
Nov 29 05:35:54 localhost kernel: acpiphp: Slot [13] registered
Nov 29 05:35:54 localhost kernel: acpiphp: Slot [14] registered
Nov 29 05:35:54 localhost kernel: acpiphp: Slot [15] registered
Nov 29 05:35:54 localhost kernel: acpiphp: Slot [16] registered
Nov 29 05:35:54 localhost kernel: acpiphp: Slot [17] registered
Nov 29 05:35:54 localhost kernel: acpiphp: Slot [18] registered
Nov 29 05:35:54 localhost kernel: acpiphp: Slot [19] registered
Nov 29 05:35:54 localhost kernel: acpiphp: Slot [20] registered
Nov 29 05:35:54 localhost kernel: acpiphp: Slot [21] registered
Nov 29 05:35:54 localhost kernel: acpiphp: Slot [22] registered
Nov 29 05:35:54 localhost kernel: acpiphp: Slot [23] registered
Nov 29 05:35:54 localhost kernel: acpiphp: Slot [24] registered
Nov 29 05:35:54 localhost kernel: acpiphp: Slot [25] registered
Nov 29 05:35:54 localhost kernel: acpiphp: Slot [26] registered
Nov 29 05:35:54 localhost kernel: acpiphp: Slot [27] registered
Nov 29 05:35:54 localhost kernel: acpiphp: Slot [28] registered
Nov 29 05:35:54 localhost kernel: acpiphp: Slot [29] registered
Nov 29 05:35:54 localhost kernel: acpiphp: Slot [30] registered
Nov 29 05:35:54 localhost kernel: acpiphp: Slot [31] registered
Nov 29 05:35:54 localhost kernel: PCI host bridge to bus 0000:00
Nov 29 05:35:54 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Nov 29 05:35:54 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Nov 29 05:35:54 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Nov 29 05:35:54 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Nov 29 05:35:54 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Nov 29 05:35:54 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Nov 29 05:35:54 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Nov 29 05:35:54 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Nov 29 05:35:54 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Nov 29 05:35:54 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Nov 29 05:35:54 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Nov 29 05:35:54 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Nov 29 05:35:54 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Nov 29 05:35:54 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Nov 29 05:35:54 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Nov 29 05:35:54 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Nov 29 05:35:54 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Nov 29 05:35:54 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Nov 29 05:35:54 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Nov 29 05:35:54 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Nov 29 05:35:54 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Nov 29 05:35:54 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Nov 29 05:35:54 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Nov 29 05:35:54 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Nov 29 05:35:54 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Nov 29 05:35:54 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Nov 29 05:35:54 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Nov 29 05:35:54 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Nov 29 05:35:54 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Nov 29 05:35:54 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Nov 29 05:35:54 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Nov 29 05:35:54 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Nov 29 05:35:54 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Nov 29 05:35:54 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Nov 29 05:35:54 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Nov 29 05:35:54 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Nov 29 05:35:54 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Nov 29 05:35:54 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Nov 29 05:35:54 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Nov 29 05:35:54 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Nov 29 05:35:54 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Nov 29 05:35:54 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Nov 29 05:35:54 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Nov 29 05:35:54 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Nov 29 05:35:54 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Nov 29 05:35:54 localhost kernel: iommu: Default domain type: Translated
Nov 29 05:35:54 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Nov 29 05:35:54 localhost kernel: SCSI subsystem initialized
Nov 29 05:35:54 localhost kernel: ACPI: bus type USB registered
Nov 29 05:35:54 localhost kernel: usbcore: registered new interface driver usbfs
Nov 29 05:35:54 localhost kernel: usbcore: registered new interface driver hub
Nov 29 05:35:54 localhost kernel: usbcore: registered new device driver usb
Nov 29 05:35:54 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Nov 29 05:35:54 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Nov 29 05:35:54 localhost kernel: PTP clock support registered
Nov 29 05:35:54 localhost kernel: EDAC MC: Ver: 3.0.0
Nov 29 05:35:54 localhost kernel: NetLabel: Initializing
Nov 29 05:35:54 localhost kernel: NetLabel:  domain hash size = 128
Nov 29 05:35:54 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Nov 29 05:35:54 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Nov 29 05:35:54 localhost kernel: PCI: Using ACPI for IRQ routing
Nov 29 05:35:54 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Nov 29 05:35:54 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Nov 29 05:35:54 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Nov 29 05:35:54 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Nov 29 05:35:54 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Nov 29 05:35:54 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Nov 29 05:35:54 localhost kernel: vgaarb: loaded
Nov 29 05:35:54 localhost kernel: clocksource: Switched to clocksource kvm-clock
Nov 29 05:35:54 localhost kernel: VFS: Disk quotas dquot_6.6.0
Nov 29 05:35:54 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Nov 29 05:35:54 localhost kernel: pnp: PnP ACPI init
Nov 29 05:35:54 localhost kernel: pnp 00:03: [dma 2]
Nov 29 05:35:54 localhost kernel: pnp: PnP ACPI: found 5 devices
Nov 29 05:35:54 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Nov 29 05:35:54 localhost kernel: NET: Registered PF_INET protocol family
Nov 29 05:35:54 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Nov 29 05:35:54 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Nov 29 05:35:54 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Nov 29 05:35:54 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Nov 29 05:35:54 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Nov 29 05:35:54 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Nov 29 05:35:54 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Nov 29 05:35:54 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Nov 29 05:35:54 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Nov 29 05:35:54 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Nov 29 05:35:54 localhost kernel: NET: Registered PF_XDP protocol family
Nov 29 05:35:54 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Nov 29 05:35:54 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Nov 29 05:35:54 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Nov 29 05:35:54 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Nov 29 05:35:54 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Nov 29 05:35:54 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Nov 29 05:35:54 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Nov 29 05:35:54 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Nov 29 05:35:54 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 74141 usecs
Nov 29 05:35:54 localhost kernel: PCI: CLS 0 bytes, default 64
Nov 29 05:35:54 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Nov 29 05:35:54 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Nov 29 05:35:54 localhost kernel: ACPI: bus type thunderbolt registered
Nov 29 05:35:54 localhost kernel: Trying to unpack rootfs image as initramfs...
Nov 29 05:35:54 localhost kernel: Initialise system trusted keyrings
Nov 29 05:35:54 localhost kernel: Key type blacklist registered
Nov 29 05:35:54 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Nov 29 05:35:54 localhost kernel: zbud: loaded
Nov 29 05:35:54 localhost kernel: integrity: Platform Keyring initialized
Nov 29 05:35:54 localhost kernel: integrity: Machine keyring initialized
Nov 29 05:35:54 localhost kernel: Freeing initrd memory: 85868K
Nov 29 05:35:54 localhost kernel: NET: Registered PF_ALG protocol family
Nov 29 05:35:54 localhost kernel: xor: automatically using best checksumming function   avx       
Nov 29 05:35:54 localhost kernel: Key type asymmetric registered
Nov 29 05:35:54 localhost kernel: Asymmetric key parser 'x509' registered
Nov 29 05:35:54 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Nov 29 05:35:54 localhost kernel: io scheduler mq-deadline registered
Nov 29 05:35:54 localhost kernel: io scheduler kyber registered
Nov 29 05:35:54 localhost kernel: io scheduler bfq registered
Nov 29 05:35:54 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Nov 29 05:35:54 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Nov 29 05:35:54 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Nov 29 05:35:54 localhost kernel: ACPI: button: Power Button [PWRF]
Nov 29 05:35:54 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Nov 29 05:35:54 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Nov 29 05:35:54 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Nov 29 05:35:54 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Nov 29 05:35:54 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Nov 29 05:35:54 localhost kernel: Non-volatile memory driver v1.3
Nov 29 05:35:54 localhost kernel: rdac: device handler registered
Nov 29 05:35:54 localhost kernel: hp_sw: device handler registered
Nov 29 05:35:54 localhost kernel: emc: device handler registered
Nov 29 05:35:54 localhost kernel: alua: device handler registered
Nov 29 05:35:54 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Nov 29 05:35:54 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Nov 29 05:35:54 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Nov 29 05:35:54 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Nov 29 05:35:54 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Nov 29 05:35:54 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Nov 29 05:35:54 localhost kernel: usb usb1: Product: UHCI Host Controller
Nov 29 05:35:54 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-642.el9.x86_64 uhci_hcd
Nov 29 05:35:54 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Nov 29 05:35:54 localhost kernel: hub 1-0:1.0: USB hub found
Nov 29 05:35:54 localhost kernel: hub 1-0:1.0: 2 ports detected
Nov 29 05:35:54 localhost kernel: usbcore: registered new interface driver usbserial_generic
Nov 29 05:35:54 localhost kernel: usbserial: USB Serial support registered for generic
Nov 29 05:35:54 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Nov 29 05:35:54 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Nov 29 05:35:54 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Nov 29 05:35:54 localhost kernel: mousedev: PS/2 mouse device common for all mice
Nov 29 05:35:54 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Nov 29 05:35:54 localhost kernel: rtc_cmos 00:04: registered as rtc0
Nov 29 05:35:54 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Nov 29 05:35:54 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-11-29T05:35:53 UTC (1764394553)
Nov 29 05:35:54 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Nov 29 05:35:54 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Nov 29 05:35:54 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Nov 29 05:35:54 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Nov 29 05:35:54 localhost kernel: usbcore: registered new interface driver usbhid
Nov 29 05:35:54 localhost kernel: usbhid: USB HID core driver
Nov 29 05:35:54 localhost kernel: drop_monitor: Initializing network drop monitor service
Nov 29 05:35:54 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Nov 29 05:35:54 localhost kernel: Initializing XFRM netlink socket
Nov 29 05:35:54 localhost kernel: NET: Registered PF_INET6 protocol family
Nov 29 05:35:54 localhost kernel: Segment Routing with IPv6
Nov 29 05:35:54 localhost kernel: NET: Registered PF_PACKET protocol family
Nov 29 05:35:54 localhost kernel: mpls_gso: MPLS GSO support
Nov 29 05:35:54 localhost kernel: IPI shorthand broadcast: enabled
Nov 29 05:35:54 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Nov 29 05:35:54 localhost kernel: AES CTR mode by8 optimization enabled
Nov 29 05:35:54 localhost kernel: sched_clock: Marking stable (1266001970, 150353320)->(1538717959, -122362669)
Nov 29 05:35:54 localhost kernel: registered taskstats version 1
Nov 29 05:35:54 localhost kernel: Loading compiled-in X.509 certificates
Nov 29 05:35:54 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8ec4bd273f582f9a9b9a494ae677ca1f1488f19e'
Nov 29 05:35:54 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Nov 29 05:35:54 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Nov 29 05:35:54 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Nov 29 05:35:54 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Nov 29 05:35:54 localhost kernel: Demotion targets for Node 0: null
Nov 29 05:35:54 localhost kernel: page_owner is disabled
Nov 29 05:35:54 localhost kernel: Key type .fscrypt registered
Nov 29 05:35:54 localhost kernel: Key type fscrypt-provisioning registered
Nov 29 05:35:54 localhost kernel: Key type big_key registered
Nov 29 05:35:54 localhost kernel: Key type encrypted registered
Nov 29 05:35:54 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Nov 29 05:35:54 localhost kernel: Loading compiled-in module X.509 certificates
Nov 29 05:35:54 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8ec4bd273f582f9a9b9a494ae677ca1f1488f19e'
Nov 29 05:35:54 localhost kernel: ima: Allocated hash algorithm: sha256
Nov 29 05:35:54 localhost kernel: ima: No architecture policies found
Nov 29 05:35:54 localhost kernel: evm: Initialising EVM extended attributes:
Nov 29 05:35:54 localhost kernel: evm: security.selinux
Nov 29 05:35:54 localhost kernel: evm: security.SMACK64 (disabled)
Nov 29 05:35:54 localhost kernel: evm: security.SMACK64EXEC (disabled)
Nov 29 05:35:54 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Nov 29 05:35:54 localhost kernel: evm: security.SMACK64MMAP (disabled)
Nov 29 05:35:54 localhost kernel: evm: security.apparmor (disabled)
Nov 29 05:35:54 localhost kernel: evm: security.ima
Nov 29 05:35:54 localhost kernel: evm: security.capability
Nov 29 05:35:54 localhost kernel: evm: HMAC attrs: 0x1
Nov 29 05:35:54 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Nov 29 05:35:54 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Nov 29 05:35:54 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Nov 29 05:35:54 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Nov 29 05:35:54 localhost kernel: usb 1-1: Manufacturer: QEMU
Nov 29 05:35:54 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Nov 29 05:35:54 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Nov 29 05:35:54 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Nov 29 05:35:54 localhost kernel: Running certificate verification RSA selftest
Nov 29 05:35:54 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Nov 29 05:35:54 localhost kernel: Running certificate verification ECDSA selftest
Nov 29 05:35:54 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Nov 29 05:35:54 localhost kernel: clk: Disabling unused clocks
Nov 29 05:35:54 localhost kernel: Freeing unused decrypted memory: 2028K
Nov 29 05:35:54 localhost kernel: Freeing unused kernel image (initmem) memory: 4192K
Nov 29 05:35:54 localhost kernel: Write protecting the kernel read-only data: 30720k
Nov 29 05:35:54 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 436K
Nov 29 05:35:54 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Nov 29 05:35:54 localhost kernel: Run /init as init process
Nov 29 05:35:54 localhost kernel:   with arguments:
Nov 29 05:35:54 localhost kernel:     /init
Nov 29 05:35:54 localhost kernel:   with environment:
Nov 29 05:35:54 localhost kernel:     HOME=/
Nov 29 05:35:54 localhost kernel:     TERM=linux
Nov 29 05:35:54 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64
Nov 29 05:35:54 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 29 05:35:54 localhost systemd[1]: Detected virtualization kvm.
Nov 29 05:35:54 localhost systemd[1]: Detected architecture x86-64.
Nov 29 05:35:54 localhost systemd[1]: Running in initrd.
Nov 29 05:35:54 localhost systemd[1]: No hostname configured, using default hostname.
Nov 29 05:35:54 localhost systemd[1]: Hostname set to <localhost>.
Nov 29 05:35:54 localhost systemd[1]: Initializing machine ID from VM UUID.
Nov 29 05:35:54 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Nov 29 05:35:54 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Nov 29 05:35:54 localhost systemd[1]: Reached target Local Encrypted Volumes.
Nov 29 05:35:54 localhost systemd[1]: Reached target Initrd /usr File System.
Nov 29 05:35:54 localhost systemd[1]: Reached target Local File Systems.
Nov 29 05:35:54 localhost systemd[1]: Reached target Path Units.
Nov 29 05:35:54 localhost systemd[1]: Reached target Slice Units.
Nov 29 05:35:54 localhost systemd[1]: Reached target Swaps.
Nov 29 05:35:54 localhost systemd[1]: Reached target Timer Units.
Nov 29 05:35:54 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Nov 29 05:35:54 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Nov 29 05:35:54 localhost systemd[1]: Listening on Journal Socket.
Nov 29 05:35:54 localhost systemd[1]: Listening on udev Control Socket.
Nov 29 05:35:54 localhost systemd[1]: Listening on udev Kernel Socket.
Nov 29 05:35:54 localhost systemd[1]: Reached target Socket Units.
Nov 29 05:35:54 localhost systemd[1]: Starting Create List of Static Device Nodes...
Nov 29 05:35:54 localhost systemd[1]: Starting Journal Service...
Nov 29 05:35:54 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Nov 29 05:35:54 localhost systemd[1]: Starting Apply Kernel Variables...
Nov 29 05:35:54 localhost systemd[1]: Starting Create System Users...
Nov 29 05:35:54 localhost systemd[1]: Starting Setup Virtual Console...
Nov 29 05:35:54 localhost systemd[1]: Finished Create List of Static Device Nodes.
Nov 29 05:35:54 localhost systemd[1]: Finished Apply Kernel Variables.
Nov 29 05:35:54 localhost systemd[1]: Finished Create System Users.
Nov 29 05:35:54 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 29 05:35:54 localhost systemd-journald[311]: Journal started
Nov 29 05:35:54 localhost systemd-journald[311]: Runtime Journal (/run/log/journal/2814de55a942416491c192c593f2f35f) is 8.0M, max 153.6M, 145.6M free.
Nov 29 05:35:54 localhost systemd-sysusers[315]: Creating group 'users' with GID 100.
Nov 29 05:35:54 localhost systemd-sysusers[315]: Creating group 'dbus' with GID 81.
Nov 29 05:35:54 localhost systemd-sysusers[315]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Nov 29 05:35:54 localhost systemd[1]: Started Journal Service.
Nov 29 05:35:54 localhost systemd[1]: Starting Create Volatile Files and Directories...
Nov 29 05:35:54 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 29 05:35:54 localhost systemd[1]: Finished Create Volatile Files and Directories.
Nov 29 05:35:54 localhost systemd[1]: Finished Setup Virtual Console.
Nov 29 05:35:54 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Nov 29 05:35:54 localhost systemd[1]: Starting dracut cmdline hook...
Nov 29 05:35:54 localhost dracut-cmdline[331]: dracut-9 dracut-057-102.git20250818.el9
Nov 29 05:35:54 localhost dracut-cmdline[331]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 29 05:35:54 localhost systemd[1]: Finished dracut cmdline hook.
Nov 29 05:35:54 localhost systemd[1]: Starting dracut pre-udev hook...
Nov 29 05:35:54 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Nov 29 05:35:54 localhost kernel: device-mapper: uevent: version 1.0.3
Nov 29 05:35:54 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Nov 29 05:35:54 localhost kernel: RPC: Registered named UNIX socket transport module.
Nov 29 05:35:54 localhost kernel: RPC: Registered udp transport module.
Nov 29 05:35:54 localhost kernel: RPC: Registered tcp transport module.
Nov 29 05:35:54 localhost kernel: RPC: Registered tcp-with-tls transport module.
Nov 29 05:35:54 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Nov 29 05:35:54 localhost rpc.statd[445]: Version 2.5.4 starting
Nov 29 05:35:54 localhost rpc.statd[445]: Initializing NSM state
Nov 29 05:35:54 localhost rpc.idmapd[450]: Setting log level to 0
Nov 29 05:35:54 localhost systemd[1]: Finished dracut pre-udev hook.
Nov 29 05:35:54 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 29 05:35:54 localhost systemd-udevd[463]: Using default interface naming scheme 'rhel-9.0'.
Nov 29 05:35:54 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 29 05:35:54 localhost systemd[1]: Starting dracut pre-trigger hook...
Nov 29 05:35:54 localhost systemd[1]: Finished dracut pre-trigger hook.
Nov 29 05:35:54 localhost systemd[1]: Starting Coldplug All udev Devices...
Nov 29 05:35:55 localhost systemd[1]: Created slice Slice /system/modprobe.
Nov 29 05:35:55 localhost systemd[1]: Starting Load Kernel Module configfs...
Nov 29 05:35:55 localhost systemd[1]: Finished Coldplug All udev Devices.
Nov 29 05:35:55 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 29 05:35:55 localhost systemd[1]: Finished Load Kernel Module configfs.
Nov 29 05:35:55 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 29 05:35:55 localhost systemd[1]: Reached target Network.
Nov 29 05:35:55 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 29 05:35:55 localhost systemd[1]: Starting dracut initqueue hook...
Nov 29 05:35:55 localhost kernel: libata version 3.00 loaded.
Nov 29 05:35:55 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Nov 29 05:35:55 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Nov 29 05:35:55 localhost kernel:  vda: vda1
Nov 29 05:35:55 localhost systemd-udevd[477]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 05:35:55 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Nov 29 05:35:55 localhost kernel: scsi host0: ata_piix
Nov 29 05:35:55 localhost kernel: scsi host1: ata_piix
Nov 29 05:35:55 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Nov 29 05:35:55 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Nov 29 05:35:55 localhost systemd[1]: Found device /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253.
Nov 29 05:35:55 localhost systemd[1]: Reached target Initrd Root Device.
Nov 29 05:35:55 localhost systemd[1]: Mounting Kernel Configuration File System...
Nov 29 05:35:55 localhost kernel: ata1: found unknown device (class 0)
Nov 29 05:35:55 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Nov 29 05:35:55 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Nov 29 05:35:55 localhost systemd[1]: Mounted Kernel Configuration File System.
Nov 29 05:35:55 localhost systemd[1]: Reached target System Initialization.
Nov 29 05:35:55 localhost systemd[1]: Reached target Basic System.
Nov 29 05:35:55 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Nov 29 05:35:55 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Nov 29 05:35:55 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Nov 29 05:35:55 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Nov 29 05:35:55 localhost systemd[1]: Finished dracut initqueue hook.
Nov 29 05:35:55 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Nov 29 05:35:55 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Nov 29 05:35:55 localhost systemd[1]: Reached target Remote File Systems.
Nov 29 05:35:55 localhost systemd[1]: Starting dracut pre-mount hook...
Nov 29 05:35:55 localhost systemd[1]: Finished dracut pre-mount hook.
Nov 29 05:35:55 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253...
Nov 29 05:35:55 localhost systemd-fsck[559]: /usr/sbin/fsck.xfs: XFS file system.
Nov 29 05:35:55 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253.
Nov 29 05:35:55 localhost systemd[1]: Mounting /sysroot...
Nov 29 05:35:55 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Nov 29 05:35:55 localhost kernel: XFS (vda1): Mounting V5 Filesystem b277050f-8ace-464d-abb6-4c46d4c45253
Nov 29 05:35:56 localhost kernel: XFS (vda1): Ending clean mount
Nov 29 05:35:56 localhost systemd[1]: Mounted /sysroot.
Nov 29 05:35:56 localhost systemd[1]: Reached target Initrd Root File System.
Nov 29 05:35:56 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Nov 29 05:35:56 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Nov 29 05:35:56 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Nov 29 05:35:56 localhost systemd[1]: Reached target Initrd File Systems.
Nov 29 05:35:56 localhost systemd[1]: Reached target Initrd Default Target.
Nov 29 05:35:56 localhost systemd[1]: Starting dracut mount hook...
Nov 29 05:35:56 localhost systemd[1]: Finished dracut mount hook.
Nov 29 05:35:56 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Nov 29 05:35:56 localhost rpc.idmapd[450]: exiting on signal 15
Nov 29 05:35:56 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Nov 29 05:35:56 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Nov 29 05:35:56 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Nov 29 05:35:56 localhost systemd[1]: Stopped target Network.
Nov 29 05:35:56 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Nov 29 05:35:56 localhost systemd[1]: Stopped target Timer Units.
Nov 29 05:35:56 localhost systemd[1]: dbus.socket: Deactivated successfully.
Nov 29 05:35:56 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Nov 29 05:35:56 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Nov 29 05:35:56 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Nov 29 05:35:56 localhost systemd[1]: Stopped target Initrd Default Target.
Nov 29 05:35:56 localhost systemd[1]: Stopped target Basic System.
Nov 29 05:35:56 localhost systemd[1]: Stopped target Initrd Root Device.
Nov 29 05:35:56 localhost systemd[1]: Stopped target Initrd /usr File System.
Nov 29 05:35:56 localhost systemd[1]: Stopped target Path Units.
Nov 29 05:35:56 localhost systemd[1]: Stopped target Remote File Systems.
Nov 29 05:35:56 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Nov 29 05:35:56 localhost systemd[1]: Stopped target Slice Units.
Nov 29 05:35:56 localhost systemd[1]: Stopped target Socket Units.
Nov 29 05:35:56 localhost systemd[1]: Stopped target System Initialization.
Nov 29 05:35:56 localhost systemd[1]: Stopped target Local File Systems.
Nov 29 05:35:56 localhost systemd[1]: Stopped target Swaps.
Nov 29 05:35:56 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Nov 29 05:35:56 localhost systemd[1]: Stopped dracut mount hook.
Nov 29 05:35:56 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Nov 29 05:35:56 localhost systemd[1]: Stopped dracut pre-mount hook.
Nov 29 05:35:56 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Nov 29 05:35:56 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Nov 29 05:35:56 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Nov 29 05:35:56 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Nov 29 05:35:56 localhost systemd[1]: Stopped dracut initqueue hook.
Nov 29 05:35:56 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 29 05:35:56 localhost systemd[1]: Stopped Apply Kernel Variables.
Nov 29 05:35:56 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Nov 29 05:35:56 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Nov 29 05:35:56 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Nov 29 05:35:56 localhost systemd[1]: Stopped Coldplug All udev Devices.
Nov 29 05:35:56 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Nov 29 05:35:56 localhost systemd[1]: Stopped dracut pre-trigger hook.
Nov 29 05:35:56 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Nov 29 05:35:56 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Nov 29 05:35:56 localhost systemd[1]: Stopped Setup Virtual Console.
Nov 29 05:35:56 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Nov 29 05:35:56 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 29 05:35:56 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Nov 29 05:35:56 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Nov 29 05:35:56 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Nov 29 05:35:56 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Nov 29 05:35:56 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Nov 29 05:35:56 localhost systemd[1]: Closed udev Control Socket.
Nov 29 05:35:56 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Nov 29 05:35:56 localhost systemd[1]: Closed udev Kernel Socket.
Nov 29 05:35:56 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Nov 29 05:35:56 localhost systemd[1]: Stopped dracut pre-udev hook.
Nov 29 05:35:56 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Nov 29 05:35:56 localhost systemd[1]: Stopped dracut cmdline hook.
Nov 29 05:35:56 localhost systemd[1]: Starting Cleanup udev Database...
Nov 29 05:35:56 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Nov 29 05:35:56 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Nov 29 05:35:56 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Nov 29 05:35:56 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Nov 29 05:35:56 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Nov 29 05:35:56 localhost systemd[1]: Stopped Create System Users.
Nov 29 05:35:56 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Nov 29 05:35:56 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Nov 29 05:35:56 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Nov 29 05:35:56 localhost systemd[1]: Finished Cleanup udev Database.
Nov 29 05:35:56 localhost systemd[1]: Reached target Switch Root.
Nov 29 05:35:56 localhost systemd[1]: Starting Switch Root...
Nov 29 05:35:56 localhost systemd[1]: Switching root.
Nov 29 05:35:56 localhost systemd-journald[311]: Journal stopped
Nov 29 05:35:57 localhost systemd-journald[311]: Received SIGTERM from PID 1 (systemd).
Nov 29 05:35:57 localhost kernel: audit: type=1404 audit(1764394556.908:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Nov 29 05:35:57 localhost kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 05:35:57 localhost kernel: SELinux:  policy capability open_perms=1
Nov 29 05:35:57 localhost kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 05:35:57 localhost kernel: SELinux:  policy capability always_check_network=0
Nov 29 05:35:57 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 05:35:57 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 05:35:57 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 05:35:57 localhost kernel: audit: type=1403 audit(1764394557.051:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Nov 29 05:35:57 localhost systemd[1]: Successfully loaded SELinux policy in 149.155ms.
Nov 29 05:35:57 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 28.480ms.
Nov 29 05:35:57 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 29 05:35:57 localhost systemd[1]: Detected virtualization kvm.
Nov 29 05:35:57 localhost systemd[1]: Detected architecture x86-64.
Nov 29 05:35:57 localhost systemd-rc-local-generator[640]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 05:35:57 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Nov 29 05:35:57 localhost systemd[1]: Stopped Switch Root.
Nov 29 05:35:57 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Nov 29 05:35:57 localhost systemd[1]: Created slice Slice /system/getty.
Nov 29 05:35:57 localhost systemd[1]: Created slice Slice /system/serial-getty.
Nov 29 05:35:57 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Nov 29 05:35:57 localhost systemd[1]: Created slice User and Session Slice.
Nov 29 05:35:57 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Nov 29 05:35:57 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Nov 29 05:35:57 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Nov 29 05:35:57 localhost systemd[1]: Reached target Local Encrypted Volumes.
Nov 29 05:35:57 localhost systemd[1]: Stopped target Switch Root.
Nov 29 05:35:57 localhost systemd[1]: Stopped target Initrd File Systems.
Nov 29 05:35:57 localhost systemd[1]: Stopped target Initrd Root File System.
Nov 29 05:35:57 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Nov 29 05:35:57 localhost systemd[1]: Reached target Path Units.
Nov 29 05:35:57 localhost systemd[1]: Reached target rpc_pipefs.target.
Nov 29 05:35:57 localhost systemd[1]: Reached target Slice Units.
Nov 29 05:35:57 localhost systemd[1]: Reached target Swaps.
Nov 29 05:35:57 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Nov 29 05:35:57 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Nov 29 05:35:57 localhost systemd[1]: Reached target RPC Port Mapper.
Nov 29 05:35:57 localhost systemd[1]: Listening on Process Core Dump Socket.
Nov 29 05:35:57 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Nov 29 05:35:57 localhost systemd[1]: Listening on udev Control Socket.
Nov 29 05:35:57 localhost systemd[1]: Listening on udev Kernel Socket.
Nov 29 05:35:57 localhost systemd[1]: Mounting Huge Pages File System...
Nov 29 05:35:57 localhost systemd[1]: Mounting POSIX Message Queue File System...
Nov 29 05:35:57 localhost systemd[1]: Mounting Kernel Debug File System...
Nov 29 05:35:57 localhost systemd[1]: Mounting Kernel Trace File System...
Nov 29 05:35:57 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 29 05:35:57 localhost systemd[1]: Starting Create List of Static Device Nodes...
Nov 29 05:35:57 localhost systemd[1]: Starting Load Kernel Module configfs...
Nov 29 05:35:57 localhost systemd[1]: Starting Load Kernel Module drm...
Nov 29 05:35:57 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Nov 29 05:35:57 localhost systemd[1]: Starting Load Kernel Module fuse...
Nov 29 05:35:57 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Nov 29 05:35:57 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Nov 29 05:35:57 localhost systemd[1]: Stopped File System Check on Root Device.
Nov 29 05:35:57 localhost systemd[1]: Stopped Journal Service.
Nov 29 05:35:57 localhost systemd[1]: Starting Journal Service...
Nov 29 05:35:57 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Nov 29 05:35:57 localhost systemd[1]: Starting Generate network units from Kernel command line...
Nov 29 05:35:57 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 29 05:35:57 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Nov 29 05:35:57 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Nov 29 05:35:57 localhost systemd[1]: Starting Apply Kernel Variables...
Nov 29 05:35:57 localhost kernel: fuse: init (API version 7.37)
Nov 29 05:35:57 localhost systemd[1]: Starting Coldplug All udev Devices...
Nov 29 05:35:57 localhost systemd[1]: Mounted Huge Pages File System.
Nov 29 05:35:57 localhost systemd[1]: Mounted POSIX Message Queue File System.
Nov 29 05:35:57 localhost systemd-journald[682]: Journal started
Nov 29 05:35:57 localhost systemd-journald[682]: Runtime Journal (/run/log/journal/1f988c78c563e12389ab342aced42dbb) is 8.0M, max 153.6M, 145.6M free.
Nov 29 05:35:57 localhost systemd[1]: Queued start job for default target Multi-User System.
Nov 29 05:35:57 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Nov 29 05:35:57 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Nov 29 05:35:57 localhost systemd[1]: Started Journal Service.
Nov 29 05:35:57 localhost systemd[1]: Mounted Kernel Debug File System.
Nov 29 05:35:57 localhost systemd[1]: Mounted Kernel Trace File System.
Nov 29 05:35:57 localhost systemd[1]: Finished Create List of Static Device Nodes.
Nov 29 05:35:57 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 29 05:35:57 localhost systemd[1]: Finished Load Kernel Module configfs.
Nov 29 05:35:57 localhost kernel: ACPI: bus type drm_connector registered
Nov 29 05:35:57 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Nov 29 05:35:57 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Nov 29 05:35:57 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Nov 29 05:35:57 localhost systemd[1]: Finished Load Kernel Module drm.
Nov 29 05:35:57 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Nov 29 05:35:57 localhost systemd[1]: Finished Load Kernel Module fuse.
Nov 29 05:35:57 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Nov 29 05:35:57 localhost systemd[1]: Finished Generate network units from Kernel command line.
Nov 29 05:35:57 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Nov 29 05:35:57 localhost systemd[1]: Finished Apply Kernel Variables.
Nov 29 05:35:57 localhost systemd[1]: Mounting FUSE Control File System...
Nov 29 05:35:57 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 29 05:35:57 localhost systemd[1]: Starting Rebuild Hardware Database...
Nov 29 05:35:57 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Nov 29 05:35:57 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Nov 29 05:35:57 localhost systemd[1]: Starting Load/Save OS Random Seed...
Nov 29 05:35:57 localhost systemd[1]: Starting Create System Users...
Nov 29 05:35:57 localhost systemd-journald[682]: Runtime Journal (/run/log/journal/1f988c78c563e12389ab342aced42dbb) is 8.0M, max 153.6M, 145.6M free.
Nov 29 05:35:57 localhost systemd-journald[682]: Received client request to flush runtime journal.
Nov 29 05:35:57 localhost systemd[1]: Mounted FUSE Control File System.
Nov 29 05:35:57 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Nov 29 05:35:57 localhost systemd[1]: Finished Coldplug All udev Devices.
Nov 29 05:35:57 localhost systemd[1]: Finished Create System Users.
Nov 29 05:35:57 localhost systemd[1]: Finished Load/Save OS Random Seed.
Nov 29 05:35:57 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 29 05:35:57 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 29 05:35:57 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 29 05:35:57 localhost systemd[1]: Reached target Preparation for Local File Systems.
Nov 29 05:35:57 localhost systemd[1]: Reached target Local File Systems.
Nov 29 05:35:57 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Nov 29 05:35:57 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Nov 29 05:35:57 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Nov 29 05:35:57 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Nov 29 05:35:57 localhost systemd[1]: Starting Automatic Boot Loader Update...
Nov 29 05:35:57 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Nov 29 05:35:57 localhost systemd[1]: Starting Create Volatile Files and Directories...
Nov 29 05:35:57 localhost bootctl[700]: Couldn't find EFI system partition, skipping.
Nov 29 05:35:57 localhost systemd[1]: Finished Automatic Boot Loader Update.
Nov 29 05:35:57 localhost systemd[1]: Finished Create Volatile Files and Directories.
Nov 29 05:35:57 localhost systemd[1]: Starting Security Auditing Service...
Nov 29 05:35:57 localhost systemd[1]: Starting RPC Bind...
Nov 29 05:35:57 localhost systemd[1]: Starting Rebuild Journal Catalog...
Nov 29 05:35:58 localhost auditd[706]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Nov 29 05:35:58 localhost auditd[706]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Nov 29 05:35:58 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Nov 29 05:35:58 localhost systemd[1]: Finished Rebuild Journal Catalog.
Nov 29 05:35:58 localhost augenrules[711]: /sbin/augenrules: No change
Nov 29 05:35:58 localhost augenrules[726]: No rules
Nov 29 05:35:58 localhost augenrules[726]: enabled 1
Nov 29 05:35:58 localhost augenrules[726]: failure 1
Nov 29 05:35:58 localhost augenrules[726]: pid 706
Nov 29 05:35:58 localhost augenrules[726]: rate_limit 0
Nov 29 05:35:58 localhost augenrules[726]: backlog_limit 8192
Nov 29 05:35:58 localhost augenrules[726]: lost 0
Nov 29 05:35:58 localhost augenrules[726]: backlog 0
Nov 29 05:35:58 localhost augenrules[726]: backlog_wait_time 60000
Nov 29 05:35:58 localhost augenrules[726]: backlog_wait_time_actual 0
Nov 29 05:35:58 localhost augenrules[726]: enabled 1
Nov 29 05:35:58 localhost augenrules[726]: failure 1
Nov 29 05:35:58 localhost augenrules[726]: pid 706
Nov 29 05:35:58 localhost augenrules[726]: rate_limit 0
Nov 29 05:35:58 localhost augenrules[726]: backlog_limit 8192
Nov 29 05:35:58 localhost augenrules[726]: lost 0
Nov 29 05:35:58 localhost augenrules[726]: backlog 0
Nov 29 05:35:58 localhost augenrules[726]: backlog_wait_time 60000
Nov 29 05:35:58 localhost augenrules[726]: backlog_wait_time_actual 0
Nov 29 05:35:58 localhost augenrules[726]: enabled 1
Nov 29 05:35:58 localhost augenrules[726]: failure 1
Nov 29 05:35:58 localhost augenrules[726]: pid 706
Nov 29 05:35:58 localhost augenrules[726]: rate_limit 0
Nov 29 05:35:58 localhost augenrules[726]: backlog_limit 8192
Nov 29 05:35:58 localhost augenrules[726]: lost 0
Nov 29 05:35:58 localhost augenrules[726]: backlog 0
Nov 29 05:35:58 localhost augenrules[726]: backlog_wait_time 60000
Nov 29 05:35:58 localhost augenrules[726]: backlog_wait_time_actual 0
Nov 29 05:35:58 localhost systemd[1]: Started Security Auditing Service.
Nov 29 05:35:58 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Nov 29 05:35:58 localhost systemd[1]: Started RPC Bind.
Nov 29 05:35:58 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Nov 29 05:35:58 localhost systemd[1]: Finished Rebuild Hardware Database.
Nov 29 05:35:58 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 29 05:35:58 localhost systemd[1]: Starting Update is Completed...
Nov 29 05:35:58 localhost systemd[1]: Finished Update is Completed.
Nov 29 05:35:58 localhost systemd-udevd[734]: Using default interface naming scheme 'rhel-9.0'.
Nov 29 05:35:58 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 29 05:35:58 localhost systemd[1]: Reached target System Initialization.
Nov 29 05:35:58 localhost systemd[1]: Started dnf makecache --timer.
Nov 29 05:35:58 localhost systemd[1]: Started Daily rotation of log files.
Nov 29 05:35:58 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Nov 29 05:35:58 localhost systemd[1]: Reached target Timer Units.
Nov 29 05:35:58 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Nov 29 05:35:58 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Nov 29 05:35:58 localhost systemd[1]: Reached target Socket Units.
Nov 29 05:35:58 localhost systemd[1]: Starting D-Bus System Message Bus...
Nov 29 05:35:58 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 29 05:35:58 localhost systemd[1]: Starting Load Kernel Module configfs...
Nov 29 05:35:58 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 29 05:35:58 localhost systemd[1]: Finished Load Kernel Module configfs.
Nov 29 05:35:58 localhost systemd-udevd[746]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 05:35:58 localhost systemd[1]: Started D-Bus System Message Bus.
Nov 29 05:35:58 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Nov 29 05:35:58 localhost systemd[1]: Reached target Basic System.
Nov 29 05:35:58 localhost dbus-broker-lau[772]: Ready
Nov 29 05:35:58 localhost systemd[1]: Starting NTP client/server...
Nov 29 05:35:58 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Nov 29 05:35:58 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Nov 29 05:35:58 localhost systemd[1]: Starting IPv4 firewall with iptables...
Nov 29 05:35:58 localhost systemd[1]: Started irqbalance daemon.
Nov 29 05:35:58 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Nov 29 05:35:58 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 29 05:35:58 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 29 05:35:58 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 29 05:35:58 localhost systemd[1]: Reached target sshd-keygen.target.
Nov 29 05:35:58 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Nov 29 05:35:58 localhost systemd[1]: Reached target User and Group Name Lookups.
Nov 29 05:35:58 localhost systemd[1]: Starting User Login Management...
Nov 29 05:35:58 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Nov 29 05:35:58 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Nov 29 05:35:58 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Nov 29 05:35:58 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Nov 29 05:35:58 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Nov 29 05:35:58 localhost chronyd[793]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Nov 29 05:35:58 localhost chronyd[793]: Loaded 0 symmetric keys
Nov 29 05:35:58 localhost chronyd[793]: Using right/UTC timezone to obtain leap second data
Nov 29 05:35:58 localhost chronyd[793]: Loaded seccomp filter (level 2)
Nov 29 05:35:58 localhost systemd[1]: Started NTP client/server.
Nov 29 05:35:58 localhost systemd-logind[788]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 29 05:35:58 localhost systemd-logind[788]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 29 05:35:58 localhost systemd-logind[788]: New seat seat0.
Nov 29 05:35:58 localhost systemd[1]: Started User Login Management.
Nov 29 05:35:58 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Nov 29 05:35:58 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Nov 29 05:35:58 localhost kernel: Console: switching to colour dummy device 80x25
Nov 29 05:35:58 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Nov 29 05:35:58 localhost kernel: [drm] features: -context_init
Nov 29 05:35:58 localhost kernel: [drm] number of scanouts: 1
Nov 29 05:35:58 localhost kernel: [drm] number of cap sets: 0
Nov 29 05:35:58 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Nov 29 05:35:58 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Nov 29 05:35:58 localhost kernel: Console: switching to colour frame buffer device 128x48
Nov 29 05:35:58 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Nov 29 05:35:58 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Nov 29 05:35:58 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Nov 29 05:35:58 localhost kernel: kvm_amd: TSC scaling supported
Nov 29 05:35:58 localhost kernel: kvm_amd: Nested Virtualization enabled
Nov 29 05:35:58 localhost kernel: kvm_amd: Nested Paging enabled
Nov 29 05:35:58 localhost kernel: kvm_amd: LBR virtualization supported
Nov 29 05:35:58 localhost iptables.init[782]: iptables: Applying firewall rules: [  OK  ]
Nov 29 05:35:59 localhost systemd[1]: Finished IPv4 firewall with iptables.
Nov 29 05:35:59 localhost cloud-init[843]: Cloud-init v. 24.4-7.el9 running 'init-local' at Sat, 29 Nov 2025 05:35:59 +0000. Up 6.89 seconds.
Nov 29 05:35:59 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Nov 29 05:35:59 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Nov 29 05:35:59 localhost systemd[1]: run-cloud\x2dinit-tmp-tmp2l96ejeg.mount: Deactivated successfully.
Nov 29 05:35:59 localhost systemd[1]: Starting Hostname Service...
Nov 29 05:35:59 localhost systemd[1]: Started Hostname Service.
Nov 29 05:35:59 np0005539503.novalocal systemd-hostnamed[857]: Hostname set to <np0005539503.novalocal> (static)
Nov 29 05:35:59 np0005539503.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Nov 29 05:35:59 np0005539503.novalocal systemd[1]: Reached target Preparation for Network.
Nov 29 05:35:59 np0005539503.novalocal systemd[1]: Starting Network Manager...
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8237] NetworkManager (version 1.54.1-1.el9) is starting... (boot:3b7c8c50-55c8-43c8-86aa-f5fa30ebf228)
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8244] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8313] manager[0x5579a9cd1080]: monitoring kernel firmware directory '/lib/firmware'.
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8363] hostname: hostname: using hostnamed
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8363] hostname: static hostname changed from (none) to "np0005539503.novalocal"
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8367] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8464] manager[0x5579a9cd1080]: rfkill: Wi-Fi hardware radio set enabled
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8465] manager[0x5579a9cd1080]: rfkill: WWAN hardware radio set enabled
Nov 29 05:35:59 np0005539503.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8521] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8522] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8522] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8523] manager: Networking is enabled by state file
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8526] settings: Loaded settings plugin: keyfile (internal)
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8540] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8563] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8576] dhcp: init: Using DHCP client 'internal'
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8579] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8593] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8602] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8611] device (lo): Activation: starting connection 'lo' (43d811ee-2cbe-4425-a4fb-d7d92aa3b968)
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8621] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8624] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8659] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8663] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8664] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8666] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8667] device (eth0): carrier: link connected
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8669] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8675] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8681] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8685] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8686] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8687] manager: NetworkManager state is now CONNECTING
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8689] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8694] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8696] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8732] dhcp4 (eth0): state changed new lease, address=38.102.83.110
Nov 29 05:35:59 np0005539503.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8738] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8766] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 05:35:59 np0005539503.novalocal systemd[1]: Started Network Manager.
Nov 29 05:35:59 np0005539503.novalocal systemd[1]: Reached target Network.
Nov 29 05:35:59 np0005539503.novalocal systemd[1]: Starting Network Manager Wait Online...
Nov 29 05:35:59 np0005539503.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Nov 29 05:35:59 np0005539503.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8958] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8960] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8960] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8964] device (lo): Activation: successful, device activated.
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8969] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8971] manager: NetworkManager state is now CONNECTED_SITE
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8973] device (eth0): Activation: successful, device activated.
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8977] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 29 05:35:59 np0005539503.novalocal NetworkManager[861]: <info>  [1764394559.8979] manager: startup complete
Nov 29 05:35:59 np0005539503.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Nov 29 05:35:59 np0005539503.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 29 05:35:59 np0005539503.novalocal systemd[1]: Reached target NFS client services.
Nov 29 05:35:59 np0005539503.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Nov 29 05:35:59 np0005539503.novalocal systemd[1]: Reached target Remote File Systems.
Nov 29 05:35:59 np0005539503.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 29 05:35:59 np0005539503.novalocal systemd[1]: Finished Network Manager Wait Online.
Nov 29 05:35:59 np0005539503.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Nov 29 05:36:00 np0005539503.novalocal cloud-init[927]: Cloud-init v. 24.4-7.el9 running 'init' at Sat, 29 Nov 2025 05:36:00 +0000. Up 7.94 seconds.
Nov 29 05:36:00 np0005539503.novalocal cloud-init[927]: ci-info: ++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Nov 29 05:36:00 np0005539503.novalocal cloud-init[927]: ci-info: +--------+------+-----------------------------+---------------+--------+-------------------+
Nov 29 05:36:00 np0005539503.novalocal cloud-init[927]: ci-info: | Device |  Up  |           Address           |      Mask     | Scope  |     Hw-Address    |
Nov 29 05:36:00 np0005539503.novalocal cloud-init[927]: ci-info: +--------+------+-----------------------------+---------------+--------+-------------------+
Nov 29 05:36:00 np0005539503.novalocal cloud-init[927]: ci-info: |  eth0  | True |        38.102.83.110        | 255.255.255.0 | global | fa:16:3e:22:03:49 |
Nov 29 05:36:00 np0005539503.novalocal cloud-init[927]: ci-info: |  eth0  | True | fe80::f816:3eff:fe22:349/64 |       .       |  link  | fa:16:3e:22:03:49 |
Nov 29 05:36:00 np0005539503.novalocal cloud-init[927]: ci-info: |   lo   | True |          127.0.0.1          |   255.0.0.0   |  host  |         .         |
Nov 29 05:36:00 np0005539503.novalocal cloud-init[927]: ci-info: |   lo   | True |           ::1/128           |       .       |  host  |         .         |
Nov 29 05:36:00 np0005539503.novalocal cloud-init[927]: ci-info: +--------+------+-----------------------------+---------------+--------+-------------------+
Nov 29 05:36:00 np0005539503.novalocal cloud-init[927]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Nov 29 05:36:00 np0005539503.novalocal cloud-init[927]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 29 05:36:00 np0005539503.novalocal cloud-init[927]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Nov 29 05:36:00 np0005539503.novalocal cloud-init[927]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 29 05:36:00 np0005539503.novalocal cloud-init[927]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Nov 29 05:36:00 np0005539503.novalocal cloud-init[927]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Nov 29 05:36:00 np0005539503.novalocal cloud-init[927]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Nov 29 05:36:00 np0005539503.novalocal cloud-init[927]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 29 05:36:00 np0005539503.novalocal cloud-init[927]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Nov 29 05:36:00 np0005539503.novalocal cloud-init[927]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 29 05:36:00 np0005539503.novalocal cloud-init[927]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Nov 29 05:36:00 np0005539503.novalocal cloud-init[927]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 29 05:36:00 np0005539503.novalocal cloud-init[927]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Nov 29 05:36:00 np0005539503.novalocal cloud-init[927]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Nov 29 05:36:00 np0005539503.novalocal cloud-init[927]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 29 05:36:01 np0005539503.novalocal useradd[993]: new group: name=cloud-user, GID=1001
Nov 29 05:36:01 np0005539503.novalocal useradd[993]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Nov 29 05:36:01 np0005539503.novalocal useradd[993]: add 'cloud-user' to group 'adm'
Nov 29 05:36:01 np0005539503.novalocal useradd[993]: add 'cloud-user' to group 'systemd-journal'
Nov 29 05:36:01 np0005539503.novalocal useradd[993]: add 'cloud-user' to shadow group 'adm'
Nov 29 05:36:01 np0005539503.novalocal useradd[993]: add 'cloud-user' to shadow group 'systemd-journal'
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: Generating public/private rsa key pair.
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: The key fingerprint is:
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: SHA256:ARvITHEZtC0zEJWvZUqbN+YsHvnAixDQemKOHTb+Nbk root@np0005539503.novalocal
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: The key's randomart image is:
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: +---[RSA 3072]----+
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: |   +=*B+         |
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: | .  +oo*         |
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: |. .   *.o        |
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: | o    .++.       |
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: |o.*  . BS        |
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: |+* + .=o+        |
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: |..+   O= .       |
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: |   o o.Bo        |
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: |    o.E..        |
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: +----[SHA256]-----+
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: Generating public/private ecdsa key pair.
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: The key fingerprint is:
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: SHA256:IWlN8ktN8d7Kpxo/3WvYDnKTuRAvEqkIfgwV/a5lWL0 root@np0005539503.novalocal
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: The key's randomart image is:
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: +---[ECDSA 256]---+
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: |     .o . o.     |
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: |      .B o .     |
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: |     .+ * o .    |
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: |    .. o =.o .   |
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: |   o    So .o .  |
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: |  . + ...+.Eo.o  |
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: |   . + .+..+oO+. |
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: |    .  .  .o*+=o.|
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: |          ..oooo.|
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: +----[SHA256]-----+
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: Generating public/private ed25519 key pair.
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: The key fingerprint is:
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: SHA256:xpK434GvULOZC1aA6EJWb4aBuHcBtlOEGgxDDqnOQws root@np0005539503.novalocal
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: The key's randomart image is:
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: +--[ED25519 256]--+
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: |Oo+*o            |
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: |**o+=            |
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: |oB+..=           |
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: |E...+o o         |
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: |*o... * S        |
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: |.=   + O         |
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: |  . = = .        |
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: |   . + + .       |
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: |      +.o        |
Nov 29 05:36:01 np0005539503.novalocal cloud-init[927]: +----[SHA256]-----+
Nov 29 05:36:01 np0005539503.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Nov 29 05:36:01 np0005539503.novalocal systemd[1]: Reached target Cloud-config availability.
Nov 29 05:36:01 np0005539503.novalocal systemd[1]: Reached target Network is Online.
Nov 29 05:36:01 np0005539503.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Nov 29 05:36:01 np0005539503.novalocal systemd[1]: Starting Crash recovery kernel arming...
Nov 29 05:36:01 np0005539503.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Nov 29 05:36:01 np0005539503.novalocal systemd[1]: Starting System Logging Service...
Nov 29 05:36:01 np0005539503.novalocal sm-notify[1009]: Version 2.5.4 starting
Nov 29 05:36:01 np0005539503.novalocal systemd[1]: Starting OpenSSH server daemon...
Nov 29 05:36:01 np0005539503.novalocal systemd[1]: Starting Permit User Sessions...
Nov 29 05:36:01 np0005539503.novalocal systemd[1]: Started Notify NFS peers of a restart.
Nov 29 05:36:01 np0005539503.novalocal systemd[1]: Finished Permit User Sessions.
Nov 29 05:36:01 np0005539503.novalocal sshd[1011]: Server listening on 0.0.0.0 port 22.
Nov 29 05:36:01 np0005539503.novalocal sshd[1011]: Server listening on :: port 22.
Nov 29 05:36:01 np0005539503.novalocal systemd[1]: Started Command Scheduler.
Nov 29 05:36:01 np0005539503.novalocal systemd[1]: Started Getty on tty1.
Nov 29 05:36:01 np0005539503.novalocal crond[1014]: (CRON) STARTUP (1.5.7)
Nov 29 05:36:01 np0005539503.novalocal crond[1014]: (CRON) INFO (Syslog will be used instead of sendmail.)
Nov 29 05:36:01 np0005539503.novalocal crond[1014]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 9% if used.)
Nov 29 05:36:01 np0005539503.novalocal crond[1014]: (CRON) INFO (running with inotify support)
Nov 29 05:36:01 np0005539503.novalocal rsyslogd[1010]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1010" x-info="https://www.rsyslog.com"] start
Nov 29 05:36:01 np0005539503.novalocal rsyslogd[1010]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Nov 29 05:36:01 np0005539503.novalocal systemd[1]: Started Serial Getty on ttyS0.
Nov 29 05:36:01 np0005539503.novalocal systemd[1]: Reached target Login Prompts.
Nov 29 05:36:01 np0005539503.novalocal systemd[1]: Started OpenSSH server daemon.
Nov 29 05:36:01 np0005539503.novalocal systemd[1]: Started System Logging Service.
Nov 29 05:36:01 np0005539503.novalocal systemd[1]: Reached target Multi-User System.
Nov 29 05:36:01 np0005539503.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Nov 29 05:36:01 np0005539503.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Nov 29 05:36:01 np0005539503.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Nov 29 05:36:01 np0005539503.novalocal rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 05:36:01 np0005539503.novalocal kdumpctl[1022]: kdump: No kdump initial ramdisk found.
Nov 29 05:36:01 np0005539503.novalocal kdumpctl[1022]: kdump: Rebuilding /boot/initramfs-5.14.0-642.el9.x86_64kdump.img
Nov 29 05:36:01 np0005539503.novalocal cloud-init[1114]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Sat, 29 Nov 2025 05:36:01 +0000. Up 9.58 seconds.
Nov 29 05:36:01 np0005539503.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Nov 29 05:36:01 np0005539503.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Nov 29 05:36:02 np0005539503.novalocal dracut[1270]: dracut-057-102.git20250818.el9
Nov 29 05:36:02 np0005539503.novalocal cloud-init[1288]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Sat, 29 Nov 2025 05:36:02 +0000. Up 9.98 seconds.
Nov 29 05:36:02 np0005539503.novalocal cloud-init[1290]: #############################################################
Nov 29 05:36:02 np0005539503.novalocal cloud-init[1291]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Nov 29 05:36:02 np0005539503.novalocal cloud-init[1293]: 256 SHA256:IWlN8ktN8d7Kpxo/3WvYDnKTuRAvEqkIfgwV/a5lWL0 root@np0005539503.novalocal (ECDSA)
Nov 29 05:36:02 np0005539503.novalocal cloud-init[1296]: 256 SHA256:xpK434GvULOZC1aA6EJWb4aBuHcBtlOEGgxDDqnOQws root@np0005539503.novalocal (ED25519)
Nov 29 05:36:02 np0005539503.novalocal cloud-init[1301]: 3072 SHA256:ARvITHEZtC0zEJWvZUqbN+YsHvnAixDQemKOHTb+Nbk root@np0005539503.novalocal (RSA)
Nov 29 05:36:02 np0005539503.novalocal cloud-init[1303]: -----END SSH HOST KEY FINGERPRINTS-----
Nov 29 05:36:02 np0005539503.novalocal cloud-init[1309]: #############################################################
Nov 29 05:36:02 np0005539503.novalocal dracut[1274]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-642.el9.x86_64kdump.img 5.14.0-642.el9.x86_64
Nov 29 05:36:02 np0005539503.novalocal cloud-init[1288]: Cloud-init v. 24.4-7.el9 finished at Sat, 29 Nov 2025 05:36:02 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 10.14 seconds
Nov 29 05:36:02 np0005539503.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Nov 29 05:36:02 np0005539503.novalocal systemd[1]: Reached target Cloud-init target.
Nov 29 05:36:02 np0005539503.novalocal sshd-session[1357]: Unable to negotiate with 38.102.83.114 port 50210: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Nov 29 05:36:02 np0005539503.novalocal sshd-session[1364]: Unable to negotiate with 38.102.83.114 port 50220: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Nov 29 05:36:02 np0005539503.novalocal sshd-session[1369]: Unable to negotiate with 38.102.83.114 port 50226: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Nov 29 05:36:02 np0005539503.novalocal sshd-session[1347]: Connection closed by 38.102.83.114 port 50196 [preauth]
Nov 29 05:36:02 np0005539503.novalocal sshd-session[1384]: Unable to negotiate with 38.102.83.114 port 50256: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Nov 29 05:36:02 np0005539503.novalocal sshd-session[1359]: Connection closed by 38.102.83.114 port 50212 [preauth]
Nov 29 05:36:02 np0005539503.novalocal sshd-session[1389]: Unable to negotiate with 38.102.83.114 port 50270: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Nov 29 05:36:02 np0005539503.novalocal sshd-session[1374]: Connection closed by 38.102.83.114 port 50238 [preauth]
Nov 29 05:36:02 np0005539503.novalocal sshd-session[1379]: Connection closed by 38.102.83.114 port 50250 [preauth]
Nov 29 05:36:02 np0005539503.novalocal dracut[1274]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Nov 29 05:36:02 np0005539503.novalocal dracut[1274]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Nov 29 05:36:02 np0005539503.novalocal dracut[1274]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Nov 29 05:36:02 np0005539503.novalocal dracut[1274]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 29 05:36:02 np0005539503.novalocal dracut[1274]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 29 05:36:02 np0005539503.novalocal dracut[1274]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 29 05:36:02 np0005539503.novalocal dracut[1274]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 29 05:36:02 np0005539503.novalocal dracut[1274]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 29 05:36:02 np0005539503.novalocal dracut[1274]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 29 05:36:02 np0005539503.novalocal dracut[1274]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 29 05:36:02 np0005539503.novalocal dracut[1274]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 29 05:36:02 np0005539503.novalocal dracut[1274]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 29 05:36:02 np0005539503.novalocal dracut[1274]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 29 05:36:02 np0005539503.novalocal dracut[1274]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: Module 'resume' will not be installed, because it's in the list to be omitted!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: memstrack is not available
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: memstrack is not available
Nov 29 05:36:03 np0005539503.novalocal dracut[1274]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 29 05:36:04 np0005539503.novalocal dracut[1274]: *** Including module: systemd ***
Nov 29 05:36:04 np0005539503.novalocal dracut[1274]: *** Including module: fips ***
Nov 29 05:36:04 np0005539503.novalocal dracut[1274]: *** Including module: systemd-initrd ***
Nov 29 05:36:04 np0005539503.novalocal chronyd[793]: Selected source 162.159.200.123 (2.centos.pool.ntp.org)
Nov 29 05:36:04 np0005539503.novalocal chronyd[793]: System clock TAI offset set to 37 seconds
Nov 29 05:36:04 np0005539503.novalocal dracut[1274]: *** Including module: i18n ***
Nov 29 05:36:04 np0005539503.novalocal dracut[1274]: *** Including module: drm ***
Nov 29 05:36:05 np0005539503.novalocal dracut[1274]: *** Including module: prefixdevname ***
Nov 29 05:36:05 np0005539503.novalocal dracut[1274]: *** Including module: kernel-modules ***
Nov 29 05:36:05 np0005539503.novalocal kernel: block vda: the capability attribute has been deprecated.
Nov 29 05:36:05 np0005539503.novalocal dracut[1274]: *** Including module: kernel-modules-extra ***
Nov 29 05:36:05 np0005539503.novalocal dracut[1274]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Nov 29 05:36:05 np0005539503.novalocal dracut[1274]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Nov 29 05:36:05 np0005539503.novalocal dracut[1274]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Nov 29 05:36:05 np0005539503.novalocal dracut[1274]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Nov 29 05:36:05 np0005539503.novalocal dracut[1274]: *** Including module: qemu ***
Nov 29 05:36:06 np0005539503.novalocal dracut[1274]: *** Including module: fstab-sys ***
Nov 29 05:36:06 np0005539503.novalocal dracut[1274]: *** Including module: rootfs-block ***
Nov 29 05:36:06 np0005539503.novalocal dracut[1274]: *** Including module: terminfo ***
Nov 29 05:36:06 np0005539503.novalocal dracut[1274]: *** Including module: udev-rules ***
Nov 29 05:36:06 np0005539503.novalocal dracut[1274]: Skipping udev rule: 91-permissions.rules
Nov 29 05:36:06 np0005539503.novalocal dracut[1274]: Skipping udev rule: 80-drivers-modprobe.rules
Nov 29 05:36:06 np0005539503.novalocal dracut[1274]: *** Including module: virtiofs ***
Nov 29 05:36:06 np0005539503.novalocal dracut[1274]: *** Including module: dracut-systemd ***
Nov 29 05:36:06 np0005539503.novalocal dracut[1274]: *** Including module: usrmount ***
Nov 29 05:36:06 np0005539503.novalocal dracut[1274]: *** Including module: base ***
Nov 29 05:36:06 np0005539503.novalocal dracut[1274]: *** Including module: fs-lib ***
Nov 29 05:36:07 np0005539503.novalocal dracut[1274]: *** Including module: kdumpbase ***
Nov 29 05:36:07 np0005539503.novalocal dracut[1274]: *** Including module: microcode_ctl-fw_dir_override ***
Nov 29 05:36:07 np0005539503.novalocal dracut[1274]:   microcode_ctl module: mangling fw_dir
Nov 29 05:36:07 np0005539503.novalocal dracut[1274]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Nov 29 05:36:07 np0005539503.novalocal dracut[1274]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Nov 29 05:36:07 np0005539503.novalocal dracut[1274]:     microcode_ctl: configuration "intel" is ignored
Nov 29 05:36:07 np0005539503.novalocal dracut[1274]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Nov 29 05:36:07 np0005539503.novalocal dracut[1274]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Nov 29 05:36:07 np0005539503.novalocal dracut[1274]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Nov 29 05:36:07 np0005539503.novalocal dracut[1274]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Nov 29 05:36:07 np0005539503.novalocal dracut[1274]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Nov 29 05:36:07 np0005539503.novalocal dracut[1274]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Nov 29 05:36:07 np0005539503.novalocal dracut[1274]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Nov 29 05:36:07 np0005539503.novalocal dracut[1274]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Nov 29 05:36:07 np0005539503.novalocal dracut[1274]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Nov 29 05:36:07 np0005539503.novalocal dracut[1274]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Nov 29 05:36:07 np0005539503.novalocal dracut[1274]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Nov 29 05:36:07 np0005539503.novalocal dracut[1274]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Nov 29 05:36:07 np0005539503.novalocal dracut[1274]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Nov 29 05:36:07 np0005539503.novalocal dracut[1274]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Nov 29 05:36:07 np0005539503.novalocal dracut[1274]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Nov 29 05:36:07 np0005539503.novalocal dracut[1274]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Nov 29 05:36:07 np0005539503.novalocal dracut[1274]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Nov 29 05:36:07 np0005539503.novalocal dracut[1274]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Nov 29 05:36:07 np0005539503.novalocal dracut[1274]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Nov 29 05:36:07 np0005539503.novalocal dracut[1274]: *** Including module: openssl ***
Nov 29 05:36:07 np0005539503.novalocal dracut[1274]: *** Including module: shutdown ***
Nov 29 05:36:08 np0005539503.novalocal dracut[1274]: *** Including module: squash ***
Nov 29 05:36:08 np0005539503.novalocal dracut[1274]: *** Including modules done ***
Nov 29 05:36:08 np0005539503.novalocal dracut[1274]: *** Installing kernel module dependencies ***
Nov 29 05:36:08 np0005539503.novalocal dracut[1274]: *** Installing kernel module dependencies done ***
Nov 29 05:36:08 np0005539503.novalocal dracut[1274]: *** Resolving executable dependencies ***
Nov 29 05:36:09 np0005539503.novalocal irqbalance[786]: Cannot change IRQ 35 affinity: Operation not permitted
Nov 29 05:36:09 np0005539503.novalocal irqbalance[786]: IRQ 35 affinity is now unmanaged
Nov 29 05:36:09 np0005539503.novalocal irqbalance[786]: Cannot change IRQ 33 affinity: Operation not permitted
Nov 29 05:36:09 np0005539503.novalocal irqbalance[786]: IRQ 33 affinity is now unmanaged
Nov 29 05:36:09 np0005539503.novalocal irqbalance[786]: Cannot change IRQ 31 affinity: Operation not permitted
Nov 29 05:36:09 np0005539503.novalocal irqbalance[786]: IRQ 31 affinity is now unmanaged
Nov 29 05:36:09 np0005539503.novalocal irqbalance[786]: Cannot change IRQ 28 affinity: Operation not permitted
Nov 29 05:36:09 np0005539503.novalocal irqbalance[786]: IRQ 28 affinity is now unmanaged
Nov 29 05:36:09 np0005539503.novalocal irqbalance[786]: Cannot change IRQ 34 affinity: Operation not permitted
Nov 29 05:36:09 np0005539503.novalocal irqbalance[786]: IRQ 34 affinity is now unmanaged
Nov 29 05:36:09 np0005539503.novalocal irqbalance[786]: Cannot change IRQ 32 affinity: Operation not permitted
Nov 29 05:36:09 np0005539503.novalocal irqbalance[786]: IRQ 32 affinity is now unmanaged
Nov 29 05:36:09 np0005539503.novalocal irqbalance[786]: Cannot change IRQ 30 affinity: Operation not permitted
Nov 29 05:36:09 np0005539503.novalocal irqbalance[786]: IRQ 30 affinity is now unmanaged
Nov 29 05:36:09 np0005539503.novalocal irqbalance[786]: Cannot change IRQ 29 affinity: Operation not permitted
Nov 29 05:36:09 np0005539503.novalocal irqbalance[786]: IRQ 29 affinity is now unmanaged
Nov 29 05:36:10 np0005539503.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 29 05:36:10 np0005539503.novalocal dracut[1274]: *** Resolving executable dependencies done ***
Nov 29 05:36:10 np0005539503.novalocal dracut[1274]: *** Generating early-microcode cpio image ***
Nov 29 05:36:10 np0005539503.novalocal dracut[1274]: *** Store current command line parameters ***
Nov 29 05:36:10 np0005539503.novalocal dracut[1274]: Stored kernel commandline:
Nov 29 05:36:10 np0005539503.novalocal dracut[1274]: No dracut internal kernel commandline stored in the initramfs
Nov 29 05:36:10 np0005539503.novalocal dracut[1274]: *** Install squash loader ***
Nov 29 05:36:11 np0005539503.novalocal dracut[1274]: *** Squashing the files inside the initramfs ***
Nov 29 05:36:12 np0005539503.novalocal dracut[1274]: *** Squashing the files inside the initramfs done ***
Nov 29 05:36:12 np0005539503.novalocal dracut[1274]: *** Creating image file '/boot/initramfs-5.14.0-642.el9.x86_64kdump.img' ***
Nov 29 05:36:12 np0005539503.novalocal dracut[1274]: *** Hardlinking files ***
Nov 29 05:36:12 np0005539503.novalocal dracut[1274]: Mode:           real
Nov 29 05:36:12 np0005539503.novalocal dracut[1274]: Files:          50
Nov 29 05:36:12 np0005539503.novalocal dracut[1274]: Linked:         0 files
Nov 29 05:36:12 np0005539503.novalocal dracut[1274]: Compared:       0 xattrs
Nov 29 05:36:12 np0005539503.novalocal dracut[1274]: Compared:       0 files
Nov 29 05:36:12 np0005539503.novalocal dracut[1274]: Saved:          0 B
Nov 29 05:36:12 np0005539503.novalocal dracut[1274]: Duration:       0.000489 seconds
Nov 29 05:36:12 np0005539503.novalocal dracut[1274]: *** Hardlinking files done ***
Nov 29 05:36:12 np0005539503.novalocal dracut[1274]: *** Creating initramfs image file '/boot/initramfs-5.14.0-642.el9.x86_64kdump.img' done ***
Nov 29 05:36:13 np0005539503.novalocal kdumpctl[1022]: kdump: kexec: loaded kdump kernel
Nov 29 05:36:13 np0005539503.novalocal kdumpctl[1022]: kdump: Starting kdump: [OK]
Nov 29 05:36:13 np0005539503.novalocal systemd[1]: Finished Crash recovery kernel arming.
Nov 29 05:36:13 np0005539503.novalocal systemd[1]: Startup finished in 1.760s (kernel) + 2.864s (initrd) + 16.256s (userspace) = 20.880s.
Nov 29 05:36:24 np0005539503.novalocal sshd-session[4300]: Connection closed by 45.78.219.251 port 46330 [preauth]
Nov 29 05:36:29 np0005539503.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 29 05:38:11 np0005539503.novalocal sshd[1011]: Timeout before authentication for connection from 182.61.53.1 to 38.102.83.110, pid = 3936
Nov 29 05:38:19 np0005539503.novalocal sshd-session[4304]: Invalid user rstudio from 36.50.176.16 port 37844
Nov 29 05:38:19 np0005539503.novalocal sshd-session[4304]: Received disconnect from 36.50.176.16 port 37844:11: Bye Bye [preauth]
Nov 29 05:38:19 np0005539503.novalocal sshd-session[4304]: Disconnected from invalid user rstudio 36.50.176.16 port 37844 [preauth]
Nov 29 05:38:52 np0005539503.novalocal sshd-session[4306]: Received disconnect from 45.78.219.251 port 46372:11: Bye Bye [preauth]
Nov 29 05:38:52 np0005539503.novalocal sshd-session[4306]: Disconnected from 45.78.219.251 port 46372 [preauth]
Nov 29 05:40:26 np0005539503.novalocal sshd-session[4308]: Received disconnect from 36.50.176.16 port 47652:11: Bye Bye [preauth]
Nov 29 05:40:26 np0005539503.novalocal sshd-session[4308]: Disconnected from authenticating user root 36.50.176.16 port 47652 [preauth]
Nov 29 05:42:30 np0005539503.novalocal sshd-session[4313]: Connection closed by 36.50.176.16 port 33708 [preauth]
Nov 29 05:43:34 np0005539503.novalocal sshd-session[4315]: Connection closed by authenticating user root 141.94.154.244 port 60294 [preauth]
Nov 29 05:46:14 np0005539503.novalocal sshd-session[4317]: Received disconnect from 45.78.219.251 port 37550:11: Bye Bye [preauth]
Nov 29 05:46:14 np0005539503.novalocal sshd-session[4317]: Disconnected from authenticating user root 45.78.219.251 port 37550 [preauth]
Nov 29 05:46:55 np0005539503.novalocal sshd-session[4320]: Accepted publickey for zuul from 38.102.83.114 port 48086 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Nov 29 05:46:55 np0005539503.novalocal systemd[1]: Created slice User Slice of UID 1000.
Nov 29 05:46:55 np0005539503.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Nov 29 05:46:55 np0005539503.novalocal systemd-logind[788]: New session 1 of user zuul.
Nov 29 05:46:55 np0005539503.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Nov 29 05:46:55 np0005539503.novalocal systemd[1]: Starting User Manager for UID 1000...
Nov 29 05:46:55 np0005539503.novalocal systemd[4324]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 05:46:55 np0005539503.novalocal systemd[4324]: Queued start job for default target Main User Target.
Nov 29 05:46:55 np0005539503.novalocal systemd[4324]: Created slice User Application Slice.
Nov 29 05:46:55 np0005539503.novalocal systemd[4324]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 29 05:46:55 np0005539503.novalocal systemd[4324]: Started Daily Cleanup of User's Temporary Directories.
Nov 29 05:46:55 np0005539503.novalocal systemd[4324]: Reached target Paths.
Nov 29 05:46:55 np0005539503.novalocal systemd[4324]: Reached target Timers.
Nov 29 05:46:55 np0005539503.novalocal systemd[4324]: Starting D-Bus User Message Bus Socket...
Nov 29 05:46:55 np0005539503.novalocal systemd[4324]: Starting Create User's Volatile Files and Directories...
Nov 29 05:46:55 np0005539503.novalocal systemd[4324]: Finished Create User's Volatile Files and Directories.
Nov 29 05:46:55 np0005539503.novalocal systemd[4324]: Listening on D-Bus User Message Bus Socket.
Nov 29 05:46:55 np0005539503.novalocal systemd[4324]: Reached target Sockets.
Nov 29 05:46:55 np0005539503.novalocal systemd[4324]: Reached target Basic System.
Nov 29 05:46:55 np0005539503.novalocal systemd[4324]: Reached target Main User Target.
Nov 29 05:46:55 np0005539503.novalocal systemd[4324]: Startup finished in 126ms.
Nov 29 05:46:55 np0005539503.novalocal systemd[1]: Started User Manager for UID 1000.
Nov 29 05:46:55 np0005539503.novalocal systemd[1]: Started Session 1 of User zuul.
Nov 29 05:46:55 np0005539503.novalocal sshd-session[4320]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 05:46:55 np0005539503.novalocal python3[4408]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 05:46:59 np0005539503.novalocal python3[4436]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 05:47:07 np0005539503.novalocal python3[4494]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 05:47:08 np0005539503.novalocal python3[4534]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Nov 29 05:47:10 np0005539503.novalocal python3[4560]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDgFkjZfiFEmT2Jql9lLFt6CMd+9slSl3MrU+Raer5Y68zzzczsYHXSYgggBZM5uz+gWk02zu4ocSLCc0JOe4EmLwZGL6Ezoic8MmIXP1BwfiaeAXto2OGK7Dc7os16Q0SND6rHgOqdWZh8Kyf2kkY5vrdl9/yfrpAOV4V0UE16RT1qCQW53Ky9IytfIZYMSXaZwSmcvRflB6YToX0wepfVb3xbVWsEBI209yBpJ9cNVY5dWwvu1IlNXbIGLhUr4j3UgrB2k+H2+ltPlEHfLXPB0E2e43vS9K00XtLqpM4JZoq24L0kLi1a3RwzEeG1NQhkGbdnesYTkGRJrh5LvfWLiF4tooJWI0nRVs7jaO/R3w1l7zjdLRrSJ0h7Ie09iYSVZ1nuUuZ77A8mwh/mgdp8FEle4ES1X0kEADcAPPXV/6wFLOHevKRKw+jWBtYusFM6hS74njbD8BM8P0xMUAgCMIw7t3AXjeZIFNjZLL1o2fplfERitOr2Mc7dMx1EvfM= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:47:10 np0005539503.novalocal python3[4584]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:47:11 np0005539503.novalocal python3[4683]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:47:11 np0005539503.novalocal python3[4754]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764395231.046904-251-176143482136989/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=913550f2748a46ca85451ad1c4228192_id_rsa follow=False checksum=527bb20bedeb4c076b14aeb265edb174c4d8c41f backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:47:12 np0005539503.novalocal python3[4877]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:47:12 np0005539503.novalocal python3[4948]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764395231.982619-306-280992285632625/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=913550f2748a46ca85451ad1c4228192_id_rsa.pub follow=False checksum=56c975ef54c9fc5ba54f09c3deb1770b074b7446 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:47:13 np0005539503.novalocal python3[4996]: ansible-ping Invoked with data=pong
Nov 29 05:47:14 np0005539503.novalocal python3[5020]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 05:47:17 np0005539503.novalocal python3[5078]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Nov 29 05:47:18 np0005539503.novalocal python3[5110]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:47:18 np0005539503.novalocal python3[5134]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:47:18 np0005539503.novalocal python3[5158]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:47:19 np0005539503.novalocal python3[5182]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:47:19 np0005539503.novalocal python3[5206]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:47:19 np0005539503.novalocal python3[5230]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:47:21 np0005539503.novalocal sudo[5254]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fspqkmyyahugydeesepvcldnkuglvnjc ; /usr/bin/python3'
Nov 29 05:47:21 np0005539503.novalocal sudo[5254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:47:21 np0005539503.novalocal python3[5256]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:47:21 np0005539503.novalocal sudo[5254]: pam_unix(sudo:session): session closed for user root
Nov 29 05:47:21 np0005539503.novalocal sudo[5332]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obltrllbmkyvzwteujvdkbcaarxbcbrn ; /usr/bin/python3'
Nov 29 05:47:21 np0005539503.novalocal sudo[5332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:47:22 np0005539503.novalocal python3[5334]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:47:22 np0005539503.novalocal sudo[5332]: pam_unix(sudo:session): session closed for user root
Nov 29 05:47:22 np0005539503.novalocal sudo[5405]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrvgldsfhkmzualuwttrylxjqcvrwlev ; /usr/bin/python3'
Nov 29 05:47:22 np0005539503.novalocal sudo[5405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:47:22 np0005539503.novalocal python3[5407]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764395241.643556-31-97438511461905/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:47:22 np0005539503.novalocal sudo[5405]: pam_unix(sudo:session): session closed for user root
Nov 29 05:47:23 np0005539503.novalocal python3[5455]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:47:23 np0005539503.novalocal python3[5479]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:47:23 np0005539503.novalocal python3[5503]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:47:24 np0005539503.novalocal python3[5527]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:47:24 np0005539503.novalocal python3[5551]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:47:24 np0005539503.novalocal python3[5575]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:47:24 np0005539503.novalocal python3[5599]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:47:25 np0005539503.novalocal python3[5623]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:47:25 np0005539503.novalocal python3[5647]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:47:25 np0005539503.novalocal python3[5671]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:47:26 np0005539503.novalocal python3[5695]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:47:26 np0005539503.novalocal python3[5719]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:47:26 np0005539503.novalocal python3[5743]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:47:26 np0005539503.novalocal python3[5767]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:47:27 np0005539503.novalocal python3[5791]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:47:27 np0005539503.novalocal python3[5815]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:47:27 np0005539503.novalocal python3[5839]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:47:28 np0005539503.novalocal python3[5863]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:47:28 np0005539503.novalocal python3[5887]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:47:28 np0005539503.novalocal python3[5911]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:47:28 np0005539503.novalocal python3[5935]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:47:29 np0005539503.novalocal python3[5959]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:47:29 np0005539503.novalocal python3[5983]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:47:29 np0005539503.novalocal python3[6007]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:47:29 np0005539503.novalocal python3[6031]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:47:30 np0005539503.novalocal python3[6055]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:47:32 np0005539503.novalocal sudo[6079]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsgzolwssbmdqwgcgzdshkjoqjpjimrl ; /usr/bin/python3'
Nov 29 05:47:32 np0005539503.novalocal sudo[6079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:47:32 np0005539503.novalocal python3[6081]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 29 05:47:33 np0005539503.novalocal systemd[1]: Starting Time & Date Service...
Nov 29 05:47:33 np0005539503.novalocal systemd[1]: Started Time & Date Service.
Nov 29 05:47:33 np0005539503.novalocal systemd-timedated[6083]: Changed time zone to 'UTC' (UTC).
Nov 29 05:47:33 np0005539503.novalocal sudo[6079]: pam_unix(sudo:session): session closed for user root
Nov 29 05:47:33 np0005539503.novalocal sudo[6110]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ooqzbhjdshqadiujguxltulsurvdyqsr ; /usr/bin/python3'
Nov 29 05:47:33 np0005539503.novalocal sudo[6110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:47:33 np0005539503.novalocal python3[6112]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:47:33 np0005539503.novalocal sudo[6110]: pam_unix(sudo:session): session closed for user root
Nov 29 05:47:34 np0005539503.novalocal python3[6188]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:47:34 np0005539503.novalocal python3[6259]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1764395253.8371253-251-14828351586945/source _original_basename=tmp9g508a6a follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:47:35 np0005539503.novalocal python3[6359]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:47:35 np0005539503.novalocal python3[6430]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764395254.7170765-301-131257366230562/source _original_basename=tmplgs4huc_ follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:47:36 np0005539503.novalocal sudo[6530]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpkquhhpfecujpxxtokpwmmqtbdimcvl ; /usr/bin/python3'
Nov 29 05:47:36 np0005539503.novalocal sudo[6530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:47:36 np0005539503.novalocal python3[6532]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:47:36 np0005539503.novalocal sudo[6530]: pam_unix(sudo:session): session closed for user root
Nov 29 05:47:36 np0005539503.novalocal sudo[6603]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmfgkjxhqmezeqcfvwgaaphwksiwytlr ; /usr/bin/python3'
Nov 29 05:47:36 np0005539503.novalocal sudo[6603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:47:36 np0005539503.novalocal python3[6605]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764395255.9250412-381-179525566098680/source _original_basename=tmpvl9mmhg5 follow=False checksum=54ceff67f46a00e80734f8bde7b737fc4d565204 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:47:36 np0005539503.novalocal sudo[6603]: pam_unix(sudo:session): session closed for user root
Nov 29 05:47:37 np0005539503.novalocal python3[6653]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 05:47:37 np0005539503.novalocal python3[6679]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 05:47:37 np0005539503.novalocal sudo[6757]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynpookypcuqmuyvnxnlpuluinimsqhdz ; /usr/bin/python3'
Nov 29 05:47:37 np0005539503.novalocal sudo[6757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:47:37 np0005539503.novalocal python3[6759]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:47:37 np0005539503.novalocal sudo[6757]: pam_unix(sudo:session): session closed for user root
Nov 29 05:47:38 np0005539503.novalocal sudo[6830]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnryzzgbpnogbtafegabzttzfvyooyhl ; /usr/bin/python3'
Nov 29 05:47:38 np0005539503.novalocal sudo[6830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:47:38 np0005539503.novalocal python3[6832]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1764395257.5946512-451-131404955845068/source _original_basename=tmptt010gbj follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:47:38 np0005539503.novalocal sudo[6830]: pam_unix(sudo:session): session closed for user root
Nov 29 05:47:38 np0005539503.novalocal sudo[6881]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lowwcnpjsrrxfhvmfbvnrmyupjfevnwf ; /usr/bin/python3'
Nov 29 05:47:38 np0005539503.novalocal sudo[6881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:47:38 np0005539503.novalocal python3[6883]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163efc-24cc-d89d-f1f2-00000000001f-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 05:47:38 np0005539503.novalocal sudo[6881]: pam_unix(sudo:session): session closed for user root
Nov 29 05:47:39 np0005539503.novalocal python3[6911]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163efc-24cc-d89d-f1f2-000000000020-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Nov 29 05:47:41 np0005539503.novalocal python3[6939]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:48:00 np0005539503.novalocal sudo[6964]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htzaysujhjzkaafrwhoyfyiuxtrmrngg ; /usr/bin/python3'
Nov 29 05:48:00 np0005539503.novalocal sudo[6964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:48:00 np0005539503.novalocal python3[6966]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:48:00 np0005539503.novalocal sudo[6964]: pam_unix(sudo:session): session closed for user root
Nov 29 05:48:03 np0005539503.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 29 05:48:27 np0005539503.novalocal sshd-session[6969]: Invalid user test2 from 36.50.176.16 port 35200
Nov 29 05:48:28 np0005539503.novalocal sshd-session[6969]: Received disconnect from 36.50.176.16 port 35200:11: Bye Bye [preauth]
Nov 29 05:48:28 np0005539503.novalocal sshd-session[6969]: Disconnected from invalid user test2 36.50.176.16 port 35200 [preauth]
Nov 29 05:48:34 np0005539503.novalocal sshd-session[6971]: Connection closed by 45.78.219.251 port 50724 [preauth]
Nov 29 05:48:41 np0005539503.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Nov 29 05:48:41 np0005539503.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Nov 29 05:48:41 np0005539503.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Nov 29 05:48:41 np0005539503.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Nov 29 05:48:41 np0005539503.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Nov 29 05:48:41 np0005539503.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Nov 29 05:48:41 np0005539503.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Nov 29 05:48:41 np0005539503.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Nov 29 05:48:41 np0005539503.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Nov 29 05:48:41 np0005539503.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Nov 29 05:48:42 np0005539503.novalocal NetworkManager[861]: <info>  [1764395322.0055] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 29 05:48:42 np0005539503.novalocal systemd-udevd[6974]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 05:48:42 np0005539503.novalocal NetworkManager[861]: <info>  [1764395322.0172] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 05:48:42 np0005539503.novalocal NetworkManager[861]: <info>  [1764395322.0209] settings: (eth1): created default wired connection 'Wired connection 1'
Nov 29 05:48:42 np0005539503.novalocal NetworkManager[861]: <info>  [1764395322.0214] device (eth1): carrier: link connected
Nov 29 05:48:42 np0005539503.novalocal NetworkManager[861]: <info>  [1764395322.0217] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Nov 29 05:48:42 np0005539503.novalocal NetworkManager[861]: <info>  [1764395322.0224] policy: auto-activating connection 'Wired connection 1' (00b5a4bd-aa7d-3265-84d1-52d370bbdb29)
Nov 29 05:48:42 np0005539503.novalocal NetworkManager[861]: <info>  [1764395322.0230] device (eth1): Activation: starting connection 'Wired connection 1' (00b5a4bd-aa7d-3265-84d1-52d370bbdb29)
Nov 29 05:48:42 np0005539503.novalocal NetworkManager[861]: <info>  [1764395322.0232] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 05:48:42 np0005539503.novalocal NetworkManager[861]: <info>  [1764395322.0235] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 05:48:42 np0005539503.novalocal NetworkManager[861]: <info>  [1764395322.0242] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 05:48:42 np0005539503.novalocal NetworkManager[861]: <info>  [1764395322.0247] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 29 05:48:43 np0005539503.novalocal python3[7000]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163efc-24cc-fab1-3db9-000000000128-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 05:48:49 np0005539503.novalocal sudo[7078]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzeuggzopjjxissdqshagjtkszefptwy ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 29 05:48:49 np0005539503.novalocal sudo[7078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:48:50 np0005539503.novalocal python3[7080]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:48:50 np0005539503.novalocal sudo[7078]: pam_unix(sudo:session): session closed for user root
Nov 29 05:48:50 np0005539503.novalocal sudo[7151]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwemsnqegexfmqwqvaredgnxqrozkhbq ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 29 05:48:50 np0005539503.novalocal sudo[7151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:48:50 np0005539503.novalocal python3[7153]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764395329.7819402-104-48975405254191/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=846b0529cc02312b35a54e6156bc136b255b5a76 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:48:50 np0005539503.novalocal sudo[7151]: pam_unix(sudo:session): session closed for user root
Nov 29 05:48:51 np0005539503.novalocal sudo[7201]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjhutlohdbifssbhxsxwhvxhfhkctrzv ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 29 05:48:51 np0005539503.novalocal sudo[7201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:48:51 np0005539503.novalocal python3[7203]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 05:48:51 np0005539503.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 29 05:48:51 np0005539503.novalocal systemd[1]: Stopped Network Manager Wait Online.
Nov 29 05:48:51 np0005539503.novalocal systemd[1]: Stopping Network Manager Wait Online...
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[861]: <info>  [1764395331.4098] caught SIGTERM, shutting down normally.
Nov 29 05:48:51 np0005539503.novalocal systemd[1]: Stopping Network Manager...
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[861]: <info>  [1764395331.4105] dhcp4 (eth0): canceled DHCP transaction
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[861]: <info>  [1764395331.4106] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[861]: <info>  [1764395331.4107] dhcp4 (eth0): state changed no lease
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[861]: <info>  [1764395331.4109] manager: NetworkManager state is now CONNECTING
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[861]: <info>  [1764395331.4199] dhcp4 (eth1): canceled DHCP transaction
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[861]: <info>  [1764395331.4199] dhcp4 (eth1): state changed no lease
Nov 29 05:48:51 np0005539503.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[861]: <info>  [1764395331.4268] exiting (success)
Nov 29 05:48:51 np0005539503.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 29 05:48:51 np0005539503.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 29 05:48:51 np0005539503.novalocal systemd[1]: Stopped Network Manager.
Nov 29 05:48:51 np0005539503.novalocal systemd[1]: NetworkManager.service: Consumed 5.033s CPU time, 10.0M memory peak.
Nov 29 05:48:51 np0005539503.novalocal systemd[1]: Starting Network Manager...
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.4794] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:3b7c8c50-55c8-43c8-86aa-f5fa30ebf228)
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.4795] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.4857] manager[0x564ea25d2070]: monitoring kernel firmware directory '/lib/firmware'.
Nov 29 05:48:51 np0005539503.novalocal systemd[1]: Starting Hostname Service...
Nov 29 05:48:51 np0005539503.novalocal systemd[1]: Started Hostname Service.
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.5916] hostname: hostname: using hostnamed
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.5917] hostname: static hostname changed from (none) to "np0005539503.novalocal"
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.5925] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.5932] manager[0x564ea25d2070]: rfkill: Wi-Fi hardware radio set enabled
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.5933] manager[0x564ea25d2070]: rfkill: WWAN hardware radio set enabled
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.5979] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.5979] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.5980] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.5981] manager: Networking is enabled by state file
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.5984] settings: Loaded settings plugin: keyfile (internal)
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.5991] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.6034] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.6049] dhcp: init: Using DHCP client 'internal'
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.6053] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.6064] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.6072] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.6084] device (lo): Activation: starting connection 'lo' (43d811ee-2cbe-4425-a4fb-d7d92aa3b968)
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.6094] device (eth0): carrier: link connected
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.6102] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.6109] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.6110] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.6120] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.6130] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.6139] device (eth1): carrier: link connected
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.6146] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.6153] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (00b5a4bd-aa7d-3265-84d1-52d370bbdb29) (indicated)
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.6154] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.6162] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.6172] device (eth1): Activation: starting connection 'Wired connection 1' (00b5a4bd-aa7d-3265-84d1-52d370bbdb29)
Nov 29 05:48:51 np0005539503.novalocal systemd[1]: Started Network Manager.
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.6184] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.6190] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.6193] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.6195] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.6198] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.6202] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.6205] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.6226] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.6235] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.6250] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.6257] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.6276] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.6281] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.6311] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.6315] dhcp4 (eth0): state changed new lease, address=38.102.83.110
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.6324] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.6333] device (lo): Activation: successful, device activated.
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.6350] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 29 05:48:51 np0005539503.novalocal systemd[1]: Starting Network Manager Wait Online...
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.6435] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.6462] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.6465] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.6470] manager: NetworkManager state is now CONNECTED_SITE
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.6476] device (eth0): Activation: successful, device activated.
Nov 29 05:48:51 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395331.6483] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 29 05:48:51 np0005539503.novalocal sudo[7201]: pam_unix(sudo:session): session closed for user root
Nov 29 05:48:51 np0005539503.novalocal python3[7287]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163efc-24cc-fab1-3db9-0000000000bd-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 05:49:01 np0005539503.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 29 05:49:04 np0005539503.novalocal systemd[4324]: Starting Mark boot as successful...
Nov 29 05:49:04 np0005539503.novalocal systemd[4324]: Finished Mark boot as successful.
Nov 29 05:49:21 np0005539503.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 29 05:49:29 np0005539503.novalocal sshd-session[7297]: error: kex_exchange_identification: read: Connection reset by peer
Nov 29 05:49:29 np0005539503.novalocal sshd-session[7297]: Connection reset by 45.140.17.97 port 10770
Nov 29 05:49:37 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395377.2824] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 29 05:49:37 np0005539503.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 29 05:49:37 np0005539503.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 29 05:49:37 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395377.3225] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 29 05:49:37 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395377.3231] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 29 05:49:37 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395377.3246] device (eth1): Activation: successful, device activated.
Nov 29 05:49:37 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395377.3260] manager: startup complete
Nov 29 05:49:37 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395377.3265] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Nov 29 05:49:37 np0005539503.novalocal NetworkManager[7212]: <warn>  [1764395377.3279] device (eth1): Activation: failed for connection 'Wired connection 1'
Nov 29 05:49:37 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395377.3293] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Nov 29 05:49:37 np0005539503.novalocal systemd[1]: Finished Network Manager Wait Online.
Nov 29 05:49:37 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395377.3453] dhcp4 (eth1): canceled DHCP transaction
Nov 29 05:49:37 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395377.3455] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 29 05:49:37 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395377.3456] dhcp4 (eth1): state changed no lease
Nov 29 05:49:37 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395377.3480] policy: auto-activating connection 'ci-private-network' (8207c9a3-524e-532c-bd71-3fc37e48ed01)
Nov 29 05:49:37 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395377.3489] device (eth1): Activation: starting connection 'ci-private-network' (8207c9a3-524e-532c-bd71-3fc37e48ed01)
Nov 29 05:49:37 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395377.3491] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 05:49:37 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395377.3497] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 05:49:37 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395377.3509] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 05:49:37 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395377.3523] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 05:49:37 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395377.3575] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 05:49:37 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395377.3581] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 05:49:37 np0005539503.novalocal NetworkManager[7212]: <info>  [1764395377.3596] device (eth1): Activation: successful, device activated.
Nov 29 05:49:47 np0005539503.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 29 05:49:52 np0005539503.novalocal sshd-session[4335]: Received disconnect from 38.102.83.114 port 48086:11: disconnected by user
Nov 29 05:49:52 np0005539503.novalocal sshd-session[4335]: Disconnected from user zuul 38.102.83.114 port 48086
Nov 29 05:49:52 np0005539503.novalocal sshd-session[4320]: pam_unix(sshd:session): session closed for user zuul
Nov 29 05:49:52 np0005539503.novalocal systemd-logind[788]: Session 1 logged out. Waiting for processes to exit.
Nov 29 05:50:39 np0005539503.novalocal sshd-session[7321]: Invalid user a from 36.50.176.16 port 59986
Nov 29 05:50:39 np0005539503.novalocal sshd-session[7321]: Received disconnect from 36.50.176.16 port 59986:11: Bye Bye [preauth]
Nov 29 05:50:39 np0005539503.novalocal sshd-session[7321]: Disconnected from invalid user a 36.50.176.16 port 59986 [preauth]
Nov 29 05:50:43 np0005539503.novalocal sshd-session[7323]: Accepted publickey for zuul from 38.102.83.114 port 41668 ssh2: RSA SHA256:OB6VM1CIH1OdgXsE6HCNs7JrsK6Z+Aqk6yzm+7553Qw
Nov 29 05:50:43 np0005539503.novalocal systemd-logind[788]: New session 3 of user zuul.
Nov 29 05:50:43 np0005539503.novalocal systemd[1]: Started Session 3 of User zuul.
Nov 29 05:50:43 np0005539503.novalocal sshd-session[7323]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 05:50:43 np0005539503.novalocal sudo[7402]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhfkxwdmwvwqsvohkrbolmmngftwbwei ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 29 05:50:43 np0005539503.novalocal sudo[7402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:50:43 np0005539503.novalocal python3[7404]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:50:43 np0005539503.novalocal sudo[7402]: pam_unix(sudo:session): session closed for user root
Nov 29 05:50:43 np0005539503.novalocal sudo[7475]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gogrmeykzhbhgrbjbziwexfyhvcpwwpu ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 29 05:50:43 np0005539503.novalocal sudo[7475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:50:44 np0005539503.novalocal python3[7477]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764395443.3133655-365-62282074178475/source _original_basename=tmp4a6ljmhc follow=False checksum=202951a95d8ea5ab635db917083142dc6b9b32e4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:50:44 np0005539503.novalocal sudo[7475]: pam_unix(sudo:session): session closed for user root
Nov 29 05:50:48 np0005539503.novalocal sshd-session[7326]: Connection closed by 38.102.83.114 port 41668
Nov 29 05:50:48 np0005539503.novalocal sshd-session[7323]: pam_unix(sshd:session): session closed for user zuul
Nov 29 05:50:48 np0005539503.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Nov 29 05:50:48 np0005539503.novalocal systemd-logind[788]: Session 3 logged out. Waiting for processes to exit.
Nov 29 05:50:48 np0005539503.novalocal systemd-logind[788]: Removed session 3.
Nov 29 05:51:02 np0005539503.novalocal sshd-session[7503]: Received disconnect from 45.78.219.251 port 34434:11: Bye Bye [preauth]
Nov 29 05:51:02 np0005539503.novalocal sshd-session[7503]: Disconnected from 45.78.219.251 port 34434 [preauth]
Nov 29 05:51:02 np0005539503.novalocal systemd[1]: Starting Cleanup of Temporary Directories...
Nov 29 05:51:02 np0005539503.novalocal systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Nov 29 05:51:02 np0005539503.novalocal systemd[1]: Finished Cleanup of Temporary Directories.
Nov 29 05:51:02 np0005539503.novalocal systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Nov 29 05:52:04 np0005539503.novalocal systemd[4324]: Created slice User Background Tasks Slice.
Nov 29 05:52:04 np0005539503.novalocal systemd[4324]: Starting Cleanup of User's Temporary Files and Directories...
Nov 29 05:52:04 np0005539503.novalocal systemd[4324]: Finished Cleanup of User's Temporary Files and Directories.
Nov 29 05:52:38 np0005539503.novalocal sshd-session[7509]: Invalid user local from 36.50.176.16 port 34096
Nov 29 05:52:38 np0005539503.novalocal sshd-session[7509]: Received disconnect from 36.50.176.16 port 34096:11: Bye Bye [preauth]
Nov 29 05:52:38 np0005539503.novalocal sshd-session[7509]: Disconnected from invalid user local 36.50.176.16 port 34096 [preauth]
Nov 29 05:52:54 np0005539503.novalocal systemd[1]: Starting dnf makecache...
Nov 29 05:52:54 np0005539503.novalocal dnf[7511]: Failed determining last makecache time.
Nov 29 05:52:55 np0005539503.novalocal dnf[7511]: CentOS Stream 9 - BaseOS                         34 kB/s | 7.3 kB     00:00
Nov 29 05:52:55 np0005539503.novalocal dnf[7511]: CentOS Stream 9 - AppStream                      27 kB/s | 7.4 kB     00:00
Nov 29 05:52:55 np0005539503.novalocal dnf[7511]: CentOS Stream 9 - CRB                            82 kB/s | 7.2 kB     00:00
Nov 29 05:52:55 np0005539503.novalocal dnf[7511]: CentOS Stream 9 - Extras packages                74 kB/s | 8.3 kB     00:00
Nov 29 05:52:56 np0005539503.novalocal dnf[7511]: Metadata cache created.
Nov 29 05:52:56 np0005539503.novalocal systemd[1]: dnf-makecache.service: Deactivated successfully.
Nov 29 05:52:56 np0005539503.novalocal systemd[1]: Finished dnf makecache.
Nov 29 05:53:24 np0005539503.novalocal sshd-session[7519]: Invalid user desliga from 45.78.219.251 port 52700
Nov 29 05:53:40 np0005539503.novalocal sshd-session[7519]: Received disconnect from 45.78.219.251 port 52700:11: Bye Bye [preauth]
Nov 29 05:53:40 np0005539503.novalocal sshd-session[7519]: Disconnected from invalid user desliga 45.78.219.251 port 52700 [preauth]
Nov 29 05:55:49 np0005539503.novalocal sshd-session[7522]: Invalid user gerrit from 45.78.219.251 port 60600
Nov 29 05:55:49 np0005539503.novalocal sshd-session[7522]: Received disconnect from 45.78.219.251 port 60600:11: Bye Bye [preauth]
Nov 29 05:55:49 np0005539503.novalocal sshd-session[7522]: Disconnected from invalid user gerrit 45.78.219.251 port 60600 [preauth]
Nov 29 05:56:26 np0005539503.novalocal sshd-session[7526]: Accepted publickey for zuul from 38.102.83.114 port 56712 ssh2: RSA SHA256:OB6VM1CIH1OdgXsE6HCNs7JrsK6Z+Aqk6yzm+7553Qw
Nov 29 05:56:26 np0005539503.novalocal systemd-logind[788]: New session 4 of user zuul.
Nov 29 05:56:26 np0005539503.novalocal systemd[1]: Started Session 4 of User zuul.
Nov 29 05:56:26 np0005539503.novalocal sshd-session[7526]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 05:56:27 np0005539503.novalocal sudo[7553]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcjwzreejeanyzrmvnekicftcbqdqfkl ; /usr/bin/python3'
Nov 29 05:56:27 np0005539503.novalocal sudo[7553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:56:27 np0005539503.novalocal python3[7555]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163efc-24cc-2d4e-d9c7-000000000ca4-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 05:56:27 np0005539503.novalocal sudo[7553]: pam_unix(sudo:session): session closed for user root
Nov 29 05:56:27 np0005539503.novalocal sudo[7582]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtgulvhvnsznneuhswessotzetoogwwj ; /usr/bin/python3'
Nov 29 05:56:27 np0005539503.novalocal sudo[7582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:56:28 np0005539503.novalocal python3[7584]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:56:28 np0005539503.novalocal sudo[7582]: pam_unix(sudo:session): session closed for user root
Nov 29 05:56:28 np0005539503.novalocal sudo[7608]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtomzyghdhrvyeqdyjguzrygwmurffuz ; /usr/bin/python3'
Nov 29 05:56:28 np0005539503.novalocal sudo[7608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:56:28 np0005539503.novalocal python3[7610]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:56:28 np0005539503.novalocal sudo[7608]: pam_unix(sudo:session): session closed for user root
Nov 29 05:56:28 np0005539503.novalocal sudo[7634]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktwptavojeccmnoirbqsvbidsmggzyoq ; /usr/bin/python3'
Nov 29 05:56:28 np0005539503.novalocal sudo[7634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:56:28 np0005539503.novalocal python3[7636]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:56:28 np0005539503.novalocal sudo[7634]: pam_unix(sudo:session): session closed for user root
Nov 29 05:56:28 np0005539503.novalocal sudo[7660]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdjepojgyxvhwlguboczjcasjqrlljwt ; /usr/bin/python3'
Nov 29 05:56:28 np0005539503.novalocal sudo[7660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:56:28 np0005539503.novalocal python3[7662]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:56:28 np0005539503.novalocal sudo[7660]: pam_unix(sudo:session): session closed for user root
Nov 29 05:56:29 np0005539503.novalocal sudo[7686]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-puowwvqwwalennnecgyptfjhnjsfrupe ; /usr/bin/python3'
Nov 29 05:56:29 np0005539503.novalocal sudo[7686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:56:29 np0005539503.novalocal python3[7688]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:56:29 np0005539503.novalocal sudo[7686]: pam_unix(sudo:session): session closed for user root
Nov 29 05:56:29 np0005539503.novalocal sudo[7764]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-waanweqcnbkpbptdtjwzoqccnohhncxf ; /usr/bin/python3'
Nov 29 05:56:29 np0005539503.novalocal sudo[7764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:56:29 np0005539503.novalocal python3[7766]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:56:29 np0005539503.novalocal sudo[7764]: pam_unix(sudo:session): session closed for user root
Nov 29 05:56:30 np0005539503.novalocal sudo[7837]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkxgaicruzriscddnhxaejpkurnnzcyo ; /usr/bin/python3'
Nov 29 05:56:30 np0005539503.novalocal sudo[7837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:56:30 np0005539503.novalocal python3[7839]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764395789.574356-364-267578804087549/source _original_basename=tmppxio4lad follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:56:30 np0005539503.novalocal sudo[7837]: pam_unix(sudo:session): session closed for user root
Nov 29 05:56:31 np0005539503.novalocal sudo[7887]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zivcrvuyolnworyfvyahnnetamsigneb ; /usr/bin/python3'
Nov 29 05:56:31 np0005539503.novalocal sudo[7887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:56:31 np0005539503.novalocal python3[7889]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 05:56:31 np0005539503.novalocal systemd[1]: Reloading.
Nov 29 05:56:31 np0005539503.novalocal systemd-rc-local-generator[7911]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 05:56:31 np0005539503.novalocal sudo[7887]: pam_unix(sudo:session): session closed for user root
Nov 29 05:56:32 np0005539503.novalocal sudo[7942]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkmqgobwhxmurzgvxwksshkldnouyjeu ; /usr/bin/python3'
Nov 29 05:56:32 np0005539503.novalocal sudo[7942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:56:32 np0005539503.novalocal python3[7944]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Nov 29 05:56:32 np0005539503.novalocal sudo[7942]: pam_unix(sudo:session): session closed for user root
Nov 29 05:56:33 np0005539503.novalocal sudo[7968]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjnuozvehpyskyxaahcwwczubdgqletv ; /usr/bin/python3'
Nov 29 05:56:33 np0005539503.novalocal sudo[7968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:56:33 np0005539503.novalocal python3[7970]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 05:56:33 np0005539503.novalocal sudo[7968]: pam_unix(sudo:session): session closed for user root
Nov 29 05:56:33 np0005539503.novalocal sudo[7996]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbmnkejrsngjansnbketmysajiujslng ; /usr/bin/python3'
Nov 29 05:56:33 np0005539503.novalocal sudo[7996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:56:33 np0005539503.novalocal python3[7998]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 05:56:33 np0005539503.novalocal sudo[7996]: pam_unix(sudo:session): session closed for user root
Nov 29 05:56:33 np0005539503.novalocal sudo[8024]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byxkkpteyvrlrynqdtdjjnizlnwwdxkx ; /usr/bin/python3'
Nov 29 05:56:33 np0005539503.novalocal sudo[8024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:56:33 np0005539503.novalocal python3[8026]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 05:56:33 np0005539503.novalocal sudo[8024]: pam_unix(sudo:session): session closed for user root
Nov 29 05:56:34 np0005539503.novalocal sudo[8052]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmpdkepsmfshmbiassillxandkahsxlc ; /usr/bin/python3'
Nov 29 05:56:34 np0005539503.novalocal sudo[8052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:56:34 np0005539503.novalocal python3[8054]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 05:56:34 np0005539503.novalocal sudo[8052]: pam_unix(sudo:session): session closed for user root
Nov 29 05:56:34 np0005539503.novalocal python3[8081]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163efc-24cc-2d4e-d9c7-000000000cab-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 05:56:35 np0005539503.novalocal python3[8111]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 29 05:56:38 np0005539503.novalocal sshd-session[7529]: Connection closed by 38.102.83.114 port 56712
Nov 29 05:56:38 np0005539503.novalocal sshd-session[7526]: pam_unix(sshd:session): session closed for user zuul
Nov 29 05:56:38 np0005539503.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Nov 29 05:56:38 np0005539503.novalocal systemd[1]: session-4.scope: Consumed 4.252s CPU time.
Nov 29 05:56:38 np0005539503.novalocal systemd-logind[788]: Session 4 logged out. Waiting for processes to exit.
Nov 29 05:56:38 np0005539503.novalocal systemd-logind[788]: Removed session 4.
Nov 29 05:56:40 np0005539503.novalocal sshd-session[8116]: Accepted publickey for zuul from 38.102.83.114 port 55918 ssh2: RSA SHA256:OB6VM1CIH1OdgXsE6HCNs7JrsK6Z+Aqk6yzm+7553Qw
Nov 29 05:56:40 np0005539503.novalocal systemd-logind[788]: New session 5 of user zuul.
Nov 29 05:56:40 np0005539503.novalocal systemd[1]: Started Session 5 of User zuul.
Nov 29 05:56:40 np0005539503.novalocal sshd-session[8116]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 05:56:40 np0005539503.novalocal sudo[8143]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-narovebaquxemcsjkzaymnuyccsugiba ; /usr/bin/python3'
Nov 29 05:56:40 np0005539503.novalocal sudo[8143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:56:40 np0005539503.novalocal python3[8145]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 29 05:56:53 np0005539503.novalocal kernel: SELinux:  Converting 385 SID table entries...
Nov 29 05:56:53 np0005539503.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 05:56:53 np0005539503.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 29 05:56:53 np0005539503.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 05:56:53 np0005539503.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 29 05:56:53 np0005539503.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 05:56:53 np0005539503.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 05:56:53 np0005539503.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 05:57:02 np0005539503.novalocal kernel: SELinux:  Converting 385 SID table entries...
Nov 29 05:57:02 np0005539503.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 05:57:02 np0005539503.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 29 05:57:02 np0005539503.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 05:57:02 np0005539503.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 29 05:57:02 np0005539503.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 05:57:02 np0005539503.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 05:57:02 np0005539503.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 05:57:11 np0005539503.novalocal kernel: SELinux:  Converting 385 SID table entries...
Nov 29 05:57:11 np0005539503.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 05:57:11 np0005539503.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 29 05:57:11 np0005539503.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 05:57:11 np0005539503.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 29 05:57:11 np0005539503.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 05:57:11 np0005539503.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 05:57:11 np0005539503.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 05:57:12 np0005539503.novalocal setsebool[8205]: The virt_use_nfs policy boolean was changed to 1 by root
Nov 29 05:57:12 np0005539503.novalocal setsebool[8205]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Nov 29 05:57:22 np0005539503.novalocal kernel: SELinux:  Converting 388 SID table entries...
Nov 29 05:57:22 np0005539503.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 05:57:22 np0005539503.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 29 05:57:22 np0005539503.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 05:57:22 np0005539503.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 29 05:57:22 np0005539503.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 05:57:22 np0005539503.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 05:57:22 np0005539503.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 05:57:39 np0005539503.novalocal dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 29 05:57:40 np0005539503.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 05:57:40 np0005539503.novalocal systemd[1]: Starting man-db-cache-update.service...
Nov 29 05:57:40 np0005539503.novalocal systemd[1]: Reloading.
Nov 29 05:57:40 np0005539503.novalocal systemd-rc-local-generator[8960]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 05:57:40 np0005539503.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 05:57:41 np0005539503.novalocal sudo[8143]: pam_unix(sudo:session): session closed for user root
Nov 29 05:58:06 np0005539503.novalocal python3[20535]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163efc-24cc-965f-b5ec-00000000000c-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 05:58:07 np0005539503.novalocal kernel: evm: overlay not supported
Nov 29 05:58:07 np0005539503.novalocal systemd[4324]: Starting D-Bus User Message Bus...
Nov 29 05:58:07 np0005539503.novalocal dbus-broker-launch[21068]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Nov 29 05:58:07 np0005539503.novalocal dbus-broker-launch[21068]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Nov 29 05:58:07 np0005539503.novalocal systemd[4324]: Started D-Bus User Message Bus.
Nov 29 05:58:07 np0005539503.novalocal dbus-broker-lau[21068]: Ready
Nov 29 05:58:07 np0005539503.novalocal systemd[4324]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 29 05:58:07 np0005539503.novalocal systemd[4324]: Created slice Slice /user.
Nov 29 05:58:07 np0005539503.novalocal systemd[4324]: podman-20993.scope: unit configures an IP firewall, but not running as root.
Nov 29 05:58:07 np0005539503.novalocal systemd[4324]: (This warning is only shown for the first unit using IP firewalling.)
Nov 29 05:58:07 np0005539503.novalocal systemd[4324]: Started podman-20993.scope.
Nov 29 05:58:08 np0005539503.novalocal systemd[4324]: Started podman-pause-f8c91171.scope.
Nov 29 05:58:08 np0005539503.novalocal sudo[21430]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xndevomoujznbbzwfisnupajzjcaoxzp ; /usr/bin/python3'
Nov 29 05:58:08 np0005539503.novalocal sudo[21430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:58:08 np0005539503.novalocal python3[21442]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                       location = "38.102.83.97:5001"
                                                       insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                       location = "38.102.83.97:5001"
                                                       insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:58:08 np0005539503.novalocal python3[21442]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Nov 29 05:58:08 np0005539503.novalocal sudo[21430]: pam_unix(sudo:session): session closed for user root
Nov 29 05:58:09 np0005539503.novalocal sshd-session[8119]: Connection closed by 38.102.83.114 port 55918
Nov 29 05:58:09 np0005539503.novalocal sshd-session[8116]: pam_unix(sshd:session): session closed for user zuul
Nov 29 05:58:09 np0005539503.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Nov 29 05:58:09 np0005539503.novalocal systemd[1]: session-5.scope: Consumed 59.434s CPU time.
Nov 29 05:58:09 np0005539503.novalocal systemd-logind[788]: Session 5 logged out. Waiting for processes to exit.
Nov 29 05:58:09 np0005539503.novalocal systemd-logind[788]: Removed session 5.
Nov 29 05:58:28 np0005539503.novalocal sshd-session[28850]: Connection closed by 38.102.83.36 port 42336 [preauth]
Nov 29 05:58:28 np0005539503.novalocal sshd-session[28851]: Connection closed by 38.102.83.36 port 42344 [preauth]
Nov 29 05:58:28 np0005539503.novalocal sshd-session[28854]: Unable to negotiate with 38.102.83.36 port 42354: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Nov 29 05:58:28 np0005539503.novalocal sshd-session[28852]: Unable to negotiate with 38.102.83.36 port 42364: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Nov 29 05:58:28 np0005539503.novalocal sshd-session[28849]: Unable to negotiate with 38.102.83.36 port 42372: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Nov 29 05:58:30 np0005539503.novalocal systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 05:58:30 np0005539503.novalocal systemd[1]: Finished man-db-cache-update.service.
Nov 29 05:58:30 np0005539503.novalocal systemd[1]: man-db-cache-update.service: Consumed 1min 1.123s CPU time.
Nov 29 05:58:30 np0005539503.novalocal systemd[1]: run-r7e1ac05a46d647a8bb51ec2160484707.service: Deactivated successfully.
Nov 29 05:58:34 np0005539503.novalocal sshd-session[29621]: Accepted publickey for zuul from 38.102.83.114 port 49196 ssh2: RSA SHA256:OB6VM1CIH1OdgXsE6HCNs7JrsK6Z+Aqk6yzm+7553Qw
Nov 29 05:58:34 np0005539503.novalocal systemd-logind[788]: New session 6 of user zuul.
Nov 29 05:58:34 np0005539503.novalocal systemd[1]: Started Session 6 of User zuul.
Nov 29 05:58:34 np0005539503.novalocal sshd-session[29621]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 05:58:34 np0005539503.novalocal python3[29648]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBL3iUg7wOJDjLm9TipkwWPon/M1FO0neD/5ezFHnUJBmbFpPtrL/PoM+teNA62c4mAkgQYtVxx4T3bRgPp78cTw= zuul@np0005539502.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:58:34 np0005539503.novalocal sudo[29672]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxwasxsszzpaqrefotxrwbqcwdmqgygp ; /usr/bin/python3'
Nov 29 05:58:34 np0005539503.novalocal sudo[29672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:58:35 np0005539503.novalocal python3[29674]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBL3iUg7wOJDjLm9TipkwWPon/M1FO0neD/5ezFHnUJBmbFpPtrL/PoM+teNA62c4mAkgQYtVxx4T3bRgPp78cTw= zuul@np0005539502.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:58:35 np0005539503.novalocal sudo[29672]: pam_unix(sudo:session): session closed for user root
Nov 29 05:58:35 np0005539503.novalocal sudo[29698]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-puvvoaezoaqvzsxprmytzrveqyecwedg ; /usr/bin/python3'
Nov 29 05:58:35 np0005539503.novalocal sudo[29698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:58:36 np0005539503.novalocal python3[29700]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005539503.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Nov 29 05:58:36 np0005539503.novalocal useradd[29702]: new group: name=cloud-admin, GID=1002
Nov 29 05:58:36 np0005539503.novalocal useradd[29702]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Nov 29 05:58:36 np0005539503.novalocal sudo[29698]: pam_unix(sudo:session): session closed for user root
Nov 29 05:58:36 np0005539503.novalocal sudo[29732]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phpvdbylgvogmupghtldzxfmyfiwwiyh ; /usr/bin/python3'
Nov 29 05:58:36 np0005539503.novalocal sudo[29732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:58:36 np0005539503.novalocal python3[29734]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBL3iUg7wOJDjLm9TipkwWPon/M1FO0neD/5ezFHnUJBmbFpPtrL/PoM+teNA62c4mAkgQYtVxx4T3bRgPp78cTw= zuul@np0005539502.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 05:58:36 np0005539503.novalocal sudo[29732]: pam_unix(sudo:session): session closed for user root
Nov 29 05:58:37 np0005539503.novalocal sudo[29812]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtidqbbuhnpofcykktbgntcfapnsvxcg ; /usr/bin/python3'
Nov 29 05:58:37 np0005539503.novalocal sudo[29812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:58:37 np0005539503.novalocal python3[29814]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 05:58:37 np0005539503.novalocal sudo[29812]: pam_unix(sudo:session): session closed for user root
Nov 29 05:58:37 np0005539503.novalocal sudo[29885]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skdbverixumokazqqijubiuiivilaknn ; /usr/bin/python3'
Nov 29 05:58:37 np0005539503.novalocal sudo[29885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:58:37 np0005539503.novalocal python3[29887]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764395916.989916-167-245612835171492/source _original_basename=tmp3am5w0_e follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 05:58:37 np0005539503.novalocal sudo[29885]: pam_unix(sudo:session): session closed for user root
Nov 29 05:58:38 np0005539503.novalocal sudo[29935]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzxwamvoftmvmutlefglneispbjxrdyk ; /usr/bin/python3'
Nov 29 05:58:38 np0005539503.novalocal sudo[29935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 05:58:38 np0005539503.novalocal sshd-session[29735]: Invalid user john from 36.50.176.16 port 52260
Nov 29 05:58:38 np0005539503.novalocal python3[29937]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Nov 29 05:58:38 np0005539503.novalocal systemd[1]: Starting Hostname Service...
Nov 29 05:58:38 np0005539503.novalocal sshd-session[29735]: Received disconnect from 36.50.176.16 port 52260:11: Bye Bye [preauth]
Nov 29 05:58:38 np0005539503.novalocal sshd-session[29735]: Disconnected from invalid user john 36.50.176.16 port 52260 [preauth]
Nov 29 05:58:38 np0005539503.novalocal systemd[1]: Started Hostname Service.
Nov 29 05:58:38 np0005539503.novalocal systemd-hostnamed[29941]: Changed pretty hostname to 'compute-0'
Nov 29 05:58:38 compute-0 systemd-hostnamed[29941]: Hostname set to <compute-0> (static)
Nov 29 05:58:38 compute-0 NetworkManager[7212]: <info>  [1764395918.6789] hostname: static hostname changed from "np0005539503.novalocal" to "compute-0"
Nov 29 05:58:38 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 29 05:58:38 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 29 05:58:38 compute-0 sudo[29935]: pam_unix(sudo:session): session closed for user root
Nov 29 05:58:39 compute-0 sshd-session[29624]: Connection closed by 38.102.83.114 port 49196
Nov 29 05:58:39 compute-0 sshd-session[29621]: pam_unix(sshd:session): session closed for user zuul
Nov 29 05:58:39 compute-0 systemd[1]: session-6.scope: Deactivated successfully.
Nov 29 05:58:39 compute-0 systemd[1]: session-6.scope: Consumed 2.106s CPU time.
Nov 29 05:58:39 compute-0 systemd-logind[788]: Session 6 logged out. Waiting for processes to exit.
Nov 29 05:58:39 compute-0 systemd-logind[788]: Removed session 6.
Nov 29 05:58:48 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 29 05:59:08 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 29 06:00:35 compute-0 sshd-session[29960]: Received disconnect from 45.78.219.251 port 60988:11: Bye Bye [preauth]
Nov 29 06:00:35 compute-0 sshd-session[29960]: Disconnected from authenticating user root 45.78.219.251 port 60988 [preauth]
Nov 29 06:00:38 compute-0 sshd-session[29962]: Received disconnect from 36.50.176.16 port 42922:11: Bye Bye [preauth]
Nov 29 06:00:38 compute-0 sshd-session[29962]: Disconnected from authenticating user root 36.50.176.16 port 42922 [preauth]
Nov 29 06:01:01 compute-0 CROND[29965]: (root) CMD (run-parts /etc/cron.hourly)
Nov 29 06:01:01 compute-0 run-parts[29968]: (/etc/cron.hourly) starting 0anacron
Nov 29 06:01:01 compute-0 anacron[29976]: Anacron started on 2025-11-29
Nov 29 06:01:01 compute-0 anacron[29976]: Will run job `cron.daily' in 36 min.
Nov 29 06:01:01 compute-0 anacron[29976]: Will run job `cron.weekly' in 56 min.
Nov 29 06:01:01 compute-0 anacron[29976]: Will run job `cron.monthly' in 76 min.
Nov 29 06:01:01 compute-0 anacron[29976]: Jobs will be executed sequentially
Nov 29 06:01:01 compute-0 run-parts[29978]: (/etc/cron.hourly) finished 0anacron
Nov 29 06:01:01 compute-0 CROND[29964]: (root) CMDEND (run-parts /etc/cron.hourly)
Nov 29 06:02:57 compute-0 sshd-session[29981]: Accepted publickey for zuul from 38.102.83.36 port 55656 ssh2: RSA SHA256:OB6VM1CIH1OdgXsE6HCNs7JrsK6Z+Aqk6yzm+7553Qw
Nov 29 06:02:57 compute-0 systemd-logind[788]: New session 7 of user zuul.
Nov 29 06:02:57 compute-0 systemd[1]: Started Session 7 of User zuul.
Nov 29 06:02:57 compute-0 sshd-session[29981]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:02:58 compute-0 python3[30057]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:03:00 compute-0 sudo[30171]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kggtjnyfptzcigkkncovcpjpuklwjlsp ; /usr/bin/python3'
Nov 29 06:03:00 compute-0 sudo[30171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:03:00 compute-0 python3[30173]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 06:03:00 compute-0 sudo[30171]: pam_unix(sudo:session): session closed for user root
Nov 29 06:03:01 compute-0 sudo[30244]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwptieecjlklqvfmbyjsdxxytnnmyfwm ; /usr/bin/python3'
Nov 29 06:03:01 compute-0 sudo[30244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:03:01 compute-0 python3[30246]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764396180.6373513-34084-4508336754901/source mode=0755 _original_basename=delorean.repo follow=False checksum=a16f090252000d02a7f7d540bb10f7c1c9cd4ac5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:03:01 compute-0 sudo[30244]: pam_unix(sudo:session): session closed for user root
Nov 29 06:03:01 compute-0 sudo[30270]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikyeqecjpfnibiasgrqbpzflpjakboer ; /usr/bin/python3'
Nov 29 06:03:01 compute-0 sudo[30270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:03:01 compute-0 python3[30272]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 06:03:01 compute-0 sudo[30270]: pam_unix(sudo:session): session closed for user root
Nov 29 06:03:01 compute-0 sudo[30343]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wackzivaklysmeufyyapxltakgsiivhd ; /usr/bin/python3'
Nov 29 06:03:01 compute-0 sudo[30343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:03:02 compute-0 python3[30345]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764396180.6373513-34084-4508336754901/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=0bdbb813b840548359ae77c28d76ca272ccaf31b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:03:02 compute-0 sudo[30343]: pam_unix(sudo:session): session closed for user root
Nov 29 06:03:02 compute-0 sudo[30369]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlowxmxqfoiuzermyygaouohzgjyflje ; /usr/bin/python3'
Nov 29 06:03:02 compute-0 sudo[30369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:03:02 compute-0 python3[30371]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 06:03:02 compute-0 sudo[30369]: pam_unix(sudo:session): session closed for user root
Nov 29 06:03:02 compute-0 sudo[30442]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztbuosdgupfjnszxrbgpjxakdsueyojh ; /usr/bin/python3'
Nov 29 06:03:02 compute-0 sudo[30442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:03:02 compute-0 python3[30444]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764396180.6373513-34084-4508336754901/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:03:02 compute-0 sudo[30442]: pam_unix(sudo:session): session closed for user root
Nov 29 06:03:02 compute-0 sudo[30468]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-huakobjdakxrinozanoxfeeteuuwrcgs ; /usr/bin/python3'
Nov 29 06:03:02 compute-0 sudo[30468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:03:02 compute-0 python3[30470]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 06:03:02 compute-0 sudo[30468]: pam_unix(sudo:session): session closed for user root
Nov 29 06:03:03 compute-0 sudo[30541]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-voamnxrrtvgjwxacswjeixvepxsppxyl ; /usr/bin/python3'
Nov 29 06:03:03 compute-0 sudo[30541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:03:03 compute-0 python3[30543]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764396180.6373513-34084-4508336754901/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:03:03 compute-0 sudo[30541]: pam_unix(sudo:session): session closed for user root
Nov 29 06:03:03 compute-0 sudo[30567]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgcpmdsvkohukyayfykwtiztnkxxrlvf ; /usr/bin/python3'
Nov 29 06:03:03 compute-0 sudo[30567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:03:03 compute-0 python3[30569]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 06:03:03 compute-0 sudo[30567]: pam_unix(sudo:session): session closed for user root
Nov 29 06:03:03 compute-0 sudo[30640]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlrnzzgjfkamfjmguooibbehnfhqofol ; /usr/bin/python3'
Nov 29 06:03:03 compute-0 sudo[30640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:03:03 compute-0 python3[30642]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764396180.6373513-34084-4508336754901/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:03:03 compute-0 sudo[30640]: pam_unix(sudo:session): session closed for user root
Nov 29 06:03:03 compute-0 sudo[30666]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-haswjratzwilabfrmexifczcrivttbcf ; /usr/bin/python3'
Nov 29 06:03:03 compute-0 sudo[30666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:03:04 compute-0 python3[30668]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 06:03:04 compute-0 sudo[30666]: pam_unix(sudo:session): session closed for user root
Nov 29 06:03:04 compute-0 sudo[30739]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zaxzmnvgvijcizoulwhsaladwwdqsfph ; /usr/bin/python3'
Nov 29 06:03:04 compute-0 sudo[30739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:03:04 compute-0 python3[30741]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764396180.6373513-34084-4508336754901/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:03:04 compute-0 sudo[30739]: pam_unix(sudo:session): session closed for user root
Nov 29 06:03:04 compute-0 sudo[30765]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-updhyozdwkfjwsoptmsnnydxdqkkkddm ; /usr/bin/python3'
Nov 29 06:03:04 compute-0 sudo[30765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:03:04 compute-0 python3[30767]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 06:03:04 compute-0 sudo[30765]: pam_unix(sudo:session): session closed for user root
Nov 29 06:03:04 compute-0 sudo[30838]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-woslqlqbsqnbctjfhhbxetwzippekwni ; /usr/bin/python3'
Nov 29 06:03:04 compute-0 sudo[30838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:03:04 compute-0 python3[30840]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764396180.6373513-34084-4508336754901/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=25e801a9a05537c191e2aa500f19076ac31d3e5b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:03:04 compute-0 sudo[30838]: pam_unix(sudo:session): session closed for user root
Nov 29 06:03:08 compute-0 sshd-session[30865]: Connection closed by 192.168.122.11 port 42726 [preauth]
Nov 29 06:03:08 compute-0 sshd-session[30866]: Connection closed by 192.168.122.11 port 42720 [preauth]
Nov 29 06:03:08 compute-0 sshd-session[30868]: Unable to negotiate with 192.168.122.11 port 42736: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Nov 29 06:03:08 compute-0 sshd-session[30867]: Unable to negotiate with 192.168.122.11 port 42756: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Nov 29 06:03:08 compute-0 sshd-session[30869]: Unable to negotiate with 192.168.122.11 port 42750: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Nov 29 06:03:12 compute-0 sshd-session[29979]: Connection closed by 45.78.219.251 port 56418 [preauth]
Nov 29 06:03:13 compute-0 python3[30898]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:04:41 compute-0 sshd-session[30902]: Invalid user x from 36.50.176.16 port 45618
Nov 29 06:04:41 compute-0 sshd-session[30902]: Received disconnect from 36.50.176.16 port 45618:11: Bye Bye [preauth]
Nov 29 06:04:41 compute-0 sshd-session[30902]: Disconnected from invalid user x 36.50.176.16 port 45618 [preauth]
Nov 29 06:05:05 compute-0 sshd-session[30904]: Invalid user x from 160.202.8.218 port 42586
Nov 29 06:05:05 compute-0 sshd-session[30904]: Received disconnect from 160.202.8.218 port 42586:11: Bye Bye [preauth]
Nov 29 06:05:05 compute-0 sshd-session[30904]: Disconnected from invalid user x 160.202.8.218 port 42586 [preauth]
Nov 29 06:05:20 compute-0 sshd-session[30907]: Invalid user ubuntu from 45.78.219.251 port 48900
Nov 29 06:05:20 compute-0 sshd-session[30907]: Received disconnect from 45.78.219.251 port 48900:11: Bye Bye [preauth]
Nov 29 06:05:20 compute-0 sshd-session[30907]: Disconnected from invalid user ubuntu 45.78.219.251 port 48900 [preauth]
Nov 29 06:06:07 compute-0 sshd-session[30909]: Invalid user caja from 103.179.56.44 port 54820
Nov 29 06:06:07 compute-0 sshd-session[30909]: Received disconnect from 103.179.56.44 port 54820:11: Bye Bye [preauth]
Nov 29 06:06:07 compute-0 sshd-session[30909]: Disconnected from invalid user caja 103.179.56.44 port 54820 [preauth]
Nov 29 06:06:43 compute-0 sshd-session[30912]: Invalid user user2 from 36.50.176.16 port 48244
Nov 29 06:06:43 compute-0 sshd-session[30912]: Received disconnect from 36.50.176.16 port 48244:11: Bye Bye [preauth]
Nov 29 06:06:43 compute-0 sshd-session[30912]: Disconnected from invalid user user2 36.50.176.16 port 48244 [preauth]
Nov 29 06:07:57 compute-0 sshd-session[30915]: Received disconnect from 45.78.219.251 port 46210:11: Bye Bye [preauth]
Nov 29 06:07:57 compute-0 sshd-session[30915]: Disconnected from 45.78.219.251 port 46210 [preauth]
Nov 29 06:08:13 compute-0 sshd-session[29984]: Received disconnect from 38.102.83.36 port 55656:11: disconnected by user
Nov 29 06:08:13 compute-0 sshd-session[29984]: Disconnected from user zuul 38.102.83.36 port 55656
Nov 29 06:08:13 compute-0 sshd-session[29981]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:08:13 compute-0 systemd[1]: session-7.scope: Deactivated successfully.
Nov 29 06:08:13 compute-0 systemd[1]: session-7.scope: Consumed 4.783s CPU time.
Nov 29 06:08:13 compute-0 systemd-logind[788]: Session 7 logged out. Waiting for processes to exit.
Nov 29 06:08:13 compute-0 systemd-logind[788]: Removed session 7.
Nov 29 06:08:40 compute-0 sshd-session[30917]: Invalid user caja from 36.50.176.16 port 50412
Nov 29 06:08:40 compute-0 sshd-session[30917]: Received disconnect from 36.50.176.16 port 50412:11: Bye Bye [preauth]
Nov 29 06:08:40 compute-0 sshd-session[30917]: Disconnected from invalid user caja 36.50.176.16 port 50412 [preauth]
Nov 29 06:08:55 compute-0 sshd[1011]: Timeout before authentication for connection from 1.194.239.240 to 38.102.83.110, pid = 30914
Nov 29 06:10:06 compute-0 sshd-session[30921]: Invalid user elastic from 160.202.8.218 port 42108
Nov 29 06:10:06 compute-0 sshd-session[30921]: Received disconnect from 160.202.8.218 port 42108:11: Bye Bye [preauth]
Nov 29 06:10:06 compute-0 sshd-session[30921]: Disconnected from invalid user elastic 160.202.8.218 port 42108 [preauth]
Nov 29 06:10:08 compute-0 sshd-session[30923]: Invalid user ubuntu from 45.78.219.251 port 46124
Nov 29 06:10:08 compute-0 sshd-session[30923]: Received disconnect from 45.78.219.251 port 46124:11: Bye Bye [preauth]
Nov 29 06:10:08 compute-0 sshd-session[30923]: Disconnected from invalid user ubuntu 45.78.219.251 port 46124 [preauth]
Nov 29 06:10:18 compute-0 sshd-session[30926]: Received disconnect from 103.179.56.44 port 49854:11: Bye Bye [preauth]
Nov 29 06:10:18 compute-0 sshd-session[30926]: Disconnected from authenticating user root 103.179.56.44 port 49854 [preauth]
Nov 29 06:10:41 compute-0 sshd-session[30928]: Received disconnect from 36.50.176.16 port 37082:11: Bye Bye [preauth]
Nov 29 06:10:41 compute-0 sshd-session[30928]: Disconnected from authenticating user root 36.50.176.16 port 37082 [preauth]
Nov 29 06:11:04 compute-0 sshd[1011]: Timeout before authentication for connection from 106.13.48.156 to 38.102.83.110, pid = 30920
Nov 29 06:11:48 compute-0 sshd-session[30930]: Received disconnect from 160.202.8.218 port 35636:11: Bye Bye [preauth]
Nov 29 06:11:48 compute-0 sshd-session[30930]: Disconnected from authenticating user root 160.202.8.218 port 35636 [preauth]
Nov 29 06:12:10 compute-0 sshd-session[30932]: Received disconnect from 103.179.56.44 port 42112:11: Bye Bye [preauth]
Nov 29 06:12:10 compute-0 sshd-session[30932]: Disconnected from authenticating user root 103.179.56.44 port 42112 [preauth]
Nov 29 06:12:33 compute-0 sshd-session[30935]: Connection closed by 45.78.219.251 port 48396 [preauth]
Nov 29 06:12:41 compute-0 sshd-session[30937]: Invalid user elena from 36.50.176.16 port 44868
Nov 29 06:12:41 compute-0 sshd-session[30937]: Received disconnect from 36.50.176.16 port 44868:11: Bye Bye [preauth]
Nov 29 06:12:41 compute-0 sshd-session[30937]: Disconnected from invalid user elena 36.50.176.16 port 44868 [preauth]
Nov 29 06:13:17 compute-0 sshd-session[30939]: Received disconnect from 160.202.8.218 port 57380:11: Bye Bye [preauth]
Nov 29 06:13:17 compute-0 sshd-session[30939]: Disconnected from authenticating user root 160.202.8.218 port 57380 [preauth]
Nov 29 06:13:27 compute-0 sshd-session[30941]: Invalid user soporte from 1.214.197.163 port 48838
Nov 29 06:13:27 compute-0 sshd-session[30941]: Received disconnect from 1.214.197.163 port 48838:11: Bye Bye [preauth]
Nov 29 06:13:27 compute-0 sshd-session[30941]: Disconnected from invalid user soporte 1.214.197.163 port 48838 [preauth]
Nov 29 06:13:29 compute-0 sshd-session[30943]: Invalid user odoo from 179.125.24.202 port 56418
Nov 29 06:13:29 compute-0 sshd-session[30943]: Received disconnect from 179.125.24.202 port 56418:11: Bye Bye [preauth]
Nov 29 06:13:29 compute-0 sshd-session[30943]: Disconnected from invalid user odoo 179.125.24.202 port 56418 [preauth]
Nov 29 06:14:00 compute-0 sshd-session[30945]: Invalid user github from 103.179.56.44 port 48224
Nov 29 06:14:00 compute-0 sshd-session[30945]: Received disconnect from 103.179.56.44 port 48224:11: Bye Bye [preauth]
Nov 29 06:14:00 compute-0 sshd-session[30945]: Disconnected from invalid user github 103.179.56.44 port 48224 [preauth]
Nov 29 06:14:22 compute-0 sshd-session[30948]: Received disconnect from 45.202.211.6 port 55552:11: Bye Bye [preauth]
Nov 29 06:14:22 compute-0 sshd-session[30948]: Disconnected from authenticating user root 45.202.211.6 port 55552 [preauth]
Nov 29 06:14:41 compute-0 sshd-session[30950]: Invalid user invitado from 36.50.176.16 port 51114
Nov 29 06:14:41 compute-0 sshd-session[30950]: Received disconnect from 36.50.176.16 port 51114:11: Bye Bye [preauth]
Nov 29 06:14:41 compute-0 sshd-session[30950]: Disconnected from invalid user invitado 36.50.176.16 port 51114 [preauth]
Nov 29 06:14:52 compute-0 sshd-session[30952]: Received disconnect from 160.202.8.218 port 50854:11: Bye Bye [preauth]
Nov 29 06:14:52 compute-0 sshd-session[30952]: Disconnected from authenticating user root 160.202.8.218 port 50854 [preauth]
Nov 29 06:14:56 compute-0 sshd-session[30954]: Invalid user kali from 45.78.219.251 port 47910
Nov 29 06:14:56 compute-0 sshd-session[30954]: Received disconnect from 45.78.219.251 port 47910:11: Bye Bye [preauth]
Nov 29 06:14:56 compute-0 sshd-session[30954]: Disconnected from invalid user kali 45.78.219.251 port 47910 [preauth]
Nov 29 06:15:53 compute-0 sshd-session[30956]: Received disconnect from 103.179.56.44 port 56620:11: Bye Bye [preauth]
Nov 29 06:15:53 compute-0 sshd-session[30956]: Disconnected from authenticating user root 103.179.56.44 port 56620 [preauth]
Nov 29 06:16:22 compute-0 sshd-session[30958]: Received disconnect from 179.125.24.202 port 46608:11: Bye Bye [preauth]
Nov 29 06:16:22 compute-0 sshd-session[30958]: Disconnected from authenticating user root 179.125.24.202 port 46608 [preauth]
Nov 29 06:16:26 compute-0 sshd-session[30960]: Invalid user splunk from 160.202.8.218 port 44336
Nov 29 06:16:27 compute-0 sshd-session[30960]: Received disconnect from 160.202.8.218 port 44336:11: Bye Bye [preauth]
Nov 29 06:16:27 compute-0 sshd-session[30960]: Disconnected from invalid user splunk 160.202.8.218 port 44336 [preauth]
Nov 29 06:16:28 compute-0 sshd-session[30962]: Invalid user user01 from 152.32.250.188 port 55382
Nov 29 06:16:28 compute-0 sshd-session[30964]: Invalid user update from 1.214.197.163 port 39400
Nov 29 06:16:28 compute-0 sshd-session[30964]: Received disconnect from 1.214.197.163 port 39400:11: Bye Bye [preauth]
Nov 29 06:16:28 compute-0 sshd-session[30964]: Disconnected from invalid user update 1.214.197.163 port 39400 [preauth]
Nov 29 06:16:29 compute-0 sshd-session[30962]: Received disconnect from 152.32.250.188 port 55382:11: Bye Bye [preauth]
Nov 29 06:16:29 compute-0 sshd-session[30962]: Disconnected from invalid user user01 152.32.250.188 port 55382 [preauth]
Nov 29 06:16:42 compute-0 sshd-session[30967]: Received disconnect from 36.50.176.16 port 55360:11: Bye Bye [preauth]
Nov 29 06:16:42 compute-0 sshd-session[30967]: Disconnected from authenticating user root 36.50.176.16 port 55360 [preauth]
Nov 29 06:16:44 compute-0 sshd-session[30969]: Invalid user root1 from 45.202.211.6 port 60920
Nov 29 06:16:44 compute-0 sshd-session[30969]: Received disconnect from 45.202.211.6 port 60920:11: Bye Bye [preauth]
Nov 29 06:16:44 compute-0 sshd-session[30969]: Disconnected from invalid user root1 45.202.211.6 port 60920 [preauth]
Nov 29 06:17:06 compute-0 sshd-session[30971]: Accepted publickey for zuul from 192.168.122.30 port 59232 ssh2: ECDSA SHA256:Ey+6YDlYfkE2dRe/gjWhHvvrJHee4xZo2Q1JojEuVBA
Nov 29 06:17:06 compute-0 systemd-logind[788]: New session 8 of user zuul.
Nov 29 06:17:06 compute-0 systemd[1]: Started Session 8 of User zuul.
Nov 29 06:17:06 compute-0 sshd-session[30971]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:17:07 compute-0 python3.9[31124]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:17:08 compute-0 sudo[31303]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzlundhbpcltkbwjsxubwvroslmhgpoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397028.377385-61-198714103150833/AnsiballZ_command.py'
Nov 29 06:17:08 compute-0 sudo[31303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:17:09 compute-0 python3.9[31305]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:17:16 compute-0 sudo[31303]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:17 compute-0 sshd-session[30974]: Connection closed by 192.168.122.30 port 59232
Nov 29 06:17:17 compute-0 sshd-session[30971]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:17:17 compute-0 systemd[1]: session-8.scope: Deactivated successfully.
Nov 29 06:17:17 compute-0 systemd[1]: session-8.scope: Consumed 7.990s CPU time.
Nov 29 06:17:17 compute-0 systemd-logind[788]: Session 8 logged out. Waiting for processes to exit.
Nov 29 06:17:17 compute-0 systemd-logind[788]: Removed session 8.
Nov 29 06:17:18 compute-0 sshd-session[31335]: Invalid user cumulus from 45.78.219.251 port 43270
Nov 29 06:17:18 compute-0 sshd-session[31335]: Received disconnect from 45.78.219.251 port 43270:11: Bye Bye [preauth]
Nov 29 06:17:18 compute-0 sshd-session[31335]: Disconnected from invalid user cumulus 45.78.219.251 port 43270 [preauth]
Nov 29 06:17:33 compute-0 sshd-session[31365]: Accepted publickey for zuul from 192.168.122.30 port 55446 ssh2: ECDSA SHA256:Ey+6YDlYfkE2dRe/gjWhHvvrJHee4xZo2Q1JojEuVBA
Nov 29 06:17:33 compute-0 systemd-logind[788]: New session 9 of user zuul.
Nov 29 06:17:33 compute-0 systemd[1]: Started Session 9 of User zuul.
Nov 29 06:17:33 compute-0 sshd-session[31365]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:17:34 compute-0 python3.9[31518]: ansible-ansible.legacy.ping Invoked with data=pong
Nov 29 06:17:35 compute-0 python3.9[31692]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:17:36 compute-0 sudo[31842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kijptlhipjsmvzcmijhxhndvjzynlxga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397055.6143186-98-186075314987546/AnsiballZ_command.py'
Nov 29 06:17:36 compute-0 sudo[31842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:17:36 compute-0 python3.9[31844]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:17:36 compute-0 sudo[31842]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:37 compute-0 sudo[31995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzwcogfygddxfdprtiedunuborwwxsxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397056.8025546-134-241151046826935/AnsiballZ_stat.py'
Nov 29 06:17:37 compute-0 sudo[31995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:17:37 compute-0 python3.9[31997]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:17:37 compute-0 sudo[31995]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:38 compute-0 sudo[32147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfyqvrusuvjytpnaoejgbflrszyswhui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397057.6675205-158-56374781909354/AnsiballZ_file.py'
Nov 29 06:17:38 compute-0 sudo[32147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:17:38 compute-0 python3.9[32149]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:17:38 compute-0 sudo[32147]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:38 compute-0 sudo[32299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igzsweuznvzgpjwzvggsmqxvauejdkzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397058.5989685-182-218117469956926/AnsiballZ_stat.py'
Nov 29 06:17:38 compute-0 sudo[32299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:17:39 compute-0 python3.9[32301]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:17:39 compute-0 sudo[32299]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:39 compute-0 sudo[32422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpmojdcesduhbwpdjifpoylcwivgsuje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397058.5989685-182-218117469956926/AnsiballZ_copy.py'
Nov 29 06:17:39 compute-0 sudo[32422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:17:39 compute-0 python3.9[32424]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397058.5989685-182-218117469956926/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:17:39 compute-0 sudo[32422]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:40 compute-0 sudo[32574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bokbdzstemqshbvatkzztiokmovquiti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397060.0488539-227-36709711039771/AnsiballZ_setup.py'
Nov 29 06:17:40 compute-0 sudo[32574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:17:40 compute-0 python3.9[32576]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:17:40 compute-0 sudo[32574]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:41 compute-0 sudo[32732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efhhdfatkscrdyqoaslcrmcctclzuzon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397061.004254-251-196298926670593/AnsiballZ_file.py'
Nov 29 06:17:41 compute-0 sudo[32732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:17:41 compute-0 python3.9[32734]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:17:41 compute-0 sudo[32732]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:42 compute-0 sudo[32884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvnsjrcbvcmdgisrfphavylzzavmmnma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397061.937014-278-45100065252207/AnsiballZ_file.py'
Nov 29 06:17:42 compute-0 sudo[32884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:17:42 compute-0 sshd-session[32605]: Invalid user david from 103.179.56.44 port 43908
Nov 29 06:17:42 compute-0 python3.9[32886]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:17:42 compute-0 sudo[32884]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:42 compute-0 sshd-session[32605]: Received disconnect from 103.179.56.44 port 43908:11: Bye Bye [preauth]
Nov 29 06:17:42 compute-0 sshd-session[32605]: Disconnected from invalid user david 103.179.56.44 port 43908 [preauth]
Nov 29 06:17:43 compute-0 python3.9[33036]: ansible-ansible.builtin.service_facts Invoked
Nov 29 06:17:47 compute-0 python3.9[33289]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:17:48 compute-0 python3.9[33439]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:17:49 compute-0 python3.9[33593]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:17:50 compute-0 sudo[33749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oumcaidrigwzueydrqalqwdmntgzcejl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397069.9896004-422-123369296291096/AnsiballZ_setup.py'
Nov 29 06:17:50 compute-0 sudo[33749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:17:50 compute-0 python3.9[33751]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:17:50 compute-0 sudo[33749]: pam_unix(sudo:session): session closed for user root
Nov 29 06:17:51 compute-0 sshd-session[33752]: Invalid user teamspeak from 179.125.24.202 port 60948
Nov 29 06:17:51 compute-0 sudo[33835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkckrowemrwwvnslrvcheyfyjuzkibrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397069.9896004-422-123369296291096/AnsiballZ_dnf.py'
Nov 29 06:17:51 compute-0 sudo[33835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:17:51 compute-0 sshd-session[33752]: Received disconnect from 179.125.24.202 port 60948:11: Bye Bye [preauth]
Nov 29 06:17:51 compute-0 sshd-session[33752]: Disconnected from invalid user teamspeak 179.125.24.202 port 60948 [preauth]
Nov 29 06:17:51 compute-0 python3.9[33837]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:17:53 compute-0 sshd-session[33857]: Received disconnect from 1.214.197.163 port 40786:11: Bye Bye [preauth]
Nov 29 06:17:53 compute-0 sshd-session[33857]: Disconnected from authenticating user root 1.214.197.163 port 40786 [preauth]
Nov 29 06:17:56 compute-0 sshd-session[33903]: Invalid user ubuntu from 160.202.8.218 port 37832
Nov 29 06:17:56 compute-0 sshd-session[33903]: Received disconnect from 160.202.8.218 port 37832:11: Bye Bye [preauth]
Nov 29 06:17:56 compute-0 sshd-session[33903]: Disconnected from invalid user ubuntu 160.202.8.218 port 37832 [preauth]
Nov 29 06:18:01 compute-0 sshd-session[33910]: Received disconnect from 45.202.211.6 port 41212:11: Bye Bye [preauth]
Nov 29 06:18:01 compute-0 sshd-session[33910]: Disconnected from authenticating user root 45.202.211.6 port 41212 [preauth]
Nov 29 06:18:41 compute-0 sshd[1011]: Timeout before authentication for connection from 14.103.118.217 to 38.102.83.110, pid = 30966
Nov 29 06:18:41 compute-0 systemd[1]: Reloading.
Nov 29 06:18:42 compute-0 systemd-rc-local-generator[34043]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:18:42 compute-0 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Nov 29 06:18:42 compute-0 systemd[1]: Reloading.
Nov 29 06:18:42 compute-0 systemd-rc-local-generator[34080]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:18:42 compute-0 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Nov 29 06:18:42 compute-0 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Nov 29 06:18:42 compute-0 systemd[1]: Reloading.
Nov 29 06:18:42 compute-0 systemd-rc-local-generator[34122]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:18:42 compute-0 systemd[1]: Listening on LVM2 poll daemon socket.
Nov 29 06:18:42 compute-0 sshd-session[34012]: Invalid user pivpn from 36.50.176.16 port 37294
Nov 29 06:18:43 compute-0 dbus-broker-launch[772]: Noticed file-system modification, trigger reload.
Nov 29 06:18:43 compute-0 dbus-broker-launch[772]: Noticed file-system modification, trigger reload.
Nov 29 06:18:43 compute-0 dbus-broker-launch[772]: Noticed file-system modification, trigger reload.
Nov 29 06:18:43 compute-0 sshd-session[34012]: Received disconnect from 36.50.176.16 port 37294:11: Bye Bye [preauth]
Nov 29 06:18:43 compute-0 sshd-session[34012]: Disconnected from invalid user pivpn 36.50.176.16 port 37294 [preauth]
Nov 29 06:19:14 compute-0 sshd-session[34230]: Received disconnect from 1.214.197.163 port 42174:11: Bye Bye [preauth]
Nov 29 06:19:14 compute-0 sshd-session[34230]: Disconnected from authenticating user root 1.214.197.163 port 42174 [preauth]
Nov 29 06:19:15 compute-0 sshd-session[34238]: Received disconnect from 45.202.211.6 port 45332:11: Bye Bye [preauth]
Nov 29 06:19:15 compute-0 sshd-session[34238]: Disconnected from authenticating user root 45.202.211.6 port 45332 [preauth]
Nov 29 06:19:16 compute-0 sshd-session[34245]: Invalid user soporte from 179.125.24.202 port 58996
Nov 29 06:19:16 compute-0 sshd-session[34245]: Received disconnect from 179.125.24.202 port 58996:11: Bye Bye [preauth]
Nov 29 06:19:16 compute-0 sshd-session[34245]: Disconnected from invalid user soporte 179.125.24.202 port 58996 [preauth]
Nov 29 06:19:25 compute-0 sshd-session[34277]: Received disconnect from 160.202.8.218 port 59560:11: Bye Bye [preauth]
Nov 29 06:19:25 compute-0 sshd-session[34277]: Disconnected from authenticating user root 160.202.8.218 port 59560 [preauth]
Nov 29 06:19:33 compute-0 sshd-session[34300]: Received disconnect from 103.179.56.44 port 53490:11: Bye Bye [preauth]
Nov 29 06:19:33 compute-0 sshd-session[34300]: Disconnected from authenticating user root 103.179.56.44 port 53490 [preauth]
Nov 29 06:19:59 compute-0 kernel: SELinux:  Converting 2717 SID table entries...
Nov 29 06:19:59 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 06:19:59 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 29 06:19:59 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 06:19:59 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 29 06:19:59 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 06:19:59 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 06:19:59 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 06:19:59 compute-0 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Nov 29 06:19:59 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 06:19:59 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 29 06:19:59 compute-0 systemd[1]: Reloading.
Nov 29 06:19:59 compute-0 systemd-rc-local-generator[34463]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:19:59 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 06:20:01 compute-0 sudo[33835]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:01 compute-0 sudo[35373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxxvllfivjyeccpxajbrkerlywckghce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397201.2793286-458-89267006252595/AnsiballZ_command.py'
Nov 29 06:20:01 compute-0 sudo[35373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:20:01 compute-0 python3.9[35375]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:20:02 compute-0 sudo[35373]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:03 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 06:20:03 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 29 06:20:03 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.177s CPU time.
Nov 29 06:20:03 compute-0 systemd[1]: run-r28ce7f3bce8947b685bd1aa947908574.service: Deactivated successfully.
Nov 29 06:20:03 compute-0 sudo[35655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amnsumeqmmqncetpiosuewvunhcwbloo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397203.1563406-482-175018457587064/AnsiballZ_selinux.py'
Nov 29 06:20:03 compute-0 sudo[35655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:20:04 compute-0 python3.9[35657]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Nov 29 06:20:04 compute-0 sudo[35655]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:04 compute-0 sudo[35807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjepebkvdlfwmsfceljblpdgutmhxsiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397204.6011176-515-32349992663974/AnsiballZ_command.py'
Nov 29 06:20:04 compute-0 sudo[35807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:20:05 compute-0 python3.9[35809]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Nov 29 06:20:07 compute-0 sudo[35807]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:09 compute-0 sudo[35961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmhnwzxilapcwipgojyjeymfredtkaey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397208.9077523-539-19605380769138/AnsiballZ_file.py'
Nov 29 06:20:09 compute-0 sudo[35961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:20:09 compute-0 python3.9[35963]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:20:09 compute-0 sudo[35961]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:14 compute-0 sudo[36113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ketvysqfwgfhlsfrqkbyfyryaigbfvaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397213.7574613-563-50200892291505/AnsiballZ_mount.py'
Nov 29 06:20:14 compute-0 sudo[36113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:20:17 compute-0 python3.9[36115]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Nov 29 06:20:17 compute-0 sudo[36113]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:18 compute-0 sudo[36265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvjsvgtuysocentpwicwusbmufvisest ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397218.5435236-647-197702996391349/AnsiballZ_file.py'
Nov 29 06:20:18 compute-0 sudo[36265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:20:19 compute-0 sshd-session[36269]: Invalid user support from 78.128.112.74 port 33456
Nov 29 06:20:19 compute-0 sshd-session[36269]: Connection closed by invalid user support 78.128.112.74 port 33456 [preauth]
Nov 29 06:20:19 compute-0 python3.9[36267]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:20:19 compute-0 sudo[36265]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:20 compute-0 sudo[36420]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnjxapdiasmrzllgawxtjwtbxszcvuse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397219.8627248-671-158846484269430/AnsiballZ_stat.py'
Nov 29 06:20:20 compute-0 sudo[36420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:20:20 compute-0 python3.9[36422]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:20:20 compute-0 sudo[36420]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:20 compute-0 sudo[36543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkiwihrzrcalzdntdbitoznpcpslhoor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397219.8627248-671-158846484269430/AnsiballZ_copy.py'
Nov 29 06:20:20 compute-0 sudo[36543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:20:20 compute-0 python3.9[36545]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397219.8627248-671-158846484269430/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e40b48f8fd7ce69cedaa6e53dbd579733b9096d3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:20:20 compute-0 sudo[36543]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:27 compute-0 sshd-session[36570]: Received disconnect from 45.202.211.6 port 50724:11: Bye Bye [preauth]
Nov 29 06:20:27 compute-0 sshd-session[36570]: Disconnected from authenticating user root 45.202.211.6 port 50724 [preauth]
Nov 29 06:20:29 compute-0 sudo[36697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssgpofuuuzduforwlyhkdukfpqqagbhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397229.4503763-743-158607311015148/AnsiballZ_stat.py'
Nov 29 06:20:29 compute-0 sudo[36697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:20:29 compute-0 python3.9[36699]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:20:29 compute-0 sudo[36697]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:30 compute-0 sudo[36849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgloyenevoatvurcgqnfbuaaqfswregb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397230.1709113-767-130405579742786/AnsiballZ_command.py'
Nov 29 06:20:30 compute-0 sudo[36849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:20:30 compute-0 python3.9[36851]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:20:30 compute-0 sudo[36849]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:31 compute-0 sudo[37002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktfybnnylakmscvcksjoqrduxojguxjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397230.9178324-791-226816342417579/AnsiballZ_file.py'
Nov 29 06:20:31 compute-0 sudo[37002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:20:31 compute-0 python3.9[37004]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:20:31 compute-0 sudo[37002]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:32 compute-0 sudo[37156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gasqbgahaxriqdesyvdkslscozfaufbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397231.9172266-824-83562960580161/AnsiballZ_getent.py'
Nov 29 06:20:32 compute-0 sudo[37156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:20:32 compute-0 python3.9[37158]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Nov 29 06:20:32 compute-0 sudo[37156]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:32 compute-0 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 06:20:33 compute-0 sshd-session[37029]: Invalid user bitnami from 1.214.197.163 port 43564
Nov 29 06:20:33 compute-0 sudo[37310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elihcfcucxiefmluhankmibvvoypocik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397232.7845418-848-162961698705042/AnsiballZ_group.py'
Nov 29 06:20:33 compute-0 sudo[37310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:20:33 compute-0 sshd-session[37029]: Received disconnect from 1.214.197.163 port 43564:11: Bye Bye [preauth]
Nov 29 06:20:33 compute-0 sshd-session[37029]: Disconnected from invalid user bitnami 1.214.197.163 port 43564 [preauth]
Nov 29 06:20:33 compute-0 python3.9[37312]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 29 06:20:33 compute-0 groupadd[37313]: group added to /etc/group: name=qemu, GID=107
Nov 29 06:20:33 compute-0 groupadd[37313]: group added to /etc/gshadow: name=qemu
Nov 29 06:20:33 compute-0 groupadd[37313]: new group: name=qemu, GID=107
Nov 29 06:20:33 compute-0 sudo[37310]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:34 compute-0 sudo[37468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fumjmhdmlieuwyxokyrtvfeiqtvhyzae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397233.9320471-872-154287044622312/AnsiballZ_user.py'
Nov 29 06:20:34 compute-0 sudo[37468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:20:34 compute-0 python3.9[37470]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 29 06:20:34 compute-0 useradd[37472]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Nov 29 06:20:35 compute-0 sudo[37468]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:35 compute-0 sudo[37628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dycadgvsifwutddscrkkskxbncfimuuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397235.4703724-896-237005681933573/AnsiballZ_getent.py'
Nov 29 06:20:35 compute-0 sudo[37628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:20:35 compute-0 python3.9[37630]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Nov 29 06:20:35 compute-0 sudo[37628]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:36 compute-0 sudo[37781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prrsktenswpdphjgnpsiutwqlsaygvyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397236.2325983-920-270829663729934/AnsiballZ_group.py'
Nov 29 06:20:36 compute-0 sudo[37781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:20:36 compute-0 python3.9[37783]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 29 06:20:36 compute-0 groupadd[37784]: group added to /etc/group: name=hugetlbfs, GID=42477
Nov 29 06:20:36 compute-0 groupadd[37784]: group added to /etc/gshadow: name=hugetlbfs
Nov 29 06:20:36 compute-0 groupadd[37784]: new group: name=hugetlbfs, GID=42477
Nov 29 06:20:36 compute-0 sudo[37781]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:37 compute-0 sudo[37939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjmmepfmoxktpczewfdpguvawibbirbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397237.193598-947-115441056251267/AnsiballZ_file.py'
Nov 29 06:20:37 compute-0 sudo[37939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:20:37 compute-0 python3.9[37941]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Nov 29 06:20:37 compute-0 sudo[37939]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:38 compute-0 sshd-session[37942]: Invalid user esuser from 179.125.24.202 port 40284
Nov 29 06:20:38 compute-0 sudo[38093]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txczhodtgeembmtiytuasredskvpkyxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397238.2674088-980-35251254894211/AnsiballZ_dnf.py'
Nov 29 06:20:38 compute-0 sudo[38093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:20:38 compute-0 sshd-session[37942]: Received disconnect from 179.125.24.202 port 40284:11: Bye Bye [preauth]
Nov 29 06:20:38 compute-0 sshd-session[37942]: Disconnected from invalid user esuser 179.125.24.202 port 40284 [preauth]
Nov 29 06:20:38 compute-0 python3.9[38095]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:20:41 compute-0 sudo[38093]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:42 compute-0 sudo[38248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajqzaebjpdohwtqrgddjlxzytneexktc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397242.2743828-1004-216089697711711/AnsiballZ_file.py'
Nov 29 06:20:42 compute-0 sudo[38248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:20:42 compute-0 python3.9[38250]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:20:42 compute-0 sudo[38248]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:43 compute-0 sshd-session[38097]: Invalid user test from 36.50.176.16 port 56258
Nov 29 06:20:43 compute-0 sudo[38400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mttsrenfcqbtmcjdletsssyxiazvsijq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397242.9304304-1028-70843907421443/AnsiballZ_stat.py'
Nov 29 06:20:43 compute-0 sudo[38400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:20:43 compute-0 sshd-session[38097]: Received disconnect from 36.50.176.16 port 56258:11: Bye Bye [preauth]
Nov 29 06:20:43 compute-0 sshd-session[38097]: Disconnected from invalid user test 36.50.176.16 port 56258 [preauth]
Nov 29 06:20:43 compute-0 python3.9[38402]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:20:43 compute-0 sudo[38400]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:43 compute-0 sudo[38523]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvhrwegnaoarxtnjdbwsvuffkkwyjhjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397242.9304304-1028-70843907421443/AnsiballZ_copy.py'
Nov 29 06:20:43 compute-0 sudo[38523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:20:44 compute-0 python3.9[38525]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397242.9304304-1028-70843907421443/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:20:44 compute-0 sudo[38523]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:45 compute-0 sudo[38675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcnqoipnnlethoaoxetulumsmxbmqymu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397244.4021978-1073-217315807702615/AnsiballZ_systemd.py'
Nov 29 06:20:45 compute-0 sudo[38675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:20:45 compute-0 python3.9[38677]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:20:45 compute-0 systemd[1]: Starting Load Kernel Modules...
Nov 29 06:20:45 compute-0 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Nov 29 06:20:45 compute-0 kernel: Bridge firewalling registered
Nov 29 06:20:45 compute-0 systemd-modules-load[38681]: Inserted module 'br_netfilter'
Nov 29 06:20:45 compute-0 systemd[1]: Finished Load Kernel Modules.
Nov 29 06:20:45 compute-0 sudo[38675]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:45 compute-0 sudo[38834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxvnwavpvsnrinvhchmkuiqzsjkmxsyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397245.6915033-1097-52613899310437/AnsiballZ_stat.py'
Nov 29 06:20:45 compute-0 sudo[38834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:20:46 compute-0 python3.9[38836]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:20:46 compute-0 sudo[38834]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:46 compute-0 sudo[38957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrsoggyngyamyavtxgeobxzaiokusisb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397245.6915033-1097-52613899310437/AnsiballZ_copy.py'
Nov 29 06:20:46 compute-0 sudo[38957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:20:46 compute-0 python3.9[38959]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397245.6915033-1097-52613899310437/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:20:46 compute-0 sudo[38957]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:47 compute-0 sudo[39109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcdxzawgrmwtdvejgqhckwrhmotafptf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397247.3769453-1151-35063511608833/AnsiballZ_dnf.py'
Nov 29 06:20:47 compute-0 sudo[39109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:20:47 compute-0 python3.9[39111]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:20:52 compute-0 dbus-broker-launch[772]: Noticed file-system modification, trigger reload.
Nov 29 06:20:52 compute-0 dbus-broker-launch[772]: Noticed file-system modification, trigger reload.
Nov 29 06:20:53 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 06:20:53 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 29 06:20:53 compute-0 systemd[1]: Reloading.
Nov 29 06:20:53 compute-0 systemd-rc-local-generator[39169]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:20:53 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 06:20:54 compute-0 sudo[39109]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:55 compute-0 python3.9[40984]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:20:56 compute-0 python3.9[42172]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Nov 29 06:20:57 compute-0 python3.9[43025]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:20:57 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 06:20:57 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 29 06:20:57 compute-0 systemd[1]: man-db-cache-update.service: Consumed 4.593s CPU time.
Nov 29 06:20:57 compute-0 systemd[1]: run-r3b928248656b48dabfc21a099e7c3256.service: Deactivated successfully.
Nov 29 06:20:57 compute-0 sudo[43301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkhwblhgsapyopkevafxkgltgmielted ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397257.6566014-1268-519715490988/AnsiballZ_command.py'
Nov 29 06:20:57 compute-0 sudo[43301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:20:58 compute-0 python3.9[43303]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:20:58 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 29 06:20:58 compute-0 systemd[1]: Starting Authorization Manager...
Nov 29 06:20:58 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 29 06:20:58 compute-0 polkitd[43520]: Started polkitd version 0.117
Nov 29 06:20:58 compute-0 polkitd[43520]: Loading rules from directory /etc/polkit-1/rules.d
Nov 29 06:20:58 compute-0 polkitd[43520]: Loading rules from directory /usr/share/polkit-1/rules.d
Nov 29 06:20:58 compute-0 polkitd[43520]: Finished loading, compiling and executing 2 rules
Nov 29 06:20:58 compute-0 polkitd[43520]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Nov 29 06:20:58 compute-0 systemd[1]: Started Authorization Manager.
Nov 29 06:20:58 compute-0 sudo[43301]: pam_unix(sudo:session): session closed for user root
Nov 29 06:20:59 compute-0 sudo[43690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfmnudkppfmupskdhfbavmwgrdxytcua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397259.1144555-1295-110103036811288/AnsiballZ_systemd.py'
Nov 29 06:20:59 compute-0 sudo[43690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:20:59 compute-0 python3.9[43692]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:20:59 compute-0 systemd[1]: Stopping Dynamic System Tuning Daemon...
Nov 29 06:20:59 compute-0 systemd[1]: tuned.service: Deactivated successfully.
Nov 29 06:20:59 compute-0 systemd[1]: Stopped Dynamic System Tuning Daemon.
Nov 29 06:20:59 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 29 06:20:59 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 29 06:20:59 compute-0 sudo[43690]: pam_unix(sudo:session): session closed for user root
Nov 29 06:21:00 compute-0 sshd-session[43662]: Invalid user nginx from 160.202.8.218 port 53036
Nov 29 06:21:00 compute-0 sshd-session[43662]: Received disconnect from 160.202.8.218 port 53036:11: Bye Bye [preauth]
Nov 29 06:21:00 compute-0 sshd-session[43662]: Disconnected from invalid user nginx 160.202.8.218 port 53036 [preauth]
Nov 29 06:21:00 compute-0 python3.9[43853]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Nov 29 06:21:03 compute-0 sudo[44003]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbhkurzptudmxxvixsasbbnnjozghhri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397263.699918-1466-27696365973489/AnsiballZ_systemd.py'
Nov 29 06:21:03 compute-0 sudo[44003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:21:04 compute-0 python3.9[44005]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:21:04 compute-0 systemd[1]: Reloading.
Nov 29 06:21:04 compute-0 systemd-rc-local-generator[44031]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:21:04 compute-0 sudo[44003]: pam_unix(sudo:session): session closed for user root
Nov 29 06:21:04 compute-0 sudo[44192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzkllnqqoqfczsolgwsdzqdcodozmrob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397264.652399-1466-271689952250795/AnsiballZ_systemd.py'
Nov 29 06:21:04 compute-0 sudo[44192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:21:05 compute-0 python3.9[44194]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:21:05 compute-0 systemd[1]: Reloading.
Nov 29 06:21:05 compute-0 systemd-rc-local-generator[44224]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:21:05 compute-0 sudo[44192]: pam_unix(sudo:session): session closed for user root
Nov 29 06:21:06 compute-0 sudo[44381]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npezqmnwgxvxdqumlydmmgiphxivkycc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397265.954088-1514-228725262424731/AnsiballZ_command.py'
Nov 29 06:21:06 compute-0 sudo[44381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:21:06 compute-0 python3.9[44383]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:21:06 compute-0 sudo[44381]: pam_unix(sudo:session): session closed for user root
Nov 29 06:21:06 compute-0 sudo[44534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blzbloxftihadeududgudijofjunbyki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397266.691362-1538-19675282408057/AnsiballZ_command.py'
Nov 29 06:21:06 compute-0 sudo[44534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:21:07 compute-0 python3.9[44536]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:21:07 compute-0 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Nov 29 06:21:07 compute-0 sudo[44534]: pam_unix(sudo:session): session closed for user root
Nov 29 06:21:07 compute-0 sudo[44687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjtfofnpsddeubrdoeijnzztqfmsbauu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397267.568625-1562-32413409737609/AnsiballZ_command.py'
Nov 29 06:21:07 compute-0 sudo[44687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:21:07 compute-0 python3.9[44689]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:21:09 compute-0 sudo[44687]: pam_unix(sudo:session): session closed for user root
Nov 29 06:21:10 compute-0 sudo[44849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjnwppdxrcykyzvmqbvmfhvhquhzbqtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397269.7810042-1586-66638622086734/AnsiballZ_command.py'
Nov 29 06:21:10 compute-0 sudo[44849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:21:10 compute-0 python3.9[44851]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:21:10 compute-0 sudo[44849]: pam_unix(sudo:session): session closed for user root
Nov 29 06:21:10 compute-0 sudo[45002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqswvcojxhzwyoviifsabkflegchfrru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397270.4794524-1610-254275792402601/AnsiballZ_systemd.py'
Nov 29 06:21:10 compute-0 sudo[45002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:21:10 compute-0 python3.9[45004]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:21:11 compute-0 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 29 06:21:11 compute-0 systemd[1]: Stopped Apply Kernel Variables.
Nov 29 06:21:11 compute-0 systemd[1]: Stopping Apply Kernel Variables...
Nov 29 06:21:11 compute-0 systemd[1]: Starting Apply Kernel Variables...
Nov 29 06:21:11 compute-0 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 29 06:21:11 compute-0 systemd[1]: Finished Apply Kernel Variables.
Nov 29 06:21:11 compute-0 sudo[45002]: pam_unix(sudo:session): session closed for user root
Nov 29 06:21:11 compute-0 sshd-session[31368]: Connection closed by 192.168.122.30 port 55446
Nov 29 06:21:11 compute-0 sshd-session[31365]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:21:11 compute-0 systemd[1]: session-9.scope: Deactivated successfully.
Nov 29 06:21:11 compute-0 systemd[1]: session-9.scope: Consumed 2min 20.926s CPU time.
Nov 29 06:21:11 compute-0 systemd-logind[788]: Session 9 logged out. Waiting for processes to exit.
Nov 29 06:21:11 compute-0 systemd-logind[788]: Removed session 9.
Nov 29 06:21:16 compute-0 sshd-session[45035]: Accepted publickey for zuul from 192.168.122.30 port 36954 ssh2: ECDSA SHA256:Ey+6YDlYfkE2dRe/gjWhHvvrJHee4xZo2Q1JojEuVBA
Nov 29 06:21:16 compute-0 systemd-logind[788]: New session 10 of user zuul.
Nov 29 06:21:16 compute-0 systemd[1]: Started Session 10 of User zuul.
Nov 29 06:21:16 compute-0 sshd-session[45035]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:21:17 compute-0 python3.9[45188]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:21:18 compute-0 python3.9[45342]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:21:19 compute-0 sudo[45496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frmyifwsomnykbtdorojhdsjhdmjswaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397279.5291696-115-254927038965946/AnsiballZ_command.py'
Nov 29 06:21:19 compute-0 sudo[45496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:21:21 compute-0 python3.9[45498]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:21:21 compute-0 sudo[45496]: pam_unix(sudo:session): session closed for user root
Nov 29 06:21:22 compute-0 python3.9[45649]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:21:22 compute-0 sudo[45803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwmkxvlqhnbgcjcwhcjdtxkwwsrlecbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397282.5380492-175-91422706081895/AnsiballZ_setup.py'
Nov 29 06:21:22 compute-0 sudo[45803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:21:23 compute-0 python3.9[45805]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:21:23 compute-0 sudo[45803]: pam_unix(sudo:session): session closed for user root
Nov 29 06:21:23 compute-0 sudo[45887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itcjicfadzbumydjgaznfevmqkxhwnvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397282.5380492-175-91422706081895/AnsiballZ_dnf.py'
Nov 29 06:21:23 compute-0 sudo[45887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:21:23 compute-0 python3.9[45889]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:21:25 compute-0 sudo[45887]: pam_unix(sudo:session): session closed for user root
Nov 29 06:21:26 compute-0 sudo[46042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-advgsdlxveqkvhdekydkvwlyoknzlxlh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397285.8577168-211-29709947477889/AnsiballZ_setup.py'
Nov 29 06:21:26 compute-0 sudo[46042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:21:26 compute-0 python3.9[46044]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:21:26 compute-0 sudo[46042]: pam_unix(sudo:session): session closed for user root
Nov 29 06:21:26 compute-0 sshd-session[45891]: Invalid user grid from 103.179.56.44 port 38332
Nov 29 06:21:27 compute-0 sshd-session[45891]: Received disconnect from 103.179.56.44 port 38332:11: Bye Bye [preauth]
Nov 29 06:21:27 compute-0 sshd-session[45891]: Disconnected from invalid user grid 103.179.56.44 port 38332 [preauth]
Nov 29 06:21:27 compute-0 sudo[46213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzwnwjnfieczuxjotgrodwjksfnjavyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397286.974772-244-72574369474920/AnsiballZ_file.py'
Nov 29 06:21:27 compute-0 sudo[46213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:21:27 compute-0 python3.9[46215]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:21:27 compute-0 sudo[46213]: pam_unix(sudo:session): session closed for user root
Nov 29 06:21:28 compute-0 sudo[46365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-puaeoqlzbgcwzngmgekpkytaaotyigty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397287.8259444-268-141689222797410/AnsiballZ_command.py'
Nov 29 06:21:28 compute-0 sudo[46365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:21:28 compute-0 python3.9[46367]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:21:28 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat1724026363-merged.mount: Deactivated successfully.
Nov 29 06:21:28 compute-0 podman[46368]: 2025-11-29 06:21:28.403544596 +0000 UTC m=+0.128488306 system refresh
Nov 29 06:21:28 compute-0 sudo[46365]: pam_unix(sudo:session): session closed for user root
Nov 29 06:21:29 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:21:29 compute-0 sudo[46529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idjgykfeiihuzveabxutrlagfkaopaac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397289.3882995-292-261970548196274/AnsiballZ_stat.py'
Nov 29 06:21:29 compute-0 sudo[46529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:21:29 compute-0 python3.9[46531]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:21:29 compute-0 sudo[46529]: pam_unix(sudo:session): session closed for user root
Nov 29 06:21:30 compute-0 sudo[46652]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgihshekrbsyrbiagqinokphipohfcwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397289.3882995-292-261970548196274/AnsiballZ_copy.py'
Nov 29 06:21:30 compute-0 sudo[46652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:21:30 compute-0 python3.9[46654]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397289.3882995-292-261970548196274/.source.json follow=False _original_basename=podman_network_config.j2 checksum=fe2af618af927a7b030f79b166bb2966334d6bf4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:21:30 compute-0 sudo[46652]: pam_unix(sudo:session): session closed for user root
Nov 29 06:21:31 compute-0 sudo[46804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmdctemjjgfxlliormzvnlrxadrwveas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397290.9016807-337-212711127752516/AnsiballZ_stat.py'
Nov 29 06:21:31 compute-0 sudo[46804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:21:31 compute-0 python3.9[46806]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:21:31 compute-0 sudo[46804]: pam_unix(sudo:session): session closed for user root
Nov 29 06:21:31 compute-0 sudo[46927]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmfffepkbcssfvlzkjpxakugmmebnpxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397290.9016807-337-212711127752516/AnsiballZ_copy.py'
Nov 29 06:21:31 compute-0 sudo[46927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:21:31 compute-0 python3.9[46929]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397290.9016807-337-212711127752516/.source.conf follow=False _original_basename=registries.conf.j2 checksum=25aa6c560e50dcbd81b989ea46a7865cb55b8998 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:21:31 compute-0 sudo[46927]: pam_unix(sudo:session): session closed for user root
Nov 29 06:21:32 compute-0 sudo[47079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkaioiltjoeiivkuyzefhomvozbmwyjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397292.2439961-385-168886805822060/AnsiballZ_ini_file.py'
Nov 29 06:21:32 compute-0 sudo[47079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:21:32 compute-0 python3.9[47081]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:21:32 compute-0 sudo[47079]: pam_unix(sudo:session): session closed for user root
Nov 29 06:21:33 compute-0 sudo[47231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfbdrmukrpxppohudhnaufwmdxgcjwhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397292.997322-385-188969249380483/AnsiballZ_ini_file.py'
Nov 29 06:21:33 compute-0 sudo[47231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:21:33 compute-0 python3.9[47233]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:21:33 compute-0 sudo[47231]: pam_unix(sudo:session): session closed for user root
Nov 29 06:21:33 compute-0 sudo[47383]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtkujfxchgrwtatfjigramprbipvqjfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397293.6426225-385-144068962781199/AnsiballZ_ini_file.py'
Nov 29 06:21:33 compute-0 sudo[47383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:21:34 compute-0 python3.9[47385]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:21:34 compute-0 sudo[47383]: pam_unix(sudo:session): session closed for user root
Nov 29 06:21:34 compute-0 sudo[47535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oiifkijdaccjzddjxtvigseqvhodfblz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397294.2164896-385-222982091931460/AnsiballZ_ini_file.py'
Nov 29 06:21:34 compute-0 sudo[47535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:21:34 compute-0 python3.9[47537]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:21:34 compute-0 sudo[47535]: pam_unix(sudo:session): session closed for user root
Nov 29 06:21:35 compute-0 python3.9[47687]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:21:36 compute-0 sudo[47839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndnfyakbxhimffcfrcooeqavjxjcibvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397296.2695634-505-191449530180162/AnsiballZ_dnf.py'
Nov 29 06:21:36 compute-0 sudo[47839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:21:36 compute-0 python3.9[47841]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 06:21:38 compute-0 sudo[47839]: pam_unix(sudo:session): session closed for user root
Nov 29 06:21:39 compute-0 sudo[47994]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bagywmgmzelyexzhboawmjtyxrzzytjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397299.1902606-529-207691275244398/AnsiballZ_dnf.py'
Nov 29 06:21:39 compute-0 sudo[47994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:21:39 compute-0 sshd-session[47843]: Received disconnect from 45.202.211.6 port 48874:11: Bye Bye [preauth]
Nov 29 06:21:39 compute-0 sshd-session[47843]: Disconnected from authenticating user root 45.202.211.6 port 48874 [preauth]
Nov 29 06:21:39 compute-0 python3.9[47996]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 06:21:42 compute-0 sudo[47994]: pam_unix(sudo:session): session closed for user root
Nov 29 06:21:42 compute-0 sudo[48154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bupjwejcjtkmqodtzjzlphawrhpclgcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397302.5020173-559-151720768751809/AnsiballZ_dnf.py'
Nov 29 06:21:42 compute-0 sudo[48154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:21:42 compute-0 python3.9[48156]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 06:21:44 compute-0 sudo[48154]: pam_unix(sudo:session): session closed for user root
Nov 29 06:21:45 compute-0 sudo[48307]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjnwitmbzltqhrbedxjhlwsxjgjjkpqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397304.9714642-586-96625896374705/AnsiballZ_dnf.py'
Nov 29 06:21:45 compute-0 sudo[48307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:21:45 compute-0 python3.9[48309]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 06:21:47 compute-0 sudo[48307]: pam_unix(sudo:session): session closed for user root
Nov 29 06:21:48 compute-0 sudo[48460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkvngzvicchcrycjrjmhfdxjjlbaqfaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397308.5143595-619-252727533552629/AnsiballZ_dnf.py'
Nov 29 06:21:48 compute-0 sudo[48460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:21:49 compute-0 python3.9[48462]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 06:21:50 compute-0 sshd-session[48464]: Received disconnect from 1.214.197.163 port 44946:11: Bye Bye [preauth]
Nov 29 06:21:50 compute-0 sshd-session[48464]: Disconnected from authenticating user root 1.214.197.163 port 44946 [preauth]
Nov 29 06:21:51 compute-0 sudo[48460]: pam_unix(sudo:session): session closed for user root
Nov 29 06:21:52 compute-0 sudo[48618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rheybpahezykwsludquarovjmrwlidvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397311.7477982-643-157773293206144/AnsiballZ_dnf.py'
Nov 29 06:21:52 compute-0 sudo[48618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:21:52 compute-0 python3.9[48620]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 06:21:56 compute-0 sudo[48618]: pam_unix(sudo:session): session closed for user root
Nov 29 06:21:57 compute-0 sudo[48788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-quyyoudkwvtmbqrwhivprkaoitendrfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397316.844846-670-18256417422361/AnsiballZ_dnf.py'
Nov 29 06:21:57 compute-0 sudo[48788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:21:57 compute-0 python3.9[48790]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 06:21:58 compute-0 sudo[48788]: pam_unix(sudo:session): session closed for user root
Nov 29 06:21:59 compute-0 sudo[48941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ziphohhbvszuwyskmeptstbmgqywikbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397319.1439662-697-23359450905750/AnsiballZ_dnf.py'
Nov 29 06:21:59 compute-0 sudo[48941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:21:59 compute-0 python3.9[48943]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 06:22:02 compute-0 sshd-session[48951]: Invalid user conectar from 179.125.24.202 port 57582
Nov 29 06:22:02 compute-0 sshd-session[48951]: Received disconnect from 179.125.24.202 port 57582:11: Bye Bye [preauth]
Nov 29 06:22:02 compute-0 sshd-session[48951]: Disconnected from invalid user conectar 179.125.24.202 port 57582 [preauth]
Nov 29 06:22:11 compute-0 sshd-session[48956]: Connection closed by 45.78.219.251 port 55546 [preauth]
Nov 29 06:22:20 compute-0 sudo[48941]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:22 compute-0 sudo[49282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blfntrievdtoqalimlyyipxrfenjxgcp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397342.6795356-724-57760206232235/AnsiballZ_dnf.py'
Nov 29 06:22:22 compute-0 sudo[49282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:22:23 compute-0 python3.9[49284]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 06:22:24 compute-0 sudo[49282]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:25 compute-0 sudo[49438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nadujmxowygxoefbedsswwwkwpaomqnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397345.3483067-757-94285479020325/AnsiballZ_file.py'
Nov 29 06:22:25 compute-0 sudo[49438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:22:25 compute-0 python3.9[49440]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:22:25 compute-0 sudo[49438]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:26 compute-0 sudo[49613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oascfpxcdzppofkggqkvbymztcpfkbcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397346.1056457-781-201473528165375/AnsiballZ_stat.py'
Nov 29 06:22:26 compute-0 sudo[49613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:22:26 compute-0 python3.9[49615]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:22:26 compute-0 sudo[49613]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:26 compute-0 sudo[49736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azfhnmaujxkyboryvrgxjrzphzlwsfhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397346.1056457-781-201473528165375/AnsiballZ_copy.py'
Nov 29 06:22:26 compute-0 sudo[49736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:22:27 compute-0 python3.9[49738]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1764397346.1056457-781-201473528165375/.source.json _original_basename=.x29jj4r3 follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:22:27 compute-0 sudo[49736]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:28 compute-0 sudo[49888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rclnmdapcafozpxuarngrowioggajnkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397347.8683877-835-167927185926764/AnsiballZ_podman_image.py'
Nov 29 06:22:28 compute-0 sudo[49888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:22:28 compute-0 python3.9[49890]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 29 06:22:28 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:22:33 compute-0 sshd-session[49935]: Invalid user github from 160.202.8.218 port 46554
Nov 29 06:22:33 compute-0 sshd-session[49935]: Received disconnect from 160.202.8.218 port 46554:11: Bye Bye [preauth]
Nov 29 06:22:33 compute-0 sshd-session[49935]: Disconnected from invalid user github 160.202.8.218 port 46554 [preauth]
Nov 29 06:22:34 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat2975142100-lower\x2dmapped.mount: Deactivated successfully.
Nov 29 06:22:41 compute-0 sshd-session[49985]: Invalid user admin from 36.50.176.16 port 60794
Nov 29 06:22:42 compute-0 sshd-session[49985]: Received disconnect from 36.50.176.16 port 60794:11: Bye Bye [preauth]
Nov 29 06:22:42 compute-0 sshd-session[49985]: Disconnected from invalid user admin 36.50.176.16 port 60794 [preauth]
Nov 29 06:22:43 compute-0 podman[49901]: 2025-11-29 06:22:43.32021196 +0000 UTC m=+14.731393576 image pull 52cb1910f3f090372807028d1c2aea98d2557b1086636469529f290368ecdf69 quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 29 06:22:43 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:22:43 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:22:43 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:22:43 compute-0 sudo[49888]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:44 compute-0 sudo[50200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjknknevazdrcsfwcrdcnfravcaxuvly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397364.6148582-874-204896259131156/AnsiballZ_podman_image.py'
Nov 29 06:22:44 compute-0 sudo[50200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:22:45 compute-0 python3.9[50202]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 29 06:22:45 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:22:47 compute-0 podman[50214]: 2025-11-29 06:22:47.447041099 +0000 UTC m=+2.267117686 image pull f275b8d168f7f57f31e3da49224019f39f95c80a833f083696a964527b07b54f quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 29 06:22:47 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:22:47 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:22:47 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:22:47 compute-0 sudo[50200]: pam_unix(sudo:session): session closed for user root
Nov 29 06:22:49 compute-0 sudo[50449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxinpdanahwxwqncbewuenizecespawz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397368.4683995-901-214094019519611/AnsiballZ_podman_image.py'
Nov 29 06:22:49 compute-0 sudo[50449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:22:49 compute-0 python3.9[50451]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 29 06:22:49 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:22:52 compute-0 sshd-session[50507]: Received disconnect from 45.202.211.6 port 44310:11: Bye Bye [preauth]
Nov 29 06:22:52 compute-0 sshd-session[50507]: Disconnected from authenticating user root 45.202.211.6 port 44310 [preauth]
Nov 29 06:23:10 compute-0 sshd-session[50552]: Invalid user esuser from 1.214.197.163 port 46336
Nov 29 06:23:10 compute-0 sshd-session[50552]: Received disconnect from 1.214.197.163 port 46336:11: Bye Bye [preauth]
Nov 29 06:23:10 compute-0 sshd-session[50552]: Disconnected from invalid user esuser 1.214.197.163 port 46336 [preauth]
Nov 29 06:23:22 compute-0 sshd-session[50569]: Connection closed by 152.32.250.188 port 37622 [preauth]
Nov 29 06:23:23 compute-0 podman[50463]: 2025-11-29 06:23:23.306275393 +0000 UTC m=+33.944477393 image pull b65793e7266422f5b94c32d109b906c8ffd974cf2ddf0b6929e463e29e05864a quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 29 06:23:23 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:23:23 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:23:23 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:23:23 compute-0 sudo[50449]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:24 compute-0 sshd-session[50573]: Received disconnect from 103.179.56.44 port 59830:11: Bye Bye [preauth]
Nov 29 06:23:24 compute-0 sshd-session[50573]: Disconnected from authenticating user root 103.179.56.44 port 59830 [preauth]
Nov 29 06:23:26 compute-0 sshd-session[50648]: Received disconnect from 179.125.24.202 port 47268:11: Bye Bye [preauth]
Nov 29 06:23:26 compute-0 sshd-session[50648]: Disconnected from authenticating user root 179.125.24.202 port 47268 [preauth]
Nov 29 06:23:30 compute-0 sudo[50775]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxtjvltwlzrzfjupoqrsjyvqodqkmget ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397409.7590675-934-163650249975781/AnsiballZ_podman_image.py'
Nov 29 06:23:30 compute-0 sudo[50775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:30 compute-0 python3.9[50777]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 29 06:23:30 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:23:39 compute-0 podman[50789]: 2025-11-29 06:23:39.655027202 +0000 UTC m=+9.400805526 image pull e6f07353639e492d8c9627d6d615ceeb47cb00ac4d14993b12e8023ee2aeee6f quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Nov 29 06:23:39 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:23:39 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:23:39 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:23:39 compute-0 sudo[50775]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:40 compute-0 sudo[51044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbfmcmubwzpmupkuwjstqabcxmzspfyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397419.9696362-934-272863200728829/AnsiballZ_podman_image.py'
Nov 29 06:23:40 compute-0 sudo[51044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:40 compute-0 python3.9[51046]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 29 06:23:43 compute-0 podman[51058]: 2025-11-29 06:23:43.162096096 +0000 UTC m=+2.645752802 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Nov 29 06:23:43 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:23:43 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:23:43 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:23:43 compute-0 sudo[51044]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:43 compute-0 sshd-session[45038]: Connection closed by 192.168.122.30 port 36954
Nov 29 06:23:43 compute-0 sshd-session[45035]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:23:43 compute-0 systemd[1]: session-10.scope: Deactivated successfully.
Nov 29 06:23:43 compute-0 systemd[1]: session-10.scope: Consumed 1min 47.835s CPU time.
Nov 29 06:23:43 compute-0 systemd-logind[788]: Session 10 logged out. Waiting for processes to exit.
Nov 29 06:23:43 compute-0 systemd-logind[788]: Removed session 10.
Nov 29 06:23:49 compute-0 sshd-session[51209]: Accepted publickey for zuul from 192.168.122.30 port 50122 ssh2: ECDSA SHA256:Ey+6YDlYfkE2dRe/gjWhHvvrJHee4xZo2Q1JojEuVBA
Nov 29 06:23:49 compute-0 systemd-logind[788]: New session 11 of user zuul.
Nov 29 06:23:49 compute-0 systemd[1]: Started Session 11 of User zuul.
Nov 29 06:23:49 compute-0 sshd-session[51209]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:23:51 compute-0 python3.9[51362]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:23:52 compute-0 sudo[51516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmdugnyoiundgzrwzhwurkslmhywdaub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397431.9466362-74-161073250269728/AnsiballZ_getent.py'
Nov 29 06:23:52 compute-0 sudo[51516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:53 compute-0 python3.9[51518]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Nov 29 06:23:53 compute-0 sudo[51516]: pam_unix(sudo:session): session closed for user root
Nov 29 06:23:53 compute-0 sudo[51669]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgstpnrynolrrwyzorzhnorwvfbxfgiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397433.4464004-98-235114771720648/AnsiballZ_group.py'
Nov 29 06:23:53 compute-0 sudo[51669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:23:56 compute-0 python3.9[51671]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 29 06:24:01 compute-0 groupadd[51673]: group added to /etc/group: name=openvswitch, GID=42476
Nov 29 06:24:02 compute-0 groupadd[51673]: group added to /etc/gshadow: name=openvswitch
Nov 29 06:24:02 compute-0 groupadd[51673]: new group: name=openvswitch, GID=42476
Nov 29 06:24:02 compute-0 sudo[51669]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:03 compute-0 sudo[51830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfpyoaoquzpphvnwbqkshgydeoaepgif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397442.4468775-122-176361777438234/AnsiballZ_user.py'
Nov 29 06:24:03 compute-0 sudo[51830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:24:03 compute-0 sshd-session[51674]: Invalid user spark from 45.202.211.6 port 58138
Nov 29 06:24:03 compute-0 python3.9[51832]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 29 06:24:03 compute-0 sshd-session[51674]: Received disconnect from 45.202.211.6 port 58138:11: Bye Bye [preauth]
Nov 29 06:24:03 compute-0 sshd-session[51674]: Disconnected from invalid user spark 45.202.211.6 port 58138 [preauth]
Nov 29 06:24:04 compute-0 useradd[51834]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Nov 29 06:24:04 compute-0 useradd[51834]: add 'openvswitch' to group 'hugetlbfs'
Nov 29 06:24:04 compute-0 useradd[51834]: add 'openvswitch' to shadow group 'hugetlbfs'
Nov 29 06:24:05 compute-0 sudo[51830]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:05 compute-0 sudo[51990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnmndekyrfkmbhzbpeslehcxkpboiphz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397445.4726424-152-58949054257066/AnsiballZ_setup.py'
Nov 29 06:24:05 compute-0 sudo[51990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:24:06 compute-0 python3.9[51992]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:24:06 compute-0 sudo[51990]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:06 compute-0 sudo[52076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ernxxcxrkhozneenzfaxpuwhglkhlhyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397445.4726424-152-58949054257066/AnsiballZ_dnf.py'
Nov 29 06:24:06 compute-0 sudo[52076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:24:06 compute-0 python3.9[52078]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 06:24:07 compute-0 sshd-session[51993]: Invalid user cisco from 160.202.8.218 port 40038
Nov 29 06:24:07 compute-0 sshd-session[51993]: Received disconnect from 160.202.8.218 port 40038:11: Bye Bye [preauth]
Nov 29 06:24:07 compute-0 sshd-session[51993]: Disconnected from invalid user cisco 160.202.8.218 port 40038 [preauth]
Nov 29 06:24:12 compute-0 sudo[52076]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:13 compute-0 sudo[52238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjbcbwvvxdytfdyfzoqkuwsbfmgtkbjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397453.2150242-194-32855998588719/AnsiballZ_dnf.py'
Nov 29 06:24:13 compute-0 sudo[52238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:24:13 compute-0 python3.9[52240]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:24:27 compute-0 sshd-session[52256]: Invalid user toto from 1.214.197.163 port 47726
Nov 29 06:24:28 compute-0 sshd-session[52256]: Received disconnect from 1.214.197.163 port 47726:11: Bye Bye [preauth]
Nov 29 06:24:28 compute-0 sshd-session[52256]: Disconnected from invalid user toto 1.214.197.163 port 47726 [preauth]
Nov 29 06:24:31 compute-0 kernel: SELinux:  Converting 2730 SID table entries...
Nov 29 06:24:31 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 06:24:31 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 29 06:24:31 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 06:24:31 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 29 06:24:31 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 06:24:31 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 06:24:31 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 06:24:31 compute-0 groupadd[52266]: group added to /etc/group: name=unbound, GID=993
Nov 29 06:24:31 compute-0 groupadd[52266]: group added to /etc/gshadow: name=unbound
Nov 29 06:24:31 compute-0 groupadd[52266]: new group: name=unbound, GID=993
Nov 29 06:24:31 compute-0 useradd[52273]: new user: name=unbound, UID=993, GID=993, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Nov 29 06:24:32 compute-0 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Nov 29 06:24:32 compute-0 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Nov 29 06:24:33 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 06:24:33 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 29 06:24:33 compute-0 systemd[1]: Reloading.
Nov 29 06:24:33 compute-0 systemd-rc-local-generator[52772]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:24:33 compute-0 systemd-sysv-generator[52776]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:24:33 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 06:24:34 compute-0 sudo[52238]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:34 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 06:24:34 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 29 06:24:34 compute-0 systemd[1]: run-rd51ccf6102d04afcb3c1f1e64c003777.service: Deactivated successfully.
Nov 29 06:24:41 compute-0 sudo[53341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyyggjmskxstwbmyrwqfnfmprttzfalu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397480.5427873-218-145108661247784/AnsiballZ_systemd.py'
Nov 29 06:24:41 compute-0 sudo[53341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:24:41 compute-0 python3.9[53343]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 06:24:41 compute-0 systemd[1]: Reloading.
Nov 29 06:24:41 compute-0 systemd-sysv-generator[53379]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:24:41 compute-0 systemd-rc-local-generator[53375]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:24:41 compute-0 systemd[1]: Starting Open vSwitch Database Unit...
Nov 29 06:24:41 compute-0 chown[53387]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Nov 29 06:24:41 compute-0 ovs-ctl[53392]: /etc/openvswitch/conf.db does not exist ... (warning).
Nov 29 06:24:42 compute-0 ovs-ctl[53392]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Nov 29 06:24:42 compute-0 ovs-ctl[53392]: Starting ovsdb-server [  OK  ]
Nov 29 06:24:42 compute-0 ovs-vsctl[53441]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Nov 29 06:24:42 compute-0 ovs-vsctl[53460]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"7525db09-7529-4df7-96c0-bba03a4d5548\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Nov 29 06:24:42 compute-0 ovs-ctl[53392]: Configuring Open vSwitch system IDs [  OK  ]
Nov 29 06:24:42 compute-0 ovs-ctl[53392]: Enabling remote OVSDB managers [  OK  ]
Nov 29 06:24:42 compute-0 ovs-vsctl[53467]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Nov 29 06:24:42 compute-0 systemd[1]: Started Open vSwitch Database Unit.
Nov 29 06:24:42 compute-0 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Nov 29 06:24:42 compute-0 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Nov 29 06:24:42 compute-0 systemd[1]: Starting Open vSwitch Forwarding Unit...
Nov 29 06:24:42 compute-0 kernel: openvswitch: Open vSwitch switching datapath
Nov 29 06:24:42 compute-0 ovs-ctl[53511]: Inserting openvswitch module [  OK  ]
Nov 29 06:24:42 compute-0 ovs-ctl[53480]: Starting ovs-vswitchd [  OK  ]
Nov 29 06:24:42 compute-0 ovs-ctl[53480]: Enabling remote OVSDB managers [  OK  ]
Nov 29 06:24:42 compute-0 ovs-vsctl[53531]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Nov 29 06:24:42 compute-0 systemd[1]: Started Open vSwitch Forwarding Unit.
Nov 29 06:24:42 compute-0 systemd[1]: Starting Open vSwitch...
Nov 29 06:24:42 compute-0 systemd[1]: Finished Open vSwitch.
Nov 29 06:24:42 compute-0 sudo[53341]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:43 compute-0 sshd-session[53344]: Invalid user daniel from 36.50.176.16 port 52902
Nov 29 06:24:43 compute-0 sshd-session[53344]: Received disconnect from 36.50.176.16 port 52902:11: Bye Bye [preauth]
Nov 29 06:24:43 compute-0 sshd-session[53344]: Disconnected from invalid user daniel 36.50.176.16 port 52902 [preauth]
Nov 29 06:24:43 compute-0 python3.9[53682]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:24:44 compute-0 sudo[53832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqtqnckhaqbodvyqrtgphgcziojnoeqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397484.0855963-272-177558915159871/AnsiballZ_sefcontext.py'
Nov 29 06:24:44 compute-0 sudo[53832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:24:44 compute-0 python3.9[53834]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Nov 29 06:24:46 compute-0 kernel: SELinux:  Converting 2744 SID table entries...
Nov 29 06:24:46 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 06:24:46 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 29 06:24:46 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 06:24:46 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 29 06:24:46 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 06:24:46 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 06:24:46 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 06:24:46 compute-0 sudo[53832]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:47 compute-0 python3.9[53989]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:24:48 compute-0 sudo[54147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ripulvrqhycdyppznxjmsrzbgjglonze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397487.983432-326-55482375678113/AnsiballZ_dnf.py'
Nov 29 06:24:48 compute-0 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Nov 29 06:24:48 compute-0 sudo[54147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:24:48 compute-0 sshd-session[54020]: Invalid user sinusbot from 179.125.24.202 port 39258
Nov 29 06:24:48 compute-0 python3.9[54149]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:24:49 compute-0 sshd-session[54020]: Received disconnect from 179.125.24.202 port 39258:11: Bye Bye [preauth]
Nov 29 06:24:49 compute-0 sshd-session[54020]: Disconnected from invalid user sinusbot 179.125.24.202 port 39258 [preauth]
Nov 29 06:24:51 compute-0 sudo[54147]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:52 compute-0 sudo[54300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hudshbngroynlteaigtlsxpkauxwjzda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397492.3388467-350-131910874022946/AnsiballZ_command.py'
Nov 29 06:24:52 compute-0 sudo[54300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:24:52 compute-0 python3.9[54302]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:24:53 compute-0 sudo[54300]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:54 compute-0 sudo[54587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqrfvpnveodnsrwevwtmnrnszajbtkeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397493.9150476-374-45139919335329/AnsiballZ_file.py'
Nov 29 06:24:54 compute-0 sudo[54587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:24:54 compute-0 python3.9[54589]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 29 06:24:54 compute-0 sudo[54587]: pam_unix(sudo:session): session closed for user root
Nov 29 06:24:55 compute-0 python3.9[54739]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:24:55 compute-0 sudo[54891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibhrygffjywocaqhtqvhzshucjtkjqsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397495.6447964-422-78144707819275/AnsiballZ_dnf.py'
Nov 29 06:24:55 compute-0 sudo[54891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:24:56 compute-0 python3.9[54893]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:24:58 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 06:24:58 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 29 06:24:58 compute-0 systemd[1]: Reloading.
Nov 29 06:24:58 compute-0 systemd-rc-local-generator[54931]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:24:58 compute-0 systemd-sysv-generator[54934]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:24:58 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 06:25:01 compute-0 sudo[54891]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:02 compute-0 sudo[55207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-saiduixxszdpegzkoybagfrleqjikxin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397502.1203563-446-1270323550480/AnsiballZ_systemd.py'
Nov 29 06:25:02 compute-0 sudo[55207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:03 compute-0 python3.9[55209]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:25:03 compute-0 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 29 06:25:03 compute-0 systemd[1]: Stopped Network Manager Wait Online.
Nov 29 06:25:03 compute-0 systemd[1]: Stopping Network Manager Wait Online...
Nov 29 06:25:03 compute-0 systemd[1]: Stopping Network Manager...
Nov 29 06:25:03 compute-0 NetworkManager[7212]: <info>  [1764397503.0992] caught SIGTERM, shutting down normally.
Nov 29 06:25:03 compute-0 NetworkManager[7212]: <info>  [1764397503.1014] dhcp4 (eth0): canceled DHCP transaction
Nov 29 06:25:03 compute-0 NetworkManager[7212]: <info>  [1764397503.1016] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 29 06:25:03 compute-0 NetworkManager[7212]: <info>  [1764397503.1017] dhcp4 (eth0): state changed no lease
Nov 29 06:25:03 compute-0 NetworkManager[7212]: <info>  [1764397503.1022] manager: NetworkManager state is now CONNECTED_SITE
Nov 29 06:25:03 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 29 06:25:03 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 29 06:25:04 compute-0 NetworkManager[7212]: <info>  [1764397504.0036] exiting (success)
Nov 29 06:25:04 compute-0 systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 29 06:25:04 compute-0 systemd[1]: Stopped Network Manager.
Nov 29 06:25:04 compute-0 systemd[1]: NetworkManager.service: Consumed 13.702s CPU time, 4.1M memory peak, read 0B from disk, written 29.0K to disk.
Nov 29 06:25:04 compute-0 systemd[1]: Starting Network Manager...
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.0817] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:3b7c8c50-55c8-43c8-86aa-f5fa30ebf228)
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.0819] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.0894] manager[0x564164734090]: monitoring kernel firmware directory '/lib/firmware'.
Nov 29 06:25:04 compute-0 systemd[1]: Starting Hostname Service...
Nov 29 06:25:04 compute-0 systemd[1]: Started Hostname Service.
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.1826] hostname: hostname: using hostnamed
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.1828] hostname: static hostname changed from (none) to "compute-0"
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.1833] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.1838] manager[0x564164734090]: rfkill: Wi-Fi hardware radio set enabled
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.1838] manager[0x564164734090]: rfkill: WWAN hardware radio set enabled
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.1863] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.1873] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.1874] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.1875] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.1876] manager: Networking is enabled by state file
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.1879] settings: Loaded settings plugin: keyfile (internal)
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.1883] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.1913] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.1923] dhcp: init: Using DHCP client 'internal'
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.1926] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.1932] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.1938] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.1947] device (lo): Activation: starting connection 'lo' (43d811ee-2cbe-4425-a4fb-d7d92aa3b968)
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.1954] device (eth0): carrier: link connected
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.1959] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.1964] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.1965] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.1974] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.1982] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.1990] device (eth1): carrier: link connected
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.1995] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.2000] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (8207c9a3-524e-532c-bd71-3fc37e48ed01) (indicated)
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.2000] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.2006] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.2013] device (eth1): Activation: starting connection 'ci-private-network' (8207c9a3-524e-532c-bd71-3fc37e48ed01)
Nov 29 06:25:04 compute-0 systemd[1]: Started Network Manager.
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.2019] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.2037] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.2041] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.2044] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.2049] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.2053] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.2055] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.2058] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.2062] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.2069] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.2073] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.2090] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.2105] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.2116] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.2120] dhcp4 (eth0): state changed new lease, address=38.102.83.110
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.2123] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.2131] device (lo): Activation: successful, device activated.
Nov 29 06:25:04 compute-0 systemd[1]: Starting Network Manager Wait Online...
Nov 29 06:25:04 compute-0 NetworkManager[55227]: <info>  [1764397504.2143] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 29 06:25:04 compute-0 sudo[55207]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:04 compute-0 sudo[55402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkughvvktiprombafdkmjfmslbcubwkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397504.428234-470-105817486335915/AnsiballZ_dnf.py'
Nov 29 06:25:04 compute-0 sudo[55402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:04 compute-0 python3.9[55404]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:25:05 compute-0 NetworkManager[55227]: <info>  [1764397505.0722] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 29 06:25:05 compute-0 NetworkManager[55227]: <info>  [1764397505.0733] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 29 06:25:05 compute-0 NetworkManager[55227]: <info>  [1764397505.0735] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 29 06:25:05 compute-0 NetworkManager[55227]: <info>  [1764397505.0737] manager: NetworkManager state is now CONNECTED_LOCAL
Nov 29 06:25:05 compute-0 NetworkManager[55227]: <info>  [1764397505.0741] device (eth1): Activation: successful, device activated.
Nov 29 06:25:05 compute-0 NetworkManager[55227]: <info>  [1764397505.3522] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 29 06:25:05 compute-0 NetworkManager[55227]: <info>  [1764397505.3525] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 29 06:25:05 compute-0 NetworkManager[55227]: <info>  [1764397505.3529] manager: NetworkManager state is now CONNECTED_SITE
Nov 29 06:25:05 compute-0 NetworkManager[55227]: <info>  [1764397505.3533] device (eth0): Activation: successful, device activated.
Nov 29 06:25:05 compute-0 NetworkManager[55227]: <info>  [1764397505.3537] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 29 06:25:05 compute-0 NetworkManager[55227]: <info>  [1764397505.7921] manager: startup complete
Nov 29 06:25:05 compute-0 systemd[1]: Finished Network Manager Wait Online.
Nov 29 06:25:09 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 06:25:09 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 29 06:25:09 compute-0 systemd[1]: run-r894203bc449d44fd823a21483d7d6c12.service: Deactivated successfully.
Nov 29 06:25:12 compute-0 sshd-session[55451]: Invalid user kodi from 45.202.211.6 port 36702
Nov 29 06:25:12 compute-0 sshd-session[55451]: Received disconnect from 45.202.211.6 port 36702:11: Bye Bye [preauth]
Nov 29 06:25:12 compute-0 sshd-session[55451]: Disconnected from invalid user kodi 45.202.211.6 port 36702 [preauth]
Nov 29 06:25:15 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 29 06:25:16 compute-0 sshd-session[55455]: Received disconnect from 103.179.56.44 port 35752:11: Bye Bye [preauth]
Nov 29 06:25:16 compute-0 sshd-session[55455]: Disconnected from authenticating user root 103.179.56.44 port 35752 [preauth]
Nov 29 06:25:19 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 06:25:19 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 29 06:25:19 compute-0 systemd[1]: Reloading.
Nov 29 06:25:19 compute-0 systemd-rc-local-generator[55489]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:25:19 compute-0 systemd-sysv-generator[55494]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:25:19 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 06:25:21 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 06:25:21 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 29 06:25:21 compute-0 systemd[1]: run-rf47c0415163d43cdb56a556eea4d8c5d.service: Deactivated successfully.
Nov 29 06:25:21 compute-0 sudo[55402]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:24 compute-0 sudo[55897]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dluetkreuinllfhnylrvronsvyjlsqzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397524.1053348-506-195996455115103/AnsiballZ_stat.py'
Nov 29 06:25:24 compute-0 sudo[55897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:24 compute-0 python3.9[55899]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:25:24 compute-0 sudo[55897]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:25 compute-0 sudo[56049]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oijybucxmoeyiszbzsvfldepjolsaiqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397524.8519413-533-212549933550963/AnsiballZ_ini_file.py'
Nov 29 06:25:25 compute-0 sudo[56049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:25 compute-0 python3.9[56051]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:25:25 compute-0 sudo[56049]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:25 compute-0 sudo[56203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqxtxeldxzdaytnqregtwcjkhusixbff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397525.7253618-563-278071505751531/AnsiballZ_ini_file.py'
Nov 29 06:25:25 compute-0 sudo[56203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:26 compute-0 python3.9[56205]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:25:26 compute-0 sudo[56203]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:26 compute-0 sudo[56355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-maulyephohsakbajfciwuzwhipnnjujo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397526.372903-563-58578930772007/AnsiballZ_ini_file.py'
Nov 29 06:25:26 compute-0 sudo[56355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:26 compute-0 python3.9[56357]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:25:26 compute-0 sudo[56355]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:27 compute-0 sudo[56507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rokemlkstehvrvsfofvzacxtvsfwfomo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397527.0822737-608-52571784428638/AnsiballZ_ini_file.py'
Nov 29 06:25:27 compute-0 sudo[56507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:27 compute-0 python3.9[56509]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:25:27 compute-0 sudo[56507]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:27 compute-0 sudo[56659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fslgnkoehvnntvkotejksqmynvjzfhop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397527.6983447-608-77032384692400/AnsiballZ_ini_file.py'
Nov 29 06:25:27 compute-0 sudo[56659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:28 compute-0 python3.9[56661]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:25:28 compute-0 sudo[56659]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:28 compute-0 sudo[56811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcbclvxtqyxyejmpjysouxkcdjwoajba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397528.451513-653-244629867329777/AnsiballZ_stat.py'
Nov 29 06:25:28 compute-0 sudo[56811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:28 compute-0 python3.9[56813]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:25:28 compute-0 sudo[56811]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:29 compute-0 sudo[56934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frmmywqdmqjygbjkakhcnzgddeypbigm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397528.451513-653-244629867329777/AnsiballZ_copy.py'
Nov 29 06:25:29 compute-0 sudo[56934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:31 compute-0 python3.9[56936]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397528.451513-653-244629867329777/.source _original_basename=._pcuzl6j follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:25:32 compute-0 sudo[56934]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:32 compute-0 sudo[57086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdhlrcqleykjmqqfcjrujlfpoojsuhbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397532.172956-698-223168637966775/AnsiballZ_file.py'
Nov 29 06:25:32 compute-0 sudo[57086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:32 compute-0 python3.9[57088]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:25:32 compute-0 sudo[57086]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:34 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 29 06:25:34 compute-0 sudo[57238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmkkaympfzmpdgqvlbqmrmybnhsoyjeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397533.7804036-722-251719017812221/AnsiballZ_edpm_os_net_config_mappings.py'
Nov 29 06:25:34 compute-0 sudo[57238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:34 compute-0 python3.9[57242]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Nov 29 06:25:34 compute-0 sudo[57238]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:35 compute-0 sudo[57392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hywgzhubyflexnrihghsxrlwjjmvvuxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397535.0686061-749-14092873434349/AnsiballZ_file.py'
Nov 29 06:25:35 compute-0 sudo[57392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:35 compute-0 python3.9[57394]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:25:35 compute-0 sudo[57392]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:36 compute-0 sudo[57544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqavosnqhgtmqovyysyivbngtfhrejqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397536.1948397-779-106075875372452/AnsiballZ_stat.py'
Nov 29 06:25:36 compute-0 sudo[57544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:36 compute-0 sudo[57544]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:37 compute-0 sudo[57667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djpftvqgehksnanspglqqnbacneozlbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397536.1948397-779-106075875372452/AnsiballZ_copy.py'
Nov 29 06:25:37 compute-0 sudo[57667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:37 compute-0 sudo[57667]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:37 compute-0 sudo[57819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbcqlcaueknewdzdhmcjdzovtjikdlzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397537.5115387-824-153568178896247/AnsiballZ_slurp.py'
Nov 29 06:25:37 compute-0 sudo[57819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:38 compute-0 python3.9[57821]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Nov 29 06:25:38 compute-0 sudo[57819]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:39 compute-0 sudo[57996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-koukyzvqyfhhbpxsnxupitmxmfthozle ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397538.4620194-851-149773919149690/async_wrapper.py j139613719602 300 /home/zuul/.ansible/tmp/ansible-tmp-1764397538.4620194-851-149773919149690/AnsiballZ_edpm_os_net_config.py _'
Nov 29 06:25:39 compute-0 sudo[57996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:39 compute-0 ansible-async_wrapper.py[57998]: Invoked with j139613719602 300 /home/zuul/.ansible/tmp/ansible-tmp-1764397538.4620194-851-149773919149690/AnsiballZ_edpm_os_net_config.py _
Nov 29 06:25:39 compute-0 ansible-async_wrapper.py[58001]: Starting module and watcher
Nov 29 06:25:39 compute-0 ansible-async_wrapper.py[58001]: Start watching 58002 (300)
Nov 29 06:25:39 compute-0 ansible-async_wrapper.py[58002]: Start module (58002)
Nov 29 06:25:39 compute-0 ansible-async_wrapper.py[57998]: Return async_wrapper task started.
Nov 29 06:25:39 compute-0 sudo[57996]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:39 compute-0 python3.9[58003]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Nov 29 06:25:40 compute-0 sshd-session[57944]: Invalid user local from 160.202.8.218 port 33528
Nov 29 06:25:40 compute-0 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Nov 29 06:25:40 compute-0 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Nov 29 06:25:40 compute-0 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Nov 29 06:25:40 compute-0 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Nov 29 06:25:40 compute-0 kernel: cfg80211: failed to load regulatory.db
Nov 29 06:25:40 compute-0 sshd-session[57944]: Received disconnect from 160.202.8.218 port 33528:11: Bye Bye [preauth]
Nov 29 06:25:40 compute-0 sshd-session[57944]: Disconnected from invalid user local 160.202.8.218 port 33528 [preauth]
Nov 29 06:25:41 compute-0 NetworkManager[55227]: <info>  [1764397541.3033] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58004 uid=0 result="success"
Nov 29 06:25:41 compute-0 NetworkManager[55227]: <info>  [1764397541.3048] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58004 uid=0 result="success"
Nov 29 06:25:41 compute-0 NetworkManager[55227]: <info>  [1764397541.3540] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Nov 29 06:25:41 compute-0 NetworkManager[55227]: <info>  [1764397541.3541] audit: op="connection-add" uuid="297b5746-e850-4721-8594-9a630ec46dd1" name="br-ex-br" pid=58004 uid=0 result="success"
Nov 29 06:25:41 compute-0 NetworkManager[55227]: <info>  [1764397541.3558] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Nov 29 06:25:41 compute-0 NetworkManager[55227]: <info>  [1764397541.3559] audit: op="connection-add" uuid="5c3e53cf-bc40-4365-95c4-92fa2fdbca24" name="br-ex-port" pid=58004 uid=0 result="success"
Nov 29 06:25:41 compute-0 NetworkManager[55227]: <info>  [1764397541.3571] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Nov 29 06:25:41 compute-0 NetworkManager[55227]: <info>  [1764397541.3572] audit: op="connection-add" uuid="26f0940a-c2e4-4f10-82e4-d59e35227273" name="eth1-port" pid=58004 uid=0 result="success"
Nov 29 06:25:41 compute-0 NetworkManager[55227]: <info>  [1764397541.3583] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Nov 29 06:25:41 compute-0 NetworkManager[55227]: <info>  [1764397541.3586] audit: op="connection-add" uuid="cad2ca67-795d-4c44-9e3a-92294d74ba69" name="vlan20-port" pid=58004 uid=0 result="success"
Nov 29 06:25:41 compute-0 NetworkManager[55227]: <info>  [1764397541.3596] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Nov 29 06:25:41 compute-0 NetworkManager[55227]: <info>  [1764397541.3598] audit: op="connection-add" uuid="f38b7c2a-d0cf-405e-a0c3-b42d6e01ccfa" name="vlan21-port" pid=58004 uid=0 result="success"
Nov 29 06:25:41 compute-0 NetworkManager[55227]: <info>  [1764397541.3609] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Nov 29 06:25:41 compute-0 NetworkManager[55227]: <info>  [1764397541.3610] audit: op="connection-add" uuid="79c8fafd-e960-4796-a11e-a95a64021cc7" name="vlan22-port" pid=58004 uid=0 result="success"
Nov 29 06:25:41 compute-0 NetworkManager[55227]: <info>  [1764397541.3629] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="connection.autoconnect-priority,connection.timestamp,ipv6.addr-gen-mode,ipv6.dhcp-timeout,ipv6.method,802-3-ethernet.mtu,ipv4.dhcp-client-id,ipv4.dhcp-timeout" pid=58004 uid=0 result="success"
Nov 29 06:25:41 compute-0 NetworkManager[55227]: <info>  [1764397541.3643] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Nov 29 06:25:41 compute-0 NetworkManager[55227]: <info>  [1764397541.3644] audit: op="connection-add" uuid="da20c687-a137-491f-b85b-6a6b7632ad0f" name="br-ex-if" pid=58004 uid=0 result="success"
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0122] audit: op="connection-update" uuid="8207c9a3-524e-532c-bd71-3fc37e48ed01" name="ci-private-network" args="connection.controller,connection.master,connection.slave-type,connection.port-type,connection.timestamp,ipv6.routing-rules,ipv6.addr-gen-mode,ipv6.dns,ipv6.routes,ipv6.addresses,ipv6.method,ovs-interface.type,ipv4.routing-rules,ipv4.dns,ipv4.routes,ipv4.addresses,ipv4.method,ipv4.never-default,ovs-external-ids.data" pid=58004 uid=0 result="success"
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0149] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0151] audit: op="connection-add" uuid="afadd5e3-7a36-4bb4-8c90-6b47e9fc7fe3" name="vlan20-if" pid=58004 uid=0 result="success"
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0171] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0172] audit: op="connection-add" uuid="722953cb-1245-4be8-89cd-a92fafe58d4f" name="vlan21-if" pid=58004 uid=0 result="success"
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0192] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0193] audit: op="connection-add" uuid="f9649c95-2a18-4267-8b97-9eb406779728" name="vlan22-if" pid=58004 uid=0 result="success"
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0206] audit: op="connection-delete" uuid="00b5a4bd-aa7d-3265-84d1-52d370bbdb29" name="Wired connection 1" pid=58004 uid=0 result="success"
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0221] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0232] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0236] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (297b5746-e850-4721-8594-9a630ec46dd1)
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0237] audit: op="connection-activate" uuid="297b5746-e850-4721-8594-9a630ec46dd1" name="br-ex-br" pid=58004 uid=0 result="success"
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0239] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0245] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0249] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (5c3e53cf-bc40-4365-95c4-92fa2fdbca24)
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0250] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0256] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0261] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (26f0940a-c2e4-4f10-82e4-d59e35227273)
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0262] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0269] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0273] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (cad2ca67-795d-4c44-9e3a-92294d74ba69)
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0275] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0283] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0287] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (f38b7c2a-d0cf-405e-a0c3-b42d6e01ccfa)
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0290] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0297] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0302] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (79c8fafd-e960-4796-a11e-a95a64021cc7)
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0302] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0305] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0307] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0314] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0320] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0324] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (da20c687-a137-491f-b85b-6a6b7632ad0f)
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0325] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0329] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0332] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0334] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0336] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0347] device (eth1): disconnecting for new activation request.
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0348] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0352] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0354] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0355] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0358] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0363] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0369] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (afadd5e3-7a36-4bb4-8c90-6b47e9fc7fe3)
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0370] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0375] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0378] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0380] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0383] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0389] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0395] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (722953cb-1245-4be8-89cd-a92fafe58d4f)
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0396] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0400] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0402] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0404] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0409] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0414] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0420] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (f9649c95-2a18-4267-8b97-9eb406779728)
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0421] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0426] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0428] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0431] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0433] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0449] audit: op="device-reapply" interface="eth0" ifindex=2 args="connection.autoconnect-priority,ipv6.addr-gen-mode,ipv6.method,802-3-ethernet.mtu,ipv4.dhcp-client-id,ipv4.dhcp-timeout" pid=58004 uid=0 result="success"
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0452] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0457] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0460] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0469] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 06:25:42 compute-0 kernel: ovs-system: entered promiscuous mode
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0474] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0501] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0505] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0507] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0512] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 06:25:42 compute-0 kernel: Timeout policy base is empty
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0516] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0520] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 06:25:42 compute-0 systemd-udevd[58009]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0522] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0528] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0532] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0536] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0538] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 06:25:42 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0543] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0547] dhcp4 (eth0): canceled DHCP transaction
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0547] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0547] dhcp4 (eth0): state changed no lease
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0549] dhcp4 (eth0): activation: beginning transaction (no timeout)
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0558] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0561] audit: op="device-reapply" interface="eth1" ifindex=3 pid=58004 uid=0 result="fail" reason="Device is not activated"
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.0569] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Nov 29 06:25:42 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 29 06:25:42 compute-0 kernel: br-ex: entered promiscuous mode
Nov 29 06:25:42 compute-0 kernel: vlan21: entered promiscuous mode
Nov 29 06:25:42 compute-0 systemd-udevd[58008]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 06:25:42 compute-0 kernel: vlan20: entered promiscuous mode
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.3543] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.3547] dhcp4 (eth0): state changed new lease, address=38.102.83.110
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.3562] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.3573] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.3580] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Nov 29 06:25:42 compute-0 NetworkManager[55227]: <info>  [1764397542.3585] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Nov 29 06:25:42 compute-0 kernel: vlan22: entered promiscuous mode
Nov 29 06:25:42 compute-0 sudo[58245]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vticqswlrvwwxpucthbrtzopbosasacu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397542.5956132-851-137257731022701/AnsiballZ_async_status.py'
Nov 29 06:25:42 compute-0 sudo[58245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.0151] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.0384] device (eth1): Activation: starting connection 'ci-private-network' (8207c9a3-524e-532c-bd71-3fc37e48ed01)
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.0392] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.0395] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.0396] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.0398] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.0400] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.0402] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.0404] device (eth1): state change: disconnected -> deactivating (reason 'new-activation', managed-type: 'full')
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.0414] device (eth1): disconnecting for new activation request.
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.0415] audit: op="connection-activate" uuid="8207c9a3-524e-532c-bd71-3fc37e48ed01" name="ci-private-network" pid=58004 uid=0 result="success"
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.0432] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.0436] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.0444] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.0451] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.0456] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.0460] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.0463] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.0467] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.0474] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.0477] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.0482] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.0485] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.0489] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.0512] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.0518] device (eth1): Activation: starting connection 'ci-private-network' (8207c9a3-524e-532c-bd71-3fc37e48ed01)
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.0522] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58004 uid=0 result="success"
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.0524] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.0543] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.0546] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.0553] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.0558] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.0567] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.0573] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.0580] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.0582] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 06:25:43 compute-0 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Nov 29 06:25:43 compute-0 python3.9[58247]: ansible-ansible.legacy.async_status Invoked with jid=j139613719602.57998 mode=status _async_dir=/root/.ansible_async
Nov 29 06:25:43 compute-0 sudo[58245]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.4560] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.4563] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.4564] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.4565] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.4567] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.4577] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.4584] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.4589] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.4594] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.4600] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.4605] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.4610] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.4616] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.4620] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 06:25:43 compute-0 NetworkManager[55227]: <info>  [1764397543.4624] device (eth1): Activation: successful, device activated.
Nov 29 06:25:44 compute-0 ansible-async_wrapper.py[58001]: 58002 still running (300)
Nov 29 06:25:44 compute-0 NetworkManager[55227]: <info>  [1764397544.7254] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58004 uid=0 result="success"
Nov 29 06:25:44 compute-0 NetworkManager[55227]: <info>  [1764397544.8624] checkpoint[0x56416470a950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Nov 29 06:25:44 compute-0 NetworkManager[55227]: <info>  [1764397544.8625] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58004 uid=0 result="success"
Nov 29 06:25:45 compute-0 NetworkManager[55227]: <info>  [1764397545.1507] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58004 uid=0 result="success"
Nov 29 06:25:45 compute-0 NetworkManager[55227]: <info>  [1764397545.1521] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58004 uid=0 result="success"
Nov 29 06:25:45 compute-0 NetworkManager[55227]: <info>  [1764397545.8211] audit: op="networking-control" arg="global-dns-configuration" pid=58004 uid=0 result="success"
Nov 29 06:25:46 compute-0 sudo[58441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mngbvtrernwjsecfmcrivmiccleytbno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397542.5956132-851-137257731022701/AnsiballZ_async_status.py'
Nov 29 06:25:46 compute-0 sudo[58441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:46 compute-0 NetworkManager[55227]: <info>  [1764397546.4111] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Nov 29 06:25:46 compute-0 python3.9[58443]: ansible-ansible.legacy.async_status Invoked with jid=j139613719602.57998 mode=status _async_dir=/root/.ansible_async
Nov 29 06:25:46 compute-0 sudo[58441]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:47 compute-0 NetworkManager[55227]: <info>  [1764397547.8857] audit: op="networking-control" arg="global-dns-configuration" pid=58004 uid=0 result="success"
Nov 29 06:25:47 compute-0 NetworkManager[55227]: <info>  [1764397547.8888] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58004 uid=0 result="success"
Nov 29 06:25:47 compute-0 sshd-session[58445]: Invalid user teamspeak from 1.214.197.163 port 49122
Nov 29 06:25:48 compute-0 NetworkManager[55227]: <info>  [1764397548.0351] checkpoint[0x56416470aa20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Nov 29 06:25:48 compute-0 NetworkManager[55227]: <info>  [1764397548.0355] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58004 uid=0 result="success"
Nov 29 06:25:48 compute-0 ansible-async_wrapper.py[58002]: Module complete (58002)
Nov 29 06:25:48 compute-0 sshd-session[58445]: Received disconnect from 1.214.197.163 port 49122:11: Bye Bye [preauth]
Nov 29 06:25:48 compute-0 sshd-session[58445]: Disconnected from invalid user teamspeak 1.214.197.163 port 49122 [preauth]
Nov 29 06:25:49 compute-0 ansible-async_wrapper.py[58001]: Done in kid B.
Nov 29 06:25:49 compute-0 sudo[58547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hoypqthfikdyskbbxgagzypczvgshxze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397542.5956132-851-137257731022701/AnsiballZ_async_status.py'
Nov 29 06:25:49 compute-0 sudo[58547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:49 compute-0 python3.9[58549]: ansible-ansible.legacy.async_status Invoked with jid=j139613719602.57998 mode=status _async_dir=/root/.ansible_async
Nov 29 06:25:50 compute-0 sudo[58547]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:50 compute-0 sudo[58647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhziyooivpdrncxttzzhpafuegrqigul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397542.5956132-851-137257731022701/AnsiballZ_async_status.py'
Nov 29 06:25:50 compute-0 sudo[58647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:50 compute-0 python3.9[58649]: ansible-ansible.legacy.async_status Invoked with jid=j139613719602.57998 mode=cleanup _async_dir=/root/.ansible_async
Nov 29 06:25:50 compute-0 sudo[58647]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:51 compute-0 sudo[58799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rttavekshnjbrvjsqvbqnahbparbaeuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397551.297974-942-270106658820074/AnsiballZ_stat.py'
Nov 29 06:25:51 compute-0 sudo[58799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:51 compute-0 python3.9[58801]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:25:51 compute-0 sudo[58799]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:52 compute-0 sudo[58922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csmnmzzivgsqedrdizccbfblyklaaqvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397551.297974-942-270106658820074/AnsiballZ_copy.py'
Nov 29 06:25:52 compute-0 sudo[58922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:53 compute-0 python3.9[58924]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397551.297974-942-270106658820074/.source.returncode _original_basename=.61w_xb_2 follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:25:53 compute-0 sudo[58922]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:53 compute-0 sudo[59075]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqicpsnioqxvocpyshuczrenghqiqxut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397553.3657174-990-237894140489834/AnsiballZ_stat.py'
Nov 29 06:25:53 compute-0 sudo[59075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:53 compute-0 python3.9[59077]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:25:53 compute-0 sudo[59075]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:54 compute-0 sudo[59198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xuspobepodhwssunipvkjogknwyvfigd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397553.3657174-990-237894140489834/AnsiballZ_copy.py'
Nov 29 06:25:54 compute-0 sudo[59198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:54 compute-0 python3.9[59200]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397553.3657174-990-237894140489834/.source.cfg _original_basename=.6zarqer2 follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:25:54 compute-0 sudo[59198]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:54 compute-0 sudo[59350]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocjunvbubgnktkotgiwczcrsapnzsswt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397554.7311254-1035-119222003289258/AnsiballZ_systemd.py'
Nov 29 06:25:54 compute-0 sudo[59350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:25:55 compute-0 python3.9[59352]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:25:55 compute-0 systemd[1]: Reloading Network Manager...
Nov 29 06:25:55 compute-0 NetworkManager[55227]: <info>  [1764397555.3388] audit: op="reload" arg="0" pid=59356 uid=0 result="success"
Nov 29 06:25:55 compute-0 NetworkManager[55227]: <info>  [1764397555.3396] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Nov 29 06:25:55 compute-0 systemd[1]: Reloaded Network Manager.
Nov 29 06:25:55 compute-0 sudo[59350]: pam_unix(sudo:session): session closed for user root
Nov 29 06:25:56 compute-0 sshd-session[51212]: Connection closed by 192.168.122.30 port 50122
Nov 29 06:25:56 compute-0 sshd-session[51209]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:25:56 compute-0 systemd-logind[788]: Session 11 logged out. Waiting for processes to exit.
Nov 29 06:25:56 compute-0 systemd[1]: session-11.scope: Deactivated successfully.
Nov 29 06:25:56 compute-0 systemd[1]: session-11.scope: Consumed 51.270s CPU time.
Nov 29 06:25:56 compute-0 systemd-logind[788]: Removed session 11.
Nov 29 06:26:01 compute-0 sshd-session[59387]: Accepted publickey for zuul from 192.168.122.30 port 55586 ssh2: ECDSA SHA256:Ey+6YDlYfkE2dRe/gjWhHvvrJHee4xZo2Q1JojEuVBA
Nov 29 06:26:01 compute-0 systemd-logind[788]: New session 12 of user zuul.
Nov 29 06:26:01 compute-0 systemd[1]: Started Session 12 of User zuul.
Nov 29 06:26:01 compute-0 sshd-session[59387]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:26:02 compute-0 python3.9[59540]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:26:03 compute-0 python3.9[59695]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:26:05 compute-0 python3.9[59884]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:26:05 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 29 06:26:05 compute-0 sshd-session[59390]: Connection closed by 192.168.122.30 port 55586
Nov 29 06:26:05 compute-0 sshd-session[59387]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:26:05 compute-0 systemd[1]: session-12.scope: Deactivated successfully.
Nov 29 06:26:05 compute-0 systemd[1]: session-12.scope: Consumed 2.452s CPU time.
Nov 29 06:26:05 compute-0 systemd-logind[788]: Session 12 logged out. Waiting for processes to exit.
Nov 29 06:26:05 compute-0 systemd-logind[788]: Removed session 12.
Nov 29 06:26:07 compute-0 sshd-session[59913]: Received disconnect from 179.125.24.202 port 50844:11: Bye Bye [preauth]
Nov 29 06:26:07 compute-0 sshd-session[59913]: Disconnected from authenticating user root 179.125.24.202 port 50844 [preauth]
Nov 29 06:26:14 compute-0 sshd-session[59917]: Accepted publickey for zuul from 192.168.122.30 port 41744 ssh2: ECDSA SHA256:Ey+6YDlYfkE2dRe/gjWhHvvrJHee4xZo2Q1JojEuVBA
Nov 29 06:26:14 compute-0 systemd-logind[788]: New session 13 of user zuul.
Nov 29 06:26:14 compute-0 systemd[1]: Started Session 13 of User zuul.
Nov 29 06:26:14 compute-0 sshd-session[59917]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:26:15 compute-0 python3.9[60070]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:26:16 compute-0 python3.9[60224]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:26:17 compute-0 sudo[60378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aonhumblholpeuhdgpsweqnxwocyqxbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397577.2021527-85-264703108456468/AnsiballZ_setup.py'
Nov 29 06:26:17 compute-0 sudo[60378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:18 compute-0 python3.9[60380]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:26:18 compute-0 sudo[60378]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:18 compute-0 sudo[60462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftyiyjxtevzmvwhvcofglxxpinuscldk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397577.2021527-85-264703108456468/AnsiballZ_dnf.py'
Nov 29 06:26:18 compute-0 sudo[60462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:19 compute-0 python3.9[60464]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:26:20 compute-0 sshd-session[60466]: Invalid user uftp from 45.202.211.6 port 37984
Nov 29 06:26:20 compute-0 sshd-session[60466]: Received disconnect from 45.202.211.6 port 37984:11: Bye Bye [preauth]
Nov 29 06:26:20 compute-0 sshd-session[60466]: Disconnected from invalid user uftp 45.202.211.6 port 37984 [preauth]
Nov 29 06:26:21 compute-0 sudo[60462]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:22 compute-0 sudo[60617]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btoryblayvhfmwzsvpckrvcdtpjznwbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397582.1635728-121-73508845609039/AnsiballZ_setup.py'
Nov 29 06:26:22 compute-0 sudo[60617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:22 compute-0 python3.9[60619]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:26:23 compute-0 sudo[60617]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:24 compute-0 sudo[60808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfrxpbjmlhorgpjwagnebosjexrfdrdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397583.5414667-154-115642085761650/AnsiballZ_file.py'
Nov 29 06:26:24 compute-0 sudo[60808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:24 compute-0 python3.9[60810]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:26:24 compute-0 sudo[60808]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:24 compute-0 sudo[60960]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmudgwowhdzqzuylddratjgmrvgrczmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397584.4139683-178-274967236335686/AnsiballZ_command.py'
Nov 29 06:26:24 compute-0 sudo[60960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:25 compute-0 python3.9[60962]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:26:25 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:26:25 compute-0 sudo[60960]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:25 compute-0 sudo[61122]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksqauqnnsxztvrpzuxskayadfmedfilq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397585.4241223-202-29453915712981/AnsiballZ_stat.py'
Nov 29 06:26:25 compute-0 sudo[61122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:26 compute-0 python3.9[61124]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:26:26 compute-0 sudo[61122]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:26 compute-0 sudo[61200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iiignslhgfkewpifqjlnnlovgzbprsgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397585.4241223-202-29453915712981/AnsiballZ_file.py'
Nov 29 06:26:26 compute-0 sudo[61200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:26 compute-0 python3.9[61202]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:26:26 compute-0 sudo[61200]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:27 compute-0 sudo[61352]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vornhaxmolcgzrkoxldievzjqtkuaomi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397586.9117944-238-258784467839681/AnsiballZ_stat.py'
Nov 29 06:26:27 compute-0 sudo[61352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:27 compute-0 python3.9[61354]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:26:27 compute-0 sudo[61352]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:27 compute-0 sudo[61430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbzhebikxrnrdetihnwiuzdygaqksnnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397586.9117944-238-258784467839681/AnsiballZ_file.py'
Nov 29 06:26:27 compute-0 sudo[61430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:28 compute-0 python3.9[61432]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:26:28 compute-0 sudo[61430]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:29 compute-0 sudo[61582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dllsthhgccrukyskgopdycjiyewiioyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397588.5852077-277-104180766274828/AnsiballZ_ini_file.py'
Nov 29 06:26:29 compute-0 sudo[61582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:29 compute-0 python3.9[61584]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:26:29 compute-0 sudo[61582]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:29 compute-0 sudo[61734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmggnexojratrgtylwifbwymwvszksgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397589.420518-277-82812547806581/AnsiballZ_ini_file.py'
Nov 29 06:26:29 compute-0 sudo[61734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:29 compute-0 python3.9[61736]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:26:29 compute-0 sudo[61734]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:30 compute-0 sudo[61886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrsccuwzdxuqvbvfnzhczlbylxpdfktq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397590.0539763-277-223348147809313/AnsiballZ_ini_file.py'
Nov 29 06:26:30 compute-0 sudo[61886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:30 compute-0 python3.9[61888]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:26:30 compute-0 sudo[61886]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:30 compute-0 sudo[62038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgrklbsivfgogpbxyntlqpktzddrqekn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397590.675188-277-8510532913858/AnsiballZ_ini_file.py'
Nov 29 06:26:30 compute-0 sudo[62038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:31 compute-0 python3.9[62040]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:26:31 compute-0 sudo[62038]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:31 compute-0 sudo[62190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzfmwdmurmceuwdjrrirvkegfvazlrfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397591.6115863-370-188067858486661/AnsiballZ_dnf.py'
Nov 29 06:26:31 compute-0 sudo[62190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:32 compute-0 python3.9[62192]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:26:34 compute-0 sudo[62190]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:37 compute-0 sudo[62343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvolskeigdnojbwkggajpetcaowpstcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397596.8570964-403-277845172689903/AnsiballZ_setup.py'
Nov 29 06:26:37 compute-0 sudo[62343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:37 compute-0 python3.9[62345]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:26:37 compute-0 sudo[62343]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:37 compute-0 sudo[62497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcwwsbbbugynrhpidwxqgbtrntwfmqjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397597.720168-427-133535033081613/AnsiballZ_stat.py'
Nov 29 06:26:37 compute-0 sudo[62497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:38 compute-0 python3.9[62499]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:26:38 compute-0 sudo[62497]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:38 compute-0 sudo[62649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uyvxuxlfglhxkqwirbxdvurenmasofgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397598.4808116-454-5925165505548/AnsiballZ_stat.py'
Nov 29 06:26:38 compute-0 sudo[62649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:38 compute-0 python3.9[62651]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:26:38 compute-0 sudo[62649]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:40 compute-0 sudo[62801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycshqjhkugvfgxtriaviuaxvcdovbnjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397599.7440546-484-254476960503716/AnsiballZ_command.py'
Nov 29 06:26:40 compute-0 sudo[62801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:40 compute-0 python3.9[62803]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:26:40 compute-0 sudo[62801]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:41 compute-0 sudo[62956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvkuzrqpmkhxchwkvaredkrdlzzphkca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397600.6139061-514-107022117495491/AnsiballZ_service_facts.py'
Nov 29 06:26:41 compute-0 sudo[62956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:41 compute-0 python3.9[62958]: ansible-service_facts Invoked
Nov 29 06:26:41 compute-0 network[62975]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 06:26:41 compute-0 network[62976]: 'network-scripts' will be removed from distribution in near future.
Nov 29 06:26:41 compute-0 network[62977]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 06:26:42 compute-0 sshd-session[62836]: Received disconnect from 36.50.176.16 port 35626:11: Bye Bye [preauth]
Nov 29 06:26:42 compute-0 sshd-session[62836]: Disconnected from authenticating user root 36.50.176.16 port 35626 [preauth]
Nov 29 06:26:45 compute-0 sudo[62956]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:46 compute-0 sudo[63260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oehidsajqowlniblkhthzcyhiunqzsav ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1764397606.031595-559-156662829499009/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1764397606.031595-559-156662829499009/args'
Nov 29 06:26:46 compute-0 sudo[63260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:46 compute-0 sudo[63260]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:47 compute-0 sudo[63427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwkrtocerkkndkgehcrylekfeoptbxjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397606.824763-592-107953539002083/AnsiballZ_dnf.py'
Nov 29 06:26:47 compute-0 sudo[63427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:47 compute-0 python3.9[63429]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:26:49 compute-0 sudo[63427]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:53 compute-0 sudo[63582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cioyhknwhrpprldzlmymnaexsyvlnlbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397612.8777378-631-103983929804721/AnsiballZ_package_facts.py'
Nov 29 06:26:53 compute-0 sudo[63582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:53 compute-0 python3.9[63584]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Nov 29 06:26:54 compute-0 sudo[63582]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:55 compute-0 sudo[63734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpkccunwgauxwbavtxvyadosqczlhfsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397614.7955322-661-250274920283636/AnsiballZ_stat.py'
Nov 29 06:26:55 compute-0 sudo[63734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:55 compute-0 python3.9[63736]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:26:55 compute-0 sudo[63734]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:56 compute-0 sudo[63859]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmwhgztopskdwfkmlnxnpxeajatufqhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397614.7955322-661-250274920283636/AnsiballZ_copy.py'
Nov 29 06:26:56 compute-0 sudo[63859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:56 compute-0 python3.9[63861]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397614.7955322-661-250274920283636/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:26:56 compute-0 sudo[63859]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:56 compute-0 sudo[64013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywvhbkeswewjxrxylkpoerwpdufcbztc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397616.6129692-706-78457273237521/AnsiballZ_stat.py'
Nov 29 06:26:56 compute-0 sudo[64013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:57 compute-0 python3.9[64015]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:26:57 compute-0 sudo[64013]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:57 compute-0 sudo[64138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwwkxmvtzuapqqzcjftjzmovslujzqlc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397616.6129692-706-78457273237521/AnsiballZ_copy.py'
Nov 29 06:26:57 compute-0 sudo[64138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:57 compute-0 python3.9[64140]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397616.6129692-706-78457273237521/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:26:57 compute-0 sudo[64138]: pam_unix(sudo:session): session closed for user root
Nov 29 06:26:59 compute-0 sshd-session[63455]: Received disconnect from 45.78.219.251 port 41340:11: Bye Bye [preauth]
Nov 29 06:26:59 compute-0 sshd-session[63455]: Disconnected from authenticating user root 45.78.219.251 port 41340 [preauth]
Nov 29 06:26:59 compute-0 sudo[64292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwrboqovejdnsmgrvemsneuafddubzrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397619.1191068-769-174940323573987/AnsiballZ_lineinfile.py'
Nov 29 06:26:59 compute-0 sudo[64292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:26:59 compute-0 python3.9[64294]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:26:59 compute-0 sudo[64292]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:01 compute-0 sudo[64446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvsjaknqsxzhzzdzwphwrqrpeoxxsjvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397620.7467518-814-103327635759759/AnsiballZ_setup.py'
Nov 29 06:27:01 compute-0 sudo[64446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:01 compute-0 python3.9[64448]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:27:01 compute-0 sudo[64446]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:02 compute-0 sudo[64530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhuwvykdbkvhjgwchsyriipxaesjiypy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397620.7467518-814-103327635759759/AnsiballZ_systemd.py'
Nov 29 06:27:02 compute-0 sudo[64530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:02 compute-0 python3.9[64532]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:27:02 compute-0 sudo[64530]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:03 compute-0 sudo[64684]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpvvedauigfnarsqjrukqgtvlcdrgcfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397623.28673-862-188960709499387/AnsiballZ_setup.py'
Nov 29 06:27:03 compute-0 sudo[64684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:03 compute-0 python3.9[64686]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:27:04 compute-0 sudo[64684]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:04 compute-0 sudo[64768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyiduhxrndrkwvyizsixjafurbftiptk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397623.28673-862-188960709499387/AnsiballZ_systemd.py'
Nov 29 06:27:04 compute-0 sudo[64768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:06 compute-0 python3.9[64770]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:27:06 compute-0 chronyd[793]: chronyd exiting
Nov 29 06:27:06 compute-0 systemd[1]: Stopping NTP client/server...
Nov 29 06:27:06 compute-0 systemd[1]: chronyd.service: Deactivated successfully.
Nov 29 06:27:06 compute-0 systemd[1]: Stopped NTP client/server.
Nov 29 06:27:06 compute-0 systemd[1]: Starting NTP client/server...
Nov 29 06:27:06 compute-0 chronyd[64778]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Nov 29 06:27:06 compute-0 chronyd[64778]: Frequency -28.341 +/- 0.154 ppm read from /var/lib/chrony/drift
Nov 29 06:27:06 compute-0 chronyd[64778]: Loaded seccomp filter (level 2)
Nov 29 06:27:06 compute-0 systemd[1]: Started NTP client/server.
Nov 29 06:27:06 compute-0 sudo[64768]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:06 compute-0 sshd-session[59920]: Connection closed by 192.168.122.30 port 41744
Nov 29 06:27:06 compute-0 sshd-session[59917]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:27:06 compute-0 systemd[1]: session-13.scope: Deactivated successfully.
Nov 29 06:27:06 compute-0 systemd[1]: session-13.scope: Consumed 25.391s CPU time.
Nov 29 06:27:06 compute-0 systemd-logind[788]: Session 13 logged out. Waiting for processes to exit.
Nov 29 06:27:06 compute-0 systemd-logind[788]: Removed session 13.
Nov 29 06:27:12 compute-0 sshd-session[64806]: Accepted publickey for zuul from 192.168.122.30 port 47832 ssh2: ECDSA SHA256:Ey+6YDlYfkE2dRe/gjWhHvvrJHee4xZo2Q1JojEuVBA
Nov 29 06:27:12 compute-0 systemd-logind[788]: New session 14 of user zuul.
Nov 29 06:27:12 compute-0 systemd[1]: Started Session 14 of User zuul.
Nov 29 06:27:12 compute-0 sshd-session[64806]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:27:13 compute-0 python3.9[64963]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:27:13 compute-0 sshd-session[64804]: Invalid user ftpadmin from 103.179.56.44 port 51456
Nov 29 06:27:13 compute-0 sshd-session[64804]: Received disconnect from 103.179.56.44 port 51456:11: Bye Bye [preauth]
Nov 29 06:27:13 compute-0 sshd-session[64804]: Disconnected from invalid user ftpadmin 103.179.56.44 port 51456 [preauth]
Nov 29 06:27:13 compute-0 sshd-session[64883]: Invalid user centos from 160.202.8.218 port 55266
Nov 29 06:27:14 compute-0 sshd-session[64883]: Received disconnect from 160.202.8.218 port 55266:11: Bye Bye [preauth]
Nov 29 06:27:14 compute-0 sshd-session[64883]: Disconnected from invalid user centos 160.202.8.218 port 55266 [preauth]
Nov 29 06:27:14 compute-0 sudo[65117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bloomovkowrqpjtypnhdmsfizpuelbrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397633.6429622-64-71755812559847/AnsiballZ_file.py'
Nov 29 06:27:14 compute-0 sudo[65117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:14 compute-0 sshd-session[64911]: Received disconnect from 1.214.197.163 port 50524:11: Bye Bye [preauth]
Nov 29 06:27:14 compute-0 sshd-session[64911]: Disconnected from authenticating user root 1.214.197.163 port 50524 [preauth]
Nov 29 06:27:14 compute-0 python3.9[65119]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:27:14 compute-0 sudo[65117]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:14 compute-0 sudo[65292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nytnixtlrgxhpsmouwiyyucwojdieldf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397634.5164855-88-129022872637023/AnsiballZ_stat.py'
Nov 29 06:27:14 compute-0 sudo[65292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:15 compute-0 python3.9[65294]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:27:15 compute-0 sudo[65292]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:15 compute-0 sudo[65370]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipirwvmluvdvlzqxgarofkthmzerkxfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397634.5164855-88-129022872637023/AnsiballZ_file.py'
Nov 29 06:27:15 compute-0 sudo[65370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:15 compute-0 python3.9[65372]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.93u1ar_4 recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:27:15 compute-0 sudo[65370]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:16 compute-0 sudo[65522]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zuemohnoqgkohkuqnejsrhfezcogfmby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397636.3559942-148-267472914586825/AnsiballZ_stat.py'
Nov 29 06:27:16 compute-0 sudo[65522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:16 compute-0 python3.9[65524]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:27:16 compute-0 sudo[65522]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:17 compute-0 sudo[65645]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nymnsetpkjwvippzjrvhpgxmdouuhwwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397636.3559942-148-267472914586825/AnsiballZ_copy.py'
Nov 29 06:27:17 compute-0 sudo[65645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:17 compute-0 python3.9[65647]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397636.3559942-148-267472914586825/.source _original_basename=.rol90uln follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:27:17 compute-0 sudo[65645]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:18 compute-0 sudo[65797]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-femrumshkqzdfttzwrayuaotfhvmbeus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397638.1733153-196-2140078312006/AnsiballZ_file.py'
Nov 29 06:27:18 compute-0 sudo[65797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:18 compute-0 python3.9[65799]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:27:18 compute-0 sudo[65797]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:19 compute-0 sudo[65949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mavmqxualjdnpsyxinzyxqqertxreidg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397638.9321856-220-225187424762000/AnsiballZ_stat.py'
Nov 29 06:27:19 compute-0 sudo[65949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:19 compute-0 python3.9[65951]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:27:19 compute-0 sudo[65949]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:20 compute-0 sudo[66072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jilqetczokilgrlkxwjytfcooblrdtzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397638.9321856-220-225187424762000/AnsiballZ_copy.py'
Nov 29 06:27:20 compute-0 sudo[66072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:20 compute-0 python3.9[66074]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397638.9321856-220-225187424762000/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:27:20 compute-0 sudo[66072]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:20 compute-0 sudo[66224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnjysvjyrthvsmxqsiwakbrhllxonqfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397640.4391644-220-175718587260442/AnsiballZ_stat.py'
Nov 29 06:27:20 compute-0 sudo[66224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:20 compute-0 python3.9[66226]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:27:20 compute-0 sudo[66224]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:21 compute-0 sudo[66347]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hazfvvafayfjambnmdoyrloqdcvgbgke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397640.4391644-220-175718587260442/AnsiballZ_copy.py'
Nov 29 06:27:21 compute-0 sudo[66347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:21 compute-0 python3.9[66349]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397640.4391644-220-175718587260442/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:27:21 compute-0 sudo[66347]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:22 compute-0 sudo[66499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvpbgrbktfvfiqqjqxegajuxwrperkte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397642.2167275-307-101656276695285/AnsiballZ_file.py'
Nov 29 06:27:22 compute-0 sudo[66499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:22 compute-0 python3.9[66501]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:27:22 compute-0 sudo[66499]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:23 compute-0 sudo[66651]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftbgfwazlshknirgpodpqpedqlkxkkev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397642.962883-331-265465761367830/AnsiballZ_stat.py'
Nov 29 06:27:23 compute-0 sudo[66651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:23 compute-0 python3.9[66653]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:27:23 compute-0 sudo[66651]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:23 compute-0 sudo[66774]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzptbwskwdowhfsofryxarodkyecbbwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397642.962883-331-265465761367830/AnsiballZ_copy.py'
Nov 29 06:27:23 compute-0 sudo[66774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:23 compute-0 python3.9[66776]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397642.962883-331-265465761367830/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:27:23 compute-0 sudo[66774]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:24 compute-0 sudo[66926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmtrozxszbgfgxdwqrmrwgpesnekcipl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397644.4671836-376-120214006910041/AnsiballZ_stat.py'
Nov 29 06:27:24 compute-0 sudo[66926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:24 compute-0 python3.9[66928]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:27:24 compute-0 sudo[66926]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:25 compute-0 sudo[67049]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwccktfhdnqdzavoidxpfdrqokggwtrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397644.4671836-376-120214006910041/AnsiballZ_copy.py'
Nov 29 06:27:25 compute-0 sudo[67049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:25 compute-0 python3.9[67051]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397644.4671836-376-120214006910041/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:27:25 compute-0 sudo[67049]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:26 compute-0 sudo[67201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imiuquaynxqidsorkctjbqzwquduhjzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397645.8287506-421-213345740208815/AnsiballZ_systemd.py'
Nov 29 06:27:26 compute-0 sudo[67201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:26 compute-0 python3.9[67203]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:27:26 compute-0 systemd[1]: Reloading.
Nov 29 06:27:26 compute-0 systemd-rc-local-generator[67229]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:27:26 compute-0 systemd-sysv-generator[67232]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:27:27 compute-0 systemd[1]: Reloading.
Nov 29 06:27:27 compute-0 systemd-rc-local-generator[67269]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:27:27 compute-0 systemd-sysv-generator[67272]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:27:27 compute-0 systemd[1]: Starting EDPM Container Shutdown...
Nov 29 06:27:27 compute-0 systemd[1]: Finished EDPM Container Shutdown.
Nov 29 06:27:27 compute-0 sudo[67201]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:27 compute-0 sudo[67429]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcltowyittviaffuineqbdghybhcgspv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397647.6281946-445-133929059686599/AnsiballZ_stat.py'
Nov 29 06:27:27 compute-0 sudo[67429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:28 compute-0 python3.9[67431]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:27:28 compute-0 sudo[67429]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:28 compute-0 sudo[67552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pakdhutkxvbgecdppgwtjroxusqmjshq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397647.6281946-445-133929059686599/AnsiballZ_copy.py'
Nov 29 06:27:28 compute-0 sudo[67552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:28 compute-0 python3.9[67554]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397647.6281946-445-133929059686599/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:27:28 compute-0 sudo[67552]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:29 compute-0 sudo[67704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umqyyyzamxbbpvovuvzfzyeyznapkyde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397648.9530876-490-275977022788111/AnsiballZ_stat.py'
Nov 29 06:27:29 compute-0 sudo[67704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:29 compute-0 python3.9[67706]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:27:29 compute-0 sudo[67704]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:29 compute-0 sudo[67829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhuzfcoadcdsgyetynqrpuswesxhjfis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397648.9530876-490-275977022788111/AnsiballZ_copy.py'
Nov 29 06:27:29 compute-0 sudo[67829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:29 compute-0 python3.9[67831]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397648.9530876-490-275977022788111/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:27:30 compute-0 sudo[67829]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:30 compute-0 sshd-session[67707]: Invalid user zookeeper from 45.202.211.6 port 48554
Nov 29 06:27:30 compute-0 sudo[67981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwhauryoogkobhjcqdbskvbyadqsqjfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397650.3150034-535-210691451266394/AnsiballZ_systemd.py'
Nov 29 06:27:30 compute-0 sudo[67981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:30 compute-0 sshd-session[67707]: Received disconnect from 45.202.211.6 port 48554:11: Bye Bye [preauth]
Nov 29 06:27:30 compute-0 sshd-session[67707]: Disconnected from invalid user zookeeper 45.202.211.6 port 48554 [preauth]
Nov 29 06:27:30 compute-0 python3.9[67983]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:27:30 compute-0 systemd[1]: Reloading.
Nov 29 06:27:30 compute-0 systemd-rc-local-generator[68009]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:27:30 compute-0 systemd-sysv-generator[68012]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:27:31 compute-0 systemd[1]: Reloading.
Nov 29 06:27:31 compute-0 systemd-sysv-generator[68052]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:27:31 compute-0 systemd-rc-local-generator[68049]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:27:31 compute-0 systemd[1]: Starting Create netns directory...
Nov 29 06:27:31 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 29 06:27:31 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 29 06:27:31 compute-0 systemd[1]: Finished Create netns directory.
Nov 29 06:27:31 compute-0 sudo[67981]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:32 compute-0 sshd-session[68067]: Invalid user marco from 179.125.24.202 port 50794
Nov 29 06:27:32 compute-0 sshd-session[68067]: Received disconnect from 179.125.24.202 port 50794:11: Bye Bye [preauth]
Nov 29 06:27:32 compute-0 sshd-session[68067]: Disconnected from invalid user marco 179.125.24.202 port 50794 [preauth]
Nov 29 06:27:32 compute-0 python3.9[68212]: ansible-ansible.builtin.service_facts Invoked
Nov 29 06:27:32 compute-0 network[68229]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 06:27:32 compute-0 network[68230]: 'network-scripts' will be removed from distribution in near future.
Nov 29 06:27:32 compute-0 network[68231]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 06:27:35 compute-0 sudo[68491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytycphfnbddyumyvxyzrevquruoehygs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397655.4471638-583-137164886757762/AnsiballZ_systemd.py'
Nov 29 06:27:35 compute-0 sudo[68491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:35 compute-0 python3.9[68493]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:27:36 compute-0 systemd[1]: Reloading.
Nov 29 06:27:36 compute-0 systemd-rc-local-generator[68518]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:27:36 compute-0 systemd-sysv-generator[68523]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:27:36 compute-0 systemd[1]: Stopping IPv4 firewall with iptables...
Nov 29 06:27:36 compute-0 iptables.init[68534]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Nov 29 06:27:36 compute-0 iptables.init[68534]: iptables: Flushing firewall rules: [  OK  ]
Nov 29 06:27:36 compute-0 systemd[1]: iptables.service: Deactivated successfully.
Nov 29 06:27:36 compute-0 systemd[1]: Stopped IPv4 firewall with iptables.
Nov 29 06:27:36 compute-0 sudo[68491]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:37 compute-0 sudo[68728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqyzckygzntcxysrxfftmdrmecmfetmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397656.7990184-583-60770348958013/AnsiballZ_systemd.py'
Nov 29 06:27:37 compute-0 sudo[68728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:37 compute-0 python3.9[68730]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:27:37 compute-0 sudo[68728]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:38 compute-0 sudo[68882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftuivdjhrmrshfmhbspvipvjwdsbhmce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397657.9082289-631-132713234327327/AnsiballZ_systemd.py'
Nov 29 06:27:38 compute-0 sudo[68882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:38 compute-0 python3.9[68884]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:27:38 compute-0 systemd[1]: Reloading.
Nov 29 06:27:38 compute-0 systemd-rc-local-generator[68916]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:27:38 compute-0 systemd-sysv-generator[68919]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:27:38 compute-0 systemd[1]: Starting Netfilter Tables...
Nov 29 06:27:38 compute-0 systemd[1]: Finished Netfilter Tables.
Nov 29 06:27:38 compute-0 sudo[68882]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:39 compute-0 sudo[69075]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbvexcmnqiwqyrzveuhftlwplugddbaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397659.1118908-655-56475623086967/AnsiballZ_command.py'
Nov 29 06:27:39 compute-0 sudo[69075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:39 compute-0 python3.9[69077]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:27:39 compute-0 sudo[69075]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:40 compute-0 sudo[69228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phlmamcriherdwtpksxrpmjzgjvfbhpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397660.381132-697-24147287311963/AnsiballZ_stat.py'
Nov 29 06:27:40 compute-0 sudo[69228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:40 compute-0 python3.9[69230]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:27:40 compute-0 sudo[69228]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:41 compute-0 sudo[69353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcyvztmpcmvkcvjfzipqsnynmkhnacac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397660.381132-697-24147287311963/AnsiballZ_copy.py'
Nov 29 06:27:41 compute-0 sudo[69353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:41 compute-0 python3.9[69355]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397660.381132-697-24147287311963/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:27:41 compute-0 sudo[69353]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:42 compute-0 sudo[69506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lisdewbcwlffiuahcxtvcegbpmhmccjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397661.9162323-742-44177743902722/AnsiballZ_systemd.py'
Nov 29 06:27:42 compute-0 sudo[69506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:42 compute-0 python3.9[69508]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:27:42 compute-0 systemd[1]: Reloading OpenSSH server daemon...
Nov 29 06:27:42 compute-0 sshd[1011]: Received SIGHUP; restarting.
Nov 29 06:27:42 compute-0 systemd[1]: Reloaded OpenSSH server daemon.
Nov 29 06:27:42 compute-0 sshd[1011]: Server listening on 0.0.0.0 port 22.
Nov 29 06:27:42 compute-0 sshd[1011]: Server listening on :: port 22.
Nov 29 06:27:42 compute-0 sudo[69506]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:43 compute-0 sudo[69662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccksgktlfgpksnjjufobfdbgqhfmsprn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397662.8119864-766-12134614171704/AnsiballZ_file.py'
Nov 29 06:27:43 compute-0 sudo[69662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:43 compute-0 python3.9[69664]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:27:43 compute-0 sudo[69662]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:43 compute-0 sudo[69814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogisjvbwvkaydmsiycfueitpufnlloqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397663.5039332-790-235295873848085/AnsiballZ_stat.py'
Nov 29 06:27:43 compute-0 sudo[69814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:44 compute-0 python3.9[69816]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:27:44 compute-0 sudo[69814]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:44 compute-0 sudo[69937]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yuunjimispuacezexewisfsnbfjmhwak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397663.5039332-790-235295873848085/AnsiballZ_copy.py'
Nov 29 06:27:44 compute-0 sudo[69937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:44 compute-0 python3.9[69939]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397663.5039332-790-235295873848085/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:27:44 compute-0 sudo[69937]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:45 compute-0 sudo[70089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgqjzdaqekhxbvciutwoshautoybqemm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397665.143654-844-47053880899096/AnsiballZ_timezone.py'
Nov 29 06:27:45 compute-0 sudo[70089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:45 compute-0 python3.9[70091]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 29 06:27:45 compute-0 systemd[1]: Starting Time & Date Service...
Nov 29 06:27:45 compute-0 systemd[1]: Started Time & Date Service.
Nov 29 06:27:45 compute-0 sudo[70089]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:46 compute-0 sudo[70245]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izjsjmgrudflwxdabitpdmjuacbjesph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397666.3678155-871-198750294435786/AnsiballZ_file.py'
Nov 29 06:27:46 compute-0 sudo[70245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:46 compute-0 python3.9[70247]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:27:46 compute-0 sudo[70245]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:47 compute-0 sudo[70397]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wptoabeyxltalmutcjxarytlqyqcrpos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397667.233327-895-162251019947136/AnsiballZ_stat.py'
Nov 29 06:27:47 compute-0 sudo[70397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:47 compute-0 python3.9[70399]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:27:47 compute-0 sudo[70397]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:48 compute-0 sudo[70520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggpruhbnehietmsuvrvbyukpradgnjnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397667.233327-895-162251019947136/AnsiballZ_copy.py'
Nov 29 06:27:48 compute-0 sudo[70520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:48 compute-0 python3.9[70522]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397667.233327-895-162251019947136/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:27:48 compute-0 sudo[70520]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:48 compute-0 sudo[70672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwvqdzjaouvqhinsxekuxmglgtokjaab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397668.67768-940-96237380852993/AnsiballZ_stat.py'
Nov 29 06:27:48 compute-0 sudo[70672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:49 compute-0 python3.9[70674]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:27:49 compute-0 sudo[70672]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:49 compute-0 sudo[70795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mefyzbwftgxzascivqzccvuztmhedpmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397668.67768-940-96237380852993/AnsiballZ_copy.py'
Nov 29 06:27:49 compute-0 sudo[70795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:49 compute-0 python3.9[70797]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397668.67768-940-96237380852993/.source.yaml _original_basename=.mh8ukqq0 follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:27:49 compute-0 sudo[70795]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:50 compute-0 sudo[70947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxmkxjaztsrrgsayebadnxsmniatlvoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397670.2133849-985-185132673991818/AnsiballZ_stat.py'
Nov 29 06:27:50 compute-0 sudo[70947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:50 compute-0 python3.9[70949]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:27:50 compute-0 sudo[70947]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:51 compute-0 sudo[71070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmeksbnsatwkklqkzzyfzrrejvedlzkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397670.2133849-985-185132673991818/AnsiballZ_copy.py'
Nov 29 06:27:51 compute-0 sudo[71070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:51 compute-0 python3.9[71072]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397670.2133849-985-185132673991818/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:27:51 compute-0 sudo[71070]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:51 compute-0 sudo[71222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utfvxwzujhmnmcwgrnddahlvuxcmnskr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397671.59552-1030-31988916050386/AnsiballZ_command.py'
Nov 29 06:27:51 compute-0 sudo[71222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:52 compute-0 python3.9[71224]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:27:52 compute-0 sudo[71222]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:52 compute-0 sudo[71375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbdocoblxnfihdxkcsqeewjceoiinedq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397672.370815-1054-259744195533492/AnsiballZ_command.py'
Nov 29 06:27:52 compute-0 sudo[71375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:52 compute-0 python3.9[71377]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:27:52 compute-0 sudo[71375]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:53 compute-0 sudo[71528]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpswefniomjyusgppecnewapapgekbmd ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764397673.14556-1078-53237899457275/AnsiballZ_edpm_nftables_from_files.py'
Nov 29 06:27:53 compute-0 sudo[71528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:53 compute-0 python3[71530]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 29 06:27:53 compute-0 sudo[71528]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:54 compute-0 sudo[71680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkmiavawtlcyhbklpwdbmxpjrmaafnug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397673.9608788-1102-196356625348024/AnsiballZ_stat.py'
Nov 29 06:27:54 compute-0 sudo[71680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:54 compute-0 python3.9[71682]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:27:54 compute-0 sudo[71680]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:54 compute-0 sudo[71803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylmleyrloejhwvwshbjfnjxoroyyuups ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397673.9608788-1102-196356625348024/AnsiballZ_copy.py'
Nov 29 06:27:54 compute-0 sudo[71803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:54 compute-0 python3.9[71805]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397673.9608788-1102-196356625348024/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:27:54 compute-0 sudo[71803]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:55 compute-0 sudo[71955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agydrpysdlnszcrrbgxmcaeduxihwjwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397675.2588832-1147-102527448026930/AnsiballZ_stat.py'
Nov 29 06:27:55 compute-0 sudo[71955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:55 compute-0 python3.9[71957]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:27:55 compute-0 sudo[71955]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:56 compute-0 sudo[72078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujlbjeexrryrndegowxaegbyspskceow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397675.2588832-1147-102527448026930/AnsiballZ_copy.py'
Nov 29 06:27:56 compute-0 sudo[72078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:56 compute-0 python3.9[72080]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397675.2588832-1147-102527448026930/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:27:56 compute-0 sudo[72078]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:56 compute-0 sudo[72230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-doceenhmiosxsxonwcxcsgtkcjinsttf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397676.598237-1192-216786081893451/AnsiballZ_stat.py'
Nov 29 06:27:56 compute-0 sudo[72230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:57 compute-0 python3.9[72232]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:27:57 compute-0 sudo[72230]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:57 compute-0 sudo[72353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwqfjvaymgeqsabbxqvtxrpumxlkhjlj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397676.598237-1192-216786081893451/AnsiballZ_copy.py'
Nov 29 06:27:57 compute-0 sudo[72353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:57 compute-0 python3.9[72355]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397676.598237-1192-216786081893451/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:27:57 compute-0 sudo[72353]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:58 compute-0 sudo[72505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nifhfcvlysjibjgzvhkqzkljkobtqiib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397678.010977-1237-137700418083571/AnsiballZ_stat.py'
Nov 29 06:27:58 compute-0 sudo[72505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:58 compute-0 python3.9[72507]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:27:58 compute-0 sudo[72505]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:58 compute-0 sudo[72628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vutoxzjlxnfsxxcavnqzrpzewbaasoqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397678.010977-1237-137700418083571/AnsiballZ_copy.py'
Nov 29 06:27:59 compute-0 sudo[72628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:27:59 compute-0 python3.9[72630]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397678.010977-1237-137700418083571/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:27:59 compute-0 sudo[72628]: pam_unix(sudo:session): session closed for user root
Nov 29 06:27:59 compute-0 sudo[72780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apcpaxcwexzwbagxnxkjpxjnwyknrhtx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397679.4311092-1282-74211349028859/AnsiballZ_stat.py'
Nov 29 06:27:59 compute-0 sudo[72780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:00 compute-0 python3.9[72782]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:28:00 compute-0 sudo[72780]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:00 compute-0 sudo[72903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdxuxgneaskuaarpjzlelifmfshffvxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397679.4311092-1282-74211349028859/AnsiballZ_copy.py'
Nov 29 06:28:00 compute-0 sudo[72903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:00 compute-0 python3.9[72905]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397679.4311092-1282-74211349028859/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:00 compute-0 sudo[72903]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:01 compute-0 sudo[73055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvjpgfpdozepjqoroiknbjdzqluhkcvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397681.0429933-1327-167860257084848/AnsiballZ_file.py'
Nov 29 06:28:01 compute-0 sudo[73055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:01 compute-0 python3.9[73057]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:01 compute-0 sudo[73055]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:01 compute-0 sudo[73207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbomkydtrwnurdyepvsxyqugfgtrdhsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397681.7013454-1351-100031991988438/AnsiballZ_command.py'
Nov 29 06:28:01 compute-0 sudo[73207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:02 compute-0 python3.9[73209]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:28:02 compute-0 sudo[73207]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:02 compute-0 sudo[73366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjzhbqwnodpocnhqtzdgcdkncdyfjpww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397682.4635546-1375-204705466400089/AnsiballZ_blockinfile.py'
Nov 29 06:28:02 compute-0 sudo[73366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:03 compute-0 python3.9[73368]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:03 compute-0 sudo[73366]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:03 compute-0 sudo[73519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdlnnlonxmvfxijswlcinbpfmktctgcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397683.439084-1402-33075805579836/AnsiballZ_file.py'
Nov 29 06:28:03 compute-0 sudo[73519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:03 compute-0 python3.9[73521]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:03 compute-0 sudo[73519]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:04 compute-0 sudo[73671]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysaiaexwxeccdibhmyyoqehxvqkedlbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397684.0493667-1402-93601756467697/AnsiballZ_file.py'
Nov 29 06:28:04 compute-0 sudo[73671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:04 compute-0 python3.9[73673]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:04 compute-0 sudo[73671]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:06 compute-0 sudo[73823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctzklgdpmjamhqnmrtfkzssrrghfkyaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397684.8072646-1447-177260703544979/AnsiballZ_mount.py'
Nov 29 06:28:06 compute-0 sudo[73823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:06 compute-0 python3.9[73825]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 29 06:28:06 compute-0 sudo[73823]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:06 compute-0 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 06:28:06 compute-0 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 06:28:06 compute-0 sudo[73977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqtajcpefpnpwstjkegajdugoaogqard ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397686.437584-1447-186315501338711/AnsiballZ_mount.py'
Nov 29 06:28:06 compute-0 sudo[73977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:06 compute-0 python3.9[73979]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 29 06:28:06 compute-0 sudo[73977]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:07 compute-0 sshd-session[64809]: Connection closed by 192.168.122.30 port 47832
Nov 29 06:28:07 compute-0 sshd-session[64806]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:28:07 compute-0 systemd[1]: session-14.scope: Deactivated successfully.
Nov 29 06:28:07 compute-0 systemd[1]: session-14.scope: Consumed 33.181s CPU time.
Nov 29 06:28:07 compute-0 systemd-logind[788]: Session 14 logged out. Waiting for processes to exit.
Nov 29 06:28:07 compute-0 systemd-logind[788]: Removed session 14.
Nov 29 06:28:10 compute-0 sshd-session[74006]: Invalid user  from 65.49.1.214 port 50111
Nov 29 06:28:13 compute-0 sshd-session[74008]: Accepted publickey for zuul from 192.168.122.30 port 58498 ssh2: ECDSA SHA256:Ey+6YDlYfkE2dRe/gjWhHvvrJHee4xZo2Q1JojEuVBA
Nov 29 06:28:13 compute-0 systemd-logind[788]: New session 15 of user zuul.
Nov 29 06:28:13 compute-0 systemd[1]: Started Session 15 of User zuul.
Nov 29 06:28:13 compute-0 sshd-session[74008]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:28:14 compute-0 sshd-session[74006]: Connection closed by invalid user  65.49.1.214 port 50111 [preauth]
Nov 29 06:28:14 compute-0 sudo[74161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hojvpftpkslaykmiownforzzufofkdiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397693.8579788-23-245398832783296/AnsiballZ_tempfile.py'
Nov 29 06:28:14 compute-0 sudo[74161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:14 compute-0 python3.9[74163]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Nov 29 06:28:14 compute-0 sudo[74161]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:15 compute-0 sudo[74313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhqnrflntichfxcophnkfhszseupbmbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397694.7956831-59-220117564723221/AnsiballZ_stat.py'
Nov 29 06:28:15 compute-0 sudo[74313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:15 compute-0 python3.9[74315]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:28:15 compute-0 sudo[74313]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:15 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 29 06:28:16 compute-0 sudo[74467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-naibhutmggilyfosenpayamjkmyazxfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397695.7370284-89-138378014297161/AnsiballZ_setup.py'
Nov 29 06:28:16 compute-0 sudo[74467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:16 compute-0 python3.9[74469]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:28:16 compute-0 sudo[74467]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:17 compute-0 sudo[74619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ooqaghwzpslatajinfonefzatmzxyfxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397696.9156795-114-204862209757128/AnsiballZ_blockinfile.py'
Nov 29 06:28:17 compute-0 sudo[74619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:17 compute-0 python3.9[74621]: ansible-ansible.builtin.blockinfile Invoked with block=compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDxE9+in1YAsVzo2PkbOP/y9jW13mE04+F1VrPVmmgKUME6PWRBUtuT66AB40zRYi5yO+6N76+VAJcvtF1kNGhm3shwR+EkOKx8SHbU+RviKmRHfsi7XEfyHL7uPXOJMckqz85eUFqMQlXm0T6k8SbAwg/7v0r7w70oz6RysylzQYZWVeFgXZ7UFNiz+TKXL4x8MRY/6V3JMXIBdt/vb6cGmIyDwfTLPa/VxO6oKiuknrmAhd6pKWVOAoLeLvCJFRcnCjfZygatiRnwzibR7Xmo/fWClfIWB/RpJC5vSGru0Y/btrmoNInBd93XAWFRh8/+L/mTAUqvgP7Dy/Ft6JXARlkcmX64/tqwMI7M6a4A9voOZ8Eb1cJyJ/XgWoTXUZB9+cehvGP5J0tLJkw/iGBXKOcXhP99ulw5rvtkAaOXV6omaio88Pl85lT2ISJO6g47/pk27eMMKNXxMdNlhqVOtR5zLQHv3t0Pvd9/HFZhfcx1w86u5aR+V9irnyt3WAc=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIB2l802ocmKW/xzYye+Pzw89MQvA5jQh5a0yLK2ZyZCd
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI7XJn2j/ECeKq3mKYHO54Bh/Op2+6G6UX6ad7xn+hglSDuDDZy9KOJY974X6YapBGPsvID5GfLpKZuusj2w6cw=
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDSD8ZpMWbyfWwat32zE3dwK32EyLj7Y+//yic/Bd8bh7jSKBLK9ym42oYT01mO+NFTdefo2ARchFmERRxMzut4oUrqMlfhrn+mNHsvLaQycoAg+oq19ivJki9YXqDUIR0GwObpBRBSVczn15OcfSZNvJ+5yEWYWoeMXyjR7IpLeP4unrXYU5Gefx+ixYfHqq9U2klSr1mLGklHOYT1257UfS7aFtDHfrGqNLBghhbpbjLBljCPzwbz2JHg+8oO3x0s19DpnMBT0ID3emGqK2CRupsBeiWpZYUfcIDbCqmgcmC5QRkORpTRfGSYdDcsqSjpDOkPShwf1Le1r5QnW7JiFsy0ogLQ0ThcibSAVqVQZpFDROMSTPeqUlnDDqklZEtTgARcUGiVhmiXhR8sIdJXzJ5b1IB28Y3jGlf6kmQpBa9raXRegF/7J3SWDcOHO/sYe7Wh50S0cBgRgix0492hkGz3icxCzNwpQ5H/dTKdLCX7SvWyn/dHYE7411EP0Xc=
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIARLjHbwtuz0VGhEJnZ8jUcmug4YEziBMgu/+Q2Xf/qr
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBN/4QosKjhedc/jgjDOXpXhsciLiDd+ILxSMZxLO5NzR72Gm5KH5lEdveLrailDwVrIBl1+UjfksCNfnn+zVt1w=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCs1lxOj+O3cXQh+L6Hvro0WUX7vGdONQb0UkjJDqrzMWuP0tmX4CuMYeN2kUtGqc5U1dKriurXmo1qGVTvVz1rFJWYr1e1qwcv/DCLijB+4QR8oi61K8+nnWm47XeUoyWOI1GxkiHPeLPUs3QDDbHClDRGD9SWUQ5AtaO0NqAPalgp4eYChWy0Y4soQNnOXqbjnwEsJRK85/mXhogmZpALrFBu87oJtbviSxczqa+4bci7R6jWZ+ZkZbKw2+D3QskWWoHcgFgQVCprAXuj/ebUq1gyCY/d+tnyQs80H9XZ6Ryvmu1e7zEhKJvldu5mAamd8l4EwL79yt1ds7cSRXEH/+ajyYpXXTerzMFIsItjkdt+fg8DiheTqZexiHXvykMSjhPdshC1A9JWSsD+ISIR5qLPmHx5g3kZyVt5WM3mPfqh8WYsG4FM7EzMz492DnLUqdIsJXOBPjExJZhCLYvOdjJI5hMYHQ2GTE4ZlW0rvYr85xi12yOn9K3zmZ6q2SU=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOVwi6LOnwRGXKTYlL5FohHpKT05ra2BKYgm2kBQxP+u
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOCcnxH2XsLxMcRRLaA4DruLY3oryYRdOPfwLiZD7s7kBHBXt+svOGk0QImtaVEKV/k9369qMK8GrFyzO2efaCk=
                                             create=True mode=0644 path=/tmp/ansible.dajwyp0h state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:17 compute-0 sudo[74619]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:18 compute-0 sudo[74771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slyojjatwjwxmlctcveikcryxbdeknmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397697.7425282-138-110949070786690/AnsiballZ_command.py'
Nov 29 06:28:18 compute-0 sudo[74771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:18 compute-0 python3.9[74773]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.dajwyp0h' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:28:18 compute-0 sudo[74771]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:19 compute-0 sudo[74925]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fafpylqhwrzjxzttglvbongkooofmple ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397698.5471916-162-277744996438859/AnsiballZ_file.py'
Nov 29 06:28:19 compute-0 sudo[74925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:19 compute-0 python3.9[74927]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.dajwyp0h state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:19 compute-0 sudo[74925]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:20 compute-0 sshd-session[74011]: Connection closed by 192.168.122.30 port 58498
Nov 29 06:28:20 compute-0 sshd-session[74008]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:28:20 compute-0 systemd[1]: session-15.scope: Deactivated successfully.
Nov 29 06:28:20 compute-0 systemd[1]: session-15.scope: Consumed 3.408s CPU time.
Nov 29 06:28:20 compute-0 systemd-logind[788]: Session 15 logged out. Waiting for processes to exit.
Nov 29 06:28:20 compute-0 systemd-logind[788]: Removed session 15.
Nov 29 06:28:26 compute-0 sshd-session[74952]: Accepted publickey for zuul from 192.168.122.30 port 45006 ssh2: ECDSA SHA256:Ey+6YDlYfkE2dRe/gjWhHvvrJHee4xZo2Q1JojEuVBA
Nov 29 06:28:26 compute-0 systemd-logind[788]: New session 16 of user zuul.
Nov 29 06:28:26 compute-0 systemd[1]: Started Session 16 of User zuul.
Nov 29 06:28:26 compute-0 sshd-session[74952]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:28:27 compute-0 python3.9[75105]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:28:28 compute-0 sudo[75259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmovajvyjpuwxkauytlijblqnzclemkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397707.913899-61-153266844554235/AnsiballZ_systemd.py'
Nov 29 06:28:28 compute-0 sudo[75259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:29 compute-0 python3.9[75261]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 29 06:28:29 compute-0 sudo[75259]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:29 compute-0 sudo[75413]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfvzmvkvialkcsjgblapkgjxcaptviyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397709.391007-85-196786510452253/AnsiballZ_systemd.py'
Nov 29 06:28:29 compute-0 sudo[75413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:30 compute-0 python3.9[75415]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:28:30 compute-0 sudo[75413]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:30 compute-0 sudo[75566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xepzkaatbiekrodslgfvkxroomqrwown ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397710.402523-112-115825705797319/AnsiballZ_command.py'
Nov 29 06:28:30 compute-0 sudo[75566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:31 compute-0 python3.9[75568]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:28:31 compute-0 sudo[75566]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:31 compute-0 sudo[75719]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmjtdkotfextlikmqklkktupbkzxfyoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397711.3397946-136-27318732981989/AnsiballZ_stat.py'
Nov 29 06:28:31 compute-0 sudo[75719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:32 compute-0 python3.9[75721]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:28:32 compute-0 sudo[75719]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:32 compute-0 sudo[75873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqeyspsthaihuhwtdrzpajixsyydjicr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397712.2905898-160-166060298698714/AnsiballZ_command.py'
Nov 29 06:28:32 compute-0 sudo[75873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:32 compute-0 python3.9[75875]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:28:32 compute-0 sudo[75873]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:33 compute-0 sudo[76028]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwruqturtmquddbqsdeeefhxsjpynltk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397713.097076-184-144783162230004/AnsiballZ_file.py'
Nov 29 06:28:33 compute-0 sudo[76028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:33 compute-0 python3.9[76030]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:28:33 compute-0 sudo[76028]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:34 compute-0 sshd-session[74955]: Connection closed by 192.168.122.30 port 45006
Nov 29 06:28:34 compute-0 sshd-session[74952]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:28:34 compute-0 systemd[1]: session-16.scope: Deactivated successfully.
Nov 29 06:28:34 compute-0 systemd[1]: session-16.scope: Consumed 4.661s CPU time.
Nov 29 06:28:34 compute-0 systemd-logind[788]: Session 16 logged out. Waiting for processes to exit.
Nov 29 06:28:34 compute-0 systemd-logind[788]: Removed session 16.
Nov 29 06:28:39 compute-0 sshd-session[76055]: Accepted publickey for zuul from 192.168.122.30 port 39360 ssh2: ECDSA SHA256:Ey+6YDlYfkE2dRe/gjWhHvvrJHee4xZo2Q1JojEuVBA
Nov 29 06:28:39 compute-0 systemd-logind[788]: New session 17 of user zuul.
Nov 29 06:28:39 compute-0 systemd[1]: Started Session 17 of User zuul.
Nov 29 06:28:39 compute-0 sshd-session[76055]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:28:40 compute-0 python3.9[76208]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:28:41 compute-0 sudo[76364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmfkuaaeblbhxsbsgrwkhziyisrrgukc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397721.45399-67-181526756482885/AnsiballZ_setup.py'
Nov 29 06:28:41 compute-0 sudo[76364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:42 compute-0 python3.9[76366]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:28:42 compute-0 sudo[76364]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:42 compute-0 sshd-session[76237]: Invalid user gits from 36.50.176.16 port 52330
Nov 29 06:28:42 compute-0 sudo[76448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwepwbzqgyoetwvmtwkxxekpozvusjjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397721.45399-67-181526756482885/AnsiballZ_dnf.py'
Nov 29 06:28:42 compute-0 sudo[76448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:42 compute-0 sshd-session[76237]: Received disconnect from 36.50.176.16 port 52330:11: Bye Bye [preauth]
Nov 29 06:28:42 compute-0 sshd-session[76237]: Disconnected from invalid user gits 36.50.176.16 port 52330 [preauth]
Nov 29 06:28:42 compute-0 python3.9[76452]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 06:28:43 compute-0 sshd-session[76449]: Invalid user intel from 45.202.211.6 port 45572
Nov 29 06:28:44 compute-0 sshd-session[76449]: Received disconnect from 45.202.211.6 port 45572:11: Bye Bye [preauth]
Nov 29 06:28:44 compute-0 sshd-session[76449]: Disconnected from invalid user intel 45.202.211.6 port 45572 [preauth]
Nov 29 06:28:44 compute-0 sudo[76448]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:45 compute-0 python3.9[76603]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:28:46 compute-0 sshd-session[76604]: Received disconnect from 1.214.197.163 port 51932:11: Bye Bye [preauth]
Nov 29 06:28:46 compute-0 sshd-session[76604]: Disconnected from authenticating user root 1.214.197.163 port 51932 [preauth]
Nov 29 06:28:47 compute-0 python3.9[76756]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 29 06:28:47 compute-0 python3.9[76906]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:28:48 compute-0 python3.9[77056]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:28:49 compute-0 sshd-session[76058]: Connection closed by 192.168.122.30 port 39360
Nov 29 06:28:49 compute-0 sshd-session[76055]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:28:49 compute-0 systemd[1]: session-17.scope: Deactivated successfully.
Nov 29 06:28:49 compute-0 systemd[1]: session-17.scope: Consumed 5.944s CPU time.
Nov 29 06:28:49 compute-0 systemd-logind[788]: Session 17 logged out. Waiting for processes to exit.
Nov 29 06:28:49 compute-0 systemd-logind[788]: Removed session 17.
Nov 29 06:28:49 compute-0 sshd-session[77057]: Invalid user scanner from 160.202.8.218 port 48770
Nov 29 06:28:51 compute-0 sshd-session[77057]: Received disconnect from 160.202.8.218 port 48770:11: Bye Bye [preauth]
Nov 29 06:28:51 compute-0 sshd-session[77057]: Disconnected from invalid user scanner 160.202.8.218 port 48770 [preauth]
Nov 29 06:28:55 compute-0 sshd-session[77083]: Accepted publickey for zuul from 192.168.122.30 port 39522 ssh2: ECDSA SHA256:Ey+6YDlYfkE2dRe/gjWhHvvrJHee4xZo2Q1JojEuVBA
Nov 29 06:28:55 compute-0 systemd-logind[788]: New session 18 of user zuul.
Nov 29 06:28:55 compute-0 systemd[1]: Started Session 18 of User zuul.
Nov 29 06:28:55 compute-0 sshd-session[77083]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:28:57 compute-0 python3.9[77236]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:28:59 compute-0 sudo[77392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdjlxyhaxrbyuwsaazftbjzfzyhvokvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397738.5794702-111-46365293551232/AnsiballZ_file.py'
Nov 29 06:28:59 compute-0 sudo[77392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:59 compute-0 python3.9[77394]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:28:59 compute-0 sudo[77392]: pam_unix(sudo:session): session closed for user root
Nov 29 06:28:59 compute-0 sshd-session[77340]: Received disconnect from 179.125.24.202 port 55430:11: Bye Bye [preauth]
Nov 29 06:28:59 compute-0 sshd-session[77340]: Disconnected from authenticating user root 179.125.24.202 port 55430 [preauth]
Nov 29 06:28:59 compute-0 sudo[77544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwykyiufkrobkoxmxnofryyosfstjwsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397739.4331055-111-104763555190122/AnsiballZ_file.py'
Nov 29 06:28:59 compute-0 sudo[77544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:28:59 compute-0 python3.9[77546]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:29:00 compute-0 sudo[77544]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:00 compute-0 sudo[77696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-joxvzpigsovustqkmrdifohhysxylbub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397740.251983-162-54788964638424/AnsiballZ_stat.py'
Nov 29 06:29:00 compute-0 sudo[77696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:00 compute-0 python3.9[77698]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:29:00 compute-0 sudo[77696]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:01 compute-0 sudo[77819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syaysxdnbpidwdmcjpeeczheeeterhdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397740.251983-162-54788964638424/AnsiballZ_copy.py'
Nov 29 06:29:01 compute-0 sudo[77819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:01 compute-0 python3.9[77821]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397740.251983-162-54788964638424/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=caf3a7d249594c6f96416413d2fa8cb8c94ed752 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:01 compute-0 sudo[77819]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:01 compute-0 sudo[77971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-woqnkqsnmbsborfjszwvqswwwrttuzja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397741.4881144-162-148616803824976/AnsiballZ_stat.py'
Nov 29 06:29:01 compute-0 sudo[77971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:01 compute-0 python3.9[77973]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:29:01 compute-0 sudo[77971]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:02 compute-0 sudo[78094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xuhtpemeqkggcgemgrtpjnptsryhenps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397741.4881144-162-148616803824976/AnsiballZ_copy.py'
Nov 29 06:29:02 compute-0 sudo[78094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:02 compute-0 python3.9[78096]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397741.4881144-162-148616803824976/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=53ec2692be9fe7fa10ffde7cdba9150c4076f3fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:02 compute-0 sudo[78094]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:02 compute-0 sudo[78246]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrnzqgzzkzvztylpclxotqwzjgjngrga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397742.6852314-162-33103087724154/AnsiballZ_stat.py'
Nov 29 06:29:02 compute-0 sudo[78246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:03 compute-0 python3.9[78248]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:29:03 compute-0 sudo[78246]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:03 compute-0 sudo[78369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixrrqhgwidnogkinmrwhfixhvlkjvrdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397742.6852314-162-33103087724154/AnsiballZ_copy.py'
Nov 29 06:29:03 compute-0 sudo[78369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:03 compute-0 python3.9[78371]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397742.6852314-162-33103087724154/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=acd88ba029817a8a277ba756c6926e67830ad6f7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:03 compute-0 sudo[78369]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:04 compute-0 sudo[78521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbsgslsbyvgtwbemsmznakjwmmxoecri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397743.9154403-283-31726706746669/AnsiballZ_file.py'
Nov 29 06:29:04 compute-0 sudo[78521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:04 compute-0 python3.9[78523]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:29:04 compute-0 sudo[78521]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:04 compute-0 sudo[78673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmqratqicqysagcadmirhawxaehuzaeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397744.5684671-283-172995711071231/AnsiballZ_file.py'
Nov 29 06:29:04 compute-0 sudo[78673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:05 compute-0 python3.9[78675]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:29:05 compute-0 sudo[78673]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:05 compute-0 sudo[78825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvlkuvdduxuubwiayzwjfrbiacoosxim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397745.2935803-328-163880361277565/AnsiballZ_stat.py'
Nov 29 06:29:05 compute-0 sudo[78825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:05 compute-0 python3.9[78827]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:29:05 compute-0 sudo[78825]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:06 compute-0 sudo[78948]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sabpymgiddvfovocqfdrhaunumisppwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397745.2935803-328-163880361277565/AnsiballZ_copy.py'
Nov 29 06:29:06 compute-0 sudo[78948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:06 compute-0 python3.9[78950]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397745.2935803-328-163880361277565/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=8210bb016e77c67ce5813c8478d9c5870889c23b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:06 compute-0 sudo[78948]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:06 compute-0 sudo[79100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcmfablwzfkilleqxjchnvrusjyfbyuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397746.4215543-328-141639754775639/AnsiballZ_stat.py'
Nov 29 06:29:06 compute-0 sudo[79100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:06 compute-0 python3.9[79102]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:29:06 compute-0 sudo[79100]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:07 compute-0 sudo[79223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrprcyxeimisxwcnpqqcvichoupgsuwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397746.4215543-328-141639754775639/AnsiballZ_copy.py'
Nov 29 06:29:07 compute-0 sudo[79223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:07 compute-0 python3.9[79225]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397746.4215543-328-141639754775639/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=bf4291d96f7c0f5cd858ccf4f424f476f6c02cd9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:07 compute-0 sudo[79223]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:07 compute-0 sudo[79375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ronzmosgpkojsiqkdrvpzcjnxkdhzyyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397747.6215527-328-199183709451916/AnsiballZ_stat.py'
Nov 29 06:29:07 compute-0 sudo[79375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:08 compute-0 python3.9[79377]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:29:08 compute-0 sudo[79375]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:08 compute-0 sudo[79498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aryewzsagjdguprnkbnxgjyxyotniapj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397747.6215527-328-199183709451916/AnsiballZ_copy.py'
Nov 29 06:29:08 compute-0 sudo[79498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:08 compute-0 python3.9[79500]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397747.6215527-328-199183709451916/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=8c5487e842f15d737a8c3bfb9232e30b91aafd3e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:08 compute-0 sudo[79498]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:09 compute-0 sudo[79650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbkysjbuaanxtzphkxrtddihlewnsqko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397748.7678494-451-5183871196668/AnsiballZ_file.py'
Nov 29 06:29:09 compute-0 sudo[79650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:09 compute-0 python3.9[79652]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:29:09 compute-0 sudo[79650]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:09 compute-0 sudo[79802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbrdjqzlczjviqfsbnrxundtsneoouod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397749.3419933-451-77487523214168/AnsiballZ_file.py'
Nov 29 06:29:09 compute-0 sudo[79802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:09 compute-0 python3.9[79804]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:29:09 compute-0 sudo[79802]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:10 compute-0 sudo[79956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fztvqohngvnnsfqwtgjrdnpmgbjohhfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397750.023326-494-124234461277580/AnsiballZ_stat.py'
Nov 29 06:29:10 compute-0 sudo[79956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:10 compute-0 python3.9[79958]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:29:10 compute-0 sudo[79956]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:10 compute-0 sudo[80079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-johhgrhiwuenlbehmbulqdlznwflmarc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397750.023326-494-124234461277580/AnsiballZ_copy.py'
Nov 29 06:29:10 compute-0 sudo[80079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:11 compute-0 python3.9[80081]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397750.023326-494-124234461277580/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=ce63929ef95fdfd5304988033c3f3a3c786fb2cc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:11 compute-0 sudo[80079]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:11 compute-0 sudo[80231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtzieesounxmbtviefhzlehqlhpcbxol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397751.1843593-494-227212550091263/AnsiballZ_stat.py'
Nov 29 06:29:11 compute-0 sudo[80231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:11 compute-0 sshd-session[79829]: Received disconnect from 103.179.56.44 port 56834:11: Bye Bye [preauth]
Nov 29 06:29:11 compute-0 sshd-session[79829]: Disconnected from authenticating user root 103.179.56.44 port 56834 [preauth]
Nov 29 06:29:11 compute-0 python3.9[80233]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:29:11 compute-0 sudo[80231]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:11 compute-0 sudo[80354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbbjqwukmvkbsitkulkqfbvlpzxpdvux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397751.1843593-494-227212550091263/AnsiballZ_copy.py'
Nov 29 06:29:11 compute-0 sudo[80354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:12 compute-0 python3.9[80356]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397751.1843593-494-227212550091263/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=d5ce9fc6df7543706791229321e0116a703016b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:12 compute-0 sudo[80354]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:12 compute-0 sudo[80506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vaznpqkklibnihljvznjdsetmlnsoogw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397752.2979963-494-175491078715632/AnsiballZ_stat.py'
Nov 29 06:29:12 compute-0 sudo[80506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:12 compute-0 python3.9[80508]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:29:12 compute-0 sudo[80506]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:13 compute-0 sudo[80629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-forncqxfeqqfhqxmhfyqmaweokjjinmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397752.2979963-494-175491078715632/AnsiballZ_copy.py'
Nov 29 06:29:13 compute-0 sudo[80629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:13 compute-0 python3.9[80631]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397752.2979963-494-175491078715632/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=f7e8489ecf49a29cfebc223162f4ef606eb5112a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:13 compute-0 sudo[80629]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:13 compute-0 sudo[80781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-miljqhtrlymgdfyozjxmisdwpugotbuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397753.499453-589-74042303916250/AnsiballZ_file.py'
Nov 29 06:29:13 compute-0 sudo[80781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:13 compute-0 python3.9[80783]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:29:14 compute-0 sudo[80781]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:14 compute-0 sudo[80933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjytvzikmxweoatxtsekowmddcdewbdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397754.1426036-589-70610487573561/AnsiballZ_file.py'
Nov 29 06:29:14 compute-0 sudo[80933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:14 compute-0 python3.9[80935]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:29:14 compute-0 sudo[80933]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:15 compute-0 sudo[81085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcgzfuopwautoiuzgzdvmeicdwkkoalh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397754.7728827-633-143467881585403/AnsiballZ_stat.py'
Nov 29 06:29:15 compute-0 sudo[81085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:15 compute-0 python3.9[81087]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:29:15 compute-0 sudo[81085]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:15 compute-0 chronyd[64778]: Selected source 198.181.199.86 (pool.ntp.org)
Nov 29 06:29:15 compute-0 sudo[81208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cuffbvvyvgwuuchheyxcupquycutjtkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397754.7728827-633-143467881585403/AnsiballZ_copy.py'
Nov 29 06:29:15 compute-0 sudo[81208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:15 compute-0 python3.9[81210]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397754.7728827-633-143467881585403/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=d5380b88b0dcc1521c8d785c45e8b23160a41834 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:15 compute-0 sudo[81208]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:16 compute-0 sudo[81360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mseqtsjsguhavndvzdgodwtawwnuwghs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397755.9291818-633-6999891173018/AnsiballZ_stat.py'
Nov 29 06:29:16 compute-0 sudo[81360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:16 compute-0 python3.9[81362]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:29:16 compute-0 sudo[81360]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:16 compute-0 sudo[81483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddaymqdryzaadvaaanytdjzhksovnafx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397755.9291818-633-6999891173018/AnsiballZ_copy.py'
Nov 29 06:29:16 compute-0 sudo[81483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:17 compute-0 python3.9[81485]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397755.9291818-633-6999891173018/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=d5ce9fc6df7543706791229321e0116a703016b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:17 compute-0 sudo[81483]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:17 compute-0 sudo[81635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crezdbzfdawqctrhbgaffylaenghcldt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397757.1400733-633-272864441340666/AnsiballZ_stat.py'
Nov 29 06:29:17 compute-0 sudo[81635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:17 compute-0 python3.9[81637]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:29:17 compute-0 sudo[81635]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:17 compute-0 sudo[81758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-howclbockmzzmoqwdvmnacwpisynwepc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397757.1400733-633-272864441340666/AnsiballZ_copy.py'
Nov 29 06:29:17 compute-0 sudo[81758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:18 compute-0 python3.9[81760]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397757.1400733-633-272864441340666/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=badcd70339cb794d960467b45da9bf0592d9a50c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:18 compute-0 sudo[81758]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:19 compute-0 sudo[81910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhxkeeizukqtyoldjkykjkqspcjuekyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397758.831958-785-49753501098776/AnsiballZ_file.py'
Nov 29 06:29:19 compute-0 sudo[81910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:19 compute-0 python3.9[81912]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:29:19 compute-0 sudo[81910]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:20 compute-0 sudo[82062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iszdnolrkqcchibsxtkkvpwqrpxqgcyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397759.7872095-824-275354875515427/AnsiballZ_stat.py'
Nov 29 06:29:20 compute-0 sudo[82062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:20 compute-0 python3.9[82064]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:29:20 compute-0 sudo[82062]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:20 compute-0 sudo[82186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqmxcywwaglyamlzpadvrerjhrgbsoom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397759.7872095-824-275354875515427/AnsiballZ_copy.py'
Nov 29 06:29:20 compute-0 sudo[82186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:20 compute-0 python3.9[82188]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397759.7872095-824-275354875515427/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e40b48f8fd7ce69cedaa6e53dbd579733b9096d3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:20 compute-0 sudo[82186]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:21 compute-0 sudo[82338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsxdrxdorgplqrqztaonjpelprhcjexd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397761.0211425-871-54036799889162/AnsiballZ_file.py'
Nov 29 06:29:21 compute-0 sudo[82338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:21 compute-0 python3.9[82340]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:29:21 compute-0 sudo[82338]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:21 compute-0 sudo[82490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipcsglhtxkrfzffbquafwinqzvhowbov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397761.68346-894-194847951863166/AnsiballZ_stat.py'
Nov 29 06:29:21 compute-0 sudo[82490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:22 compute-0 python3.9[82492]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:29:22 compute-0 sudo[82490]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:22 compute-0 sudo[82613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkztfhywqnpspxltxsvrteqkmlqmdvck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397761.68346-894-194847951863166/AnsiballZ_copy.py'
Nov 29 06:29:22 compute-0 sudo[82613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:22 compute-0 python3.9[82615]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397761.68346-894-194847951863166/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e40b48f8fd7ce69cedaa6e53dbd579733b9096d3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:22 compute-0 sudo[82613]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:23 compute-0 sudo[82765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rauphveqmcwmtfiuxtvjxpczfamttgsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397763.082135-932-76946264771780/AnsiballZ_file.py'
Nov 29 06:29:23 compute-0 sudo[82765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:23 compute-0 python3.9[82767]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:29:23 compute-0 sudo[82765]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:23 compute-0 sudo[82917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drwfmsfrzpvgwbfbzutkzozqtwvxdsdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397763.7099252-959-198785860111929/AnsiballZ_stat.py'
Nov 29 06:29:23 compute-0 sudo[82917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:24 compute-0 python3.9[82919]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:29:24 compute-0 sudo[82917]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:24 compute-0 sudo[83040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylzzqpnzdtdcuiymnwjhrurpsgtenofi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397763.7099252-959-198785860111929/AnsiballZ_copy.py'
Nov 29 06:29:24 compute-0 sudo[83040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:24 compute-0 python3.9[83042]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397763.7099252-959-198785860111929/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e40b48f8fd7ce69cedaa6e53dbd579733b9096d3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:24 compute-0 sudo[83040]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:25 compute-0 sudo[83192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvqetutfzirwwbckuocsszcjzgnwgabq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397765.039298-1004-254652281011876/AnsiballZ_file.py'
Nov 29 06:29:25 compute-0 sudo[83192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:25 compute-0 python3.9[83194]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:29:25 compute-0 sudo[83192]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:25 compute-0 sudo[83344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umvyblmoaklfapeevpfoyxbzsccmigar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397765.624662-1029-149407911526135/AnsiballZ_stat.py'
Nov 29 06:29:25 compute-0 sudo[83344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:26 compute-0 python3.9[83346]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:29:26 compute-0 sudo[83344]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:26 compute-0 sudo[83467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzmjbuaeayvsprakrkjhybcpuvfqtrip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397765.624662-1029-149407911526135/AnsiballZ_copy.py'
Nov 29 06:29:26 compute-0 sudo[83467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:26 compute-0 python3.9[83469]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397765.624662-1029-149407911526135/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e40b48f8fd7ce69cedaa6e53dbd579733b9096d3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:26 compute-0 sudo[83467]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:27 compute-0 sudo[83619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncjnqxqemqjuysrcuhtovxnfoobiwbkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397766.825793-1076-228217528445868/AnsiballZ_file.py'
Nov 29 06:29:27 compute-0 sudo[83619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:27 compute-0 python3.9[83621]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:29:27 compute-0 sudo[83619]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:27 compute-0 sudo[83771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrbtsttvoqvqfesgxhwplivkkoapzall ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397767.6547244-1100-259692504865888/AnsiballZ_stat.py'
Nov 29 06:29:27 compute-0 sudo[83771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:28 compute-0 python3.9[83773]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:29:28 compute-0 sudo[83771]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:28 compute-0 sudo[83894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nibgkdjauywbqtlgeogawaooglhgakjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397767.6547244-1100-259692504865888/AnsiballZ_copy.py'
Nov 29 06:29:28 compute-0 sudo[83894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:29 compute-0 python3.9[83896]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397767.6547244-1100-259692504865888/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e40b48f8fd7ce69cedaa6e53dbd579733b9096d3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:29 compute-0 sudo[83894]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:29 compute-0 sudo[84046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsgtwzkmxwxwcxvahcfhpsbobszrpwqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397769.260008-1158-18479686688977/AnsiballZ_file.py'
Nov 29 06:29:29 compute-0 sudo[84046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:29 compute-0 python3.9[84048]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:29:29 compute-0 sudo[84046]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:30 compute-0 sudo[84198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzdoxhvxyyaheneqpivhxguitmpjxdsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397769.9293966-1183-93213842921328/AnsiballZ_stat.py'
Nov 29 06:29:30 compute-0 sudo[84198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:30 compute-0 python3.9[84200]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:29:30 compute-0 sudo[84198]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:30 compute-0 sudo[84321]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejfchffuzppoxaznrxjoxyazzkwwocra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397769.9293966-1183-93213842921328/AnsiballZ_copy.py'
Nov 29 06:29:30 compute-0 sudo[84321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:30 compute-0 python3.9[84323]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397769.9293966-1183-93213842921328/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e40b48f8fd7ce69cedaa6e53dbd579733b9096d3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:31 compute-0 sudo[84321]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:31 compute-0 sudo[84473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-niporsrivpgmqsvgnvqavpjcmroehqqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397771.211105-1231-128861690684051/AnsiballZ_file.py'
Nov 29 06:29:31 compute-0 sudo[84473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:31 compute-0 python3.9[84475]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:29:31 compute-0 sudo[84473]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:32 compute-0 sudo[84625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnwsfumtarngbjlckezfcwnqnoxtxqvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397771.9566631-1257-68794819345361/AnsiballZ_stat.py'
Nov 29 06:29:32 compute-0 sudo[84625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:32 compute-0 python3.9[84627]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:29:32 compute-0 sudo[84625]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:32 compute-0 sudo[84748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbikwyysljsvlzbzhcyplwgpkzcrfwaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397771.9566631-1257-68794819345361/AnsiballZ_copy.py'
Nov 29 06:29:32 compute-0 sudo[84748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:33 compute-0 python3.9[84750]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397771.9566631-1257-68794819345361/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e40b48f8fd7ce69cedaa6e53dbd579733b9096d3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:29:33 compute-0 sudo[84748]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:38 compute-0 sshd-session[77086]: Connection closed by 192.168.122.30 port 39522
Nov 29 06:29:38 compute-0 sshd-session[77083]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:29:38 compute-0 systemd[1]: session-18.scope: Deactivated successfully.
Nov 29 06:29:38 compute-0 systemd[1]: session-18.scope: Consumed 27.474s CPU time.
Nov 29 06:29:38 compute-0 systemd-logind[788]: Session 18 logged out. Waiting for processes to exit.
Nov 29 06:29:38 compute-0 systemd-logind[788]: Removed session 18.
Nov 29 06:29:42 compute-0 sshd-session[82065]: Connection closed by 45.78.219.251 port 59928 [preauth]
Nov 29 06:29:43 compute-0 sshd-session[84777]: Accepted publickey for zuul from 192.168.122.30 port 50034 ssh2: ECDSA SHA256:Ey+6YDlYfkE2dRe/gjWhHvvrJHee4xZo2Q1JojEuVBA
Nov 29 06:29:43 compute-0 systemd-logind[788]: New session 19 of user zuul.
Nov 29 06:29:43 compute-0 systemd[1]: Started Session 19 of User zuul.
Nov 29 06:29:43 compute-0 sshd-session[84777]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:29:44 compute-0 python3.9[84930]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:29:45 compute-0 sudo[85084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfpxmeyfgycfzzkcuwtjwoilopelwndq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397785.3326836-66-132309989703143/AnsiballZ_file.py'
Nov 29 06:29:45 compute-0 sudo[85084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:45 compute-0 python3.9[85086]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:29:45 compute-0 sudo[85084]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:46 compute-0 sudo[85236]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfrbwinelmxyflvsgomxytocoxximzfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397786.1105535-66-153358406031314/AnsiballZ_file.py'
Nov 29 06:29:46 compute-0 sudo[85236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:46 compute-0 python3.9[85238]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:29:46 compute-0 sudo[85236]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:47 compute-0 python3.9[85388]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:29:48 compute-0 sudo[85538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ouzzezhtvqmrrghyzcjqfhlfrthnangg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397787.770679-135-232355855211257/AnsiballZ_seboolean.py'
Nov 29 06:29:48 compute-0 sudo[85538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:48 compute-0 python3.9[85540]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 29 06:29:52 compute-0 sudo[85538]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:53 compute-0 sudo[85694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycejclsxoahlagbraxsxlxykxatyxkwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397793.4269662-165-80837443489508/AnsiballZ_setup.py'
Nov 29 06:29:53 compute-0 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Nov 29 06:29:53 compute-0 sudo[85694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:53 compute-0 python3.9[85696]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:29:54 compute-0 sudo[85694]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:54 compute-0 sudo[85778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcjbpuyoqawjxzmjhswyoozkfogrvkzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397793.4269662-165-80837443489508/AnsiballZ_dnf.py'
Nov 29 06:29:54 compute-0 sudo[85778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:54 compute-0 python3.9[85780]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:29:56 compute-0 sudo[85778]: pam_unix(sudo:session): session closed for user root
Nov 29 06:29:57 compute-0 sudo[85931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fquviglnbqglvophdcoiocitnyetcdwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397796.7756035-201-112680528762606/AnsiballZ_systemd.py'
Nov 29 06:29:57 compute-0 sudo[85931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:29:57 compute-0 python3.9[85933]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 06:29:57 compute-0 sudo[85931]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:01 compute-0 sshd-session[85961]: Received disconnect from 45.202.211.6 port 41494:11: Bye Bye [preauth]
Nov 29 06:30:01 compute-0 sshd-session[85961]: Disconnected from authenticating user root 45.202.211.6 port 41494 [preauth]
Nov 29 06:30:01 compute-0 sudo[86088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-waynlgxdpvnzkjmjfmzzbyisqkhqexzp ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764397801.0300875-225-99468430777798/AnsiballZ_edpm_nftables_snippet.py'
Nov 29 06:30:01 compute-0 sudo[86088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:01 compute-0 python3[86090]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Nov 29 06:30:01 compute-0 sudo[86088]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:02 compute-0 sudo[86240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ruvyxoeaulxpfjztayhlpaeaqgguqscq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397802.1078792-252-227843555318121/AnsiballZ_file.py'
Nov 29 06:30:02 compute-0 sudo[86240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:02 compute-0 python3.9[86242]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:30:02 compute-0 sudo[86240]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:03 compute-0 sudo[86392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxmjxuoguzzrowymxaiexmfkgbcrjott ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397802.7893574-276-269716173185230/AnsiballZ_stat.py'
Nov 29 06:30:03 compute-0 sudo[86392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:03 compute-0 python3.9[86394]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:30:03 compute-0 sudo[86392]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:03 compute-0 sudo[86470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-garnqtkndlmndrgmpbqxqbxvwveewjjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397802.7893574-276-269716173185230/AnsiballZ_file.py'
Nov 29 06:30:03 compute-0 sudo[86470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:04 compute-0 python3.9[86472]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:30:04 compute-0 sudo[86470]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:04 compute-0 sudo[86622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txitafdpgpozdrnwmieixznqjbfxszds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397804.209959-312-9089186331205/AnsiballZ_stat.py'
Nov 29 06:30:04 compute-0 sudo[86622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:04 compute-0 python3.9[86624]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:30:04 compute-0 sudo[86622]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:05 compute-0 sudo[86700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dweqzappoptgmrpualidsuehobjqkchn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397804.209959-312-9089186331205/AnsiballZ_file.py'
Nov 29 06:30:05 compute-0 sudo[86700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:05 compute-0 python3.9[86702]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.tddc2b3f recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:30:05 compute-0 sudo[86700]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:05 compute-0 sudo[86852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umjjrrnfeibykfiorsdsahvzizglaqkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397805.531113-348-109829702251269/AnsiballZ_stat.py'
Nov 29 06:30:05 compute-0 sudo[86852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:06 compute-0 python3.9[86854]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:30:06 compute-0 sudo[86852]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:06 compute-0 sudo[86930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehainqzwzmpnqgxmtlqrlzbawnopgudy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397805.531113-348-109829702251269/AnsiballZ_file.py'
Nov 29 06:30:06 compute-0 sudo[86930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:06 compute-0 python3.9[86932]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:30:06 compute-0 sudo[86930]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:07 compute-0 sudo[87082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkywjzvohuezlciwjopcvkydashreqnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397806.7495463-387-112176197201762/AnsiballZ_command.py'
Nov 29 06:30:07 compute-0 sudo[87082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:07 compute-0 python3.9[87084]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:30:07 compute-0 sudo[87082]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:08 compute-0 sudo[87235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbboxncweiphtrettqmodhlzktrkjhxk ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764397807.6234272-411-161572602863050/AnsiballZ_edpm_nftables_from_files.py'
Nov 29 06:30:08 compute-0 sudo[87235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:08 compute-0 python3[87237]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 29 06:30:08 compute-0 sudo[87235]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:08 compute-0 sudo[87387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etrkniknvtzbjxidmmzhlrfjzgvltslq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397808.5188386-435-2684477495440/AnsiballZ_stat.py'
Nov 29 06:30:08 compute-0 sudo[87387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:09 compute-0 python3.9[87389]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:30:09 compute-0 sudo[87387]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:09 compute-0 sudo[87512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wotzlrfkcmceuwjpahzswgsmirqdview ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397808.5188386-435-2684477495440/AnsiballZ_copy.py'
Nov 29 06:30:09 compute-0 sudo[87512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:09 compute-0 python3.9[87514]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397808.5188386-435-2684477495440/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:30:09 compute-0 sudo[87512]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:10 compute-0 sudo[87664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cncyeihhrbfhdawaafswfftdphcjcarm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397810.1269135-480-54996447617157/AnsiballZ_stat.py'
Nov 29 06:30:10 compute-0 sudo[87664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:10 compute-0 python3.9[87666]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:30:10 compute-0 sudo[87664]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:11 compute-0 sudo[87789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjtvzhmnwevoyuzjfsognyvajtmxocqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397810.1269135-480-54996447617157/AnsiballZ_copy.py'
Nov 29 06:30:11 compute-0 sudo[87789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:11 compute-0 python3.9[87791]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397810.1269135-480-54996447617157/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:30:11 compute-0 sudo[87789]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:14 compute-0 sudo[87941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjivpxyfvzbwmeihabnnsupgdqxejbty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397813.6915529-525-75202308078296/AnsiballZ_stat.py'
Nov 29 06:30:14 compute-0 sudo[87941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:14 compute-0 python3.9[87943]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:30:14 compute-0 sudo[87941]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:14 compute-0 sudo[88066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elrzlkermsxjkrcjfdgwrqiuddyvwpib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397813.6915529-525-75202308078296/AnsiballZ_copy.py'
Nov 29 06:30:14 compute-0 sudo[88066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:14 compute-0 python3.9[88068]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397813.6915529-525-75202308078296/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:30:14 compute-0 sudo[88066]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:15 compute-0 sudo[88218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvebyfjfnipisywijafwyuipvchophrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397815.2914743-570-160109676401967/AnsiballZ_stat.py'
Nov 29 06:30:15 compute-0 sudo[88218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:16 compute-0 python3.9[88220]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:30:16 compute-0 sudo[88218]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:16 compute-0 sudo[88343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fuwtkryumxnvsmynemecdprnngziejkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397815.2914743-570-160109676401967/AnsiballZ_copy.py'
Nov 29 06:30:16 compute-0 sudo[88343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:16 compute-0 python3.9[88345]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397815.2914743-570-160109676401967/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:30:16 compute-0 sudo[88343]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:17 compute-0 sudo[88495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grfegvdviujfzjlwwdathdncgfpfmyba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397817.1270838-615-147548069849364/AnsiballZ_stat.py'
Nov 29 06:30:17 compute-0 sudo[88495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:17 compute-0 python3.9[88497]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:30:17 compute-0 sudo[88495]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:18 compute-0 sudo[88620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfubovdjbuqxamojbsnjjuepprmownlv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397817.1270838-615-147548069849364/AnsiballZ_copy.py'
Nov 29 06:30:18 compute-0 sudo[88620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:18 compute-0 python3.9[88622]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397817.1270838-615-147548069849364/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:30:18 compute-0 sudo[88620]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:19 compute-0 sudo[88772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkvqzwgidqleoinywdcopijyvkndcprn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397818.8471842-660-9100544472902/AnsiballZ_file.py'
Nov 29 06:30:19 compute-0 sudo[88772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:19 compute-0 python3.9[88774]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:30:19 compute-0 sudo[88772]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:20 compute-0 sudo[88924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfhxijxbfnimqksuwzykfbtthnztyviu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397820.0121205-684-273167218775912/AnsiballZ_command.py'
Nov 29 06:30:20 compute-0 sudo[88924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:20 compute-0 python3.9[88926]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:30:20 compute-0 sudo[88924]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:21 compute-0 sudo[89079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xendxayxoulgzwnmlrmlhjkauofterph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397820.7204182-708-206762645439165/AnsiballZ_blockinfile.py'
Nov 29 06:30:21 compute-0 sudo[89079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:21 compute-0 python3.9[89081]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:30:21 compute-0 sudo[89079]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:22 compute-0 sudo[89233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkoeyutjijfvblqcugktcbamcbbwgizx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397822.207467-735-228927155870963/AnsiballZ_command.py'
Nov 29 06:30:22 compute-0 sudo[89233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:22 compute-0 python3.9[89235]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:30:22 compute-0 sudo[89233]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:23 compute-0 sudo[89386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fuesemnkvzbscfqdvjshwizhjegtojwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397822.9612236-759-1558956395633/AnsiballZ_stat.py'
Nov 29 06:30:23 compute-0 sudo[89386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:23 compute-0 python3.9[89388]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:30:23 compute-0 sshd-session[89106]: Received disconnect from 1.214.197.163 port 53336:11: Bye Bye [preauth]
Nov 29 06:30:23 compute-0 sshd-session[89106]: Disconnected from authenticating user root 1.214.197.163 port 53336 [preauth]
Nov 29 06:30:23 compute-0 sudo[89386]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:23 compute-0 sudo[89542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icvsbglxwmqiuauvaxaruuhqafvehhtz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397823.676955-783-228986069944044/AnsiballZ_command.py'
Nov 29 06:30:23 compute-0 sudo[89542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:24 compute-0 python3.9[89544]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:30:24 compute-0 sudo[89542]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:24 compute-0 sudo[89697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqxjlpxtpbrkeydghvxctjprzpomoylx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397824.4386306-807-31642835070393/AnsiballZ_file.py'
Nov 29 06:30:24 compute-0 sudo[89697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:24 compute-0 sshd-session[89391]: Received disconnect from 160.202.8.218 port 42238:11: Bye Bye [preauth]
Nov 29 06:30:24 compute-0 sshd-session[89391]: Disconnected from authenticating user root 160.202.8.218 port 42238 [preauth]
Nov 29 06:30:24 compute-0 python3.9[89699]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:30:24 compute-0 sudo[89697]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:26 compute-0 python3.9[89849]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:30:27 compute-0 sudo[90000]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxvwpaurybbturxsttckwbcdoirzvztv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397826.9710824-927-232515353234093/AnsiballZ_command.py'
Nov 29 06:30:27 compute-0 sudo[90000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:27 compute-0 python3.9[90002]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:1e:0a:c6:22:5a:f7" external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:30:27 compute-0 ovs-vsctl[90003]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:1e:0a:c6:22:5a:f7 external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Nov 29 06:30:27 compute-0 sudo[90000]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:28 compute-0 sudo[90153]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzxustamgnkmapdtyexghrgyhixnmnuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397828.0184252-954-67691442758844/AnsiballZ_command.py'
Nov 29 06:30:28 compute-0 sudo[90153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:28 compute-0 python3.9[90155]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:30:28 compute-0 sudo[90153]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:29 compute-0 sudo[90308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxcrxvxamnljgfdemqahbtprpsqulvdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397828.8294158-978-270720806250627/AnsiballZ_command.py'
Nov 29 06:30:29 compute-0 sudo[90308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:29 compute-0 python3.9[90310]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:30:29 compute-0 ovs-vsctl[90311]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Nov 29 06:30:29 compute-0 sudo[90308]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:30 compute-0 python3.9[90461]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:30:30 compute-0 sudo[90615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyvnhloonwvjhxlxbqmmhanzjixhfmre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397830.3270965-1029-247580901691384/AnsiballZ_file.py'
Nov 29 06:30:30 compute-0 sudo[90615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:30 compute-0 python3.9[90617]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:30:30 compute-0 sudo[90615]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:30 compute-0 sshd-session[90462]: Received disconnect from 179.125.24.202 port 41468:11: Bye Bye [preauth]
Nov 29 06:30:30 compute-0 sshd-session[90462]: Disconnected from authenticating user root 179.125.24.202 port 41468 [preauth]
Nov 29 06:30:31 compute-0 sudo[90767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xubreertiasuaxnvsddzmbdvmitsdyrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397831.0001178-1053-17725417694561/AnsiballZ_stat.py'
Nov 29 06:30:31 compute-0 sudo[90767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:31 compute-0 python3.9[90769]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:30:31 compute-0 sudo[90767]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:31 compute-0 sudo[90845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlswthvsrvhoslpycgpfddpaezsmclvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397831.0001178-1053-17725417694561/AnsiballZ_file.py'
Nov 29 06:30:31 compute-0 sudo[90845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:31 compute-0 python3.9[90847]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:30:31 compute-0 sudo[90845]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:32 compute-0 sudo[90997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awdiiqodwklijehzkehhqussghvteqpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397832.0659838-1053-46642554382030/AnsiballZ_stat.py'
Nov 29 06:30:32 compute-0 sudo[90997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:32 compute-0 python3.9[90999]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:30:32 compute-0 sudo[90997]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:32 compute-0 sudo[91075]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqwtycnemqzwjopcyropjiaqdsmxzxfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397832.0659838-1053-46642554382030/AnsiballZ_file.py'
Nov 29 06:30:32 compute-0 sudo[91075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:33 compute-0 python3.9[91077]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:30:33 compute-0 sudo[91075]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:33 compute-0 sudo[91227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzbckmndyqplwtyzjhpdacetctgpryoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397833.4474912-1122-204769550472610/AnsiballZ_file.py'
Nov 29 06:30:33 compute-0 sudo[91227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:33 compute-0 python3.9[91229]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:30:33 compute-0 sudo[91227]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:34 compute-0 sudo[91379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rywfjhmkptpriqmcxwtcxingizwiyzep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397834.1594603-1146-249695986447977/AnsiballZ_stat.py'
Nov 29 06:30:34 compute-0 sudo[91379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:34 compute-0 python3.9[91381]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:30:34 compute-0 sudo[91379]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:34 compute-0 sudo[91457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osuavylneipwtzqlgzookngwozibajwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397834.1594603-1146-249695986447977/AnsiballZ_file.py'
Nov 29 06:30:34 compute-0 sudo[91457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:35 compute-0 python3.9[91459]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:30:35 compute-0 sudo[91457]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:35 compute-0 sudo[91609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzvfrkecmwajyxiwygdcsjeakgdfspth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397835.3067868-1182-109491569851924/AnsiballZ_stat.py'
Nov 29 06:30:35 compute-0 sudo[91609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:36 compute-0 python3.9[91611]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:30:36 compute-0 sudo[91609]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:37 compute-0 sudo[91687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aovuixuubkmzfqpovboamilitqxqbpza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397835.3067868-1182-109491569851924/AnsiballZ_file.py'
Nov 29 06:30:37 compute-0 sudo[91687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:37 compute-0 python3.9[91689]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:30:37 compute-0 sudo[91687]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:37 compute-0 sudo[91839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcttueubazskflbtnahstvsleacbcxlv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397837.4113925-1218-83540077112653/AnsiballZ_systemd.py'
Nov 29 06:30:37 compute-0 sudo[91839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:37 compute-0 python3.9[91841]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:30:37 compute-0 systemd[1]: Reloading.
Nov 29 06:30:38 compute-0 systemd-rc-local-generator[91865]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:30:38 compute-0 systemd-sysv-generator[91873]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:30:38 compute-0 sudo[91839]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:38 compute-0 sudo[92029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlltxecgbupjmxawcwtzvdidjrqhsgxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397838.5601249-1242-97553312530076/AnsiballZ_stat.py'
Nov 29 06:30:38 compute-0 sudo[92029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:39 compute-0 python3.9[92031]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:30:39 compute-0 sudo[92029]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:39 compute-0 sudo[92107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpjzxqtomoyumbnpuszxriiaalubavih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397838.5601249-1242-97553312530076/AnsiballZ_file.py'
Nov 29 06:30:39 compute-0 sudo[92107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:39 compute-0 python3.9[92109]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:30:39 compute-0 sudo[92107]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:40 compute-0 sudo[92259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qherpssuucqhvdlalizzqnkesyqetemu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397839.818527-1278-63593558470981/AnsiballZ_stat.py'
Nov 29 06:30:40 compute-0 sudo[92259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:40 compute-0 python3.9[92261]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:30:40 compute-0 sudo[92259]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:40 compute-0 sudo[92337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ienvrkvtabcpkmitlhkxmdtjcxjnphlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397839.818527-1278-63593558470981/AnsiballZ_file.py'
Nov 29 06:30:40 compute-0 sudo[92337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:40 compute-0 python3.9[92339]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:30:40 compute-0 sudo[92337]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:41 compute-0 sudo[92489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwqthkxbpmruzyudopknamuvnbtgayaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397840.9559376-1314-22835661517772/AnsiballZ_systemd.py'
Nov 29 06:30:41 compute-0 sudo[92489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:41 compute-0 python3.9[92491]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:30:41 compute-0 systemd[1]: Reloading.
Nov 29 06:30:41 compute-0 systemd-sysv-generator[92523]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:30:41 compute-0 systemd-rc-local-generator[92517]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:30:41 compute-0 systemd[1]: Starting Create netns directory...
Nov 29 06:30:41 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 29 06:30:41 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 29 06:30:41 compute-0 systemd[1]: Finished Create netns directory.
Nov 29 06:30:41 compute-0 sudo[92489]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:42 compute-0 sudo[92682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btyoxmhatgakpkwykwivcnnsmhhfjftt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397842.3872716-1344-165737940335908/AnsiballZ_file.py'
Nov 29 06:30:42 compute-0 sudo[92682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:42 compute-0 python3.9[92684]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:30:42 compute-0 sudo[92682]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:43 compute-0 sudo[92834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrwlrfxjwztihxwnmsecpsonordvzscm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397843.0810897-1368-120454281764984/AnsiballZ_stat.py'
Nov 29 06:30:43 compute-0 sudo[92834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:43 compute-0 python3.9[92836]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:30:43 compute-0 sudo[92834]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:43 compute-0 sudo[92957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ublugnzgpdaftknaodqqxoslcuaiqahq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397843.0810897-1368-120454281764984/AnsiballZ_copy.py'
Nov 29 06:30:43 compute-0 sudo[92957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:44 compute-0 python3.9[92959]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397843.0810897-1368-120454281764984/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:30:44 compute-0 sudo[92957]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:44 compute-0 sudo[93109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewowmznwhyzzwqliovgabbgnqdgckqyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397844.6962926-1419-263726512601473/AnsiballZ_file.py'
Nov 29 06:30:44 compute-0 sudo[93109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:45 compute-0 python3.9[93111]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:30:45 compute-0 sudo[93109]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:45 compute-0 sudo[93261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfjbslznkehgezkrhgbuozkbjdcvvcop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397845.4497428-1443-275375525724732/AnsiballZ_stat.py'
Nov 29 06:30:45 compute-0 sudo[93261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:45 compute-0 python3.9[93263]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:30:45 compute-0 sudo[93261]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:46 compute-0 sudo[93384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmbvwotqhnuxltumarnpeldliygyjyzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397845.4497428-1443-275375525724732/AnsiballZ_copy.py'
Nov 29 06:30:46 compute-0 sudo[93384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:46 compute-0 python3.9[93386]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397845.4497428-1443-275375525724732/.source.json _original_basename=.aaw1duz2 follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:30:46 compute-0 sudo[93384]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:47 compute-0 sudo[93536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnxbxckluzyvptmwrpxckjmnekcbexhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397846.8847094-1488-52222340225990/AnsiballZ_file.py'
Nov 29 06:30:47 compute-0 sudo[93536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:47 compute-0 python3.9[93538]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:30:47 compute-0 sudo[93536]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:48 compute-0 sudo[93688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpqagdsjfkszgoeazshduwhhmwvjsras ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397847.8515725-1512-123230442465874/AnsiballZ_stat.py'
Nov 29 06:30:48 compute-0 sudo[93688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:48 compute-0 sudo[93688]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:48 compute-0 sudo[93811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxpunfetdxnxrwywmwsdfgbscwnvamuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397847.8515725-1512-123230442465874/AnsiballZ_copy.py'
Nov 29 06:30:48 compute-0 sudo[93811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:49 compute-0 sudo[93811]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:49 compute-0 sudo[93963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrwfarokziftueqrfhbvwuwwkfopznma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397849.517656-1563-161201532562602/AnsiballZ_container_config_data.py'
Nov 29 06:30:49 compute-0 sudo[93963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:50 compute-0 python3.9[93965]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Nov 29 06:30:50 compute-0 sudo[93963]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:50 compute-0 sudo[94115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owwgjjfxzkfethblwoyilxbazfunikub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397850.489947-1590-151739226626176/AnsiballZ_container_config_hash.py'
Nov 29 06:30:50 compute-0 sudo[94115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:51 compute-0 python3.9[94117]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 06:30:51 compute-0 sudo[94115]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:51 compute-0 sudo[94267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwjkfkorycdebmodmqsrjnvhegssztll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397851.469898-1617-137487275943523/AnsiballZ_podman_container_info.py'
Nov 29 06:30:51 compute-0 sudo[94267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:52 compute-0 python3.9[94269]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 29 06:30:52 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:30:52 compute-0 sudo[94267]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:54 compute-0 sudo[94430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qutjdjgwskenubfvfdcwnmlmmyakjsou ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764397853.5277443-1656-277553902237434/AnsiballZ_edpm_container_manage.py'
Nov 29 06:30:54 compute-0 sudo[94430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:54 compute-0 python3[94432]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 06:30:54 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:30:54 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:30:54 compute-0 podman[94467]: 2025-11-29 06:30:54.586277403 +0000 UTC m=+0.027587493 image pull 52cb1910f3f090372807028d1c2aea98d2557b1086636469529f290368ecdf69 quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 29 06:30:54 compute-0 podman[94467]: 2025-11-29 06:30:54.922211613 +0000 UTC m=+0.363521683 container create 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 06:30:54 compute-0 python3[94432]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 29 06:30:55 compute-0 sudo[94430]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:55 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 06:30:56 compute-0 sudo[94654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isluvymnnuapcryausbdxamegiehupih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397855.7233734-1680-176044589476647/AnsiballZ_stat.py'
Nov 29 06:30:56 compute-0 sudo[94654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:56 compute-0 python3.9[94656]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:30:56 compute-0 sudo[94654]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:57 compute-0 sudo[94808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fibqzivlgojgrajkcfgbpffihykmxyfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397856.9178407-1707-148316221352821/AnsiballZ_file.py'
Nov 29 06:30:57 compute-0 sudo[94808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:57 compute-0 python3.9[94810]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:30:57 compute-0 sudo[94808]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:57 compute-0 sudo[94884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvhvxwnzcyzigotmhulxoypljmmldcfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397856.9178407-1707-148316221352821/AnsiballZ_stat.py'
Nov 29 06:30:57 compute-0 sudo[94884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:57 compute-0 python3.9[94886]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:30:57 compute-0 sudo[94884]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:58 compute-0 sudo[95035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpmdunmbkenwwdadasnwmzaavcpuqncm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397857.872869-1707-64537026384079/AnsiballZ_copy.py'
Nov 29 06:30:58 compute-0 sudo[95035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:58 compute-0 python3.9[95037]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764397857.872869-1707-64537026384079/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:30:58 compute-0 sudo[95035]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:58 compute-0 sudo[95111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbxolhwovetmppfeakoyjihohjuqvgyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397857.872869-1707-64537026384079/AnsiballZ_systemd.py'
Nov 29 06:30:58 compute-0 sudo[95111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:59 compute-0 python3.9[95113]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 06:30:59 compute-0 systemd[1]: Reloading.
Nov 29 06:30:59 compute-0 systemd-sysv-generator[95144]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:30:59 compute-0 systemd-rc-local-generator[95139]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:30:59 compute-0 sudo[95111]: pam_unix(sudo:session): session closed for user root
Nov 29 06:30:59 compute-0 sudo[95223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-delbhyatsiqhyqeacmgkcxnnjeprvskm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397857.872869-1707-64537026384079/AnsiballZ_systemd.py'
Nov 29 06:30:59 compute-0 sudo[95223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:30:59 compute-0 python3.9[95225]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:31:00 compute-0 systemd[1]: Reloading.
Nov 29 06:31:00 compute-0 systemd-sysv-generator[95257]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:31:00 compute-0 systemd-rc-local-generator[95254]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:31:00 compute-0 systemd[1]: Starting ovn_controller container...
Nov 29 06:31:00 compute-0 systemd[1]: Created slice Virtual Machine and Container Slice.
Nov 29 06:31:00 compute-0 systemd[1]: Started libcrun container.
Nov 29 06:31:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a46e6785edaa4db68182908e071cd917096135ab257fe1b37b607f2d7d7a6eb/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 29 06:31:00 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152.
Nov 29 06:31:00 compute-0 podman[95265]: 2025-11-29 06:31:00.516497388 +0000 UTC m=+0.213894463 container init 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller)
Nov 29 06:31:00 compute-0 ovn_controller[95281]: + sudo -E kolla_set_configs
Nov 29 06:31:00 compute-0 podman[95265]: 2025-11-29 06:31:00.541019323 +0000 UTC m=+0.238416368 container start 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 29 06:31:00 compute-0 edpm-start-podman-container[95265]: ovn_controller
Nov 29 06:31:00 compute-0 systemd[1]: Created slice User Slice of UID 0.
Nov 29 06:31:00 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Nov 29 06:31:00 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Nov 29 06:31:00 compute-0 edpm-start-podman-container[95264]: Creating additional drop-in dependency for "ovn_controller" (8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152)
Nov 29 06:31:00 compute-0 systemd[1]: Starting User Manager for UID 0...
Nov 29 06:31:00 compute-0 podman[95287]: 2025-11-29 06:31:00.621897205 +0000 UTC m=+0.070511650 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 06:31:00 compute-0 systemd[1]: 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152-1d0cc3f65a2e948d.service: Main process exited, code=exited, status=1/FAILURE
Nov 29 06:31:00 compute-0 systemd[1]: 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152-1d0cc3f65a2e948d.service: Failed with result 'exit-code'.
Nov 29 06:31:00 compute-0 systemd[95324]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Nov 29 06:31:00 compute-0 systemd[1]: Reloading.
Nov 29 06:31:00 compute-0 systemd-rc-local-generator[95363]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:31:00 compute-0 systemd-sysv-generator[95369]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:31:00 compute-0 systemd[95324]: Queued start job for default target Main User Target.
Nov 29 06:31:00 compute-0 systemd[95324]: Created slice User Application Slice.
Nov 29 06:31:00 compute-0 systemd[95324]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Nov 29 06:31:00 compute-0 systemd[95324]: Started Daily Cleanup of User's Temporary Directories.
Nov 29 06:31:00 compute-0 systemd[95324]: Reached target Paths.
Nov 29 06:31:00 compute-0 systemd[95324]: Reached target Timers.
Nov 29 06:31:00 compute-0 systemd[95324]: Starting D-Bus User Message Bus Socket...
Nov 29 06:31:00 compute-0 systemd[95324]: Starting Create User's Volatile Files and Directories...
Nov 29 06:31:00 compute-0 systemd[95324]: Listening on D-Bus User Message Bus Socket.
Nov 29 06:31:00 compute-0 systemd[95324]: Reached target Sockets.
Nov 29 06:31:00 compute-0 systemd[95324]: Finished Create User's Volatile Files and Directories.
Nov 29 06:31:00 compute-0 systemd[95324]: Reached target Basic System.
Nov 29 06:31:00 compute-0 systemd[95324]: Reached target Main User Target.
Nov 29 06:31:00 compute-0 systemd[95324]: Startup finished in 140ms.
Nov 29 06:31:00 compute-0 systemd[1]: Started User Manager for UID 0.
Nov 29 06:31:00 compute-0 systemd[1]: Started ovn_controller container.
Nov 29 06:31:00 compute-0 systemd[1]: Started Session c1 of User root.
Nov 29 06:31:00 compute-0 sudo[95223]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:00 compute-0 ovn_controller[95281]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 06:31:00 compute-0 ovn_controller[95281]: INFO:__main__:Validating config file
Nov 29 06:31:00 compute-0 ovn_controller[95281]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 06:31:00 compute-0 ovn_controller[95281]: INFO:__main__:Writing out command to execute
Nov 29 06:31:00 compute-0 systemd[1]: session-c1.scope: Deactivated successfully.
Nov 29 06:31:00 compute-0 ovn_controller[95281]: ++ cat /run_command
Nov 29 06:31:00 compute-0 ovn_controller[95281]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Nov 29 06:31:00 compute-0 ovn_controller[95281]: + ARGS=
Nov 29 06:31:00 compute-0 ovn_controller[95281]: + sudo kolla_copy_cacerts
Nov 29 06:31:00 compute-0 systemd[1]: Started Session c2 of User root.
Nov 29 06:31:00 compute-0 ovn_controller[95281]: + [[ ! -n '' ]]
Nov 29 06:31:00 compute-0 ovn_controller[95281]: + . kolla_extend_start
Nov 29 06:31:00 compute-0 ovn_controller[95281]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Nov 29 06:31:00 compute-0 ovn_controller[95281]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Nov 29 06:31:00 compute-0 ovn_controller[95281]: + umask 0022
Nov 29 06:31:00 compute-0 ovn_controller[95281]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Nov 29 06:31:00 compute-0 systemd[1]: session-c2.scope: Deactivated successfully.
Nov 29 06:31:00 compute-0 ovn_controller[95281]: 2025-11-29T06:31:00Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 29 06:31:00 compute-0 ovn_controller[95281]: 2025-11-29T06:31:00Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 29 06:31:00 compute-0 ovn_controller[95281]: 2025-11-29T06:31:00Z|00003|main|INFO|OVN internal version is : [24.03.7-20.33.0-76.8]
Nov 29 06:31:00 compute-0 ovn_controller[95281]: 2025-11-29T06:31:00Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Nov 29 06:31:00 compute-0 ovn_controller[95281]: 2025-11-29T06:31:00Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 29 06:31:00 compute-0 ovn_controller[95281]: 2025-11-29T06:31:00Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Nov 29 06:31:00 compute-0 NetworkManager[55227]: <info>  [1764397860.9972] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Nov 29 06:31:00 compute-0 NetworkManager[55227]: <info>  [1764397860.9981] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 06:31:00 compute-0 NetworkManager[55227]: <info>  [1764397860.9995] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Nov 29 06:31:01 compute-0 NetworkManager[55227]: <info>  [1764397861.0002] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Nov 29 06:31:01 compute-0 NetworkManager[55227]: <info>  [1764397861.0006] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 29 06:31:01 compute-0 kernel: br-int: entered promiscuous mode
Nov 29 06:31:01 compute-0 ovn_controller[95281]: 2025-11-29T06:31:01Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Nov 29 06:31:01 compute-0 ovn_controller[95281]: 2025-11-29T06:31:01Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 29 06:31:01 compute-0 ovn_controller[95281]: 2025-11-29T06:31:01Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 29 06:31:01 compute-0 ovn_controller[95281]: 2025-11-29T06:31:01Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Nov 29 06:31:01 compute-0 ovn_controller[95281]: 2025-11-29T06:31:01Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Nov 29 06:31:01 compute-0 ovn_controller[95281]: 2025-11-29T06:31:01Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Nov 29 06:31:01 compute-0 ovn_controller[95281]: 2025-11-29T06:31:01Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 29 06:31:01 compute-0 ovn_controller[95281]: 2025-11-29T06:31:01Z|00014|main|INFO|OVS feature set changed, force recompute.
Nov 29 06:31:01 compute-0 ovn_controller[95281]: 2025-11-29T06:31:01Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 29 06:31:01 compute-0 ovn_controller[95281]: 2025-11-29T06:31:01Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 29 06:31:01 compute-0 ovn_controller[95281]: 2025-11-29T06:31:01Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 29 06:31:01 compute-0 ovn_controller[95281]: 2025-11-29T06:31:01Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Nov 29 06:31:01 compute-0 ovn_controller[95281]: 2025-11-29T06:31:01Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Nov 29 06:31:01 compute-0 ovn_controller[95281]: 2025-11-29T06:31:01Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 29 06:31:01 compute-0 ovn_controller[95281]: 2025-11-29T06:31:01Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 29 06:31:01 compute-0 ovn_controller[95281]: 2025-11-29T06:31:01Z|00022|main|INFO|OVS feature set changed, force recompute.
Nov 29 06:31:01 compute-0 ovn_controller[95281]: 2025-11-29T06:31:01Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Nov 29 06:31:01 compute-0 ovn_controller[95281]: 2025-11-29T06:31:01Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Nov 29 06:31:01 compute-0 systemd-udevd[95415]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 06:31:01 compute-0 ovn_controller[95281]: 2025-11-29T06:31:01Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 29 06:31:01 compute-0 ovn_controller[95281]: 2025-11-29T06:31:01Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 29 06:31:01 compute-0 ovn_controller[95281]: 2025-11-29T06:31:01Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 29 06:31:01 compute-0 ovn_controller[95281]: 2025-11-29T06:31:01Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 29 06:31:01 compute-0 ovn_controller[95281]: 2025-11-29T06:31:01Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 29 06:31:01 compute-0 ovn_controller[95281]: 2025-11-29T06:31:01Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 29 06:31:01 compute-0 NetworkManager[55227]: <info>  [1764397861.0868] manager: (ovn-bd30a8-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Nov 29 06:31:01 compute-0 NetworkManager[55227]: <info>  [1764397861.0877] manager: (ovn-cdd09c-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/18)
Nov 29 06:31:01 compute-0 kernel: genev_sys_6081: entered promiscuous mode
Nov 29 06:31:01 compute-0 NetworkManager[55227]: <info>  [1764397861.1021] device (genev_sys_6081): carrier: link connected
Nov 29 06:31:01 compute-0 systemd-udevd[95417]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 06:31:01 compute-0 NetworkManager[55227]: <info>  [1764397861.1023] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/19)
Nov 29 06:31:01 compute-0 NetworkManager[55227]: <info>  [1764397861.6080] manager: (ovn-a43628-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Nov 29 06:31:01 compute-0 sudo[95545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vextjhsmivaqyxtysdzhklrfeiziutth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397861.4997094-1791-140413458389677/AnsiballZ_command.py'
Nov 29 06:31:01 compute-0 sudo[95545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:02 compute-0 python3.9[95547]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:31:02 compute-0 ovs-vsctl[95548]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Nov 29 06:31:02 compute-0 sudo[95545]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:02 compute-0 sudo[95698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgiovcjydddlumqgqoasxwsapasktlwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397862.280533-1815-64255552591788/AnsiballZ_command.py'
Nov 29 06:31:02 compute-0 sudo[95698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:02 compute-0 python3.9[95700]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:31:02 compute-0 ovs-vsctl[95702]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Nov 29 06:31:02 compute-0 sudo[95698]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:03 compute-0 sudo[95853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyoqscztbickriioztjakgenprvtpkxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397863.3435886-1857-84478740201425/AnsiballZ_command.py'
Nov 29 06:31:03 compute-0 sudo[95853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:03 compute-0 python3.9[95855]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:31:03 compute-0 ovs-vsctl[95856]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Nov 29 06:31:04 compute-0 sudo[95853]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:04 compute-0 sshd-session[84780]: Connection closed by 192.168.122.30 port 50034
Nov 29 06:31:04 compute-0 sshd-session[84777]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:31:04 compute-0 systemd[1]: session-19.scope: Deactivated successfully.
Nov 29 06:31:04 compute-0 systemd[1]: session-19.scope: Consumed 44.694s CPU time.
Nov 29 06:31:04 compute-0 systemd-logind[788]: Session 19 logged out. Waiting for processes to exit.
Nov 29 06:31:04 compute-0 systemd-logind[788]: Removed session 19.
Nov 29 06:31:10 compute-0 sshd-session[95883]: Accepted publickey for zuul from 192.168.122.30 port 40824 ssh2: ECDSA SHA256:Ey+6YDlYfkE2dRe/gjWhHvvrJHee4xZo2Q1JojEuVBA
Nov 29 06:31:10 compute-0 systemd-logind[788]: New session 21 of user zuul.
Nov 29 06:31:10 compute-0 systemd[1]: Started Session 21 of User zuul.
Nov 29 06:31:10 compute-0 sshd-session[95883]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:31:11 compute-0 sshd-session[95881]: Invalid user hdfs from 103.179.56.44 port 34248
Nov 29 06:31:11 compute-0 systemd[1]: Stopping User Manager for UID 0...
Nov 29 06:31:11 compute-0 systemd[95324]: Activating special unit Exit the Session...
Nov 29 06:31:11 compute-0 systemd[95324]: Stopped target Main User Target.
Nov 29 06:31:11 compute-0 systemd[95324]: Stopped target Basic System.
Nov 29 06:31:11 compute-0 systemd[95324]: Stopped target Paths.
Nov 29 06:31:11 compute-0 systemd[95324]: Stopped target Sockets.
Nov 29 06:31:11 compute-0 systemd[95324]: Stopped target Timers.
Nov 29 06:31:11 compute-0 systemd[95324]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 29 06:31:11 compute-0 systemd[95324]: Closed D-Bus User Message Bus Socket.
Nov 29 06:31:11 compute-0 systemd[95324]: Stopped Create User's Volatile Files and Directories.
Nov 29 06:31:11 compute-0 systemd[95324]: Removed slice User Application Slice.
Nov 29 06:31:11 compute-0 systemd[95324]: Reached target Shutdown.
Nov 29 06:31:11 compute-0 systemd[95324]: Finished Exit the Session.
Nov 29 06:31:11 compute-0 systemd[95324]: Reached target Exit the Session.
Nov 29 06:31:11 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Nov 29 06:31:11 compute-0 systemd[1]: Stopped User Manager for UID 0.
Nov 29 06:31:11 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Nov 29 06:31:11 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Nov 29 06:31:11 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Nov 29 06:31:11 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Nov 29 06:31:11 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Nov 29 06:31:11 compute-0 sshd-session[95881]: Received disconnect from 103.179.56.44 port 34248:11: Bye Bye [preauth]
Nov 29 06:31:11 compute-0 sshd-session[95881]: Disconnected from invalid user hdfs 103.179.56.44 port 34248 [preauth]
Nov 29 06:31:11 compute-0 python3.9[96039]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:31:12 compute-0 sudo[96193]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxzysiodraheudtzmzpnamivzypdirvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397872.5088427-67-219446274233918/AnsiballZ_file.py'
Nov 29 06:31:12 compute-0 sudo[96193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:13 compute-0 python3.9[96195]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:31:13 compute-0 sudo[96193]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:13 compute-0 sudo[96345]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jchyyxacmavojnnmovseqeaqfvmqtxez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397873.3341806-67-26984332413859/AnsiballZ_file.py'
Nov 29 06:31:13 compute-0 sudo[96345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:13 compute-0 python3.9[96347]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:31:13 compute-0 sudo[96345]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:14 compute-0 sudo[96497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibuvrdhwsexgitpgxpffbelrzbrjljaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397873.9729924-67-172919135518794/AnsiballZ_file.py'
Nov 29 06:31:14 compute-0 sudo[96497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:14 compute-0 python3.9[96499]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:31:14 compute-0 sudo[96497]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:14 compute-0 sudo[96649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvgtkdfltzdriyybhxpitgdovapnmzfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397874.6370199-67-155134655418266/AnsiballZ_file.py'
Nov 29 06:31:14 compute-0 sudo[96649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:15 compute-0 python3.9[96651]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:31:15 compute-0 sudo[96649]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:15 compute-0 sudo[96801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwicshryrbieczxvprodhkbxqdfhiydc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397875.2903562-67-214572893333100/AnsiballZ_file.py'
Nov 29 06:31:15 compute-0 sudo[96801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:15 compute-0 python3.9[96803]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:31:15 compute-0 sudo[96801]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:17 compute-0 sshd-session[96828]: Received disconnect from 45.202.211.6 port 53120:11: Bye Bye [preauth]
Nov 29 06:31:17 compute-0 sshd-session[96828]: Disconnected from authenticating user root 45.202.211.6 port 53120 [preauth]
Nov 29 06:31:21 compute-0 python3.9[96957]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:31:22 compute-0 sudo[97107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skvvvhdqemtfyfbiqzcpocxgznczrtks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397882.0818195-199-74737007134394/AnsiballZ_seboolean.py'
Nov 29 06:31:22 compute-0 sudo[97107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:22 compute-0 python3.9[97109]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 29 06:31:23 compute-0 sudo[97107]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:24 compute-0 python3.9[97259]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:31:25 compute-0 python3.9[97380]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397883.6567757-223-108824885814028/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:31:25 compute-0 python3.9[97530]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:31:26 compute-0 python3.9[97651]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397885.2118251-268-107834036126221/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:31:26 compute-0 sudo[97801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xyyenvthrlyyqvaxgeyiytzysfdzehdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397886.6669796-319-97169591201370/AnsiballZ_setup.py'
Nov 29 06:31:26 compute-0 sudo[97801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:27 compute-0 python3.9[97803]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:31:27 compute-0 sudo[97801]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:27 compute-0 sudo[97885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxdanatkpmparffackmrqmyutfcxnsds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397886.6669796-319-97169591201370/AnsiballZ_dnf.py'
Nov 29 06:31:27 compute-0 sudo[97885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:28 compute-0 python3.9[97887]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:31:30 compute-0 sudo[97885]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:30 compute-0 ovn_controller[95281]: 2025-11-29T06:31:30Z|00025|memory|INFO|16384 kB peak resident set size after 29.8 seconds
Nov 29 06:31:30 compute-0 ovn_controller[95281]: 2025-11-29T06:31:30Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:585 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:3
Nov 29 06:31:30 compute-0 podman[97965]: 2025-11-29 06:31:30.861548887 +0000 UTC m=+0.130558270 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 29 06:31:31 compute-0 sudo[98065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjpwufxdftkhcvxwwvjjcmbcexatgzog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397890.5048988-355-17858650298264/AnsiballZ_systemd.py'
Nov 29 06:31:31 compute-0 sudo[98065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:31 compute-0 python3.9[98067]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 06:31:31 compute-0 sudo[98065]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:32 compute-0 python3.9[98221]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:31:32 compute-0 python3.9[98342]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397891.8404624-379-45777364063749/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:31:33 compute-0 python3.9[98492]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:31:33 compute-0 python3.9[98613]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397893.0429983-379-122092451547147/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:31:35 compute-0 python3.9[98763]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:31:35 compute-0 python3.9[98884]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397894.800304-511-136021324682460/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:31:37 compute-0 python3.9[99034]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:31:37 compute-0 python3.9[99155]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397896.0263603-511-164326917265847/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=3fd0bbe67f8d6b170421a2b4395a288aa69eaea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:31:38 compute-0 python3.9[99305]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:31:38 compute-0 sudo[99457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfrzqhnpxyalnkwxkvznwggyfwllnubg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397898.7641997-625-228138491526707/AnsiballZ_file.py'
Nov 29 06:31:38 compute-0 sudo[99457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:39 compute-0 python3.9[99459]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:31:39 compute-0 sudo[99457]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:39 compute-0 sudo[99609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtwqbkzvnjdawkclbykcdnbxjwdecxlp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397899.389812-649-231751778119895/AnsiballZ_stat.py'
Nov 29 06:31:39 compute-0 sudo[99609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:39 compute-0 python3.9[99611]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:31:39 compute-0 sudo[99609]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:40 compute-0 sudo[99687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kukddoorqoyfydwierntjdohzxtlafcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397899.389812-649-231751778119895/AnsiballZ_file.py'
Nov 29 06:31:40 compute-0 sudo[99687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:40 compute-0 python3.9[99689]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:31:40 compute-0 sudo[99687]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:40 compute-0 sudo[99839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qufeysjdbdpeqjxqnhidacgyedrfglki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397900.4308162-649-126642100126880/AnsiballZ_stat.py'
Nov 29 06:31:40 compute-0 sudo[99839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:40 compute-0 python3.9[99841]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:31:40 compute-0 sudo[99839]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:41 compute-0 sudo[99917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czbwaboxqurdamcvjvmedewmpuwvcsou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397900.4308162-649-126642100126880/AnsiballZ_file.py'
Nov 29 06:31:41 compute-0 sudo[99917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:41 compute-0 python3.9[99919]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:31:41 compute-0 sudo[99917]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:41 compute-0 sudo[100071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibrunmzxpygyeyvximwbfbjdazfevcbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397901.707078-718-15494562632369/AnsiballZ_file.py'
Nov 29 06:31:41 compute-0 sudo[100071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:42 compute-0 python3.9[100073]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:31:42 compute-0 sudo[100071]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:42 compute-0 sshd-session[99933]: Invalid user root1 from 45.78.219.251 port 48546
Nov 29 06:31:42 compute-0 sudo[100223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwoptjiyixbhnwcjumndrqwvolrhuhch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397902.4596593-742-175933069815290/AnsiballZ_stat.py'
Nov 29 06:31:42 compute-0 sudo[100223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:42 compute-0 sshd-session[99933]: Received disconnect from 45.78.219.251 port 48546:11: Bye Bye [preauth]
Nov 29 06:31:42 compute-0 sshd-session[99933]: Disconnected from invalid user root1 45.78.219.251 port 48546 [preauth]
Nov 29 06:31:42 compute-0 python3.9[100225]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:31:43 compute-0 sudo[100223]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:43 compute-0 sudo[100301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhtgugmwxkgwvyyydyfvmwgetefgdeqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397902.4596593-742-175933069815290/AnsiballZ_file.py'
Nov 29 06:31:43 compute-0 sudo[100301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:43 compute-0 python3.9[100303]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:31:43 compute-0 sudo[100301]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:43 compute-0 sudo[100453]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwfakkxtaauxfxbqubbgdngwwqqgplpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397903.6680875-778-99264530069496/AnsiballZ_stat.py'
Nov 29 06:31:43 compute-0 sudo[100453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:44 compute-0 python3.9[100455]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:31:44 compute-0 sudo[100453]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:44 compute-0 sudo[100531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dndcbqkmddvivaoyxycwluhifrfkuqgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397903.6680875-778-99264530069496/AnsiballZ_file.py'
Nov 29 06:31:44 compute-0 sudo[100531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:44 compute-0 python3.9[100533]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:31:44 compute-0 sudo[100531]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:45 compute-0 sudo[100683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghufvilaqzzxowrgnutbawtnqhadvdka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397904.774192-814-26903012534134/AnsiballZ_systemd.py'
Nov 29 06:31:45 compute-0 sudo[100683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:45 compute-0 python3.9[100685]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:31:45 compute-0 systemd[1]: Reloading.
Nov 29 06:31:45 compute-0 systemd-rc-local-generator[100709]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:31:45 compute-0 systemd-sysv-generator[100715]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:31:46 compute-0 sudo[100683]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:46 compute-0 sudo[100872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-heywogrqimyovkztqorucreyjvyknmnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397906.6835632-838-199070163571374/AnsiballZ_stat.py'
Nov 29 06:31:46 compute-0 sudo[100872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:47 compute-0 python3.9[100874]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:31:47 compute-0 sudo[100872]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:47 compute-0 sudo[100950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzwbyuiysaxyyqnnzsotrjxyeuoeusca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397906.6835632-838-199070163571374/AnsiballZ_file.py'
Nov 29 06:31:47 compute-0 sudo[100950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:47 compute-0 python3.9[100952]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:31:47 compute-0 sudo[100950]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:48 compute-0 sudo[101102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctixfqbsavvxjfjgxlfzbgprdzplzvdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397907.8413565-874-225475719713194/AnsiballZ_stat.py'
Nov 29 06:31:48 compute-0 sudo[101102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:48 compute-0 python3.9[101104]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:31:48 compute-0 sudo[101102]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:48 compute-0 sudo[101180]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-padnmmovyoeyrlonvedcaoahxiaqgsze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397907.8413565-874-225475719713194/AnsiballZ_file.py'
Nov 29 06:31:48 compute-0 sudo[101180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:48 compute-0 python3.9[101182]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:31:48 compute-0 sudo[101180]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:49 compute-0 sudo[101332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxhqveksgeylkxhqunymyuzsgwtgmlir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397909.127896-910-166890867947864/AnsiballZ_systemd.py'
Nov 29 06:31:49 compute-0 sudo[101332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:49 compute-0 python3.9[101334]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:31:49 compute-0 systemd[1]: Reloading.
Nov 29 06:31:49 compute-0 systemd-rc-local-generator[101360]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:31:49 compute-0 systemd-sysv-generator[101365]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:31:50 compute-0 systemd[1]: Starting Create netns directory...
Nov 29 06:31:50 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 29 06:31:50 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 29 06:31:50 compute-0 systemd[1]: Finished Create netns directory.
Nov 29 06:31:50 compute-0 sudo[101332]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:50 compute-0 sudo[101525]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhjwbtkhvwwfpehzblrthfmrasxutvxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397910.4436865-940-127516801020976/AnsiballZ_file.py'
Nov 29 06:31:50 compute-0 sudo[101525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:50 compute-0 python3.9[101527]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:31:50 compute-0 sudo[101525]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:51 compute-0 sudo[101677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ciscygbupcztnlkalzbcixfepexzhgra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397911.1395524-964-223032478267808/AnsiballZ_stat.py'
Nov 29 06:31:51 compute-0 sudo[101677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:51 compute-0 python3.9[101679]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:31:51 compute-0 sudo[101677]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:51 compute-0 sudo[101800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojfmkjtotwfezbebizmqvddigmphgmim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397911.1395524-964-223032478267808/AnsiballZ_copy.py'
Nov 29 06:31:51 compute-0 sudo[101800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:52 compute-0 python3.9[101802]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397911.1395524-964-223032478267808/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:31:52 compute-0 sudo[101800]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:52 compute-0 sudo[101952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsckdxxydlyeswvlquclrtswkwtlfqli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397912.7081604-1015-153735021615359/AnsiballZ_file.py'
Nov 29 06:31:52 compute-0 sudo[101952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:53 compute-0 python3.9[101954]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:31:53 compute-0 sudo[101952]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:53 compute-0 sudo[102104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnencvctnidesyakmrhiapzotkmeoxyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397913.4378903-1039-172139409917962/AnsiballZ_stat.py'
Nov 29 06:31:53 compute-0 sudo[102104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:53 compute-0 python3.9[102106]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:31:53 compute-0 sudo[102104]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:54 compute-0 sudo[102227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqfabioqneesanwpedsdfhavdtysbuzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397913.4378903-1039-172139409917962/AnsiballZ_copy.py'
Nov 29 06:31:54 compute-0 sudo[102227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:54 compute-0 python3.9[102229]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397913.4378903-1039-172139409917962/.source.json _original_basename=.xz14tw_s follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:31:54 compute-0 sudo[102227]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:55 compute-0 sudo[102379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwuqrkmxwhxajitknsknrrfqnojjrguv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397914.7971973-1084-21392858430764/AnsiballZ_file.py'
Nov 29 06:31:55 compute-0 sudo[102379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:55 compute-0 python3.9[102381]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:31:55 compute-0 sudo[102379]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:55 compute-0 sudo[102534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apumtciagatrofbfkggacjoukwnfuzzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397915.5533137-1108-168598876310698/AnsiballZ_stat.py'
Nov 29 06:31:55 compute-0 sudo[102534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:56 compute-0 sudo[102534]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:56 compute-0 sudo[102659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywuueyormruavrwxwocqivxqdqncowki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397915.5533137-1108-168598876310698/AnsiballZ_copy.py'
Nov 29 06:31:56 compute-0 sudo[102659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:56 compute-0 sudo[102659]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:56 compute-0 sshd-session[102537]: Invalid user Test from 179.125.24.202 port 55336
Nov 29 06:31:56 compute-0 sshd-session[102482]: Invalid user zjw from 1.214.197.163 port 54752
Nov 29 06:31:56 compute-0 sshd-session[102537]: Received disconnect from 179.125.24.202 port 55336:11: Bye Bye [preauth]
Nov 29 06:31:56 compute-0 sshd-session[102537]: Disconnected from invalid user Test 179.125.24.202 port 55336 [preauth]
Nov 29 06:31:57 compute-0 sshd-session[102482]: Received disconnect from 1.214.197.163 port 54752:11: Bye Bye [preauth]
Nov 29 06:31:57 compute-0 sshd-session[102482]: Disconnected from invalid user zjw 1.214.197.163 port 54752 [preauth]
Nov 29 06:31:57 compute-0 sudo[102811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwrnqssdrclmkbnoyijteqhvgtvthxyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397916.9964194-1159-246795875406945/AnsiballZ_container_config_data.py'
Nov 29 06:31:57 compute-0 sudo[102811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:57 compute-0 python3.9[102813]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Nov 29 06:31:57 compute-0 sudo[102811]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:58 compute-0 sudo[102963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-neuwwlheanxdztpttkeoatzyyoaicoer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397917.8901212-1186-57534964747850/AnsiballZ_container_config_hash.py'
Nov 29 06:31:58 compute-0 sudo[102963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:58 compute-0 python3.9[102965]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 06:31:58 compute-0 sudo[102963]: pam_unix(sudo:session): session closed for user root
Nov 29 06:31:59 compute-0 sudo[103115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrkxtieidojsazukqezniejqzjiaiwfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397918.932957-1213-19819121281051/AnsiballZ_podman_container_info.py'
Nov 29 06:31:59 compute-0 sudo[103115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:31:59 compute-0 python3.9[103117]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 29 06:31:59 compute-0 sudo[103115]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:00 compute-0 sshd-session[103118]: Received disconnect from 160.202.8.218 port 35778:11: Bye Bye [preauth]
Nov 29 06:32:00 compute-0 sshd-session[103118]: Disconnected from authenticating user root 160.202.8.218 port 35778 [preauth]
Nov 29 06:32:01 compute-0 sudo[103306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gubeynmsqrxhqhwhnmaaoghxjtrlvefl ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764397920.5852041-1252-188202017078028/AnsiballZ_edpm_container_manage.py'
Nov 29 06:32:01 compute-0 sudo[103306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:01 compute-0 podman[103269]: 2025-11-29 06:32:01.177656457 +0000 UTC m=+0.110169376 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:32:01 compute-0 python3[103314]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 06:32:15 compute-0 podman[103336]: 2025-11-29 06:32:15.97843011 +0000 UTC m=+14.532390227 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 06:32:16 compute-0 podman[103434]: 2025-11-29 06:32:16.129217899 +0000 UTC m=+0.049195040 container create 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 29 06:32:16 compute-0 podman[103434]: 2025-11-29 06:32:16.100936469 +0000 UTC m=+0.020913630 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 06:32:16 compute-0 python3[103314]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 06:32:16 compute-0 sudo[103306]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:16 compute-0 sudo[103621]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qogmnspraqmgcvxgzpvkppdpqgwfnfxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397936.4324052-1276-236324786826777/AnsiballZ_stat.py'
Nov 29 06:32:16 compute-0 sudo[103621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:16 compute-0 python3.9[103623]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:32:16 compute-0 sudo[103621]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:17 compute-0 sudo[103775]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wuoudiaeryrjyjygchxqpjehdbmlgrvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397937.217181-1303-137209937426137/AnsiballZ_file.py'
Nov 29 06:32:17 compute-0 sudo[103775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:17 compute-0 python3.9[103777]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:32:17 compute-0 sudo[103775]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:18 compute-0 sudo[103851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emhkpfdntuousxskentyhslwlkyuovho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397937.217181-1303-137209937426137/AnsiballZ_stat.py'
Nov 29 06:32:18 compute-0 sudo[103851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:18 compute-0 python3.9[103853]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:32:18 compute-0 sudo[103851]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:18 compute-0 sudo[104002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvurzjdxvsjdmbqmenbzsncageoszwlg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397938.2674084-1303-78795252377345/AnsiballZ_copy.py'
Nov 29 06:32:18 compute-0 sudo[104002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:18 compute-0 python3.9[104004]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764397938.2674084-1303-78795252377345/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:32:18 compute-0 sudo[104002]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:19 compute-0 sudo[104078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-annhwfgkqpkhhewjrzdzaizlzwnkxzyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397938.2674084-1303-78795252377345/AnsiballZ_systemd.py'
Nov 29 06:32:19 compute-0 sudo[104078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:19 compute-0 python3.9[104080]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 06:32:19 compute-0 systemd[1]: Reloading.
Nov 29 06:32:19 compute-0 systemd-sysv-generator[104109]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:32:19 compute-0 systemd-rc-local-generator[104106]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:32:20 compute-0 sudo[104078]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:20 compute-0 sudo[104190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plqkslviniortdukecsmwhdfiolwqvtb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397938.2674084-1303-78795252377345/AnsiballZ_systemd.py'
Nov 29 06:32:20 compute-0 sudo[104190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:20 compute-0 python3.9[104192]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:32:20 compute-0 systemd[1]: Reloading.
Nov 29 06:32:20 compute-0 systemd-rc-local-generator[104223]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:32:20 compute-0 systemd-sysv-generator[104228]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:32:21 compute-0 systemd[1]: Starting ovn_metadata_agent container...
Nov 29 06:32:21 compute-0 systemd[1]: Started libcrun container.
Nov 29 06:32:21 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/80474e73ab44dc4613a5df4c9e01b7e80d321530c23c55d9bfd81ec12bcfba7d/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Nov 29 06:32:21 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/80474e73ab44dc4613a5df4c9e01b7e80d321530c23c55d9bfd81ec12bcfba7d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 06:32:22 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e.
Nov 29 06:32:22 compute-0 podman[104233]: 2025-11-29 06:32:22.840618108 +0000 UTC m=+1.683731244 container init 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 06:32:22 compute-0 ovn_metadata_agent[104249]: + sudo -E kolla_set_configs
Nov 29 06:32:22 compute-0 podman[104233]: 2025-11-29 06:32:22.863154873 +0000 UTC m=+1.706268029 container start 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 06:32:22 compute-0 ovn_metadata_agent[104249]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 06:32:22 compute-0 ovn_metadata_agent[104249]: INFO:__main__:Validating config file
Nov 29 06:32:22 compute-0 ovn_metadata_agent[104249]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 06:32:22 compute-0 ovn_metadata_agent[104249]: INFO:__main__:Copying service configuration files
Nov 29 06:32:22 compute-0 ovn_metadata_agent[104249]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Nov 29 06:32:22 compute-0 ovn_metadata_agent[104249]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Nov 29 06:32:22 compute-0 ovn_metadata_agent[104249]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Nov 29 06:32:22 compute-0 ovn_metadata_agent[104249]: INFO:__main__:Writing out command to execute
Nov 29 06:32:22 compute-0 ovn_metadata_agent[104249]: INFO:__main__:Setting permission for /var/lib/neutron
Nov 29 06:32:22 compute-0 ovn_metadata_agent[104249]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Nov 29 06:32:22 compute-0 ovn_metadata_agent[104249]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Nov 29 06:32:22 compute-0 ovn_metadata_agent[104249]: INFO:__main__:Setting permission for /var/lib/neutron/external
Nov 29 06:32:22 compute-0 ovn_metadata_agent[104249]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Nov 29 06:32:22 compute-0 ovn_metadata_agent[104249]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Nov 29 06:32:22 compute-0 ovn_metadata_agent[104249]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Nov 29 06:32:22 compute-0 ovn_metadata_agent[104249]: ++ cat /run_command
Nov 29 06:32:22 compute-0 ovn_metadata_agent[104249]: + CMD=neutron-ovn-metadata-agent
Nov 29 06:32:22 compute-0 ovn_metadata_agent[104249]: + ARGS=
Nov 29 06:32:22 compute-0 ovn_metadata_agent[104249]: + sudo kolla_copy_cacerts
Nov 29 06:32:22 compute-0 ovn_metadata_agent[104249]: + [[ ! -n '' ]]
Nov 29 06:32:22 compute-0 ovn_metadata_agent[104249]: + . kolla_extend_start
Nov 29 06:32:22 compute-0 ovn_metadata_agent[104249]: Running command: 'neutron-ovn-metadata-agent'
Nov 29 06:32:22 compute-0 ovn_metadata_agent[104249]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Nov 29 06:32:22 compute-0 ovn_metadata_agent[104249]: + umask 0022
Nov 29 06:32:22 compute-0 ovn_metadata_agent[104249]: + exec neutron-ovn-metadata-agent
Nov 29 06:32:23 compute-0 edpm-start-podman-container[104233]: ovn_metadata_agent
Nov 29 06:32:23 compute-0 edpm-start-podman-container[104232]: Creating additional drop-in dependency for "ovn_metadata_agent" (0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e)
Nov 29 06:32:23 compute-0 systemd[1]: Reloading.
Nov 29 06:32:23 compute-0 podman[104255]: 2025-11-29 06:32:23.687521603 +0000 UTC m=+0.814155669 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Nov 29 06:32:23 compute-0 systemd-rc-local-generator[104330]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:32:23 compute-0 systemd-sysv-generator[104334]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:32:23 compute-0 systemd[1]: Started ovn_metadata_agent container.
Nov 29 06:32:23 compute-0 sudo[104190]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.746 104254 INFO neutron.common.config [-] Logging enabled!
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.746 104254 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.747 104254 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.747 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.747 104254 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.747 104254 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.748 104254 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.748 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.748 104254 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.748 104254 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.748 104254 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.748 104254 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.748 104254 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.748 104254 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.749 104254 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.749 104254 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.749 104254 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.749 104254 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.749 104254 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.749 104254 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.749 104254 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.749 104254 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.749 104254 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.749 104254 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.750 104254 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.750 104254 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.750 104254 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.750 104254 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.750 104254 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.750 104254 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.750 104254 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.750 104254 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.750 104254 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.751 104254 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.751 104254 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.751 104254 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.751 104254 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.751 104254 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.751 104254 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.751 104254 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.751 104254 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.751 104254 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.752 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.752 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.752 104254 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.752 104254 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.752 104254 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.752 104254 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.752 104254 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.752 104254 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.752 104254 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.752 104254 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.753 104254 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.753 104254 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.753 104254 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.753 104254 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.753 104254 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.753 104254 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.753 104254 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.753 104254 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.753 104254 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.753 104254 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.754 104254 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.754 104254 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.754 104254 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.754 104254 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.754 104254 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.754 104254 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.754 104254 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.754 104254 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.755 104254 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.755 104254 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.755 104254 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.755 104254 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-cell1-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.755 104254 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.755 104254 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.755 104254 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.755 104254 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.755 104254 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.756 104254 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.756 104254 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.756 104254 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.756 104254 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.756 104254 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.756 104254 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.756 104254 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.756 104254 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.756 104254 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.756 104254 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.757 104254 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.757 104254 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.757 104254 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.757 104254 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.757 104254 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.757 104254 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.757 104254 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.757 104254 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.757 104254 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.757 104254 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.758 104254 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.758 104254 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.758 104254 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.758 104254 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.758 104254 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.758 104254 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.758 104254 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.758 104254 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.758 104254 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.758 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.759 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.759 104254 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.759 104254 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.759 104254 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.759 104254 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.759 104254 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.759 104254 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.759 104254 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.759 104254 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.760 104254 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.760 104254 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.760 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.760 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.760 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.760 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.760 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.760 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.761 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.761 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.761 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.761 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.761 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.761 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.761 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.761 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.761 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.761 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.762 104254 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.762 104254 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.762 104254 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.762 104254 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.762 104254 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.762 104254 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.762 104254 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.762 104254 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.762 104254 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.763 104254 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.763 104254 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.763 104254 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.763 104254 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.763 104254 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.763 104254 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.763 104254 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.763 104254 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.763 104254 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.763 104254 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.764 104254 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.764 104254 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.764 104254 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.764 104254 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.764 104254 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.764 104254 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.764 104254 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.764 104254 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.764 104254 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.765 104254 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.765 104254 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.765 104254 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.765 104254 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.765 104254 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.765 104254 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.765 104254 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.765 104254 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.765 104254 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.765 104254 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.766 104254 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.766 104254 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.766 104254 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.766 104254 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.766 104254 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.766 104254 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.766 104254 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.766 104254 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.767 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.767 104254 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.767 104254 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.767 104254 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.767 104254 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.767 104254 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.767 104254 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.767 104254 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.768 104254 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.768 104254 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.768 104254 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.768 104254 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.768 104254 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.768 104254 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.768 104254 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.768 104254 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.769 104254 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.769 104254 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.769 104254 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.769 104254 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.769 104254 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.769 104254 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.769 104254 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.769 104254 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.769 104254 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.770 104254 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.770 104254 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.770 104254 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.770 104254 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.770 104254 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.770 104254 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.770 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.770 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.771 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.771 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.771 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.771 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.771 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.771 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.771 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.771 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.771 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.772 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.772 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.772 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.772 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.772 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.772 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.772 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.772 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.773 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.773 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.773 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.773 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.773 104254 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.773 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.773 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.773 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.774 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.774 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.774 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.774 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.774 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.774 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.774 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.774 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.774 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.775 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.775 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.775 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.775 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.775 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.775 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.775 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.775 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.776 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.776 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.776 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.776 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.776 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.776 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.776 104254 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.776 104254 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.776 104254 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.777 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.777 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.777 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.777 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.777 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.777 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.777 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.777 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.777 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.778 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.778 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.778 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.778 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.778 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.778 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.778 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.778 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.778 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.779 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.779 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.779 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.779 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.779 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.779 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.779 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.779 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.779 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.780 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.780 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.780 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.780 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.780 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.780 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.780 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.780 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.780 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.781 104254 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.781 104254 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.790 104254 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.790 104254 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.790 104254 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.791 104254 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.791 104254 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.802 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 7525db09-7529-4df7-96c0-bba03a4d5548 (UUID: 7525db09-7529-4df7-96c0-bba03a4d5548) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.825 104254 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.826 104254 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.826 104254 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.826 104254 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.829 104254 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.835 104254 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.841 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '7525db09-7529-4df7-96c0-bba03a4d5548'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], external_ids={}, name=7525db09-7529-4df7-96c0-bba03a4d5548, nb_cfg_timestamp=1764397869018, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.842 104254 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f3578b7da90>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.842 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.843 104254 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.843 104254 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.843 104254 INFO oslo_service.service [-] Starting 1 workers
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.847 104254 DEBUG oslo_service.service [-] Started child 104361 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.850 104361 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-431745'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.850 104254 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp0ecupl02/privsep.sock']
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.875 104361 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.876 104361 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.876 104361 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.879 104361 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.885 104361 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Nov 29 06:32:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:24.890 104361 INFO eventlet.wsgi.server [-] (104361) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Nov 29 06:32:25 compute-0 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Nov 29 06:32:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:25.537 104254 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Nov 29 06:32:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:25.538 104254 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp0ecupl02/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Nov 29 06:32:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:25.386 104366 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 29 06:32:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:25.393 104366 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 29 06:32:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:25.397 104366 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Nov 29 06:32:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:25.398 104366 INFO oslo.privsep.daemon [-] privsep daemon running as pid 104366
Nov 29 06:32:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:25.542 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[f99af5cb-8408-44d6-b957-feb8d049ab10]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.084 104366 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.084 104366 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.084 104366 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:32:26 compute-0 sshd-session[95886]: Connection closed by 192.168.122.30 port 40824
Nov 29 06:32:26 compute-0 sshd-session[95883]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:32:26 compute-0 systemd[1]: session-21.scope: Deactivated successfully.
Nov 29 06:32:26 compute-0 systemd[1]: session-21.scope: Consumed 47.848s CPU time.
Nov 29 06:32:26 compute-0 systemd-logind[788]: Session 21 logged out. Waiting for processes to exit.
Nov 29 06:32:26 compute-0 systemd-logind[788]: Removed session 21.
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.626 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[574cec0f-ac0d-4b2a-abff-60bc984c7e9e]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.629 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, column=external_ids, values=({'neutron:ovn-metadata-id': 'ee3577c8-3a9f-5f22-abf0-d76191f88fab'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.695 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.714 104254 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.715 104254 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.715 104254 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.715 104254 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.715 104254 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.715 104254 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.715 104254 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.715 104254 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.715 104254 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.715 104254 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.716 104254 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.716 104254 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.716 104254 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.716 104254 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.716 104254 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.716 104254 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.716 104254 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.716 104254 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.717 104254 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.717 104254 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.717 104254 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.717 104254 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.717 104254 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.717 104254 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.717 104254 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.717 104254 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.718 104254 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.718 104254 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.718 104254 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.718 104254 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.718 104254 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.718 104254 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.718 104254 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.718 104254 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.718 104254 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.719 104254 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.719 104254 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.719 104254 DEBUG oslo_service.service [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.719 104254 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.719 104254 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.719 104254 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.720 104254 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.720 104254 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.720 104254 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.720 104254 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.720 104254 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.720 104254 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.720 104254 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.720 104254 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.720 104254 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.721 104254 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.721 104254 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.721 104254 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.721 104254 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.721 104254 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.721 104254 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.721 104254 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.721 104254 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.722 104254 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.722 104254 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.722 104254 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.722 104254 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.722 104254 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.722 104254 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.722 104254 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.722 104254 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.722 104254 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.723 104254 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.723 104254 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.723 104254 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.723 104254 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.723 104254 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.723 104254 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.723 104254 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-cell1-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.723 104254 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.723 104254 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.723 104254 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.724 104254 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.724 104254 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.724 104254 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.724 104254 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.724 104254 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.724 104254 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.724 104254 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.724 104254 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.725 104254 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.725 104254 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.725 104254 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.725 104254 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.725 104254 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.725 104254 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.725 104254 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.725 104254 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.725 104254 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.726 104254 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.726 104254 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.726 104254 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.726 104254 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.726 104254 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.726 104254 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.726 104254 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.726 104254 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.726 104254 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.727 104254 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.727 104254 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.727 104254 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.727 104254 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.727 104254 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.727 104254 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.727 104254 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.727 104254 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.727 104254 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.728 104254 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.728 104254 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.728 104254 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.728 104254 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.728 104254 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.728 104254 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.728 104254 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.728 104254 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.729 104254 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.729 104254 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.729 104254 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.729 104254 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.729 104254 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.729 104254 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.729 104254 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.729 104254 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.729 104254 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.730 104254 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.730 104254 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.730 104254 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.730 104254 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.730 104254 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.730 104254 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.730 104254 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.730 104254 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.730 104254 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.731 104254 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.731 104254 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.731 104254 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.731 104254 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.731 104254 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.731 104254 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.731 104254 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.731 104254 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.732 104254 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.732 104254 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.732 104254 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.732 104254 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.732 104254 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.732 104254 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.732 104254 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.733 104254 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.733 104254 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.733 104254 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.733 104254 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.733 104254 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.733 104254 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.733 104254 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.733 104254 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.733 104254 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.733 104254 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.734 104254 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.734 104254 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.734 104254 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.734 104254 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.734 104254 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.734 104254 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.734 104254 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.734 104254 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.734 104254 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.734 104254 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.735 104254 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.735 104254 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.735 104254 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.735 104254 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.735 104254 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.735 104254 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.735 104254 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.735 104254 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.735 104254 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.736 104254 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.736 104254 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.736 104254 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.736 104254 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.736 104254 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.736 104254 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.736 104254 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.736 104254 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.736 104254 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.737 104254 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.737 104254 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.737 104254 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.737 104254 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.737 104254 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.737 104254 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.737 104254 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.737 104254 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.737 104254 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.738 104254 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.738 104254 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.738 104254 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.738 104254 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.738 104254 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.738 104254 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.738 104254 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.738 104254 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.738 104254 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.738 104254 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.738 104254 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.739 104254 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.739 104254 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.739 104254 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.739 104254 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.739 104254 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.739 104254 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.739 104254 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.739 104254 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.739 104254 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.739 104254 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.740 104254 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.740 104254 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.740 104254 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.740 104254 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.740 104254 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.740 104254 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.740 104254 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.740 104254 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.740 104254 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.740 104254 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.741 104254 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.741 104254 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.741 104254 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.741 104254 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.741 104254 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.741 104254 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.741 104254 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.741 104254 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.742 104254 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.742 104254 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.742 104254 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.742 104254 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.742 104254 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.742 104254 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.742 104254 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.742 104254 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.742 104254 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.742 104254 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.743 104254 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.743 104254 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.743 104254 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.743 104254 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.743 104254 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.743 104254 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.743 104254 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.743 104254 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.743 104254 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.744 104254 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.744 104254 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.744 104254 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.744 104254 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.744 104254 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.744 104254 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.744 104254 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.744 104254 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.744 104254 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.745 104254 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.745 104254 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.745 104254 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.745 104254 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.745 104254 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.745 104254 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.745 104254 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.745 104254 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.745 104254 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.746 104254 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.746 104254 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.746 104254 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.746 104254 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.746 104254 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.746 104254 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.746 104254 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.746 104254 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.746 104254 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.746 104254 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.747 104254 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.747 104254 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.747 104254 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.747 104254 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.747 104254 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.747 104254 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.747 104254 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.747 104254 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.747 104254 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.748 104254 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.748 104254 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.748 104254 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.748 104254 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.748 104254 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.748 104254 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.748 104254 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.748 104254 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:32:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:32:26.749 104254 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 29 06:32:31 compute-0 sshd-session[104371]: Received disconnect from 45.202.211.6 port 49920:11: Bye Bye [preauth]
Nov 29 06:32:31 compute-0 sshd-session[104371]: Disconnected from authenticating user root 45.202.211.6 port 49920 [preauth]
Nov 29 06:32:31 compute-0 podman[104373]: 2025-11-29 06:32:31.910063269 +0000 UTC m=+0.177208311 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 06:32:32 compute-0 sshd-session[104400]: Accepted publickey for zuul from 192.168.122.30 port 55806 ssh2: ECDSA SHA256:Ey+6YDlYfkE2dRe/gjWhHvvrJHee4xZo2Q1JojEuVBA
Nov 29 06:32:32 compute-0 systemd-logind[788]: New session 22 of user zuul.
Nov 29 06:32:32 compute-0 systemd[1]: Started Session 22 of User zuul.
Nov 29 06:32:32 compute-0 sshd-session[104400]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:32:33 compute-0 python3.9[104553]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:32:34 compute-0 sudo[104707]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkirrwhxbatgcprvkrstzirnvkfuhbwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397954.032362-67-107244985737281/AnsiballZ_command.py'
Nov 29 06:32:34 compute-0 sudo[104707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:34 compute-0 python3.9[104709]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:32:34 compute-0 sudo[104707]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:36 compute-0 sudo[104873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oslvgrqjcqbteookmoeyjmyrmpifdtqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397956.0224614-100-90347211509566/AnsiballZ_systemd_service.py'
Nov 29 06:32:36 compute-0 sudo[104873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:36 compute-0 python3.9[104875]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 06:32:36 compute-0 systemd[1]: Reloading.
Nov 29 06:32:37 compute-0 systemd-rc-local-generator[104898]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:32:37 compute-0 systemd-sysv-generator[104904]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:32:37 compute-0 sudo[104873]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:39 compute-0 sshd-session[104935]: Invalid user hb from 36.50.176.16 port 35082
Nov 29 06:32:39 compute-0 sshd-session[104935]: Received disconnect from 36.50.176.16 port 35082:11: Bye Bye [preauth]
Nov 29 06:32:39 compute-0 sshd-session[104935]: Disconnected from invalid user hb 36.50.176.16 port 35082 [preauth]
Nov 29 06:32:41 compute-0 python3.9[105062]: ansible-ansible.builtin.service_facts Invoked
Nov 29 06:32:41 compute-0 network[105079]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 06:32:41 compute-0 network[105080]: 'network-scripts' will be removed from distribution in near future.
Nov 29 06:32:41 compute-0 network[105081]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 06:32:45 compute-0 sudo[105340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jajfswnyhawgfniydguzauidkhmqjiqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397965.403812-157-91592764587048/AnsiballZ_systemd_service.py'
Nov 29 06:32:45 compute-0 sudo[105340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:45 compute-0 python3.9[105342]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:32:46 compute-0 sudo[105340]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:46 compute-0 sudo[105493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrzuvsrfgpyydvyjcuxljycmyahtezst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397966.134646-157-138724274975994/AnsiballZ_systemd_service.py'
Nov 29 06:32:46 compute-0 sudo[105493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:47 compute-0 python3.9[105495]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:32:47 compute-0 sudo[105493]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:48 compute-0 sudo[105646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgaputyyvupmjemcgfkgnnkocfwwdnog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397968.003781-157-204691351731118/AnsiballZ_systemd_service.py'
Nov 29 06:32:48 compute-0 sudo[105646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:51 compute-0 python3.9[105648]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:32:51 compute-0 sudo[105646]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:51 compute-0 sudo[105799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umhxsexjitwjgldmrhnbcaizwkijfrfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397971.677853-157-255101337903961/AnsiballZ_systemd_service.py'
Nov 29 06:32:51 compute-0 sudo[105799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:52 compute-0 python3.9[105801]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:32:52 compute-0 sudo[105799]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:52 compute-0 sudo[105952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljwpzhdrigxuzzlzbhcdhetkefjglwrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397972.467326-157-21843948078118/AnsiballZ_systemd_service.py'
Nov 29 06:32:52 compute-0 sudo[105952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:52 compute-0 python3.9[105954]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:32:53 compute-0 sudo[105952]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:53 compute-0 sudo[106105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-luvacfexfxfcbgxhhsknrtsjuzlzfyef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397973.107394-157-171624275733542/AnsiballZ_systemd_service.py'
Nov 29 06:32:53 compute-0 sudo[106105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:53 compute-0 python3.9[106107]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:32:53 compute-0 sudo[106105]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:54 compute-0 sudo[106264]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jawrfsjspqjcvipeindcaasmgzkyngas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397973.87842-157-96979410877675/AnsiballZ_systemd_service.py'
Nov 29 06:32:54 compute-0 sudo[106264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:54 compute-0 podman[106232]: 2025-11-29 06:32:54.20865389 +0000 UTC m=+0.069936228 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 29 06:32:54 compute-0 python3.9[106269]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:32:54 compute-0 sudo[106264]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:56 compute-0 sudo[106431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycmoczdbuvxlpkwzlvkfntrtsbdbpofg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397975.5817726-313-18476757203498/AnsiballZ_file.py'
Nov 29 06:32:56 compute-0 sudo[106431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:56 compute-0 python3.9[106433]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:32:56 compute-0 sudo[106431]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:56 compute-0 sudo[106583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcnumfuotdmtpyjjwgtvgpsapptoyhxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397976.358884-313-197393349827141/AnsiballZ_file.py'
Nov 29 06:32:56 compute-0 sudo[106583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:57 compute-0 python3.9[106585]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:32:57 compute-0 sudo[106583]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:57 compute-0 sudo[106735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppntnwtrydvfkduzsznwqdlwywkuzfxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397977.374606-313-168292797345305/AnsiballZ_file.py'
Nov 29 06:32:57 compute-0 sudo[106735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:57 compute-0 python3.9[106737]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:32:57 compute-0 sudo[106735]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:58 compute-0 sudo[106887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brlscankvygwbgqzwguierzdxcsvaztz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397977.9487777-313-43826965379802/AnsiballZ_file.py'
Nov 29 06:32:58 compute-0 sudo[106887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:58 compute-0 python3.9[106889]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:32:58 compute-0 sudo[106887]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:58 compute-0 sudo[107039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-biwlsscukkggroaylymmdapphqaxvnyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397978.5161424-313-164839357130165/AnsiballZ_file.py'
Nov 29 06:32:58 compute-0 sudo[107039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:58 compute-0 python3.9[107041]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:32:59 compute-0 sudo[107039]: pam_unix(sudo:session): session closed for user root
Nov 29 06:32:59 compute-0 sudo[107191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnpyneavvsyzixnuddddbzbzpxqbgytd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397979.1232371-313-180988162479639/AnsiballZ_file.py'
Nov 29 06:32:59 compute-0 sudo[107191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:32:59 compute-0 python3.9[107193]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:32:59 compute-0 sudo[107191]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:00 compute-0 sudo[107343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shxgaavrzntfkikietqedklyguyskhlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397979.7786565-313-152837026821487/AnsiballZ_file.py'
Nov 29 06:33:00 compute-0 sudo[107343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:00 compute-0 python3.9[107345]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:33:00 compute-0 sudo[107343]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:02 compute-0 sudo[107508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-loxbashjdfbmeptrakbynneyztmwqywr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397981.840896-463-94172627933207/AnsiballZ_file.py'
Nov 29 06:33:02 compute-0 sudo[107508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:02 compute-0 podman[107469]: 2025-11-29 06:33:02.169972284 +0000 UTC m=+0.089514640 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Nov 29 06:33:02 compute-0 python3.9[107516]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:33:02 compute-0 sudo[107508]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:02 compute-0 sudo[107673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqqtxunnzuxwjizehjfaoxyecizwwvbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397982.4597719-463-109840375051671/AnsiballZ_file.py'
Nov 29 06:33:02 compute-0 sudo[107673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:02 compute-0 python3.9[107675]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:33:02 compute-0 sudo[107673]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:03 compute-0 sudo[107825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxhdfgrrlcffzrfnbhggwroqkfzkxvru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397983.0697176-463-201553227562533/AnsiballZ_file.py'
Nov 29 06:33:03 compute-0 sudo[107825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:03 compute-0 python3.9[107827]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:33:03 compute-0 sudo[107825]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:03 compute-0 sudo[107977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imirsqomtfdumhnyovgbqdjdvrwueedy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397983.6241305-463-222316306656969/AnsiballZ_file.py'
Nov 29 06:33:03 compute-0 sudo[107977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:04 compute-0 python3.9[107979]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:33:04 compute-0 sudo[107977]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:04 compute-0 sudo[108129]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytaprxrvhjbikjuhpwveajohanqajmzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397984.2456264-463-5265803225623/AnsiballZ_file.py'
Nov 29 06:33:04 compute-0 sudo[108129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:04 compute-0 python3.9[108131]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:33:04 compute-0 sudo[108129]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:05 compute-0 sudo[108281]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyjxcgqtfbgmwagzqqhpdjngzawwtldz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397984.8324263-463-200342648147565/AnsiballZ_file.py'
Nov 29 06:33:05 compute-0 sudo[108281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:05 compute-0 python3.9[108283]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:33:05 compute-0 sudo[108281]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:05 compute-0 sudo[108433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-birixeckiutufkzvngdqzgciczmpeskx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397985.43615-463-199105448345969/AnsiballZ_file.py'
Nov 29 06:33:05 compute-0 sudo[108433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:05 compute-0 python3.9[108435]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:33:05 compute-0 sudo[108433]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:07 compute-0 sudo[108585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkcraqokzmbcyzqkrsobslgkjshdihnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397986.8300662-616-60279806002439/AnsiballZ_command.py'
Nov 29 06:33:07 compute-0 sudo[108585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:07 compute-0 python3.9[108587]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:33:07 compute-0 sudo[108585]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:08 compute-0 python3.9[108741]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 29 06:33:08 compute-0 sudo[108891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkcaibbbvaytdtxwcheonbhqnbnkjvbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397988.6887443-670-167747498883121/AnsiballZ_systemd_service.py'
Nov 29 06:33:08 compute-0 sudo[108891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:09 compute-0 python3.9[108893]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 06:33:09 compute-0 systemd[1]: Reloading.
Nov 29 06:33:09 compute-0 systemd-sysv-generator[108924]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:33:09 compute-0 systemd-rc-local-generator[108921]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:33:09 compute-0 sshd-session[108691]: Invalid user odoo from 103.179.56.44 port 34642
Nov 29 06:33:09 compute-0 sudo[108891]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:09 compute-0 sshd-session[108691]: Received disconnect from 103.179.56.44 port 34642:11: Bye Bye [preauth]
Nov 29 06:33:09 compute-0 sshd-session[108691]: Disconnected from invalid user odoo 103.179.56.44 port 34642 [preauth]
Nov 29 06:33:10 compute-0 sudo[109078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qokaykabvsdslozatkxhpwcwvaufpzmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397989.8780587-694-254325470676850/AnsiballZ_command.py'
Nov 29 06:33:10 compute-0 sudo[109078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:10 compute-0 python3.9[109080]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:33:10 compute-0 sudo[109078]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:10 compute-0 sudo[109231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spbxplryfntiwjtuqzmglxvmpsafuizq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397990.6088538-694-231628832508422/AnsiballZ_command.py'
Nov 29 06:33:10 compute-0 sudo[109231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:11 compute-0 python3.9[109233]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:33:11 compute-0 sudo[109231]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:11 compute-0 sudo[109384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhsscygmxpmpvfkxoweswjurqjqqmcde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397991.227949-694-107537919565616/AnsiballZ_command.py'
Nov 29 06:33:11 compute-0 sudo[109384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:11 compute-0 python3.9[109386]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:33:11 compute-0 sudo[109384]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:12 compute-0 sudo[109537]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxmxqvvtpqgpewjovjfevukhtzqqhtyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397991.8411446-694-140622756267791/AnsiballZ_command.py'
Nov 29 06:33:12 compute-0 sudo[109537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:12 compute-0 python3.9[109539]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:33:12 compute-0 sudo[109537]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:12 compute-0 sudo[109690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vedpmkohqouhqplyhotjlqeuupvrytjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397992.4024384-694-125819034658755/AnsiballZ_command.py'
Nov 29 06:33:12 compute-0 sudo[109690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:12 compute-0 python3.9[109692]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:33:12 compute-0 sudo[109690]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:13 compute-0 sudo[109843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fooyookoydqhfbjrzgdizwzjdkplowlh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397993.0153418-694-232574551902644/AnsiballZ_command.py'
Nov 29 06:33:13 compute-0 sudo[109843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:13 compute-0 python3.9[109845]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:33:13 compute-0 sudo[109843]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:13 compute-0 sudo[109996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kewcpdggpqrpxxbcsmnizfbmufrulvle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397993.6860824-694-129891832961625/AnsiballZ_command.py'
Nov 29 06:33:13 compute-0 sudo[109996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:14 compute-0 python3.9[109998]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:33:14 compute-0 sudo[109996]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:15 compute-0 sudo[110149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhkrjqihrixnzgdrjfwsjxmgphhmfkpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397995.462464-856-195215614743950/AnsiballZ_getent.py'
Nov 29 06:33:15 compute-0 sudo[110149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:16 compute-0 python3.9[110151]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Nov 29 06:33:16 compute-0 sudo[110149]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:17 compute-0 sudo[110302]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htvpsmgwiljprytpxcjyxxcbxxddygog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397996.49022-880-128629979103221/AnsiballZ_group.py'
Nov 29 06:33:17 compute-0 sudo[110302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:17 compute-0 python3.9[110304]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 29 06:33:17 compute-0 groupadd[110305]: group added to /etc/group: name=libvirt, GID=42473
Nov 29 06:33:17 compute-0 groupadd[110305]: group added to /etc/gshadow: name=libvirt
Nov 29 06:33:17 compute-0 groupadd[110305]: new group: name=libvirt, GID=42473
Nov 29 06:33:17 compute-0 sudo[110302]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:18 compute-0 sudo[110460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajkibqmmujdxkvrfanureburijszbuvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397998.3704205-904-252326069227195/AnsiballZ_user.py'
Nov 29 06:33:18 compute-0 sudo[110460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:19 compute-0 python3.9[110462]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 29 06:33:19 compute-0 useradd[110464]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Nov 29 06:33:19 compute-0 sudo[110460]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:19 compute-0 sudo[110620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjhtgcxrvrhrbthxugptleyoixgnwzht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397999.713238-937-100940064139657/AnsiballZ_setup.py'
Nov 29 06:33:19 compute-0 sudo[110620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:20 compute-0 python3.9[110622]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:33:20 compute-0 sudo[110620]: pam_unix(sudo:session): session closed for user root
Nov 29 06:33:20 compute-0 sudo[110704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wymecmbjyxcbneufaecbhepvcfvwcizc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764397999.713238-937-100940064139657/AnsiballZ_dnf.py'
Nov 29 06:33:20 compute-0 sudo[110704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:33:21 compute-0 python3.9[110706]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:33:23 compute-0 sshd-session[110708]: Invalid user sonarqube from 179.125.24.202 port 34864
Nov 29 06:33:23 compute-0 sshd-session[110708]: Received disconnect from 179.125.24.202 port 34864:11: Bye Bye [preauth]
Nov 29 06:33:23 compute-0 sshd-session[110708]: Disconnected from invalid user sonarqube 179.125.24.202 port 34864 [preauth]
Nov 29 06:33:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:33:24.784 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:33:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:33:24.786 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:33:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:33:24.786 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:33:24 compute-0 podman[110719]: 2025-11-29 06:33:24.823906239 +0000 UTC m=+0.088117914 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 29 06:33:29 compute-0 sshd-session[110741]: Invalid user Test from 1.214.197.163 port 56138
Nov 29 06:33:30 compute-0 sshd-session[110741]: Received disconnect from 1.214.197.163 port 56138:11: Bye Bye [preauth]
Nov 29 06:33:30 compute-0 sshd-session[110741]: Disconnected from invalid user Test 1.214.197.163 port 56138 [preauth]
Nov 29 06:33:32 compute-0 podman[110745]: 2025-11-29 06:33:32.827418907 +0000 UTC m=+0.092058697 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 29 06:33:33 compute-0 sshd-session[110743]: Invalid user ranger from 160.202.8.218 port 57548
Nov 29 06:33:33 compute-0 sshd-session[110743]: Received disconnect from 160.202.8.218 port 57548:11: Bye Bye [preauth]
Nov 29 06:33:33 compute-0 sshd-session[110743]: Disconnected from invalid user ranger 160.202.8.218 port 57548 [preauth]
Nov 29 06:33:46 compute-0 sshd-session[110946]: Invalid user ubuntu from 45.202.211.6 port 33512
Nov 29 06:33:47 compute-0 sshd-session[110946]: Received disconnect from 45.202.211.6 port 33512:11: Bye Bye [preauth]
Nov 29 06:33:47 compute-0 sshd-session[110946]: Disconnected from invalid user ubuntu 45.202.211.6 port 33512 [preauth]
Nov 29 06:33:55 compute-0 podman[110954]: 2025-11-29 06:33:55.796925368 +0000 UTC m=+0.058784427 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 29 06:34:00 compute-0 kernel: SELinux:  Converting 2756 SID table entries...
Nov 29 06:34:00 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 06:34:00 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 29 06:34:00 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 06:34:00 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 29 06:34:00 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 06:34:00 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 06:34:00 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 06:34:03 compute-0 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Nov 29 06:34:03 compute-0 podman[110981]: 2025-11-29 06:34:03.848778501 +0000 UTC m=+0.097189368 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 06:34:05 compute-0 sshd-session[110994]: Invalid user jack from 45.78.219.251 port 57970
Nov 29 06:34:07 compute-0 sshd-session[110994]: Received disconnect from 45.78.219.251 port 57970:11: Bye Bye [preauth]
Nov 29 06:34:07 compute-0 sshd-session[110994]: Disconnected from invalid user jack 45.78.219.251 port 57970 [preauth]
Nov 29 06:34:11 compute-0 kernel: SELinux:  Converting 2756 SID table entries...
Nov 29 06:34:11 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 06:34:11 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 29 06:34:11 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 06:34:11 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 29 06:34:11 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 06:34:11 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 06:34:11 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 06:34:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:34:24.785 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:34:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:34:24.787 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:34:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:34:24.787 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:34:26 compute-0 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Nov 29 06:34:26 compute-0 podman[112460]: 2025-11-29 06:34:26.797803897 +0000 UTC m=+0.054902390 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:34:34 compute-0 podman[117906]: 2025-11-29 06:34:34.873172469 +0000 UTC m=+0.139631570 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 29 06:34:41 compute-0 sshd-session[121023]: Received disconnect from 36.50.176.16 port 40190:11: Bye Bye [preauth]
Nov 29 06:34:41 compute-0 sshd-session[121023]: Disconnected from authenticating user root 36.50.176.16 port 40190 [preauth]
Nov 29 06:34:49 compute-0 sshd-session[126961]: Invalid user update from 179.125.24.202 port 45342
Nov 29 06:34:49 compute-0 sshd-session[126961]: Received disconnect from 179.125.24.202 port 45342:11: Bye Bye [preauth]
Nov 29 06:34:49 compute-0 sshd-session[126961]: Disconnected from invalid user update 179.125.24.202 port 45342 [preauth]
Nov 29 06:34:57 compute-0 podman[127866]: 2025-11-29 06:34:57.871379721 +0000 UTC m=+0.072708702 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent)
Nov 29 06:35:01 compute-0 sshd-session[127885]: Received disconnect from 45.202.211.6 port 50298:11: Bye Bye [preauth]
Nov 29 06:35:01 compute-0 sshd-session[127885]: Disconnected from authenticating user root 45.202.211.6 port 50298 [preauth]
Nov 29 06:35:03 compute-0 sshd-session[127887]: Invalid user sonarqube from 1.214.197.163 port 57526
Nov 29 06:35:04 compute-0 sshd-session[127887]: Received disconnect from 1.214.197.163 port 57526:11: Bye Bye [preauth]
Nov 29 06:35:04 compute-0 sshd-session[127887]: Disconnected from invalid user sonarqube 1.214.197.163 port 57526 [preauth]
Nov 29 06:35:05 compute-0 podman[127889]: 2025-11-29 06:35:05.850597141 +0000 UTC m=+0.104311053 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Nov 29 06:35:08 compute-0 sshd-session[127921]: Invalid user ubuntu from 160.202.8.218 port 51068
Nov 29 06:35:09 compute-0 sshd-session[127921]: Received disconnect from 160.202.8.218 port 51068:11: Bye Bye [preauth]
Nov 29 06:35:09 compute-0 sshd-session[127921]: Disconnected from invalid user ubuntu 160.202.8.218 port 51068 [preauth]
Nov 29 06:35:11 compute-0 kernel: SELinux:  Converting 2757 SID table entries...
Nov 29 06:35:11 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 06:35:11 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 29 06:35:11 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 06:35:11 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 29 06:35:11 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 06:35:11 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 06:35:11 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 06:35:14 compute-0 sshd-session[127925]: Invalid user ansadmin from 103.179.56.44 port 34508
Nov 29 06:35:14 compute-0 groupadd[127933]: group added to /etc/group: name=dnsmasq, GID=992
Nov 29 06:35:14 compute-0 sshd-session[127925]: Received disconnect from 103.179.56.44 port 34508:11: Bye Bye [preauth]
Nov 29 06:35:14 compute-0 sshd-session[127925]: Disconnected from invalid user ansadmin 103.179.56.44 port 34508 [preauth]
Nov 29 06:35:14 compute-0 groupadd[127933]: group added to /etc/gshadow: name=dnsmasq
Nov 29 06:35:14 compute-0 groupadd[127933]: new group: name=dnsmasq, GID=992
Nov 29 06:35:15 compute-0 useradd[127940]: new user: name=dnsmasq, UID=992, GID=992, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Nov 29 06:35:15 compute-0 dbus-broker-launch[772]: Noticed file-system modification, trigger reload.
Nov 29 06:35:15 compute-0 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Nov 29 06:35:15 compute-0 dbus-broker-launch[772]: Noticed file-system modification, trigger reload.
Nov 29 06:35:21 compute-0 groupadd[127953]: group added to /etc/group: name=clevis, GID=991
Nov 29 06:35:22 compute-0 groupadd[127953]: group added to /etc/gshadow: name=clevis
Nov 29 06:35:22 compute-0 groupadd[127953]: new group: name=clevis, GID=991
Nov 29 06:35:22 compute-0 useradd[127960]: new user: name=clevis, UID=991, GID=991, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Nov 29 06:35:23 compute-0 usermod[127970]: add 'clevis' to group 'tss'
Nov 29 06:35:23 compute-0 usermod[127970]: add 'clevis' to shadow group 'tss'
Nov 29 06:35:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:35:24.787 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:35:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:35:24.790 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:35:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:35:24.790 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:35:28 compute-0 podman[127991]: 2025-11-29 06:35:28.81160894 +0000 UTC m=+0.068254786 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 29 06:35:30 compute-0 polkitd[43520]: Reloading rules
Nov 29 06:35:30 compute-0 polkitd[43520]: Collecting garbage unconditionally...
Nov 29 06:35:30 compute-0 polkitd[43520]: Loading rules from directory /etc/polkit-1/rules.d
Nov 29 06:35:30 compute-0 polkitd[43520]: Loading rules from directory /usr/share/polkit-1/rules.d
Nov 29 06:35:30 compute-0 polkitd[43520]: Finished loading, compiling and executing 3 rules
Nov 29 06:35:30 compute-0 polkitd[43520]: Reloading rules
Nov 29 06:35:30 compute-0 polkitd[43520]: Collecting garbage unconditionally...
Nov 29 06:35:30 compute-0 polkitd[43520]: Loading rules from directory /etc/polkit-1/rules.d
Nov 29 06:35:30 compute-0 polkitd[43520]: Loading rules from directory /usr/share/polkit-1/rules.d
Nov 29 06:35:30 compute-0 polkitd[43520]: Finished loading, compiling and executing 3 rules
Nov 29 06:35:35 compute-0 groupadd[128176]: group added to /etc/group: name=ceph, GID=167
Nov 29 06:35:35 compute-0 groupadd[128176]: group added to /etc/gshadow: name=ceph
Nov 29 06:35:35 compute-0 groupadd[128176]: new group: name=ceph, GID=167
Nov 29 06:35:35 compute-0 useradd[128182]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Nov 29 06:35:36 compute-0 podman[128189]: 2025-11-29 06:35:36.963771195 +0000 UTC m=+0.218769390 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 29 06:35:39 compute-0 systemd[1]: Stopping OpenSSH server daemon...
Nov 29 06:35:39 compute-0 sshd[1011]: Received signal 15; terminating.
Nov 29 06:35:39 compute-0 systemd[1]: sshd.service: Deactivated successfully.
Nov 29 06:35:39 compute-0 systemd[1]: Stopped OpenSSH server daemon.
Nov 29 06:35:39 compute-0 systemd[1]: sshd.service: Consumed 5.016s CPU time, read 32.0K from disk, written 176.0K to disk.
Nov 29 06:35:39 compute-0 systemd[1]: Stopped target sshd-keygen.target.
Nov 29 06:35:39 compute-0 systemd[1]: Stopping sshd-keygen.target...
Nov 29 06:35:39 compute-0 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 29 06:35:39 compute-0 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 29 06:35:39 compute-0 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 29 06:35:39 compute-0 systemd[1]: Reached target sshd-keygen.target.
Nov 29 06:35:39 compute-0 systemd[1]: Starting OpenSSH server daemon...
Nov 29 06:35:39 compute-0 sshd[128727]: Server listening on 0.0.0.0 port 22.
Nov 29 06:35:39 compute-0 sshd[128727]: Server listening on :: port 22.
Nov 29 06:35:39 compute-0 systemd[1]: Started OpenSSH server daemon.
Nov 29 06:35:42 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 06:35:42 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 29 06:35:42 compute-0 systemd[1]: Reloading.
Nov 29 06:35:42 compute-0 systemd-rc-local-generator[128980]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:35:42 compute-0 systemd-sysv-generator[128984]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:35:42 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 06:35:59 compute-0 podman[134976]: 2025-11-29 06:35:59.940096703 +0000 UTC m=+0.073207022 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent)
Nov 29 06:36:07 compute-0 podman[137357]: 2025-11-29 06:36:07.616704828 +0000 UTC m=+0.129909749 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 29 06:36:13 compute-0 sshd-session[137383]: Invalid user hu from 179.125.24.202 port 33370
Nov 29 06:36:13 compute-0 sshd-session[137383]: Received disconnect from 179.125.24.202 port 33370:11: Bye Bye [preauth]
Nov 29 06:36:13 compute-0 sshd-session[137383]: Disconnected from invalid user hu 179.125.24.202 port 33370 [preauth]
Nov 29 06:36:14 compute-0 sshd-session[137385]: Invalid user tidb from 45.202.211.6 port 45798
Nov 29 06:36:14 compute-0 sshd-session[137385]: Received disconnect from 45.202.211.6 port 45798:11: Bye Bye [preauth]
Nov 29 06:36:14 compute-0 sshd-session[137385]: Disconnected from invalid user tidb 45.202.211.6 port 45798 [preauth]
Nov 29 06:36:14 compute-0 sudo[110704]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:15 compute-0 sudo[137536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjcqrwctcqiedbrnoebsxusfyskrxpxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398175.1094487-973-219243747836214/AnsiballZ_systemd.py'
Nov 29 06:36:15 compute-0 sudo[137536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:16 compute-0 python3.9[137538]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 06:36:16 compute-0 systemd[1]: Reloading.
Nov 29 06:36:16 compute-0 systemd-rc-local-generator[137569]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:36:16 compute-0 systemd-sysv-generator[137573]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:36:16 compute-0 sudo[137536]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:17 compute-0 sudo[137727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chxxyojzbspnfxgwevmozjmpuswdbwim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398176.969165-973-217382095331206/AnsiballZ_systemd.py'
Nov 29 06:36:17 compute-0 sudo[137727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:17 compute-0 python3.9[137729]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 06:36:17 compute-0 systemd[1]: Reloading.
Nov 29 06:36:17 compute-0 systemd-sysv-generator[137762]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:36:17 compute-0 systemd-rc-local-generator[137758]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:36:18 compute-0 sudo[137727]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:18 compute-0 sudo[137917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eelramnyqzdokxdlrxwcopimtgrehdxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398178.2358377-973-164170038364613/AnsiballZ_systemd.py'
Nov 29 06:36:18 compute-0 sudo[137917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:18 compute-0 python3.9[137919]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 06:36:18 compute-0 systemd[1]: Reloading.
Nov 29 06:36:18 compute-0 systemd-rc-local-generator[137947]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:36:18 compute-0 systemd-sysv-generator[137952]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:36:19 compute-0 sudo[137917]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:19 compute-0 sudo[138107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdgjfawfrgtrdmtwlnqzqoerqggjlttg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398179.3617098-973-271701920147462/AnsiballZ_systemd.py'
Nov 29 06:36:19 compute-0 sudo[138107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:19 compute-0 python3.9[138109]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 06:36:19 compute-0 systemd[1]: Reloading.
Nov 29 06:36:20 compute-0 systemd-rc-local-generator[138138]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:36:20 compute-0 systemd-sysv-generator[138142]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:36:20 compute-0 sudo[138107]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:20 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 06:36:20 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 29 06:36:20 compute-0 systemd[1]: man-db-cache-update.service: Consumed 10.084s CPU time.
Nov 29 06:36:20 compute-0 systemd[1]: run-rf1fbb7a9973145b89df6eb387feee899.service: Deactivated successfully.
Nov 29 06:36:20 compute-0 sudo[138298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwsjcjcppgqoyeojwijborjlxvrmzriz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398180.675674-1060-275560674398104/AnsiballZ_systemd.py'
Nov 29 06:36:20 compute-0 sudo[138298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:22 compute-0 python3.9[138300]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:36:22 compute-0 systemd[1]: Reloading.
Nov 29 06:36:22 compute-0 systemd-sysv-generator[138329]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:36:22 compute-0 systemd-rc-local-generator[138320]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:36:23 compute-0 sudo[138298]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:23 compute-0 sudo[138488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsgiptvdiewcoagmfgvdsmjdwuyxeetg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398183.3457205-1060-11548615298676/AnsiballZ_systemd.py'
Nov 29 06:36:23 compute-0 sudo[138488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:23 compute-0 python3.9[138490]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:36:24 compute-0 systemd[1]: Reloading.
Nov 29 06:36:24 compute-0 systemd-rc-local-generator[138520]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:36:24 compute-0 systemd-sysv-generator[138523]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:36:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:36:24.789 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:36:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:36:24.794 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:36:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:36:24.794 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:36:24 compute-0 sudo[138488]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:25 compute-0 sudo[138678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eiismdrglickskyyeigitybuhbixwwae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398184.9837568-1060-176429805496617/AnsiballZ_systemd.py'
Nov 29 06:36:25 compute-0 sudo[138678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:25 compute-0 python3.9[138680]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:36:25 compute-0 systemd[1]: Reloading.
Nov 29 06:36:25 compute-0 systemd-rc-local-generator[138710]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:36:25 compute-0 systemd-sysv-generator[138713]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:36:26 compute-0 sudo[138678]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:26 compute-0 sudo[138868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybfogipqxtumjcrgxjkriyxqombmspxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398186.6512876-1060-111131107532569/AnsiballZ_systemd.py'
Nov 29 06:36:26 compute-0 sudo[138868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:27 compute-0 python3.9[138870]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:36:27 compute-0 sudo[138868]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:28 compute-0 sudo[139024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msxuvhprjlanoajqfofuiunfzwanhvqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398187.7354624-1060-94715372036158/AnsiballZ_systemd.py'
Nov 29 06:36:28 compute-0 sudo[139024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:28 compute-0 python3.9[139026]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:36:28 compute-0 systemd[1]: Reloading.
Nov 29 06:36:28 compute-0 systemd-rc-local-generator[139054]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:36:28 compute-0 systemd-sysv-generator[139057]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:36:29 compute-0 sudo[139024]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:30 compute-0 sudo[139230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfhwdstouuuhuspuaeufkjqcnqjioeaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398189.7712348-1168-227725040319528/AnsiballZ_systemd.py'
Nov 29 06:36:30 compute-0 sudo[139230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:30 compute-0 podman[139190]: 2025-11-29 06:36:30.288496497 +0000 UTC m=+0.089753668 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 06:36:30 compute-0 python3.9[139236]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 06:36:30 compute-0 systemd[1]: Reloading.
Nov 29 06:36:30 compute-0 systemd-rc-local-generator[139264]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:36:30 compute-0 systemd-sysv-generator[139271]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:36:30 compute-0 sshd-session[139065]: Invalid user wordpress from 45.78.219.251 port 58624
Nov 29 06:36:30 compute-0 systemd[1]: Listening on libvirt proxy daemon socket.
Nov 29 06:36:30 compute-0 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Nov 29 06:36:31 compute-0 sudo[139230]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:31 compute-0 sshd-session[139065]: Received disconnect from 45.78.219.251 port 58624:11: Bye Bye [preauth]
Nov 29 06:36:31 compute-0 sshd-session[139065]: Disconnected from invalid user wordpress 45.78.219.251 port 58624 [preauth]
Nov 29 06:36:31 compute-0 sudo[139428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmspyrvzgfphfxscdzkohfrpnmdzvqys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398191.1898408-1192-94787990247798/AnsiballZ_systemd.py'
Nov 29 06:36:31 compute-0 sudo[139428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:31 compute-0 python3.9[139430]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:36:32 compute-0 sudo[139428]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:33 compute-0 sudo[139585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtogekfhmmyawxfdhubvmumpjktdfnwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398193.008418-1192-177896108138886/AnsiballZ_systemd.py'
Nov 29 06:36:33 compute-0 sudo[139585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:33 compute-0 python3.9[139587]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:36:33 compute-0 sudo[139585]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:34 compute-0 sudo[139740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbnburkjgiqsvckwrkpjomxpywygwtlr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398193.8344717-1192-190792521364261/AnsiballZ_systemd.py'
Nov 29 06:36:34 compute-0 sudo[139740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:34 compute-0 sshd-session[139458]: Invalid user marco from 1.214.197.163 port 58918
Nov 29 06:36:34 compute-0 sshd-session[139458]: Received disconnect from 1.214.197.163 port 58918:11: Bye Bye [preauth]
Nov 29 06:36:34 compute-0 sshd-session[139458]: Disconnected from invalid user marco 1.214.197.163 port 58918 [preauth]
Nov 29 06:36:34 compute-0 python3.9[139742]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:36:34 compute-0 sudo[139740]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:34 compute-0 sudo[139895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yszxffbdpdmwfaoijdyzzypusxeresoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398194.5938601-1192-131963826395251/AnsiballZ_systemd.py'
Nov 29 06:36:34 compute-0 sudo[139895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:35 compute-0 python3.9[139897]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:36:35 compute-0 sudo[139895]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:35 compute-0 sudo[140050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obznqjxifwkwdnzwxrliefkamhloqcar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398195.3407195-1192-64872953746519/AnsiballZ_systemd.py'
Nov 29 06:36:35 compute-0 sudo[140050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:35 compute-0 python3.9[140052]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:36:35 compute-0 sudo[140050]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:36 compute-0 sudo[140205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlznvpjhkiwmwaorgzztzerotpoqktwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398196.0867343-1192-243127327033915/AnsiballZ_systemd.py'
Nov 29 06:36:36 compute-0 sudo[140205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:36 compute-0 python3.9[140207]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:36:36 compute-0 sudo[140205]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:37 compute-0 sudo[140360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvgjfiuxazmlfnkvwlthzdhcjncoiheg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398196.9392974-1192-48264569903751/AnsiballZ_systemd.py'
Nov 29 06:36:37 compute-0 sudo[140360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:37 compute-0 python3.9[140362]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:36:37 compute-0 sudo[140360]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:37 compute-0 podman[140413]: 2025-11-29 06:36:37.887525121 +0000 UTC m=+0.154920749 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:36:38 compute-0 sudo[140540]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqjkjpobicjczwpwbkgqvuwouribcjyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398197.713357-1192-267392949694075/AnsiballZ_systemd.py'
Nov 29 06:36:38 compute-0 sudo[140540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:38 compute-0 python3.9[140542]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:36:38 compute-0 sudo[140540]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:38 compute-0 sudo[140695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsllvbsfcbgcewqzkosxpyzcdqixjlkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398198.5852053-1192-63144775899179/AnsiballZ_systemd.py'
Nov 29 06:36:38 compute-0 sudo[140695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:39 compute-0 python3.9[140697]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:36:39 compute-0 sudo[140695]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:39 compute-0 sudo[140850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iprejsytgnalhrjmiuhbhthzbpkitoud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398199.3689425-1192-143545406311117/AnsiballZ_systemd.py'
Nov 29 06:36:39 compute-0 sudo[140850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:39 compute-0 python3.9[140852]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:36:40 compute-0 sudo[140850]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:40 compute-0 sudo[141007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzylbwmzjxncqvesnporjiuxaqhislin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398200.1753287-1192-258456450973500/AnsiballZ_systemd.py'
Nov 29 06:36:40 compute-0 sudo[141007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:40 compute-0 python3.9[141009]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:36:41 compute-0 sudo[141007]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:41 compute-0 sshd-session[140959]: Invalid user root2 from 160.202.8.218 port 44588
Nov 29 06:36:41 compute-0 sudo[141162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kijlozisxkvvbxgwhvqysjoptnzknant ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398201.1646156-1192-107727137413758/AnsiballZ_systemd.py'
Nov 29 06:36:41 compute-0 sudo[141162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:41 compute-0 sshd-session[140959]: Received disconnect from 160.202.8.218 port 44588:11: Bye Bye [preauth]
Nov 29 06:36:41 compute-0 sshd-session[140959]: Disconnected from invalid user root2 160.202.8.218 port 44588 [preauth]
Nov 29 06:36:41 compute-0 python3.9[141164]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:36:41 compute-0 sudo[141162]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:42 compute-0 sudo[141317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ripoymlrcfaoiyzeubkpkndobcgylbbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398201.9912295-1192-60381491908825/AnsiballZ_systemd.py'
Nov 29 06:36:42 compute-0 sudo[141317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:42 compute-0 python3.9[141319]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:36:42 compute-0 sudo[141317]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:43 compute-0 sudo[141472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kejclqwpnhskaysomtujugpolmxxbtbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398202.7806654-1192-65877870462205/AnsiballZ_systemd.py'
Nov 29 06:36:43 compute-0 sudo[141472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:43 compute-0 python3.9[141474]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 06:36:43 compute-0 sudo[141472]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:44 compute-0 sudo[141627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrhuohlmguvqnocujrmagyideqdiodic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398204.0992556-1498-61772496782532/AnsiballZ_file.py'
Nov 29 06:36:44 compute-0 sudo[141627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:44 compute-0 python3.9[141629]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:36:44 compute-0 sudo[141627]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:44 compute-0 sudo[141779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmbrpqrwgdqmoieptxvduesygddckgtb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398204.6865933-1498-56792989552061/AnsiballZ_file.py'
Nov 29 06:36:44 compute-0 sudo[141779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:45 compute-0 python3.9[141781]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:36:45 compute-0 sudo[141779]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:45 compute-0 sudo[141931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tckfooyruqjyaiearrrscvzfjpyhwxyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398205.4542644-1498-59900789247745/AnsiballZ_file.py'
Nov 29 06:36:45 compute-0 sudo[141931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:45 compute-0 python3.9[141933]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:36:45 compute-0 sudo[141931]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:46 compute-0 sudo[142083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwbutlcixjwgztbvxqzstdrqxlcwsudm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398206.0647578-1498-61682069005227/AnsiballZ_file.py'
Nov 29 06:36:46 compute-0 sudo[142083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:46 compute-0 python3.9[142085]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:36:46 compute-0 sudo[142083]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:46 compute-0 sudo[142235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfbfgkeboyytneuwqmgtrdcyrkaqetfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398206.6780856-1498-135612803864717/AnsiballZ_file.py'
Nov 29 06:36:46 compute-0 sudo[142235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:47 compute-0 python3.9[142237]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:36:47 compute-0 sudo[142235]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:47 compute-0 sudo[142387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cazlidihxsiqbpostanfyezovdkowcae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398207.2792666-1498-145247637798824/AnsiballZ_file.py'
Nov 29 06:36:47 compute-0 sudo[142387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:47 compute-0 python3.9[142389]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:36:47 compute-0 sudo[142387]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:48 compute-0 sudo[142539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hajzjziwftjddehyygxngyqnxubjdhhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398208.410166-1627-228885656429813/AnsiballZ_stat.py'
Nov 29 06:36:48 compute-0 sudo[142539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:49 compute-0 python3.9[142541]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:36:49 compute-0 sudo[142539]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:49 compute-0 sudo[142664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlgmomvnnominvdbgymdzbqyqfjlexhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398208.410166-1627-228885656429813/AnsiballZ_copy.py'
Nov 29 06:36:49 compute-0 sudo[142664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:49 compute-0 python3.9[142666]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398208.410166-1627-228885656429813/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:36:49 compute-0 sudo[142664]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:50 compute-0 sudo[142816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibmphugtsfihskvjehxpnmcabpdfoekm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398209.8732922-1627-188193881869483/AnsiballZ_stat.py'
Nov 29 06:36:50 compute-0 sudo[142816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:50 compute-0 python3.9[142818]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:36:50 compute-0 sudo[142816]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:50 compute-0 sudo[142941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhecveutpwdhqnkewwtyptlqxspddkse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398209.8732922-1627-188193881869483/AnsiballZ_copy.py'
Nov 29 06:36:50 compute-0 sudo[142941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:50 compute-0 python3.9[142943]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398209.8732922-1627-188193881869483/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:36:50 compute-0 sudo[142941]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:51 compute-0 sudo[143093]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlqiqincddbhejyhdevwceelxgrprsfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398211.0097873-1627-1048501179657/AnsiballZ_stat.py'
Nov 29 06:36:51 compute-0 sudo[143093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:51 compute-0 python3.9[143095]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:36:51 compute-0 sudo[143093]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:51 compute-0 sudo[143218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzzcbvmoelsuvyfuxfcnssvghbinutyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398211.0097873-1627-1048501179657/AnsiballZ_copy.py'
Nov 29 06:36:51 compute-0 sudo[143218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:52 compute-0 python3.9[143220]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398211.0097873-1627-1048501179657/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:36:52 compute-0 sudo[143218]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:52 compute-0 sudo[143370]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-romwqfmpwrwhtawsuccwdzgxvvlxheiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398212.2432864-1627-191976833910561/AnsiballZ_stat.py'
Nov 29 06:36:52 compute-0 sudo[143370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:52 compute-0 python3.9[143372]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:36:52 compute-0 sudo[143370]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:53 compute-0 sudo[143495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfjlfvfmqrbxzxryyloinwnewjkuwply ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398212.2432864-1627-191976833910561/AnsiballZ_copy.py'
Nov 29 06:36:53 compute-0 sudo[143495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:53 compute-0 python3.9[143497]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398212.2432864-1627-191976833910561/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:36:53 compute-0 sudo[143495]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:53 compute-0 sudo[143647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnrbdnnzfbibhxabhdjfgzvvkjpyxysc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398213.4721699-1627-22529706509779/AnsiballZ_stat.py'
Nov 29 06:36:53 compute-0 sudo[143647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:53 compute-0 python3.9[143649]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:36:53 compute-0 sudo[143647]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:54 compute-0 sudo[143772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcygteeieplcwnmcevxpemdckrjbtbax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398213.4721699-1627-22529706509779/AnsiballZ_copy.py'
Nov 29 06:36:54 compute-0 sudo[143772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:54 compute-0 python3.9[143774]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398213.4721699-1627-22529706509779/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:36:54 compute-0 sudo[143772]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:54 compute-0 sudo[143924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjprttzjxcyzwtndchdtqbnyxucsdlnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398214.6854668-1627-112516314095692/AnsiballZ_stat.py'
Nov 29 06:36:54 compute-0 sudo[143924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:55 compute-0 python3.9[143926]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:36:55 compute-0 sudo[143924]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:55 compute-0 sudo[144049]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-renccxaewrvwzgwqfgbxuxfhgozqkzvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398214.6854668-1627-112516314095692/AnsiballZ_copy.py'
Nov 29 06:36:55 compute-0 sudo[144049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:55 compute-0 python3.9[144051]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398214.6854668-1627-112516314095692/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:36:55 compute-0 sudo[144049]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:56 compute-0 sudo[144201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrinillkyqsqndnylcizhomeansppcvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398215.849926-1627-119491024375763/AnsiballZ_stat.py'
Nov 29 06:36:56 compute-0 sudo[144201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:56 compute-0 python3.9[144203]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:36:56 compute-0 sudo[144201]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:56 compute-0 sudo[144324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mykucztnyonfsuzvclkbkrpdzwqoscjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398215.849926-1627-119491024375763/AnsiballZ_copy.py'
Nov 29 06:36:56 compute-0 sudo[144324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:56 compute-0 python3.9[144326]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398215.849926-1627-119491024375763/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:36:56 compute-0 sudo[144324]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:57 compute-0 sudo[144476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltihghxefcoydqurulqufpbyuheomwqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398217.085032-1627-261308043862905/AnsiballZ_stat.py'
Nov 29 06:36:57 compute-0 sudo[144476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:57 compute-0 python3.9[144478]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:36:57 compute-0 sudo[144476]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:57 compute-0 sudo[144601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwvgxcimsuixzisqpxecjererbrjwmdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398217.085032-1627-261308043862905/AnsiballZ_copy.py'
Nov 29 06:36:57 compute-0 sudo[144601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:58 compute-0 python3.9[144603]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398217.085032-1627-261308043862905/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:36:58 compute-0 sudo[144601]: pam_unix(sudo:session): session closed for user root
Nov 29 06:36:59 compute-0 sudo[144753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljdrpckzxoutmdhozbbcuhlquzrslexr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398219.301749-1966-141597560295322/AnsiballZ_command.py'
Nov 29 06:36:59 compute-0 sudo[144753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:36:59 compute-0 python3.9[144755]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Nov 29 06:37:00 compute-0 sudo[144753]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:00 compute-0 sudo[144916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbdovxjwboykkddjlrhgxygyximwdrvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398220.417448-1993-216526471330648/AnsiballZ_file.py'
Nov 29 06:37:00 compute-0 sudo[144916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:00 compute-0 podman[144880]: 2025-11-29 06:37:00.721612055 +0000 UTC m=+0.053071203 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 06:37:00 compute-0 python3.9[144922]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:00 compute-0 sudo[144916]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:01 compute-0 sudo[145076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnhoedweksyepenagyozabvswpbbzewh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398221.0418549-1993-230900012932114/AnsiballZ_file.py'
Nov 29 06:37:01 compute-0 sudo[145076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:01 compute-0 python3.9[145078]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:01 compute-0 sudo[145076]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:01 compute-0 anacron[29976]: Job `cron.daily' started
Nov 29 06:37:01 compute-0 anacron[29976]: Job `cron.daily' terminated
Nov 29 06:37:02 compute-0 sudo[145230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tyckydpbpfnhvbfxulktuerajduloaco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398221.7799518-1993-239663043534993/AnsiballZ_file.py'
Nov 29 06:37:02 compute-0 sudo[145230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:02 compute-0 python3.9[145232]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:02 compute-0 sudo[145230]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:02 compute-0 sudo[145382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njyfnypqcppkcqqhwcwpxldvpmnwpzix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398222.3581078-1993-220653448651347/AnsiballZ_file.py'
Nov 29 06:37:02 compute-0 sudo[145382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:02 compute-0 python3.9[145384]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:02 compute-0 sudo[145382]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:03 compute-0 sudo[145534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btnazervpfuoxiguvecfkhwxfnpgdylk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398222.9302068-1993-166506206081774/AnsiballZ_file.py'
Nov 29 06:37:03 compute-0 sudo[145534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:03 compute-0 python3.9[145536]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:03 compute-0 sudo[145534]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:03 compute-0 sudo[145686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdgzdscwwlbufdepzefiqinodwzmyjqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398223.592726-1993-233568134644180/AnsiballZ_file.py'
Nov 29 06:37:03 compute-0 sudo[145686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:04 compute-0 python3.9[145688]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:04 compute-0 sudo[145686]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:04 compute-0 sudo[145838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vamwspmbxvbomufwrzkyqnucjhribwav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398224.1601024-1993-164952847767761/AnsiballZ_file.py'
Nov 29 06:37:04 compute-0 sudo[145838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:04 compute-0 python3.9[145840]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:04 compute-0 sudo[145838]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:05 compute-0 sudo[145990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yoawuvbyhividphrcetgctigugjanamq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398224.754177-1993-204910816559974/AnsiballZ_file.py'
Nov 29 06:37:05 compute-0 sudo[145990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:05 compute-0 python3.9[145992]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:05 compute-0 sudo[145990]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:05 compute-0 sudo[146142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqhwadwpiesmhpluqcslzmckzddyhhwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398225.393229-1993-216369040095426/AnsiballZ_file.py'
Nov 29 06:37:05 compute-0 sudo[146142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:05 compute-0 python3.9[146144]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:05 compute-0 sudo[146142]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:06 compute-0 sudo[146294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnruyegjbghtrrnlhcawrsoqveqgnjlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398225.9895256-1993-222750547543111/AnsiballZ_file.py'
Nov 29 06:37:06 compute-0 sudo[146294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:06 compute-0 python3.9[146296]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:06 compute-0 sudo[146294]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:06 compute-0 sudo[146446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocqglweqcljrolmgagedlscdhusgbibl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398226.6188362-1993-53289370707403/AnsiballZ_file.py'
Nov 29 06:37:06 compute-0 sudo[146446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:07 compute-0 python3.9[146448]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:07 compute-0 sudo[146446]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:07 compute-0 sudo[146600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxodbjtfeofaxrfkwpxvahvlgtllgayp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398227.2330074-1993-48541717651236/AnsiballZ_file.py'
Nov 29 06:37:07 compute-0 sudo[146600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:07 compute-0 python3.9[146602]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:07 compute-0 sudo[146600]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:08 compute-0 sudo[146762]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlewwyabqxubhskzfxayfxvrvfjmjgtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398227.8112123-1993-250241166480475/AnsiballZ_file.py'
Nov 29 06:37:08 compute-0 sudo[146762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:08 compute-0 podman[146726]: 2025-11-29 06:37:08.158221836 +0000 UTC m=+0.129579433 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 29 06:37:08 compute-0 python3.9[146769]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:08 compute-0 sudo[146762]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:08 compute-0 sshd-session[146449]: Invalid user init from 103.179.56.44 port 58806
Nov 29 06:37:08 compute-0 sshd-session[146449]: Received disconnect from 103.179.56.44 port 58806:11: Bye Bye [preauth]
Nov 29 06:37:08 compute-0 sshd-session[146449]: Disconnected from invalid user init 103.179.56.44 port 58806 [preauth]
Nov 29 06:37:08 compute-0 sudo[146930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfnddtbqjjlljdszsakpznlgnefscsya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398228.446185-1993-203567538078845/AnsiballZ_file.py'
Nov 29 06:37:08 compute-0 sudo[146930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:08 compute-0 python3.9[146932]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:08 compute-0 sudo[146930]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:10 compute-0 sudo[147082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-korbuaxnyoqwccbgfkwsikdluuryzydc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398229.9886842-2290-1155243850991/AnsiballZ_stat.py'
Nov 29 06:37:10 compute-0 sudo[147082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:10 compute-0 python3.9[147084]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:10 compute-0 sudo[147082]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:10 compute-0 sudo[147205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byzomdcsjqlwpabsrexbfqoxmlpmpatd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398229.9886842-2290-1155243850991/AnsiballZ_copy.py'
Nov 29 06:37:10 compute-0 sudo[147205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:11 compute-0 python3.9[147207]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398229.9886842-2290-1155243850991/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:11 compute-0 sudo[147205]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:11 compute-0 sudo[147357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhxuodbtyzkcwdsgpawufqyfczadllck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398231.2771864-2290-57484670848367/AnsiballZ_stat.py'
Nov 29 06:37:11 compute-0 sudo[147357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:11 compute-0 python3.9[147359]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:11 compute-0 sudo[147357]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:12 compute-0 sudo[147480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsdwzsuvmlgitxaqllzldbvrxwbshmmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398231.2771864-2290-57484670848367/AnsiballZ_copy.py'
Nov 29 06:37:12 compute-0 sudo[147480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:12 compute-0 python3.9[147482]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398231.2771864-2290-57484670848367/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:12 compute-0 sudo[147480]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:12 compute-0 sudo[147632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlomcmndiphcuhaewxpanbrcyttvoihj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398232.3978298-2290-8034697821926/AnsiballZ_stat.py'
Nov 29 06:37:12 compute-0 sudo[147632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:12 compute-0 python3.9[147634]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:12 compute-0 sudo[147632]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:13 compute-0 sudo[147755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdiptyyvfkdeeooopzlnayeujjwfmdup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398232.3978298-2290-8034697821926/AnsiballZ_copy.py'
Nov 29 06:37:13 compute-0 sudo[147755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:13 compute-0 python3.9[147757]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398232.3978298-2290-8034697821926/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:13 compute-0 sudo[147755]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:13 compute-0 sudo[147907]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxqwqdhpfkjbhbslgsptxadndlbnpoem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398233.498128-2290-267669861743899/AnsiballZ_stat.py'
Nov 29 06:37:13 compute-0 sudo[147907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:13 compute-0 python3.9[147909]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:13 compute-0 sudo[147907]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:14 compute-0 sudo[148030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilaqmvnbkyeoandsgrizupwyozakjphf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398233.498128-2290-267669861743899/AnsiballZ_copy.py'
Nov 29 06:37:14 compute-0 sudo[148030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:14 compute-0 python3.9[148032]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398233.498128-2290-267669861743899/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:14 compute-0 sudo[148030]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:14 compute-0 sudo[148182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctvksohtttwbgksvnhqocdfjexbwzkjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398234.6095243-2290-42887268857544/AnsiballZ_stat.py'
Nov 29 06:37:14 compute-0 sudo[148182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:15 compute-0 python3.9[148184]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:15 compute-0 sudo[148182]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:15 compute-0 sudo[148305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scpjhfyumeifnajmuepqjphxxuxeijxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398234.6095243-2290-42887268857544/AnsiballZ_copy.py'
Nov 29 06:37:15 compute-0 sudo[148305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:15 compute-0 python3.9[148307]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398234.6095243-2290-42887268857544/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:15 compute-0 sudo[148305]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:15 compute-0 sudo[148457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsqberdaabvomrpakharrrqqlyilaflj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398235.7281573-2290-50563519572579/AnsiballZ_stat.py'
Nov 29 06:37:15 compute-0 sudo[148457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:16 compute-0 python3.9[148459]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:16 compute-0 sudo[148457]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:16 compute-0 sudo[148580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjdtlmmjoiubvbkbuhurfwoznenczcay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398235.7281573-2290-50563519572579/AnsiballZ_copy.py'
Nov 29 06:37:16 compute-0 sudo[148580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:16 compute-0 python3.9[148582]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398235.7281573-2290-50563519572579/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:16 compute-0 sudo[148580]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:17 compute-0 sudo[148732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amglgzxsfucqdkhwkvjwijgmqdigxfpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398236.7730865-2290-104266050639220/AnsiballZ_stat.py'
Nov 29 06:37:17 compute-0 sudo[148732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:17 compute-0 python3.9[148734]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:17 compute-0 sudo[148732]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:17 compute-0 sudo[148855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rppifrrkllgfwgzxtphoncbwydnosffp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398236.7730865-2290-104266050639220/AnsiballZ_copy.py'
Nov 29 06:37:17 compute-0 sudo[148855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:17 compute-0 python3.9[148857]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398236.7730865-2290-104266050639220/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:17 compute-0 sudo[148855]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:18 compute-0 sudo[149007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bionrkmthpqgtwfmolpkdfwomivothyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398237.8241518-2290-25682673962749/AnsiballZ_stat.py'
Nov 29 06:37:18 compute-0 sudo[149007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:18 compute-0 python3.9[149009]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:18 compute-0 sudo[149007]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:18 compute-0 sudo[149130]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kogkitmwmckbuysqusyoeaztmvxakurh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398237.8241518-2290-25682673962749/AnsiballZ_copy.py'
Nov 29 06:37:18 compute-0 sudo[149130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:18 compute-0 python3.9[149132]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398237.8241518-2290-25682673962749/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:18 compute-0 sudo[149130]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:19 compute-0 sudo[149282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkrcahpddaomfyjxvdtfritszepwonip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398238.8906813-2290-247600218653378/AnsiballZ_stat.py'
Nov 29 06:37:19 compute-0 sudo[149282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:19 compute-0 python3.9[149284]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:19 compute-0 sudo[149282]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:19 compute-0 sudo[149405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqezemdehdxqbpfhbbgwxgbfxhmytmcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398238.8906813-2290-247600218653378/AnsiballZ_copy.py'
Nov 29 06:37:19 compute-0 sudo[149405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:19 compute-0 python3.9[149407]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398238.8906813-2290-247600218653378/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:19 compute-0 sudo[149405]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:20 compute-0 sudo[149557]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lukykiqlrqblotqfypqjkcykguvduebc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398240.0154681-2290-86644265381538/AnsiballZ_stat.py'
Nov 29 06:37:20 compute-0 sudo[149557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:20 compute-0 python3.9[149559]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:20 compute-0 sudo[149557]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:20 compute-0 sudo[149680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfkgmzimjuiscofdjlctddiysomlpyno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398240.0154681-2290-86644265381538/AnsiballZ_copy.py'
Nov 29 06:37:20 compute-0 sudo[149680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:20 compute-0 python3.9[149682]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398240.0154681-2290-86644265381538/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:21 compute-0 sudo[149680]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:21 compute-0 sudo[149832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpxolxvrnffysdyugvqlhzrvdpuorrfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398241.1159158-2290-230876776101808/AnsiballZ_stat.py'
Nov 29 06:37:21 compute-0 sudo[149832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:21 compute-0 python3.9[149834]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:21 compute-0 sudo[149832]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:21 compute-0 sudo[149955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spxwnwygpasdqeuaisszvyyhxqfkpfxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398241.1159158-2290-230876776101808/AnsiballZ_copy.py'
Nov 29 06:37:21 compute-0 sudo[149955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:22 compute-0 python3.9[149957]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398241.1159158-2290-230876776101808/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:22 compute-0 sudo[149955]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:22 compute-0 sudo[150107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfqirqgwibpoipxdrehokpoucevkgyht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398242.2571528-2290-37566705119260/AnsiballZ_stat.py'
Nov 29 06:37:22 compute-0 sudo[150107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:22 compute-0 python3.9[150109]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:22 compute-0 sudo[150107]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:23 compute-0 sudo[150230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uchrtadddsyxrwobzmxwmgffsnqtbeee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398242.2571528-2290-37566705119260/AnsiballZ_copy.py'
Nov 29 06:37:23 compute-0 sudo[150230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:23 compute-0 python3.9[150232]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398242.2571528-2290-37566705119260/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:23 compute-0 sudo[150230]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:23 compute-0 sudo[150382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rozoyrzphglwuvtigthmnbvvuwigituz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398243.3741412-2290-208820343437792/AnsiballZ_stat.py'
Nov 29 06:37:23 compute-0 sudo[150382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:23 compute-0 python3.9[150384]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:23 compute-0 sudo[150382]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:24 compute-0 sudo[150507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmvuygqopxesvgdzenjbztaqqhdamlnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398243.3741412-2290-208820343437792/AnsiballZ_copy.py'
Nov 29 06:37:24 compute-0 sudo[150507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:24 compute-0 python3.9[150509]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398243.3741412-2290-208820343437792/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:24 compute-0 sudo[150507]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:24 compute-0 sudo[150659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbdadygmaosmfklhyuunuojxgyvnduqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398244.495092-2290-80115258524191/AnsiballZ_stat.py'
Nov 29 06:37:24 compute-0 sudo[150659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:37:24.790 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:37:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:37:24.791 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:37:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:37:24.791 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:37:24 compute-0 python3.9[150661]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:24 compute-0 sudo[150659]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:24 compute-0 sshd-session[150385]: Received disconnect from 45.202.211.6 port 48244:11: Bye Bye [preauth]
Nov 29 06:37:24 compute-0 sshd-session[150385]: Disconnected from authenticating user root 45.202.211.6 port 48244 [preauth]
Nov 29 06:37:25 compute-0 sudo[150782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgoxcdshrbagqhrlqakyheaypxodrnzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398244.495092-2290-80115258524191/AnsiballZ_copy.py'
Nov 29 06:37:25 compute-0 sudo[150782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:25 compute-0 python3.9[150784]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398244.495092-2290-80115258524191/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:25 compute-0 sudo[150782]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:27 compute-0 python3.9[150934]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:37:28 compute-0 sudo[151087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mspjihuaqhlpoogyxfcmvkgraixdlhqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398248.2690806-2908-256676194967683/AnsiballZ_seboolean.py'
Nov 29 06:37:28 compute-0 sudo[151087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:28 compute-0 python3.9[151089]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Nov 29 06:37:31 compute-0 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Nov 29 06:37:31 compute-0 podman[151094]: 2025-11-29 06:37:31.791702413 +0000 UTC m=+0.055329046 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 06:37:34 compute-0 sudo[151087]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:35 compute-0 sshd-session[151116]: Invalid user user01 from 179.125.24.202 port 35798
Nov 29 06:37:35 compute-0 sshd-session[151116]: Received disconnect from 179.125.24.202 port 35798:11: Bye Bye [preauth]
Nov 29 06:37:35 compute-0 sshd-session[151116]: Disconnected from invalid user user01 179.125.24.202 port 35798 [preauth]
Nov 29 06:37:35 compute-0 sudo[151267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvdhtmecrjmbsjekqwgnndlgaibnhyxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398254.9674096-2932-25263192171762/AnsiballZ_copy.py'
Nov 29 06:37:35 compute-0 sudo[151267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:36 compute-0 python3.9[151269]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:36 compute-0 sudo[151267]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:36 compute-0 sudo[151419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djhjzscdwkjrdunwlxijsemwktxcftzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398256.3314893-2932-42442235947745/AnsiballZ_copy.py'
Nov 29 06:37:36 compute-0 sudo[151419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:36 compute-0 python3.9[151421]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:36 compute-0 sudo[151419]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:37 compute-0 sudo[151571]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkzgakesbvqbsdsyrwvxionoadzjpdbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398257.133215-2932-242433052680998/AnsiballZ_copy.py'
Nov 29 06:37:37 compute-0 sudo[151571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:37 compute-0 python3.9[151573]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:37 compute-0 sudo[151571]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:38 compute-0 sudo[151723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqxfbjfhvfpjieosrwoyygaquihvsewh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398257.8825557-2932-68671824586535/AnsiballZ_copy.py'
Nov 29 06:37:38 compute-0 sudo[151723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:38 compute-0 podman[151725]: 2025-11-29 06:37:38.316695971 +0000 UTC m=+0.084789744 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 29 06:37:38 compute-0 python3.9[151726]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:38 compute-0 sudo[151723]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:39 compute-0 sudo[151901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkfwvmzlghqiapktbniedmzsxlhmjqcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398258.6116133-2932-82429328555824/AnsiballZ_copy.py'
Nov 29 06:37:39 compute-0 sudo[151901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:39 compute-0 python3.9[151903]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:39 compute-0 sudo[151901]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:40 compute-0 sudo[152053]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odvjmvxdkvglwblutvkyxtzmvpppehli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398260.6991389-3040-255170839601502/AnsiballZ_copy.py'
Nov 29 06:37:40 compute-0 sudo[152053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:41 compute-0 python3.9[152055]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:41 compute-0 sudo[152053]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:41 compute-0 sudo[152205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eixfiyenybilifwvjsoaemeetxiehwbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398261.3396232-3040-22323597448493/AnsiballZ_copy.py'
Nov 29 06:37:41 compute-0 sudo[152205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:41 compute-0 python3.9[152207]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:41 compute-0 sudo[152205]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:42 compute-0 sudo[152357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sssvhawmcixbqtksvowzqvokoebjeewm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398261.9747112-3040-26394092730110/AnsiballZ_copy.py'
Nov 29 06:37:42 compute-0 sudo[152357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:42 compute-0 python3.9[152359]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:42 compute-0 sudo[152357]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:42 compute-0 sudo[152509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iidkbszrbzagqfycmpqagvibxgqdqmyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398262.6211638-3040-97156854779114/AnsiballZ_copy.py'
Nov 29 06:37:42 compute-0 sudo[152509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:43 compute-0 python3.9[152511]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:43 compute-0 sudo[152509]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:43 compute-0 sudo[152661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzjihjvfktuntduwqziagcfvdkitmnqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398263.2502053-3040-136625275057836/AnsiballZ_copy.py'
Nov 29 06:37:43 compute-0 sudo[152661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:43 compute-0 python3.9[152663]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:43 compute-0 sudo[152661]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:45 compute-0 sudo[152813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxqahfekbqgydlbgusyhfgvriscxxtvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398264.7804928-3148-104925400950662/AnsiballZ_systemd.py'
Nov 29 06:37:45 compute-0 sudo[152813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:45 compute-0 python3.9[152815]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:37:45 compute-0 systemd[1]: Reloading.
Nov 29 06:37:45 compute-0 systemd-rc-local-generator[152845]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:37:45 compute-0 systemd-sysv-generator[152848]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:37:45 compute-0 systemd[1]: Starting libvirt logging daemon socket...
Nov 29 06:37:45 compute-0 systemd[1]: Listening on libvirt logging daemon socket.
Nov 29 06:37:45 compute-0 systemd[1]: Starting libvirt logging daemon admin socket...
Nov 29 06:37:45 compute-0 systemd[1]: Listening on libvirt logging daemon admin socket.
Nov 29 06:37:45 compute-0 systemd[1]: Starting libvirt logging daemon...
Nov 29 06:37:45 compute-0 systemd[1]: Started libvirt logging daemon.
Nov 29 06:37:45 compute-0 sudo[152813]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:46 compute-0 sudo[153008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqsfscgpxozpzeowrlbxfcolrilenycl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398265.984717-3148-272464952336073/AnsiballZ_systemd.py'
Nov 29 06:37:46 compute-0 sudo[153008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:46 compute-0 python3.9[153010]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:37:46 compute-0 systemd[1]: Reloading.
Nov 29 06:37:46 compute-0 systemd-rc-local-generator[153037]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:37:46 compute-0 systemd-sysv-generator[153040]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:37:46 compute-0 systemd[1]: Starting libvirt nodedev daemon socket...
Nov 29 06:37:46 compute-0 systemd[1]: Listening on libvirt nodedev daemon socket.
Nov 29 06:37:46 compute-0 systemd[1]: Starting libvirt nodedev daemon admin socket...
Nov 29 06:37:46 compute-0 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Nov 29 06:37:46 compute-0 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Nov 29 06:37:46 compute-0 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Nov 29 06:37:46 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Nov 29 06:37:46 compute-0 systemd[1]: Started libvirt nodedev daemon.
Nov 29 06:37:46 compute-0 sudo[153008]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:47 compute-0 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Nov 29 06:37:47 compute-0 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Nov 29 06:37:47 compute-0 sudo[153224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azpehbjxuqcposqnpwvfwidixexkbaqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398267.0679579-3148-12380740495917/AnsiballZ_systemd.py'
Nov 29 06:37:47 compute-0 sudo[153224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:47 compute-0 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Nov 29 06:37:47 compute-0 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Nov 29 06:37:47 compute-0 python3.9[153226]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:37:47 compute-0 systemd[1]: Reloading.
Nov 29 06:37:47 compute-0 systemd-rc-local-generator[153258]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:37:47 compute-0 systemd-sysv-generator[153262]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:37:47 compute-0 systemd[1]: Starting libvirt proxy daemon admin socket...
Nov 29 06:37:47 compute-0 systemd[1]: Starting libvirt proxy daemon read-only socket...
Nov 29 06:37:47 compute-0 systemd[1]: Listening on libvirt proxy daemon admin socket.
Nov 29 06:37:47 compute-0 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Nov 29 06:37:47 compute-0 systemd[1]: Starting libvirt proxy daemon...
Nov 29 06:37:48 compute-0 systemd[1]: Started libvirt proxy daemon.
Nov 29 06:37:48 compute-0 sudo[153224]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:48 compute-0 setroubleshoot[153101]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 8903f98c-662e-41e9-87ca-a3a85626e7a4
Nov 29 06:37:48 compute-0 setroubleshoot[153101]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Nov 29 06:37:48 compute-0 setroubleshoot[153101]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 8903f98c-662e-41e9-87ca-a3a85626e7a4
Nov 29 06:37:48 compute-0 setroubleshoot[153101]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Nov 29 06:37:48 compute-0 sudo[153445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqwvnjipzwfohpouxtnaklfppdwxqiyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398268.444224-3148-270934026079293/AnsiballZ_systemd.py'
Nov 29 06:37:48 compute-0 sudo[153445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:49 compute-0 python3.9[153447]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:37:49 compute-0 systemd[1]: Reloading.
Nov 29 06:37:49 compute-0 systemd-sysv-generator[153476]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:37:49 compute-0 systemd-rc-local-generator[153470]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:37:49 compute-0 systemd[1]: Listening on libvirt locking daemon socket.
Nov 29 06:37:49 compute-0 systemd[1]: Starting libvirt QEMU daemon socket...
Nov 29 06:37:49 compute-0 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Nov 29 06:37:49 compute-0 systemd[1]: Starting Virtual Machine and Container Registration Service...
Nov 29 06:37:49 compute-0 systemd[1]: Listening on libvirt QEMU daemon socket.
Nov 29 06:37:49 compute-0 systemd[1]: Starting libvirt QEMU daemon admin socket...
Nov 29 06:37:49 compute-0 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Nov 29 06:37:49 compute-0 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Nov 29 06:37:49 compute-0 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Nov 29 06:37:49 compute-0 systemd[1]: Started Virtual Machine and Container Registration Service.
Nov 29 06:37:49 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Nov 29 06:37:49 compute-0 systemd[1]: Started libvirt QEMU daemon.
Nov 29 06:37:49 compute-0 sudo[153445]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:49 compute-0 sudo[153659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uavuecppfnetdqpqsmjoanltvzqyanwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398269.5669918-3148-161562853908345/AnsiballZ_systemd.py'
Nov 29 06:37:49 compute-0 sudo[153659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:50 compute-0 python3.9[153661]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:37:50 compute-0 systemd[1]: Reloading.
Nov 29 06:37:50 compute-0 systemd-sysv-generator[153691]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:37:50 compute-0 systemd-rc-local-generator[153685]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:37:50 compute-0 systemd[1]: Starting libvirt secret daemon socket...
Nov 29 06:37:50 compute-0 systemd[1]: Listening on libvirt secret daemon socket.
Nov 29 06:37:50 compute-0 systemd[1]: Starting libvirt secret daemon admin socket...
Nov 29 06:37:50 compute-0 systemd[1]: Starting libvirt secret daemon read-only socket...
Nov 29 06:37:50 compute-0 systemd[1]: Listening on libvirt secret daemon admin socket.
Nov 29 06:37:50 compute-0 systemd[1]: Listening on libvirt secret daemon read-only socket.
Nov 29 06:37:50 compute-0 systemd[1]: Starting libvirt secret daemon...
Nov 29 06:37:50 compute-0 systemd[1]: Started libvirt secret daemon.
Nov 29 06:37:50 compute-0 sudo[153659]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:53 compute-0 sudo[153870]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssazkvnytlahyvxxcszoonwnydtvznmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398273.2730527-3259-20300606969388/AnsiballZ_file.py'
Nov 29 06:37:53 compute-0 sudo[153870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:53 compute-0 python3.9[153872]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:53 compute-0 sudo[153870]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:54 compute-0 sudo[154022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkbwinwssmhsfxrlpxvtyhigpxoxvwdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398274.048825-3283-247765384274059/AnsiballZ_find.py'
Nov 29 06:37:54 compute-0 sudo[154022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:54 compute-0 python3.9[154024]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 29 06:37:54 compute-0 sudo[154022]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:56 compute-0 sudo[154174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsjnspgubdekfzsylzfyuzfkdnhhtwzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398275.2021077-3325-146555863682113/AnsiballZ_stat.py'
Nov 29 06:37:56 compute-0 sudo[154174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:56 compute-0 python3.9[154176]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:56 compute-0 sudo[154174]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:56 compute-0 sudo[154297]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-waeaqqsczmygwdmblyddvslmtnbyenid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398275.2021077-3325-146555863682113/AnsiballZ_copy.py'
Nov 29 06:37:56 compute-0 sudo[154297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:56 compute-0 python3.9[154299]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398275.2021077-3325-146555863682113/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:57 compute-0 sudo[154297]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:57 compute-0 sudo[154449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcfiqzypmnzmgkrmvdftqofkqiohkcto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398277.5520692-3373-211605178131335/AnsiballZ_file.py'
Nov 29 06:37:57 compute-0 sudo[154449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:58 compute-0 python3.9[154451]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:58 compute-0 sudo[154449]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:58 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Nov 29 06:37:58 compute-0 systemd[1]: setroubleshootd.service: Deactivated successfully.
Nov 29 06:37:58 compute-0 sudo[154601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrjueiopqqsoyefbwovrhzopgqazthcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398278.2998555-3397-89121448549740/AnsiballZ_stat.py'
Nov 29 06:37:58 compute-0 sudo[154601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:58 compute-0 python3.9[154604]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:37:58 compute-0 sudo[154601]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:59 compute-0 sudo[154680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqfmvsdxoxgbvjolbyeurptesjqgulra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398278.2998555-3397-89121448549740/AnsiballZ_file.py'
Nov 29 06:37:59 compute-0 sudo[154680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:37:59 compute-0 python3.9[154682]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:37:59 compute-0 sudo[154680]: pam_unix(sudo:session): session closed for user root
Nov 29 06:37:59 compute-0 sudo[154832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vifbbbfkmadonfvusbnugsbvjoywfdfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398279.5949209-3433-101277022746138/AnsiballZ_stat.py'
Nov 29 06:37:59 compute-0 sudo[154832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:00 compute-0 python3.9[154834]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:38:00 compute-0 sudo[154832]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:00 compute-0 sudo[154910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnsuogmrvvapdgnhwavrdttuoeofetyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398279.5949209-3433-101277022746138/AnsiballZ_file.py'
Nov 29 06:38:00 compute-0 sudo[154910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:00 compute-0 python3.9[154912]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.olhkqmra recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:00 compute-0 sudo[154910]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:01 compute-0 sudo[155064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqlhquobbukgilatawvpqxwygmgyvvyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398280.986737-3469-131056033708239/AnsiballZ_stat.py'
Nov 29 06:38:01 compute-0 sudo[155064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:01 compute-0 python3.9[155066]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:38:01 compute-0 sudo[155064]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:01 compute-0 sudo[155142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dihijsnjwcjocgsdiebjhgdojjbzmrus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398280.986737-3469-131056033708239/AnsiballZ_file.py'
Nov 29 06:38:01 compute-0 sudo[155142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:01 compute-0 python3.9[155144]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:01 compute-0 sudo[155142]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:02 compute-0 sshd-session[154990]: Invalid user user01 from 1.214.197.163 port 60304
Nov 29 06:38:02 compute-0 podman[155193]: 2025-11-29 06:38:02.39473707 +0000 UTC m=+0.061331341 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent)
Nov 29 06:38:02 compute-0 sshd-session[154990]: Received disconnect from 1.214.197.163 port 60304:11: Bye Bye [preauth]
Nov 29 06:38:02 compute-0 sshd-session[154990]: Disconnected from invalid user user01 1.214.197.163 port 60304 [preauth]
Nov 29 06:38:02 compute-0 sudo[155311]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nigdwjblayvxbshdpmkziayphhgpmcer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398282.291847-3508-24707362119508/AnsiballZ_command.py'
Nov 29 06:38:02 compute-0 sudo[155311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:02 compute-0 python3.9[155313]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:38:02 compute-0 sudo[155311]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:03 compute-0 sudo[155464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfnmphzrzetbxvfqmwaquauurqbrnpct ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764398283.0156567-3532-120580825312726/AnsiballZ_edpm_nftables_from_files.py'
Nov 29 06:38:03 compute-0 sudo[155464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:03 compute-0 python3[155466]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 29 06:38:03 compute-0 sudo[155464]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:04 compute-0 sudo[155616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blwrfhtrfqxiymfqcwkudbyidhixbwvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398283.9462671-3556-240448054805519/AnsiballZ_stat.py'
Nov 29 06:38:04 compute-0 sudo[155616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:04 compute-0 python3.9[155618]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:38:04 compute-0 sudo[155616]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:04 compute-0 sudo[155696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbfxfkxwtdueyfqhrcthpfaqvqkgagrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398283.9462671-3556-240448054805519/AnsiballZ_file.py'
Nov 29 06:38:04 compute-0 sudo[155696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:04 compute-0 python3.9[155698]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:05 compute-0 sudo[155696]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:05 compute-0 sudo[155848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anxnehqijgahazmafgyhvzicsoofpsnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398285.5020576-3592-156003437082837/AnsiballZ_stat.py'
Nov 29 06:38:05 compute-0 sudo[155848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:05 compute-0 python3.9[155850]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:38:06 compute-0 sudo[155848]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:06 compute-0 sshd-session[155619]: Received disconnect from 114.66.38.28 port 32950:11:  [preauth]
Nov 29 06:38:06 compute-0 sshd-session[155619]: Disconnected from authenticating user root 114.66.38.28 port 32950 [preauth]
Nov 29 06:38:06 compute-0 sudo[155926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-veabbiwhqglutoxcbgncepwkinfzohie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398285.5020576-3592-156003437082837/AnsiballZ_file.py'
Nov 29 06:38:06 compute-0 sudo[155926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:06 compute-0 python3.9[155928]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:06 compute-0 sudo[155926]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:07 compute-0 sudo[156078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltavvgxbabnmjyjlbsmusovgcjvupqvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398286.6845133-3628-167974489631234/AnsiballZ_stat.py'
Nov 29 06:38:07 compute-0 sudo[156078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:07 compute-0 python3.9[156080]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:38:07 compute-0 sudo[156078]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:07 compute-0 sudo[156156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aceppgdwbadaucamydtcklehcxlcabmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398286.6845133-3628-167974489631234/AnsiballZ_file.py'
Nov 29 06:38:07 compute-0 sudo[156156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:07 compute-0 python3.9[156158]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:07 compute-0 sudo[156156]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:08 compute-0 sudo[156308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxceoeewegqrpxbdfsqsvbjisxpmcjlt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398287.9323804-3664-90689324262232/AnsiballZ_stat.py'
Nov 29 06:38:08 compute-0 sudo[156308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:08 compute-0 python3.9[156310]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:38:08 compute-0 sudo[156308]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:08 compute-0 sudo[156396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pygdiumzryjhfleecpiiqpjldvgorffn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398287.9323804-3664-90689324262232/AnsiballZ_file.py'
Nov 29 06:38:08 compute-0 sudo[156396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:08 compute-0 podman[156360]: 2025-11-29 06:38:08.807252338 +0000 UTC m=+0.088186468 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 06:38:08 compute-0 python3.9[156404]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:08 compute-0 sudo[156396]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:09 compute-0 sudo[156567]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahrgjtwxjjumapooxntyqtrtdvxbiocu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398289.2983372-3700-5937553293672/AnsiballZ_stat.py'
Nov 29 06:38:09 compute-0 sudo[156567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:09 compute-0 python3.9[156569]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:38:09 compute-0 sudo[156567]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:10 compute-0 sudo[156692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wodcveqgrmbadplmvvihbaetztaqqkje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398289.2983372-3700-5937553293672/AnsiballZ_copy.py'
Nov 29 06:38:10 compute-0 sudo[156692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:10 compute-0 python3.9[156694]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398289.2983372-3700-5937553293672/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:10 compute-0 sudo[156692]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:11 compute-0 sudo[156844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivnfswztejsfefqddybydyaljcdtxdqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398290.7901366-3745-23973804250634/AnsiballZ_file.py'
Nov 29 06:38:11 compute-0 sudo[156844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:11 compute-0 python3.9[156846]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:11 compute-0 sudo[156844]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:11 compute-0 sudo[156996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htednibnulzyusczqelnsnkbiivtluls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398291.5572326-3769-178975226376105/AnsiballZ_command.py'
Nov 29 06:38:11 compute-0 sudo[156996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:12 compute-0 python3.9[156998]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:38:12 compute-0 sudo[156996]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:13 compute-0 sudo[157151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmlbbhjrvdqprbofpmrrdavwffwmrhis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398292.4915395-3793-213945225925993/AnsiballZ_blockinfile.py'
Nov 29 06:38:13 compute-0 sudo[157151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:13 compute-0 python3.9[157153]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:13 compute-0 sudo[157151]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:13 compute-0 sudo[157305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqmcwjvddhruheciogilrlseudrtgntr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398293.5815578-3820-50925401860424/AnsiballZ_command.py'
Nov 29 06:38:13 compute-0 sudo[157305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:14 compute-0 python3.9[157307]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:38:14 compute-0 sudo[157305]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:14 compute-0 sshd-session[157154]: Invalid user linuxacademy from 160.202.8.218 port 38072
Nov 29 06:38:14 compute-0 sshd-session[157154]: Received disconnect from 160.202.8.218 port 38072:11: Bye Bye [preauth]
Nov 29 06:38:14 compute-0 sshd-session[157154]: Disconnected from invalid user linuxacademy 160.202.8.218 port 38072 [preauth]
Nov 29 06:38:14 compute-0 sudo[157458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvrjusnnegxpplcbivsvxfjrszlemfdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398294.355619-3844-276473023269542/AnsiballZ_stat.py'
Nov 29 06:38:14 compute-0 sudo[157458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:14 compute-0 python3.9[157460]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:38:14 compute-0 sudo[157458]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:15 compute-0 sudo[157612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqofsbvuoecxtcfqknnqupsexvpvmpbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398295.1255465-3868-107816575451439/AnsiballZ_command.py'
Nov 29 06:38:15 compute-0 sudo[157612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:15 compute-0 python3.9[157614]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:38:15 compute-0 sudo[157612]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:16 compute-0 sudo[157767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgehepggxfudmpwrffvfgajneeinemgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398295.8410645-3892-119368562829396/AnsiballZ_file.py'
Nov 29 06:38:16 compute-0 sudo[157767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:16 compute-0 python3.9[157769]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:16 compute-0 sudo[157767]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:16 compute-0 sudo[157919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtkqlsarkilazuzfgkavxrxzswjbebpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398296.5567544-3916-246328679353896/AnsiballZ_stat.py'
Nov 29 06:38:16 compute-0 sudo[157919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:17 compute-0 python3.9[157921]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:38:17 compute-0 sudo[157919]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:17 compute-0 sudo[158042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgmqcnzskihdtvlmsbboarmmyakwffil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398296.5567544-3916-246328679353896/AnsiballZ_copy.py'
Nov 29 06:38:17 compute-0 sudo[158042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:17 compute-0 python3.9[158044]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398296.5567544-3916-246328679353896/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:17 compute-0 sudo[158042]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:18 compute-0 sudo[158194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgmniicbciqgdiaozvzvpxzufsenxjlv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398298.094169-3961-82447193803109/AnsiballZ_stat.py'
Nov 29 06:38:18 compute-0 sudo[158194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:18 compute-0 python3.9[158196]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:38:18 compute-0 sudo[158194]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:18 compute-0 sudo[158317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwawpxiufaqudenykvlftcuebniacwmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398298.094169-3961-82447193803109/AnsiballZ_copy.py'
Nov 29 06:38:18 compute-0 sudo[158317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:19 compute-0 python3.9[158319]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398298.094169-3961-82447193803109/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:19 compute-0 sudo[158317]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:19 compute-0 sudo[158469]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csjedqwibgyaiczhbebnfwabbndcnyet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398299.3927176-4006-206888797474950/AnsiballZ_stat.py'
Nov 29 06:38:19 compute-0 sudo[158469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:19 compute-0 python3.9[158471]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:38:19 compute-0 sudo[158469]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:20 compute-0 sudo[158592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmfymqbjaoeozdiakifeghsowtkmhbyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398299.3927176-4006-206888797474950/AnsiballZ_copy.py'
Nov 29 06:38:20 compute-0 sudo[158592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:20 compute-0 python3.9[158594]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398299.3927176-4006-206888797474950/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:20 compute-0 sudo[158592]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:21 compute-0 sudo[158744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrssyzojxmnoikcgcmyliruyxwakgjox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398300.856877-4051-253523755905928/AnsiballZ_systemd.py'
Nov 29 06:38:21 compute-0 sudo[158744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:21 compute-0 python3.9[158746]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:38:21 compute-0 systemd[1]: Reloading.
Nov 29 06:38:21 compute-0 systemd-rc-local-generator[158775]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:38:21 compute-0 systemd-sysv-generator[158779]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:38:22 compute-0 systemd[1]: Reached target edpm_libvirt.target.
Nov 29 06:38:22 compute-0 sudo[158744]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:22 compute-0 sudo[158936]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsxpvtdarvenglzfzywhplseidiorcxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398302.2240505-4075-21226243066973/AnsiballZ_systemd.py'
Nov 29 06:38:22 compute-0 sudo[158936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:22 compute-0 python3.9[158938]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 29 06:38:22 compute-0 systemd[1]: Reloading.
Nov 29 06:38:22 compute-0 systemd-rc-local-generator[158967]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:38:22 compute-0 systemd-sysv-generator[158970]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:38:23 compute-0 systemd[1]: Reloading.
Nov 29 06:38:23 compute-0 systemd-rc-local-generator[159005]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:38:23 compute-0 systemd-sysv-generator[159008]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:38:23 compute-0 sudo[158936]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:24 compute-0 sshd-session[104403]: Connection closed by 192.168.122.30 port 55806
Nov 29 06:38:24 compute-0 sshd-session[104400]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:38:24 compute-0 systemd[1]: session-22.scope: Deactivated successfully.
Nov 29 06:38:24 compute-0 systemd[1]: session-22.scope: Consumed 3min 15.679s CPU time.
Nov 29 06:38:24 compute-0 systemd-logind[788]: Session 22 logged out. Waiting for processes to exit.
Nov 29 06:38:24 compute-0 systemd-logind[788]: Removed session 22.
Nov 29 06:38:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:38:24.791 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:38:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:38:24.793 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:38:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:38:24.793 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:38:30 compute-0 sshd-session[159037]: Accepted publickey for zuul from 192.168.122.30 port 36430 ssh2: ECDSA SHA256:Ey+6YDlYfkE2dRe/gjWhHvvrJHee4xZo2Q1JojEuVBA
Nov 29 06:38:30 compute-0 systemd-logind[788]: New session 23 of user zuul.
Nov 29 06:38:30 compute-0 systemd[1]: Started Session 23 of User zuul.
Nov 29 06:38:30 compute-0 sshd-session[159037]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:38:31 compute-0 python3.9[159190]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:38:32 compute-0 podman[159318]: 2025-11-29 06:38:32.776737714 +0000 UTC m=+0.058606684 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent)
Nov 29 06:38:32 compute-0 python3.9[159357]: ansible-ansible.builtin.service_facts Invoked
Nov 29 06:38:32 compute-0 network[159380]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 06:38:33 compute-0 network[159381]: 'network-scripts' will be removed from distribution in near future.
Nov 29 06:38:33 compute-0 network[159382]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 06:38:34 compute-0 sshd-session[159388]: Invalid user openbravo from 45.202.211.6 port 36920
Nov 29 06:38:34 compute-0 sshd-session[159388]: Received disconnect from 45.202.211.6 port 36920:11: Bye Bye [preauth]
Nov 29 06:38:34 compute-0 sshd-session[159388]: Disconnected from invalid user openbravo 45.202.211.6 port 36920 [preauth]
Nov 29 06:38:37 compute-0 sudo[159653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvohbhggpvpcblvljjcocudmksjiqfam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398317.2599134-106-148768637394126/AnsiballZ_setup.py'
Nov 29 06:38:37 compute-0 sudo[159653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:37 compute-0 python3.9[159655]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 06:38:38 compute-0 sudo[159653]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:38 compute-0 sudo[159737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsfhddvdtzcqklqmlcgyduagrbagqekh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398317.2599134-106-148768637394126/AnsiballZ_dnf.py'
Nov 29 06:38:38 compute-0 sudo[159737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:38 compute-0 python3.9[159739]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:38:39 compute-0 podman[159741]: 2025-11-29 06:38:39.833977772 +0000 UTC m=+0.103681125 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller)
Nov 29 06:38:44 compute-0 sudo[159737]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:45 compute-0 sudo[159916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzshrgegbobfcuzppdqbqgqlrkirsuqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398324.6164703-142-249687148365727/AnsiballZ_stat.py'
Nov 29 06:38:45 compute-0 sudo[159916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:45 compute-0 python3.9[159918]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:38:45 compute-0 sudo[159916]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:45 compute-0 sudo[160068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfnozmpsywkeetykehatztypcimxuhxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398325.5897973-172-181471767348389/AnsiballZ_command.py'
Nov 29 06:38:45 compute-0 sudo[160068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:46 compute-0 python3.9[160070]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:38:46 compute-0 sudo[160068]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:46 compute-0 sudo[160221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wseivczkszuulyvmfpmgsppphnthront ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398326.6589968-202-157249062773665/AnsiballZ_stat.py'
Nov 29 06:38:46 compute-0 sudo[160221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:47 compute-0 python3.9[160223]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:38:47 compute-0 sudo[160221]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:47 compute-0 sudo[160373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gihnbbryzutfyjxffobnoslkctpwyueb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398327.4398942-226-165897653544574/AnsiballZ_command.py'
Nov 29 06:38:47 compute-0 sudo[160373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:47 compute-0 python3.9[160375]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:38:47 compute-0 sudo[160373]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:48 compute-0 sudo[160526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcnkapebpulqcfpfdkcielhtodgrphgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398328.1673896-250-87271631455089/AnsiballZ_stat.py'
Nov 29 06:38:48 compute-0 sudo[160526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:48 compute-0 python3.9[160528]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:38:48 compute-0 sudo[160526]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:49 compute-0 sudo[160649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqyywvkskdpezzcvnahtvoicktzhkbwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398328.1673896-250-87271631455089/AnsiballZ_copy.py'
Nov 29 06:38:49 compute-0 sudo[160649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:49 compute-0 python3.9[160651]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398328.1673896-250-87271631455089/.source.iscsi _original_basename=.89wxq008 follow=False checksum=65251aea09ccdf1ecf14724cc5f66f63e73fa9ed backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:49 compute-0 sudo[160649]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:50 compute-0 sudo[160801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccydoispaaepcjdffwmplimzdqybfwan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398329.5590544-295-122384017820660/AnsiballZ_file.py'
Nov 29 06:38:50 compute-0 sudo[160801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:50 compute-0 python3.9[160803]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:50 compute-0 sudo[160801]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:50 compute-0 sudo[160953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kiinczwfytdyjmwirhegxmnmakztbuhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398330.4539824-319-87202380403522/AnsiballZ_lineinfile.py'
Nov 29 06:38:50 compute-0 sudo[160953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:51 compute-0 python3.9[160955]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:38:51 compute-0 sudo[160953]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:51 compute-0 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 06:38:51 compute-0 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 06:38:51 compute-0 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 06:38:52 compute-0 sudo[161106]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egcvcguyzjkcfdqbhvpemezliqiaevho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398331.4754326-346-258376639911255/AnsiballZ_systemd_service.py'
Nov 29 06:38:52 compute-0 sudo[161106]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:52 compute-0 python3.9[161108]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:38:52 compute-0 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Nov 29 06:38:52 compute-0 sudo[161106]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:53 compute-0 sudo[161262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhsxkgygozuaeidzasaqgyhuedydmjqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398332.8000934-370-169582948413668/AnsiballZ_systemd_service.py'
Nov 29 06:38:53 compute-0 sudo[161262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:53 compute-0 python3.9[161264]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:38:53 compute-0 systemd[1]: Reloading.
Nov 29 06:38:53 compute-0 systemd-rc-local-generator[161295]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:38:53 compute-0 systemd-sysv-generator[161298]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:38:53 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 29 06:38:53 compute-0 systemd[1]: Starting Open-iSCSI...
Nov 29 06:38:53 compute-0 kernel: Loading iSCSI transport class v2.0-870.
Nov 29 06:38:53 compute-0 systemd[1]: Started Open-iSCSI.
Nov 29 06:38:53 compute-0 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Nov 29 06:38:53 compute-0 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Nov 29 06:38:53 compute-0 sudo[161262]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:54 compute-0 sshd-session[161265]: Received disconnect from 179.125.24.202 port 44620:11: Bye Bye [preauth]
Nov 29 06:38:54 compute-0 sshd-session[161265]: Disconnected from authenticating user root 179.125.24.202 port 44620 [preauth]
Nov 29 06:38:54 compute-0 sudo[161464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpvejqausronkzfkmtoxdkqxkylgchdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398334.5196583-403-198180412030539/AnsiballZ_service_facts.py'
Nov 29 06:38:54 compute-0 sudo[161464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:55 compute-0 python3.9[161466]: ansible-ansible.builtin.service_facts Invoked
Nov 29 06:38:55 compute-0 network[161483]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 06:38:55 compute-0 network[161484]: 'network-scripts' will be removed from distribution in near future.
Nov 29 06:38:55 compute-0 network[161485]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 06:38:58 compute-0 sudo[161464]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:59 compute-0 sudo[161756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdunweydmxawmcocrhgcflqwiodyvskc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398339.1458995-433-152367595828883/AnsiballZ_file.py'
Nov 29 06:38:59 compute-0 sudo[161756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:38:59 compute-0 python3.9[161758]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 29 06:38:59 compute-0 sshd-session[161572]: Invalid user sftp from 103.179.56.44 port 48546
Nov 29 06:38:59 compute-0 sudo[161756]: pam_unix(sudo:session): session closed for user root
Nov 29 06:38:59 compute-0 sshd-session[161572]: Received disconnect from 103.179.56.44 port 48546:11: Bye Bye [preauth]
Nov 29 06:38:59 compute-0 sshd-session[161572]: Disconnected from invalid user sftp 103.179.56.44 port 48546 [preauth]
Nov 29 06:39:00 compute-0 sudo[161908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhupkowlccxlgxyqlmskvtdcdsqisttj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398339.8484817-457-229180344073629/AnsiballZ_modprobe.py'
Nov 29 06:39:00 compute-0 sudo[161908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:00 compute-0 python3.9[161910]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Nov 29 06:39:00 compute-0 sudo[161908]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:01 compute-0 sudo[162064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqzqpaulnvleavnevscugjtgfxmwktnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398340.7886028-481-274750068007097/AnsiballZ_stat.py'
Nov 29 06:39:01 compute-0 sudo[162064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:01 compute-0 python3.9[162066]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:39:01 compute-0 sudo[162064]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:01 compute-0 sudo[162187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myglrktoafdcbqmogqodcppszcbshqjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398340.7886028-481-274750068007097/AnsiballZ_copy.py'
Nov 29 06:39:01 compute-0 sudo[162187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:01 compute-0 python3.9[162189]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398340.7886028-481-274750068007097/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:39:01 compute-0 sudo[162187]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:02 compute-0 sudo[162339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uoxxunowvkmvqzdpqryxcmklwrvnnbbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398342.1401236-529-93336928237644/AnsiballZ_lineinfile.py'
Nov 29 06:39:02 compute-0 sudo[162339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:02 compute-0 python3.9[162341]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:39:02 compute-0 sudo[162339]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:03 compute-0 sudo[162505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmbomahhhqfbqjtlbyhgodxzhdfbhmfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398342.810901-553-188710268920727/AnsiballZ_systemd.py'
Nov 29 06:39:03 compute-0 sudo[162505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:03 compute-0 podman[162465]: 2025-11-29 06:39:03.472503025 +0000 UTC m=+0.082018231 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Nov 29 06:39:03 compute-0 python3.9[162511]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:39:03 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 29 06:39:03 compute-0 systemd[1]: Stopped Load Kernel Modules.
Nov 29 06:39:03 compute-0 systemd[1]: Stopping Load Kernel Modules...
Nov 29 06:39:03 compute-0 systemd[1]: Starting Load Kernel Modules...
Nov 29 06:39:03 compute-0 systemd[1]: Finished Load Kernel Modules.
Nov 29 06:39:03 compute-0 sudo[162505]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:04 compute-0 sudo[162666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlkpvppyskoxeumkevwvldomndawqdmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398344.0853858-577-154256112608817/AnsiballZ_file.py'
Nov 29 06:39:04 compute-0 sudo[162666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:04 compute-0 python3.9[162668]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:39:04 compute-0 sudo[162666]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:05 compute-0 sudo[162818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ueduykpckisxwojlgtnttgnfgvvivmck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398344.9191203-604-103584959184924/AnsiballZ_stat.py'
Nov 29 06:39:05 compute-0 sudo[162818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:05 compute-0 python3.9[162820]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:39:05 compute-0 sudo[162818]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:05 compute-0 sudo[162970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkykwrxbbfftwdleoseefvfplptmxbel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398345.6167674-631-245883309602674/AnsiballZ_stat.py'
Nov 29 06:39:05 compute-0 sudo[162970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:06 compute-0 python3.9[162972]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:39:06 compute-0 sudo[162970]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:06 compute-0 sudo[163122]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkabaujxbkfiwoivtmxjypvdpsjydvbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398346.302308-655-260366794142413/AnsiballZ_stat.py'
Nov 29 06:39:06 compute-0 sudo[163122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:06 compute-0 python3.9[163124]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:39:06 compute-0 sudo[163122]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:07 compute-0 sudo[163245]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzrzltfolyqtbyzfckdinoktjkkiohrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398346.302308-655-260366794142413/AnsiballZ_copy.py'
Nov 29 06:39:07 compute-0 sudo[163245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:07 compute-0 python3.9[163247]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398346.302308-655-260366794142413/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:39:07 compute-0 sudo[163245]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:07 compute-0 sudo[163397]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yteltabtyrrqkeuyuhujemcitcldlqgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398347.5922296-700-99653453166538/AnsiballZ_command.py'
Nov 29 06:39:07 compute-0 sudo[163397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:08 compute-0 python3.9[163399]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:39:08 compute-0 sudo[163397]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:08 compute-0 sudo[163550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmmaojyqunjaiygjyowgdvktrsydcsxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398348.2995377-724-153307956227119/AnsiballZ_lineinfile.py'
Nov 29 06:39:08 compute-0 sudo[163550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:08 compute-0 python3.9[163552]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:39:08 compute-0 sudo[163550]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:09 compute-0 sudo[163702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxpbjmzfpdrvzvacnfdhpolczlirxpfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398349.179945-748-5039344985678/AnsiballZ_replace.py'
Nov 29 06:39:09 compute-0 sudo[163702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:09 compute-0 python3.9[163704]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:39:09 compute-0 sudo[163702]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:10 compute-0 sudo[163864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smcsfkvpcbrjnorexjgojygohzqgmlyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398350.0618646-772-133591604369885/AnsiballZ_replace.py'
Nov 29 06:39:10 compute-0 sudo[163864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:10 compute-0 podman[163828]: 2025-11-29 06:39:10.373476589 +0000 UTC m=+0.107167365 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 29 06:39:10 compute-0 python3.9[163873]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:39:10 compute-0 sudo[163864]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:11 compute-0 sudo[164030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wesfqutkmybljfrslmvcyrxpodxepcnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398350.849016-799-64425396526517/AnsiballZ_lineinfile.py'
Nov 29 06:39:11 compute-0 sudo[164030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:11 compute-0 python3.9[164032]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:39:11 compute-0 sudo[164030]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:11 compute-0 sudo[164182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzjlophpglykxyyrzbuherteqnnlxevf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398351.4840965-799-214561917771229/AnsiballZ_lineinfile.py'
Nov 29 06:39:11 compute-0 sudo[164182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:11 compute-0 python3.9[164184]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:39:11 compute-0 sudo[164182]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:12 compute-0 sudo[164334]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evbjaljobwsbvbprsfwpdghisazfnsyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398352.2030907-799-173152511990533/AnsiballZ_lineinfile.py'
Nov 29 06:39:12 compute-0 sudo[164334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:12 compute-0 python3.9[164336]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:39:12 compute-0 sudo[164334]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:13 compute-0 sudo[164486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xoopjuytyukrjxmnshxlillwbylrzeed ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398352.8820612-799-164486498763591/AnsiballZ_lineinfile.py'
Nov 29 06:39:13 compute-0 sudo[164486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:13 compute-0 python3.9[164488]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:39:13 compute-0 sudo[164486]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:13 compute-0 sudo[164638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbbparubudfikcxtrvbahiykonchvtkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398353.7159314-886-2360449761640/AnsiballZ_stat.py'
Nov 29 06:39:13 compute-0 sudo[164638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:14 compute-0 python3.9[164640]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:39:14 compute-0 sudo[164638]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:14 compute-0 sudo[164792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqmhdbloltzrkklbrudgkxrttkakojev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398354.497547-910-91323221401744/AnsiballZ_file.py'
Nov 29 06:39:14 compute-0 sudo[164792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:14 compute-0 python3.9[164794]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:39:14 compute-0 sudo[164792]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:15 compute-0 sudo[164944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzjzbtessbchtekgffnmimbqzyvdrdqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398355.3528774-937-224902740778428/AnsiballZ_file.py'
Nov 29 06:39:15 compute-0 sudo[164944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:15 compute-0 python3.9[164946]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:39:15 compute-0 sudo[164944]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:16 compute-0 sudo[165096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgmjodnliukqghohqgzcszzhidvgljva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398356.2011395-961-140292490695794/AnsiballZ_stat.py'
Nov 29 06:39:16 compute-0 sudo[165096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:16 compute-0 python3.9[165098]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:39:16 compute-0 sudo[165096]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:16 compute-0 sudo[165174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-roxbuglbiscrjgrflviroruykentjhuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398356.2011395-961-140292490695794/AnsiballZ_file.py'
Nov 29 06:39:16 compute-0 sudo[165174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:17 compute-0 python3.9[165176]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:39:17 compute-0 sudo[165174]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:17 compute-0 sudo[165326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amwslrwbhygepkircqbipvdsynegkowk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398357.2792125-961-240150787199937/AnsiballZ_stat.py'
Nov 29 06:39:17 compute-0 sudo[165326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:17 compute-0 python3.9[165328]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:39:17 compute-0 sudo[165326]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:18 compute-0 sudo[165404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pllczcpcnrultupgkwotznjmagxlscwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398357.2792125-961-240150787199937/AnsiballZ_file.py'
Nov 29 06:39:18 compute-0 sudo[165404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:18 compute-0 python3.9[165406]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:39:18 compute-0 sudo[165404]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:18 compute-0 sudo[165556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xligjqkixyjknvsipcfxrjzjdkypgove ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398358.5141268-1030-158858618221148/AnsiballZ_file.py'
Nov 29 06:39:18 compute-0 sudo[165556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:18 compute-0 python3.9[165558]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:39:18 compute-0 sudo[165556]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:19 compute-0 sudo[165708]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asdamyacyiqccmmnweelbisngokpsuop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398359.2224996-1054-146916242584905/AnsiballZ_stat.py'
Nov 29 06:39:19 compute-0 sudo[165708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:19 compute-0 python3.9[165710]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:39:19 compute-0 sudo[165708]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:19 compute-0 sudo[165786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcbszydumckplebsarffeabsltbrjtgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398359.2224996-1054-146916242584905/AnsiballZ_file.py'
Nov 29 06:39:19 compute-0 sudo[165786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:20 compute-0 python3.9[165788]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:39:20 compute-0 sudo[165786]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:20 compute-0 sudo[165938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rraufuehymvytyuqmxpzxgwbwyzbgruk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398360.3978112-1090-84857011865518/AnsiballZ_stat.py'
Nov 29 06:39:20 compute-0 sudo[165938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:20 compute-0 python3.9[165940]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:39:20 compute-0 sudo[165938]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:21 compute-0 sudo[166016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qztruyrpadyjjsypdordtzuewxtizkjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398360.3978112-1090-84857011865518/AnsiballZ_file.py'
Nov 29 06:39:21 compute-0 sudo[166016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:21 compute-0 python3.9[166018]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:39:21 compute-0 sudo[166016]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:22 compute-0 sudo[166168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyoggwrgrhzugoiqdrxfganxeebhgrvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398361.792734-1126-93646935064043/AnsiballZ_systemd.py'
Nov 29 06:39:22 compute-0 sudo[166168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:22 compute-0 python3.9[166170]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:39:22 compute-0 systemd[1]: Reloading.
Nov 29 06:39:22 compute-0 systemd-sysv-generator[166199]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:39:22 compute-0 systemd-rc-local-generator[166196]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:39:22 compute-0 sudo[166168]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:23 compute-0 sudo[166357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zywdkniqsqiipcyvivkypdhvoeknbodn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398362.9627154-1150-176155323245437/AnsiballZ_stat.py'
Nov 29 06:39:23 compute-0 sudo[166357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:23 compute-0 python3.9[166359]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:39:23 compute-0 sudo[166357]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:23 compute-0 sudo[166435]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqcuxfhvbkejgadyvcjcxgdfmltsakld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398362.9627154-1150-176155323245437/AnsiballZ_file.py'
Nov 29 06:39:23 compute-0 sudo[166435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:23 compute-0 python3.9[166437]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:39:23 compute-0 sudo[166435]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:24 compute-0 sudo[166587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lydmlgdvabzkhuvhnnaurgqzoxoymyzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398364.167376-1186-118985069074931/AnsiballZ_stat.py'
Nov 29 06:39:24 compute-0 sudo[166587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:24 compute-0 python3.9[166589]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:39:24 compute-0 sudo[166587]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:39:24.792 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:39:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:39:24.794 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:39:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:39:24.795 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:39:24 compute-0 sudo[166665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zguvixtwrccusciunmbzniqqyfijybcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398364.167376-1186-118985069074931/AnsiballZ_file.py'
Nov 29 06:39:24 compute-0 sudo[166665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:25 compute-0 python3.9[166667]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:39:25 compute-0 sudo[166665]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:25 compute-0 sudo[166817]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-navwxrsfsolpzxxuzjqfaoknqtlsjyxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398365.3996139-1222-65336877161287/AnsiballZ_systemd.py'
Nov 29 06:39:25 compute-0 sudo[166817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:25 compute-0 python3.9[166819]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:39:25 compute-0 systemd[1]: Reloading.
Nov 29 06:39:26 compute-0 systemd-rc-local-generator[166844]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:39:26 compute-0 systemd-sysv-generator[166849]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:39:26 compute-0 systemd[1]: Starting Create netns directory...
Nov 29 06:39:26 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 29 06:39:26 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 29 06:39:26 compute-0 systemd[1]: Finished Create netns directory.
Nov 29 06:39:26 compute-0 sudo[166817]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:27 compute-0 sudo[167010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyhkcredkygtdbdefutwzbjjshcyeyed ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398366.9375222-1252-43904574402994/AnsiballZ_file.py'
Nov 29 06:39:27 compute-0 sudo[167010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:27 compute-0 python3.9[167012]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:39:27 compute-0 sudo[167010]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:27 compute-0 sudo[167164]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpudunekhtddsjvpqrykktzwqhudzbhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398367.694957-1276-273448344340379/AnsiballZ_stat.py'
Nov 29 06:39:27 compute-0 sudo[167164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:28 compute-0 python3.9[167166]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:39:28 compute-0 sudo[167164]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:28 compute-0 sudo[167287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-labvrggfvnkunhabynulugerwwmsdsfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398367.694957-1276-273448344340379/AnsiballZ_copy.py'
Nov 29 06:39:28 compute-0 sudo[167287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:28 compute-0 python3.9[167289]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398367.694957-1276-273448344340379/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:39:28 compute-0 sudo[167287]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:29 compute-0 sshd-session[167136]: Invalid user odoo from 1.214.197.163 port 33454
Nov 29 06:39:29 compute-0 sshd-session[167136]: Received disconnect from 1.214.197.163 port 33454:11: Bye Bye [preauth]
Nov 29 06:39:29 compute-0 sshd-session[167136]: Disconnected from invalid user odoo 1.214.197.163 port 33454 [preauth]
Nov 29 06:39:29 compute-0 sudo[167439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbwtxeuzfritwgduaryhhoztbfbrtfdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398369.3543677-1327-154319995269923/AnsiballZ_file.py'
Nov 29 06:39:29 compute-0 sudo[167439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:29 compute-0 python3.9[167441]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:39:29 compute-0 sudo[167439]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:30 compute-0 sudo[167591]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgqjtabsmtfiihoamfckdgzmkwqckkza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398370.1002133-1351-18819284421003/AnsiballZ_stat.py'
Nov 29 06:39:30 compute-0 sudo[167591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:30 compute-0 python3.9[167593]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:39:30 compute-0 sudo[167591]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:30 compute-0 sudo[167714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygferixycngvufqdtowvpjolvtmqpbja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398370.1002133-1351-18819284421003/AnsiballZ_copy.py'
Nov 29 06:39:30 compute-0 sudo[167714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:31 compute-0 python3.9[167716]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398370.1002133-1351-18819284421003/.source.json _original_basename=.f_dvqk3s follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:39:31 compute-0 sudo[167714]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:31 compute-0 sudo[167866]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tocrqcfqglvhlordlnymsuzuqtfsrfrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398371.6045053-1396-193349776209502/AnsiballZ_file.py'
Nov 29 06:39:31 compute-0 sudo[167866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:32 compute-0 python3.9[167868]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:39:32 compute-0 sudo[167866]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:32 compute-0 sudo[168018]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nibnnaddrzqbdthqaixokhlvnngxweff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398372.3532324-1420-57005982881521/AnsiballZ_stat.py'
Nov 29 06:39:32 compute-0 sudo[168018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:32 compute-0 sudo[168018]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:33 compute-0 sudo[168141]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msfofxxqaqwizcveloqahifdsokbtrwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398372.3532324-1420-57005982881521/AnsiballZ_copy.py'
Nov 29 06:39:33 compute-0 sudo[168141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:33 compute-0 sudo[168141]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:33 compute-0 podman[168168]: 2025-11-29 06:39:33.80352648 +0000 UTC m=+0.072913102 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2)
Nov 29 06:39:34 compute-0 sudo[168312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvpygfvmwlfbyvdbmmmuyceeshyzgpdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398373.943425-1471-23863656854109/AnsiballZ_container_config_data.py'
Nov 29 06:39:34 compute-0 sudo[168312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:34 compute-0 python3.9[168314]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Nov 29 06:39:34 compute-0 sudo[168312]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:35 compute-0 sudo[168464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjdpzrjnivxnflxlueeynhjpfxuyxlfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398375.0063212-1498-185029952649871/AnsiballZ_container_config_hash.py'
Nov 29 06:39:35 compute-0 sudo[168464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:35 compute-0 python3.9[168466]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 06:39:35 compute-0 sudo[168464]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:36 compute-0 sudo[168616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppgjbrgjdzcgqrhougjjmlvewqdoeaqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398376.0417445-1525-128773111286483/AnsiballZ_podman_container_info.py'
Nov 29 06:39:36 compute-0 sudo[168616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:36 compute-0 python3.9[168618]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 29 06:39:36 compute-0 sudo[168616]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:38 compute-0 sudo[168793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eakqqicfxrhsdmiiftvkjhnfghrasomc ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764398377.7449234-1564-138790651490359/AnsiballZ_edpm_container_manage.py'
Nov 29 06:39:38 compute-0 sudo[168793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:38 compute-0 python3[168795]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 06:39:38 compute-0 podman[168831]: 2025-11-29 06:39:38.707675343 +0000 UTC m=+0.026664568 image pull f275b8d168f7f57f31e3da49224019f39f95c80a833f083696a964527b07b54f quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 29 06:39:38 compute-0 podman[168831]: 2025-11-29 06:39:38.818721887 +0000 UTC m=+0.137711102 container create 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 06:39:38 compute-0 python3[168795]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 29 06:39:38 compute-0 sudo[168793]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:39 compute-0 sudo[169019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cppvzlnchjqavutorexxrpwkifyfrpgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398379.3450282-1588-180111456387873/AnsiballZ_stat.py'
Nov 29 06:39:39 compute-0 sudo[169019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:39 compute-0 python3.9[169021]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:39:39 compute-0 sudo[169019]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:40 compute-0 sudo[169190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlvaihybtevveqmkiahjcghxkiglrybq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398380.3266761-1615-252217724596876/AnsiballZ_file.py'
Nov 29 06:39:40 compute-0 sudo[169190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:40 compute-0 podman[169147]: 2025-11-29 06:39:40.699729349 +0000 UTC m=+0.135611553 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Nov 29 06:39:40 compute-0 python3.9[169195]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:39:40 compute-0 sudo[169190]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:41 compute-0 sudo[169275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gagovdcuffyeegjimhscbsgpbpkloqjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398380.3266761-1615-252217724596876/AnsiballZ_stat.py'
Nov 29 06:39:41 compute-0 sudo[169275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:41 compute-0 python3.9[169277]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:39:41 compute-0 sudo[169275]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:41 compute-0 sudo[169426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atsfelujgklksbbcndisrhetsegzqsae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398381.4428303-1615-135693381772787/AnsiballZ_copy.py'
Nov 29 06:39:41 compute-0 sudo[169426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:42 compute-0 python3.9[169428]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764398381.4428303-1615-135693381772787/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:39:42 compute-0 sudo[169426]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:42 compute-0 sudo[169502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pybjpxwilhrrjzhvfimeqcikupeheoep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398381.4428303-1615-135693381772787/AnsiballZ_systemd.py'
Nov 29 06:39:42 compute-0 sudo[169502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:42 compute-0 python3.9[169504]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 06:39:42 compute-0 systemd[1]: Reloading.
Nov 29 06:39:42 compute-0 systemd-rc-local-generator[169532]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:39:42 compute-0 systemd-sysv-generator[169535]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:39:42 compute-0 sudo[169502]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:43 compute-0 sudo[169615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwafxbyiyrcrpjswydiswugnpgwomwzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398381.4428303-1615-135693381772787/AnsiballZ_systemd.py'
Nov 29 06:39:43 compute-0 sudo[169615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:43 compute-0 python3.9[169617]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:39:43 compute-0 systemd[1]: Reloading.
Nov 29 06:39:43 compute-0 systemd-rc-local-generator[169647]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:39:43 compute-0 systemd-sysv-generator[169651]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:39:43 compute-0 systemd[1]: Starting multipathd container...
Nov 29 06:39:43 compute-0 sshd-session[169506]: Invalid user m from 45.202.211.6 port 54922
Nov 29 06:39:43 compute-0 systemd[1]: Started libcrun container.
Nov 29 06:39:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6db70deefe2a392e7d4b4d1c8f3da35dd65b48d4dbc2bf72dab40490e1d75938/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 29 06:39:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6db70deefe2a392e7d4b4d1c8f3da35dd65b48d4dbc2bf72dab40490e1d75938/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 29 06:39:43 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205.
Nov 29 06:39:43 compute-0 podman[169657]: 2025-11-29 06:39:43.903600236 +0000 UTC m=+0.120711220 container init 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd)
Nov 29 06:39:43 compute-0 multipathd[169673]: + sudo -E kolla_set_configs
Nov 29 06:39:43 compute-0 sudo[169679]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 29 06:39:43 compute-0 podman[169657]: 2025-11-29 06:39:43.937366605 +0000 UTC m=+0.154477599 container start 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 29 06:39:43 compute-0 sudo[169679]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 29 06:39:43 compute-0 sudo[169679]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 29 06:39:43 compute-0 multipathd[169673]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 06:39:43 compute-0 multipathd[169673]: INFO:__main__:Validating config file
Nov 29 06:39:43 compute-0 multipathd[169673]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 06:39:43 compute-0 multipathd[169673]: INFO:__main__:Writing out command to execute
Nov 29 06:39:43 compute-0 sudo[169679]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:43 compute-0 multipathd[169673]: ++ cat /run_command
Nov 29 06:39:43 compute-0 podman[169657]: multipathd
Nov 29 06:39:43 compute-0 multipathd[169673]: + CMD='/usr/sbin/multipathd -d'
Nov 29 06:39:43 compute-0 multipathd[169673]: + ARGS=
Nov 29 06:39:43 compute-0 multipathd[169673]: + sudo kolla_copy_cacerts
Nov 29 06:39:43 compute-0 systemd[1]: Started multipathd container.
Nov 29 06:39:43 compute-0 sudo[169693]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 29 06:39:44 compute-0 sudo[169693]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 29 06:39:44 compute-0 sudo[169693]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 29 06:39:44 compute-0 sudo[169693]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:44 compute-0 multipathd[169673]: Running command: '/usr/sbin/multipathd -d'
Nov 29 06:39:44 compute-0 multipathd[169673]: + [[ ! -n '' ]]
Nov 29 06:39:44 compute-0 multipathd[169673]: + . kolla_extend_start
Nov 29 06:39:44 compute-0 multipathd[169673]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 29 06:39:44 compute-0 multipathd[169673]: + umask 0022
Nov 29 06:39:44 compute-0 multipathd[169673]: + exec /usr/sbin/multipathd -d
Nov 29 06:39:44 compute-0 multipathd[169673]: 3831.739692 | --------start up--------
Nov 29 06:39:44 compute-0 multipathd[169673]: 3831.739705 | read /etc/multipath.conf
Nov 29 06:39:44 compute-0 sudo[169615]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:44 compute-0 multipathd[169673]: 3831.746967 | path checkers start up
Nov 29 06:39:44 compute-0 sshd-session[169506]: Received disconnect from 45.202.211.6 port 54922:11: Bye Bye [preauth]
Nov 29 06:39:44 compute-0 sshd-session[169506]: Disconnected from invalid user m 45.202.211.6 port 54922 [preauth]
Nov 29 06:39:44 compute-0 podman[169680]: 2025-11-29 06:39:44.048020768 +0000 UTC m=+0.099510008 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 06:39:45 compute-0 python3.9[169862]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:39:45 compute-0 sudo[170014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnwpcuscgqoltbclmotfypiuhxlqywnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398385.661028-1723-237906677081355/AnsiballZ_command.py'
Nov 29 06:39:45 compute-0 sudo[170014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:46 compute-0 python3.9[170016]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:39:46 compute-0 sudo[170014]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:46 compute-0 sudo[170179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxgupwvhrrsrkpqzxvbimqdeeqzajezx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398386.3994823-1747-144577877258822/AnsiballZ_systemd.py'
Nov 29 06:39:46 compute-0 sudo[170179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:46 compute-0 systemd[1]: virtnodedevd.service: Deactivated successfully.
Nov 29 06:39:46 compute-0 python3.9[170181]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:39:46 compute-0 systemd[1]: Stopping multipathd container...
Nov 29 06:39:47 compute-0 multipathd[169673]: 3834.762434 | exit (signal)
Nov 29 06:39:47 compute-0 multipathd[169673]: 3834.762956 | --------shut down-------
Nov 29 06:39:47 compute-0 systemd[1]: libpod-03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205.scope: Deactivated successfully.
Nov 29 06:39:47 compute-0 podman[170186]: 2025-11-29 06:39:47.075461463 +0000 UTC m=+0.063147715 container died 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=multipathd)
Nov 29 06:39:47 compute-0 systemd[1]: 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205-b9173d448f2a080.timer: Deactivated successfully.
Nov 29 06:39:47 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205.
Nov 29 06:39:47 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205-userdata-shm.mount: Deactivated successfully.
Nov 29 06:39:47 compute-0 systemd[1]: var-lib-containers-storage-overlay-6db70deefe2a392e7d4b4d1c8f3da35dd65b48d4dbc2bf72dab40490e1d75938-merged.mount: Deactivated successfully.
Nov 29 06:39:47 compute-0 podman[170186]: 2025-11-29 06:39:47.112650469 +0000 UTC m=+0.100336731 container cleanup 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:39:47 compute-0 podman[170186]: multipathd
Nov 29 06:39:47 compute-0 podman[170211]: multipathd
Nov 29 06:39:47 compute-0 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Nov 29 06:39:47 compute-0 systemd[1]: Stopped multipathd container.
Nov 29 06:39:47 compute-0 systemd[1]: Starting multipathd container...
Nov 29 06:39:47 compute-0 systemd[1]: Started libcrun container.
Nov 29 06:39:47 compute-0 sshd-session[170017]: Invalid user ftpadmin from 160.202.8.218 port 59798
Nov 29 06:39:47 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6db70deefe2a392e7d4b4d1c8f3da35dd65b48d4dbc2bf72dab40490e1d75938/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 29 06:39:47 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6db70deefe2a392e7d4b4d1c8f3da35dd65b48d4dbc2bf72dab40490e1d75938/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 29 06:39:47 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205.
Nov 29 06:39:47 compute-0 podman[170224]: 2025-11-29 06:39:47.289185274 +0000 UTC m=+0.099744134 container init 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 06:39:47 compute-0 multipathd[170240]: + sudo -E kolla_set_configs
Nov 29 06:39:47 compute-0 sudo[170246]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 29 06:39:47 compute-0 podman[170224]: 2025-11-29 06:39:47.312396013 +0000 UTC m=+0.122954873 container start 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Nov 29 06:39:47 compute-0 sudo[170246]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 29 06:39:47 compute-0 sudo[170246]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 29 06:39:47 compute-0 podman[170224]: multipathd
Nov 29 06:39:47 compute-0 systemd[1]: Started multipathd container.
Nov 29 06:39:47 compute-0 sudo[170179]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:47 compute-0 multipathd[170240]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 06:39:47 compute-0 multipathd[170240]: INFO:__main__:Validating config file
Nov 29 06:39:47 compute-0 multipathd[170240]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 06:39:47 compute-0 multipathd[170240]: INFO:__main__:Writing out command to execute
Nov 29 06:39:47 compute-0 sudo[170246]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:47 compute-0 multipathd[170240]: ++ cat /run_command
Nov 29 06:39:47 compute-0 multipathd[170240]: + CMD='/usr/sbin/multipathd -d'
Nov 29 06:39:47 compute-0 multipathd[170240]: + ARGS=
Nov 29 06:39:47 compute-0 multipathd[170240]: + sudo kolla_copy_cacerts
Nov 29 06:39:47 compute-0 podman[170247]: 2025-11-29 06:39:47.378761378 +0000 UTC m=+0.053693356 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0)
Nov 29 06:39:47 compute-0 sudo[170269]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 29 06:39:47 compute-0 sudo[170269]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 29 06:39:47 compute-0 sudo[170269]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 29 06:39:47 compute-0 systemd[1]: 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205-6542ca14825c2328.service: Main process exited, code=exited, status=1/FAILURE
Nov 29 06:39:47 compute-0 systemd[1]: 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205-6542ca14825c2328.service: Failed with result 'exit-code'.
Nov 29 06:39:47 compute-0 sudo[170269]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:47 compute-0 multipathd[170240]: + [[ ! -n '' ]]
Nov 29 06:39:47 compute-0 multipathd[170240]: + . kolla_extend_start
Nov 29 06:39:47 compute-0 multipathd[170240]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 29 06:39:47 compute-0 multipathd[170240]: Running command: '/usr/sbin/multipathd -d'
Nov 29 06:39:47 compute-0 multipathd[170240]: + umask 0022
Nov 29 06:39:47 compute-0 multipathd[170240]: + exec /usr/sbin/multipathd -d
Nov 29 06:39:47 compute-0 multipathd[170240]: 3835.119659 | --------start up--------
Nov 29 06:39:47 compute-0 multipathd[170240]: 3835.119677 | read /etc/multipath.conf
Nov 29 06:39:47 compute-0 multipathd[170240]: 3835.124253 | path checkers start up
Nov 29 06:39:47 compute-0 sshd-session[170017]: Received disconnect from 160.202.8.218 port 59798:11: Bye Bye [preauth]
Nov 29 06:39:47 compute-0 sshd-session[170017]: Disconnected from invalid user ftpadmin 160.202.8.218 port 59798 [preauth]
Nov 29 06:39:48 compute-0 sudo[170429]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmvzyhzblmjssxcfbpvnylqcvefosoaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398387.794471-1771-52082468602051/AnsiballZ_file.py'
Nov 29 06:39:48 compute-0 sudo[170429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:48 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 29 06:39:48 compute-0 python3.9[170431]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:39:48 compute-0 sudo[170429]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:49 compute-0 sudo[170582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shafecprffbxcbgqgenzlwredsiiikqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398388.9276364-1807-276917087268660/AnsiballZ_file.py'
Nov 29 06:39:49 compute-0 sudo[170582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:49 compute-0 python3.9[170584]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 29 06:39:49 compute-0 sudo[170582]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:49 compute-0 systemd[1]: virtqemud.service: Deactivated successfully.
Nov 29 06:39:49 compute-0 sudo[170735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzbljmdtfpgfileualjfacpgqprlrzfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398389.7431383-1831-273980648952417/AnsiballZ_modprobe.py'
Nov 29 06:39:49 compute-0 sudo[170735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:50 compute-0 python3.9[170737]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Nov 29 06:39:50 compute-0 kernel: Key type psk registered
Nov 29 06:39:50 compute-0 sudo[170735]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:50 compute-0 systemd[1]: virtsecretd.service: Deactivated successfully.
Nov 29 06:39:50 compute-0 sudo[170897]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpiufvdlcwecmgurdckszcmrgnwykekx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398390.558319-1855-108836804859520/AnsiballZ_stat.py'
Nov 29 06:39:50 compute-0 sudo[170897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:51 compute-0 python3.9[170899]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:39:51 compute-0 sudo[170897]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:51 compute-0 sudo[171020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlvlbocrkbpamknrpwibjgzztycnotym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398390.558319-1855-108836804859520/AnsiballZ_copy.py'
Nov 29 06:39:51 compute-0 sudo[171020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:51 compute-0 python3.9[171022]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398390.558319-1855-108836804859520/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:39:51 compute-0 sudo[171020]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:52 compute-0 sudo[171172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahhugkfoxpoceqkjpbjhzwwnpyvcwugf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398392.0531673-1903-10187349319902/AnsiballZ_lineinfile.py'
Nov 29 06:39:52 compute-0 sudo[171172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:52 compute-0 python3.9[171174]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:39:52 compute-0 sudo[171172]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:53 compute-0 sudo[171324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtdpisptumxvjvdshsybuxtsqmmkktxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398392.7732604-1927-238221506953760/AnsiballZ_systemd.py'
Nov 29 06:39:53 compute-0 sudo[171324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:53 compute-0 python3.9[171326]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:39:53 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 29 06:39:53 compute-0 systemd[1]: Stopped Load Kernel Modules.
Nov 29 06:39:53 compute-0 systemd[1]: Stopping Load Kernel Modules...
Nov 29 06:39:53 compute-0 systemd[1]: Starting Load Kernel Modules...
Nov 29 06:39:53 compute-0 systemd[1]: Finished Load Kernel Modules.
Nov 29 06:39:53 compute-0 sudo[171324]: pam_unix(sudo:session): session closed for user root
Nov 29 06:39:54 compute-0 sudo[171480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcjdfsqvsghfsqqwaupynvjzojqsgjns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398393.872453-1951-9809935647403/AnsiballZ_dnf.py'
Nov 29 06:39:54 compute-0 sudo[171480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:39:54 compute-0 python3.9[171482]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 06:39:58 compute-0 systemd[1]: Reloading.
Nov 29 06:39:58 compute-0 systemd-rc-local-generator[171514]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:39:58 compute-0 systemd-sysv-generator[171518]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:39:58 compute-0 systemd[1]: Reloading.
Nov 29 06:39:58 compute-0 systemd-rc-local-generator[171550]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:39:58 compute-0 systemd-sysv-generator[171553]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:39:58 compute-0 systemd-logind[788]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 29 06:39:58 compute-0 systemd-logind[788]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 29 06:39:59 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 06:39:59 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 29 06:39:59 compute-0 systemd[1]: Reloading.
Nov 29 06:39:59 compute-0 systemd-rc-local-generator[171647]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:39:59 compute-0 systemd-sysv-generator[171650]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:39:59 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 06:40:00 compute-0 sudo[171480]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:01 compute-0 sudo[172840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcaljjunsitbcudwkvbtshqjlhzvekzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398400.8077936-1975-18617979972956/AnsiballZ_systemd_service.py'
Nov 29 06:40:01 compute-0 sudo[172840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:01 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 06:40:01 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 29 06:40:01 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.701s CPU time.
Nov 29 06:40:01 compute-0 systemd[1]: run-rf00313dc88cf4f3ea0a47d309308cf35.service: Deactivated successfully.
Nov 29 06:40:01 compute-0 python3.9[172874]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:40:01 compute-0 systemd[1]: Stopping Open-iSCSI...
Nov 29 06:40:01 compute-0 iscsid[161306]: iscsid shutting down.
Nov 29 06:40:01 compute-0 systemd[1]: iscsid.service: Deactivated successfully.
Nov 29 06:40:01 compute-0 systemd[1]: Stopped Open-iSCSI.
Nov 29 06:40:01 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 29 06:40:01 compute-0 systemd[1]: Starting Open-iSCSI...
Nov 29 06:40:01 compute-0 systemd[1]: Started Open-iSCSI.
Nov 29 06:40:01 compute-0 sudo[172840]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:02 compute-0 python3.9[173090]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:40:03 compute-0 sudo[173244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrbojslezhsmlctzhskqtzslrgqlvbjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398403.0196283-2027-116389444232759/AnsiballZ_file.py'
Nov 29 06:40:03 compute-0 sudo[173244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:03 compute-0 python3.9[173246]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:03 compute-0 sudo[173244]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:04 compute-0 sudo[173406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvtddimqbwuxgphsoklkylfdchyinzjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398404.1148982-2060-113919925889091/AnsiballZ_systemd_service.py'
Nov 29 06:40:04 compute-0 sudo[173406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:04 compute-0 podman[173370]: 2025-11-29 06:40:04.392897201 +0000 UTC m=+0.051924804 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125)
Nov 29 06:40:04 compute-0 python3.9[173414]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 06:40:04 compute-0 systemd[1]: Reloading.
Nov 29 06:40:04 compute-0 systemd-rc-local-generator[173439]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:40:04 compute-0 systemd-sysv-generator[173443]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:40:04 compute-0 sudo[173406]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:05 compute-0 python3.9[173599]: ansible-ansible.builtin.service_facts Invoked
Nov 29 06:40:05 compute-0 network[173616]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 06:40:05 compute-0 network[173617]: 'network-scripts' will be removed from distribution in near future.
Nov 29 06:40:05 compute-0 network[173618]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 06:40:10 compute-0 podman[173765]: 2025-11-29 06:40:10.892255775 +0000 UTC m=+0.159890971 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:40:11 compute-0 sudo[173915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygamgpdiknhkykxwvhsjywoylgbffjtl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398410.7850204-2117-33880238624735/AnsiballZ_systemd_service.py'
Nov 29 06:40:11 compute-0 sudo[173915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:11 compute-0 python3.9[173917]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:40:11 compute-0 sudo[173915]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:11 compute-0 sudo[174068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbsynkehbydctmlyybxlwhfubbaajzei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398411.6830063-2117-182937606698039/AnsiballZ_systemd_service.py'
Nov 29 06:40:11 compute-0 sudo[174068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:12 compute-0 python3.9[174070]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:40:12 compute-0 sudo[174068]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:12 compute-0 sudo[174221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqhqmoxzufedgpgvmaurlerbzcrbzeit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398412.4589279-2117-230217411970908/AnsiballZ_systemd_service.py'
Nov 29 06:40:12 compute-0 sudo[174221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:13 compute-0 python3.9[174223]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:40:13 compute-0 sudo[174221]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:13 compute-0 sudo[174374]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-einxhsugwcjrgtegxuzqolfsjfdueixg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398413.2751088-2117-221329662045024/AnsiballZ_systemd_service.py'
Nov 29 06:40:13 compute-0 sudo[174374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:13 compute-0 python3.9[174376]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:40:13 compute-0 sudo[174374]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:14 compute-0 sudo[174527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulewiqfhqnubjwaroazemtxlchwkrkmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398414.0514472-2117-277391916767138/AnsiballZ_systemd_service.py'
Nov 29 06:40:14 compute-0 sudo[174527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:14 compute-0 python3.9[174529]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:40:14 compute-0 sudo[174527]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:15 compute-0 sudo[174680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfaalzseyuebylzgayfaxaapbyztrkgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398414.772957-2117-263797738520169/AnsiballZ_systemd_service.py'
Nov 29 06:40:15 compute-0 sudo[174680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:15 compute-0 python3.9[174682]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:40:15 compute-0 sudo[174680]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:15 compute-0 sudo[174833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxrscuoryulsuqvvrwwjigbrbcfmwhmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398415.5454597-2117-120234959238541/AnsiballZ_systemd_service.py'
Nov 29 06:40:15 compute-0 sudo[174833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:16 compute-0 python3.9[174835]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:40:16 compute-0 sudo[174833]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:16 compute-0 sudo[174988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-washioceldhnovnltwctnfvnysqdrraq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398416.3346167-2117-235827914050321/AnsiballZ_systemd_service.py'
Nov 29 06:40:16 compute-0 sudo[174988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:16 compute-0 python3.9[174990]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:40:17 compute-0 sshd-session[174913]: Invalid user toto from 179.125.24.202 port 54460
Nov 29 06:40:17 compute-0 sshd-session[174913]: Received disconnect from 179.125.24.202 port 54460:11: Bye Bye [preauth]
Nov 29 06:40:17 compute-0 sshd-session[174913]: Disconnected from invalid user toto 179.125.24.202 port 54460 [preauth]
Nov 29 06:40:17 compute-0 podman[174992]: 2025-11-29 06:40:17.842896562 +0000 UTC m=+0.090265150 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 29 06:40:17 compute-0 sudo[174988]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:18 compute-0 sudo[175162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xueunqsbvqxfoesywcovpmuasibknokk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398418.3960967-2294-98472873006493/AnsiballZ_file.py'
Nov 29 06:40:18 compute-0 sudo[175162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:18 compute-0 python3.9[175164]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:18 compute-0 sudo[175162]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:19 compute-0 sudo[175314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzowjbthgqkjprqozetqbommnnujgwyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398419.0027468-2294-174905133258485/AnsiballZ_file.py'
Nov 29 06:40:19 compute-0 sudo[175314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:19 compute-0 python3.9[175316]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:19 compute-0 sudo[175314]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:19 compute-0 sudo[175466]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iciqboqovuwnwoavxbccikcmwqlmjnhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398419.6079493-2294-91765665733321/AnsiballZ_file.py'
Nov 29 06:40:19 compute-0 sudo[175466]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:20 compute-0 python3.9[175468]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:20 compute-0 sudo[175466]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:20 compute-0 sudo[175618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjaydboxrnqkzjglhoxuohvjrwujdvfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398420.2095044-2294-153292990910802/AnsiballZ_file.py'
Nov 29 06:40:20 compute-0 sudo[175618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:20 compute-0 python3.9[175620]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:20 compute-0 sudo[175618]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:20 compute-0 sudo[175770]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdqmcooibvlqhkchxnmdbnbhinwwnhtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398420.736722-2294-43791821150069/AnsiballZ_file.py'
Nov 29 06:40:20 compute-0 sudo[175770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:21 compute-0 python3.9[175772]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:21 compute-0 sudo[175770]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:21 compute-0 sudo[175922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqkhibaffifehdgaduzmwztifbsgjwir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398421.4568925-2294-84301157811068/AnsiballZ_file.py'
Nov 29 06:40:21 compute-0 sudo[175922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:21 compute-0 python3.9[175924]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:21 compute-0 sudo[175922]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:22 compute-0 sudo[176074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eoxtofhxgddhhdobaxszydjkcdiizajc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398422.0635686-2294-160886835543828/AnsiballZ_file.py'
Nov 29 06:40:22 compute-0 sudo[176074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:22 compute-0 python3.9[176076]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:22 compute-0 sudo[176074]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:22 compute-0 sudo[176226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eisqjdygweucvixueaftmdmvyynwoolz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398422.6509185-2294-213753234733385/AnsiballZ_file.py'
Nov 29 06:40:22 compute-0 sudo[176226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:23 compute-0 python3.9[176228]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:23 compute-0 sudo[176226]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:24 compute-0 sudo[176378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfzmffpibdsmyedditjpmjqqxavggshc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398423.889237-2465-201098861488829/AnsiballZ_file.py'
Nov 29 06:40:24 compute-0 sudo[176378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:24 compute-0 python3.9[176380]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:24 compute-0 sudo[176378]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:24 compute-0 sudo[176530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmtaiuqmkglhbyscdnhvefitlwlhktrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398424.4915712-2465-82565523079832/AnsiballZ_file.py'
Nov 29 06:40:24 compute-0 sudo[176530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:40:24.793 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:40:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:40:24.795 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:40:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:40:24.795 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:40:24 compute-0 python3.9[176532]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:24 compute-0 sudo[176530]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:25 compute-0 sudo[176682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqcpgvwzphvxnusvaspalznocnezhcwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398425.0889103-2465-16588920425554/AnsiballZ_file.py'
Nov 29 06:40:25 compute-0 sudo[176682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:25 compute-0 python3.9[176684]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:25 compute-0 sudo[176682]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:25 compute-0 sudo[176834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbeycdjpgbgdlpjxnpklfpyfvokbzbtc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398425.651011-2465-81181260024522/AnsiballZ_file.py'
Nov 29 06:40:25 compute-0 sudo[176834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:26 compute-0 python3.9[176836]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:26 compute-0 sudo[176834]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:26 compute-0 sudo[176986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtskzrvtiotyygywkdioxjvcxxmaxjxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398426.1981292-2465-234623886819774/AnsiballZ_file.py'
Nov 29 06:40:26 compute-0 sudo[176986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:26 compute-0 python3.9[176988]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:26 compute-0 sudo[176986]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:27 compute-0 sudo[177138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzkopcnfjtxmkvazsopuizvpkqnikanr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398426.8147256-2465-262323489835139/AnsiballZ_file.py'
Nov 29 06:40:27 compute-0 sudo[177138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:27 compute-0 python3.9[177140]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:27 compute-0 sudo[177138]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:27 compute-0 sudo[177290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xsxwhquzynoutpknjdnugpgssfewllaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398427.4044545-2465-110567144490186/AnsiballZ_file.py'
Nov 29 06:40:27 compute-0 sudo[177290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:27 compute-0 python3.9[177292]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:27 compute-0 sudo[177290]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:28 compute-0 sudo[177442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odnmcmaykapdgnqcrsdvkqrifgtcdrff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398428.058987-2465-71724989863568/AnsiballZ_file.py'
Nov 29 06:40:28 compute-0 sudo[177442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:28 compute-0 python3.9[177444]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:40:28 compute-0 sudo[177442]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:29 compute-0 sudo[177594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npytiqynhdfnafkbvfmcbiosijvssjzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398429.4825897-2639-172208676309328/AnsiballZ_command.py'
Nov 29 06:40:29 compute-0 sudo[177594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:30 compute-0 python3.9[177596]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:40:30 compute-0 sudo[177594]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:30 compute-0 python3.9[177748]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 29 06:40:31 compute-0 sudo[177898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjrpdnydmusxnfabllrszfcpnqgnbmfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398431.3001008-2693-68384124186181/AnsiballZ_systemd_service.py'
Nov 29 06:40:31 compute-0 sudo[177898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:31 compute-0 python3.9[177900]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 06:40:31 compute-0 systemd[1]: Reloading.
Nov 29 06:40:32 compute-0 systemd-rc-local-generator[177926]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:40:32 compute-0 systemd-sysv-generator[177931]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:40:32 compute-0 sudo[177898]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:32 compute-0 sudo[178084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqzfvtgcgngeidutnauvnovfuszjcusx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398432.3863971-2717-267629694483382/AnsiballZ_command.py'
Nov 29 06:40:32 compute-0 sudo[178084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:32 compute-0 python3.9[178086]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:40:32 compute-0 sudo[178084]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:33 compute-0 sudo[178237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkxecqkayddfqtgdrbaskypoactqebrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398432.994796-2717-69352157854060/AnsiballZ_command.py'
Nov 29 06:40:33 compute-0 sudo[178237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:33 compute-0 python3.9[178239]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:40:33 compute-0 sudo[178237]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:33 compute-0 sudo[178390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sonjnebmyzjhpjfjgjpmqcrvaqrkzkuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398433.595371-2717-48552031292048/AnsiballZ_command.py'
Nov 29 06:40:33 compute-0 sudo[178390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:34 compute-0 python3.9[178392]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:40:34 compute-0 sudo[178390]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:34 compute-0 sudo[178543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsdscbypyfbgdxzlljxljgsfnpcuykro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398434.1678302-2717-255964641406717/AnsiballZ_command.py'
Nov 29 06:40:34 compute-0 sudo[178543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:34 compute-0 podman[178545]: 2025-11-29 06:40:34.502619684 +0000 UTC m=+0.053793270 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 06:40:34 compute-0 python3.9[178546]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:40:34 compute-0 sudo[178543]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:35 compute-0 sudo[178715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knyebccnzuqwshhujlgvgnethemikkbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398434.767465-2717-111160092318944/AnsiballZ_command.py'
Nov 29 06:40:35 compute-0 sudo[178715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:35 compute-0 python3.9[178717]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:40:35 compute-0 sudo[178715]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:35 compute-0 sudo[178868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oojpvrtzqvpzgmcwrnfheuzujvstaatc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398435.382736-2717-228093955214724/AnsiballZ_command.py'
Nov 29 06:40:35 compute-0 sudo[178868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:35 compute-0 python3.9[178870]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:40:35 compute-0 sudo[178868]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:36 compute-0 sudo[179021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iuhhzyjrxcypdjipbkylmohspiertmlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398436.0879128-2717-105183240899097/AnsiballZ_command.py'
Nov 29 06:40:36 compute-0 sudo[179021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:36 compute-0 python3.9[179023]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:40:36 compute-0 sudo[179021]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:37 compute-0 sudo[179174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dikbfsxedeqkkablmsycprqvcxvdxrzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398436.7183754-2717-219443390356728/AnsiballZ_command.py'
Nov 29 06:40:37 compute-0 sudo[179174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:37 compute-0 python3.9[179176]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:40:37 compute-0 sudo[179174]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:39 compute-0 sudo[179327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lalpzfecitwhxzpewaqywwmyzlpjagxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398438.8559053-2924-194002269937539/AnsiballZ_file.py'
Nov 29 06:40:39 compute-0 sudo[179327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:39 compute-0 python3.9[179329]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:40:39 compute-0 sudo[179327]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:39 compute-0 sudo[179479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtbbuobfyosxcvypxbflgsiqgvreopec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398439.5311444-2924-235295176802701/AnsiballZ_file.py'
Nov 29 06:40:39 compute-0 sudo[179479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:39 compute-0 python3.9[179481]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:40:40 compute-0 sudo[179479]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:40 compute-0 sudo[179631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgfwdiiqwnjapamxagmmwuytzofqoetg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398440.1555579-2924-148689554295407/AnsiballZ_file.py'
Nov 29 06:40:40 compute-0 sudo[179631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:40 compute-0 python3.9[179633]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:40:40 compute-0 sudo[179631]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:41 compute-0 sudo[179803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkqkzeblqiqewjkckwmjnvkatvipalrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398440.9335277-2990-264022070038232/AnsiballZ_file.py'
Nov 29 06:40:41 compute-0 sudo[179803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:41 compute-0 podman[179757]: 2025-11-29 06:40:41.374722782 +0000 UTC m=+0.093229569 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 06:40:41 compute-0 python3.9[179809]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:40:41 compute-0 sudo[179803]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:41 compute-0 sudo[179961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ammgmvsjafpidvkssaxbxboitbkdvkln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398441.683816-2990-56504326887564/AnsiballZ_file.py'
Nov 29 06:40:41 compute-0 sudo[179961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:42 compute-0 python3.9[179963]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:40:42 compute-0 sudo[179961]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:42 compute-0 sudo[180113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbxyokmnqozwektwicnwwfxakrnacfqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398442.349795-2990-117116146671093/AnsiballZ_file.py'
Nov 29 06:40:42 compute-0 sudo[180113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:42 compute-0 python3.9[180115]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:40:42 compute-0 sudo[180113]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:43 compute-0 sudo[180265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhxwvarbuhlcbbhkrnoqhluojkbabmhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398442.9776208-2990-71528124567379/AnsiballZ_file.py'
Nov 29 06:40:43 compute-0 sudo[180265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:43 compute-0 python3.9[180267]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:40:43 compute-0 sudo[180265]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:43 compute-0 sudo[180417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crbgvmetxrkgrudfpnzxskqtgsgjwbkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398443.6134522-2990-265948414876784/AnsiballZ_file.py'
Nov 29 06:40:43 compute-0 sudo[180417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:44 compute-0 python3.9[180419]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:40:44 compute-0 sudo[180417]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:44 compute-0 sudo[180569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjecfuqmruuxtckwkzrammfiscbqlxts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398444.2534852-2990-221575284002973/AnsiballZ_file.py'
Nov 29 06:40:44 compute-0 sudo[180569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:44 compute-0 python3.9[180571]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:40:44 compute-0 sudo[180569]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:45 compute-0 sudo[180721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngxhbtlnpdgramdbmemqreapaeepawwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398444.9845421-2990-216171467926324/AnsiballZ_file.py'
Nov 29 06:40:45 compute-0 sudo[180721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:45 compute-0 python3.9[180723]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:40:45 compute-0 sudo[180721]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:48 compute-0 podman[180748]: 2025-11-29 06:40:48.80572061 +0000 UTC m=+0.060765438 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 29 06:40:50 compute-0 sudo[180895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgrbcrfgbztqqvkbzzerinzurdzvusld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398449.9435828-3295-190345004007245/AnsiballZ_getent.py'
Nov 29 06:40:50 compute-0 sudo[180895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:50 compute-0 python3.9[180897]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Nov 29 06:40:50 compute-0 sudo[180895]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:50 compute-0 sshd-session[180768]: Received disconnect from 103.179.56.44 port 54102:11: Bye Bye [preauth]
Nov 29 06:40:50 compute-0 sshd-session[180768]: Disconnected from authenticating user root 103.179.56.44 port 54102 [preauth]
Nov 29 06:40:51 compute-0 sudo[181048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmepqvcnczakxlxzwjdfzywdshwunqsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398450.8079832-3319-53684113734136/AnsiballZ_group.py'
Nov 29 06:40:51 compute-0 sudo[181048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:51 compute-0 python3.9[181050]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 29 06:40:51 compute-0 groupadd[181051]: group added to /etc/group: name=nova, GID=42436
Nov 29 06:40:51 compute-0 groupadd[181051]: group added to /etc/gshadow: name=nova
Nov 29 06:40:51 compute-0 groupadd[181051]: new group: name=nova, GID=42436
Nov 29 06:40:51 compute-0 sudo[181048]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:52 compute-0 sudo[181206]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fntlovaziwagwwbokgkfgxtltrnkarse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398451.855992-3343-129785583688719/AnsiballZ_user.py'
Nov 29 06:40:52 compute-0 sudo[181206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:40:52 compute-0 python3.9[181208]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 29 06:40:53 compute-0 useradd[181210]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Nov 29 06:40:53 compute-0 useradd[181210]: add 'nova' to group 'libvirt'
Nov 29 06:40:53 compute-0 useradd[181210]: add 'nova' to shadow group 'libvirt'
Nov 29 06:40:53 compute-0 sudo[181206]: pam_unix(sudo:session): session closed for user root
Nov 29 06:40:54 compute-0 sshd-session[181243]: Accepted publickey for zuul from 192.168.122.30 port 33074 ssh2: ECDSA SHA256:Ey+6YDlYfkE2dRe/gjWhHvvrJHee4xZo2Q1JojEuVBA
Nov 29 06:40:54 compute-0 systemd-logind[788]: New session 24 of user zuul.
Nov 29 06:40:54 compute-0 systemd[1]: Started Session 24 of User zuul.
Nov 29 06:40:54 compute-0 sshd-session[181243]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:40:54 compute-0 sshd-session[181246]: Received disconnect from 192.168.122.30 port 33074:11: disconnected by user
Nov 29 06:40:54 compute-0 sshd-session[181246]: Disconnected from user zuul 192.168.122.30 port 33074
Nov 29 06:40:54 compute-0 sshd-session[181243]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:40:54 compute-0 systemd[1]: session-24.scope: Deactivated successfully.
Nov 29 06:40:54 compute-0 systemd-logind[788]: Session 24 logged out. Waiting for processes to exit.
Nov 29 06:40:54 compute-0 systemd-logind[788]: Removed session 24.
Nov 29 06:40:54 compute-0 sshd-session[181241]: Invalid user a from 45.202.211.6 port 53106
Nov 29 06:40:55 compute-0 sshd-session[181241]: Received disconnect from 45.202.211.6 port 53106:11: Bye Bye [preauth]
Nov 29 06:40:55 compute-0 sshd-session[181241]: Disconnected from invalid user a 45.202.211.6 port 53106 [preauth]
Nov 29 06:40:55 compute-0 python3.9[181396]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:40:55 compute-0 python3.9[181517]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398454.693305-3418-206258155438120/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:40:56 compute-0 python3.9[181667]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:40:56 compute-0 python3.9[181743]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:40:57 compute-0 python3.9[181893]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:40:58 compute-0 python3.9[182014]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398457.0172198-3418-173234162139820/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:40:58 compute-0 python3.9[182164]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:40:59 compute-0 python3.9[182287]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398458.248096-3418-181709610173282/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:40:59 compute-0 python3.9[182437]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:41:00 compute-0 sshd-session[182270]: Invalid user sinusbot from 1.214.197.163 port 34852
Nov 29 06:41:00 compute-0 python3.9[182558]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398459.2866726-3418-869496251812/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:41:00 compute-0 sshd-session[182270]: Received disconnect from 1.214.197.163 port 34852:11: Bye Bye [preauth]
Nov 29 06:41:00 compute-0 sshd-session[182270]: Disconnected from invalid user sinusbot 1.214.197.163 port 34852 [preauth]
Nov 29 06:41:00 compute-0 python3.9[182708]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:41:01 compute-0 python3.9[182829]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398460.4089606-3418-51825984714977/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:41:02 compute-0 sudo[182979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmqsxqrauhlzldkpatitavtzlwqjeozg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398462.1740355-3667-59786133960032/AnsiballZ_file.py'
Nov 29 06:41:02 compute-0 sudo[182979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:02 compute-0 python3.9[182981]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:02 compute-0 sudo[182979]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:03 compute-0 sudo[183131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrsdaudikhxgmpobhsaxywmehlfaqmfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398462.8568811-3691-75811525382476/AnsiballZ_copy.py'
Nov 29 06:41:03 compute-0 sudo[183131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:03 compute-0 python3.9[183133]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:03 compute-0 sudo[183131]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:03 compute-0 sudo[183283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpmccxbrcuphnaxubdeckqxyxyrddgyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398463.5435805-3715-113952071995481/AnsiballZ_stat.py'
Nov 29 06:41:03 compute-0 sudo[183283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:04 compute-0 python3.9[183285]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:41:04 compute-0 sudo[183283]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:04 compute-0 sudo[183435]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xuchxulbhrmcrmsagcrqxdvzysgsoefl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398464.281429-3739-261708600455430/AnsiballZ_stat.py'
Nov 29 06:41:04 compute-0 sudo[183435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:04 compute-0 podman[183437]: 2025-11-29 06:41:04.591657405 +0000 UTC m=+0.051873341 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent)
Nov 29 06:41:04 compute-0 python3.9[183438]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:41:04 compute-0 sudo[183435]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:05 compute-0 sudo[183578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwzefsunkczfuolsnvqnepuhcbcvmafq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398464.281429-3739-261708600455430/AnsiballZ_copy.py'
Nov 29 06:41:05 compute-0 sudo[183578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:05 compute-0 python3.9[183580]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1764398464.281429-3739-261708600455430/.source _original_basename=.6tu0leap follow=False checksum=d1d605c25c85e9c7fa18485d206f873f29f48546 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Nov 29 06:41:05 compute-0 sudo[183578]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:06 compute-0 python3.9[183732]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:41:06 compute-0 python3.9[183884]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:41:07 compute-0 python3.9[184005]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398466.490419-3817-954632903981/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=211ffd0bca4b407eb4de45a749ef70116a7806fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:41:08 compute-0 python3.9[184155]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:41:08 compute-0 python3.9[184276]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398467.7481825-3862-266671046710345/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:41:09 compute-0 sudo[184426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pajgyilbfsglmcjpgdrmnyydjqpovuvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398469.296819-3913-29154318189638/AnsiballZ_container_config_data.py'
Nov 29 06:41:09 compute-0 sudo[184426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:09 compute-0 python3.9[184428]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Nov 29 06:41:09 compute-0 sudo[184426]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:10 compute-0 sudo[184578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezrjjveixfglvdkkbsryzwwjqhnvuwua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398470.0885806-3940-166034876851049/AnsiballZ_container_config_hash.py'
Nov 29 06:41:10 compute-0 sudo[184578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:10 compute-0 python3.9[184580]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 06:41:10 compute-0 sudo[184578]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:11 compute-0 sudo[184730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jshkxaabibjyiizwotjxcuzbgpamxqsh ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764398471.0294728-3970-68085443258552/AnsiballZ_edpm_container_manage.py'
Nov 29 06:41:11 compute-0 sudo[184730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:11 compute-0 python3[184732]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 06:41:11 compute-0 podman[184771]: 2025-11-29 06:41:11.783852828 +0000 UTC m=+0.055525520 container create aa76c9013abfbca89c47a5a661d424ad01d012dc4f57bb1f8bae5d9a67403c91 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.license=GPLv2, container_name=nova_compute_init)
Nov 29 06:41:11 compute-0 podman[184771]: 2025-11-29 06:41:11.754200217 +0000 UTC m=+0.025872929 image pull b65793e7266422f5b94c32d109b906c8ffd974cf2ddf0b6929e463e29e05864a quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 29 06:41:11 compute-0 python3[184732]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Nov 29 06:41:11 compute-0 podman[184770]: 2025-11-29 06:41:11.846425994 +0000 UTC m=+0.109958360 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 29 06:41:11 compute-0 sudo[184730]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:12 compute-0 sudo[184982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btevezpwwjgyvprqbsaxvojrfamfmgdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398472.3220189-3994-66998819100811/AnsiballZ_stat.py'
Nov 29 06:41:12 compute-0 sudo[184982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:12 compute-0 python3.9[184984]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:41:12 compute-0 sudo[184982]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:13 compute-0 sudo[185136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdptfgkxethqgkrubwimlpcoaoxobfsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398473.5721076-4030-46058070254384/AnsiballZ_container_config_data.py'
Nov 29 06:41:13 compute-0 sudo[185136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:14 compute-0 python3.9[185138]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Nov 29 06:41:14 compute-0 sudo[185136]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:14 compute-0 sudo[185288]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oadzpfposlmfevcpzkjemxfgmvegvdxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398474.3471477-4057-204961715665215/AnsiballZ_container_config_hash.py'
Nov 29 06:41:14 compute-0 sudo[185288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:14 compute-0 python3.9[185290]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 06:41:14 compute-0 sudo[185288]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:15 compute-0 sudo[185440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmltbqozqxvwybbxpeagncpjtqgvufll ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764398475.4131513-4087-157007934439251/AnsiballZ_edpm_container_manage.py'
Nov 29 06:41:15 compute-0 sudo[185440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:15 compute-0 python3[185442]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 06:41:16 compute-0 podman[185480]: 2025-11-29 06:41:16.20747855 +0000 UTC m=+0.025583449 image pull b65793e7266422f5b94c32d109b906c8ffd974cf2ddf0b6929e463e29e05864a quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 29 06:41:16 compute-0 podman[185480]: 2025-11-29 06:41:16.695432276 +0000 UTC m=+0.513537135 container create 9e31a8b4136a4421fec7ff06e740bd442e40fd5203e971de48094f5c0ad9b467 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 06:41:16 compute-0 python3[185442]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Nov 29 06:41:16 compute-0 sudo[185440]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:17 compute-0 sudo[185669]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yankbzwqilqsghsuiscduullfuntefwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398476.9945776-4111-38014736779312/AnsiballZ_stat.py'
Nov 29 06:41:17 compute-0 sudo[185669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:17 compute-0 python3.9[185671]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:41:17 compute-0 sudo[185669]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:18 compute-0 sudo[185823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iztskbzydodfmtuekdcdzwcvgsfjyjdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398477.9159737-4138-151453190096450/AnsiballZ_file.py'
Nov 29 06:41:18 compute-0 sudo[185823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:18 compute-0 python3.9[185825]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:18 compute-0 sudo[185823]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:18 compute-0 sudo[185974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvawahvksbpvuhxmjetxrjccuuebgftn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398478.463661-4138-40003880050090/AnsiballZ_copy.py'
Nov 29 06:41:18 compute-0 sudo[185974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:18 compute-0 podman[185976]: 2025-11-29 06:41:18.9091974 +0000 UTC m=+0.054315907 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 06:41:19 compute-0 python3.9[185977]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764398478.463661-4138-40003880050090/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:19 compute-0 sudo[185974]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:19 compute-0 sudo[186070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlpllrbjfaheominpdrdigafeibsvvqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398478.463661-4138-40003880050090/AnsiballZ_systemd.py'
Nov 29 06:41:19 compute-0 sudo[186070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:19 compute-0 python3.9[186072]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 06:41:19 compute-0 systemd[1]: Reloading.
Nov 29 06:41:19 compute-0 systemd-rc-local-generator[186100]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:41:19 compute-0 systemd-sysv-generator[186103]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:41:20 compute-0 sudo[186070]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:20 compute-0 sudo[186181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fznedvxiqcdrbhznkawcgxepkvvryhht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398478.463661-4138-40003880050090/AnsiballZ_systemd.py'
Nov 29 06:41:20 compute-0 sudo[186181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:20 compute-0 python3.9[186183]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:41:20 compute-0 systemd[1]: Reloading.
Nov 29 06:41:20 compute-0 systemd-rc-local-generator[186216]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:41:20 compute-0 systemd-sysv-generator[186219]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:41:21 compute-0 systemd[1]: Starting nova_compute container...
Nov 29 06:41:22 compute-0 systemd[1]: Started libcrun container.
Nov 29 06:41:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d97f3d504bb082e00ab3578455e4ef7c3c94cdd9e2e07b7ed2948dfa78e077a8/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 29 06:41:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d97f3d504bb082e00ab3578455e4ef7c3c94cdd9e2e07b7ed2948dfa78e077a8/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 29 06:41:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d97f3d504bb082e00ab3578455e4ef7c3c94cdd9e2e07b7ed2948dfa78e077a8/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 29 06:41:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d97f3d504bb082e00ab3578455e4ef7c3c94cdd9e2e07b7ed2948dfa78e077a8/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 29 06:41:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d97f3d504bb082e00ab3578455e4ef7c3c94cdd9e2e07b7ed2948dfa78e077a8/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 29 06:41:22 compute-0 podman[186223]: 2025-11-29 06:41:22.204395759 +0000 UTC m=+0.885233880 container init 9e31a8b4136a4421fec7ff06e740bd442e40fd5203e971de48094f5c0ad9b467 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 06:41:22 compute-0 podman[186223]: 2025-11-29 06:41:22.210773023 +0000 UTC m=+0.891611094 container start 9e31a8b4136a4421fec7ff06e740bd442e40fd5203e971de48094f5c0ad9b467 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 06:41:22 compute-0 nova_compute[186241]: + sudo -E kolla_set_configs
Nov 29 06:41:22 compute-0 podman[186223]: nova_compute
Nov 29 06:41:22 compute-0 systemd[1]: Started nova_compute container.
Nov 29 06:41:22 compute-0 sudo[186181]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:22 compute-0 nova_compute[186241]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 06:41:22 compute-0 nova_compute[186241]: INFO:__main__:Validating config file
Nov 29 06:41:22 compute-0 nova_compute[186241]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 06:41:22 compute-0 nova_compute[186241]: INFO:__main__:Copying service configuration files
Nov 29 06:41:22 compute-0 nova_compute[186241]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 29 06:41:22 compute-0 nova_compute[186241]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 29 06:41:22 compute-0 nova_compute[186241]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 29 06:41:22 compute-0 nova_compute[186241]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 29 06:41:22 compute-0 nova_compute[186241]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 29 06:41:22 compute-0 nova_compute[186241]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 29 06:41:22 compute-0 nova_compute[186241]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 29 06:41:22 compute-0 nova_compute[186241]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 29 06:41:22 compute-0 nova_compute[186241]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 29 06:41:22 compute-0 nova_compute[186241]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 29 06:41:22 compute-0 nova_compute[186241]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 29 06:41:22 compute-0 nova_compute[186241]: INFO:__main__:Deleting /etc/ceph
Nov 29 06:41:22 compute-0 nova_compute[186241]: INFO:__main__:Creating directory /etc/ceph
Nov 29 06:41:22 compute-0 nova_compute[186241]: INFO:__main__:Setting permission for /etc/ceph
Nov 29 06:41:22 compute-0 nova_compute[186241]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 29 06:41:22 compute-0 nova_compute[186241]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 29 06:41:22 compute-0 nova_compute[186241]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 29 06:41:22 compute-0 nova_compute[186241]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 29 06:41:22 compute-0 nova_compute[186241]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 29 06:41:22 compute-0 nova_compute[186241]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 29 06:41:22 compute-0 nova_compute[186241]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 29 06:41:22 compute-0 nova_compute[186241]: INFO:__main__:Writing out command to execute
Nov 29 06:41:22 compute-0 nova_compute[186241]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 29 06:41:22 compute-0 nova_compute[186241]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 29 06:41:22 compute-0 nova_compute[186241]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 29 06:41:22 compute-0 nova_compute[186241]: ++ cat /run_command
Nov 29 06:41:22 compute-0 nova_compute[186241]: + CMD=nova-compute
Nov 29 06:41:22 compute-0 nova_compute[186241]: + ARGS=
Nov 29 06:41:22 compute-0 nova_compute[186241]: + sudo kolla_copy_cacerts
Nov 29 06:41:22 compute-0 nova_compute[186241]: + [[ ! -n '' ]]
Nov 29 06:41:22 compute-0 nova_compute[186241]: + . kolla_extend_start
Nov 29 06:41:22 compute-0 nova_compute[186241]: Running command: 'nova-compute'
Nov 29 06:41:22 compute-0 nova_compute[186241]: + echo 'Running command: '\''nova-compute'\'''
Nov 29 06:41:22 compute-0 nova_compute[186241]: + umask 0022
Nov 29 06:41:22 compute-0 nova_compute[186241]: + exec nova-compute
Nov 29 06:41:22 compute-0 sshd-session[186236]: Invalid user newuser from 160.202.8.218 port 53326
Nov 29 06:41:22 compute-0 sshd-session[186236]: Received disconnect from 160.202.8.218 port 53326:11: Bye Bye [preauth]
Nov 29 06:41:22 compute-0 sshd-session[186236]: Disconnected from invalid user newuser 160.202.8.218 port 53326 [preauth]
Nov 29 06:41:23 compute-0 python3.9[186403]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:41:24 compute-0 nova_compute[186241]: 2025-11-29 06:41:24.307 186245 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 29 06:41:24 compute-0 nova_compute[186241]: 2025-11-29 06:41:24.307 186245 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 29 06:41:24 compute-0 nova_compute[186241]: 2025-11-29 06:41:24.308 186245 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 29 06:41:24 compute-0 nova_compute[186241]: 2025-11-29 06:41:24.308 186245 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Nov 29 06:41:24 compute-0 python3.9[186553]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:41:24 compute-0 nova_compute[186241]: 2025-11-29 06:41:24.454 186245 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:41:24 compute-0 nova_compute[186241]: 2025-11-29 06:41:24.481 186245 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:41:24 compute-0 nova_compute[186241]: 2025-11-29 06:41:24.482 186245 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Nov 29 06:41:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:41:24.794 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:41:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:41:24.795 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:41:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:41:24.795 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:41:24 compute-0 nova_compute[186241]: 2025-11-29 06:41:24.970 186245 INFO nova.virt.driver [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.085 186245 INFO nova.compute.provider_config [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.123 186245 DEBUG oslo_concurrency.lockutils [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.123 186245 DEBUG oslo_concurrency.lockutils [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.123 186245 DEBUG oslo_concurrency.lockutils [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.124 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.124 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.124 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.124 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.124 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.124 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.125 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.125 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.125 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.125 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.125 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.126 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.126 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.126 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.126 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.126 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.126 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.127 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.127 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.127 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.127 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.127 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.127 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.127 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.128 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.128 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.128 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.128 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.128 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.128 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.129 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.129 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.129 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.129 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.129 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.129 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.130 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.130 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.130 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.130 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.130 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.130 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.131 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.131 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.131 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.131 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.131 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.131 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.132 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.132 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.132 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.132 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.132 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.132 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.132 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.133 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.133 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.133 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.133 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.133 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.133 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.133 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.134 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.134 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.134 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.134 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.134 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.134 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.134 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.135 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.135 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.135 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.135 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.135 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.135 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.135 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.136 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.136 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.136 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.136 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.136 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.136 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.136 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.137 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.137 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.137 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.137 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.137 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.137 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.138 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.138 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.138 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.138 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.138 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.138 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.139 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.139 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.139 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.139 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.139 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.140 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.140 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.140 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.140 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.140 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.140 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.141 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.141 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.141 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.141 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.141 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.141 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.141 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.142 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.142 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.142 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.142 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.142 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.142 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.142 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.143 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.143 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.143 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.143 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.143 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.143 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.143 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.144 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.144 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.144 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.144 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.144 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.144 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.144 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.145 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.145 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.145 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.145 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.145 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.145 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.145 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.146 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.146 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.146 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.146 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.146 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.146 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.146 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.147 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.147 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.147 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.147 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.147 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.147 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.147 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.148 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.148 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.148 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.148 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.148 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.148 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.148 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.149 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.149 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.149 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.149 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.149 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.149 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.150 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.150 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.150 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.150 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.150 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.150 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.150 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.151 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.151 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.151 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.151 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.151 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.151 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.151 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.152 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.152 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.152 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.152 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.152 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.152 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.153 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.153 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.153 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.153 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.153 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.153 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.153 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.154 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.154 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.154 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.154 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.154 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.154 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.154 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.155 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.155 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.155 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.155 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.155 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.155 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.155 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.155 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.156 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.156 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.156 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.156 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.156 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.156 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.157 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.157 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.157 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.157 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.157 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.157 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.158 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.158 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.158 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.158 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.158 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.159 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.159 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.159 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.159 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.159 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.159 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.159 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.160 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.160 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.160 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.160 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.160 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.160 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.161 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.161 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.161 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.161 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.161 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.161 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.161 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.162 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.162 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.162 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.162 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.162 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.162 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.162 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.163 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.163 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.163 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.163 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.163 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.163 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.163 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.164 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.164 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.164 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.164 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.164 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.164 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.165 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.165 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.165 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.165 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.165 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.165 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.166 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.166 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.166 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.166 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.166 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.166 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.166 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.167 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.167 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.167 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.167 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.167 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.167 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.168 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.168 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.168 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.168 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.168 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.168 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.168 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.168 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.169 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.169 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.169 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.169 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.169 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.169 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.169 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.170 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.170 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.170 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.170 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.170 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.170 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.170 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.171 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.171 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.171 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.171 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.171 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.171 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.171 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.172 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.172 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.172 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.172 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.172 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.172 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.173 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.173 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.173 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.173 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.173 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.173 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.173 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.174 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.174 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.174 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.174 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.174 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.174 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.175 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.175 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.175 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.175 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.175 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.175 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.175 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.176 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.176 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.176 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.176 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.176 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.176 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.176 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.176 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.177 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.177 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.177 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.177 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.177 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.177 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.178 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.178 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.178 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.178 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.178 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.178 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.179 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.179 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.179 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.179 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.179 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.179 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.179 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.179 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.180 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.180 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.180 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.180 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.180 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.180 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.180 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.181 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.181 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.181 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.181 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.181 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.181 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.181 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.182 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.182 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.182 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.182 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.182 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.182 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.182 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.183 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.183 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.183 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.183 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.183 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.183 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.183 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.183 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.184 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.184 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.184 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.184 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.184 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.184 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.185 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.185 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.185 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.185 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.185 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.185 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.185 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.186 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.186 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.186 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.186 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.186 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.186 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.186 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.186 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.187 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.187 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.187 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.187 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.187 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.187 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.187 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.188 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.188 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.188 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.188 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.188 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.188 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.188 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.189 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.189 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.189 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.189 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.189 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.189 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.189 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.189 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.190 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.190 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.190 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.190 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.190 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.190 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.190 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.191 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.191 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.191 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.191 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.191 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.191 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.192 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.192 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.192 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.192 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.192 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.192 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.192 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.193 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.193 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.193 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.193 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.193 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.193 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.193 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.194 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.194 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.194 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.194 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.194 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.194 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.194 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.195 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.195 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.195 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.195 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.195 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.195 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.195 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.196 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.196 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.196 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.196 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.196 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.196 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.196 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.196 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.197 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.197 186245 WARNING oslo_config.cfg [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 29 06:41:25 compute-0 nova_compute[186241]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 29 06:41:25 compute-0 nova_compute[186241]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 29 06:41:25 compute-0 nova_compute[186241]: and ``live_migration_inbound_addr`` respectively.
Nov 29 06:41:25 compute-0 nova_compute[186241]: ).  Its value may be silently ignored in the future.
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.197 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.197 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.197 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.197 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.198 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.198 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.198 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.198 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.198 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.198 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.199 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.199 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.199 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.199 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.199 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.200 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.200 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.200 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.200 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.200 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.200 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.201 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.201 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.201 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.201 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.201 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.202 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.202 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.202 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.202 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.203 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.203 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.203 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.203 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.203 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.204 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.204 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.204 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.204 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.204 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.205 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.205 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.205 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.205 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.205 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.206 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.206 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.206 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.206 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.206 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.206 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.206 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.207 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.207 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.207 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.207 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.207 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.207 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.208 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.208 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.208 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.208 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.208 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.208 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.208 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.209 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.209 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.209 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.209 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.209 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.209 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.209 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.210 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.210 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.210 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.210 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.210 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.210 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.210 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.210 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.211 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.211 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.211 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.211 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.211 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.211 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.212 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.212 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.212 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.212 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.212 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.212 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.212 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.212 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.213 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.213 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.213 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.213 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.213 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.213 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.213 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.214 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.214 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.214 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.214 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.214 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.214 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.214 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.215 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.215 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.215 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.215 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.215 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.215 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.215 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.216 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.216 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.216 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.216 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.216 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.216 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.216 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.217 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.217 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.217 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.217 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.217 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.217 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.217 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.218 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.218 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.218 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.218 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.218 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.218 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.219 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.219 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.219 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.219 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.219 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.219 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.220 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.220 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.220 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.220 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.220 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.220 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.220 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.220 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.221 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.221 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.221 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.221 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.221 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.221 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.221 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.222 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.222 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.222 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.222 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.222 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.222 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.223 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.223 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.223 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.223 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.223 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.223 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.223 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.223 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.224 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.224 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.224 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.224 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.224 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.224 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.224 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.225 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.225 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.225 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.225 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.225 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.225 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.226 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.226 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.226 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.226 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.226 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.226 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.226 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.227 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.227 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.227 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.227 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.227 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.227 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.227 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.228 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.228 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.228 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.228 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.228 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.228 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.229 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.229 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.229 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.229 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.229 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.229 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.229 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.229 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.230 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.230 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.230 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.230 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.230 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.231 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.231 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.231 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.231 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.231 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.232 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.232 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.232 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.232 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.232 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.232 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.233 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.233 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.233 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.233 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.233 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.234 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.234 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.234 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.234 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.234 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.235 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.235 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.235 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.235 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.236 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.236 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.236 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.236 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.236 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.237 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.237 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.237 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.237 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.238 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.238 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.238 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.238 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.239 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.239 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.239 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.239 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.239 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.240 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.240 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.240 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.240 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.240 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.241 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.241 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.241 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.241 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.241 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.242 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.242 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.242 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.242 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.242 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.243 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.243 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.243 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.243 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.243 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.244 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.244 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.244 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.244 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.244 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.245 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.245 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.245 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.245 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.245 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.246 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.246 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.246 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.246 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.246 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.247 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.247 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.247 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.247 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.248 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.248 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.248 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.248 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.248 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.249 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.249 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.249 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.249 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.249 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.250 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.250 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.250 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.250 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.250 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.251 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.251 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.251 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.251 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.251 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.252 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.252 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.252 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.252 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.252 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.253 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.253 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.253 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.253 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.253 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.254 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.254 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.254 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.254 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.254 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.255 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.255 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.255 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.255 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.255 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.256 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.256 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.256 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.256 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.256 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.257 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.257 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.257 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.257 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.257 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.258 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.258 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.258 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.258 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.258 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.259 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.259 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.259 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.259 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.259 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.260 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.260 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.260 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.260 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.260 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.261 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.261 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.261 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.261 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.261 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.261 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.268 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.268 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.269 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.269 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.269 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.269 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.269 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.269 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.270 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.270 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.270 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.271 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.271 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.271 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.271 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.271 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.272 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.272 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.272 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.272 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.272 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.273 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.273 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.273 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.273 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.273 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.274 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.274 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.274 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.274 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.274 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.274 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.275 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.275 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.275 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.275 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.275 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.276 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.276 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.276 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.276 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.276 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.276 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.277 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.277 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.277 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.277 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.277 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.278 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.278 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.278 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.278 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.278 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.278 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.279 186245 DEBUG oslo_service.service [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.280 186245 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.317 186245 DEBUG nova.virt.libvirt.host [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.317 186245 DEBUG nova.virt.libvirt.host [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.318 186245 DEBUG nova.virt.libvirt.host [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.318 186245 DEBUG nova.virt.libvirt.host [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Nov 29 06:41:25 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Nov 29 06:41:25 compute-0 systemd[1]: Started libvirt QEMU daemon.
Nov 29 06:41:25 compute-0 python3.9[186707]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.394 186245 DEBUG nova.virt.libvirt.host [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f8893fc1760> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.399 186245 DEBUG nova.virt.libvirt.host [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f8893fc1760> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.401 186245 INFO nova.virt.libvirt.driver [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] Connection event '1' reason 'None'
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.426 186245 WARNING nova.virt.libvirt.driver [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Nov 29 06:41:25 compute-0 nova_compute[186241]: 2025-11-29 06:41:25.426 186245 DEBUG nova.virt.libvirt.volume.mount [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Nov 29 06:41:26 compute-0 sudo[186917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbjmcgojrfskniqcnrfuzdglgcaimplq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398485.7135866-4318-65459640116755/AnsiballZ_podman_container.py'
Nov 29 06:41:26 compute-0 sudo[186917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:26 compute-0 nova_compute[186241]: 2025-11-29 06:41:26.226 186245 INFO nova.virt.libvirt.host [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] Libvirt host capabilities <capabilities>
Nov 29 06:41:26 compute-0 nova_compute[186241]: 
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <host>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <uuid>2814de55-a942-4164-91c1-92c593f2f35f</uuid>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <cpu>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <arch>x86_64</arch>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model>EPYC-Rome-v4</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <vendor>AMD</vendor>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <microcode version='16777317'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <signature family='23' model='49' stepping='0'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <maxphysaddr mode='emulate' bits='40'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature name='x2apic'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature name='tsc-deadline'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature name='osxsave'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature name='hypervisor'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature name='tsc_adjust'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature name='spec-ctrl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature name='stibp'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature name='arch-capabilities'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature name='ssbd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature name='cmp_legacy'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature name='topoext'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature name='virt-ssbd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature name='lbrv'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature name='tsc-scale'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature name='vmcb-clean'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature name='pause-filter'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature name='pfthreshold'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature name='svme-addr-chk'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature name='rdctl-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature name='skip-l1dfl-vmentry'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature name='mds-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature name='pschange-mc-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <pages unit='KiB' size='4'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <pages unit='KiB' size='2048'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <pages unit='KiB' size='1048576'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </cpu>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <power_management>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <suspend_mem/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <suspend_disk/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <suspend_hybrid/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </power_management>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <iommu support='no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <migration_features>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <live/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <uri_transports>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <uri_transport>tcp</uri_transport>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <uri_transport>rdma</uri_transport>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </uri_transports>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </migration_features>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <topology>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <cells num='1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <cell id='0'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:           <memory unit='KiB'>7864316</memory>
Nov 29 06:41:26 compute-0 nova_compute[186241]:           <pages unit='KiB' size='4'>1966079</pages>
Nov 29 06:41:26 compute-0 nova_compute[186241]:           <pages unit='KiB' size='2048'>0</pages>
Nov 29 06:41:26 compute-0 nova_compute[186241]:           <pages unit='KiB' size='1048576'>0</pages>
Nov 29 06:41:26 compute-0 nova_compute[186241]:           <distances>
Nov 29 06:41:26 compute-0 nova_compute[186241]:             <sibling id='0' value='10'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:           </distances>
Nov 29 06:41:26 compute-0 nova_compute[186241]:           <cpus num='8'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:           </cpus>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         </cell>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </cells>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </topology>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <cache>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </cache>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <secmodel>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model>selinux</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <doi>0</doi>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </secmodel>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <secmodel>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model>dac</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <doi>0</doi>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <baselabel type='kvm'>+107:+107</baselabel>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <baselabel type='qemu'>+107:+107</baselabel>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </secmodel>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   </host>
Nov 29 06:41:26 compute-0 nova_compute[186241]: 
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <guest>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <os_type>hvm</os_type>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <arch name='i686'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <wordsize>32</wordsize>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <domain type='qemu'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <domain type='kvm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </arch>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <features>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <pae/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <nonpae/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <acpi default='on' toggle='yes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <apic default='on' toggle='no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <cpuselection/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <deviceboot/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <disksnapshot default='on' toggle='no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <externalSnapshot/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </features>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   </guest>
Nov 29 06:41:26 compute-0 nova_compute[186241]: 
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <guest>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <os_type>hvm</os_type>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <arch name='x86_64'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <wordsize>64</wordsize>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <domain type='qemu'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <domain type='kvm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </arch>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <features>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <acpi default='on' toggle='yes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <apic default='on' toggle='no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <cpuselection/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <deviceboot/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <disksnapshot default='on' toggle='no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <externalSnapshot/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </features>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   </guest>
Nov 29 06:41:26 compute-0 nova_compute[186241]: 
Nov 29 06:41:26 compute-0 nova_compute[186241]: </capabilities>
Nov 29 06:41:26 compute-0 nova_compute[186241]: 
Nov 29 06:41:26 compute-0 nova_compute[186241]: 2025-11-29 06:41:26.232 186245 DEBUG nova.virt.libvirt.host [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 29 06:41:26 compute-0 nova_compute[186241]: 2025-11-29 06:41:26.250 186245 DEBUG nova.virt.libvirt.host [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 29 06:41:26 compute-0 nova_compute[186241]: <domainCapabilities>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <path>/usr/libexec/qemu-kvm</path>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <domain>kvm</domain>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <arch>i686</arch>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <vcpu max='4096'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <iothreads supported='yes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <os supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <enum name='firmware'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <loader supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='type'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>rom</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>pflash</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='readonly'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>yes</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>no</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='secure'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>no</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </loader>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   </os>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <cpu>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <mode name='host-passthrough' supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='hostPassthroughMigratable'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>on</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>off</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </mode>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <mode name='maximum' supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='maximumMigratable'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>on</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>off</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </mode>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <mode name='host-model' supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <vendor>AMD</vendor>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='x2apic'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='tsc-deadline'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='hypervisor'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='tsc_adjust'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='spec-ctrl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='stibp'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='ssbd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='cmp_legacy'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='overflow-recov'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='succor'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='ibrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='amd-ssbd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='virt-ssbd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='lbrv'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='tsc-scale'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='vmcb-clean'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='flushbyasid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='pause-filter'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='pfthreshold'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='svme-addr-chk'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='disable' name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </mode>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <mode name='custom' supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Broadwell'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Broadwell-IBRS'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Broadwell-noTSX'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Broadwell-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Broadwell-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Broadwell-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Broadwell-v4'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-v4'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-v5'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Cooperlake'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Cooperlake-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Cooperlake-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Denverton'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='mpx'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Denverton-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='mpx'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Denverton-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Denverton-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Dhyana-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-Genoa'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amd-psfd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='auto-ibrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='no-nested-data-bp'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='null-sel-clr-base'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='stibp-always-on'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-Genoa-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amd-psfd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='auto-ibrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='no-nested-data-bp'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='null-sel-clr-base'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='stibp-always-on'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-Milan'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-Milan-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-Milan-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amd-psfd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='no-nested-data-bp'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='null-sel-clr-base'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='stibp-always-on'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-Rome'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-Rome-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-Rome-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-Rome-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-v4'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='GraniteRapids'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-fp16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='mcdt-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pbrsb-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='prefetchiti'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='GraniteRapids-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-fp16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='mcdt-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pbrsb-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='prefetchiti'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='GraniteRapids-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-fp16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx10'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx10-128'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx10-256'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx10-512'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='mcdt-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pbrsb-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='prefetchiti'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Haswell'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Haswell-IBRS'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Haswell-noTSX'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Haswell-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Haswell-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Haswell-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Haswell-v4'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-noTSX'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v4'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v5'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v6'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v7'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='IvyBridge'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='IvyBridge-IBRS'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='IvyBridge-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='IvyBridge-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='KnightsMill'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-4fmaps'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-4vnniw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512er'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512pf'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='KnightsMill-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-4fmaps'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-4vnniw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512er'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512pf'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Opteron_G4'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fma4'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xop'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Opteron_G4-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fma4'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xop'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Opteron_G5'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fma4'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='tbm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xop'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Opteron_G5-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fma4'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='tbm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xop'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='SapphireRapids'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='SapphireRapids-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='SapphireRapids-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='SapphireRapids-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='SierraForest'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-ne-convert'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni-int8'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='cmpccxadd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='mcdt-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pbrsb-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='SierraForest-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-ne-convert'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni-int8'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='cmpccxadd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='mcdt-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pbrsb-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client-IBRS'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client-v4'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-IBRS'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-v4'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-v5'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Snowridge'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='core-capability'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='mpx'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='split-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Snowridge-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='core-capability'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='mpx'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='split-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Snowridge-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='core-capability'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='split-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Snowridge-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='core-capability'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='split-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Snowridge-v4'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='athlon'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='3dnow'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='3dnowext'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='athlon-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='3dnow'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='3dnowext'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='core2duo'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='core2duo-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='coreduo'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='coreduo-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='n270'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='n270-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='phenom'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='3dnow'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='3dnowext'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='phenom-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='3dnow'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='3dnowext'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </mode>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   </cpu>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <memoryBacking supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <enum name='sourceType'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <value>file</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <value>anonymous</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <value>memfd</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   </memoryBacking>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <devices>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <disk supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='diskDevice'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>disk</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>cdrom</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>floppy</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>lun</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='bus'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>fdc</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>scsi</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>virtio</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>usb</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>sata</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='model'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>virtio</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>virtio-transitional</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>virtio-non-transitional</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </disk>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <graphics supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='type'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>vnc</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>egl-headless</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>dbus</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </graphics>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <video supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='modelType'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>vga</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>cirrus</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>virtio</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>none</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>bochs</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>ramfb</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </video>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <hostdev supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='mode'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>subsystem</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='startupPolicy'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>default</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>mandatory</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>requisite</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>optional</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='subsysType'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>usb</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>pci</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>scsi</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='capsType'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='pciBackend'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </hostdev>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <rng supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='model'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>virtio</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>virtio-transitional</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>virtio-non-transitional</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='backendModel'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>random</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>egd</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>builtin</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </rng>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <filesystem supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='driverType'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>path</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>handle</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>virtiofs</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </filesystem>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <tpm supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='model'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>tpm-tis</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>tpm-crb</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='backendModel'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>emulator</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>external</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='backendVersion'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>2.0</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </tpm>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <redirdev supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='bus'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>usb</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </redirdev>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <channel supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='type'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>pty</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>unix</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </channel>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <crypto supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='model'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='type'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>qemu</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='backendModel'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>builtin</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </crypto>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <interface supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='backendType'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>default</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>passt</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </interface>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <panic supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='model'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>isa</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>hyperv</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </panic>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <console supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='type'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>null</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>vc</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>pty</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>dev</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>file</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>pipe</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>stdio</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>udp</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>tcp</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>unix</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>qemu-vdagent</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>dbus</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </console>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   </devices>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <features>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <gic supported='no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <vmcoreinfo supported='yes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <genid supported='yes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <backingStoreInput supported='yes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <backup supported='yes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <async-teardown supported='yes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <ps2 supported='yes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <sev supported='no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <sgx supported='no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <hyperv supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='features'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>relaxed</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>vapic</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>spinlocks</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>vpindex</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>runtime</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>synic</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>stimer</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>reset</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>vendor_id</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>frequencies</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>reenlightenment</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>tlbflush</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>ipi</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>avic</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>emsr_bitmap</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>xmm_input</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <defaults>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <spinlocks>4095</spinlocks>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <stimer_direct>on</stimer_direct>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <tlbflush_direct>on</tlbflush_direct>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <tlbflush_extended>on</tlbflush_extended>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </defaults>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </hyperv>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <launchSecurity supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='sectype'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>tdx</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </launchSecurity>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   </features>
Nov 29 06:41:26 compute-0 nova_compute[186241]: </domainCapabilities>
Nov 29 06:41:26 compute-0 nova_compute[186241]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 29 06:41:26 compute-0 nova_compute[186241]: 2025-11-29 06:41:26.255 186245 DEBUG nova.virt.libvirt.host [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 29 06:41:26 compute-0 nova_compute[186241]: <domainCapabilities>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <path>/usr/libexec/qemu-kvm</path>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <domain>kvm</domain>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <arch>i686</arch>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <vcpu max='240'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <iothreads supported='yes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <os supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <enum name='firmware'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <loader supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='type'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>rom</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>pflash</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='readonly'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>yes</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>no</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='secure'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>no</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </loader>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   </os>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <cpu>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <mode name='host-passthrough' supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='hostPassthroughMigratable'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>on</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>off</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </mode>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <mode name='maximum' supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='maximumMigratable'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>on</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>off</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </mode>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <mode name='host-model' supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <vendor>AMD</vendor>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='x2apic'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='tsc-deadline'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='hypervisor'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='tsc_adjust'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='spec-ctrl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='stibp'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='ssbd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='cmp_legacy'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='overflow-recov'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='succor'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='ibrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='amd-ssbd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='virt-ssbd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='lbrv'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='tsc-scale'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='vmcb-clean'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='flushbyasid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='pause-filter'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='pfthreshold'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='svme-addr-chk'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='disable' name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </mode>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <mode name='custom' supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Broadwell'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Broadwell-IBRS'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Broadwell-noTSX'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Broadwell-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Broadwell-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Broadwell-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Broadwell-v4'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-v4'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-v5'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Cooperlake'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Cooperlake-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Cooperlake-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Denverton'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='mpx'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Denverton-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='mpx'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Denverton-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Denverton-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Dhyana-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-Genoa'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amd-psfd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='auto-ibrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='no-nested-data-bp'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='null-sel-clr-base'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='stibp-always-on'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-Genoa-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amd-psfd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='auto-ibrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='no-nested-data-bp'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='null-sel-clr-base'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='stibp-always-on'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-Milan'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-Milan-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-Milan-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amd-psfd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='no-nested-data-bp'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='null-sel-clr-base'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='stibp-always-on'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-Rome'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-Rome-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-Rome-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-Rome-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-v4'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='GraniteRapids'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-fp16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='mcdt-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pbrsb-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='prefetchiti'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='GraniteRapids-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-fp16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='mcdt-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pbrsb-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='prefetchiti'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='GraniteRapids-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-fp16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx10'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx10-128'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx10-256'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx10-512'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='mcdt-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pbrsb-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='prefetchiti'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Haswell'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Haswell-IBRS'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Haswell-noTSX'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Haswell-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Haswell-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Haswell-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Haswell-v4'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-noTSX'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v4'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v5'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v6'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v7'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='IvyBridge'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='IvyBridge-IBRS'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='IvyBridge-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='IvyBridge-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='KnightsMill'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-4fmaps'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-4vnniw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512er'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512pf'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='KnightsMill-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-4fmaps'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-4vnniw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512er'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512pf'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Opteron_G4'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fma4'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xop'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Opteron_G4-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fma4'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xop'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Opteron_G5'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fma4'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='tbm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xop'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Opteron_G5-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fma4'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='tbm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xop'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='SapphireRapids'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='SapphireRapids-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='SapphireRapids-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='SapphireRapids-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='SierraForest'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-ne-convert'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni-int8'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='cmpccxadd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='mcdt-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pbrsb-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='SierraForest-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-ne-convert'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni-int8'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='cmpccxadd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='mcdt-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pbrsb-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client-IBRS'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client-v4'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-IBRS'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-v4'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-v5'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Snowridge'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='core-capability'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='mpx'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='split-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Snowridge-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='core-capability'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='mpx'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='split-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Snowridge-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='core-capability'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='split-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Snowridge-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='core-capability'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='split-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Snowridge-v4'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='athlon'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='3dnow'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='3dnowext'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='athlon-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='3dnow'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='3dnowext'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='core2duo'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='core2duo-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='coreduo'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='coreduo-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='n270'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='n270-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='phenom'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='3dnow'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='3dnowext'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='phenom-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='3dnow'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='3dnowext'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </mode>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   </cpu>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <memoryBacking supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <enum name='sourceType'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <value>file</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <value>anonymous</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <value>memfd</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   </memoryBacking>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <devices>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <disk supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='diskDevice'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>disk</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>cdrom</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>floppy</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>lun</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='bus'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>ide</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>fdc</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>scsi</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>virtio</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>usb</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>sata</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='model'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>virtio</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>virtio-transitional</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>virtio-non-transitional</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </disk>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <graphics supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='type'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>vnc</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>egl-headless</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>dbus</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </graphics>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <video supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='modelType'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>vga</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>cirrus</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>virtio</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>none</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>bochs</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>ramfb</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </video>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <hostdev supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='mode'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>subsystem</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='startupPolicy'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>default</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>mandatory</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>requisite</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>optional</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='subsysType'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>usb</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>pci</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>scsi</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='capsType'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='pciBackend'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </hostdev>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <rng supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='model'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>virtio</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>virtio-transitional</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>virtio-non-transitional</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='backendModel'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>random</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>egd</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>builtin</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </rng>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <filesystem supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='driverType'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>path</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>handle</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>virtiofs</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </filesystem>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <tpm supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='model'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>tpm-tis</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>tpm-crb</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='backendModel'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>emulator</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>external</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='backendVersion'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>2.0</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </tpm>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <redirdev supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='bus'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>usb</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </redirdev>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <channel supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='type'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>pty</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>unix</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </channel>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <crypto supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='model'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='type'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>qemu</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='backendModel'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>builtin</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </crypto>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <interface supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='backendType'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>default</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>passt</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </interface>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <panic supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='model'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>isa</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>hyperv</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </panic>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <console supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='type'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>null</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>vc</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>pty</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>dev</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>file</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>pipe</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>stdio</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>udp</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>tcp</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>unix</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>qemu-vdagent</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>dbus</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </console>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   </devices>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <features>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <gic supported='no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <vmcoreinfo supported='yes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <genid supported='yes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <backingStoreInput supported='yes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <backup supported='yes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <async-teardown supported='yes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <ps2 supported='yes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <sev supported='no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <sgx supported='no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <hyperv supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='features'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>relaxed</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>vapic</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>spinlocks</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>vpindex</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>runtime</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>synic</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>stimer</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>reset</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>vendor_id</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>frequencies</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>reenlightenment</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>tlbflush</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>ipi</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>avic</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>emsr_bitmap</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>xmm_input</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <defaults>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <spinlocks>4095</spinlocks>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <stimer_direct>on</stimer_direct>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <tlbflush_direct>on</tlbflush_direct>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <tlbflush_extended>on</tlbflush_extended>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </defaults>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </hyperv>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <launchSecurity supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='sectype'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>tdx</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </launchSecurity>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   </features>
Nov 29 06:41:26 compute-0 nova_compute[186241]: </domainCapabilities>
Nov 29 06:41:26 compute-0 nova_compute[186241]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 29 06:41:26 compute-0 nova_compute[186241]: 2025-11-29 06:41:26.302 186245 DEBUG nova.virt.libvirt.host [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 29 06:41:26 compute-0 nova_compute[186241]: 2025-11-29 06:41:26.306 186245 DEBUG nova.virt.libvirt.host [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 29 06:41:26 compute-0 nova_compute[186241]: <domainCapabilities>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <path>/usr/libexec/qemu-kvm</path>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <domain>kvm</domain>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <arch>x86_64</arch>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <vcpu max='4096'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <iothreads supported='yes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <os supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <enum name='firmware'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <value>efi</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <loader supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='type'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>rom</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>pflash</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='readonly'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>yes</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>no</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='secure'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>yes</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>no</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </loader>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   </os>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <cpu>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <mode name='host-passthrough' supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='hostPassthroughMigratable'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>on</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>off</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </mode>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <mode name='maximum' supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='maximumMigratable'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>on</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>off</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </mode>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <mode name='host-model' supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <vendor>AMD</vendor>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='x2apic'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='tsc-deadline'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='hypervisor'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='tsc_adjust'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='spec-ctrl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='stibp'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='ssbd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='cmp_legacy'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='overflow-recov'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='succor'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='ibrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='amd-ssbd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='virt-ssbd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='lbrv'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='tsc-scale'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='vmcb-clean'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='flushbyasid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='pause-filter'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='pfthreshold'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='svme-addr-chk'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='disable' name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </mode>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <mode name='custom' supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Broadwell'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Broadwell-IBRS'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Broadwell-noTSX'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Broadwell-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Broadwell-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Broadwell-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Broadwell-v4'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-v4'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-v5'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Cooperlake'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Cooperlake-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Cooperlake-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Denverton'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='mpx'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Denverton-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='mpx'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Denverton-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Denverton-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Dhyana-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-Genoa'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amd-psfd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='auto-ibrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='no-nested-data-bp'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='null-sel-clr-base'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='stibp-always-on'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-Genoa-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amd-psfd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='auto-ibrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='no-nested-data-bp'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='null-sel-clr-base'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='stibp-always-on'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-Milan'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-Milan-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-Milan-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amd-psfd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='no-nested-data-bp'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='null-sel-clr-base'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='stibp-always-on'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-Rome'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-Rome-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-Rome-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-Rome-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-v4'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='GraniteRapids'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-fp16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='mcdt-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pbrsb-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='prefetchiti'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='GraniteRapids-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-fp16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='mcdt-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pbrsb-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='prefetchiti'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='GraniteRapids-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-fp16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx10'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx10-128'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx10-256'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx10-512'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='mcdt-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pbrsb-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='prefetchiti'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Haswell'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Haswell-IBRS'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Haswell-noTSX'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Haswell-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Haswell-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Haswell-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Haswell-v4'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-noTSX'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v4'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v5'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v6'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v7'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='IvyBridge'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='IvyBridge-IBRS'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='IvyBridge-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='IvyBridge-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='KnightsMill'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-4fmaps'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-4vnniw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512er'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512pf'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='KnightsMill-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-4fmaps'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-4vnniw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512er'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512pf'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Opteron_G4'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fma4'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xop'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Opteron_G4-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fma4'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xop'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Opteron_G5'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fma4'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='tbm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xop'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Opteron_G5-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fma4'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='tbm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xop'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='SapphireRapids'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='SapphireRapids-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='SapphireRapids-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='SapphireRapids-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='SierraForest'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-ne-convert'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni-int8'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='cmpccxadd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='mcdt-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pbrsb-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='SierraForest-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-ne-convert'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni-int8'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='cmpccxadd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='mcdt-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pbrsb-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client-IBRS'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client-v4'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-IBRS'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-v4'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-v5'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Snowridge'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='core-capability'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='mpx'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='split-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Snowridge-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='core-capability'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='mpx'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='split-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Snowridge-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='core-capability'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='split-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Snowridge-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='core-capability'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='split-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Snowridge-v4'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='athlon'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='3dnow'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='3dnowext'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='athlon-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='3dnow'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='3dnowext'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='core2duo'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='core2duo-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='coreduo'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='coreduo-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='n270'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='n270-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='phenom'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='3dnow'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='3dnowext'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='phenom-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='3dnow'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='3dnowext'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </mode>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   </cpu>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <memoryBacking supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <enum name='sourceType'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <value>file</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <value>anonymous</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <value>memfd</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   </memoryBacking>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <devices>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <disk supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='diskDevice'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>disk</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>cdrom</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>floppy</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>lun</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='bus'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>fdc</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>scsi</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>virtio</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>usb</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>sata</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='model'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>virtio</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>virtio-transitional</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>virtio-non-transitional</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </disk>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <graphics supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='type'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>vnc</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>egl-headless</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>dbus</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </graphics>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <video supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='modelType'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>vga</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>cirrus</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>virtio</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>none</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>bochs</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>ramfb</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </video>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <hostdev supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='mode'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>subsystem</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='startupPolicy'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>default</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>mandatory</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>requisite</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>optional</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='subsysType'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>usb</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>pci</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>scsi</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='capsType'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='pciBackend'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </hostdev>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <rng supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='model'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>virtio</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>virtio-transitional</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>virtio-non-transitional</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='backendModel'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>random</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>egd</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>builtin</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </rng>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <filesystem supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='driverType'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>path</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>handle</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>virtiofs</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </filesystem>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <tpm supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='model'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>tpm-tis</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>tpm-crb</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='backendModel'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>emulator</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>external</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='backendVersion'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>2.0</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </tpm>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <redirdev supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='bus'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>usb</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </redirdev>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <channel supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='type'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>pty</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>unix</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </channel>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <crypto supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='model'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='type'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>qemu</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='backendModel'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>builtin</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </crypto>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <interface supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='backendType'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>default</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>passt</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </interface>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <panic supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='model'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>isa</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>hyperv</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </panic>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <console supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='type'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>null</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>vc</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>pty</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>dev</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>file</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>pipe</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>stdio</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>udp</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>tcp</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>unix</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>qemu-vdagent</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>dbus</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </console>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   </devices>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <features>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <gic supported='no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <vmcoreinfo supported='yes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <genid supported='yes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <backingStoreInput supported='yes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <backup supported='yes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <async-teardown supported='yes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <ps2 supported='yes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <sev supported='no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <sgx supported='no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <hyperv supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='features'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>relaxed</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>vapic</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>spinlocks</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>vpindex</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>runtime</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>synic</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>stimer</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>reset</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>vendor_id</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>frequencies</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>reenlightenment</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>tlbflush</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>ipi</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>avic</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>emsr_bitmap</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>xmm_input</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <defaults>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <spinlocks>4095</spinlocks>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <stimer_direct>on</stimer_direct>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <tlbflush_direct>on</tlbflush_direct>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <tlbflush_extended>on</tlbflush_extended>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </defaults>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </hyperv>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <launchSecurity supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='sectype'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>tdx</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </launchSecurity>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   </features>
Nov 29 06:41:26 compute-0 nova_compute[186241]: </domainCapabilities>
Nov 29 06:41:26 compute-0 nova_compute[186241]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 29 06:41:26 compute-0 nova_compute[186241]: 2025-11-29 06:41:26.369 186245 DEBUG nova.virt.libvirt.host [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 29 06:41:26 compute-0 nova_compute[186241]: <domainCapabilities>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <path>/usr/libexec/qemu-kvm</path>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <domain>kvm</domain>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <arch>x86_64</arch>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <vcpu max='240'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <iothreads supported='yes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <os supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <enum name='firmware'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <loader supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='type'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>rom</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>pflash</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='readonly'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>yes</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>no</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='secure'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>no</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </loader>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   </os>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <cpu>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <mode name='host-passthrough' supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='hostPassthroughMigratable'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>on</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>off</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </mode>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <mode name='maximum' supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='maximumMigratable'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>on</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>off</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </mode>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <mode name='host-model' supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <vendor>AMD</vendor>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='x2apic'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='tsc-deadline'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='hypervisor'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='tsc_adjust'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='spec-ctrl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='stibp'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='ssbd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='cmp_legacy'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='overflow-recov'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='succor'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='ibrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='amd-ssbd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='virt-ssbd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='lbrv'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='tsc-scale'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='vmcb-clean'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='flushbyasid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='pause-filter'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='pfthreshold'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='svme-addr-chk'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <feature policy='disable' name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </mode>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <mode name='custom' supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Broadwell'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Broadwell-IBRS'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Broadwell-noTSX'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Broadwell-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Broadwell-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Broadwell-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Broadwell-v4'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-v4'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Cascadelake-Server-v5'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Cooperlake'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Cooperlake-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Cooperlake-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Denverton'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='mpx'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Denverton-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='mpx'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Denverton-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Denverton-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Dhyana-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-Genoa'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amd-psfd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='auto-ibrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='no-nested-data-bp'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='null-sel-clr-base'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='stibp-always-on'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-Genoa-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amd-psfd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='auto-ibrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='no-nested-data-bp'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='null-sel-clr-base'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='stibp-always-on'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-Milan'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 python3.9[186919]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-Milan-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-Milan-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amd-psfd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='no-nested-data-bp'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='null-sel-clr-base'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='stibp-always-on'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-Rome'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-Rome-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-Rome-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-Rome-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='EPYC-v4'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='GraniteRapids'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-fp16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='mcdt-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pbrsb-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='prefetchiti'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='GraniteRapids-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-fp16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='mcdt-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pbrsb-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='prefetchiti'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='GraniteRapids-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-fp16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx10'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx10-128'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx10-256'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx10-512'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='mcdt-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pbrsb-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='prefetchiti'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Haswell'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Haswell-IBRS'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Haswell-noTSX'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Haswell-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Haswell-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Haswell-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Haswell-v4'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-noTSX'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v4'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v5'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v6'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Icelake-Server-v7'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='IvyBridge'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='IvyBridge-IBRS'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='IvyBridge-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='IvyBridge-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='KnightsMill'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-4fmaps'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-4vnniw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512er'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512pf'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='KnightsMill-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-4fmaps'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-4vnniw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512er'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512pf'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Opteron_G4'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fma4'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xop'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Opteron_G4-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fma4'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xop'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Opteron_G5'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fma4'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='tbm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xop'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Opteron_G5-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fma4'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='tbm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xop'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='SapphireRapids'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='SapphireRapids-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='SapphireRapids-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='SapphireRapids-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-int8'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='amx-tile'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-bf16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-fp16'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bitalg'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrc'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fzrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='la57'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='taa-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xfd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='SierraForest'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-ne-convert'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni-int8'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='cmpccxadd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='mcdt-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pbrsb-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='SierraForest-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-ifma'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-ne-convert'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx-vnni-int8'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='cmpccxadd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fbsdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='fsrs'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ibrs-all'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='mcdt-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pbrsb-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='psdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='serialize'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vaes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client-IBRS'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Client-v4'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-IBRS'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='hle'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='rtm'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-v4'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Skylake-Server-v5'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512bw'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512cd'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512dq'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512f'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='avx512vl'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='invpcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pcid'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='pku'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Snowridge'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='core-capability'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='mpx'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='split-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Snowridge-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='core-capability'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='mpx'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='split-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Snowridge-v2'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='core-capability'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='split-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Snowridge-v3'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='core-capability'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='split-lock-detect'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='Snowridge-v4'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='cldemote'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='erms'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='gfni'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdir64b'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='movdiri'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='xsaves'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='athlon'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='3dnow'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='3dnowext'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='athlon-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='3dnow'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='3dnowext'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='core2duo'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='core2duo-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='coreduo'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='coreduo-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='n270'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='n270-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='ss'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='phenom'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='3dnow'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='3dnowext'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <blockers model='phenom-v1'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='3dnow'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <feature name='3dnowext'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </blockers>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </mode>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   </cpu>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <memoryBacking supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <enum name='sourceType'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <value>file</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <value>anonymous</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <value>memfd</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   </memoryBacking>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <devices>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <disk supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='diskDevice'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>disk</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>cdrom</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>floppy</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>lun</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='bus'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>ide</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>fdc</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>scsi</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>virtio</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>usb</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>sata</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='model'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>virtio</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>virtio-transitional</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>virtio-non-transitional</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </disk>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <graphics supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='type'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>vnc</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>egl-headless</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>dbus</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </graphics>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <video supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='modelType'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>vga</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>cirrus</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>virtio</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>none</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>bochs</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>ramfb</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </video>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <hostdev supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='mode'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>subsystem</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='startupPolicy'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>default</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>mandatory</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>requisite</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>optional</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='subsysType'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>usb</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>pci</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>scsi</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='capsType'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='pciBackend'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </hostdev>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <rng supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='model'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>virtio</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>virtio-transitional</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>virtio-non-transitional</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='backendModel'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>random</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>egd</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>builtin</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </rng>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <filesystem supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='driverType'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>path</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>handle</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>virtiofs</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </filesystem>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <tpm supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='model'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>tpm-tis</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>tpm-crb</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='backendModel'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>emulator</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>external</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='backendVersion'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>2.0</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </tpm>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <redirdev supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='bus'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>usb</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </redirdev>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <channel supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='type'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>pty</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>unix</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </channel>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <crypto supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='model'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='type'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>qemu</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='backendModel'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>builtin</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </crypto>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <interface supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='backendType'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>default</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>passt</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </interface>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <panic supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='model'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>isa</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>hyperv</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </panic>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <console supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='type'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>null</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>vc</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>pty</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>dev</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>file</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>pipe</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>stdio</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>udp</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>tcp</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>unix</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>qemu-vdagent</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>dbus</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </console>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   </devices>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <features>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <gic supported='no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <vmcoreinfo supported='yes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <genid supported='yes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <backingStoreInput supported='yes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <backup supported='yes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <async-teardown supported='yes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <ps2 supported='yes'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <sev supported='no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <sgx supported='no'/>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <hyperv supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='features'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>relaxed</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>vapic</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>spinlocks</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>vpindex</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>runtime</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>synic</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>stimer</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>reset</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>vendor_id</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>frequencies</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>reenlightenment</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>tlbflush</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>ipi</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>avic</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>emsr_bitmap</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>xmm_input</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <defaults>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <spinlocks>4095</spinlocks>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <stimer_direct>on</stimer_direct>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <tlbflush_direct>on</tlbflush_direct>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <tlbflush_extended>on</tlbflush_extended>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </defaults>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </hyperv>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     <launchSecurity supported='yes'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       <enum name='sectype'>
Nov 29 06:41:26 compute-0 nova_compute[186241]:         <value>tdx</value>
Nov 29 06:41:26 compute-0 nova_compute[186241]:       </enum>
Nov 29 06:41:26 compute-0 nova_compute[186241]:     </launchSecurity>
Nov 29 06:41:26 compute-0 nova_compute[186241]:   </features>
Nov 29 06:41:26 compute-0 nova_compute[186241]: </domainCapabilities>
Nov 29 06:41:26 compute-0 nova_compute[186241]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 29 06:41:26 compute-0 nova_compute[186241]: 2025-11-29 06:41:26.425 186245 DEBUG nova.virt.libvirt.host [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 29 06:41:26 compute-0 nova_compute[186241]: 2025-11-29 06:41:26.425 186245 INFO nova.virt.libvirt.host [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] Secure Boot support detected
Nov 29 06:41:26 compute-0 nova_compute[186241]: 2025-11-29 06:41:26.427 186245 INFO nova.virt.libvirt.driver [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 29 06:41:26 compute-0 nova_compute[186241]: 2025-11-29 06:41:26.428 186245 INFO nova.virt.libvirt.driver [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 29 06:41:26 compute-0 nova_compute[186241]: 2025-11-29 06:41:26.437 186245 DEBUG nova.virt.libvirt.driver [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] cpu compare xml: <cpu match="exact">
Nov 29 06:41:26 compute-0 nova_compute[186241]:   <model>Nehalem</model>
Nov 29 06:41:26 compute-0 nova_compute[186241]: </cpu>
Nov 29 06:41:26 compute-0 nova_compute[186241]:  _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019
Nov 29 06:41:26 compute-0 nova_compute[186241]: 2025-11-29 06:41:26.440 186245 DEBUG nova.virt.libvirt.driver [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Nov 29 06:41:26 compute-0 nova_compute[186241]: 2025-11-29 06:41:26.459 186245 INFO nova.virt.node [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] Determined node identity 4e39a026-df39-4e20-874a-dbb5a40df044 from /var/lib/nova/compute_id
Nov 29 06:41:26 compute-0 nova_compute[186241]: 2025-11-29 06:41:26.480 186245 WARNING nova.compute.manager [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] Compute nodes ['4e39a026-df39-4e20-874a-dbb5a40df044'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Nov 29 06:41:26 compute-0 nova_compute[186241]: 2025-11-29 06:41:26.524 186245 INFO nova.compute.manager [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Nov 29 06:41:26 compute-0 sudo[186917]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:26 compute-0 nova_compute[186241]: 2025-11-29 06:41:26.581 186245 WARNING nova.compute.manager [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Nov 29 06:41:26 compute-0 nova_compute[186241]: 2025-11-29 06:41:26.581 186245 DEBUG oslo_concurrency.lockutils [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:41:26 compute-0 nova_compute[186241]: 2025-11-29 06:41:26.581 186245 DEBUG oslo_concurrency.lockutils [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:41:26 compute-0 nova_compute[186241]: 2025-11-29 06:41:26.582 186245 DEBUG oslo_concurrency.lockutils [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:41:26 compute-0 nova_compute[186241]: 2025-11-29 06:41:26.582 186245 DEBUG nova.compute.resource_tracker [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:41:26 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Nov 29 06:41:26 compute-0 systemd[1]: Started libvirt nodedev daemon.
Nov 29 06:41:26 compute-0 nova_compute[186241]: 2025-11-29 06:41:26.860 186245 WARNING nova.virt.libvirt.driver [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:41:26 compute-0 nova_compute[186241]: 2025-11-29 06:41:26.861 186245 DEBUG nova.compute.resource_tracker [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6222MB free_disk=73.54598617553711GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:41:26 compute-0 nova_compute[186241]: 2025-11-29 06:41:26.861 186245 DEBUG oslo_concurrency.lockutils [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:41:26 compute-0 nova_compute[186241]: 2025-11-29 06:41:26.861 186245 DEBUG oslo_concurrency.lockutils [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:41:26 compute-0 nova_compute[186241]: 2025-11-29 06:41:26.887 186245 WARNING nova.compute.resource_tracker [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] No compute node record for compute-0.ctlplane.example.com:4e39a026-df39-4e20-874a-dbb5a40df044: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 4e39a026-df39-4e20-874a-dbb5a40df044 could not be found.
Nov 29 06:41:26 compute-0 nova_compute[186241]: 2025-11-29 06:41:26.904 186245 INFO nova.compute.resource_tracker [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] Compute node record created for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com with uuid: 4e39a026-df39-4e20-874a-dbb5a40df044
Nov 29 06:41:26 compute-0 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 06:41:26 compute-0 nova_compute[186241]: 2025-11-29 06:41:26.974 186245 DEBUG nova.compute.resource_tracker [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:41:26 compute-0 nova_compute[186241]: 2025-11-29 06:41:26.974 186245 DEBUG nova.compute.resource_tracker [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:41:27 compute-0 nova_compute[186241]: 2025-11-29 06:41:27.125 186245 INFO nova.scheduler.client.report [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] [req-fe6eca4b-38c0-4ef7-b4cb-20038af52440] Created resource provider record via placement API for resource provider with UUID 4e39a026-df39-4e20-874a-dbb5a40df044 and name compute-0.ctlplane.example.com.
Nov 29 06:41:27 compute-0 nova_compute[186241]: 2025-11-29 06:41:27.149 186245 DEBUG nova.virt.libvirt.host [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Nov 29 06:41:27 compute-0 nova_compute[186241]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Nov 29 06:41:27 compute-0 nova_compute[186241]: 2025-11-29 06:41:27.149 186245 INFO nova.virt.libvirt.host [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] kernel doesn't support AMD SEV
Nov 29 06:41:27 compute-0 nova_compute[186241]: 2025-11-29 06:41:27.149 186245 DEBUG nova.compute.provider_tree [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] Updating inventory in ProviderTree for provider 4e39a026-df39-4e20-874a-dbb5a40df044 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 29 06:41:27 compute-0 nova_compute[186241]: 2025-11-29 06:41:27.150 186245 DEBUG nova.virt.libvirt.driver [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 06:41:27 compute-0 nova_compute[186241]: 2025-11-29 06:41:27.152 186245 DEBUG nova.virt.libvirt.driver [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] Libvirt baseline CPU <cpu>
Nov 29 06:41:27 compute-0 nova_compute[186241]:   <arch>x86_64</arch>
Nov 29 06:41:27 compute-0 nova_compute[186241]:   <model>Nehalem</model>
Nov 29 06:41:27 compute-0 nova_compute[186241]:   <vendor>AMD</vendor>
Nov 29 06:41:27 compute-0 nova_compute[186241]:   <topology sockets="8" cores="1" threads="1"/>
Nov 29 06:41:27 compute-0 nova_compute[186241]: </cpu>
Nov 29 06:41:27 compute-0 nova_compute[186241]:  _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537
Nov 29 06:41:27 compute-0 sudo[187120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjvkjyjsfmdjzrhvxlkyrvlisdlfosdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398486.92053-4342-255769301104020/AnsiballZ_systemd.py'
Nov 29 06:41:27 compute-0 sudo[187120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:27 compute-0 nova_compute[186241]: 2025-11-29 06:41:27.259 186245 DEBUG nova.scheduler.client.report [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] Updated inventory for provider 4e39a026-df39-4e20-874a-dbb5a40df044 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Nov 29 06:41:27 compute-0 nova_compute[186241]: 2025-11-29 06:41:27.259 186245 DEBUG nova.compute.provider_tree [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] Updating resource provider 4e39a026-df39-4e20-874a-dbb5a40df044 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 29 06:41:27 compute-0 nova_compute[186241]: 2025-11-29 06:41:27.259 186245 DEBUG nova.compute.provider_tree [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] Updating inventory in ProviderTree for provider 4e39a026-df39-4e20-874a-dbb5a40df044 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 29 06:41:27 compute-0 nova_compute[186241]: 2025-11-29 06:41:27.361 186245 DEBUG nova.compute.provider_tree [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] Updating resource provider 4e39a026-df39-4e20-874a-dbb5a40df044 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 29 06:41:27 compute-0 nova_compute[186241]: 2025-11-29 06:41:27.380 186245 DEBUG nova.compute.resource_tracker [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:41:27 compute-0 nova_compute[186241]: 2025-11-29 06:41:27.380 186245 DEBUG oslo_concurrency.lockutils [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.519s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:41:27 compute-0 nova_compute[186241]: 2025-11-29 06:41:27.380 186245 DEBUG nova.service [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Nov 29 06:41:27 compute-0 nova_compute[186241]: 2025-11-29 06:41:27.442 186245 DEBUG nova.service [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Nov 29 06:41:27 compute-0 nova_compute[186241]: 2025-11-29 06:41:27.443 186245 DEBUG nova.servicegroup.drivers.db [None req-c265a8b9-1ea7-4247-b1c3-4370502623d2 - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Nov 29 06:41:27 compute-0 python3.9[187122]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:41:27 compute-0 systemd[1]: Stopping nova_compute container...
Nov 29 06:41:27 compute-0 nova_compute[186241]: 2025-11-29 06:41:27.988 186245 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored
Nov 29 06:41:27 compute-0 nova_compute[186241]: 2025-11-29 06:41:27.990 186245 DEBUG oslo_concurrency.lockutils [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:41:27 compute-0 nova_compute[186241]: 2025-11-29 06:41:27.990 186245 DEBUG oslo_concurrency.lockutils [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:41:27 compute-0 nova_compute[186241]: 2025-11-29 06:41:27.990 186245 DEBUG oslo_concurrency.lockutils [None req-59200211-3102-4dd4-9024-559a8a8ef22c - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:41:28 compute-0 virtqemud[186729]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Nov 29 06:41:28 compute-0 virtqemud[186729]: hostname: compute-0
Nov 29 06:41:28 compute-0 virtqemud[186729]: End of file while reading data: Input/output error
Nov 29 06:41:28 compute-0 systemd[1]: libpod-9e31a8b4136a4421fec7ff06e740bd442e40fd5203e971de48094f5c0ad9b467.scope: Deactivated successfully.
Nov 29 06:41:28 compute-0 systemd[1]: libpod-9e31a8b4136a4421fec7ff06e740bd442e40fd5203e971de48094f5c0ad9b467.scope: Consumed 3.332s CPU time.
Nov 29 06:41:28 compute-0 podman[187126]: 2025-11-29 06:41:28.45530327 +0000 UTC m=+0.892993381 container died 9e31a8b4136a4421fec7ff06e740bd442e40fd5203e971de48094f5c0ad9b467 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=nova_compute, io.buildah.version=1.41.3, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 06:41:29 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9e31a8b4136a4421fec7ff06e740bd442e40fd5203e971de48094f5c0ad9b467-userdata-shm.mount: Deactivated successfully.
Nov 29 06:41:29 compute-0 systemd[1]: var-lib-containers-storage-overlay-d97f3d504bb082e00ab3578455e4ef7c3c94cdd9e2e07b7ed2948dfa78e077a8-merged.mount: Deactivated successfully.
Nov 29 06:41:30 compute-0 podman[187126]: 2025-11-29 06:41:30.731764515 +0000 UTC m=+3.169454656 container cleanup 9e31a8b4136a4421fec7ff06e740bd442e40fd5203e971de48094f5c0ad9b467 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 06:41:30 compute-0 podman[187126]: nova_compute
Nov 29 06:41:30 compute-0 podman[187156]: nova_compute
Nov 29 06:41:30 compute-0 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Nov 29 06:41:30 compute-0 systemd[1]: Stopped nova_compute container.
Nov 29 06:41:30 compute-0 systemd[1]: Starting nova_compute container...
Nov 29 06:41:30 compute-0 systemd[1]: Started libcrun container.
Nov 29 06:41:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d97f3d504bb082e00ab3578455e4ef7c3c94cdd9e2e07b7ed2948dfa78e077a8/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 29 06:41:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d97f3d504bb082e00ab3578455e4ef7c3c94cdd9e2e07b7ed2948dfa78e077a8/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 29 06:41:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d97f3d504bb082e00ab3578455e4ef7c3c94cdd9e2e07b7ed2948dfa78e077a8/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 29 06:41:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d97f3d504bb082e00ab3578455e4ef7c3c94cdd9e2e07b7ed2948dfa78e077a8/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 29 06:41:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d97f3d504bb082e00ab3578455e4ef7c3c94cdd9e2e07b7ed2948dfa78e077a8/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 29 06:41:31 compute-0 podman[187170]: 2025-11-29 06:41:31.039444404 +0000 UTC m=+0.179275903 container init 9e31a8b4136a4421fec7ff06e740bd442e40fd5203e971de48094f5c0ad9b467 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Nov 29 06:41:31 compute-0 podman[187170]: 2025-11-29 06:41:31.044763281 +0000 UTC m=+0.184594750 container start 9e31a8b4136a4421fec7ff06e740bd442e40fd5203e971de48094f5c0ad9b467 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Nov 29 06:41:31 compute-0 nova_compute[187185]: + sudo -E kolla_set_configs
Nov 29 06:41:31 compute-0 podman[187170]: nova_compute
Nov 29 06:41:31 compute-0 systemd[1]: Started nova_compute container.
Nov 29 06:41:31 compute-0 nova_compute[187185]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 06:41:31 compute-0 nova_compute[187185]: INFO:__main__:Validating config file
Nov 29 06:41:31 compute-0 nova_compute[187185]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 06:41:31 compute-0 nova_compute[187185]: INFO:__main__:Copying service configuration files
Nov 29 06:41:31 compute-0 nova_compute[187185]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 29 06:41:31 compute-0 nova_compute[187185]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 29 06:41:31 compute-0 nova_compute[187185]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 29 06:41:31 compute-0 nova_compute[187185]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Nov 29 06:41:31 compute-0 nova_compute[187185]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 29 06:41:31 compute-0 nova_compute[187185]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 29 06:41:31 compute-0 nova_compute[187185]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 29 06:41:31 compute-0 nova_compute[187185]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 29 06:41:31 compute-0 nova_compute[187185]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 29 06:41:31 compute-0 nova_compute[187185]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Nov 29 06:41:31 compute-0 nova_compute[187185]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 29 06:41:31 compute-0 nova_compute[187185]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 29 06:41:31 compute-0 nova_compute[187185]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 29 06:41:31 compute-0 nova_compute[187185]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 29 06:41:31 compute-0 nova_compute[187185]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 29 06:41:31 compute-0 nova_compute[187185]: INFO:__main__:Deleting /etc/ceph
Nov 29 06:41:31 compute-0 nova_compute[187185]: INFO:__main__:Creating directory /etc/ceph
Nov 29 06:41:31 compute-0 nova_compute[187185]: INFO:__main__:Setting permission for /etc/ceph
Nov 29 06:41:31 compute-0 nova_compute[187185]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Nov 29 06:41:31 compute-0 nova_compute[187185]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 29 06:41:31 compute-0 nova_compute[187185]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 29 06:41:31 compute-0 nova_compute[187185]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Nov 29 06:41:31 compute-0 nova_compute[187185]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 29 06:41:31 compute-0 nova_compute[187185]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 29 06:41:31 compute-0 nova_compute[187185]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 29 06:41:31 compute-0 nova_compute[187185]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 29 06:41:31 compute-0 nova_compute[187185]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 29 06:41:31 compute-0 nova_compute[187185]: INFO:__main__:Writing out command to execute
Nov 29 06:41:31 compute-0 nova_compute[187185]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 29 06:41:31 compute-0 nova_compute[187185]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 29 06:41:31 compute-0 nova_compute[187185]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 29 06:41:31 compute-0 sudo[187120]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:31 compute-0 nova_compute[187185]: ++ cat /run_command
Nov 29 06:41:31 compute-0 nova_compute[187185]: + CMD=nova-compute
Nov 29 06:41:31 compute-0 nova_compute[187185]: + ARGS=
Nov 29 06:41:31 compute-0 nova_compute[187185]: + sudo kolla_copy_cacerts
Nov 29 06:41:31 compute-0 nova_compute[187185]: + [[ ! -n '' ]]
Nov 29 06:41:31 compute-0 nova_compute[187185]: + . kolla_extend_start
Nov 29 06:41:31 compute-0 nova_compute[187185]: Running command: 'nova-compute'
Nov 29 06:41:31 compute-0 nova_compute[187185]: + echo 'Running command: '\''nova-compute'\'''
Nov 29 06:41:31 compute-0 nova_compute[187185]: + umask 0022
Nov 29 06:41:31 compute-0 nova_compute[187185]: + exec nova-compute
Nov 29 06:41:31 compute-0 sudo[187346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcwwsnxupxbxdzsnpmddaqyvyzpfkmpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398491.4758234-4369-212812474716274/AnsiballZ_podman_container.py'
Nov 29 06:41:31 compute-0 sudo[187346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:31 compute-0 python3.9[187348]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 29 06:41:32 compute-0 systemd[1]: Started libpod-conmon-aa76c9013abfbca89c47a5a661d424ad01d012dc4f57bb1f8bae5d9a67403c91.scope.
Nov 29 06:41:32 compute-0 systemd[1]: Started libcrun container.
Nov 29 06:41:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58a14b8dfe4afd4df5ee7fd02e1e13022309c93aade516bad3c30e3f7b8fe0cf/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Nov 29 06:41:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58a14b8dfe4afd4df5ee7fd02e1e13022309c93aade516bad3c30e3f7b8fe0cf/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 29 06:41:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58a14b8dfe4afd4df5ee7fd02e1e13022309c93aade516bad3c30e3f7b8fe0cf/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Nov 29 06:41:32 compute-0 podman[187373]: 2025-11-29 06:41:32.569269475 +0000 UTC m=+0.506153245 container init aa76c9013abfbca89c47a5a661d424ad01d012dc4f57bb1f8bae5d9a67403c91 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=nova_compute_init, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 06:41:32 compute-0 podman[187373]: 2025-11-29 06:41:32.613425054 +0000 UTC m=+0.550308774 container start aa76c9013abfbca89c47a5a661d424ad01d012dc4f57bb1f8bae5d9a67403c91 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 29 06:41:32 compute-0 nova_compute_init[187394]: INFO:nova_statedir:Applying nova statedir ownership
Nov 29 06:41:32 compute-0 nova_compute_init[187394]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Nov 29 06:41:32 compute-0 nova_compute_init[187394]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Nov 29 06:41:32 compute-0 nova_compute_init[187394]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Nov 29 06:41:32 compute-0 nova_compute_init[187394]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Nov 29 06:41:32 compute-0 nova_compute_init[187394]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Nov 29 06:41:32 compute-0 nova_compute_init[187394]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Nov 29 06:41:32 compute-0 nova_compute_init[187394]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Nov 29 06:41:32 compute-0 nova_compute_init[187394]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Nov 29 06:41:32 compute-0 nova_compute_init[187394]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Nov 29 06:41:32 compute-0 nova_compute_init[187394]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Nov 29 06:41:32 compute-0 nova_compute_init[187394]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Nov 29 06:41:32 compute-0 nova_compute_init[187394]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Nov 29 06:41:32 compute-0 nova_compute_init[187394]: INFO:nova_statedir:Nova statedir ownership complete
Nov 29 06:41:32 compute-0 systemd[1]: libpod-aa76c9013abfbca89c47a5a661d424ad01d012dc4f57bb1f8bae5d9a67403c91.scope: Deactivated successfully.
Nov 29 06:41:32 compute-0 python3.9[187348]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Nov 29 06:41:32 compute-0 podman[187395]: 2025-11-29 06:41:32.79023136 +0000 UTC m=+0.108518410 container died aa76c9013abfbca89c47a5a661d424ad01d012dc4f57bb1f8bae5d9a67403c91 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, container_name=nova_compute_init)
Nov 29 06:41:33 compute-0 nova_compute[187185]: 2025-11-29 06:41:33.128 187189 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 29 06:41:33 compute-0 nova_compute[187185]: 2025-11-29 06:41:33.128 187189 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 29 06:41:33 compute-0 nova_compute[187185]: 2025-11-29 06:41:33.129 187189 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 29 06:41:33 compute-0 nova_compute[187185]: 2025-11-29 06:41:33.129 187189 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Nov 29 06:41:33 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-aa76c9013abfbca89c47a5a661d424ad01d012dc4f57bb1f8bae5d9a67403c91-userdata-shm.mount: Deactivated successfully.
Nov 29 06:41:33 compute-0 systemd[1]: var-lib-containers-storage-overlay-58a14b8dfe4afd4df5ee7fd02e1e13022309c93aade516bad3c30e3f7b8fe0cf-merged.mount: Deactivated successfully.
Nov 29 06:41:33 compute-0 podman[187395]: 2025-11-29 06:41:33.194476206 +0000 UTC m=+0.512763206 container cleanup aa76c9013abfbca89c47a5a661d424ad01d012dc4f57bb1f8bae5d9a67403c91 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, container_name=nova_compute_init, io.buildah.version=1.41.3, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 06:41:33 compute-0 systemd[1]: libpod-conmon-aa76c9013abfbca89c47a5a661d424ad01d012dc4f57bb1f8bae5d9a67403c91.scope: Deactivated successfully.
Nov 29 06:41:33 compute-0 nova_compute[187185]: 2025-11-29 06:41:33.281 187189 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:41:33 compute-0 nova_compute[187185]: 2025-11-29 06:41:33.305 187189 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:41:33 compute-0 nova_compute[187185]: 2025-11-29 06:41:33.305 187189 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Nov 29 06:41:33 compute-0 sudo[187346]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:33 compute-0 nova_compute[187185]: 2025-11-29 06:41:33.926 187189 INFO nova.virt.driver [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.022 187189 INFO nova.compute.provider_config [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.037 187189 DEBUG oslo_concurrency.lockutils [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.037 187189 DEBUG oslo_concurrency.lockutils [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.037 187189 DEBUG oslo_concurrency.lockutils [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.038 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.038 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.038 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.038 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.038 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.039 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.039 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.039 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.039 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.039 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.039 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.040 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.040 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.040 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.040 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.040 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.041 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.041 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.041 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.041 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.041 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.042 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.042 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.042 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.042 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.042 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.043 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.043 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.043 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.043 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.044 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.044 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.044 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.044 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.044 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.044 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.045 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.045 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.045 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.045 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.046 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.046 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.046 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.046 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.046 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.046 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.047 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.047 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.047 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.047 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.047 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.047 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.048 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.048 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.048 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.048 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.048 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.048 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.048 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.049 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.049 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.049 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.049 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.049 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.049 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.049 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.050 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.050 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.050 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.050 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.050 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.050 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.051 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.051 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.051 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.051 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.051 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.051 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.052 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.052 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.052 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.052 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.052 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.052 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.053 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.053 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.053 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.053 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.053 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.053 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.054 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.054 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.054 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.054 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.054 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.054 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.055 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.055 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.055 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.055 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.055 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.055 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.056 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.056 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.056 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.056 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.056 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.056 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.056 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.057 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.057 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.057 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.057 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.057 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.057 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.058 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.058 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.058 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.058 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.058 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.059 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.059 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.059 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.059 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.059 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.059 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.060 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.060 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.060 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.060 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.060 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.060 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.061 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.061 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.061 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.061 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.061 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.061 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.062 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.062 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.062 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.062 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.062 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.062 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.062 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.063 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.063 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.063 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.063 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.063 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.064 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.064 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.064 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.064 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.064 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.064 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.064 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.065 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.065 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.065 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.065 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.065 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.065 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.066 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.066 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.066 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.066 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.066 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.066 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.066 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.067 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.067 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.067 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.067 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.067 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.067 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.068 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.068 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.068 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.068 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.068 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.068 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.069 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.069 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.069 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.069 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.069 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.069 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.069 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.070 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.070 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.070 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.070 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.070 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.070 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.070 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.071 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.071 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.071 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.071 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.071 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.071 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.071 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.072 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.072 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.072 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.072 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.072 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.072 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.073 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.073 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.073 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.073 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.073 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.073 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.073 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.074 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.074 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.074 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.074 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.074 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.074 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.074 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.075 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.075 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.075 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.075 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.075 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.075 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.076 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.076 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.076 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.076 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.076 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.076 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.076 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.077 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.077 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.077 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.077 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.077 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.077 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.077 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.078 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.078 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.078 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.078 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.078 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.078 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.078 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.079 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.079 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.079 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.079 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.079 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.079 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.079 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.080 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.080 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.080 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.080 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.080 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.080 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.081 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.081 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.081 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.081 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.081 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.081 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.082 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.082 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.082 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.082 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.082 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.082 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.083 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.083 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.083 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.083 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.083 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.084 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.084 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.084 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.084 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.084 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.085 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.085 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.085 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.085 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.085 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.086 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.086 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.086 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.086 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.086 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.086 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.087 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.087 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.087 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.087 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.087 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.088 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.088 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.088 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.088 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.088 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.088 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.089 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.089 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.089 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.089 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.089 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.090 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.090 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.090 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.090 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.090 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.090 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.090 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.091 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.091 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.091 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.091 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.091 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.091 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.092 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.092 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.092 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.092 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.092 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.092 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.092 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.093 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.093 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.093 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.093 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.093 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.093 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.093 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.094 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.094 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.094 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.094 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.094 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.094 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.094 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.095 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.095 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.095 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.095 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.095 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.095 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.095 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.096 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.096 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.096 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.096 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.096 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.097 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.097 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.097 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.097 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.097 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.097 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.097 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.098 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.098 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.098 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.098 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.098 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.098 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.098 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.099 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.099 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.099 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.099 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.099 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.099 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.099 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.099 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.100 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.100 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.100 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.100 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.100 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.100 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.101 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.101 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.101 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.101 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.101 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.102 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.102 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.102 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.102 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.102 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.102 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.103 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.103 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.103 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.103 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.103 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.103 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.103 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.104 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.104 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.104 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.104 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.104 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.104 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.104 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.105 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.105 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.105 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.105 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.105 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.105 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.105 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.106 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.106 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.106 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.106 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.106 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.106 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.106 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.106 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.107 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.107 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.107 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.107 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.107 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.107 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.107 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.108 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.108 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.108 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.108 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.108 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.108 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.108 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.109 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.109 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.109 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.109 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.109 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.109 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.110 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.110 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.110 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.110 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.110 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.110 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.111 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.111 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.111 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.111 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.111 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.111 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.111 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.112 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.112 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.112 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.112 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.112 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.112 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.113 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.113 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.113 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.113 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.113 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.113 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.113 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.114 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.114 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.114 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.114 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.114 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.114 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.114 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.115 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.115 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.115 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.115 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.115 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.115 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.115 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.115 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.116 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.116 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.116 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.116 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.116 187189 WARNING oslo_config.cfg [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 29 06:41:34 compute-0 nova_compute[187185]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 29 06:41:34 compute-0 nova_compute[187185]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 29 06:41:34 compute-0 nova_compute[187185]: and ``live_migration_inbound_addr`` respectively.
Nov 29 06:41:34 compute-0 nova_compute[187185]: ).  Its value may be silently ignored in the future.
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.116 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.117 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.117 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.117 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.117 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.117 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.117 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.118 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.118 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.118 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.118 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.118 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.118 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.118 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.119 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.119 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.119 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.119 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.119 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.119 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.119 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.119 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.120 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.120 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.120 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.120 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.120 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.120 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.121 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.121 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.121 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.121 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.121 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.121 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.121 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.122 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.122 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.122 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.122 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.122 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.122 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.123 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.123 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.123 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.123 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.123 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.123 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.123 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.123 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.124 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.124 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.124 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.124 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.124 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.124 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.125 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.125 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.125 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.125 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.125 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.125 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.125 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.125 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.126 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.126 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.126 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.126 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.126 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.126 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.126 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.127 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.127 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.127 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.127 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.127 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.127 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.127 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.128 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.128 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.128 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.128 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.128 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.128 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.128 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.129 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 sshd-session[159040]: Connection closed by 192.168.122.30 port 36430
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.129 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.129 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.129 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.129 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.129 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.129 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.130 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.130 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.130 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.130 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.130 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.130 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.130 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.130 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.131 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.131 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.131 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.131 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.131 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.131 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.131 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.131 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.132 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 sshd-session[159037]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.132 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.132 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.132 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.132 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.132 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.132 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.133 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.133 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.133 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.133 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.133 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.133 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.134 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.134 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.134 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.134 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.134 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.134 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.134 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.135 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.135 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.135 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.135 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.135 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.135 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.135 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.136 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.136 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.136 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.136 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.136 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.136 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 systemd-logind[788]: Session 23 logged out. Waiting for processes to exit.
Nov 29 06:41:34 compute-0 systemd[1]: session-23.scope: Deactivated successfully.
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.137 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.137 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.137 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 systemd[1]: session-23.scope: Consumed 1min 49.332s CPU time.
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.137 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.137 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.137 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.138 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.138 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.138 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.138 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.138 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.138 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.138 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.139 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.139 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 systemd-logind[788]: Removed session 23.
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.139 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.139 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.139 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.139 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.140 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.140 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.140 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.140 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.140 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.141 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.141 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.141 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.141 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.141 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.141 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.142 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.142 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.142 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.142 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.142 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.142 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.142 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.142 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.143 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.143 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.143 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.143 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.143 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.143 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.144 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.144 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.144 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.144 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.144 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.144 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.145 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.145 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.145 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.145 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.145 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.145 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.146 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.146 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.146 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.146 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.147 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.147 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.147 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.147 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.147 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.147 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.147 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.148 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.148 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.148 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.148 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.148 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.148 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.148 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.149 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.149 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.149 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.149 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.149 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.149 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.149 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.150 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.150 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.150 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.150 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.150 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.150 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.150 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.151 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.151 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.151 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.151 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.151 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.151 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.151 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.152 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.152 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.152 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.152 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.152 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.152 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.152 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.153 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.153 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.153 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.153 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.153 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.153 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.154 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.154 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.154 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.154 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.154 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.154 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.155 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.155 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.155 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.155 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.155 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.155 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.155 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.156 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.156 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.156 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.156 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.156 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.156 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.157 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.157 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.157 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.157 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.157 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.157 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.157 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.158 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.158 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.158 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.158 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.158 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.158 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.159 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.159 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.159 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.159 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.159 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.159 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.160 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.160 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.160 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.160 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.160 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.160 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.160 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.161 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.161 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.161 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.161 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.161 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.161 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.162 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.162 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.162 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.162 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.162 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.162 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.162 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.163 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.163 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.163 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.163 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.163 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.163 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.163 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.164 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.164 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.164 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.164 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.164 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.164 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.165 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.165 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.165 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.165 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.165 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.165 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.165 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.166 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.166 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.166 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.166 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.166 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.166 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.166 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.167 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.167 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.167 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.167 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.167 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.167 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.167 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.168 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.168 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.168 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.168 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.168 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.168 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.168 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.169 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.169 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.169 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.169 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.169 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.169 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.169 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.170 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.170 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.170 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.170 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.170 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.170 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.170 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.171 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.171 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.171 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.171 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.171 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.171 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.171 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.172 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.172 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.172 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.172 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.172 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.172 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.173 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.173 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.173 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.173 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.173 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.173 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.174 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.174 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.174 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.174 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.174 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.174 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.175 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.175 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.175 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.175 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.175 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.175 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.176 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.176 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.176 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.176 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.176 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.177 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.177 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.177 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.177 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.177 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.177 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.177 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.178 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.178 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.178 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.178 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.178 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.178 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.179 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.179 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.179 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.179 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.179 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.179 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.180 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.180 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.180 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.180 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.180 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.180 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.181 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.181 187189 DEBUG oslo_service.service [None req-4abfaad9-c5ae-4086-a637-53f2c04851ed - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.182 187189 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.198 187189 INFO nova.virt.node [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Determined node identity 4e39a026-df39-4e20-874a-dbb5a40df044 from /var/lib/nova/compute_id
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.199 187189 DEBUG nova.virt.libvirt.host [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.200 187189 DEBUG nova.virt.libvirt.host [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.200 187189 DEBUG nova.virt.libvirt.host [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.200 187189 DEBUG nova.virt.libvirt.host [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.213 187189 DEBUG nova.virt.libvirt.host [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f7082a60c70> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.216 187189 DEBUG nova.virt.libvirt.host [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f7082a60c70> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.217 187189 INFO nova.virt.libvirt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Connection event '1' reason 'None'
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.224 187189 INFO nova.virt.libvirt.host [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Libvirt host capabilities <capabilities>
Nov 29 06:41:34 compute-0 nova_compute[187185]: 
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <host>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <uuid>2814de55-a942-4164-91c1-92c593f2f35f</uuid>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <cpu>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <arch>x86_64</arch>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model>EPYC-Rome-v4</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <vendor>AMD</vendor>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <microcode version='16777317'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <signature family='23' model='49' stepping='0'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <maxphysaddr mode='emulate' bits='40'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature name='x2apic'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature name='tsc-deadline'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature name='osxsave'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature name='hypervisor'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature name='tsc_adjust'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature name='spec-ctrl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature name='stibp'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature name='arch-capabilities'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature name='ssbd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature name='cmp_legacy'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature name='topoext'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature name='virt-ssbd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature name='lbrv'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature name='tsc-scale'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature name='vmcb-clean'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature name='pause-filter'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature name='pfthreshold'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature name='svme-addr-chk'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature name='rdctl-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature name='skip-l1dfl-vmentry'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature name='mds-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature name='pschange-mc-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <pages unit='KiB' size='4'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <pages unit='KiB' size='2048'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <pages unit='KiB' size='1048576'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </cpu>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <power_management>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <suspend_mem/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <suspend_disk/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <suspend_hybrid/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </power_management>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <iommu support='no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <migration_features>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <live/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <uri_transports>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <uri_transport>tcp</uri_transport>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <uri_transport>rdma</uri_transport>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </uri_transports>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </migration_features>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <topology>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <cells num='1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <cell id='0'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:           <memory unit='KiB'>7864316</memory>
Nov 29 06:41:34 compute-0 nova_compute[187185]:           <pages unit='KiB' size='4'>1966079</pages>
Nov 29 06:41:34 compute-0 nova_compute[187185]:           <pages unit='KiB' size='2048'>0</pages>
Nov 29 06:41:34 compute-0 nova_compute[187185]:           <pages unit='KiB' size='1048576'>0</pages>
Nov 29 06:41:34 compute-0 nova_compute[187185]:           <distances>
Nov 29 06:41:34 compute-0 nova_compute[187185]:             <sibling id='0' value='10'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:           </distances>
Nov 29 06:41:34 compute-0 nova_compute[187185]:           <cpus num='8'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:           </cpus>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         </cell>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </cells>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </topology>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <cache>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </cache>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <secmodel>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model>selinux</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <doi>0</doi>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </secmodel>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <secmodel>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model>dac</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <doi>0</doi>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <baselabel type='kvm'>+107:+107</baselabel>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <baselabel type='qemu'>+107:+107</baselabel>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </secmodel>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   </host>
Nov 29 06:41:34 compute-0 nova_compute[187185]: 
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <guest>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <os_type>hvm</os_type>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <arch name='i686'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <wordsize>32</wordsize>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <domain type='qemu'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <domain type='kvm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </arch>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <features>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <pae/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <nonpae/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <acpi default='on' toggle='yes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <apic default='on' toggle='no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <cpuselection/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <deviceboot/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <disksnapshot default='on' toggle='no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <externalSnapshot/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </features>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   </guest>
Nov 29 06:41:34 compute-0 nova_compute[187185]: 
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <guest>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <os_type>hvm</os_type>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <arch name='x86_64'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <wordsize>64</wordsize>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <domain type='qemu'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <domain type='kvm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </arch>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <features>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <acpi default='on' toggle='yes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <apic default='on' toggle='no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <cpuselection/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <deviceboot/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <disksnapshot default='on' toggle='no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <externalSnapshot/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </features>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   </guest>
Nov 29 06:41:34 compute-0 nova_compute[187185]: 
Nov 29 06:41:34 compute-0 nova_compute[187185]: </capabilities>
Nov 29 06:41:34 compute-0 nova_compute[187185]: 
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.233 187189 DEBUG nova.virt.libvirt.host [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.237 187189 DEBUG nova.virt.libvirt.host [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 29 06:41:34 compute-0 nova_compute[187185]: <domainCapabilities>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <path>/usr/libexec/qemu-kvm</path>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <domain>kvm</domain>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <arch>i686</arch>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <vcpu max='4096'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <iothreads supported='yes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <os supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <enum name='firmware'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <loader supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='type'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>rom</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>pflash</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='readonly'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>yes</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>no</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='secure'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>no</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </loader>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   </os>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <cpu>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <mode name='host-passthrough' supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='hostPassthroughMigratable'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>on</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>off</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </mode>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <mode name='maximum' supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='maximumMigratable'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>on</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>off</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </mode>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <mode name='host-model' supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <vendor>AMD</vendor>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='x2apic'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='tsc-deadline'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='hypervisor'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='tsc_adjust'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='spec-ctrl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='stibp'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='ssbd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='cmp_legacy'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='overflow-recov'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='succor'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='ibrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='amd-ssbd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='virt-ssbd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='lbrv'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='tsc-scale'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='vmcb-clean'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='flushbyasid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='pause-filter'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='pfthreshold'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='svme-addr-chk'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='disable' name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </mode>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <mode name='custom' supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Broadwell'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Broadwell-IBRS'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Broadwell-noTSX'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Broadwell-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Broadwell-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Broadwell-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Broadwell-v4'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Cascadelake-Server'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Cascadelake-Server-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Cascadelake-Server-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Cascadelake-Server-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Cascadelake-Server-v4'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Cascadelake-Server-v5'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Cooperlake'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Cooperlake-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Cooperlake-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Denverton'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='mpx'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Denverton-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='mpx'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Denverton-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Denverton-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Dhyana-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-Genoa'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amd-psfd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='auto-ibrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='no-nested-data-bp'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='null-sel-clr-base'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='stibp-always-on'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-Genoa-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amd-psfd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='auto-ibrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='no-nested-data-bp'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='null-sel-clr-base'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='stibp-always-on'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-Milan'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-Milan-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-Milan-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amd-psfd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='no-nested-data-bp'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='null-sel-clr-base'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='stibp-always-on'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-Rome'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-Rome-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-Rome-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-Rome-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-v4'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='GraniteRapids'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-fp16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-int8'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-tile'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-fp16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fbsdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrc'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fzrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='mcdt-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pbrsb-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='prefetchiti'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='psdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='serialize'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xfd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='GraniteRapids-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-fp16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-int8'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-tile'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-fp16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fbsdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrc'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fzrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='mcdt-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pbrsb-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='prefetchiti'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='psdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='serialize'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xfd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='GraniteRapids-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-fp16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-int8'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-tile'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx10'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx10-128'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx10-256'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx10-512'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-fp16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='cldemote'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fbsdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrc'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fzrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='mcdt-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdir64b'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdiri'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pbrsb-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='prefetchiti'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='psdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='serialize'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ss'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xfd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Haswell'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Haswell-IBRS'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Haswell-noTSX'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Haswell-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Haswell-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Haswell-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Haswell-v4'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Icelake-Server'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Icelake-Server-noTSX'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Icelake-Server-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Icelake-Server-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Icelake-Server-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Icelake-Server-v4'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Icelake-Server-v5'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Icelake-Server-v6'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Icelake-Server-v7'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='IvyBridge'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='IvyBridge-IBRS'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='IvyBridge-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='IvyBridge-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='KnightsMill'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-4fmaps'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-4vnniw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512er'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512pf'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ss'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='KnightsMill-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-4fmaps'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-4vnniw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512er'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512pf'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ss'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Opteron_G4'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fma4'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xop'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Opteron_G4-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fma4'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xop'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Opteron_G5'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fma4'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='tbm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xop'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Opteron_G5-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fma4'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='tbm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xop'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='SapphireRapids'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-int8'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-tile'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-fp16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrc'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fzrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='serialize'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xfd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='SapphireRapids-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-int8'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-tile'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-fp16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrc'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fzrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='serialize'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xfd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='SapphireRapids-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-int8'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-tile'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-fp16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fbsdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrc'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fzrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='psdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='serialize'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xfd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='SapphireRapids-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-int8'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-tile'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-fp16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='cldemote'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fbsdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrc'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fzrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdir64b'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdiri'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='psdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='serialize'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ss'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xfd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='SierraForest'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-ne-convert'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni-int8'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='cmpccxadd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fbsdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='mcdt-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pbrsb-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='psdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='serialize'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='SierraForest-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-ne-convert'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni-int8'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='cmpccxadd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fbsdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='mcdt-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pbrsb-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='psdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='serialize'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Client'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Client-IBRS'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Client-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Client-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Client-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Client-v4'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Server'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Server-IBRS'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Server-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Server-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Server-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Server-v4'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Server-v5'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Snowridge'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='cldemote'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='core-capability'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdir64b'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdiri'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='mpx'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='split-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Snowridge-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='cldemote'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='core-capability'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdir64b'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdiri'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='mpx'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='split-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Snowridge-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='cldemote'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='core-capability'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdir64b'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdiri'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='split-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Snowridge-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='cldemote'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='core-capability'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdir64b'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdiri'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='split-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Snowridge-v4'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='cldemote'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdir64b'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdiri'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='athlon'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='3dnow'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='3dnowext'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='athlon-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='3dnow'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='3dnowext'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='core2duo'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ss'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='core2duo-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ss'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='coreduo'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ss'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='coreduo-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ss'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='n270'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ss'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='n270-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ss'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='phenom'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='3dnow'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='3dnowext'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='phenom-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='3dnow'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='3dnowext'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </mode>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   </cpu>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <memoryBacking supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <enum name='sourceType'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <value>file</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <value>anonymous</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <value>memfd</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   </memoryBacking>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <devices>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <disk supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='diskDevice'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>disk</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>cdrom</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>floppy</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>lun</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='bus'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>fdc</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>scsi</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>virtio</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>usb</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>sata</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='model'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>virtio</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>virtio-transitional</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>virtio-non-transitional</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </disk>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <graphics supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='type'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>vnc</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>egl-headless</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>dbus</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </graphics>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <video supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='modelType'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>vga</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>cirrus</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>virtio</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>none</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>bochs</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>ramfb</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </video>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <hostdev supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='mode'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>subsystem</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='startupPolicy'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>default</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>mandatory</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>requisite</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>optional</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='subsysType'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>usb</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>pci</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>scsi</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='capsType'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='pciBackend'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </hostdev>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <rng supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='model'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>virtio</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>virtio-transitional</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>virtio-non-transitional</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='backendModel'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>random</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>egd</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>builtin</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </rng>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <filesystem supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='driverType'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>path</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>handle</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>virtiofs</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </filesystem>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <tpm supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='model'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>tpm-tis</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>tpm-crb</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='backendModel'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>emulator</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>external</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='backendVersion'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>2.0</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </tpm>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <redirdev supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='bus'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>usb</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </redirdev>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <channel supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='type'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>pty</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>unix</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </channel>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <crypto supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='model'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='type'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>qemu</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='backendModel'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>builtin</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </crypto>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <interface supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='backendType'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>default</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>passt</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </interface>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <panic supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='model'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>isa</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>hyperv</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </panic>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <console supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='type'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>null</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>vc</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>pty</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>dev</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>file</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>pipe</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>stdio</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>udp</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>tcp</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>unix</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>qemu-vdagent</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>dbus</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </console>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   </devices>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <features>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <gic supported='no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <vmcoreinfo supported='yes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <genid supported='yes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <backingStoreInput supported='yes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <backup supported='yes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <async-teardown supported='yes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <ps2 supported='yes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <sev supported='no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <sgx supported='no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <hyperv supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='features'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>relaxed</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>vapic</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>spinlocks</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>vpindex</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>runtime</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>synic</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>stimer</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>reset</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>vendor_id</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>frequencies</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>reenlightenment</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>tlbflush</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>ipi</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>avic</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>emsr_bitmap</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>xmm_input</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <defaults>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <spinlocks>4095</spinlocks>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <stimer_direct>on</stimer_direct>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <tlbflush_direct>on</tlbflush_direct>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <tlbflush_extended>on</tlbflush_extended>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </defaults>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </hyperv>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <launchSecurity supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='sectype'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>tdx</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </launchSecurity>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   </features>
Nov 29 06:41:34 compute-0 nova_compute[187185]: </domainCapabilities>
Nov 29 06:41:34 compute-0 nova_compute[187185]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.244 187189 DEBUG nova.virt.libvirt.host [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 29 06:41:34 compute-0 nova_compute[187185]: <domainCapabilities>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <path>/usr/libexec/qemu-kvm</path>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <domain>kvm</domain>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <arch>i686</arch>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <vcpu max='240'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <iothreads supported='yes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <os supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <enum name='firmware'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <loader supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='type'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>rom</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>pflash</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='readonly'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>yes</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>no</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='secure'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>no</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </loader>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   </os>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <cpu>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <mode name='host-passthrough' supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='hostPassthroughMigratable'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>on</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>off</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </mode>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <mode name='maximum' supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='maximumMigratable'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>on</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>off</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </mode>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <mode name='host-model' supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <vendor>AMD</vendor>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='x2apic'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='tsc-deadline'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='hypervisor'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='tsc_adjust'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='spec-ctrl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='stibp'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='ssbd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='cmp_legacy'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='overflow-recov'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='succor'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='ibrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='amd-ssbd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='virt-ssbd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='lbrv'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='tsc-scale'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='vmcb-clean'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='flushbyasid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='pause-filter'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='pfthreshold'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='svme-addr-chk'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='disable' name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </mode>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <mode name='custom' supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Broadwell'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Broadwell-IBRS'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Broadwell-noTSX'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Broadwell-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Broadwell-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Broadwell-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Broadwell-v4'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Cascadelake-Server'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Cascadelake-Server-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Cascadelake-Server-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Cascadelake-Server-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Cascadelake-Server-v4'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Cascadelake-Server-v5'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Cooperlake'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Cooperlake-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Cooperlake-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Denverton'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='mpx'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Denverton-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='mpx'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Denverton-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Denverton-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Dhyana-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-Genoa'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amd-psfd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='auto-ibrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='no-nested-data-bp'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='null-sel-clr-base'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='stibp-always-on'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-Genoa-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amd-psfd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='auto-ibrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='no-nested-data-bp'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='null-sel-clr-base'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='stibp-always-on'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-Milan'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-Milan-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-Milan-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amd-psfd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='no-nested-data-bp'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='null-sel-clr-base'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='stibp-always-on'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-Rome'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-Rome-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-Rome-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-Rome-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-v4'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='GraniteRapids'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-fp16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-int8'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-tile'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-fp16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fbsdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrc'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fzrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='mcdt-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pbrsb-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='prefetchiti'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='psdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='serialize'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xfd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='GraniteRapids-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-fp16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-int8'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-tile'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-fp16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fbsdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrc'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fzrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='mcdt-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pbrsb-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='prefetchiti'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='psdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='serialize'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xfd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='GraniteRapids-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-fp16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-int8'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-tile'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx10'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx10-128'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx10-256'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx10-512'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-fp16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='cldemote'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fbsdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrc'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fzrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='mcdt-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdir64b'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdiri'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pbrsb-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='prefetchiti'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='psdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='serialize'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ss'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xfd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Haswell'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Haswell-IBRS'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Haswell-noTSX'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Haswell-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Haswell-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Haswell-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Haswell-v4'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Icelake-Server'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Icelake-Server-noTSX'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Icelake-Server-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Icelake-Server-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Icelake-Server-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Icelake-Server-v4'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Icelake-Server-v5'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Icelake-Server-v6'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Icelake-Server-v7'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='IvyBridge'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='IvyBridge-IBRS'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='IvyBridge-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='IvyBridge-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='KnightsMill'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-4fmaps'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-4vnniw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512er'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512pf'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ss'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='KnightsMill-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-4fmaps'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-4vnniw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512er'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512pf'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ss'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Opteron_G4'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fma4'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xop'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Opteron_G4-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fma4'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xop'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Opteron_G5'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fma4'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='tbm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xop'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Opteron_G5-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fma4'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='tbm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xop'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='SapphireRapids'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-int8'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-tile'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-fp16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrc'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fzrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='serialize'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xfd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='SapphireRapids-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-int8'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-tile'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-fp16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrc'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fzrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='serialize'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xfd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='SapphireRapids-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-int8'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-tile'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-fp16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fbsdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrc'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fzrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='psdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='serialize'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xfd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='SapphireRapids-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-int8'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-tile'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-fp16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='cldemote'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fbsdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrc'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fzrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdir64b'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdiri'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='psdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='serialize'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ss'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xfd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='SierraForest'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-ne-convert'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni-int8'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='cmpccxadd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fbsdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='mcdt-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pbrsb-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='psdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='serialize'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='SierraForest-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-ne-convert'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni-int8'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='cmpccxadd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fbsdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='mcdt-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pbrsb-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='psdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='serialize'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Client'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Client-IBRS'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Client-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Client-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Client-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Client-v4'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Server'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Server-IBRS'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Server-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Server-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Server-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Server-v4'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Server-v5'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Snowridge'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='cldemote'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='core-capability'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdir64b'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdiri'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='mpx'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='split-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Snowridge-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='cldemote'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='core-capability'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdir64b'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdiri'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='mpx'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='split-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Snowridge-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='cldemote'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='core-capability'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdir64b'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdiri'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='split-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Snowridge-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='cldemote'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='core-capability'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdir64b'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdiri'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='split-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Snowridge-v4'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='cldemote'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdir64b'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdiri'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='athlon'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='3dnow'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='3dnowext'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='athlon-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='3dnow'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='3dnowext'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='core2duo'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ss'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='core2duo-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ss'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='coreduo'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ss'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='coreduo-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ss'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='n270'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ss'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='n270-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ss'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='phenom'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='3dnow'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='3dnowext'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='phenom-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='3dnow'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='3dnowext'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </mode>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   </cpu>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <memoryBacking supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <enum name='sourceType'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <value>file</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <value>anonymous</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <value>memfd</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   </memoryBacking>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <devices>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <disk supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='diskDevice'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>disk</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>cdrom</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>floppy</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>lun</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='bus'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>ide</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>fdc</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>scsi</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>virtio</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>usb</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>sata</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='model'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>virtio</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>virtio-transitional</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>virtio-non-transitional</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </disk>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <graphics supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='type'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>vnc</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>egl-headless</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>dbus</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </graphics>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <video supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='modelType'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>vga</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>cirrus</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>virtio</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>none</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>bochs</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>ramfb</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </video>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <hostdev supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='mode'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>subsystem</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='startupPolicy'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>default</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>mandatory</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>requisite</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>optional</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='subsysType'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>usb</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>pci</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>scsi</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='capsType'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='pciBackend'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </hostdev>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <rng supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='model'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>virtio</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>virtio-transitional</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>virtio-non-transitional</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='backendModel'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>random</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>egd</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>builtin</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </rng>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <filesystem supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='driverType'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>path</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>handle</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>virtiofs</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </filesystem>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <tpm supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='model'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>tpm-tis</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>tpm-crb</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='backendModel'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>emulator</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>external</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='backendVersion'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>2.0</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </tpm>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <redirdev supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='bus'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>usb</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </redirdev>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <channel supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='type'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>pty</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>unix</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </channel>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <crypto supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='model'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='type'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>qemu</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='backendModel'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>builtin</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </crypto>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <interface supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='backendType'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>default</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>passt</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </interface>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <panic supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='model'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>isa</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>hyperv</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </panic>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <console supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='type'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>null</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>vc</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>pty</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>dev</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>file</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>pipe</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>stdio</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>udp</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>tcp</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>unix</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>qemu-vdagent</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>dbus</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </console>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   </devices>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <features>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <gic supported='no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <vmcoreinfo supported='yes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <genid supported='yes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <backingStoreInput supported='yes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <backup supported='yes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <async-teardown supported='yes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <ps2 supported='yes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <sev supported='no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <sgx supported='no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <hyperv supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='features'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>relaxed</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>vapic</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>spinlocks</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>vpindex</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>runtime</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>synic</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>stimer</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>reset</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>vendor_id</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>frequencies</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>reenlightenment</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>tlbflush</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>ipi</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>avic</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>emsr_bitmap</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>xmm_input</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <defaults>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <spinlocks>4095</spinlocks>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <stimer_direct>on</stimer_direct>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <tlbflush_direct>on</tlbflush_direct>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <tlbflush_extended>on</tlbflush_extended>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </defaults>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </hyperv>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <launchSecurity supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='sectype'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>tdx</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </launchSecurity>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   </features>
Nov 29 06:41:34 compute-0 nova_compute[187185]: </domainCapabilities>
Nov 29 06:41:34 compute-0 nova_compute[187185]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.284 187189 DEBUG nova.virt.libvirt.host [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.289 187189 DEBUG nova.virt.libvirt.host [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 29 06:41:34 compute-0 nova_compute[187185]: <domainCapabilities>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <path>/usr/libexec/qemu-kvm</path>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <domain>kvm</domain>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <arch>x86_64</arch>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <vcpu max='4096'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <iothreads supported='yes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <os supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <enum name='firmware'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <value>efi</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <loader supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='type'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>rom</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>pflash</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='readonly'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>yes</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>no</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='secure'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>yes</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>no</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </loader>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   </os>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <cpu>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <mode name='host-passthrough' supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='hostPassthroughMigratable'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>on</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>off</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </mode>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <mode name='maximum' supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='maximumMigratable'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>on</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>off</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </mode>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <mode name='host-model' supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <vendor>AMD</vendor>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='x2apic'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='tsc-deadline'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='hypervisor'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='tsc_adjust'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='spec-ctrl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='stibp'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='ssbd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='cmp_legacy'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='overflow-recov'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='succor'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='ibrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='amd-ssbd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='virt-ssbd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='lbrv'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='tsc-scale'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='vmcb-clean'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='flushbyasid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='pause-filter'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='pfthreshold'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='svme-addr-chk'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='disable' name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </mode>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <mode name='custom' supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Broadwell'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Broadwell-IBRS'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Broadwell-noTSX'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Broadwell-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Broadwell-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Broadwell-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Broadwell-v4'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Cascadelake-Server'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Cascadelake-Server-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Cascadelake-Server-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Cascadelake-Server-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Cascadelake-Server-v4'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Cascadelake-Server-v5'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Cooperlake'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Cooperlake-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Cooperlake-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Denverton'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='mpx'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Denverton-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='mpx'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Denverton-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Denverton-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Dhyana-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-Genoa'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amd-psfd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='auto-ibrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='no-nested-data-bp'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='null-sel-clr-base'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='stibp-always-on'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-Genoa-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amd-psfd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='auto-ibrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='no-nested-data-bp'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='null-sel-clr-base'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='stibp-always-on'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-Milan'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-Milan-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-Milan-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amd-psfd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='no-nested-data-bp'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='null-sel-clr-base'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='stibp-always-on'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-Rome'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-Rome-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-Rome-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-Rome-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-v4'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='GraniteRapids'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-fp16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-int8'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-tile'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-fp16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fbsdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrc'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fzrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='mcdt-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pbrsb-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='prefetchiti'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='psdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='serialize'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xfd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='GraniteRapids-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-fp16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-int8'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-tile'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-fp16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fbsdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrc'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fzrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='mcdt-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pbrsb-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='prefetchiti'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='psdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='serialize'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xfd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='GraniteRapids-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-fp16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-int8'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-tile'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx10'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx10-128'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx10-256'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx10-512'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-fp16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='cldemote'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fbsdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrc'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fzrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='mcdt-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdir64b'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdiri'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pbrsb-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='prefetchiti'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='psdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='serialize'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ss'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xfd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Haswell'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Haswell-IBRS'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Haswell-noTSX'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Haswell-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Haswell-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Haswell-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Haswell-v4'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Icelake-Server'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Icelake-Server-noTSX'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Icelake-Server-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Icelake-Server-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Icelake-Server-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Icelake-Server-v4'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Icelake-Server-v5'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Icelake-Server-v6'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Icelake-Server-v7'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='IvyBridge'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='IvyBridge-IBRS'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='IvyBridge-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='IvyBridge-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='KnightsMill'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-4fmaps'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-4vnniw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512er'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512pf'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ss'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='KnightsMill-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-4fmaps'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-4vnniw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512er'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512pf'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ss'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Opteron_G4'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fma4'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xop'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Opteron_G4-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fma4'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xop'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Opteron_G5'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fma4'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='tbm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xop'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Opteron_G5-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fma4'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='tbm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xop'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='SapphireRapids'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-int8'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-tile'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-fp16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrc'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fzrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='serialize'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xfd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='SapphireRapids-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-int8'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-tile'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-fp16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrc'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fzrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='serialize'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xfd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='SapphireRapids-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-int8'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-tile'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-fp16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fbsdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrc'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fzrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='psdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='serialize'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xfd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='SapphireRapids-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-int8'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-tile'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-fp16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='cldemote'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fbsdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrc'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fzrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdir64b'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdiri'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='psdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='serialize'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ss'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xfd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='SierraForest'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-ne-convert'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni-int8'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='cmpccxadd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fbsdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='mcdt-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pbrsb-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='psdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='serialize'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='SierraForest-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-ne-convert'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni-int8'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='cmpccxadd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fbsdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='mcdt-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pbrsb-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='psdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='serialize'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Client'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Client-IBRS'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Client-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Client-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Client-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Client-v4'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Server'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Server-IBRS'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Server-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Server-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Server-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Server-v4'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Server-v5'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Snowridge'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='cldemote'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='core-capability'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdir64b'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdiri'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='mpx'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='split-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Snowridge-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='cldemote'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='core-capability'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdir64b'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdiri'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='mpx'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='split-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Snowridge-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='cldemote'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='core-capability'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdir64b'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdiri'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='split-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Snowridge-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='cldemote'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='core-capability'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdir64b'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdiri'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='split-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Snowridge-v4'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='cldemote'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdir64b'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdiri'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='athlon'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='3dnow'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='3dnowext'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='athlon-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='3dnow'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='3dnowext'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='core2duo'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ss'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='core2duo-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ss'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='coreduo'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ss'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='coreduo-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ss'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='n270'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ss'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='n270-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ss'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='phenom'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='3dnow'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='3dnowext'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='phenom-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='3dnow'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='3dnowext'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </mode>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   </cpu>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <memoryBacking supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <enum name='sourceType'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <value>file</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <value>anonymous</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <value>memfd</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   </memoryBacking>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <devices>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <disk supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='diskDevice'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>disk</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>cdrom</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>floppy</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>lun</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='bus'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>fdc</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>scsi</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>virtio</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>usb</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>sata</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='model'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>virtio</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>virtio-transitional</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>virtio-non-transitional</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </disk>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <graphics supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='type'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>vnc</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>egl-headless</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>dbus</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </graphics>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <video supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='modelType'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>vga</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>cirrus</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>virtio</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>none</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>bochs</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>ramfb</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </video>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <hostdev supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='mode'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>subsystem</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='startupPolicy'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>default</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>mandatory</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>requisite</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>optional</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='subsysType'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>usb</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>pci</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>scsi</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='capsType'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='pciBackend'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </hostdev>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <rng supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='model'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>virtio</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>virtio-transitional</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>virtio-non-transitional</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='backendModel'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>random</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>egd</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>builtin</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </rng>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <filesystem supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='driverType'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>path</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>handle</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>virtiofs</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </filesystem>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <tpm supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='model'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>tpm-tis</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>tpm-crb</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='backendModel'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>emulator</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>external</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='backendVersion'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>2.0</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </tpm>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <redirdev supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='bus'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>usb</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </redirdev>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <channel supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='type'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>pty</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>unix</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </channel>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <crypto supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='model'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='type'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>qemu</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='backendModel'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>builtin</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </crypto>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <interface supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='backendType'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>default</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>passt</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </interface>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <panic supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='model'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>isa</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>hyperv</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </panic>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <console supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='type'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>null</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>vc</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>pty</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>dev</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>file</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>pipe</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>stdio</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>udp</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>tcp</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>unix</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>qemu-vdagent</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>dbus</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </console>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   </devices>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <features>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <gic supported='no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <vmcoreinfo supported='yes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <genid supported='yes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <backingStoreInput supported='yes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <backup supported='yes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <async-teardown supported='yes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <ps2 supported='yes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <sev supported='no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <sgx supported='no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <hyperv supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='features'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>relaxed</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>vapic</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>spinlocks</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>vpindex</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>runtime</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>synic</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>stimer</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>reset</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>vendor_id</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>frequencies</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>reenlightenment</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>tlbflush</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>ipi</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>avic</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>emsr_bitmap</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>xmm_input</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <defaults>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <spinlocks>4095</spinlocks>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <stimer_direct>on</stimer_direct>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <tlbflush_direct>on</tlbflush_direct>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <tlbflush_extended>on</tlbflush_extended>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </defaults>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </hyperv>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <launchSecurity supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='sectype'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>tdx</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </launchSecurity>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   </features>
Nov 29 06:41:34 compute-0 nova_compute[187185]: </domainCapabilities>
Nov 29 06:41:34 compute-0 nova_compute[187185]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.353 187189 DEBUG nova.virt.libvirt.volume.mount [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.362 187189 DEBUG nova.virt.libvirt.host [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 29 06:41:34 compute-0 nova_compute[187185]: <domainCapabilities>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <path>/usr/libexec/qemu-kvm</path>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <domain>kvm</domain>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <arch>x86_64</arch>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <vcpu max='240'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <iothreads supported='yes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <os supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <enum name='firmware'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <loader supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='type'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>rom</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>pflash</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='readonly'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>yes</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>no</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='secure'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>no</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </loader>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   </os>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <cpu>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <mode name='host-passthrough' supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='hostPassthroughMigratable'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>on</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>off</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </mode>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <mode name='maximum' supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='maximumMigratable'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>on</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>off</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </mode>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <mode name='host-model' supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <vendor>AMD</vendor>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='x2apic'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='tsc-deadline'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='hypervisor'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='tsc_adjust'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='spec-ctrl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='stibp'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='ssbd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='cmp_legacy'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='overflow-recov'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='succor'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='ibrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='amd-ssbd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='virt-ssbd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='lbrv'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='tsc-scale'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='vmcb-clean'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='flushbyasid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='pause-filter'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='pfthreshold'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='svme-addr-chk'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <feature policy='disable' name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </mode>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <mode name='custom' supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Broadwell'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Broadwell-IBRS'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Broadwell-noTSX'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Broadwell-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Broadwell-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Broadwell-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Broadwell-v4'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Cascadelake-Server'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Cascadelake-Server-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Cascadelake-Server-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Cascadelake-Server-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Cascadelake-Server-v4'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Cascadelake-Server-v5'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Cooperlake'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Cooperlake-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Cooperlake-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Denverton'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='mpx'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Denverton-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='mpx'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Denverton-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Denverton-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Dhyana-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-Genoa'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amd-psfd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='auto-ibrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='no-nested-data-bp'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='null-sel-clr-base'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='stibp-always-on'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-Genoa-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amd-psfd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='auto-ibrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='no-nested-data-bp'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='null-sel-clr-base'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='stibp-always-on'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-Milan'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-Milan-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-Milan-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amd-psfd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='no-nested-data-bp'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='null-sel-clr-base'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='stibp-always-on'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-Rome'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-Rome-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-Rome-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-Rome-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='EPYC-v4'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='GraniteRapids'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-fp16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-int8'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-tile'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-fp16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fbsdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrc'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fzrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='mcdt-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pbrsb-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='prefetchiti'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='psdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='serialize'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xfd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='GraniteRapids-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-fp16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-int8'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-tile'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-fp16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fbsdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrc'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fzrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='mcdt-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pbrsb-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='prefetchiti'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='psdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='serialize'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xfd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='GraniteRapids-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-fp16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-int8'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-tile'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx10'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx10-128'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx10-256'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx10-512'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-fp16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='cldemote'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fbsdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrc'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fzrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='mcdt-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdir64b'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdiri'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pbrsb-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='prefetchiti'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='psdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='serialize'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ss'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xfd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Haswell'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Haswell-IBRS'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Haswell-noTSX'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Haswell-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Haswell-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Haswell-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Haswell-v4'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Icelake-Server'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Icelake-Server-noTSX'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Icelake-Server-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Icelake-Server-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Icelake-Server-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Icelake-Server-v4'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Icelake-Server-v5'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Icelake-Server-v6'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Icelake-Server-v7'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='IvyBridge'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='IvyBridge-IBRS'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='IvyBridge-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='IvyBridge-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='KnightsMill'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-4fmaps'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-4vnniw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512er'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512pf'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ss'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='KnightsMill-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-4fmaps'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-4vnniw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512er'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512pf'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ss'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Opteron_G4'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fma4'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xop'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Opteron_G4-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fma4'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xop'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Opteron_G5'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fma4'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='tbm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xop'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Opteron_G5-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fma4'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='tbm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xop'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='SapphireRapids'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-int8'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-tile'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-fp16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrc'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fzrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='serialize'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xfd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='SapphireRapids-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-int8'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-tile'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-fp16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrc'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fzrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='serialize'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xfd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='SapphireRapids-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-int8'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-tile'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-fp16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fbsdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrc'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fzrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='psdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='serialize'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xfd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='SapphireRapids-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-int8'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='amx-tile'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-bf16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-fp16'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512-vpopcntdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bitalg'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vbmi2'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='cldemote'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fbsdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrc'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fzrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='la57'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdir64b'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdiri'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='psdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='serialize'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ss'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='taa-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='tsx-ldtrk'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xfd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='SierraForest'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-ne-convert'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni-int8'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='cmpccxadd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fbsdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='mcdt-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pbrsb-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='psdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='serialize'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='SierraForest-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-ifma'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-ne-convert'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx-vnni-int8'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='bus-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='cmpccxadd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fbsdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='fsrs'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ibrs-all'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='mcdt-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pbrsb-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='psdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='sbdr-ssdp-no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='serialize'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vaes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='vpclmulqdq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Client'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Client-IBRS'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Client-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Client-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Client-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Client-v4'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Server'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Server-IBRS'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Server-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Server-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='hle'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='rtm'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Server-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Server-v4'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Skylake-Server-v5'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512bw'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512cd'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512dq'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512f'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='avx512vl'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='invpcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pcid'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='pku'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Snowridge'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='cldemote'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='core-capability'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdir64b'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdiri'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='mpx'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='split-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Snowridge-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='cldemote'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='core-capability'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdir64b'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdiri'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='mpx'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='split-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Snowridge-v2'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='cldemote'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='core-capability'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdir64b'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdiri'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='split-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Snowridge-v3'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='cldemote'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='core-capability'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdir64b'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdiri'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='split-lock-detect'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='Snowridge-v4'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='cldemote'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='erms'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='gfni'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdir64b'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='movdiri'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='xsaves'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='athlon'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='3dnow'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='3dnowext'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='athlon-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='3dnow'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='3dnowext'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='core2duo'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ss'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='core2duo-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ss'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='coreduo'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ss'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='coreduo-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ss'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='n270'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ss'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='n270-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='ss'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='phenom'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='3dnow'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='3dnowext'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <blockers model='phenom-v1'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='3dnow'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <feature name='3dnowext'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </blockers>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </mode>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   </cpu>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <memoryBacking supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <enum name='sourceType'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <value>file</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <value>anonymous</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <value>memfd</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   </memoryBacking>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <devices>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <disk supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='diskDevice'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>disk</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>cdrom</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>floppy</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>lun</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='bus'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>ide</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>fdc</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>scsi</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>virtio</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>usb</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>sata</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='model'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>virtio</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>virtio-transitional</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>virtio-non-transitional</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </disk>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <graphics supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='type'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>vnc</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>egl-headless</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>dbus</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </graphics>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <video supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='modelType'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>vga</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>cirrus</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>virtio</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>none</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>bochs</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>ramfb</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </video>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <hostdev supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='mode'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>subsystem</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='startupPolicy'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>default</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>mandatory</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>requisite</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>optional</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='subsysType'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>usb</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>pci</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>scsi</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='capsType'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='pciBackend'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </hostdev>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <rng supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='model'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>virtio</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>virtio-transitional</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>virtio-non-transitional</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='backendModel'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>random</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>egd</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>builtin</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </rng>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <filesystem supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='driverType'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>path</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>handle</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>virtiofs</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </filesystem>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <tpm supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='model'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>tpm-tis</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>tpm-crb</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='backendModel'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>emulator</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>external</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='backendVersion'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>2.0</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </tpm>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <redirdev supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='bus'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>usb</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </redirdev>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <channel supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='type'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>pty</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>unix</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </channel>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <crypto supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='model'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='type'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>qemu</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='backendModel'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>builtin</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </crypto>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <interface supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='backendType'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>default</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>passt</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </interface>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <panic supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='model'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>isa</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>hyperv</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </panic>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <console supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='type'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>null</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>vc</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>pty</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>dev</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>file</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>pipe</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>stdio</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>udp</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>tcp</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>unix</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>qemu-vdagent</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>dbus</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </console>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   </devices>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <features>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <gic supported='no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <vmcoreinfo supported='yes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <genid supported='yes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <backingStoreInput supported='yes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <backup supported='yes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <async-teardown supported='yes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <ps2 supported='yes'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <sev supported='no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <sgx supported='no'/>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <hyperv supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='features'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>relaxed</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>vapic</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>spinlocks</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>vpindex</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>runtime</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>synic</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>stimer</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>reset</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>vendor_id</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>frequencies</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>reenlightenment</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>tlbflush</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>ipi</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>avic</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>emsr_bitmap</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>xmm_input</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <defaults>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <spinlocks>4095</spinlocks>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <stimer_direct>on</stimer_direct>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <tlbflush_direct>on</tlbflush_direct>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <tlbflush_extended>on</tlbflush_extended>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </defaults>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </hyperv>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     <launchSecurity supported='yes'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       <enum name='sectype'>
Nov 29 06:41:34 compute-0 nova_compute[187185]:         <value>tdx</value>
Nov 29 06:41:34 compute-0 nova_compute[187185]:       </enum>
Nov 29 06:41:34 compute-0 nova_compute[187185]:     </launchSecurity>
Nov 29 06:41:34 compute-0 nova_compute[187185]:   </features>
Nov 29 06:41:34 compute-0 nova_compute[187185]: </domainCapabilities>
Nov 29 06:41:34 compute-0 nova_compute[187185]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.425 187189 DEBUG nova.virt.libvirt.host [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.426 187189 INFO nova.virt.libvirt.host [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Secure Boot support detected
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.428 187189 INFO nova.virt.libvirt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.428 187189 INFO nova.virt.libvirt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.436 187189 DEBUG nova.virt.libvirt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] cpu compare xml: <cpu match="exact">
Nov 29 06:41:34 compute-0 nova_compute[187185]:   <model>Nehalem</model>
Nov 29 06:41:34 compute-0 nova_compute[187185]: </cpu>
Nov 29 06:41:34 compute-0 nova_compute[187185]:  _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.439 187189 DEBUG nova.virt.libvirt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.469 187189 INFO nova.virt.node [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Determined node identity 4e39a026-df39-4e20-874a-dbb5a40df044 from /var/lib/nova/compute_id
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.528 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Verified node 4e39a026-df39-4e20-874a-dbb5a40df044 matches my host compute-0.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.567 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.712 187189 DEBUG oslo_concurrency.lockutils [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.713 187189 DEBUG oslo_concurrency.lockutils [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.713 187189 DEBUG oslo_concurrency.lockutils [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.713 187189 DEBUG nova.compute.resource_tracker [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:41:34 compute-0 podman[187484]: 2025-11-29 06:41:34.791153114 +0000 UTC m=+0.059042163 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.864 187189 WARNING nova.virt.libvirt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.865 187189 DEBUG nova.compute.resource_tracker [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6221MB free_disk=73.5413589477539GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.865 187189 DEBUG oslo_concurrency.lockutils [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:41:34 compute-0 nova_compute[187185]: 2025-11-29 06:41:34.865 187189 DEBUG oslo_concurrency.lockutils [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:41:35 compute-0 nova_compute[187185]: 2025-11-29 06:41:35.080 187189 DEBUG nova.compute.resource_tracker [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:41:35 compute-0 nova_compute[187185]: 2025-11-29 06:41:35.081 187189 DEBUG nova.compute.resource_tracker [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:41:35 compute-0 nova_compute[187185]: 2025-11-29 06:41:35.216 187189 DEBUG nova.scheduler.client.report [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Refreshing inventories for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 29 06:41:35 compute-0 nova_compute[187185]: 2025-11-29 06:41:35.328 187189 DEBUG nova.scheduler.client.report [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Updating ProviderTree inventory for provider 4e39a026-df39-4e20-874a-dbb5a40df044 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 29 06:41:35 compute-0 nova_compute[187185]: 2025-11-29 06:41:35.328 187189 DEBUG nova.compute.provider_tree [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Updating inventory in ProviderTree for provider 4e39a026-df39-4e20-874a-dbb5a40df044 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 29 06:41:35 compute-0 nova_compute[187185]: 2025-11-29 06:41:35.352 187189 DEBUG nova.scheduler.client.report [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Refreshing aggregate associations for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 29 06:41:35 compute-0 nova_compute[187185]: 2025-11-29 06:41:35.423 187189 DEBUG nova.scheduler.client.report [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Refreshing trait associations for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044, traits: HW_CPU_X86_SSE,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_TRUSTED_CERTS,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 29 06:41:35 compute-0 nova_compute[187185]: 2025-11-29 06:41:35.448 187189 DEBUG nova.virt.libvirt.host [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Nov 29 06:41:35 compute-0 nova_compute[187185]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Nov 29 06:41:35 compute-0 nova_compute[187185]: 2025-11-29 06:41:35.449 187189 INFO nova.virt.libvirt.host [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] kernel doesn't support AMD SEV
Nov 29 06:41:35 compute-0 nova_compute[187185]: 2025-11-29 06:41:35.450 187189 DEBUG nova.compute.provider_tree [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:41:35 compute-0 nova_compute[187185]: 2025-11-29 06:41:35.450 187189 DEBUG nova.virt.libvirt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 06:41:35 compute-0 nova_compute[187185]: 2025-11-29 06:41:35.452 187189 DEBUG nova.virt.libvirt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Libvirt baseline CPU <cpu>
Nov 29 06:41:35 compute-0 nova_compute[187185]:   <arch>x86_64</arch>
Nov 29 06:41:35 compute-0 nova_compute[187185]:   <model>Nehalem</model>
Nov 29 06:41:35 compute-0 nova_compute[187185]:   <vendor>AMD</vendor>
Nov 29 06:41:35 compute-0 nova_compute[187185]:   <topology sockets="8" cores="1" threads="1"/>
Nov 29 06:41:35 compute-0 nova_compute[187185]: </cpu>
Nov 29 06:41:35 compute-0 nova_compute[187185]:  _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537
Nov 29 06:41:35 compute-0 nova_compute[187185]: 2025-11-29 06:41:35.473 187189 DEBUG nova.scheduler.client.report [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:41:35 compute-0 nova_compute[187185]: 2025-11-29 06:41:35.537 187189 DEBUG nova.compute.resource_tracker [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:41:35 compute-0 nova_compute[187185]: 2025-11-29 06:41:35.537 187189 DEBUG oslo_concurrency.lockutils [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.672s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:41:35 compute-0 nova_compute[187185]: 2025-11-29 06:41:35.537 187189 DEBUG nova.service [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Nov 29 06:41:35 compute-0 nova_compute[187185]: 2025-11-29 06:41:35.572 187189 DEBUG nova.service [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Nov 29 06:41:35 compute-0 nova_compute[187185]: 2025-11-29 06:41:35.573 187189 DEBUG nova.servicegroup.drivers.db [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Nov 29 06:41:40 compute-0 sshd-session[187503]: Accepted publickey for zuul from 192.168.122.30 port 51330 ssh2: ECDSA SHA256:Ey+6YDlYfkE2dRe/gjWhHvvrJHee4xZo2Q1JojEuVBA
Nov 29 06:41:40 compute-0 systemd-logind[788]: New session 25 of user zuul.
Nov 29 06:41:40 compute-0 systemd[1]: Started Session 25 of User zuul.
Nov 29 06:41:40 compute-0 sshd-session[187503]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 06:41:41 compute-0 sshd-session[187559]: Invalid user aa from 179.125.24.202 port 49150
Nov 29 06:41:41 compute-0 python3.9[187658]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 06:41:41 compute-0 sshd-session[187559]: Received disconnect from 179.125.24.202 port 49150:11: Bye Bye [preauth]
Nov 29 06:41:41 compute-0 sshd-session[187559]: Disconnected from invalid user aa 179.125.24.202 port 49150 [preauth]
Nov 29 06:41:42 compute-0 sudo[187828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iuvikmyihgsnhjqsusqslbufyvhqwujb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398502.119301-72-216748561383574/AnsiballZ_systemd_service.py'
Nov 29 06:41:42 compute-0 sudo[187828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:42 compute-0 podman[187786]: 2025-11-29 06:41:42.859268684 +0000 UTC m=+0.114234490 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible)
Nov 29 06:41:43 compute-0 python3.9[187835]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 06:41:43 compute-0 systemd[1]: Reloading.
Nov 29 06:41:43 compute-0 systemd-rc-local-generator[187870]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:41:43 compute-0 systemd-sysv-generator[187874]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:41:43 compute-0 sudo[187828]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:44 compute-0 python3.9[188027]: ansible-ansible.builtin.service_facts Invoked
Nov 29 06:41:44 compute-0 network[188044]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 06:41:44 compute-0 network[188045]: 'network-scripts' will be removed from distribution in near future.
Nov 29 06:41:44 compute-0 network[188046]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 06:41:49 compute-0 podman[188193]: 2025-11-29 06:41:49.808035779 +0000 UTC m=+0.073577904 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd)
Nov 29 06:41:50 compute-0 sudo[188338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pibjdmrkuneeczziowarebppobrciywe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398509.8259904-129-100856120708474/AnsiballZ_systemd_service.py'
Nov 29 06:41:50 compute-0 sudo[188338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:50 compute-0 python3.9[188340]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:41:50 compute-0 sudo[188338]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:51 compute-0 sudo[188491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plqqsgxhtxkchrydwlgskbxvpwebtvhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398510.893516-159-181422925310638/AnsiballZ_file.py'
Nov 29 06:41:51 compute-0 sudo[188491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:51 compute-0 python3.9[188493]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:51 compute-0 sudo[188491]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:51 compute-0 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 06:41:51 compute-0 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 06:41:52 compute-0 sudo[188644]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrsnffsqkmjcesaidupuaxamxtbzaklz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398511.8084145-183-266377255183682/AnsiballZ_file.py'
Nov 29 06:41:52 compute-0 sudo[188644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:52 compute-0 python3.9[188646]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:41:52 compute-0 sudo[188644]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:53 compute-0 sudo[188796]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avjnulvevmjikdmekwqgwygteixikdcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398512.7043366-210-192742443761616/AnsiballZ_command.py'
Nov 29 06:41:53 compute-0 sudo[188796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:53 compute-0 python3.9[188798]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:41:53 compute-0 sudo[188796]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:54 compute-0 python3.9[188950]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 29 06:41:55 compute-0 sudo[189100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkhtwitsdqwbgzrfwpfwaqawwinpnqzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398514.8575416-264-54483951430291/AnsiballZ_systemd_service.py'
Nov 29 06:41:55 compute-0 sudo[189100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:55 compute-0 python3.9[189102]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 06:41:55 compute-0 systemd[1]: Reloading.
Nov 29 06:41:55 compute-0 systemd-rc-local-generator[189130]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:41:55 compute-0 systemd-sysv-generator[189133]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:41:55 compute-0 sudo[189100]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:56 compute-0 sudo[189287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aonnkuqietvvhcqabfcaelrimahgcysv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398516.1750844-288-7218365827867/AnsiballZ_command.py'
Nov 29 06:41:56 compute-0 sudo[189287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:56 compute-0 python3.9[189289]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:41:56 compute-0 sudo[189287]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:57 compute-0 sudo[189440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-linbpngtgfzjwvqgwyxnvbqukbavdicn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398517.0592403-315-180597817351812/AnsiballZ_file.py'
Nov 29 06:41:57 compute-0 sudo[189440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:41:57 compute-0 python3.9[189442]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:41:57 compute-0 sudo[189440]: pam_unix(sudo:session): session closed for user root
Nov 29 06:41:58 compute-0 python3.9[189592]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:41:59 compute-0 python3.9[189744]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:42:00 compute-0 python3.9[189865]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398518.865932-363-146080651908027/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=e86e0e43000ce9ccfe5aefbf8e8f2e3d15d05584 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:42:00 compute-0 sudo[190015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lodylrwrzkaazfudkrhygkpsaxnowpoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398520.3635554-408-220204791570976/AnsiballZ_group.py'
Nov 29 06:42:00 compute-0 sudo[190015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:01 compute-0 python3.9[190017]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Nov 29 06:42:01 compute-0 sudo[190015]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:02 compute-0 sudo[190167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycngzvaddlrbjtncafitxfdetyktemis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398521.8760567-441-239768593623617/AnsiballZ_getent.py'
Nov 29 06:42:02 compute-0 sudo[190167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:02 compute-0 python3.9[190169]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Nov 29 06:42:02 compute-0 sudo[190167]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:03 compute-0 sudo[190320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drwsptzhfwhccphhyazhwrzvicsatokf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398522.8093371-465-278970522338747/AnsiballZ_group.py'
Nov 29 06:42:03 compute-0 sudo[190320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:03 compute-0 python3.9[190322]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 29 06:42:03 compute-0 groupadd[190323]: group added to /etc/group: name=ceilometer, GID=42405
Nov 29 06:42:03 compute-0 groupadd[190323]: group added to /etc/gshadow: name=ceilometer
Nov 29 06:42:03 compute-0 groupadd[190323]: new group: name=ceilometer, GID=42405
Nov 29 06:42:03 compute-0 sudo[190320]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:04 compute-0 sudo[190478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvhnttgxsrtqilieatpthnjndvtzwlsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398523.6206005-489-153424928867968/AnsiballZ_user.py'
Nov 29 06:42:04 compute-0 sudo[190478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:04 compute-0 python3.9[190480]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 29 06:42:04 compute-0 useradd[190482]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Nov 29 06:42:04 compute-0 useradd[190482]: add 'ceilometer' to group 'libvirt'
Nov 29 06:42:04 compute-0 useradd[190482]: add 'ceilometer' to shadow group 'libvirt'
Nov 29 06:42:04 compute-0 sudo[190478]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:05 compute-0 podman[190565]: 2025-11-29 06:42:05.819621034 +0000 UTC m=+0.071095156 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Nov 29 06:42:06 compute-0 python3.9[190658]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:42:06 compute-0 python3.9[190779]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764398525.6164627-567-35381116904062/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:42:07 compute-0 python3.9[190931]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:42:07 compute-0 python3.9[191052]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764398526.7551515-567-133615054833535/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:42:08 compute-0 sshd-session[190900]: Invalid user admin from 45.202.211.6 port 58830
Nov 29 06:42:08 compute-0 sshd-session[190900]: Received disconnect from 45.202.211.6 port 58830:11: Bye Bye [preauth]
Nov 29 06:42:08 compute-0 sshd-session[190900]: Disconnected from invalid user admin 45.202.211.6 port 58830 [preauth]
Nov 29 06:42:08 compute-0 python3.9[191202]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:42:09 compute-0 python3.9[191323]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764398527.8922384-567-119368237691743/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:42:10 compute-0 python3.9[191473]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:42:10 compute-0 python3.9[191625]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:42:11 compute-0 python3.9[191777]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:42:12 compute-0 python3.9[191898]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398531.1577723-744-209726207251870/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:42:12 compute-0 python3.9[192048]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:42:12 compute-0 podman[192098]: 2025-11-29 06:42:12.990876943 +0000 UTC m=+0.081572862 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller)
Nov 29 06:42:13 compute-0 python3.9[192139]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:42:13 compute-0 python3.9[192300]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:42:14 compute-0 python3.9[192421]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398533.2857306-744-67022152150266/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=17453a32c9d181134878b3e453cb84c3cd9bd67d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:42:15 compute-0 python3.9[192571]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:42:15 compute-0 python3.9[192692]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398534.6981442-744-15903832474086/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:42:16 compute-0 python3.9[192842]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:42:16 compute-0 python3.9[192963]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398535.9423318-744-142197978269197/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:42:17 compute-0 python3.9[193113]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:42:18 compute-0 python3.9[193234]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398537.0519683-744-260243901907417/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=6e4982940d2bfae88404914dfaf72552f6356d81 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:42:18 compute-0 python3.9[193384]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:42:19 compute-0 python3.9[193505]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398538.2283428-744-104164714495577/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:42:19 compute-0 python3.9[193655]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:42:20 compute-0 podman[193750]: 2025-11-29 06:42:20.184817067 +0000 UTC m=+0.056002885 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, io.buildah.version=1.41.3)
Nov 29 06:42:20 compute-0 python3.9[193793]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398539.321751-744-3816244423564/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=d474f1e4c3dbd24762592c51cbe5311f0a037273 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:42:20 compute-0 python3.9[193946]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:42:21 compute-0 python3.9[194067]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398540.4919133-744-28444312045659/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:42:22 compute-0 python3.9[194217]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:42:22 compute-0 python3.9[194338]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398541.839215-744-18501510443248/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=e342121a88f67e2bae7ebc05d1e6d350470198a5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:42:22 compute-0 auditd[706]: Audit daemon rotating log files
Nov 29 06:42:23 compute-0 python3.9[194488]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:42:24 compute-0 python3.9[194609]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398542.929918-744-266837145370688/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:42:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:42:24.795 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:42:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:42:24.796 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:42:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:42:24.796 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:42:26 compute-0 python3.9[194759]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:42:26 compute-0 python3.9[194835]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:42:27 compute-0 python3.9[194985]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:42:27 compute-0 python3.9[195061]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:42:28 compute-0 python3.9[195211]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:42:28 compute-0 nova_compute[187185]: 2025-11-29 06:42:28.574 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:42:28 compute-0 nova_compute[187185]: 2025-11-29 06:42:28.658 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:42:28 compute-0 python3.9[195287]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:42:29 compute-0 sudo[195437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbeclswfnmzhicusvohmxiyjgwoktzug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398549.1934648-1311-190051756141016/AnsiballZ_file.py'
Nov 29 06:42:29 compute-0 sudo[195437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:29 compute-0 python3.9[195439]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:42:29 compute-0 sudo[195437]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:30 compute-0 sudo[195589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpetmmlsfoofclrqfwiryyikqpzhyncp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398550.0032463-1335-270747911728759/AnsiballZ_file.py'
Nov 29 06:42:30 compute-0 sudo[195589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:30 compute-0 python3.9[195591]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:42:30 compute-0 sudo[195589]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:30 compute-0 sudo[195743]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-friosrtibjwdonhhuxlbeavumedtjipa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398550.6839633-1359-205730843371060/AnsiballZ_file.py'
Nov 29 06:42:30 compute-0 sudo[195743]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:31 compute-0 python3.9[195745]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:42:31 compute-0 sudo[195743]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:31 compute-0 sshd-session[195592]: Invalid user devuser from 1.214.197.163 port 36242
Nov 29 06:42:31 compute-0 sudo[195895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebcytoqkvrxutqqoarvghmczfbkcwete ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398551.4354947-1383-27791125566358/AnsiballZ_systemd_service.py'
Nov 29 06:42:31 compute-0 sudo[195895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:31 compute-0 sshd-session[195592]: Received disconnect from 1.214.197.163 port 36242:11: Bye Bye [preauth]
Nov 29 06:42:31 compute-0 sshd-session[195592]: Disconnected from invalid user devuser 1.214.197.163 port 36242 [preauth]
Nov 29 06:42:32 compute-0 python3.9[195897]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:42:32 compute-0 systemd[1]: Reloading.
Nov 29 06:42:32 compute-0 systemd-rc-local-generator[195926]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:42:32 compute-0 systemd-sysv-generator[195931]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:42:32 compute-0 systemd[1]: Listening on Podman API Socket.
Nov 29 06:42:32 compute-0 sudo[195895]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:33 compute-0 nova_compute[187185]: 2025-11-29 06:42:33.318 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:42:33 compute-0 nova_compute[187185]: 2025-11-29 06:42:33.318 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:42:33 compute-0 nova_compute[187185]: 2025-11-29 06:42:33.319 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 06:42:33 compute-0 nova_compute[187185]: 2025-11-29 06:42:33.319 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 06:42:33 compute-0 nova_compute[187185]: 2025-11-29 06:42:33.337 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 06:42:33 compute-0 nova_compute[187185]: 2025-11-29 06:42:33.338 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:42:33 compute-0 nova_compute[187185]: 2025-11-29 06:42:33.338 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:42:33 compute-0 nova_compute[187185]: 2025-11-29 06:42:33.338 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:42:33 compute-0 nova_compute[187185]: 2025-11-29 06:42:33.339 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:42:33 compute-0 nova_compute[187185]: 2025-11-29 06:42:33.339 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:42:33 compute-0 nova_compute[187185]: 2025-11-29 06:42:33.339 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:42:33 compute-0 nova_compute[187185]: 2025-11-29 06:42:33.339 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 06:42:33 compute-0 nova_compute[187185]: 2025-11-29 06:42:33.339 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:42:33 compute-0 nova_compute[187185]: 2025-11-29 06:42:33.362 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:42:33 compute-0 nova_compute[187185]: 2025-11-29 06:42:33.363 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:42:33 compute-0 nova_compute[187185]: 2025-11-29 06:42:33.363 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:42:33 compute-0 nova_compute[187185]: 2025-11-29 06:42:33.363 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:42:33 compute-0 nova_compute[187185]: 2025-11-29 06:42:33.521 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:42:33 compute-0 nova_compute[187185]: 2025-11-29 06:42:33.522 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6166MB free_disk=73.54471588134766GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:42:33 compute-0 nova_compute[187185]: 2025-11-29 06:42:33.522 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:42:33 compute-0 nova_compute[187185]: 2025-11-29 06:42:33.523 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:42:33 compute-0 nova_compute[187185]: 2025-11-29 06:42:33.626 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:42:33 compute-0 nova_compute[187185]: 2025-11-29 06:42:33.626 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:42:33 compute-0 nova_compute[187185]: 2025-11-29 06:42:33.646 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:42:33 compute-0 nova_compute[187185]: 2025-11-29 06:42:33.669 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:42:33 compute-0 nova_compute[187185]: 2025-11-29 06:42:33.670 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:42:33 compute-0 nova_compute[187185]: 2025-11-29 06:42:33.670 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:42:34 compute-0 sudo[196086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frtftmgddjniwhbmjbfwzrplhdvcqdrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398553.8698652-1410-66718115389603/AnsiballZ_stat.py'
Nov 29 06:42:34 compute-0 sudo[196086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:34 compute-0 python3.9[196088]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:42:34 compute-0 sudo[196086]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:34 compute-0 sudo[196209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llvsunbpswwmtlbnsykfoptiuujqgkzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398553.8698652-1410-66718115389603/AnsiballZ_copy.py'
Nov 29 06:42:34 compute-0 sudo[196209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:34 compute-0 python3.9[196211]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398553.8698652-1410-66718115389603/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:42:34 compute-0 sudo[196209]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:35 compute-0 sudo[196285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghfuukzhoafzuaevpbdylhywfttxtxsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398553.8698652-1410-66718115389603/AnsiballZ_stat.py'
Nov 29 06:42:35 compute-0 sudo[196285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:35 compute-0 python3.9[196287]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:42:35 compute-0 sudo[196285]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:35 compute-0 sudo[196408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylrclpgiowcgjjlpxbkldikqfoidayrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398553.8698652-1410-66718115389603/AnsiballZ_copy.py'
Nov 29 06:42:35 compute-0 sudo[196408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:35 compute-0 python3.9[196410]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398553.8698652-1410-66718115389603/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:42:35 compute-0 sudo[196408]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:35 compute-0 podman[196411]: 2025-11-29 06:42:35.929562421 +0000 UTC m=+0.058569487 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 29 06:42:36 compute-0 sudo[196578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thiuqldatalpiusgoiwhvnhltktrlvew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398556.361218-1494-234988546432779/AnsiballZ_container_config_data.py'
Nov 29 06:42:36 compute-0 sudo[196578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:37 compute-0 python3.9[196580]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=ceilometer_agent_compute.json debug=False
Nov 29 06:42:37 compute-0 sudo[196578]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:37 compute-0 sudo[196730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gotxqdfhecpkgqcqxopyebogpzhmaqbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398557.4208276-1521-62572820137601/AnsiballZ_container_config_hash.py'
Nov 29 06:42:37 compute-0 sudo[196730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:38 compute-0 python3.9[196732]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 06:42:38 compute-0 sudo[196730]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:39 compute-0 sudo[196882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frzhyqobfvcmgonxcqbugvesoajqcpdk ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764398558.5902145-1551-272464499634426/AnsiballZ_edpm_container_manage.py'
Nov 29 06:42:39 compute-0 sudo[196882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:39 compute-0 python3[196884]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=ceilometer_agent_compute.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 06:42:39 compute-0 podman[196922]: 2025-11-29 06:42:39.542489382 +0000 UTC m=+0.053059491 container create 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 29 06:42:39 compute-0 podman[196922]: 2025-11-29 06:42:39.515657728 +0000 UTC m=+0.026227857 image pull e6f07353639e492d8c9627d6d615ceeb47cb00ac4d14993b12e8023ee2aeee6f quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Nov 29 06:42:39 compute-0 python3[196884]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck compute --label config_id=edpm --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z --volume /var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start
Nov 29 06:42:39 compute-0 sudo[196882]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:40 compute-0 sudo[197109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yunycptpdorlfvqsiikigmrzguckphvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398560.0024922-1575-207642926917597/AnsiballZ_stat.py'
Nov 29 06:42:40 compute-0 sudo[197109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:40 compute-0 python3.9[197111]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:42:40 compute-0 sudo[197109]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:41 compute-0 sudo[197263]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrdagpbnbzrzruqgfendfqthxenlsjca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398560.8493907-1602-47004931528260/AnsiballZ_file.py'
Nov 29 06:42:41 compute-0 sudo[197263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:41 compute-0 python3.9[197265]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:42:41 compute-0 sudo[197263]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:41 compute-0 sudo[197414]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-liwujnbteemblocncygucsrhiwesdyvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398561.3597226-1602-125576330630477/AnsiballZ_copy.py'
Nov 29 06:42:41 compute-0 sudo[197414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:41 compute-0 python3.9[197416]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764398561.3597226-1602-125576330630477/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:42:42 compute-0 sudo[197414]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:42 compute-0 sudo[197490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usfqiwktpotrtqzvzmerwrfdcwnyfcnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398561.3597226-1602-125576330630477/AnsiballZ_systemd.py'
Nov 29 06:42:42 compute-0 sudo[197490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:42 compute-0 python3.9[197492]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 06:42:42 compute-0 systemd[1]: Reloading.
Nov 29 06:42:42 compute-0 systemd-rc-local-generator[197518]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:42:42 compute-0 systemd-sysv-generator[197521]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:42:43 compute-0 sudo[197490]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:43 compute-0 podman[197527]: 2025-11-29 06:42:43.400337271 +0000 UTC m=+0.096673101 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Nov 29 06:42:43 compute-0 sudo[197627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njmslnlyglinydmucexggtekyemwlyhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398561.3597226-1602-125576330630477/AnsiballZ_systemd.py'
Nov 29 06:42:43 compute-0 sudo[197627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:43 compute-0 python3.9[197629]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:42:43 compute-0 systemd[1]: Reloading.
Nov 29 06:42:43 compute-0 systemd-sysv-generator[197661]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:42:43 compute-0 systemd-rc-local-generator[197657]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:42:44 compute-0 systemd[1]: Starting ceilometer_agent_compute container...
Nov 29 06:42:44 compute-0 systemd[1]: Started libcrun container.
Nov 29 06:42:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ca42539d85f63fdfbeda8a8de0b27a696397d648223ab2168bbc43701594685/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 29 06:42:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ca42539d85f63fdfbeda8a8de0b27a696397d648223ab2168bbc43701594685/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Nov 29 06:42:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ca42539d85f63fdfbeda8a8de0b27a696397d648223ab2168bbc43701594685/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Nov 29 06:42:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ca42539d85f63fdfbeda8a8de0b27a696397d648223ab2168bbc43701594685/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Nov 29 06:42:44 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d.
Nov 29 06:42:44 compute-0 podman[197670]: 2025-11-29 06:42:44.344035523 +0000 UTC m=+0.131687068 container init 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 06:42:44 compute-0 ceilometer_agent_compute[197686]: + sudo -E kolla_set_configs
Nov 29 06:42:44 compute-0 podman[197670]: 2025-11-29 06:42:44.370712402 +0000 UTC m=+0.158363927 container start 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm)
Nov 29 06:42:44 compute-0 podman[197670]: ceilometer_agent_compute
Nov 29 06:42:44 compute-0 systemd[1]: Started ceilometer_agent_compute container.
Nov 29 06:42:44 compute-0 ceilometer_agent_compute[197686]: sudo: unable to send audit message: Operation not permitted
Nov 29 06:42:44 compute-0 sudo[197692]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 29 06:42:44 compute-0 sudo[197692]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 29 06:42:44 compute-0 sudo[197692]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Nov 29 06:42:44 compute-0 sudo[197627]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:44 compute-0 ceilometer_agent_compute[197686]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 06:42:44 compute-0 ceilometer_agent_compute[197686]: INFO:__main__:Validating config file
Nov 29 06:42:44 compute-0 ceilometer_agent_compute[197686]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 06:42:44 compute-0 ceilometer_agent_compute[197686]: INFO:__main__:Copying service configuration files
Nov 29 06:42:44 compute-0 ceilometer_agent_compute[197686]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Nov 29 06:42:44 compute-0 ceilometer_agent_compute[197686]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Nov 29 06:42:44 compute-0 ceilometer_agent_compute[197686]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Nov 29 06:42:44 compute-0 ceilometer_agent_compute[197686]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Nov 29 06:42:44 compute-0 ceilometer_agent_compute[197686]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Nov 29 06:42:44 compute-0 ceilometer_agent_compute[197686]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Nov 29 06:42:44 compute-0 ceilometer_agent_compute[197686]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 29 06:42:44 compute-0 ceilometer_agent_compute[197686]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 29 06:42:44 compute-0 ceilometer_agent_compute[197686]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 29 06:42:44 compute-0 ceilometer_agent_compute[197686]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 29 06:42:44 compute-0 ceilometer_agent_compute[197686]: INFO:__main__:Writing out command to execute
Nov 29 06:42:44 compute-0 podman[197693]: 2025-11-29 06:42:44.437319278 +0000 UTC m=+0.055108050 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:42:44 compute-0 sudo[197692]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:44 compute-0 ceilometer_agent_compute[197686]: ++ cat /run_command
Nov 29 06:42:44 compute-0 systemd[1]: 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d-5cbacc8572d90457.service: Main process exited, code=exited, status=1/FAILURE
Nov 29 06:42:44 compute-0 systemd[1]: 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d-5cbacc8572d90457.service: Failed with result 'exit-code'.
Nov 29 06:42:44 compute-0 ceilometer_agent_compute[197686]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 29 06:42:44 compute-0 ceilometer_agent_compute[197686]: + ARGS=
Nov 29 06:42:44 compute-0 ceilometer_agent_compute[197686]: + sudo kolla_copy_cacerts
Nov 29 06:42:44 compute-0 ceilometer_agent_compute[197686]: sudo: unable to send audit message: Operation not permitted
Nov 29 06:42:44 compute-0 sudo[197715]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 29 06:42:44 compute-0 sudo[197715]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 29 06:42:44 compute-0 sudo[197715]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Nov 29 06:42:44 compute-0 sudo[197715]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:44 compute-0 ceilometer_agent_compute[197686]: + [[ ! -n '' ]]
Nov 29 06:42:44 compute-0 ceilometer_agent_compute[197686]: + . kolla_extend_start
Nov 29 06:42:44 compute-0 ceilometer_agent_compute[197686]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Nov 29 06:42:44 compute-0 ceilometer_agent_compute[197686]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 29 06:42:44 compute-0 ceilometer_agent_compute[197686]: + umask 0022
Nov 29 06:42:44 compute-0 ceilometer_agent_compute[197686]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Nov 29 06:42:45 compute-0 sudo[197867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xucfsfuyxhjphkmhqovelqrhlzisqaxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398564.84606-1674-81906060210208/AnsiballZ_systemd.py'
Nov 29 06:42:45 compute-0 sudo[197867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.384 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.384 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.384 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.384 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.385 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.385 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.385 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.385 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.385 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.385 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.385 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.385 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.385 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.385 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.386 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.386 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.386 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.386 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.386 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.386 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.386 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.386 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.386 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.386 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.386 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.387 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.387 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.387 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.387 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.387 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.387 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.387 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.387 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.387 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.387 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.387 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.387 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.387 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.388 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.388 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.388 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.388 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.388 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.388 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.388 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.388 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.388 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.388 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.388 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.388 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.389 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.389 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.389 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.389 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.389 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.389 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.389 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.389 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.389 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.389 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.389 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.390 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.390 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.390 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.390 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.390 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.390 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.390 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.390 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.390 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.390 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.390 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.391 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.391 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.391 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.391 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.391 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.391 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.391 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.391 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.391 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.391 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.391 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.391 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.391 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.392 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.392 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.392 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.392 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.392 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.392 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.392 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.392 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.392 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.392 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.393 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.393 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.393 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.393 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.393 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.393 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.393 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.393 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.393 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.393 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.393 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.394 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.394 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.394 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.394 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.394 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.394 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.394 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.394 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.394 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.394 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.395 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.395 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.395 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.395 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.395 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.395 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.395 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.395 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.395 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.395 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.396 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.396 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.396 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.396 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.396 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.396 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.396 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.397 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.397 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.397 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.397 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.397 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.397 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.397 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.397 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.397 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.397 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.398 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.398 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.398 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.398 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.398 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.398 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.398 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.398 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.398 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.398 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.398 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.398 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.399 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.399 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.399 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.399 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.399 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.399 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.399 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.399 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.399 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.399 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.417 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.418 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.419 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Nov 29 06:42:45 compute-0 python3.9[197869]: ansible-ansible.builtin.systemd Invoked with name=edpm_ceilometer_agent_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:42:45 compute-0 systemd[1]: Stopping ceilometer_agent_compute container...
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.520 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.528 2 INFO cotyledon._service_manager [-] Caught SIGTERM signal, graceful exiting of master process
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.612 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.612 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.612 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.612 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.612 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.612 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.613 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.613 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.613 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.613 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.613 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.613 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.613 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.613 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.614 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.614 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.614 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.614 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.614 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.614 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.614 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.614 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.615 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.615 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.615 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.615 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.615 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.615 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.615 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.615 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.615 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.616 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.616 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.616 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.616 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.616 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.616 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.616 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.616 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.616 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.617 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.617 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.617 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.617 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.617 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.617 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.617 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.617 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.617 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.618 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.618 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.618 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.618 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.618 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.618 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.618 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.618 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.618 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.618 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.619 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.619 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.619 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.619 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.619 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.619 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.619 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.619 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.619 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.619 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.620 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.620 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.620 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.620 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.620 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.620 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.620 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.620 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.620 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.620 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.620 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.620 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.621 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.621 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.621 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.621 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.621 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.621 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.621 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.621 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.621 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.621 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.622 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.622 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.622 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.622 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.622 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.622 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.622 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.622 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.622 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.622 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.623 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.623 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.623 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.623 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.623 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.623 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.623 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.623 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.623 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.624 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.624 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.624 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.624 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.624 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.624 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.624 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.624 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.624 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.624 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.625 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.625 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.625 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.625 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.625 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.625 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.625 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.625 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.625 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.625 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.625 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.626 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.626 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.626 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.626 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.626 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.626 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.626 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.626 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.626 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.626 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.626 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.626 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.627 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.627 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.627 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.627 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.627 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.627 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.627 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.627 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.627 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.627 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.627 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.628 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.628 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.628 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.628 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.628 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.628 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.628 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.628 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.628 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.628 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.628 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.629 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.629 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.629 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.629 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.629 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.629 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.629 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.629 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.629 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.629 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.629 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.629 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.629 2 DEBUG cotyledon._service_manager [-] Killing services with signal SIGTERM _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:304
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.630 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.630 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.630 2 DEBUG cotyledon._service_manager [-] Waiting services to terminate _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:308
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.630 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.630 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.630 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.630 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.630 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.630 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.630 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.630 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.630 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.630 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.631 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.631 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.631 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.631 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.631 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.631 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.631 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.631 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.631 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.631 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.631 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.632 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.632 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.632 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.632 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.632 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.632 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.632 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.632 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.632 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.632 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.632 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.633 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.633 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.633 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.633 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.633 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.633 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.633 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.633 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.634 12 INFO cotyledon._service [-] Caught SIGTERM signal, graceful exiting of service AgentManager(0) [12]
Nov 29 06:42:45 compute-0 ceilometer_agent_compute[197686]: 2025-11-29 06:42:45.641 2 DEBUG cotyledon._service_manager [-] Shutdown finish _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:320
Nov 29 06:42:45 compute-0 virtqemud[186729]: End of file while reading data: Input/output error
Nov 29 06:42:45 compute-0 sshd-session[197666]: Received disconnect from 103.179.56.44 port 49726:11: Bye Bye [preauth]
Nov 29 06:42:45 compute-0 sshd-session[197666]: Disconnected from authenticating user root 103.179.56.44 port 49726 [preauth]
Nov 29 06:42:45 compute-0 systemd[1]: libpod-39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d.scope: Deactivated successfully.
Nov 29 06:42:45 compute-0 systemd[1]: libpod-39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d.scope: Consumed 1.445s CPU time.
Nov 29 06:42:45 compute-0 conmon[197686]: conmon 39c9a04598137e9f2755 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d.scope/container/memory.events
Nov 29 06:42:45 compute-0 podman[197876]: 2025-11-29 06:42:45.804181529 +0000 UTC m=+0.316380403 container died 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Nov 29 06:42:46 compute-0 systemd[1]: 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d-5cbacc8572d90457.timer: Deactivated successfully.
Nov 29 06:42:46 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d.
Nov 29 06:42:46 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d-userdata-shm.mount: Deactivated successfully.
Nov 29 06:42:46 compute-0 systemd[1]: var-lib-containers-storage-overlay-0ca42539d85f63fdfbeda8a8de0b27a696397d648223ab2168bbc43701594685-merged.mount: Deactivated successfully.
Nov 29 06:42:46 compute-0 podman[197876]: 2025-11-29 06:42:46.347620032 +0000 UTC m=+0.859818906 container cleanup 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 29 06:42:46 compute-0 podman[197876]: ceilometer_agent_compute
Nov 29 06:42:46 compute-0 podman[197905]: ceilometer_agent_compute
Nov 29 06:42:46 compute-0 systemd[1]: edpm_ceilometer_agent_compute.service: Deactivated successfully.
Nov 29 06:42:46 compute-0 systemd[1]: Stopped ceilometer_agent_compute container.
Nov 29 06:42:46 compute-0 systemd[1]: Starting ceilometer_agent_compute container...
Nov 29 06:42:46 compute-0 systemd[1]: Started libcrun container.
Nov 29 06:42:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ca42539d85f63fdfbeda8a8de0b27a696397d648223ab2168bbc43701594685/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 29 06:42:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ca42539d85f63fdfbeda8a8de0b27a696397d648223ab2168bbc43701594685/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Nov 29 06:42:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ca42539d85f63fdfbeda8a8de0b27a696397d648223ab2168bbc43701594685/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Nov 29 06:42:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ca42539d85f63fdfbeda8a8de0b27a696397d648223ab2168bbc43701594685/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Nov 29 06:42:46 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d.
Nov 29 06:42:46 compute-0 podman[197914]: 2025-11-29 06:42:46.735007785 +0000 UTC m=+0.270755545 container init 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 29 06:42:46 compute-0 ceilometer_agent_compute[197930]: + sudo -E kolla_set_configs
Nov 29 06:42:46 compute-0 ceilometer_agent_compute[197930]: sudo: unable to send audit message: Operation not permitted
Nov 29 06:42:46 compute-0 sudo[197936]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 29 06:42:46 compute-0 sudo[197936]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 29 06:42:46 compute-0 podman[197914]: 2025-11-29 06:42:46.766731537 +0000 UTC m=+0.302479267 container start 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Nov 29 06:42:46 compute-0 sudo[197936]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Nov 29 06:42:46 compute-0 ceilometer_agent_compute[197930]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 06:42:46 compute-0 ceilometer_agent_compute[197930]: INFO:__main__:Validating config file
Nov 29 06:42:46 compute-0 ceilometer_agent_compute[197930]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 06:42:46 compute-0 ceilometer_agent_compute[197930]: INFO:__main__:Copying service configuration files
Nov 29 06:42:46 compute-0 ceilometer_agent_compute[197930]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Nov 29 06:42:46 compute-0 ceilometer_agent_compute[197930]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Nov 29 06:42:46 compute-0 ceilometer_agent_compute[197930]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Nov 29 06:42:46 compute-0 ceilometer_agent_compute[197930]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Nov 29 06:42:46 compute-0 ceilometer_agent_compute[197930]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Nov 29 06:42:46 compute-0 ceilometer_agent_compute[197930]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Nov 29 06:42:46 compute-0 ceilometer_agent_compute[197930]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 29 06:42:46 compute-0 ceilometer_agent_compute[197930]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 29 06:42:46 compute-0 ceilometer_agent_compute[197930]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 29 06:42:46 compute-0 ceilometer_agent_compute[197930]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 29 06:42:46 compute-0 ceilometer_agent_compute[197930]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 29 06:42:46 compute-0 ceilometer_agent_compute[197930]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 29 06:42:46 compute-0 ceilometer_agent_compute[197930]: INFO:__main__:Writing out command to execute
Nov 29 06:42:46 compute-0 sudo[197936]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:46 compute-0 ceilometer_agent_compute[197930]: ++ cat /run_command
Nov 29 06:42:46 compute-0 ceilometer_agent_compute[197930]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 29 06:42:46 compute-0 ceilometer_agent_compute[197930]: + ARGS=
Nov 29 06:42:46 compute-0 ceilometer_agent_compute[197930]: + sudo kolla_copy_cacerts
Nov 29 06:42:46 compute-0 ceilometer_agent_compute[197930]: sudo: unable to send audit message: Operation not permitted
Nov 29 06:42:46 compute-0 sudo[197951]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 29 06:42:46 compute-0 sudo[197951]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 29 06:42:46 compute-0 sudo[197951]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Nov 29 06:42:46 compute-0 sudo[197951]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:46 compute-0 ceilometer_agent_compute[197930]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 29 06:42:46 compute-0 ceilometer_agent_compute[197930]: + [[ ! -n '' ]]
Nov 29 06:42:46 compute-0 ceilometer_agent_compute[197930]: + . kolla_extend_start
Nov 29 06:42:46 compute-0 ceilometer_agent_compute[197930]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Nov 29 06:42:46 compute-0 ceilometer_agent_compute[197930]: + umask 0022
Nov 29 06:42:46 compute-0 ceilometer_agent_compute[197930]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Nov 29 06:42:46 compute-0 podman[197914]: ceilometer_agent_compute
Nov 29 06:42:46 compute-0 systemd[1]: Started ceilometer_agent_compute container.
Nov 29 06:42:47 compute-0 sudo[197867]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:47 compute-0 podman[197937]: 2025-11-29 06:42:47.057319016 +0000 UTC m=+0.281537152 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 06:42:47 compute-0 systemd[1]: 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d-7231b2b56a723b0a.service: Main process exited, code=exited, status=1/FAILURE
Nov 29 06:42:47 compute-0 systemd[1]: 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d-7231b2b56a723b0a.service: Failed with result 'exit-code'.
Nov 29 06:42:47 compute-0 sudo[198111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lygygcwefqefendupyalnzuopsempiuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398567.1706555-1698-14472969511554/AnsiballZ_stat.py'
Nov 29 06:42:47 compute-0 sudo[198111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:47 compute-0 python3.9[198113]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:42:47 compute-0 sudo[198111]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.760 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.760 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.760 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.760 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.761 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.761 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.761 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.761 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.761 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.761 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.761 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.761 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.762 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.762 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.762 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.762 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.762 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.762 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.762 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.762 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.762 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.762 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.763 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.763 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.763 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.763 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.763 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.763 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.763 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.763 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.763 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.763 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.763 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.763 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.764 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.764 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.764 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.764 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.764 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.764 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.764 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.764 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.764 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.764 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.765 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.765 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.765 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.765 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.765 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.765 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.765 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.765 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.765 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.765 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.765 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.766 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.766 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.766 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.766 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.766 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.766 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.766 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.767 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.767 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.767 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.767 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.767 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.767 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.767 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.767 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.768 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.768 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.768 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.768 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.768 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.768 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.768 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.768 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.769 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.769 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.769 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.769 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.769 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.769 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.769 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.769 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.769 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.769 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.769 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.770 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.770 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.770 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.770 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.770 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.770 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.770 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.770 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.770 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.770 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.770 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.770 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.771 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.771 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.771 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.771 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.771 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.771 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.771 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.771 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.772 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.772 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.772 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.772 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.772 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.772 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.772 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.772 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.772 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.772 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.773 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.773 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.773 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.773 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.773 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.773 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.773 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.773 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.773 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.774 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.774 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.774 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.774 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.774 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.774 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.774 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.774 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.774 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.775 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.775 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.775 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.775 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.775 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.775 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.775 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.775 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.775 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.775 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.775 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.776 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.776 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.776 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.776 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.776 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.776 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.776 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.776 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.776 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.776 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.777 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.777 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.777 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.777 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.777 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.777 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.777 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.795 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.796 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.797 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.809 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.943 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.943 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.943 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.943 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.944 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.944 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.944 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.944 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.944 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.944 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.944 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.944 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.944 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.945 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.945 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.945 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.945 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.945 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.945 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.945 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.946 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.946 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.946 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.946 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.946 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.946 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.946 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.946 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.946 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.946 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.946 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.946 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.947 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.947 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.947 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.947 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.947 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.947 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.947 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.947 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.947 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.947 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.948 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.948 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.948 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.948 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.948 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.948 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.948 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.948 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.948 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.948 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.949 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.949 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.949 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.949 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.949 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.949 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.949 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.949 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.949 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.949 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.950 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.950 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.950 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.950 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.950 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.950 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.950 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.950 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.950 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.950 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.950 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.951 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.951 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.951 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.951 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.951 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.951 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.951 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.951 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.951 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.951 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.951 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.951 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.952 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.952 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.952 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.952 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.952 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.952 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.952 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.952 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.952 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.952 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.953 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.953 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.953 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.953 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.953 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.953 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.953 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.953 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.953 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.953 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.953 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.954 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.954 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.954 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.954 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.954 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.954 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.954 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.954 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.954 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.954 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.954 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.955 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.955 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.955 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.955 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.955 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.955 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.955 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.955 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.955 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.955 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.956 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.956 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.956 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.956 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.956 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.956 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.956 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.956 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.956 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.956 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.957 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.957 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.957 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.957 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.957 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.957 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.957 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 sudo[198237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxbpxrsdyetqbterfvxhqolkbjyugtit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398567.1706555-1698-14472969511554/AnsiballZ_copy.py'
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.957 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.957 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.957 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.958 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.958 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.958 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.958 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.958 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.958 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.958 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.958 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.958 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.958 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.959 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.959 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.959 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.959 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.959 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.959 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.959 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 sudo[198237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.959 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.959 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.960 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.960 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.960 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.960 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.960 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.960 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.960 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.960 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.960 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.960 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.960 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.961 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.961 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.961 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.961 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.961 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.961 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.961 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.961 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.961 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.962 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.962 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.962 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.962 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.962 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.962 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.962 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.962 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.962 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.963 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.963 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.963 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.963 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.963 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.963 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.963 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.963 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.963 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.964 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.964 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.964 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.964 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.964 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.964 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.964 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.964 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.964 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.964 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.965 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.965 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.965 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.965 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.965 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.967 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.973 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.976 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.977 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.977 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.977 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.977 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.977 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.977 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.977 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.977 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.977 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.977 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.978 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.978 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.978 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.978 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.978 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.978 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.978 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.978 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.978 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.979 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.979 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.979 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.979 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:42:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:42:47.979 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:42:48 compute-0 python3.9[198239]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398567.1706555-1698-14472969511554/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:42:48 compute-0 sudo[198237]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:49 compute-0 sudo[198392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xowxpigryxdlxzhtxoqrsjejtopjfvtg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398568.789381-1749-142505300458439/AnsiballZ_container_config_data.py'
Nov 29 06:42:49 compute-0 sudo[198392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:49 compute-0 python3.9[198394]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=node_exporter.json debug=False
Nov 29 06:42:49 compute-0 sudo[198392]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:50 compute-0 sudo[198544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dockfspkjximvjaedubmqztrhrdohjaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398569.783851-1776-47541754673537/AnsiballZ_container_config_hash.py'
Nov 29 06:42:50 compute-0 sudo[198544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:50 compute-0 python3.9[198546]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 06:42:50 compute-0 sudo[198544]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:50 compute-0 podman[198571]: 2025-11-29 06:42:50.876373031 +0000 UTC m=+0.135220798 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd)
Nov 29 06:42:51 compute-0 sudo[198716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osfgqubatiwmyjszdiqeidrmezwxnwjb ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764398570.7786396-1806-74794842359808/AnsiballZ_edpm_container_manage.py'
Nov 29 06:42:51 compute-0 sudo[198716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:51 compute-0 python3[198718]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=node_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 06:42:51 compute-0 podman[198754]: 2025-11-29 06:42:51.544398019 +0000 UTC m=+0.025449055 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Nov 29 06:42:52 compute-0 podman[198754]: 2025-11-29 06:42:52.002948406 +0000 UTC m=+0.483999452 container create cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, config_id=edpm, container_name=node_exporter, managed_by=edpm_ansible)
Nov 29 06:42:52 compute-0 python3[198718]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck node_exporter --label config_id=edpm --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter:v1.5.0 --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Nov 29 06:42:52 compute-0 sudo[198716]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:55 compute-0 sudo[198944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swntbkcauvfbsezttsyqdizowplqgrij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398575.3120046-1830-67034827946889/AnsiballZ_stat.py'
Nov 29 06:42:55 compute-0 sudo[198944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:55 compute-0 python3.9[198946]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:42:55 compute-0 sudo[198944]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:55 compute-0 sshd-session[198817]: Invalid user steam from 160.202.8.218 port 46832
Nov 29 06:42:56 compute-0 sshd-session[198817]: Received disconnect from 160.202.8.218 port 46832:11: Bye Bye [preauth]
Nov 29 06:42:56 compute-0 sshd-session[198817]: Disconnected from invalid user steam 160.202.8.218 port 46832 [preauth]
Nov 29 06:42:56 compute-0 sudo[199098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzinmpnfmbtumznszadqfohqpsoorwrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398576.1790714-1857-105698740625903/AnsiballZ_file.py'
Nov 29 06:42:56 compute-0 sudo[199098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:56 compute-0 python3.9[199100]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:42:56 compute-0 sudo[199098]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:57 compute-0 sudo[199249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkrlpmouprmtbgdbuwfspmbvekcsmdxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398576.7111123-1857-85599171428098/AnsiballZ_copy.py'
Nov 29 06:42:57 compute-0 sudo[199249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:57 compute-0 python3.9[199251]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764398576.7111123-1857-85599171428098/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:42:57 compute-0 sudo[199249]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:57 compute-0 sudo[199325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gafbecowgaztrpvowxcpzlkwedmrletm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398576.7111123-1857-85599171428098/AnsiballZ_systemd.py'
Nov 29 06:42:57 compute-0 sudo[199325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:57 compute-0 python3.9[199327]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 06:42:57 compute-0 systemd[1]: Reloading.
Nov 29 06:42:57 compute-0 systemd-rc-local-generator[199354]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:42:57 compute-0 systemd-sysv-generator[199357]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:42:58 compute-0 sudo[199325]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:58 compute-0 sudo[199435]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrzqbicijjdpbdxpuqxshndufhimrdsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398576.7111123-1857-85599171428098/AnsiballZ_systemd.py'
Nov 29 06:42:58 compute-0 sudo[199435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:42:58 compute-0 python3.9[199437]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:42:58 compute-0 systemd[1]: Reloading.
Nov 29 06:42:58 compute-0 systemd-sysv-generator[199470]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:42:58 compute-0 systemd-rc-local-generator[199466]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:42:58 compute-0 systemd[1]: Starting node_exporter container...
Nov 29 06:42:59 compute-0 systemd[1]: Started libcrun container.
Nov 29 06:42:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8063fe61ad0386245729e4ed2d0400849d5850f4b74d0f171f34abcbb9fea797/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 29 06:42:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8063fe61ad0386245729e4ed2d0400849d5850f4b74d0f171f34abcbb9fea797/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 29 06:42:59 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd.
Nov 29 06:42:59 compute-0 podman[199477]: 2025-11-29 06:42:59.066409157 +0000 UTC m=+0.120354785 container init cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.082Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.082Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.082Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.083Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.083Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.083Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.083Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.084Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.084Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.084Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.084Z caller=node_exporter.go:117 level=info collector=arp
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.084Z caller=node_exporter.go:117 level=info collector=bcache
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.084Z caller=node_exporter.go:117 level=info collector=bonding
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.084Z caller=node_exporter.go:117 level=info collector=btrfs
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.084Z caller=node_exporter.go:117 level=info collector=conntrack
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.084Z caller=node_exporter.go:117 level=info collector=cpu
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.084Z caller=node_exporter.go:117 level=info collector=cpufreq
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.084Z caller=node_exporter.go:117 level=info collector=diskstats
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.084Z caller=node_exporter.go:117 level=info collector=edac
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.085Z caller=node_exporter.go:117 level=info collector=fibrechannel
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.085Z caller=node_exporter.go:117 level=info collector=filefd
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.085Z caller=node_exporter.go:117 level=info collector=filesystem
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.085Z caller=node_exporter.go:117 level=info collector=infiniband
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.085Z caller=node_exporter.go:117 level=info collector=ipvs
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.085Z caller=node_exporter.go:117 level=info collector=loadavg
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.085Z caller=node_exporter.go:117 level=info collector=mdadm
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.085Z caller=node_exporter.go:117 level=info collector=meminfo
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.085Z caller=node_exporter.go:117 level=info collector=netclass
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.085Z caller=node_exporter.go:117 level=info collector=netdev
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.085Z caller=node_exporter.go:117 level=info collector=netstat
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.085Z caller=node_exporter.go:117 level=info collector=nfs
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.085Z caller=node_exporter.go:117 level=info collector=nfsd
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.085Z caller=node_exporter.go:117 level=info collector=nvme
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.085Z caller=node_exporter.go:117 level=info collector=schedstat
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.085Z caller=node_exporter.go:117 level=info collector=sockstat
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.085Z caller=node_exporter.go:117 level=info collector=softnet
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.085Z caller=node_exporter.go:117 level=info collector=systemd
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.085Z caller=node_exporter.go:117 level=info collector=tapestats
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.085Z caller=node_exporter.go:117 level=info collector=udp_queues
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.085Z caller=node_exporter.go:117 level=info collector=vmstat
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.085Z caller=node_exporter.go:117 level=info collector=xfs
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.085Z caller=node_exporter.go:117 level=info collector=zfs
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.087Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Nov 29 06:42:59 compute-0 node_exporter[199492]: ts=2025-11-29T06:42:59.088Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Nov 29 06:42:59 compute-0 podman[199477]: 2025-11-29 06:42:59.09428905 +0000 UTC m=+0.148234678 container start cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 06:42:59 compute-0 podman[199477]: node_exporter
Nov 29 06:42:59 compute-0 systemd[1]: Started node_exporter container.
Nov 29 06:42:59 compute-0 sudo[199435]: pam_unix(sudo:session): session closed for user root
Nov 29 06:42:59 compute-0 podman[199501]: 2025-11-29 06:42:59.18109917 +0000 UTC m=+0.077468235 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 06:43:00 compute-0 sudo[199673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bosczrvpqvbwjrevjhpmsdlcwmvbsibj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398580.7155044-1929-158439959574926/AnsiballZ_systemd.py'
Nov 29 06:43:01 compute-0 sudo[199673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:01 compute-0 python3.9[199675]: ansible-ansible.builtin.systemd Invoked with name=edpm_node_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:43:01 compute-0 systemd[1]: Stopping node_exporter container...
Nov 29 06:43:01 compute-0 systemd[1]: libpod-cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd.scope: Deactivated successfully.
Nov 29 06:43:01 compute-0 podman[199679]: 2025-11-29 06:43:01.437807102 +0000 UTC m=+0.062890251 container died cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 06:43:01 compute-0 systemd[1]: cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd-423b0c9aec4175f2.timer: Deactivated successfully.
Nov 29 06:43:01 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd.
Nov 29 06:43:01 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd-userdata-shm.mount: Deactivated successfully.
Nov 29 06:43:01 compute-0 systemd[1]: var-lib-containers-storage-overlay-8063fe61ad0386245729e4ed2d0400849d5850f4b74d0f171f34abcbb9fea797-merged.mount: Deactivated successfully.
Nov 29 06:43:01 compute-0 podman[199679]: 2025-11-29 06:43:01.485209231 +0000 UTC m=+0.110292380 container cleanup cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 06:43:01 compute-0 podman[199679]: node_exporter
Nov 29 06:43:01 compute-0 systemd[1]: edpm_node_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 29 06:43:01 compute-0 podman[199709]: node_exporter
Nov 29 06:43:01 compute-0 systemd[1]: edpm_node_exporter.service: Failed with result 'exit-code'.
Nov 29 06:43:01 compute-0 systemd[1]: Stopped node_exporter container.
Nov 29 06:43:01 compute-0 systemd[1]: Starting node_exporter container...
Nov 29 06:43:01 compute-0 systemd[1]: Started libcrun container.
Nov 29 06:43:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8063fe61ad0386245729e4ed2d0400849d5850f4b74d0f171f34abcbb9fea797/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 29 06:43:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8063fe61ad0386245729e4ed2d0400849d5850f4b74d0f171f34abcbb9fea797/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 29 06:43:01 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd.
Nov 29 06:43:01 compute-0 podman[199722]: 2025-11-29 06:43:01.674953769 +0000 UTC m=+0.098798432 container init cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.687Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.687Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.687Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.687Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.687Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.688Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.688Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.688Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.688Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.688Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.688Z caller=node_exporter.go:117 level=info collector=arp
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.688Z caller=node_exporter.go:117 level=info collector=bcache
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.688Z caller=node_exporter.go:117 level=info collector=bonding
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.688Z caller=node_exporter.go:117 level=info collector=btrfs
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.688Z caller=node_exporter.go:117 level=info collector=conntrack
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.688Z caller=node_exporter.go:117 level=info collector=cpu
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.688Z caller=node_exporter.go:117 level=info collector=cpufreq
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.688Z caller=node_exporter.go:117 level=info collector=diskstats
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.688Z caller=node_exporter.go:117 level=info collector=edac
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.688Z caller=node_exporter.go:117 level=info collector=fibrechannel
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.688Z caller=node_exporter.go:117 level=info collector=filefd
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.688Z caller=node_exporter.go:117 level=info collector=filesystem
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.688Z caller=node_exporter.go:117 level=info collector=infiniband
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.688Z caller=node_exporter.go:117 level=info collector=ipvs
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.688Z caller=node_exporter.go:117 level=info collector=loadavg
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.688Z caller=node_exporter.go:117 level=info collector=mdadm
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.688Z caller=node_exporter.go:117 level=info collector=meminfo
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.688Z caller=node_exporter.go:117 level=info collector=netclass
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.688Z caller=node_exporter.go:117 level=info collector=netdev
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.688Z caller=node_exporter.go:117 level=info collector=netstat
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.688Z caller=node_exporter.go:117 level=info collector=nfs
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.688Z caller=node_exporter.go:117 level=info collector=nfsd
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.688Z caller=node_exporter.go:117 level=info collector=nvme
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.688Z caller=node_exporter.go:117 level=info collector=schedstat
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.688Z caller=node_exporter.go:117 level=info collector=sockstat
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.688Z caller=node_exporter.go:117 level=info collector=softnet
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.688Z caller=node_exporter.go:117 level=info collector=systemd
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.688Z caller=node_exporter.go:117 level=info collector=tapestats
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.688Z caller=node_exporter.go:117 level=info collector=udp_queues
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.688Z caller=node_exporter.go:117 level=info collector=vmstat
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.688Z caller=node_exporter.go:117 level=info collector=xfs
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.688Z caller=node_exporter.go:117 level=info collector=zfs
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.689Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Nov 29 06:43:01 compute-0 node_exporter[199739]: ts=2025-11-29T06:43:01.689Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Nov 29 06:43:01 compute-0 podman[199722]: 2025-11-29 06:43:01.699554749 +0000 UTC m=+0.123399382 container start cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 06:43:01 compute-0 podman[199722]: node_exporter
Nov 29 06:43:01 compute-0 systemd[1]: Started node_exporter container.
Nov 29 06:43:01 compute-0 sudo[199673]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:01 compute-0 podman[199748]: 2025-11-29 06:43:01.766120743 +0000 UTC m=+0.054335927 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 06:43:02 compute-0 sudo[199920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dexxxxcatxltmveqyvqhvwawqxizfeoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398581.917629-1953-68460837675369/AnsiballZ_stat.py'
Nov 29 06:43:02 compute-0 sudo[199920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:02 compute-0 python3.9[199922]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:43:02 compute-0 sudo[199920]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:02 compute-0 sudo[200043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqyuhsdlcvjfquizcudaiqwwpusylusp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398581.917629-1953-68460837675369/AnsiballZ_copy.py'
Nov 29 06:43:02 compute-0 sudo[200043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:02 compute-0 python3.9[200045]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398581.917629-1953-68460837675369/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:43:02 compute-0 sudo[200043]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:03 compute-0 sudo[200195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfgdbcvzfzzlvdowivtsbymezvzmbeso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398583.4511952-2004-223463329652252/AnsiballZ_container_config_data.py'
Nov 29 06:43:03 compute-0 sudo[200195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:03 compute-0 python3.9[200197]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Nov 29 06:43:03 compute-0 sudo[200195]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:04 compute-0 sudo[200347]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yompswxpdurrcralqawqufteukygybez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398584.2603152-2031-183195542085712/AnsiballZ_container_config_hash.py'
Nov 29 06:43:04 compute-0 sudo[200347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:04 compute-0 python3.9[200349]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 06:43:04 compute-0 sudo[200347]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:05 compute-0 sudo[200499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydskuqtdpakundaenppascmmmokfxobg ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764398585.1774957-2061-172349813677080/AnsiballZ_edpm_container_manage.py'
Nov 29 06:43:05 compute-0 sudo[200499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:05 compute-0 python3[200501]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 06:43:06 compute-0 sshd-session[200502]: Received disconnect from 179.125.24.202 port 41958:11: Bye Bye [preauth]
Nov 29 06:43:06 compute-0 sshd-session[200502]: Disconnected from authenticating user root 179.125.24.202 port 41958 [preauth]
Nov 29 06:43:06 compute-0 podman[200559]: 2025-11-29 06:43:06.853286929 +0000 UTC m=+0.112359818 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:43:07 compute-0 podman[200517]: 2025-11-29 06:43:07.168006794 +0000 UTC m=+1.352010510 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Nov 29 06:43:07 compute-0 podman[200632]: 2025-11-29 06:43:07.311347473 +0000 UTC m=+0.046278528 container create 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 06:43:07 compute-0 podman[200632]: 2025-11-29 06:43:07.289376058 +0000 UTC m=+0.024307133 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Nov 29 06:43:07 compute-0 python3[200501]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Nov 29 06:43:07 compute-0 sudo[200499]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:08 compute-0 sudo[200819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgnzrtuwyinhyraesjqzyabdfcyieegm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398588.423884-2085-240995847442524/AnsiballZ_stat.py'
Nov 29 06:43:08 compute-0 sudo[200819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:08 compute-0 python3.9[200821]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:43:09 compute-0 sudo[200819]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:09 compute-0 sudo[200973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gimkdgoojlcywvdxryivuzsvznrgmqcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398589.426772-2112-224818956204187/AnsiballZ_file.py'
Nov 29 06:43:09 compute-0 sudo[200973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:09 compute-0 python3.9[200975]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:43:09 compute-0 sudo[200973]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:10 compute-0 sudo[201124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldufeqaoxsoyyboyzmosfhuzetkwumxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398589.9106452-2112-279360598529916/AnsiballZ_copy.py'
Nov 29 06:43:10 compute-0 sudo[201124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:10 compute-0 python3.9[201126]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764398589.9106452-2112-279360598529916/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:43:10 compute-0 sudo[201124]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:10 compute-0 sudo[201200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jagnyffbhnxwwwompmmpffdhjwjktzlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398589.9106452-2112-279360598529916/AnsiballZ_systemd.py'
Nov 29 06:43:10 compute-0 sudo[201200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:11 compute-0 python3.9[201202]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 06:43:11 compute-0 systemd[1]: Reloading.
Nov 29 06:43:11 compute-0 systemd-sysv-generator[201234]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:43:11 compute-0 systemd-rc-local-generator[201231]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:43:11 compute-0 sudo[201200]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:11 compute-0 sudo[201313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdrzdbtdovzhalkmfgxrlsxmhywhfzqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398589.9106452-2112-279360598529916/AnsiballZ_systemd.py'
Nov 29 06:43:11 compute-0 sudo[201313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:11 compute-0 python3.9[201315]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:43:12 compute-0 systemd[1]: Reloading.
Nov 29 06:43:12 compute-0 systemd-rc-local-generator[201343]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:43:12 compute-0 systemd-sysv-generator[201346]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:43:12 compute-0 systemd[1]: Starting podman_exporter container...
Nov 29 06:43:12 compute-0 systemd[1]: Started libcrun container.
Nov 29 06:43:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/361ecaed5ce130807667ef4e2b42f9a7d3ac63f4203397b6e872cf11c82fe14a/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 29 06:43:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/361ecaed5ce130807667ef4e2b42f9a7d3ac63f4203397b6e872cf11c82fe14a/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 29 06:43:12 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c.
Nov 29 06:43:12 compute-0 podman[201355]: 2025-11-29 06:43:12.41638805 +0000 UTC m=+0.114150609 container init 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 06:43:12 compute-0 podman_exporter[201370]: ts=2025-11-29T06:43:12.439Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Nov 29 06:43:12 compute-0 podman_exporter[201370]: ts=2025-11-29T06:43:12.439Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Nov 29 06:43:12 compute-0 podman_exporter[201370]: ts=2025-11-29T06:43:12.439Z caller=handler.go:94 level=info msg="enabled collectors"
Nov 29 06:43:12 compute-0 podman_exporter[201370]: ts=2025-11-29T06:43:12.439Z caller=handler.go:105 level=info collector=container
Nov 29 06:43:12 compute-0 podman[201355]: 2025-11-29 06:43:12.445333924 +0000 UTC m=+0.143096453 container start 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 06:43:12 compute-0 podman[201355]: podman_exporter
Nov 29 06:43:12 compute-0 systemd[1]: Starting Podman API Service...
Nov 29 06:43:12 compute-0 systemd[1]: Started Podman API Service.
Nov 29 06:43:12 compute-0 systemd[1]: Started podman_exporter container.
Nov 29 06:43:12 compute-0 podman[201381]: time="2025-11-29T06:43:12Z" level=info msg="/usr/bin/podman filtering at log level info"
Nov 29 06:43:12 compute-0 podman[201381]: time="2025-11-29T06:43:12Z" level=info msg="Setting parallel job count to 25"
Nov 29 06:43:12 compute-0 podman[201381]: time="2025-11-29T06:43:12Z" level=info msg="Using sqlite as database backend"
Nov 29 06:43:12 compute-0 podman[201381]: time="2025-11-29T06:43:12Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Nov 29 06:43:12 compute-0 podman[201381]: time="2025-11-29T06:43:12Z" level=info msg="Using systemd socket activation to determine API endpoint"
Nov 29 06:43:12 compute-0 podman[201381]: time="2025-11-29T06:43:12Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Nov 29 06:43:12 compute-0 sudo[201313]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:12 compute-0 podman[201381]: @ - - [29/Nov/2025:06:43:12 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Nov 29 06:43:12 compute-0 podman[201381]: time="2025-11-29T06:43:12Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 29 06:43:12 compute-0 podman[201381]: @ - - [29/Nov/2025:06:43:12 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 19568 "" "Go-http-client/1.1"
Nov 29 06:43:12 compute-0 podman_exporter[201370]: ts=2025-11-29T06:43:12.517Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Nov 29 06:43:12 compute-0 podman_exporter[201370]: ts=2025-11-29T06:43:12.518Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Nov 29 06:43:12 compute-0 podman_exporter[201370]: ts=2025-11-29T06:43:12.518Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Nov 29 06:43:12 compute-0 podman[201379]: 2025-11-29 06:43:12.535578442 +0000 UTC m=+0.077826546 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 06:43:12 compute-0 systemd[1]: 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c-60d9ba1f96a9b082.service: Main process exited, code=exited, status=1/FAILURE
Nov 29 06:43:12 compute-0 systemd[1]: 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c-60d9ba1f96a9b082.service: Failed with result 'exit-code'.
Nov 29 06:43:13 compute-0 sudo[201578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clnfpophbknfstfzpcodpziiykrfeusz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398593.39634-2184-187123777167371/AnsiballZ_systemd.py'
Nov 29 06:43:13 compute-0 sudo[201578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:13 compute-0 podman[201535]: 2025-11-29 06:43:13.763485309 +0000 UTC m=+0.084596718 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 06:43:14 compute-0 python3.9[201588]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:43:14 compute-0 systemd[1]: Stopping podman_exporter container...
Nov 29 06:43:14 compute-0 podman[201381]: @ - - [29/Nov/2025:06:43:12 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 3437 "" "Go-http-client/1.1"
Nov 29 06:43:14 compute-0 systemd[1]: libpod-78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c.scope: Deactivated successfully.
Nov 29 06:43:14 compute-0 podman[201593]: 2025-11-29 06:43:14.135427512 +0000 UTC m=+0.042545981 container died 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 06:43:14 compute-0 systemd[1]: 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c-60d9ba1f96a9b082.timer: Deactivated successfully.
Nov 29 06:43:14 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c.
Nov 29 06:43:14 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c-userdata-shm.mount: Deactivated successfully.
Nov 29 06:43:14 compute-0 systemd[1]: var-lib-containers-storage-overlay-361ecaed5ce130807667ef4e2b42f9a7d3ac63f4203397b6e872cf11c82fe14a-merged.mount: Deactivated successfully.
Nov 29 06:43:14 compute-0 podman[201593]: 2025-11-29 06:43:14.89694264 +0000 UTC m=+0.804061109 container cleanup 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 06:43:14 compute-0 podman[201593]: podman_exporter
Nov 29 06:43:14 compute-0 systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 29 06:43:14 compute-0 podman[201620]: podman_exporter
Nov 29 06:43:14 compute-0 systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Nov 29 06:43:14 compute-0 systemd[1]: Stopped podman_exporter container.
Nov 29 06:43:14 compute-0 systemd[1]: Starting podman_exporter container...
Nov 29 06:43:15 compute-0 systemd[1]: Started libcrun container.
Nov 29 06:43:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/361ecaed5ce130807667ef4e2b42f9a7d3ac63f4203397b6e872cf11c82fe14a/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 29 06:43:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/361ecaed5ce130807667ef4e2b42f9a7d3ac63f4203397b6e872cf11c82fe14a/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 29 06:43:15 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c.
Nov 29 06:43:15 compute-0 podman[201633]: 2025-11-29 06:43:15.243545852 +0000 UTC m=+0.244964821 container init 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 06:43:15 compute-0 podman_exporter[201647]: ts=2025-11-29T06:43:15.297Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Nov 29 06:43:15 compute-0 podman_exporter[201647]: ts=2025-11-29T06:43:15.297Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Nov 29 06:43:15 compute-0 podman_exporter[201647]: ts=2025-11-29T06:43:15.297Z caller=handler.go:94 level=info msg="enabled collectors"
Nov 29 06:43:15 compute-0 podman_exporter[201647]: ts=2025-11-29T06:43:15.297Z caller=handler.go:105 level=info collector=container
Nov 29 06:43:15 compute-0 podman[201381]: @ - - [29/Nov/2025:06:43:15 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Nov 29 06:43:15 compute-0 podman[201381]: time="2025-11-29T06:43:15Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 29 06:43:15 compute-0 podman[201633]: 2025-11-29 06:43:15.309614282 +0000 UTC m=+0.311033251 container start 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 06:43:15 compute-0 podman[201633]: podman_exporter
Nov 29 06:43:15 compute-0 systemd[1]: Started podman_exporter container.
Nov 29 06:43:15 compute-0 podman[201381]: @ - - [29/Nov/2025:06:43:15 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 19570 "" "Go-http-client/1.1"
Nov 29 06:43:15 compute-0 podman_exporter[201647]: ts=2025-11-29T06:43:15.344Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Nov 29 06:43:15 compute-0 podman_exporter[201647]: ts=2025-11-29T06:43:15.344Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Nov 29 06:43:15 compute-0 podman_exporter[201647]: ts=2025-11-29T06:43:15.344Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Nov 29 06:43:15 compute-0 sudo[201578]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:15 compute-0 podman[201656]: 2025-11-29 06:43:15.403062771 +0000 UTC m=+0.071357931 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 06:43:15 compute-0 sudo[201830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qejitlexrfhkdpgrbonovyoxdcudfqds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398595.5501647-2208-195016495426576/AnsiballZ_stat.py'
Nov 29 06:43:15 compute-0 sudo[201830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:16 compute-0 python3.9[201832]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:43:16 compute-0 sudo[201830]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:16 compute-0 sudo[201953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-poqvcbngvudoyadgbjvcsdicsfmgntwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398595.5501647-2208-195016495426576/AnsiballZ_copy.py'
Nov 29 06:43:16 compute-0 sudo[201953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:16 compute-0 python3.9[201955]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398595.5501647-2208-195016495426576/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 06:43:16 compute-0 sudo[201953]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:17 compute-0 sudo[202116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylmarhhaoyqanogeagjahbckxctojjzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398597.0905902-2259-212564863878821/AnsiballZ_container_config_data.py'
Nov 29 06:43:17 compute-0 sudo[202116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:17 compute-0 podman[202079]: 2025-11-29 06:43:17.412830906 +0000 UTC m=+0.075098418 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=2, health_log=, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 29 06:43:17 compute-0 systemd[1]: 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d-7231b2b56a723b0a.service: Main process exited, code=exited, status=1/FAILURE
Nov 29 06:43:17 compute-0 systemd[1]: 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d-7231b2b56a723b0a.service: Failed with result 'exit-code'.
Nov 29 06:43:17 compute-0 python3.9[202126]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Nov 29 06:43:17 compute-0 sudo[202116]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:18 compute-0 sudo[202276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlkpcpmsiqyciqlaofbjxvpigdrayboz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398597.9119852-2286-28158688865072/AnsiballZ_container_config_hash.py'
Nov 29 06:43:18 compute-0 sudo[202276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:18 compute-0 python3.9[202278]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 06:43:18 compute-0 sudo[202276]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:19 compute-0 sudo[202428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvgkgbearcvxtnkuyfwcqjkjkhznvqsz ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764398598.8002512-2316-27088940618620/AnsiballZ_edpm_container_manage.py'
Nov 29 06:43:19 compute-0 sudo[202428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:19 compute-0 python3[202430]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 06:43:20 compute-0 sshd-session[202431]: Invalid user intell from 45.202.211.6 port 50140
Nov 29 06:43:20 compute-0 sshd-session[202431]: Received disconnect from 45.202.211.6 port 50140:11: Bye Bye [preauth]
Nov 29 06:43:20 compute-0 sshd-session[202431]: Disconnected from invalid user intell 45.202.211.6 port 50140 [preauth]
Nov 29 06:43:22 compute-0 podman[202490]: 2025-11-29 06:43:22.390668463 +0000 UTC m=+0.656560484 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, io.buildah.version=1.41.3, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 29 06:43:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:43:24.798 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:43:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:43:24.805 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.008s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:43:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:43:24.805 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:43:25 compute-0 podman[202446]: 2025-11-29 06:43:25.240001009 +0000 UTC m=+5.879948983 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Nov 29 06:43:25 compute-0 podman[202565]: 2025-11-29 06:43:25.3710467 +0000 UTC m=+0.026533352 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Nov 29 06:43:25 compute-0 podman[202565]: 2025-11-29 06:43:25.776248323 +0000 UTC m=+0.431734955 container create 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, architecture=x86_64, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, config_id=edpm, version=9.6, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, io.openshift.expose-services=)
Nov 29 06:43:25 compute-0 python3[202430]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Nov 29 06:43:25 compute-0 sudo[202428]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:26 compute-0 sudo[202752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsbovkshxmljbwjlugdkrpfhypjpxwbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398606.0894682-2340-34713910879481/AnsiballZ_stat.py'
Nov 29 06:43:26 compute-0 sudo[202752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:26 compute-0 python3.9[202754]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:43:26 compute-0 sudo[202752]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:27 compute-0 sudo[202906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsxmcqmkdzpbmzgzmouiwtbjyfzmzipm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398606.8580468-2367-150045584326213/AnsiballZ_file.py'
Nov 29 06:43:27 compute-0 sudo[202906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:27 compute-0 python3.9[202908]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:43:27 compute-0 sudo[202906]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:27 compute-0 sudo[203057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upnpgksggkmyrnwudjzexehzligekpov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398607.384442-2367-177878646566487/AnsiballZ_copy.py'
Nov 29 06:43:27 compute-0 sudo[203057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:28 compute-0 python3.9[203059]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764398607.384442-2367-177878646566487/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:43:28 compute-0 sudo[203057]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:28 compute-0 sudo[203133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylvqojouwvzzgyxgqlfuylnrmhrpgucp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398607.384442-2367-177878646566487/AnsiballZ_systemd.py'
Nov 29 06:43:28 compute-0 sudo[203133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:28 compute-0 python3.9[203135]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 06:43:28 compute-0 systemd[1]: Reloading.
Nov 29 06:43:28 compute-0 systemd-sysv-generator[203166]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:43:28 compute-0 systemd-rc-local-generator[203163]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:43:28 compute-0 sudo[203133]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:29 compute-0 sudo[203244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opwzzybbtkbjubazbraigspmxtadqxui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398607.384442-2367-177878646566487/AnsiballZ_systemd.py'
Nov 29 06:43:29 compute-0 sudo[203244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:29 compute-0 python3.9[203246]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 06:43:29 compute-0 systemd[1]: Reloading.
Nov 29 06:43:29 compute-0 systemd-sysv-generator[203280]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 06:43:29 compute-0 systemd-rc-local-generator[203273]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 06:43:30 compute-0 systemd[1]: Starting openstack_network_exporter container...
Nov 29 06:43:30 compute-0 systemd[1]: Started libcrun container.
Nov 29 06:43:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/764faf2acb9ca1ae5ee21495f51d0e15eb8f688be0943d36fee0331c1c24f82d/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 29 06:43:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/764faf2acb9ca1ae5ee21495f51d0e15eb8f688be0943d36fee0331c1c24f82d/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 29 06:43:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/764faf2acb9ca1ae5ee21495f51d0e15eb8f688be0943d36fee0331c1c24f82d/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 29 06:43:30 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf.
Nov 29 06:43:30 compute-0 podman[203286]: 2025-11-29 06:43:30.8331909 +0000 UTC m=+0.790103737 container init 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.buildah.version=1.33.7, vcs-type=git, container_name=openstack_network_exporter, distribution-scope=public, config_id=edpm, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal)
Nov 29 06:43:30 compute-0 openstack_network_exporter[203300]: INFO    06:43:30 main.go:48: registering *bridge.Collector
Nov 29 06:43:30 compute-0 openstack_network_exporter[203300]: INFO    06:43:30 main.go:48: registering *coverage.Collector
Nov 29 06:43:30 compute-0 openstack_network_exporter[203300]: INFO    06:43:30 main.go:48: registering *datapath.Collector
Nov 29 06:43:30 compute-0 openstack_network_exporter[203300]: INFO    06:43:30 main.go:48: registering *iface.Collector
Nov 29 06:43:30 compute-0 openstack_network_exporter[203300]: INFO    06:43:30 main.go:48: registering *memory.Collector
Nov 29 06:43:30 compute-0 openstack_network_exporter[203300]: INFO    06:43:30 main.go:48: registering *ovnnorthd.Collector
Nov 29 06:43:30 compute-0 openstack_network_exporter[203300]: INFO    06:43:30 main.go:48: registering *ovn.Collector
Nov 29 06:43:30 compute-0 openstack_network_exporter[203300]: INFO    06:43:30 main.go:48: registering *ovsdbserver.Collector
Nov 29 06:43:30 compute-0 openstack_network_exporter[203300]: INFO    06:43:30 main.go:48: registering *pmd_perf.Collector
Nov 29 06:43:30 compute-0 openstack_network_exporter[203300]: INFO    06:43:30 main.go:48: registering *pmd_rxq.Collector
Nov 29 06:43:30 compute-0 openstack_network_exporter[203300]: INFO    06:43:30 main.go:48: registering *vswitch.Collector
Nov 29 06:43:30 compute-0 openstack_network_exporter[203300]: NOTICE  06:43:30 main.go:76: listening on https://:9105/metrics
Nov 29 06:43:30 compute-0 podman[203286]: 2025-11-29 06:43:30.861992778 +0000 UTC m=+0.818905565 container start 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 29 06:43:30 compute-0 podman[203286]: openstack_network_exporter
Nov 29 06:43:30 compute-0 systemd[1]: Started openstack_network_exporter container.
Nov 29 06:43:30 compute-0 sudo[203244]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:30 compute-0 podman[203310]: 2025-11-29 06:43:30.971792161 +0000 UTC m=+0.099968056 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, managed_by=edpm_ansible, name=ubi9-minimal, io.buildah.version=1.33.7, version=9.6, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, release=1755695350, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.openshift.tags=minimal rhel9, vcs-type=git, com.redhat.component=ubi9-minimal-container)
Nov 29 06:43:31 compute-0 sudo[203483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vemcaoujjpgtrttdrlldkaefsodwwfae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398611.134075-2439-233294977996588/AnsiballZ_systemd.py'
Nov 29 06:43:31 compute-0 sudo[203483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:31 compute-0 python3.9[203485]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 06:43:31 compute-0 systemd[1]: Stopping openstack_network_exporter container...
Nov 29 06:43:31 compute-0 systemd[1]: libpod-65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf.scope: Deactivated successfully.
Nov 29 06:43:31 compute-0 podman[203489]: 2025-11-29 06:43:31.864019296 +0000 UTC m=+0.079945360 container died 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.buildah.version=1.33.7, managed_by=edpm_ansible, release=1755695350, build-date=2025-08-20T13:12:41, name=ubi9-minimal, vcs-type=git, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter)
Nov 29 06:43:31 compute-0 systemd[1]: 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf-6e73ed686a4aad6a.timer: Deactivated successfully.
Nov 29 06:43:31 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf.
Nov 29 06:43:32 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf-userdata-shm.mount: Deactivated successfully.
Nov 29 06:43:32 compute-0 systemd[1]: var-lib-containers-storage-overlay-764faf2acb9ca1ae5ee21495f51d0e15eb8f688be0943d36fee0331c1c24f82d-merged.mount: Deactivated successfully.
Nov 29 06:43:32 compute-0 podman[203505]: 2025-11-29 06:43:32.063673389 +0000 UTC m=+0.189012195 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 06:43:33 compute-0 nova_compute[187185]: 2025-11-29 06:43:33.662 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:43:33 compute-0 nova_compute[187185]: 2025-11-29 06:43:33.662 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:43:33 compute-0 nova_compute[187185]: 2025-11-29 06:43:33.705 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:43:33 compute-0 nova_compute[187185]: 2025-11-29 06:43:33.705 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 06:43:33 compute-0 nova_compute[187185]: 2025-11-29 06:43:33.705 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 06:43:33 compute-0 nova_compute[187185]: 2025-11-29 06:43:33.716 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 06:43:33 compute-0 nova_compute[187185]: 2025-11-29 06:43:33.717 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:43:33 compute-0 nova_compute[187185]: 2025-11-29 06:43:33.717 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:43:33 compute-0 nova_compute[187185]: 2025-11-29 06:43:33.717 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:43:34 compute-0 nova_compute[187185]: 2025-11-29 06:43:34.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:43:34 compute-0 nova_compute[187185]: 2025-11-29 06:43:34.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 06:43:34 compute-0 nova_compute[187185]: 2025-11-29 06:43:34.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:43:34 compute-0 nova_compute[187185]: 2025-11-29 06:43:34.340 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:43:34 compute-0 nova_compute[187185]: 2025-11-29 06:43:34.340 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:43:34 compute-0 nova_compute[187185]: 2025-11-29 06:43:34.340 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:43:34 compute-0 nova_compute[187185]: 2025-11-29 06:43:34.340 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:43:34 compute-0 nova_compute[187185]: 2025-11-29 06:43:34.481 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:43:34 compute-0 nova_compute[187185]: 2025-11-29 06:43:34.481 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5960MB free_disk=73.37813949584961GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:43:34 compute-0 nova_compute[187185]: 2025-11-29 06:43:34.482 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:43:34 compute-0 nova_compute[187185]: 2025-11-29 06:43:34.482 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:43:34 compute-0 nova_compute[187185]: 2025-11-29 06:43:34.672 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:43:34 compute-0 nova_compute[187185]: 2025-11-29 06:43:34.672 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:43:34 compute-0 nova_compute[187185]: 2025-11-29 06:43:34.693 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:43:34 compute-0 nova_compute[187185]: 2025-11-29 06:43:34.720 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:43:34 compute-0 nova_compute[187185]: 2025-11-29 06:43:34.722 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:43:34 compute-0 nova_compute[187185]: 2025-11-29 06:43:34.722 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.240s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:43:35 compute-0 podman[203489]: 2025-11-29 06:43:35.021151417 +0000 UTC m=+3.237077481 container cleanup 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.33.7, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, name=ubi9-minimal, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350)
Nov 29 06:43:35 compute-0 podman[203489]: openstack_network_exporter
Nov 29 06:43:35 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 29 06:43:35 compute-0 podman[203541]: openstack_network_exporter
Nov 29 06:43:35 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Nov 29 06:43:35 compute-0 systemd[1]: Stopped openstack_network_exporter container.
Nov 29 06:43:35 compute-0 systemd[1]: Starting openstack_network_exporter container...
Nov 29 06:43:35 compute-0 systemd[1]: Started libcrun container.
Nov 29 06:43:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/764faf2acb9ca1ae5ee21495f51d0e15eb8f688be0943d36fee0331c1c24f82d/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 29 06:43:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/764faf2acb9ca1ae5ee21495f51d0e15eb8f688be0943d36fee0331c1c24f82d/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 29 06:43:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/764faf2acb9ca1ae5ee21495f51d0e15eb8f688be0943d36fee0331c1c24f82d/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 29 06:43:35 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf.
Nov 29 06:43:35 compute-0 nova_compute[187185]: 2025-11-29 06:43:35.721 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:43:35 compute-0 nova_compute[187185]: 2025-11-29 06:43:35.722 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:43:35 compute-0 podman[203554]: 2025-11-29 06:43:35.966084202 +0000 UTC m=+0.869976302 container init 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.expose-services=, vcs-type=git, container_name=openstack_network_exporter, version=9.6, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, distribution-scope=public, io.buildah.version=1.33.7, config_id=edpm, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., name=ubi9-minimal, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 29 06:43:35 compute-0 openstack_network_exporter[203569]: INFO    06:43:35 main.go:48: registering *bridge.Collector
Nov 29 06:43:35 compute-0 openstack_network_exporter[203569]: INFO    06:43:35 main.go:48: registering *coverage.Collector
Nov 29 06:43:35 compute-0 openstack_network_exporter[203569]: INFO    06:43:35 main.go:48: registering *datapath.Collector
Nov 29 06:43:35 compute-0 openstack_network_exporter[203569]: INFO    06:43:35 main.go:48: registering *iface.Collector
Nov 29 06:43:35 compute-0 openstack_network_exporter[203569]: INFO    06:43:35 main.go:48: registering *memory.Collector
Nov 29 06:43:35 compute-0 openstack_network_exporter[203569]: INFO    06:43:35 main.go:48: registering *ovnnorthd.Collector
Nov 29 06:43:35 compute-0 openstack_network_exporter[203569]: INFO    06:43:35 main.go:48: registering *ovn.Collector
Nov 29 06:43:35 compute-0 openstack_network_exporter[203569]: INFO    06:43:35 main.go:48: registering *ovsdbserver.Collector
Nov 29 06:43:35 compute-0 openstack_network_exporter[203569]: INFO    06:43:35 main.go:48: registering *pmd_perf.Collector
Nov 29 06:43:35 compute-0 openstack_network_exporter[203569]: INFO    06:43:35 main.go:48: registering *pmd_rxq.Collector
Nov 29 06:43:35 compute-0 openstack_network_exporter[203569]: INFO    06:43:35 main.go:48: registering *vswitch.Collector
Nov 29 06:43:35 compute-0 openstack_network_exporter[203569]: NOTICE  06:43:35 main.go:76: listening on https://:9105/metrics
Nov 29 06:43:35 compute-0 podman[203554]: 2025-11-29 06:43:35.996660895 +0000 UTC m=+0.900552985 container start 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vcs-type=git, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, release=1755695350, maintainer=Red Hat, Inc., config_id=edpm, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, container_name=openstack_network_exporter, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41)
Nov 29 06:43:36 compute-0 podman[203554]: openstack_network_exporter
Nov 29 06:43:36 compute-0 systemd[1]: Started openstack_network_exporter container.
Nov 29 06:43:36 compute-0 sudo[203483]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:36 compute-0 podman[203579]: 2025-11-29 06:43:36.235753666 +0000 UTC m=+0.230733231 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-type=git, maintainer=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, release=1755695350, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=minimal rhel9)
Nov 29 06:43:36 compute-0 sudo[203752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrfbjifbuanouachpurqgmeiuuclcudc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398616.3713305-2463-167296053986701/AnsiballZ_find.py'
Nov 29 06:43:36 compute-0 sudo[203752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:36 compute-0 python3.9[203754]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 29 06:43:36 compute-0 sudo[203752]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:37 compute-0 podman[203848]: 2025-11-29 06:43:37.792533437 +0000 UTC m=+0.051032130 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 06:43:37 compute-0 sudo[203924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tevsepvytolaorwjqdspbovmvyncuozv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398617.447057-2491-96915877180899/AnsiballZ_podman_container_info.py'
Nov 29 06:43:37 compute-0 sudo[203924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:38 compute-0 python3.9[203926]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Nov 29 06:43:38 compute-0 sudo[203924]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:38 compute-0 sudo[204089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsmzeqehdmpfkxgixgaixapkibydqgmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398618.3382437-2499-42139127268473/AnsiballZ_podman_container_exec.py'
Nov 29 06:43:38 compute-0 sudo[204089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:38 compute-0 python3.9[204091]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 29 06:43:39 compute-0 systemd[1]: Started libpod-conmon-8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152.scope.
Nov 29 06:43:39 compute-0 podman[204092]: 2025-11-29 06:43:39.121534763 +0000 UTC m=+0.166376423 container exec 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 06:43:39 compute-0 podman[204112]: 2025-11-29 06:43:39.193027229 +0000 UTC m=+0.053977224 container exec_died 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 06:43:39 compute-0 podman[204092]: 2025-11-29 06:43:39.270440964 +0000 UTC m=+0.315282584 container exec_died 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 06:43:39 compute-0 systemd[1]: libpod-conmon-8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152.scope: Deactivated successfully.
Nov 29 06:43:39 compute-0 sudo[204089]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:39 compute-0 sudo[204275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xurkbqzxcjzdsznklytvhmxessqsorcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398619.5126824-2507-112504893402931/AnsiballZ_podman_container_exec.py'
Nov 29 06:43:39 compute-0 sudo[204275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:40 compute-0 python3.9[204277]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 29 06:43:40 compute-0 systemd[1]: Started libpod-conmon-8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152.scope.
Nov 29 06:43:40 compute-0 podman[204278]: 2025-11-29 06:43:40.340190164 +0000 UTC m=+0.262316977 container exec 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 29 06:43:40 compute-0 podman[204297]: 2025-11-29 06:43:40.596025116 +0000 UTC m=+0.236101325 container exec_died 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:43:40 compute-0 podman[204278]: 2025-11-29 06:43:40.712889128 +0000 UTC m=+0.635015951 container exec_died 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 06:43:40 compute-0 systemd[1]: libpod-conmon-8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152.scope: Deactivated successfully.
Nov 29 06:43:41 compute-0 sudo[204275]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:41 compute-0 sudo[204459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exijsyczeulbfxjoyaukkajhuvigbulp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398621.3178325-2515-29750822053911/AnsiballZ_file.py'
Nov 29 06:43:41 compute-0 sudo[204459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:41 compute-0 python3.9[204461]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:43:41 compute-0 sudo[204459]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:42 compute-0 sudo[204611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtvckovwzhbebsiymzplezsroggmugak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398622.0747926-2524-225481131109007/AnsiballZ_podman_container_info.py'
Nov 29 06:43:42 compute-0 sudo[204611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:42 compute-0 python3.9[204613]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Nov 29 06:43:42 compute-0 sudo[204611]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:43 compute-0 sudo[204777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lznfvxokcmwghiywliwspzmnzmazdwok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398622.7587647-2532-147180717731010/AnsiballZ_podman_container_exec.py'
Nov 29 06:43:43 compute-0 sudo[204777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:43 compute-0 python3.9[204779]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 29 06:43:43 compute-0 systemd[1]: Started libpod-conmon-0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e.scope.
Nov 29 06:43:43 compute-0 podman[204780]: 2025-11-29 06:43:43.372307777 +0000 UTC m=+0.087354737 container exec 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 06:43:43 compute-0 podman[204800]: 2025-11-29 06:43:43.442023918 +0000 UTC m=+0.054899287 container exec_died 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 06:43:43 compute-0 podman[204780]: 2025-11-29 06:43:43.446588553 +0000 UTC m=+0.161635523 container exec_died 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 06:43:43 compute-0 systemd[1]: libpod-conmon-0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e.scope: Deactivated successfully.
Nov 29 06:43:43 compute-0 sudo[204777]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:43 compute-0 sudo[204972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-duqrbcjhoykftdlpqdgsvvyacqobppij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398623.6380901-2540-168029965276151/AnsiballZ_podman_container_exec.py'
Nov 29 06:43:43 compute-0 sudo[204972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:43 compute-0 podman[204936]: 2025-11-29 06:43:43.950113081 +0000 UTC m=+0.087964242 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:43:44 compute-0 python3.9[204980]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 29 06:43:44 compute-0 systemd[1]: Started libpod-conmon-0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e.scope.
Nov 29 06:43:44 compute-0 podman[204990]: 2025-11-29 06:43:44.171662237 +0000 UTC m=+0.067475515 container exec 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:43:44 compute-0 podman[204990]: 2025-11-29 06:43:44.20740631 +0000 UTC m=+0.103219588 container exec_died 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Nov 29 06:43:44 compute-0 systemd[1]: libpod-conmon-0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e.scope: Deactivated successfully.
Nov 29 06:43:44 compute-0 sudo[204972]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:44 compute-0 sudo[205171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpxhvnupepcsbqexrwegoxkzcosstzro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398624.4067307-2548-33879596472573/AnsiballZ_file.py'
Nov 29 06:43:44 compute-0 sudo[205171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:44 compute-0 python3.9[205173]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:43:44 compute-0 sudo[205171]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:45 compute-0 sudo[205323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzcvbzmqjrmglzwpjuhaqltndldnwiuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398625.100592-2557-20486306035480/AnsiballZ_podman_container_info.py'
Nov 29 06:43:45 compute-0 sudo[205323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:45 compute-0 python3.9[205325]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Nov 29 06:43:45 compute-0 sudo[205323]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:45 compute-0 podman[205345]: 2025-11-29 06:43:45.777490126 +0000 UTC m=+0.046688530 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 06:43:46 compute-0 sudo[205513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elnuywsqrcorfwobecmzlfikpklbysyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398625.8686404-2565-96228573254036/AnsiballZ_podman_container_exec.py'
Nov 29 06:43:46 compute-0 sudo[205513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:46 compute-0 python3.9[205515]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 29 06:43:46 compute-0 systemd[1]: Started libpod-conmon-03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205.scope.
Nov 29 06:43:46 compute-0 podman[205516]: 2025-11-29 06:43:46.409608192 +0000 UTC m=+0.074290277 container exec 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd)
Nov 29 06:43:46 compute-0 podman[205516]: 2025-11-29 06:43:46.444167275 +0000 UTC m=+0.108849360 container exec_died 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd)
Nov 29 06:43:46 compute-0 systemd[1]: libpod-conmon-03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205.scope: Deactivated successfully.
Nov 29 06:43:46 compute-0 sudo[205513]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:46 compute-0 sudo[205700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsinabknffytfpryyqlfbzbixckaxfli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398626.667536-2573-242908274536394/AnsiballZ_podman_container_exec.py'
Nov 29 06:43:46 compute-0 sudo[205700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:47 compute-0 python3.9[205702]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 29 06:43:47 compute-0 systemd[1]: Started libpod-conmon-03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205.scope.
Nov 29 06:43:47 compute-0 podman[205703]: 2025-11-29 06:43:47.730939456 +0000 UTC m=+0.523765890 container exec 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 29 06:43:47 compute-0 podman[205724]: 2025-11-29 06:43:47.827115895 +0000 UTC m=+0.079140460 container exec_died 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Nov 29 06:43:47 compute-0 podman[205703]: 2025-11-29 06:43:47.840928714 +0000 UTC m=+0.633755148 container exec_died 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 06:43:47 compute-0 systemd[1]: libpod-conmon-03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205.scope: Deactivated successfully.
Nov 29 06:43:47 compute-0 podman[205721]: 2025-11-29 06:43:47.865994317 +0000 UTC m=+0.129181334 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, health_failing_streak=3, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 29 06:43:47 compute-0 systemd[1]: 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d-7231b2b56a723b0a.service: Main process exited, code=exited, status=1/FAILURE
Nov 29 06:43:47 compute-0 systemd[1]: 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d-7231b2b56a723b0a.service: Failed with result 'exit-code'.
Nov 29 06:43:47 compute-0 sudo[205700]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:48 compute-0 sudo[205905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnksaulbxcdhusiearmiewprcoyyhctq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398628.0670033-2581-45915004285107/AnsiballZ_file.py'
Nov 29 06:43:48 compute-0 sudo[205905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:48 compute-0 python3.9[205907]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:43:48 compute-0 sudo[205905]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:49 compute-0 sudo[206057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehvpoaldryruyzoipwwnhxewsifijvkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398628.7833905-2590-55480047959036/AnsiballZ_podman_container_info.py'
Nov 29 06:43:49 compute-0 sudo[206057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:49 compute-0 python3.9[206059]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Nov 29 06:43:49 compute-0 sudo[206057]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:49 compute-0 sudo[206223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldpzidxqfwcjezabzueoxtrxybscpblf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398629.5770078-2598-36271755111310/AnsiballZ_podman_container_exec.py'
Nov 29 06:43:49 compute-0 sudo[206223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:50 compute-0 python3.9[206225]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 29 06:43:50 compute-0 systemd[1]: Started libpod-conmon-39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d.scope.
Nov 29 06:43:50 compute-0 podman[206226]: 2025-11-29 06:43:50.562095734 +0000 UTC m=+0.394812713 container exec 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 29 06:43:50 compute-0 podman[206246]: 2025-11-29 06:43:50.63006893 +0000 UTC m=+0.054609359 container exec_died 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 06:43:50 compute-0 podman[206226]: 2025-11-29 06:43:50.645130791 +0000 UTC m=+0.477847730 container exec_died 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:43:50 compute-0 systemd[1]: libpod-conmon-39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d.scope: Deactivated successfully.
Nov 29 06:43:50 compute-0 sudo[206223]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:51 compute-0 sudo[206408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmtlzwwpnlxfebjvwbixieiloexlbmes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398630.8409371-2606-21923940641822/AnsiballZ_podman_container_exec.py'
Nov 29 06:43:51 compute-0 sudo[206408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:51 compute-0 python3.9[206410]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 29 06:43:51 compute-0 systemd[1]: Started libpod-conmon-39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d.scope.
Nov 29 06:43:51 compute-0 podman[206411]: 2025-11-29 06:43:51.525564019 +0000 UTC m=+0.181457594 container exec 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 06:43:51 compute-0 podman[206430]: 2025-11-29 06:43:51.673991168 +0000 UTC m=+0.134100718 container exec_died 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 29 06:43:51 compute-0 podman[206411]: 2025-11-29 06:43:51.781553915 +0000 UTC m=+0.437447460 container exec_died 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, container_name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:43:51 compute-0 systemd[1]: libpod-conmon-39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d.scope: Deactivated successfully.
Nov 29 06:43:51 compute-0 sudo[206408]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:52 compute-0 sudo[206593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sunceniikbovuhjjsusuedfacropbkri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398632.1185856-2614-32901824874198/AnsiballZ_file.py'
Nov 29 06:43:52 compute-0 sudo[206593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:52 compute-0 python3.9[206595]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:43:52 compute-0 sudo[206593]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:53 compute-0 sudo[206745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjaaicchmvohvsozwyxtsvxgvlfvhwju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398632.8017375-2623-226183321110384/AnsiballZ_podman_container_info.py'
Nov 29 06:43:53 compute-0 sudo[206745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:53 compute-0 python3.9[206747]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Nov 29 06:43:53 compute-0 sudo[206745]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:53 compute-0 podman[206860]: 2025-11-29 06:43:53.854477302 +0000 UTC m=+0.111751654 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 06:43:53 compute-0 sudo[206927]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhmprsnctzlethusknyuzqzzgmuzvwaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398633.5537076-2631-99884406654558/AnsiballZ_podman_container_exec.py'
Nov 29 06:43:53 compute-0 sudo[206927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:54 compute-0 python3.9[206929]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 29 06:43:54 compute-0 systemd[1]: Started libpod-conmon-cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd.scope.
Nov 29 06:43:54 compute-0 podman[206930]: 2025-11-29 06:43:54.218163708 +0000 UTC m=+0.085735847 container exec cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 06:43:54 compute-0 podman[206930]: 2025-11-29 06:43:54.25111226 +0000 UTC m=+0.118684399 container exec_died cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 06:43:54 compute-0 systemd[1]: libpod-conmon-cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd.scope: Deactivated successfully.
Nov 29 06:43:54 compute-0 sudo[206927]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:54 compute-0 sudo[207112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uthdsdhygcmdvtglzoqdtpdnbmrfjvlx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398634.5009172-2639-139377125577923/AnsiballZ_podman_container_exec.py'
Nov 29 06:43:54 compute-0 sudo[207112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:55 compute-0 python3.9[207114]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 29 06:43:55 compute-0 systemd[1]: Started libpod-conmon-cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd.scope.
Nov 29 06:43:55 compute-0 podman[207115]: 2025-11-29 06:43:55.082086299 +0000 UTC m=+0.065110826 container exec cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 06:43:55 compute-0 podman[207115]: 2025-11-29 06:43:55.111771378 +0000 UTC m=+0.094795835 container exec_died cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 06:43:55 compute-0 systemd[1]: libpod-conmon-cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd.scope: Deactivated successfully.
Nov 29 06:43:55 compute-0 sudo[207112]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:55 compute-0 sudo[207297]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avwbyeqbruouxobmnoksncptqybkvlob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398635.2911494-2647-211028056039689/AnsiballZ_file.py'
Nov 29 06:43:55 compute-0 sudo[207297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:55 compute-0 python3.9[207299]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:43:55 compute-0 sudo[207297]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:56 compute-0 sudo[207449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccgemebexbssrucnpvxjpxwefkqhtvwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398636.092536-2656-2892966431242/AnsiballZ_podman_container_info.py'
Nov 29 06:43:56 compute-0 sudo[207449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:56 compute-0 python3.9[207451]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Nov 29 06:43:56 compute-0 sudo[207449]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:57 compute-0 sudo[207615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snyjedjplwnfcokijhanxxsdobukpiiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398636.837179-2664-276760062282709/AnsiballZ_podman_container_exec.py'
Nov 29 06:43:57 compute-0 sudo[207615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:57 compute-0 python3.9[207617]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 29 06:43:57 compute-0 systemd[1]: Started libpod-conmon-78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c.scope.
Nov 29 06:43:57 compute-0 podman[207618]: 2025-11-29 06:43:57.49767178 +0000 UTC m=+0.080742710 container exec 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 06:43:57 compute-0 podman[207618]: 2025-11-29 06:43:57.532268804 +0000 UTC m=+0.115339704 container exec_died 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 06:43:57 compute-0 systemd[1]: libpod-conmon-78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c.scope: Deactivated successfully.
Nov 29 06:43:57 compute-0 sudo[207615]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:57 compute-0 sudo[207799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwmiaallstywpjpvyrngzojctuvgyhap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398637.73559-2672-99238203193809/AnsiballZ_podman_container_exec.py'
Nov 29 06:43:57 compute-0 sudo[207799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:58 compute-0 python3.9[207801]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 29 06:43:58 compute-0 systemd[1]: Started libpod-conmon-78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c.scope.
Nov 29 06:43:58 compute-0 podman[207802]: 2025-11-29 06:43:58.702033439 +0000 UTC m=+0.477178823 container exec 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 06:43:58 compute-0 podman[207802]: 2025-11-29 06:43:58.747030746 +0000 UTC m=+0.522176130 container exec_died 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 06:43:58 compute-0 systemd[1]: libpod-conmon-78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c.scope: Deactivated successfully.
Nov 29 06:43:58 compute-0 sudo[207799]: pam_unix(sudo:session): session closed for user root
Nov 29 06:43:59 compute-0 sudo[207984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfmixkrkwyrbmewyxvnozswbazdyecgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398639.0804164-2680-79823823887950/AnsiballZ_file.py'
Nov 29 06:43:59 compute-0 sudo[207984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:43:59 compute-0 python3.9[207986]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:43:59 compute-0 sudo[207984]: pam_unix(sudo:session): session closed for user root
Nov 29 06:44:00 compute-0 sudo[208136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjismptdxytsrhkseyztwbjpzsnwhfqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398639.8555305-2689-240190872146021/AnsiballZ_podman_container_info.py'
Nov 29 06:44:00 compute-0 sudo[208136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:44:00 compute-0 python3.9[208138]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Nov 29 06:44:00 compute-0 sudo[208136]: pam_unix(sudo:session): session closed for user root
Nov 29 06:44:00 compute-0 sudo[208304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsfdlhxunifaklgekxwyuaworfdvrdcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398640.63696-2697-48620887234431/AnsiballZ_podman_container_exec.py'
Nov 29 06:44:00 compute-0 sudo[208304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:44:01 compute-0 python3.9[208306]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 29 06:44:01 compute-0 systemd[1]: Started libpod-conmon-65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf.scope.
Nov 29 06:44:01 compute-0 podman[208307]: 2025-11-29 06:44:01.267540088 +0000 UTC m=+0.120983767 container exec 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., config_id=edpm, distribution-scope=public, architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, managed_by=edpm_ansible, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41)
Nov 29 06:44:01 compute-0 podman[208327]: 2025-11-29 06:44:01.34007649 +0000 UTC m=+0.060210712 container exec_died 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, config_id=edpm, container_name=openstack_network_exporter, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, release=1755695350)
Nov 29 06:44:01 compute-0 podman[208307]: 2025-11-29 06:44:01.346575314 +0000 UTC m=+0.200018993 container exec_died 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.)
Nov 29 06:44:01 compute-0 systemd[1]: libpod-conmon-65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf.scope: Deactivated successfully.
Nov 29 06:44:01 compute-0 sudo[208304]: pam_unix(sudo:session): session closed for user root
Nov 29 06:44:01 compute-0 sudo[208489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmxbhelzhtmrvibbwcrjoophpbfutxax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398641.5624716-2705-92353254894429/AnsiballZ_podman_container_exec.py'
Nov 29 06:44:01 compute-0 sudo[208489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:44:01 compute-0 sshd-session[208205]: Invalid user system from 1.214.197.163 port 37644
Nov 29 06:44:01 compute-0 python3.9[208491]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 29 06:44:02 compute-0 sshd-session[208205]: Received disconnect from 1.214.197.163 port 37644:11: Bye Bye [preauth]
Nov 29 06:44:02 compute-0 sshd-session[208205]: Disconnected from invalid user system 1.214.197.163 port 37644 [preauth]
Nov 29 06:44:02 compute-0 systemd[1]: Started libpod-conmon-65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf.scope.
Nov 29 06:44:02 compute-0 podman[208492]: 2025-11-29 06:44:02.248070004 +0000 UTC m=+0.237990742 container exec 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, maintainer=Red Hat, Inc., config_id=edpm, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, managed_by=edpm_ansible, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.buildah.version=1.33.7)
Nov 29 06:44:02 compute-0 podman[208492]: 2025-11-29 06:44:02.27997592 +0000 UTC m=+0.269896658 container exec_died 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, version=9.6, vendor=Red Hat, Inc., architecture=x86_64, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=edpm, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=)
Nov 29 06:44:02 compute-0 systemd[1]: libpod-conmon-65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf.scope: Deactivated successfully.
Nov 29 06:44:02 compute-0 sudo[208489]: pam_unix(sudo:session): session closed for user root
Nov 29 06:44:03 compute-0 sudo[208674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhxaxkjkmilzlbsjtpvulmgjitjvmlhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398642.726801-2713-217488284881979/AnsiballZ_file.py'
Nov 29 06:44:03 compute-0 sudo[208674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:44:03 compute-0 python3.9[208676]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:44:03 compute-0 sudo[208674]: pam_unix(sudo:session): session closed for user root
Nov 29 06:44:05 compute-0 podman[208701]: 2025-11-29 06:44:05.808610764 +0000 UTC m=+0.074265907 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 06:44:06 compute-0 podman[208725]: 2025-11-29 06:44:06.774751857 +0000 UTC m=+0.046590408 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, container_name=openstack_network_exporter, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-type=git, version=9.6, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, config_id=edpm, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 29 06:44:08 compute-0 podman[208746]: 2025-11-29 06:44:08.777697706 +0000 UTC m=+0.050041466 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 29 06:44:14 compute-0 podman[208765]: 2025-11-29 06:44:14.904821492 +0000 UTC m=+0.173221836 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:44:16 compute-0 podman[208791]: 2025-11-29 06:44:16.814550988 +0000 UTC m=+0.072922973 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 06:44:18 compute-0 podman[208817]: 2025-11-29 06:44:18.791873631 +0000 UTC m=+0.060394887 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Nov 29 06:44:24 compute-0 podman[208837]: 2025-11-29 06:44:24.790974003 +0000 UTC m=+0.059332335 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:44:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:44:24.799 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:44:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:44:24.802 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:44:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:44:24.802 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:44:30 compute-0 sshd-session[208858]: Received disconnect from 160.202.8.218 port 40350:11: Bye Bye [preauth]
Nov 29 06:44:30 compute-0 sshd-session[208858]: Disconnected from authenticating user root 160.202.8.218 port 40350 [preauth]
Nov 29 06:44:31 compute-0 sshd-session[208860]: Received disconnect from 179.125.24.202 port 35534:11: Bye Bye [preauth]
Nov 29 06:44:31 compute-0 sshd-session[208860]: Disconnected from authenticating user root 179.125.24.202 port 35534 [preauth]
Nov 29 06:44:33 compute-0 sshd-session[208862]: Received disconnect from 45.202.211.6 port 58014:11: Bye Bye [preauth]
Nov 29 06:44:33 compute-0 sshd-session[208862]: Disconnected from authenticating user root 45.202.211.6 port 58014 [preauth]
Nov 29 06:44:33 compute-0 nova_compute[187185]: 2025-11-29 06:44:33.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:44:34 compute-0 nova_compute[187185]: 2025-11-29 06:44:34.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:44:34 compute-0 nova_compute[187185]: 2025-11-29 06:44:34.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 06:44:34 compute-0 nova_compute[187185]: 2025-11-29 06:44:34.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 06:44:34 compute-0 nova_compute[187185]: 2025-11-29 06:44:34.407 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 06:44:34 compute-0 nova_compute[187185]: 2025-11-29 06:44:34.408 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:44:34 compute-0 nova_compute[187185]: 2025-11-29 06:44:34.408 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:44:35 compute-0 nova_compute[187185]: 2025-11-29 06:44:35.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:44:35 compute-0 nova_compute[187185]: 2025-11-29 06:44:35.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:44:35 compute-0 nova_compute[187185]: 2025-11-29 06:44:35.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 06:44:36 compute-0 nova_compute[187185]: 2025-11-29 06:44:36.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:44:36 compute-0 nova_compute[187185]: 2025-11-29 06:44:36.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:44:36 compute-0 podman[208864]: 2025-11-29 06:44:36.818225338 +0000 UTC m=+0.077221315 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 06:44:36 compute-0 podman[208888]: 2025-11-29 06:44:36.902242086 +0000 UTC m=+0.058692389 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, vcs-type=git, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, distribution-scope=public, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9-minimal)
Nov 29 06:44:37 compute-0 nova_compute[187185]: 2025-11-29 06:44:37.231 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:44:37 compute-0 nova_compute[187185]: 2025-11-29 06:44:37.231 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:44:37 compute-0 nova_compute[187185]: 2025-11-29 06:44:37.231 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:44:37 compute-0 nova_compute[187185]: 2025-11-29 06:44:37.232 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:44:37 compute-0 nova_compute[187185]: 2025-11-29 06:44:37.398 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:44:37 compute-0 nova_compute[187185]: 2025-11-29 06:44:37.399 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6074MB free_disk=73.378173828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:44:37 compute-0 nova_compute[187185]: 2025-11-29 06:44:37.400 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:44:37 compute-0 nova_compute[187185]: 2025-11-29 06:44:37.400 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:44:37 compute-0 nova_compute[187185]: 2025-11-29 06:44:37.545 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:44:37 compute-0 nova_compute[187185]: 2025-11-29 06:44:37.546 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:44:37 compute-0 nova_compute[187185]: 2025-11-29 06:44:37.582 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:44:37 compute-0 nova_compute[187185]: 2025-11-29 06:44:37.597 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:44:37 compute-0 nova_compute[187185]: 2025-11-29 06:44:37.599 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:44:37 compute-0 nova_compute[187185]: 2025-11-29 06:44:37.599 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.200s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:44:38 compute-0 nova_compute[187185]: 2025-11-29 06:44:38.599 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:44:39 compute-0 podman[208909]: 2025-11-29 06:44:39.7830337 +0000 UTC m=+0.055507476 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:44:45 compute-0 podman[208929]: 2025-11-29 06:44:45.800106773 +0000 UTC m=+0.071014955 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller)
Nov 29 06:44:47 compute-0 podman[208957]: 2025-11-29 06:44:47.780512 +0000 UTC m=+0.050002495 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 06:44:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:44:47.974 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:44:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:44:47.975 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:44:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:44:47.975 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:44:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:44:47.975 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:44:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:44:47.975 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:44:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:44:47.975 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:44:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:44:47.975 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:44:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:44:47.975 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:44:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:44:47.975 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:44:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:44:47.976 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:44:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:44:47.976 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:44:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:44:47.976 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:44:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:44:47.976 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:44:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:44:47.976 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:44:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:44:47.976 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:44:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:44:47.976 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:44:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:44:47.976 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:44:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:44:47.976 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:44:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:44:47.976 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:44:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:44:47.976 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:44:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:44:47.976 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:44:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:44:47.976 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:44:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:44:47.976 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:44:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:44:47.977 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:44:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:44:47.977 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:44:49 compute-0 sshd-session[208955]: Received disconnect from 103.179.56.44 port 58406:11: Bye Bye [preauth]
Nov 29 06:44:49 compute-0 sshd-session[208955]: Disconnected from authenticating user root 103.179.56.44 port 58406 [preauth]
Nov 29 06:44:49 compute-0 podman[208981]: 2025-11-29 06:44:49.786765351 +0000 UTC m=+0.053261099 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Nov 29 06:44:55 compute-0 podman[209001]: 2025-11-29 06:44:55.785797689 +0000 UTC m=+0.053859715 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:44:58 compute-0 sudo[209146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ayutyxapxsfjcejyyidsxoitkswyhxdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398698.6077373-3186-115507058688037/AnsiballZ_file.py'
Nov 29 06:44:58 compute-0 sudo[209146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:44:59 compute-0 python3.9[209148]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:44:59 compute-0 sudo[209146]: pam_unix(sudo:session): session closed for user root
Nov 29 06:44:59 compute-0 sudo[209298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwhahlciebwfppzctksoddleizkbsupy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398699.3425505-3210-133095921310572/AnsiballZ_stat.py'
Nov 29 06:44:59 compute-0 sudo[209298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:44:59 compute-0 python3.9[209300]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:44:59 compute-0 sudo[209298]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:00 compute-0 sudo[209421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhiwburmgjgpxbzfeaqizokcbthhqbcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398699.3425505-3210-133095921310572/AnsiballZ_copy.py'
Nov 29 06:45:00 compute-0 sudo[209421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:45:00 compute-0 python3.9[209423]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398699.3425505-3210-133095921310572/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:45:00 compute-0 sudo[209421]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:01 compute-0 sudo[209573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lasoccdbeuiorkdlmcgnxohbemxptavl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398700.8571868-3258-202239321237023/AnsiballZ_file.py'
Nov 29 06:45:01 compute-0 sudo[209573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:45:01 compute-0 python3.9[209575]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:45:01 compute-0 sudo[209573]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:01 compute-0 sudo[209725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvkbvauddlaidykhgfwwnhgjimbnmlti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398701.6431878-3282-102684359572440/AnsiballZ_stat.py'
Nov 29 06:45:01 compute-0 sudo[209725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:45:02 compute-0 python3.9[209727]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:45:02 compute-0 sudo[209725]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:02 compute-0 sudo[209803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuiqjxqcgmlrcauladfmopkxidixfsdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398701.6431878-3282-102684359572440/AnsiballZ_file.py'
Nov 29 06:45:02 compute-0 sudo[209803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:45:02 compute-0 python3.9[209805]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:45:02 compute-0 sudo[209803]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:03 compute-0 sudo[209955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jshpvzgvkmtlwphsbacbhrinmpwhjttf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398702.911203-3318-25985672598658/AnsiballZ_stat.py'
Nov 29 06:45:03 compute-0 sudo[209955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:45:03 compute-0 python3.9[209957]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:45:03 compute-0 sudo[209955]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:03 compute-0 sudo[210033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xeklchgcpsahxnwdllqvfcbkrjzhdaqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398702.911203-3318-25985672598658/AnsiballZ_file.py'
Nov 29 06:45:03 compute-0 sudo[210033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:45:03 compute-0 python3.9[210035]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.a80a3qjp recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:45:03 compute-0 sudo[210033]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:04 compute-0 sudo[210185]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iddbjrrxwguowrlkaxyvdogwnenospqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398704.2140782-3354-276477364568751/AnsiballZ_stat.py'
Nov 29 06:45:04 compute-0 sudo[210185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:45:04 compute-0 python3.9[210187]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:45:04 compute-0 sudo[210185]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:04 compute-0 sudo[210263]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etrclkrfsvpwthnbtpjenacohwgnvyec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398704.2140782-3354-276477364568751/AnsiballZ_file.py'
Nov 29 06:45:04 compute-0 sudo[210263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:45:05 compute-0 python3.9[210265]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:45:05 compute-0 sudo[210263]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:05 compute-0 sudo[210415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-giosymbsjcmayhtbswfvxelcurwfvdch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398705.6189642-3393-89092666562798/AnsiballZ_command.py'
Nov 29 06:45:05 compute-0 sudo[210415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:45:06 compute-0 python3.9[210417]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:45:06 compute-0 sudo[210415]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:06 compute-0 sudo[210568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhaswsearcljkvnfonilzcwbmdqjstzv ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764398706.361737-3417-174082755135914/AnsiballZ_edpm_nftables_from_files.py'
Nov 29 06:45:06 compute-0 sudo[210568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:45:07 compute-0 python3[210570]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 29 06:45:07 compute-0 sudo[210568]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:07 compute-0 sudo[210740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lejsqlpszfjfrvultbcarosteonewoww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398707.3894138-3441-92812289030022/AnsiballZ_stat.py'
Nov 29 06:45:07 compute-0 sudo[210740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:45:07 compute-0 podman[210694]: 2025-11-29 06:45:07.742616207 +0000 UTC m=+0.066521549 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, version=9.6, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_id=edpm, io.openshift.expose-services=, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 29 06:45:07 compute-0 podman[210695]: 2025-11-29 06:45:07.765627648 +0000 UTC m=+0.081625837 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 06:45:07 compute-0 python3.9[210748]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:45:07 compute-0 sudo[210740]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:08 compute-0 sudo[210838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btqjkcvxkzpzkzvrcupejdrcgmzyxodk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398707.3894138-3441-92812289030022/AnsiballZ_file.py'
Nov 29 06:45:08 compute-0 sudo[210838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:45:08 compute-0 python3.9[210840]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:45:08 compute-0 sudo[210838]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:09 compute-0 sudo[210990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfvsgjcrwxzmgtenkgmojdrchnhrbfqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398708.780137-3477-162103742773296/AnsiballZ_stat.py'
Nov 29 06:45:09 compute-0 sudo[210990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:45:09 compute-0 python3.9[210992]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:45:09 compute-0 sudo[210990]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:09 compute-0 sudo[211068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofbhnljynyvywwejlwtitfnzlqveklhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398708.780137-3477-162103742773296/AnsiballZ_file.py'
Nov 29 06:45:09 compute-0 sudo[211068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:45:09 compute-0 python3.9[211070]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:45:09 compute-0 sudo[211068]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:10 compute-0 sudo[211233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yinkizzvnhfvbwqxszfblkidkdwlxrey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398710.0012212-3513-123538981879347/AnsiballZ_stat.py'
Nov 29 06:45:10 compute-0 sudo[211233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:45:10 compute-0 podman[211194]: 2025-11-29 06:45:10.370479165 +0000 UTC m=+0.044572996 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:45:10 compute-0 python3.9[211241]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:45:10 compute-0 sudo[211233]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:10 compute-0 sudo[211317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkneyireyljcazxjqlgldmzkcalnvpyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398710.0012212-3513-123538981879347/AnsiballZ_file.py'
Nov 29 06:45:10 compute-0 sudo[211317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:45:10 compute-0 python3.9[211319]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:45:10 compute-0 sudo[211317]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:11 compute-0 sudo[211469]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eacojzmxotxhchqvcwktrnmhixfdorcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398711.2364295-3549-141652239874701/AnsiballZ_stat.py'
Nov 29 06:45:11 compute-0 sudo[211469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:45:11 compute-0 python3.9[211471]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:45:11 compute-0 sudo[211469]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:11 compute-0 sudo[211547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyzxintfqqtrapubbegwacrfqzwayxbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398711.2364295-3549-141652239874701/AnsiballZ_file.py'
Nov 29 06:45:11 compute-0 sudo[211547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:45:12 compute-0 python3.9[211549]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:45:12 compute-0 sudo[211547]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:12 compute-0 sudo[211699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjmkvwszykpedeqagmqacknhaozoekja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398712.4408145-3585-276760755171010/AnsiballZ_stat.py'
Nov 29 06:45:12 compute-0 sudo[211699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:45:13 compute-0 python3.9[211701]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 06:45:13 compute-0 sudo[211699]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:13 compute-0 sudo[211824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytaibrkplbtxhykhwkrxlmcqevteaceq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398712.4408145-3585-276760755171010/AnsiballZ_copy.py'
Nov 29 06:45:13 compute-0 sudo[211824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:45:13 compute-0 python3.9[211826]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398712.4408145-3585-276760755171010/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:45:13 compute-0 sudo[211824]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:14 compute-0 sudo[211976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utdrmgwjfcszoaizgcfkwtwhwtmokgnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398713.9070635-3630-41082774181734/AnsiballZ_file.py'
Nov 29 06:45:14 compute-0 sudo[211976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:45:14 compute-0 python3.9[211978]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:45:14 compute-0 sudo[211976]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:15 compute-0 sudo[212128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pabpdnqqscxrobwgqeqasouekqvguemm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398714.6987035-3654-275262131258516/AnsiballZ_command.py'
Nov 29 06:45:15 compute-0 sudo[212128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:45:15 compute-0 python3.9[212130]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:45:15 compute-0 sudo[212128]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:16 compute-0 sudo[212292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csvfcwfezozzhbrxtuddarttesnwjipn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398715.5618367-3678-113060040573082/AnsiballZ_blockinfile.py'
Nov 29 06:45:16 compute-0 sudo[212292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:45:16 compute-0 podman[212257]: 2025-11-29 06:45:16.162324051 +0000 UTC m=+0.152412186 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 06:45:16 compute-0 python3.9[212303]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:45:16 compute-0 sudo[212292]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:16 compute-0 sudo[212462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbheauqrdqxdjemxzthjvissvmaprnje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398716.681058-3705-83820295636700/AnsiballZ_command.py'
Nov 29 06:45:16 compute-0 sudo[212462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:45:17 compute-0 python3.9[212464]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:45:17 compute-0 sudo[212462]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:17 compute-0 sudo[212615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijrwhldxbgfdweiigqvdcbcmtwumcyid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398717.540927-3729-239456461699656/AnsiballZ_stat.py'
Nov 29 06:45:17 compute-0 sudo[212615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:45:17 compute-0 podman[212617]: 2025-11-29 06:45:17.884412144 +0000 UTC m=+0.044488854 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 06:45:18 compute-0 python3.9[212618]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 06:45:18 compute-0 sudo[212615]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:18 compute-0 sudo[212794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvaughzzevssjskekbcuhoxcgmpqihxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398718.2806675-3753-251415140086887/AnsiballZ_command.py'
Nov 29 06:45:18 compute-0 sudo[212794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:45:18 compute-0 python3.9[212796]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 06:45:18 compute-0 sudo[212794]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:19 compute-0 sudo[212949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlozzzisbpogfkcpkgqfwgwpdtbgbxxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764398719.0607324-3777-221379655971150/AnsiballZ_file.py'
Nov 29 06:45:19 compute-0 sudo[212949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 06:45:19 compute-0 python3.9[212951]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 06:45:19 compute-0 sudo[212949]: pam_unix(sudo:session): session closed for user root
Nov 29 06:45:20 compute-0 sshd-session[187506]: Connection closed by 192.168.122.30 port 51330
Nov 29 06:45:20 compute-0 sshd-session[187503]: pam_unix(sshd:session): session closed for user zuul
Nov 29 06:45:20 compute-0 systemd-logind[788]: Session 25 logged out. Waiting for processes to exit.
Nov 29 06:45:20 compute-0 systemd[1]: session-25.scope: Deactivated successfully.
Nov 29 06:45:20 compute-0 systemd[1]: session-25.scope: Consumed 1min 35.900s CPU time.
Nov 29 06:45:20 compute-0 systemd-logind[788]: Removed session 25.
Nov 29 06:45:20 compute-0 podman[212976]: 2025-11-29 06:45:20.206947769 +0000 UTC m=+0.053021143 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 29 06:45:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:45:24.801 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:45:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:45:24.803 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:45:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:45:24.803 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:45:26 compute-0 podman[212996]: 2025-11-29 06:45:26.793166338 +0000 UTC m=+0.056351108 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 06:45:32 compute-0 sshd-session[213016]: Received disconnect from 1.214.197.163 port 39036:11: Bye Bye [preauth]
Nov 29 06:45:32 compute-0 sshd-session[213016]: Disconnected from authenticating user root 1.214.197.163 port 39036 [preauth]
Nov 29 06:45:34 compute-0 nova_compute[187185]: 2025-11-29 06:45:34.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:45:34 compute-0 nova_compute[187185]: 2025-11-29 06:45:34.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:45:35 compute-0 nova_compute[187185]: 2025-11-29 06:45:35.311 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:45:35 compute-0 nova_compute[187185]: 2025-11-29 06:45:35.332 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:45:36 compute-0 nova_compute[187185]: 2025-11-29 06:45:36.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:45:36 compute-0 nova_compute[187185]: 2025-11-29 06:45:36.318 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 06:45:36 compute-0 nova_compute[187185]: 2025-11-29 06:45:36.318 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 06:45:36 compute-0 nova_compute[187185]: 2025-11-29 06:45:36.330 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 06:45:37 compute-0 nova_compute[187185]: 2025-11-29 06:45:37.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:45:37 compute-0 nova_compute[187185]: 2025-11-29 06:45:37.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:45:37 compute-0 nova_compute[187185]: 2025-11-29 06:45:37.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:45:37 compute-0 nova_compute[187185]: 2025-11-29 06:45:37.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 06:45:38 compute-0 nova_compute[187185]: 2025-11-29 06:45:38.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:45:38 compute-0 nova_compute[187185]: 2025-11-29 06:45:38.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:45:38 compute-0 nova_compute[187185]: 2025-11-29 06:45:38.348 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:45:38 compute-0 nova_compute[187185]: 2025-11-29 06:45:38.348 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:45:38 compute-0 nova_compute[187185]: 2025-11-29 06:45:38.348 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:45:38 compute-0 nova_compute[187185]: 2025-11-29 06:45:38.349 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:45:38 compute-0 nova_compute[187185]: 2025-11-29 06:45:38.476 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:45:38 compute-0 nova_compute[187185]: 2025-11-29 06:45:38.478 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6053MB free_disk=73.37817001342773GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:45:38 compute-0 nova_compute[187185]: 2025-11-29 06:45:38.478 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:45:38 compute-0 nova_compute[187185]: 2025-11-29 06:45:38.478 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:45:38 compute-0 nova_compute[187185]: 2025-11-29 06:45:38.534 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:45:38 compute-0 nova_compute[187185]: 2025-11-29 06:45:38.534 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:45:38 compute-0 nova_compute[187185]: 2025-11-29 06:45:38.561 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:45:38 compute-0 nova_compute[187185]: 2025-11-29 06:45:38.579 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:45:38 compute-0 nova_compute[187185]: 2025-11-29 06:45:38.581 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:45:38 compute-0 nova_compute[187185]: 2025-11-29 06:45:38.581 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:45:38 compute-0 podman[213019]: 2025-11-29 06:45:38.789371721 +0000 UTC m=+0.048693733 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 06:45:38 compute-0 podman[213018]: 2025-11-29 06:45:38.793780848 +0000 UTC m=+0.054023476 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, distribution-scope=public, container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-type=git, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.buildah.version=1.33.7, architecture=x86_64, io.openshift.expose-services=, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41)
Nov 29 06:45:40 compute-0 podman[213057]: 2025-11-29 06:45:40.827214941 +0000 UTC m=+0.088261541 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 06:45:46 compute-0 sshd-session[213076]: Invalid user Test from 45.202.211.6 port 49734
Nov 29 06:45:46 compute-0 sshd-session[213076]: Received disconnect from 45.202.211.6 port 49734:11: Bye Bye [preauth]
Nov 29 06:45:46 compute-0 sshd-session[213076]: Disconnected from invalid user Test 45.202.211.6 port 49734 [preauth]
Nov 29 06:45:46 compute-0 podman[213078]: 2025-11-29 06:45:46.859304561 +0000 UTC m=+0.115114695 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:45:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:45:47.989 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 06:45:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:45:47.991 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 06:45:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:45:47.993 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:45:48 compute-0 podman[213104]: 2025-11-29 06:45:48.78592795 +0000 UTC m=+0.051074841 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 06:45:50 compute-0 podman[213129]: 2025-11-29 06:45:50.82510811 +0000 UTC m=+0.084194705 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 29 06:45:57 compute-0 sshd-session[213150]: Invalid user devuser from 179.125.24.202 port 44146
Nov 29 06:45:57 compute-0 podman[213152]: 2025-11-29 06:45:57.588236382 +0000 UTC m=+0.066665530 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 06:45:57 compute-0 sshd-session[213150]: Received disconnect from 179.125.24.202 port 44146:11: Bye Bye [preauth]
Nov 29 06:45:57 compute-0 sshd-session[213150]: Disconnected from invalid user devuser 179.125.24.202 port 44146 [preauth]
Nov 29 06:46:03 compute-0 sshd-session[213172]: Received disconnect from 160.202.8.218 port 33872:11: Bye Bye [preauth]
Nov 29 06:46:03 compute-0 sshd-session[213172]: Disconnected from authenticating user ftp 160.202.8.218 port 33872 [preauth]
Nov 29 06:46:09 compute-0 podman[213175]: 2025-11-29 06:46:09.781867384 +0000 UTC m=+0.052055580 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 06:46:09 compute-0 podman[213174]: 2025-11-29 06:46:09.783426539 +0000 UTC m=+0.055927181 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=edpm, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., release=1755695350, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git)
Nov 29 06:46:11 compute-0 podman[213219]: 2025-11-29 06:46:11.788102755 +0000 UTC m=+0.051795002 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 06:46:17 compute-0 podman[213238]: 2025-11-29 06:46:17.833019813 +0000 UTC m=+0.103000306 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 06:46:19 compute-0 podman[213264]: 2025-11-29 06:46:19.801149176 +0000 UTC m=+0.060535673 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 06:46:21 compute-0 podman[213288]: 2025-11-29 06:46:21.808516061 +0000 UTC m=+0.070903613 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, io.buildah.version=1.41.3)
Nov 29 06:46:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:46:24.802 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:46:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:46:24.803 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:46:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:46:24.803 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:46:27 compute-0 podman[213309]: 2025-11-29 06:46:27.789456396 +0000 UTC m=+0.057678841 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible)
Nov 29 06:46:33 compute-0 nova_compute[187185]: 2025-11-29 06:46:33.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:46:33 compute-0 nova_compute[187185]: 2025-11-29 06:46:33.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 29 06:46:33 compute-0 nova_compute[187185]: 2025-11-29 06:46:33.341 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 29 06:46:33 compute-0 nova_compute[187185]: 2025-11-29 06:46:33.342 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:46:33 compute-0 nova_compute[187185]: 2025-11-29 06:46:33.343 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 29 06:46:33 compute-0 nova_compute[187185]: 2025-11-29 06:46:33.358 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:46:35 compute-0 nova_compute[187185]: 2025-11-29 06:46:35.389 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:46:35 compute-0 nova_compute[187185]: 2025-11-29 06:46:35.390 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:46:36 compute-0 nova_compute[187185]: 2025-11-29 06:46:36.318 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:46:36 compute-0 nova_compute[187185]: 2025-11-29 06:46:36.319 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 06:46:36 compute-0 nova_compute[187185]: 2025-11-29 06:46:36.319 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 06:46:37 compute-0 nova_compute[187185]: 2025-11-29 06:46:37.237 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 06:46:37 compute-0 nova_compute[187185]: 2025-11-29 06:46:37.237 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:46:37 compute-0 nova_compute[187185]: 2025-11-29 06:46:37.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:46:37 compute-0 nova_compute[187185]: 2025-11-29 06:46:37.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:46:37 compute-0 nova_compute[187185]: 2025-11-29 06:46:37.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 06:46:38 compute-0 nova_compute[187185]: 2025-11-29 06:46:38.311 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:46:39 compute-0 nova_compute[187185]: 2025-11-29 06:46:39.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:46:40 compute-0 podman[213330]: 2025-11-29 06:46:40.789466146 +0000 UTC m=+0.060800417 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vcs-type=git, release=1755695350, container_name=openstack_network_exporter, name=ubi9-minimal, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 29 06:46:40 compute-0 podman[213331]: 2025-11-29 06:46:40.795083332 +0000 UTC m=+0.059581070 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 06:46:42 compute-0 podman[213373]: 2025-11-29 06:46:42.831111856 +0000 UTC m=+0.078658957 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:46:44 compute-0 nova_compute[187185]: 2025-11-29 06:46:44.606 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:46:44 compute-0 nova_compute[187185]: 2025-11-29 06:46:44.607 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:46:44 compute-0 nova_compute[187185]: 2025-11-29 06:46:44.607 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:46:44 compute-0 nova_compute[187185]: 2025-11-29 06:46:44.608 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:46:44 compute-0 nova_compute[187185]: 2025-11-29 06:46:44.798 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:46:44 compute-0 nova_compute[187185]: 2025-11-29 06:46:44.800 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6069MB free_disk=73.37719345092773GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:46:44 compute-0 nova_compute[187185]: 2025-11-29 06:46:44.800 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:46:44 compute-0 nova_compute[187185]: 2025-11-29 06:46:44.800 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:46:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:46:47.976 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:46:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:46:47.976 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:46:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:46:47.976 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:46:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:46:47.977 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:46:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:46:47.977 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:46:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:46:47.977 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:46:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:46:47.977 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:46:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:46:47.977 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:46:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:46:47.977 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:46:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:46:47.977 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:46:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:46:47.977 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:46:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:46:47.977 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:46:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:46:47.977 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:46:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:46:47.978 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:46:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:46:47.978 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:46:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:46:47.978 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:46:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:46:47.978 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:46:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:46:47.978 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:46:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:46:47.978 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:46:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:46:47.978 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:46:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:46:47.978 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:46:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:46:47.978 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:46:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:46:47.979 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:46:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:46:47.979 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:46:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:46:47.979 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:46:48 compute-0 podman[213394]: 2025-11-29 06:46:48.793145971 +0000 UTC m=+0.066214177 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 06:46:49 compute-0 sshd-session[213392]: Received disconnect from 103.179.56.44 port 58764:11: Bye Bye [preauth]
Nov 29 06:46:49 compute-0 sshd-session[213392]: Disconnected from authenticating user root 103.179.56.44 port 58764 [preauth]
Nov 29 06:46:50 compute-0 nova_compute[187185]: 2025-11-29 06:46:50.347 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:46:50 compute-0 nova_compute[187185]: 2025-11-29 06:46:50.348 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:46:50 compute-0 nova_compute[187185]: 2025-11-29 06:46:50.437 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Refreshing inventories for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 29 06:46:50 compute-0 nova_compute[187185]: 2025-11-29 06:46:50.502 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Updating ProviderTree inventory for provider 4e39a026-df39-4e20-874a-dbb5a40df044 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 29 06:46:50 compute-0 nova_compute[187185]: 2025-11-29 06:46:50.502 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Updating inventory in ProviderTree for provider 4e39a026-df39-4e20-874a-dbb5a40df044 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 29 06:46:50 compute-0 nova_compute[187185]: 2025-11-29 06:46:50.518 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Refreshing aggregate associations for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 29 06:46:50 compute-0 nova_compute[187185]: 2025-11-29 06:46:50.540 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Refreshing trait associations for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044, traits: HW_CPU_X86_SSE,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_TRUSTED_CERTS,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 29 06:46:50 compute-0 nova_compute[187185]: 2025-11-29 06:46:50.564 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:46:50 compute-0 nova_compute[187185]: 2025-11-29 06:46:50.710 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:46:50 compute-0 nova_compute[187185]: 2025-11-29 06:46:50.712 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:46:50 compute-0 nova_compute[187185]: 2025-11-29 06:46:50.713 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 5.912s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:46:50 compute-0 podman[213420]: 2025-11-29 06:46:50.820652172 +0000 UTC m=+0.072523604 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 06:46:51 compute-0 nova_compute[187185]: 2025-11-29 06:46:51.713 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:46:52 compute-0 podman[213444]: 2025-11-29 06:46:52.800897319 +0000 UTC m=+0.068227867 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 06:46:58 compute-0 podman[213464]: 2025-11-29 06:46:58.803505139 +0000 UTC m=+0.064933578 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:47:01 compute-0 sshd-session[213485]: Received disconnect from 45.202.211.6 port 52182:11: Bye Bye [preauth]
Nov 29 06:47:01 compute-0 sshd-session[213485]: Disconnected from authenticating user root 45.202.211.6 port 52182 [preauth]
Nov 29 06:47:07 compute-0 sshd-session[213487]: Invalid user aa from 1.214.197.163 port 40434
Nov 29 06:47:07 compute-0 sshd-session[213487]: Received disconnect from 1.214.197.163 port 40434:11: Bye Bye [preauth]
Nov 29 06:47:07 compute-0 sshd-session[213487]: Disconnected from invalid user aa 1.214.197.163 port 40434 [preauth]
Nov 29 06:47:11 compute-0 podman[213490]: 2025-11-29 06:47:11.789633863 +0000 UTC m=+0.058385374 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 06:47:11 compute-0 podman[213489]: 2025-11-29 06:47:11.803550086 +0000 UTC m=+0.076339147 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, container_name=openstack_network_exporter, managed_by=edpm_ansible, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vcs-type=git, vendor=Red Hat, Inc., config_id=edpm, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6)
Nov 29 06:47:13 compute-0 podman[213534]: 2025-11-29 06:47:13.792917474 +0000 UTC m=+0.060632631 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:47:19 compute-0 podman[213553]: 2025-11-29 06:47:19.816786316 +0000 UTC m=+0.077244434 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 06:47:21 compute-0 podman[213579]: 2025-11-29 06:47:21.77770209 +0000 UTC m=+0.050002705 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 06:47:23 compute-0 podman[213604]: 2025-11-29 06:47:23.823813603 +0000 UTC m=+0.080985435 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:47:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:47:24.804 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:47:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:47:24.804 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:47:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:47:24.805 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:47:26 compute-0 sshd-session[213624]: Received disconnect from 179.125.24.202 port 44806:11: Bye Bye [preauth]
Nov 29 06:47:26 compute-0 sshd-session[213624]: Disconnected from authenticating user root 179.125.24.202 port 44806 [preauth]
Nov 29 06:47:29 compute-0 podman[213626]: 2025-11-29 06:47:29.826779903 +0000 UTC m=+0.079181781 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd)
Nov 29 06:47:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:47:30.684 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 06:47:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:47:30.685 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 06:47:36 compute-0 nova_compute[187185]: 2025-11-29 06:47:36.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:36 compute-0 nova_compute[187185]: 2025-11-29 06:47:36.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 06:47:36 compute-0 nova_compute[187185]: 2025-11-29 06:47:36.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 06:47:36 compute-0 nova_compute[187185]: 2025-11-29 06:47:36.335 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 06:47:36 compute-0 nova_compute[187185]: 2025-11-29 06:47:36.336 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:37 compute-0 nova_compute[187185]: 2025-11-29 06:47:37.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:37 compute-0 sshd-session[213647]: Received disconnect from 160.202.8.218 port 55616:11: Bye Bye [preauth]
Nov 29 06:47:37 compute-0 sshd-session[213647]: Disconnected from authenticating user root 160.202.8.218 port 55616 [preauth]
Nov 29 06:47:38 compute-0 nova_compute[187185]: 2025-11-29 06:47:38.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:38 compute-0 nova_compute[187185]: 2025-11-29 06:47:38.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:38 compute-0 nova_compute[187185]: 2025-11-29 06:47:38.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 06:47:38 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:47:38.687 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:47:39 compute-0 nova_compute[187185]: 2025-11-29 06:47:39.311 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:39 compute-0 nova_compute[187185]: 2025-11-29 06:47:39.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:40 compute-0 nova_compute[187185]: 2025-11-29 06:47:40.310 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:40 compute-0 nova_compute[187185]: 2025-11-29 06:47:40.403 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:40 compute-0 nova_compute[187185]: 2025-11-29 06:47:40.434 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:47:40 compute-0 nova_compute[187185]: 2025-11-29 06:47:40.435 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:47:40 compute-0 nova_compute[187185]: 2025-11-29 06:47:40.435 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:47:40 compute-0 nova_compute[187185]: 2025-11-29 06:47:40.436 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:47:40 compute-0 nova_compute[187185]: 2025-11-29 06:47:40.588 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:47:40 compute-0 nova_compute[187185]: 2025-11-29 06:47:40.589 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6072MB free_disk=73.3772964477539GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:47:40 compute-0 nova_compute[187185]: 2025-11-29 06:47:40.590 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:47:40 compute-0 nova_compute[187185]: 2025-11-29 06:47:40.590 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:47:40 compute-0 nova_compute[187185]: 2025-11-29 06:47:40.804 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:47:40 compute-0 nova_compute[187185]: 2025-11-29 06:47:40.804 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:47:40 compute-0 nova_compute[187185]: 2025-11-29 06:47:40.823 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:47:41 compute-0 nova_compute[187185]: 2025-11-29 06:47:41.222 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:47:41 compute-0 nova_compute[187185]: 2025-11-29 06:47:41.224 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:47:41 compute-0 nova_compute[187185]: 2025-11-29 06:47:41.224 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.634s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:47:42 compute-0 nova_compute[187185]: 2025-11-29 06:47:42.136 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:47:42 compute-0 podman[213649]: 2025-11-29 06:47:42.792033158 +0000 UTC m=+0.059906837 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, config_id=edpm, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7)
Nov 29 06:47:42 compute-0 podman[213650]: 2025-11-29 06:47:42.818122547 +0000 UTC m=+0.066886753 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 06:47:44 compute-0 podman[213694]: 2025-11-29 06:47:44.778516245 +0000 UTC m=+0.048397887 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 29 06:47:48 compute-0 nova_compute[187185]: 2025-11-29 06:47:48.355 187189 DEBUG oslo_concurrency.lockutils [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquiring lock "afe4ae44-2787-45ec-8e0b-72fa7297cebb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:47:48 compute-0 nova_compute[187185]: 2025-11-29 06:47:48.356 187189 DEBUG oslo_concurrency.lockutils [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "afe4ae44-2787-45ec-8e0b-72fa7297cebb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:47:48 compute-0 nova_compute[187185]: 2025-11-29 06:47:48.382 187189 DEBUG nova.compute.manager [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 06:47:48 compute-0 nova_compute[187185]: 2025-11-29 06:47:48.516 187189 DEBUG oslo_concurrency.lockutils [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:47:48 compute-0 nova_compute[187185]: 2025-11-29 06:47:48.516 187189 DEBUG oslo_concurrency.lockutils [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:47:48 compute-0 nova_compute[187185]: 2025-11-29 06:47:48.522 187189 DEBUG nova.virt.hardware [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 06:47:48 compute-0 nova_compute[187185]: 2025-11-29 06:47:48.522 187189 INFO nova.compute.claims [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Claim successful on node compute-0.ctlplane.example.com
Nov 29 06:47:48 compute-0 nova_compute[187185]: 2025-11-29 06:47:48.655 187189 DEBUG nova.compute.provider_tree [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:47:48 compute-0 nova_compute[187185]: 2025-11-29 06:47:48.682 187189 DEBUG nova.scheduler.client.report [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:47:48 compute-0 nova_compute[187185]: 2025-11-29 06:47:48.711 187189 DEBUG oslo_concurrency.lockutils [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.195s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:47:48 compute-0 nova_compute[187185]: 2025-11-29 06:47:48.713 187189 DEBUG nova.compute.manager [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 06:47:48 compute-0 nova_compute[187185]: 2025-11-29 06:47:48.950 187189 DEBUG nova.compute.manager [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948
Nov 29 06:47:48 compute-0 nova_compute[187185]: 2025-11-29 06:47:48.982 187189 INFO nova.virt.libvirt.driver [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 06:47:49 compute-0 nova_compute[187185]: 2025-11-29 06:47:49.010 187189 DEBUG nova.compute.manager [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 06:47:49 compute-0 nova_compute[187185]: 2025-11-29 06:47:49.140 187189 DEBUG nova.compute.manager [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 06:47:49 compute-0 nova_compute[187185]: 2025-11-29 06:47:49.142 187189 DEBUG nova.virt.libvirt.driver [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 06:47:49 compute-0 nova_compute[187185]: 2025-11-29 06:47:49.143 187189 INFO nova.virt.libvirt.driver [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Creating image(s)
Nov 29 06:47:49 compute-0 nova_compute[187185]: 2025-11-29 06:47:49.144 187189 DEBUG oslo_concurrency.lockutils [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquiring lock "/var/lib/nova/instances/afe4ae44-2787-45ec-8e0b-72fa7297cebb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:47:49 compute-0 nova_compute[187185]: 2025-11-29 06:47:49.145 187189 DEBUG oslo_concurrency.lockutils [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "/var/lib/nova/instances/afe4ae44-2787-45ec-8e0b-72fa7297cebb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:47:49 compute-0 nova_compute[187185]: 2025-11-29 06:47:49.146 187189 DEBUG oslo_concurrency.lockutils [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "/var/lib/nova/instances/afe4ae44-2787-45ec-8e0b-72fa7297cebb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:47:49 compute-0 nova_compute[187185]: 2025-11-29 06:47:49.147 187189 DEBUG oslo_concurrency.lockutils [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:47:49 compute-0 nova_compute[187185]: 2025-11-29 06:47:49.148 187189 DEBUG oslo_concurrency.lockutils [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:47:50 compute-0 podman[213713]: 2025-11-29 06:47:50.838773687 +0000 UTC m=+0.102850313 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:47:51 compute-0 nova_compute[187185]: 2025-11-29 06:47:51.089 187189 DEBUG oslo_concurrency.processutils [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:47:51 compute-0 nova_compute[187185]: 2025-11-29 06:47:51.144 187189 DEBUG oslo_concurrency.processutils [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28.part --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:47:51 compute-0 nova_compute[187185]: 2025-11-29 06:47:51.145 187189 DEBUG nova.virt.images [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] 5d270706-931c-4fd1-846d-ba6ddeac2a79 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Nov 29 06:47:51 compute-0 nova_compute[187185]: 2025-11-29 06:47:51.146 187189 DEBUG nova.privsep.utils [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 29 06:47:51 compute-0 nova_compute[187185]: 2025-11-29 06:47:51.146 187189 DEBUG oslo_concurrency.processutils [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28.part /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:47:51 compute-0 nova_compute[187185]: 2025-11-29 06:47:51.391 187189 DEBUG oslo_concurrency.processutils [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28.part /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28.converted" returned: 0 in 0.245s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:47:51 compute-0 nova_compute[187185]: 2025-11-29 06:47:51.397 187189 DEBUG oslo_concurrency.processutils [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:47:51 compute-0 nova_compute[187185]: 2025-11-29 06:47:51.447 187189 DEBUG oslo_concurrency.processutils [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28.converted --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:47:51 compute-0 nova_compute[187185]: 2025-11-29 06:47:51.449 187189 DEBUG oslo_concurrency.lockutils [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.301s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:47:51 compute-0 nova_compute[187185]: 2025-11-29 06:47:51.467 187189 INFO oslo.privsep.daemon [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmp6bdy0ttv/privsep.sock']
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.163 187189 INFO oslo.privsep.daemon [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Spawned new privsep daemon via rootwrap
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.052 213758 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.055 213758 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.057 213758 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.057 213758 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213758
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.268 187189 DEBUG oslo_concurrency.processutils [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.318 187189 DEBUG oslo_concurrency.processutils [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.319 187189 DEBUG oslo_concurrency.lockutils [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.320 187189 DEBUG oslo_concurrency.lockutils [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.337 187189 DEBUG oslo_concurrency.processutils [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.387 187189 DEBUG oslo_concurrency.processutils [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.388 187189 DEBUG oslo_concurrency.processutils [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/afe4ae44-2787-45ec-8e0b-72fa7297cebb/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.571 187189 DEBUG oslo_concurrency.processutils [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/afe4ae44-2787-45ec-8e0b-72fa7297cebb/disk 1073741824" returned: 0 in 0.183s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.572 187189 DEBUG oslo_concurrency.lockutils [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.252s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.573 187189 DEBUG oslo_concurrency.processutils [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.663 187189 DEBUG oslo_concurrency.processutils [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.665 187189 DEBUG nova.virt.disk.api [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Checking if we can resize image /var/lib/nova/instances/afe4ae44-2787-45ec-8e0b-72fa7297cebb/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.666 187189 DEBUG oslo_concurrency.processutils [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/afe4ae44-2787-45ec-8e0b-72fa7297cebb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.720 187189 DEBUG oslo_concurrency.processutils [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/afe4ae44-2787-45ec-8e0b-72fa7297cebb/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.721 187189 DEBUG nova.virt.disk.api [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Cannot resize image /var/lib/nova/instances/afe4ae44-2787-45ec-8e0b-72fa7297cebb/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.722 187189 DEBUG nova.objects.instance [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lazy-loading 'migration_context' on Instance uuid afe4ae44-2787-45ec-8e0b-72fa7297cebb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.743 187189 DEBUG nova.virt.libvirt.driver [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.743 187189 DEBUG nova.virt.libvirt.driver [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Ensure instance console log exists: /var/lib/nova/instances/afe4ae44-2787-45ec-8e0b-72fa7297cebb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.743 187189 DEBUG oslo_concurrency.lockutils [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.744 187189 DEBUG oslo_concurrency.lockutils [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.744 187189 DEBUG oslo_concurrency.lockutils [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.746 187189 DEBUG nova.virt.libvirt.driver [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.752 187189 WARNING nova.virt.libvirt.driver [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.757 187189 DEBUG nova.virt.libvirt.host [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.758 187189 DEBUG nova.virt.libvirt.host [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.761 187189 DEBUG nova.virt.libvirt.host [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.762 187189 DEBUG nova.virt.libvirt.host [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.764 187189 DEBUG nova.virt.libvirt.driver [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.765 187189 DEBUG nova.virt.hardware [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.766 187189 DEBUG nova.virt.hardware [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.766 187189 DEBUG nova.virt.hardware [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.767 187189 DEBUG nova.virt.hardware [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.767 187189 DEBUG nova.virt.hardware [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.767 187189 DEBUG nova.virt.hardware [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.768 187189 DEBUG nova.virt.hardware [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.768 187189 DEBUG nova.virt.hardware [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.769 187189 DEBUG nova.virt.hardware [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.769 187189 DEBUG nova.virt.hardware [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.770 187189 DEBUG nova.virt.hardware [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.774 187189 DEBUG nova.privsep.utils [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.776 187189 DEBUG nova.objects.instance [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lazy-loading 'pci_devices' on Instance uuid afe4ae44-2787-45ec-8e0b-72fa7297cebb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:47:52 compute-0 podman[213773]: 2025-11-29 06:47:52.793017373 +0000 UTC m=+0.060130653 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.795 187189 DEBUG nova.virt.libvirt.driver [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] End _get_guest_xml xml=<domain type="kvm">
Nov 29 06:47:52 compute-0 nova_compute[187185]:   <uuid>afe4ae44-2787-45ec-8e0b-72fa7297cebb</uuid>
Nov 29 06:47:52 compute-0 nova_compute[187185]:   <name>instance-00000002</name>
Nov 29 06:47:52 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 06:47:52 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 06:47:52 compute-0 nova_compute[187185]:   <metadata>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 06:47:52 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:       <nova:name>tempest-AutoAllocateNetworkTest-server-1451606376</nova:name>
Nov 29 06:47:52 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 06:47:52</nova:creationTime>
Nov 29 06:47:52 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 06:47:52 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 06:47:52 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 06:47:52 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 06:47:52 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 06:47:52 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 06:47:52 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 06:47:52 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 06:47:52 compute-0 nova_compute[187185]:         <nova:user uuid="7a31c969c2f744a9810fc9890dd7acb2">tempest-AutoAllocateNetworkTest-224859463-project-member</nova:user>
Nov 29 06:47:52 compute-0 nova_compute[187185]:         <nova:project uuid="6d2e7db012114f9eb8e8e1b0123c9974">tempest-AutoAllocateNetworkTest-224859463</nova:project>
Nov 29 06:47:52 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 06:47:52 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:       <nova:ports/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 06:47:52 compute-0 nova_compute[187185]:   </metadata>
Nov 29 06:47:52 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <system>
Nov 29 06:47:52 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 06:47:52 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 06:47:52 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 06:47:52 compute-0 nova_compute[187185]:       <entry name="serial">afe4ae44-2787-45ec-8e0b-72fa7297cebb</entry>
Nov 29 06:47:52 compute-0 nova_compute[187185]:       <entry name="uuid">afe4ae44-2787-45ec-8e0b-72fa7297cebb</entry>
Nov 29 06:47:52 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     </system>
Nov 29 06:47:52 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 06:47:52 compute-0 nova_compute[187185]:   <os>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:   </os>
Nov 29 06:47:52 compute-0 nova_compute[187185]:   <features>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <apic/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:   </features>
Nov 29 06:47:52 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:   </clock>
Nov 29 06:47:52 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:   </cpu>
Nov 29 06:47:52 compute-0 nova_compute[187185]:   <devices>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 06:47:52 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/afe4ae44-2787-45ec-8e0b-72fa7297cebb/disk"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     </disk>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 06:47:52 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/afe4ae44-2787-45ec-8e0b-72fa7297cebb/disk.config"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     </disk>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 06:47:52 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/afe4ae44-2787-45ec-8e0b-72fa7297cebb/console.log" append="off"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     </serial>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <video>
Nov 29 06:47:52 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     </video>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 06:47:52 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     </rng>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 06:47:52 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 06:47:52 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 06:47:52 compute-0 nova_compute[187185]:   </devices>
Nov 29 06:47:52 compute-0 nova_compute[187185]: </domain>
Nov 29 06:47:52 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.848 187189 DEBUG nova.virt.libvirt.driver [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.849 187189 DEBUG nova.virt.libvirt.driver [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 06:47:52 compute-0 nova_compute[187185]: 2025-11-29 06:47:52.850 187189 INFO nova.virt.libvirt.driver [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Using config drive
Nov 29 06:47:53 compute-0 nova_compute[187185]: 2025-11-29 06:47:53.461 187189 INFO nova.virt.libvirt.driver [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Creating config drive at /var/lib/nova/instances/afe4ae44-2787-45ec-8e0b-72fa7297cebb/disk.config
Nov 29 06:47:53 compute-0 nova_compute[187185]: 2025-11-29 06:47:53.470 187189 DEBUG oslo_concurrency.processutils [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/afe4ae44-2787-45ec-8e0b-72fa7297cebb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvut9hq1w execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:47:53 compute-0 nova_compute[187185]: 2025-11-29 06:47:53.594 187189 DEBUG oslo_concurrency.processutils [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/afe4ae44-2787-45ec-8e0b-72fa7297cebb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvut9hq1w" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:47:53 compute-0 systemd-machined[153486]: New machine qemu-1-instance-00000002.
Nov 29 06:47:53 compute-0 systemd[1]: Started Virtual Machine qemu-1-instance-00000002.
Nov 29 06:47:54 compute-0 podman[213823]: 2025-11-29 06:47:54.395428638 +0000 UTC m=+0.079388041 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Nov 29 06:47:54 compute-0 nova_compute[187185]: 2025-11-29 06:47:54.411 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764398874.4107895, afe4ae44-2787-45ec-8e0b-72fa7297cebb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:47:54 compute-0 nova_compute[187185]: 2025-11-29 06:47:54.413 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] VM Resumed (Lifecycle Event)
Nov 29 06:47:54 compute-0 nova_compute[187185]: 2025-11-29 06:47:54.416 187189 DEBUG nova.compute.manager [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 06:47:54 compute-0 nova_compute[187185]: 2025-11-29 06:47:54.417 187189 DEBUG nova.virt.libvirt.driver [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 06:47:54 compute-0 nova_compute[187185]: 2025-11-29 06:47:54.426 187189 INFO nova.virt.libvirt.driver [-] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Instance spawned successfully.
Nov 29 06:47:54 compute-0 nova_compute[187185]: 2025-11-29 06:47:54.427 187189 DEBUG nova.virt.libvirt.driver [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 06:47:54 compute-0 nova_compute[187185]: 2025-11-29 06:47:54.449 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:47:54 compute-0 nova_compute[187185]: 2025-11-29 06:47:54.454 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 06:47:54 compute-0 nova_compute[187185]: 2025-11-29 06:47:54.457 187189 DEBUG nova.virt.libvirt.driver [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:47:54 compute-0 nova_compute[187185]: 2025-11-29 06:47:54.457 187189 DEBUG nova.virt.libvirt.driver [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:47:54 compute-0 nova_compute[187185]: 2025-11-29 06:47:54.457 187189 DEBUG nova.virt.libvirt.driver [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:47:54 compute-0 nova_compute[187185]: 2025-11-29 06:47:54.458 187189 DEBUG nova.virt.libvirt.driver [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:47:54 compute-0 nova_compute[187185]: 2025-11-29 06:47:54.458 187189 DEBUG nova.virt.libvirt.driver [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:47:54 compute-0 nova_compute[187185]: 2025-11-29 06:47:54.459 187189 DEBUG nova.virt.libvirt.driver [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:47:54 compute-0 nova_compute[187185]: 2025-11-29 06:47:54.485 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 06:47:54 compute-0 nova_compute[187185]: 2025-11-29 06:47:54.485 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764398874.4129426, afe4ae44-2787-45ec-8e0b-72fa7297cebb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:47:54 compute-0 nova_compute[187185]: 2025-11-29 06:47:54.485 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] VM Started (Lifecycle Event)
Nov 29 06:47:54 compute-0 nova_compute[187185]: 2025-11-29 06:47:54.516 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:47:54 compute-0 nova_compute[187185]: 2025-11-29 06:47:54.519 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 06:47:54 compute-0 nova_compute[187185]: 2025-11-29 06:47:54.545 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 06:47:54 compute-0 nova_compute[187185]: 2025-11-29 06:47:54.551 187189 INFO nova.compute.manager [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Took 5.41 seconds to spawn the instance on the hypervisor.
Nov 29 06:47:54 compute-0 nova_compute[187185]: 2025-11-29 06:47:54.552 187189 DEBUG nova.compute.manager [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:47:54 compute-0 nova_compute[187185]: 2025-11-29 06:47:54.617 187189 INFO nova.compute.manager [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Took 6.18 seconds to build instance.
Nov 29 06:47:54 compute-0 nova_compute[187185]: 2025-11-29 06:47:54.637 187189 DEBUG oslo_concurrency.lockutils [None req-234e32fa-fe08-4a9f-b42f-626d3ea137e8 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "afe4ae44-2787-45ec-8e0b-72fa7297cebb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.281s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:47:59 compute-0 nova_compute[187185]: 2025-11-29 06:47:59.517 187189 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquiring lock "67c594e4-def3-4964-bd5e-472a63536c4c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:47:59 compute-0 nova_compute[187185]: 2025-11-29 06:47:59.517 187189 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "67c594e4-def3-4964-bd5e-472a63536c4c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:47:59 compute-0 nova_compute[187185]: 2025-11-29 06:47:59.562 187189 DEBUG nova.compute.manager [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 06:47:59 compute-0 nova_compute[187185]: 2025-11-29 06:47:59.725 187189 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:47:59 compute-0 nova_compute[187185]: 2025-11-29 06:47:59.726 187189 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:47:59 compute-0 nova_compute[187185]: 2025-11-29 06:47:59.736 187189 DEBUG nova.virt.hardware [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 06:47:59 compute-0 nova_compute[187185]: 2025-11-29 06:47:59.737 187189 INFO nova.compute.claims [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Claim successful on node compute-0.ctlplane.example.com
Nov 29 06:47:59 compute-0 nova_compute[187185]: 2025-11-29 06:47:59.907 187189 DEBUG nova.compute.provider_tree [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Updating inventory in ProviderTree for provider 4e39a026-df39-4e20-874a-dbb5a40df044 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 29 06:47:59 compute-0 nova_compute[187185]: 2025-11-29 06:47:59.966 187189 ERROR nova.scheduler.client.report [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [req-b50064de-e7b1-4f68-a889-93e7132c7f7d] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID 4e39a026-df39-4e20-874a-dbb5a40df044.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-b50064de-e7b1-4f68-a889-93e7132c7f7d"}]}
Nov 29 06:48:00 compute-0 nova_compute[187185]: 2025-11-29 06:48:00.021 187189 DEBUG nova.scheduler.client.report [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Refreshing inventories for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 29 06:48:00 compute-0 nova_compute[187185]: 2025-11-29 06:48:00.069 187189 DEBUG nova.scheduler.client.report [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Updating ProviderTree inventory for provider 4e39a026-df39-4e20-874a-dbb5a40df044 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 29 06:48:00 compute-0 nova_compute[187185]: 2025-11-29 06:48:00.070 187189 DEBUG nova.compute.provider_tree [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Updating inventory in ProviderTree for provider 4e39a026-df39-4e20-874a-dbb5a40df044 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 29 06:48:00 compute-0 nova_compute[187185]: 2025-11-29 06:48:00.094 187189 DEBUG nova.scheduler.client.report [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Refreshing aggregate associations for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 29 06:48:00 compute-0 nova_compute[187185]: 2025-11-29 06:48:00.144 187189 DEBUG nova.scheduler.client.report [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Refreshing trait associations for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044, traits: HW_CPU_X86_SSE,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_TRUSTED_CERTS,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 29 06:48:00 compute-0 nova_compute[187185]: 2025-11-29 06:48:00.242 187189 DEBUG nova.compute.provider_tree [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Updating inventory in ProviderTree for provider 4e39a026-df39-4e20-874a-dbb5a40df044 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 29 06:48:00 compute-0 nova_compute[187185]: 2025-11-29 06:48:00.291 187189 DEBUG nova.scheduler.client.report [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Updated inventory for provider 4e39a026-df39-4e20-874a-dbb5a40df044 with generation 4 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Nov 29 06:48:00 compute-0 nova_compute[187185]: 2025-11-29 06:48:00.291 187189 DEBUG nova.compute.provider_tree [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Updating resource provider 4e39a026-df39-4e20-874a-dbb5a40df044 generation from 4 to 5 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 29 06:48:00 compute-0 nova_compute[187185]: 2025-11-29 06:48:00.292 187189 DEBUG nova.compute.provider_tree [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Updating inventory in ProviderTree for provider 4e39a026-df39-4e20-874a-dbb5a40df044 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 29 06:48:00 compute-0 nova_compute[187185]: 2025-11-29 06:48:00.320 187189 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.594s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:48:00 compute-0 nova_compute[187185]: 2025-11-29 06:48:00.321 187189 DEBUG nova.compute.manager [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 06:48:00 compute-0 nova_compute[187185]: 2025-11-29 06:48:00.387 187189 DEBUG nova.compute.manager [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 06:48:00 compute-0 nova_compute[187185]: 2025-11-29 06:48:00.388 187189 DEBUG nova.network.neutron [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 06:48:00 compute-0 nova_compute[187185]: 2025-11-29 06:48:00.405 187189 INFO nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 06:48:00 compute-0 nova_compute[187185]: 2025-11-29 06:48:00.422 187189 DEBUG nova.compute.manager [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 06:48:00 compute-0 nova_compute[187185]: 2025-11-29 06:48:00.527 187189 DEBUG nova.compute.manager [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 06:48:00 compute-0 nova_compute[187185]: 2025-11-29 06:48:00.530 187189 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 06:48:00 compute-0 nova_compute[187185]: 2025-11-29 06:48:00.530 187189 INFO nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Creating image(s)
Nov 29 06:48:00 compute-0 nova_compute[187185]: 2025-11-29 06:48:00.532 187189 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquiring lock "/var/lib/nova/instances/67c594e4-def3-4964-bd5e-472a63536c4c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:48:00 compute-0 nova_compute[187185]: 2025-11-29 06:48:00.532 187189 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "/var/lib/nova/instances/67c594e4-def3-4964-bd5e-472a63536c4c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:48:00 compute-0 nova_compute[187185]: 2025-11-29 06:48:00.533 187189 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "/var/lib/nova/instances/67c594e4-def3-4964-bd5e-472a63536c4c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:48:00 compute-0 nova_compute[187185]: 2025-11-29 06:48:00.560 187189 DEBUG oslo_concurrency.processutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:48:00 compute-0 nova_compute[187185]: 2025-11-29 06:48:00.658 187189 DEBUG oslo_concurrency.processutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:48:00 compute-0 nova_compute[187185]: 2025-11-29 06:48:00.660 187189 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:48:00 compute-0 nova_compute[187185]: 2025-11-29 06:48:00.660 187189 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:48:00 compute-0 nova_compute[187185]: 2025-11-29 06:48:00.676 187189 DEBUG oslo_concurrency.processutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:48:00 compute-0 nova_compute[187185]: 2025-11-29 06:48:00.730 187189 DEBUG oslo_concurrency.processutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:48:00 compute-0 nova_compute[187185]: 2025-11-29 06:48:00.731 187189 DEBUG oslo_concurrency.processutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/67c594e4-def3-4964-bd5e-472a63536c4c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:48:00 compute-0 podman[213850]: 2025-11-29 06:48:00.792000895 +0000 UTC m=+0.058692581 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:48:00 compute-0 nova_compute[187185]: 2025-11-29 06:48:00.854 187189 DEBUG nova.network.neutron [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Automatically allocating a network for project 6d2e7db012114f9eb8e8e1b0123c9974. _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2460
Nov 29 06:48:01 compute-0 nova_compute[187185]: 2025-11-29 06:48:01.116 187189 DEBUG oslo_concurrency.processutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/67c594e4-def3-4964-bd5e-472a63536c4c/disk 1073741824" returned: 0 in 0.385s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:48:01 compute-0 nova_compute[187185]: 2025-11-29 06:48:01.117 187189 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.456s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:48:01 compute-0 nova_compute[187185]: 2025-11-29 06:48:01.117 187189 DEBUG oslo_concurrency.processutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:48:01 compute-0 nova_compute[187185]: 2025-11-29 06:48:01.168 187189 DEBUG oslo_concurrency.processutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:48:01 compute-0 nova_compute[187185]: 2025-11-29 06:48:01.169 187189 DEBUG nova.virt.disk.api [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Checking if we can resize image /var/lib/nova/instances/67c594e4-def3-4964-bd5e-472a63536c4c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 06:48:01 compute-0 nova_compute[187185]: 2025-11-29 06:48:01.169 187189 DEBUG oslo_concurrency.processutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/67c594e4-def3-4964-bd5e-472a63536c4c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:48:01 compute-0 nova_compute[187185]: 2025-11-29 06:48:01.223 187189 DEBUG oslo_concurrency.processutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/67c594e4-def3-4964-bd5e-472a63536c4c/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:48:01 compute-0 nova_compute[187185]: 2025-11-29 06:48:01.224 187189 DEBUG nova.virt.disk.api [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Cannot resize image /var/lib/nova/instances/67c594e4-def3-4964-bd5e-472a63536c4c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 06:48:01 compute-0 nova_compute[187185]: 2025-11-29 06:48:01.225 187189 DEBUG nova.objects.instance [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lazy-loading 'migration_context' on Instance uuid 67c594e4-def3-4964-bd5e-472a63536c4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:48:01 compute-0 nova_compute[187185]: 2025-11-29 06:48:01.239 187189 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 06:48:01 compute-0 nova_compute[187185]: 2025-11-29 06:48:01.240 187189 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Ensure instance console log exists: /var/lib/nova/instances/67c594e4-def3-4964-bd5e-472a63536c4c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 06:48:01 compute-0 nova_compute[187185]: 2025-11-29 06:48:01.240 187189 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:48:01 compute-0 nova_compute[187185]: 2025-11-29 06:48:01.240 187189 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:48:01 compute-0 nova_compute[187185]: 2025-11-29 06:48:01.241 187189 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:48:03 compute-0 sshd-session[213880]: Unable to negotiate with 103.152.48.69 port 58169: no matching host key type found. Their offer: ssh-rsa,ssh-dss [preauth]
Nov 29 06:48:13 compute-0 podman[213896]: 2025-11-29 06:48:13.809364399 +0000 UTC m=+0.073868379 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, name=ubi9-minimal, config_id=edpm, io.openshift.expose-services=, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, architecture=x86_64, version=9.6, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 29 06:48:13 compute-0 podman[213897]: 2025-11-29 06:48:13.83720732 +0000 UTC m=+0.085135251 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 06:48:15 compute-0 podman[213939]: 2025-11-29 06:48:15.821160882 +0000 UTC m=+0.084540113 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 29 06:48:20 compute-0 sshd-session[213960]: Invalid user app from 45.202.211.6 port 52800
Nov 29 06:48:20 compute-0 sshd-session[213960]: Received disconnect from 45.202.211.6 port 52800:11: Bye Bye [preauth]
Nov 29 06:48:20 compute-0 sshd-session[213960]: Disconnected from invalid user app 45.202.211.6 port 52800 [preauth]
Nov 29 06:48:21 compute-0 podman[213962]: 2025-11-29 06:48:21.900330553 +0000 UTC m=+0.151866767 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:48:23 compute-0 podman[213989]: 2025-11-29 06:48:23.842396342 +0000 UTC m=+0.086511031 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 06:48:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:24.805 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:48:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:24.808 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:48:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:24.808 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:48:24 compute-0 podman[214014]: 2025-11-29 06:48:24.843696167 +0000 UTC m=+0.097672109 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 06:48:30 compute-0 nova_compute[187185]: 2025-11-29 06:48:30.396 187189 DEBUG nova.network.neutron [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Automatically allocated network: {'id': '425e933e-ca72-466c-8d2b-499c7ba67318', 'name': 'auto_allocated_network', 'tenant_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'admin_state_up': True, 'mtu': 1442, 'status': 'ACTIVE', 'subnets': ['89838152-b4b2-434b-a7d9-d3f897cb4399', 'a56d2d79-817f-461e-9014-0136415cc45e'], 'shared': False, 'availability_zone_hints': [], 'availability_zones': [], 'ipv4_address_scope': None, 'ipv6_address_scope': None, 'router:external': False, 'description': '', 'qos_policy_id': None, 'port_security_enabled': True, 'dns_domain': '', 'l2_adjacency': True, 'tags': [], 'created_at': '2025-11-29T06:48:00Z', 'updated_at': '2025-11-29T06:48:09Z', 'revision_number': 4, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974'} _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2478
Nov 29 06:48:30 compute-0 nova_compute[187185]: 2025-11-29 06:48:30.408 187189 WARNING oslo_policy.policy [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Nov 29 06:48:30 compute-0 nova_compute[187185]: 2025-11-29 06:48:30.409 187189 WARNING oslo_policy.policy [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Nov 29 06:48:30 compute-0 nova_compute[187185]: 2025-11-29 06:48:30.411 187189 DEBUG nova.policy [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 06:48:31 compute-0 podman[214034]: 2025-11-29 06:48:31.808436195 +0000 UTC m=+0.076549048 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 06:48:32 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:32.196 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 06:48:32 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:32.198 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 06:48:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:33.201 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:48:33 compute-0 nova_compute[187185]: 2025-11-29 06:48:33.500 187189 DEBUG nova.network.neutron [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Successfully created port: 789be005-db43-4c16-8d31-448144c818e2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 06:48:37 compute-0 nova_compute[187185]: 2025-11-29 06:48:37.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:48:37 compute-0 nova_compute[187185]: 2025-11-29 06:48:37.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 06:48:37 compute-0 nova_compute[187185]: 2025-11-29 06:48:37.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 06:48:38 compute-0 nova_compute[187185]: 2025-11-29 06:48:38.080 187189 DEBUG nova.network.neutron [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Successfully updated port: 789be005-db43-4c16-8d31-448144c818e2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 06:48:38 compute-0 nova_compute[187185]: 2025-11-29 06:48:38.598 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Nov 29 06:48:38 compute-0 nova_compute[187185]: 2025-11-29 06:48:38.693 187189 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquiring lock "refresh_cache-67c594e4-def3-4964-bd5e-472a63536c4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:48:38 compute-0 nova_compute[187185]: 2025-11-29 06:48:38.693 187189 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquired lock "refresh_cache-67c594e4-def3-4964-bd5e-472a63536c4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:48:38 compute-0 nova_compute[187185]: 2025-11-29 06:48:38.693 187189 DEBUG nova.network.neutron [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 06:48:38 compute-0 nova_compute[187185]: 2025-11-29 06:48:38.704 187189 DEBUG nova.compute.manager [req-30be0973-b4ee-4f2d-9ec5-cd6f865f75b0 req-0dd4be9c-642d-485c-9c15-e3477bd07033 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Received event network-changed-789be005-db43-4c16-8d31-448144c818e2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:48:38 compute-0 nova_compute[187185]: 2025-11-29 06:48:38.704 187189 DEBUG nova.compute.manager [req-30be0973-b4ee-4f2d-9ec5-cd6f865f75b0 req-0dd4be9c-642d-485c-9c15-e3477bd07033 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Refreshing instance network info cache due to event network-changed-789be005-db43-4c16-8d31-448144c818e2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 06:48:38 compute-0 nova_compute[187185]: 2025-11-29 06:48:38.704 187189 DEBUG oslo_concurrency.lockutils [req-30be0973-b4ee-4f2d-9ec5-cd6f865f75b0 req-0dd4be9c-642d-485c-9c15-e3477bd07033 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-67c594e4-def3-4964-bd5e-472a63536c4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:48:39 compute-0 nova_compute[187185]: 2025-11-29 06:48:39.038 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "refresh_cache-afe4ae44-2787-45ec-8e0b-72fa7297cebb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:48:39 compute-0 nova_compute[187185]: 2025-11-29 06:48:39.039 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquired lock "refresh_cache-afe4ae44-2787-45ec-8e0b-72fa7297cebb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:48:39 compute-0 nova_compute[187185]: 2025-11-29 06:48:39.039 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 29 06:48:39 compute-0 nova_compute[187185]: 2025-11-29 06:48:39.039 187189 DEBUG nova.objects.instance [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lazy-loading 'info_cache' on Instance uuid afe4ae44-2787-45ec-8e0b-72fa7297cebb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:48:39 compute-0 nova_compute[187185]: 2025-11-29 06:48:39.159 187189 DEBUG nova.network.neutron [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 06:48:40 compute-0 nova_compute[187185]: 2025-11-29 06:48:40.236 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 06:48:41 compute-0 nova_compute[187185]: 2025-11-29 06:48:41.531 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:48:41 compute-0 nova_compute[187185]: 2025-11-29 06:48:41.729 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Releasing lock "refresh_cache-afe4ae44-2787-45ec-8e0b-72fa7297cebb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:48:41 compute-0 nova_compute[187185]: 2025-11-29 06:48:41.729 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 29 06:48:41 compute-0 nova_compute[187185]: 2025-11-29 06:48:41.731 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:48:41 compute-0 nova_compute[187185]: 2025-11-29 06:48:41.732 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:48:41 compute-0 nova_compute[187185]: 2025-11-29 06:48:41.733 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:48:41 compute-0 nova_compute[187185]: 2025-11-29 06:48:41.734 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:48:41 compute-0 nova_compute[187185]: 2025-11-29 06:48:41.734 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:48:41 compute-0 nova_compute[187185]: 2025-11-29 06:48:41.735 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:48:41 compute-0 nova_compute[187185]: 2025-11-29 06:48:41.735 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 06:48:42 compute-0 sshd-session[214069]: Received disconnect from 1.214.197.163 port 41830:11: Bye Bye [preauth]
Nov 29 06:48:42 compute-0 sshd-session[214069]: Disconnected from authenticating user root 1.214.197.163 port 41830 [preauth]
Nov 29 06:48:42 compute-0 nova_compute[187185]: 2025-11-29 06:48:42.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:48:42 compute-0 nova_compute[187185]: 2025-11-29 06:48:42.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:48:42 compute-0 nova_compute[187185]: 2025-11-29 06:48:42.385 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:48:42 compute-0 nova_compute[187185]: 2025-11-29 06:48:42.386 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:48:42 compute-0 nova_compute[187185]: 2025-11-29 06:48:42.387 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:48:42 compute-0 nova_compute[187185]: 2025-11-29 06:48:42.388 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:48:42 compute-0 nova_compute[187185]: 2025-11-29 06:48:42.476 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/afe4ae44-2787-45ec-8e0b-72fa7297cebb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:48:42 compute-0 nova_compute[187185]: 2025-11-29 06:48:42.562 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/afe4ae44-2787-45ec-8e0b-72fa7297cebb/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:48:42 compute-0 nova_compute[187185]: 2025-11-29 06:48:42.564 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/afe4ae44-2787-45ec-8e0b-72fa7297cebb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:48:42 compute-0 nova_compute[187185]: 2025-11-29 06:48:42.642 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/afe4ae44-2787-45ec-8e0b-72fa7297cebb/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:48:42 compute-0 nova_compute[187185]: 2025-11-29 06:48:42.801 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:48:42 compute-0 nova_compute[187185]: 2025-11-29 06:48:42.803 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5901MB free_disk=73.3142318725586GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:48:42 compute-0 nova_compute[187185]: 2025-11-29 06:48:42.803 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:48:42 compute-0 nova_compute[187185]: 2025-11-29 06:48:42.803 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:48:44 compute-0 podman[214079]: 2025-11-29 06:48:44.799624819 +0000 UTC m=+0.060078964 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 06:48:44 compute-0 podman[214078]: 2025-11-29 06:48:44.808939404 +0000 UTC m=+0.074317150 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.buildah.version=1.33.7, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc.)
Nov 29 06:48:45 compute-0 nova_compute[187185]: 2025-11-29 06:48:45.304 187189 DEBUG nova.network.neutron [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Updating instance_info_cache with network_info: [{"id": "789be005-db43-4c16-8d31-448144c818e2", "address": "fa:16:3e:3a:47:de", "network": {"id": "425e933e-ca72-466c-8d2b-499c7ba67318", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::1f6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.40", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d2e7db012114f9eb8e8e1b0123c9974", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap789be005-db", "ovs_interfaceid": "789be005-db43-4c16-8d31-448144c818e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:48:46 compute-0 podman[214125]: 2025-11-29 06:48:46.806100649 +0000 UTC m=+0.062301428 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:48:47 compute-0 nova_compute[187185]: 2025-11-29 06:48:47.354 187189 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Releasing lock "refresh_cache-67c594e4-def3-4964-bd5e-472a63536c4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:48:47 compute-0 nova_compute[187185]: 2025-11-29 06:48:47.354 187189 DEBUG nova.compute.manager [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Instance network_info: |[{"id": "789be005-db43-4c16-8d31-448144c818e2", "address": "fa:16:3e:3a:47:de", "network": {"id": "425e933e-ca72-466c-8d2b-499c7ba67318", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::1f6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.40", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d2e7db012114f9eb8e8e1b0123c9974", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap789be005-db", "ovs_interfaceid": "789be005-db43-4c16-8d31-448144c818e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 06:48:47 compute-0 nova_compute[187185]: 2025-11-29 06:48:47.355 187189 DEBUG oslo_concurrency.lockutils [req-30be0973-b4ee-4f2d-9ec5-cd6f865f75b0 req-0dd4be9c-642d-485c-9c15-e3477bd07033 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-67c594e4-def3-4964-bd5e-472a63536c4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:48:47 compute-0 nova_compute[187185]: 2025-11-29 06:48:47.355 187189 DEBUG nova.network.neutron [req-30be0973-b4ee-4f2d-9ec5-cd6f865f75b0 req-0dd4be9c-642d-485c-9c15-e3477bd07033 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Refreshing network info cache for port 789be005-db43-4c16-8d31-448144c818e2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 06:48:47 compute-0 nova_compute[187185]: 2025-11-29 06:48:47.359 187189 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Start _get_guest_xml network_info=[{"id": "789be005-db43-4c16-8d31-448144c818e2", "address": "fa:16:3e:3a:47:de", "network": {"id": "425e933e-ca72-466c-8d2b-499c7ba67318", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::1f6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.40", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d2e7db012114f9eb8e8e1b0123c9974", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap789be005-db", "ovs_interfaceid": "789be005-db43-4c16-8d31-448144c818e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 06:48:47 compute-0 nova_compute[187185]: 2025-11-29 06:48:47.364 187189 WARNING nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:48:47 compute-0 nova_compute[187185]: 2025-11-29 06:48:47.369 187189 DEBUG nova.virt.libvirt.host [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 06:48:47 compute-0 nova_compute[187185]: 2025-11-29 06:48:47.370 187189 DEBUG nova.virt.libvirt.host [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 06:48:47 compute-0 nova_compute[187185]: 2025-11-29 06:48:47.380 187189 DEBUG nova.virt.libvirt.host [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 06:48:47 compute-0 nova_compute[187185]: 2025-11-29 06:48:47.380 187189 DEBUG nova.virt.libvirt.host [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 06:48:47 compute-0 nova_compute[187185]: 2025-11-29 06:48:47.382 187189 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 06:48:47 compute-0 nova_compute[187185]: 2025-11-29 06:48:47.383 187189 DEBUG nova.virt.hardware [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 06:48:47 compute-0 nova_compute[187185]: 2025-11-29 06:48:47.383 187189 DEBUG nova.virt.hardware [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 06:48:47 compute-0 nova_compute[187185]: 2025-11-29 06:48:47.384 187189 DEBUG nova.virt.hardware [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 06:48:47 compute-0 nova_compute[187185]: 2025-11-29 06:48:47.384 187189 DEBUG nova.virt.hardware [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 06:48:47 compute-0 nova_compute[187185]: 2025-11-29 06:48:47.384 187189 DEBUG nova.virt.hardware [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 06:48:47 compute-0 nova_compute[187185]: 2025-11-29 06:48:47.385 187189 DEBUG nova.virt.hardware [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 06:48:47 compute-0 nova_compute[187185]: 2025-11-29 06:48:47.385 187189 DEBUG nova.virt.hardware [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 06:48:47 compute-0 nova_compute[187185]: 2025-11-29 06:48:47.385 187189 DEBUG nova.virt.hardware [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 06:48:47 compute-0 nova_compute[187185]: 2025-11-29 06:48:47.386 187189 DEBUG nova.virt.hardware [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 06:48:47 compute-0 nova_compute[187185]: 2025-11-29 06:48:47.386 187189 DEBUG nova.virt.hardware [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 06:48:47 compute-0 nova_compute[187185]: 2025-11-29 06:48:47.386 187189 DEBUG nova.virt.hardware [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 06:48:47 compute-0 nova_compute[187185]: 2025-11-29 06:48:47.391 187189 DEBUG nova.virt.libvirt.vif [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:47:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-526752650-3',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-526752650-3',id=6,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=2,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6d2e7db012114f9eb8e8e1b0123c9974',ramdisk_id='',reservation_id='r-o9m0h1rn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-224859463',owner_user_name='tempest-AutoAllocateNetworkTest-224859463-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:48:00Z,user_data=None,user_id='7a31c969c2f744a9810fc9890dd7acb2',uuid=67c594e4-def3-4964-bd5e-472a63536c4c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "789be005-db43-4c16-8d31-448144c818e2", "address": "fa:16:3e:3a:47:de", "network": {"id": "425e933e-ca72-466c-8d2b-499c7ba67318", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::1f6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.40", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d2e7db012114f9eb8e8e1b0123c9974", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap789be005-db", "ovs_interfaceid": "789be005-db43-4c16-8d31-448144c818e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 06:48:47 compute-0 nova_compute[187185]: 2025-11-29 06:48:47.391 187189 DEBUG nova.network.os_vif_util [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Converting VIF {"id": "789be005-db43-4c16-8d31-448144c818e2", "address": "fa:16:3e:3a:47:de", "network": {"id": "425e933e-ca72-466c-8d2b-499c7ba67318", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::1f6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.40", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d2e7db012114f9eb8e8e1b0123c9974", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap789be005-db", "ovs_interfaceid": "789be005-db43-4c16-8d31-448144c818e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 06:48:47 compute-0 nova_compute[187185]: 2025-11-29 06:48:47.393 187189 DEBUG nova.network.os_vif_util [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3a:47:de,bridge_name='br-int',has_traffic_filtering=True,id=789be005-db43-4c16-8d31-448144c818e2,network=Network(425e933e-ca72-466c-8d2b-499c7ba67318),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap789be005-db') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 06:48:47 compute-0 nova_compute[187185]: 2025-11-29 06:48:47.395 187189 DEBUG nova.objects.instance [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lazy-loading 'pci_devices' on Instance uuid 67c594e4-def3-4964-bd5e-472a63536c4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:48.321 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}dd148b1b64568516e8e9c4f7dca4b2c96ed9e7a6d38f1387503b8184f9bf9013" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Nov 29 06:48:48 compute-0 nova_compute[187185]: 2025-11-29 06:48:48.492 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance afe4ae44-2787-45ec-8e0b-72fa7297cebb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 06:48:48 compute-0 nova_compute[187185]: 2025-11-29 06:48:48.493 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance 67c594e4-def3-4964-bd5e-472a63536c4c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 06:48:48 compute-0 nova_compute[187185]: 2025-11-29 06:48:48.493 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:48:48 compute-0 nova_compute[187185]: 2025-11-29 06:48:48.494 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:48:48 compute-0 nova_compute[187185]: 2025-11-29 06:48:48.564 187189 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] End _get_guest_xml xml=<domain type="kvm">
Nov 29 06:48:48 compute-0 nova_compute[187185]:   <uuid>67c594e4-def3-4964-bd5e-472a63536c4c</uuid>
Nov 29 06:48:48 compute-0 nova_compute[187185]:   <name>instance-00000006</name>
Nov 29 06:48:48 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 06:48:48 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 06:48:48 compute-0 nova_compute[187185]:   <metadata>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 06:48:48 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:       <nova:name>tempest-tempest.common.compute-instance-526752650-3</nova:name>
Nov 29 06:48:48 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 06:48:47</nova:creationTime>
Nov 29 06:48:48 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 06:48:48 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 06:48:48 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 06:48:48 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 06:48:48 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 06:48:48 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 06:48:48 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 06:48:48 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 06:48:48 compute-0 nova_compute[187185]:         <nova:user uuid="7a31c969c2f744a9810fc9890dd7acb2">tempest-AutoAllocateNetworkTest-224859463-project-member</nova:user>
Nov 29 06:48:48 compute-0 nova_compute[187185]:         <nova:project uuid="6d2e7db012114f9eb8e8e1b0123c9974">tempest-AutoAllocateNetworkTest-224859463</nova:project>
Nov 29 06:48:48 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 06:48:48 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 06:48:48 compute-0 nova_compute[187185]:         <nova:port uuid="789be005-db43-4c16-8d31-448144c818e2">
Nov 29 06:48:48 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="fdfe:381f:8400::1f6" ipVersion="6"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.1.0.40" ipVersion="4"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 06:48:48 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 06:48:48 compute-0 nova_compute[187185]:   </metadata>
Nov 29 06:48:48 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <system>
Nov 29 06:48:48 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 06:48:48 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 06:48:48 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 06:48:48 compute-0 nova_compute[187185]:       <entry name="serial">67c594e4-def3-4964-bd5e-472a63536c4c</entry>
Nov 29 06:48:48 compute-0 nova_compute[187185]:       <entry name="uuid">67c594e4-def3-4964-bd5e-472a63536c4c</entry>
Nov 29 06:48:48 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     </system>
Nov 29 06:48:48 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 06:48:48 compute-0 nova_compute[187185]:   <os>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:   </os>
Nov 29 06:48:48 compute-0 nova_compute[187185]:   <features>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <apic/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:   </features>
Nov 29 06:48:48 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:   </clock>
Nov 29 06:48:48 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:   </cpu>
Nov 29 06:48:48 compute-0 nova_compute[187185]:   <devices>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 06:48:48 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/67c594e4-def3-4964-bd5e-472a63536c4c/disk"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     </disk>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 06:48:48 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/67c594e4-def3-4964-bd5e-472a63536c4c/disk.config"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     </disk>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 06:48:48 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:3a:47:de"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:       <target dev="tap789be005-db"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     </interface>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 06:48:48 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/67c594e4-def3-4964-bd5e-472a63536c4c/console.log" append="off"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     </serial>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <video>
Nov 29 06:48:48 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     </video>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 06:48:48 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     </rng>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 06:48:48 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 06:48:48 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 06:48:48 compute-0 nova_compute[187185]:   </devices>
Nov 29 06:48:48 compute-0 nova_compute[187185]: </domain>
Nov 29 06:48:48 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 06:48:48 compute-0 nova_compute[187185]: 2025-11-29 06:48:48.564 187189 DEBUG nova.compute.manager [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Preparing to wait for external event network-vif-plugged-789be005-db43-4c16-8d31-448144c818e2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 06:48:48 compute-0 nova_compute[187185]: 2025-11-29 06:48:48.564 187189 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquiring lock "67c594e4-def3-4964-bd5e-472a63536c4c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:48:48 compute-0 nova_compute[187185]: 2025-11-29 06:48:48.565 187189 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "67c594e4-def3-4964-bd5e-472a63536c4c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:48:48 compute-0 nova_compute[187185]: 2025-11-29 06:48:48.565 187189 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "67c594e4-def3-4964-bd5e-472a63536c4c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:48:48 compute-0 nova_compute[187185]: 2025-11-29 06:48:48.565 187189 DEBUG nova.virt.libvirt.vif [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:47:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-526752650-3',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-526752650-3',id=6,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=2,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6d2e7db012114f9eb8e8e1b0123c9974',ramdisk_id='',reservation_id='r-o9m0h1rn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-224859463',owner_user_name='tempest-AutoAllocateNetworkTest-224859463-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:48:00Z,user_data=None,user_id='7a31c969c2f744a9810fc9890dd7acb2',uuid=67c594e4-def3-4964-bd5e-472a63536c4c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "789be005-db43-4c16-8d31-448144c818e2", "address": "fa:16:3e:3a:47:de", "network": {"id": "425e933e-ca72-466c-8d2b-499c7ba67318", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::1f6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.40", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d2e7db012114f9eb8e8e1b0123c9974", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap789be005-db", "ovs_interfaceid": "789be005-db43-4c16-8d31-448144c818e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 06:48:48 compute-0 nova_compute[187185]: 2025-11-29 06:48:48.566 187189 DEBUG nova.network.os_vif_util [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Converting VIF {"id": "789be005-db43-4c16-8d31-448144c818e2", "address": "fa:16:3e:3a:47:de", "network": {"id": "425e933e-ca72-466c-8d2b-499c7ba67318", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::1f6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.40", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d2e7db012114f9eb8e8e1b0123c9974", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap789be005-db", "ovs_interfaceid": "789be005-db43-4c16-8d31-448144c818e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 06:48:48 compute-0 nova_compute[187185]: 2025-11-29 06:48:48.566 187189 DEBUG nova.network.os_vif_util [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3a:47:de,bridge_name='br-int',has_traffic_filtering=True,id=789be005-db43-4c16-8d31-448144c818e2,network=Network(425e933e-ca72-466c-8d2b-499c7ba67318),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap789be005-db') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 06:48:48 compute-0 nova_compute[187185]: 2025-11-29 06:48:48.567 187189 DEBUG os_vif [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3a:47:de,bridge_name='br-int',has_traffic_filtering=True,id=789be005-db43-4c16-8d31-448144c818e2,network=Network(425e933e-ca72-466c-8d2b-499c7ba67318),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap789be005-db') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 06:48:48 compute-0 nova_compute[187185]: 2025-11-29 06:48:48.605 187189 DEBUG ovsdbapp.backend.ovs_idl [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 29 06:48:48 compute-0 nova_compute[187185]: 2025-11-29 06:48:48.605 187189 DEBUG ovsdbapp.backend.ovs_idl [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 29 06:48:48 compute-0 nova_compute[187185]: 2025-11-29 06:48:48.605 187189 DEBUG ovsdbapp.backend.ovs_idl [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 29 06:48:48 compute-0 nova_compute[187185]: 2025-11-29 06:48:48.606 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 29 06:48:48 compute-0 nova_compute[187185]: 2025-11-29 06:48:48.607 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [POLLOUT] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:48:48 compute-0 nova_compute[187185]: 2025-11-29 06:48:48.607 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 29 06:48:48 compute-0 nova_compute[187185]: 2025-11-29 06:48:48.608 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:48:48 compute-0 nova_compute[187185]: 2025-11-29 06:48:48.621 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:48:48 compute-0 nova_compute[187185]: 2025-11-29 06:48:48.622 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:48:48 compute-0 nova_compute[187185]: 2025-11-29 06:48:48.622 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 06:48:48 compute-0 nova_compute[187185]: 2025-11-29 06:48:48.623 187189 INFO oslo.privsep.daemon [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpy91wkoe6/privsep.sock']
Nov 29 06:48:48 compute-0 nova_compute[187185]: 2025-11-29 06:48:48.641 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:48:48 compute-0 nova_compute[187185]: 2025-11-29 06:48:48.665 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:48:48 compute-0 nova_compute[187185]: 2025-11-29 06:48:48.708 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:48:48 compute-0 nova_compute[187185]: 2025-11-29 06:48:48.708 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 5.905s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:48:49 compute-0 sshd-session[214146]: Invalid user bitwarden from 103.179.56.44 port 33094
Nov 29 06:48:49 compute-0 nova_compute[187185]: 2025-11-29 06:48:49.362 187189 INFO oslo.privsep.daemon [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Spawned new privsep daemon via rootwrap
Nov 29 06:48:49 compute-0 nova_compute[187185]: 2025-11-29 06:48:49.218 214152 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 29 06:48:49 compute-0 nova_compute[187185]: 2025-11-29 06:48:49.223 214152 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 29 06:48:49 compute-0 nova_compute[187185]: 2025-11-29 06:48:49.225 214152 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Nov 29 06:48:49 compute-0 nova_compute[187185]: 2025-11-29 06:48:49.225 214152 INFO oslo.privsep.daemon [-] privsep daemon running as pid 214152
Nov 29 06:48:49 compute-0 sshd-session[214146]: Received disconnect from 103.179.56.44 port 33094:11: Bye Bye [preauth]
Nov 29 06:48:49 compute-0 sshd-session[214146]: Disconnected from invalid user bitwarden 103.179.56.44 port 33094 [preauth]
Nov 29 06:48:49 compute-0 nova_compute[187185]: 2025-11-29 06:48:49.677 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:48:49 compute-0 nova_compute[187185]: 2025-11-29 06:48:49.677 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap789be005-db, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:48:49 compute-0 nova_compute[187185]: 2025-11-29 06:48:49.678 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap789be005-db, col_values=(('external_ids', {'iface-id': '789be005-db43-4c16-8d31-448144c818e2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3a:47:de', 'vm-uuid': '67c594e4-def3-4964-bd5e-472a63536c4c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:48:49 compute-0 nova_compute[187185]: 2025-11-29 06:48:49.680 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:48:49 compute-0 NetworkManager[55227]: <info>  [1764398929.6812] manager: (tap789be005-db): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/21)
Nov 29 06:48:49 compute-0 nova_compute[187185]: 2025-11-29 06:48:49.682 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 06:48:49 compute-0 nova_compute[187185]: 2025-11-29 06:48:49.687 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:48:49 compute-0 nova_compute[187185]: 2025-11-29 06:48:49.688 187189 INFO os_vif [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3a:47:de,bridge_name='br-int',has_traffic_filtering=True,id=789be005-db43-4c16-8d31-448144c818e2,network=Network(425e933e-ca72-466c-8d2b-499c7ba67318),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap789be005-db')
Nov 29 06:48:51 compute-0 nova_compute[187185]: 2025-11-29 06:48:51.201 187189 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 06:48:51 compute-0 nova_compute[187185]: 2025-11-29 06:48:51.202 187189 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 06:48:51 compute-0 nova_compute[187185]: 2025-11-29 06:48:51.202 187189 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] No VIF found with MAC fa:16:3e:3a:47:de, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 06:48:51 compute-0 nova_compute[187185]: 2025-11-29 06:48:51.203 187189 INFO nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Using config drive
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.212 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 644 Content-Type: application/json Date: Sat, 29 Nov 2025 06:48:48 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-3448d31c-9b12-47fc-bb97-2ca66ffb834e x-openstack-request-id: req-3448d31c-9b12-47fc-bb97-2ca66ffb834e _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.212 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31"}]}, {"id": "e29df891-dca5-4a1c-9258-dc512a46956f", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/e29df891-dca5-4a1c-9258-dc512a46956f"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/e29df891-dca5-4a1c-9258-dc512a46956f"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.213 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-3448d31c-9b12-47fc-bb97-2ca66ffb834e request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.215 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}dd148b1b64568516e8e9c4f7dca4b2c96ed9e7a6d38f1387503b8184f9bf9013" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.331 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 495 Content-Type: application/json Date: Sat, 29 Nov 2025 06:48:51 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-a0baf3ba-bff6-46e8-a711-7df9b1b58abc x-openstack-request-id: req-a0baf3ba-bff6-46e8-a711-7df9b1b58abc _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.331 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31", "name": "m1.nano", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.331 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31 used request id req-a0baf3ba-bff6-46e8-a711-7df9b1b58abc request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.332 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'afe4ae44-2787-45ec-8e0b-72fa7297cebb', 'name': 'tempest-AutoAllocateNetworkTest-server-1451606376', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'hostId': 'a3133a3095ec983db87fd330886dac81e6ce75fbc43387792049327c', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.333 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.356 12 DEBUG ceilometer.compute.pollsters [-] afe4ae44-2787-45ec-8e0b-72fa7297cebb/cpu volume: 10540000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f93f68b2-6c12-4008-88d3-b254f72f60f9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10540000000, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': 'afe4ae44-2787-45ec-8e0b-72fa7297cebb', 'timestamp': '2025-11-29T06:48:51.333373', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-1451606376', 'name': 'instance-00000002', 'instance_id': 'afe4ae44-2787-45ec-8e0b-72fa7297cebb', 'instance_type': 'm1.nano', 'host': 'a3133a3095ec983db87fd330886dac81e6ce75fbc43387792049327c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '76f62050-ccef-11f0-8f64-fa163e220349', 'monotonic_time': 4379.07452826, 'message_signature': '3dcd0930624c922a5388d3dcdc75b4d938f878002fd8436abc1924fced6236d3'}]}, 'timestamp': '2025-11-29 06:48:51.358235', '_unique_id': '8c8d730b87594348ab822926baf8a47f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.368 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.374 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.407 12 DEBUG ceilometer.compute.pollsters [-] afe4ae44-2787-45ec-8e0b-72fa7297cebb/disk.device.write.bytes volume: 72953856 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.408 12 DEBUG ceilometer.compute.pollsters [-] afe4ae44-2787-45ec-8e0b-72fa7297cebb/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f53c40f3-fa68-420a-b680-920cdb647d2d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72953856, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': 'afe4ae44-2787-45ec-8e0b-72fa7297cebb-vda', 'timestamp': '2025-11-29T06:48:51.374920', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-1451606376', 'name': 'instance-00000002', 'instance_id': 'afe4ae44-2787-45ec-8e0b-72fa7297cebb', 'instance_type': 'm1.nano', 'host': 'a3133a3095ec983db87fd330886dac81e6ce75fbc43387792049327c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '76fdd5e8-ccef-11f0-8f64-fa163e220349', 'monotonic_time': 4379.093352527, 'message_signature': 'f8a1604bea7c09eaa9982410c85a9d10ff5506315ec8c752a3e0d65b2cc34a68'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': 'afe4ae44-2787-45ec-8e0b-72fa7297cebb-sda', 'timestamp': '2025-11-29T06:48:51.374920', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-1451606376', 'name': 'instance-00000002', 'instance_id': 'afe4ae44-2787-45ec-8e0b-72fa7297cebb', 'instance_type': 'm1.nano', 'host': 'a3133a3095ec983db87fd330886dac81e6ce75fbc43387792049327c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '76fdea6a-ccef-11f0-8f64-fa163e220349', 'monotonic_time': 4379.093352527, 'message_signature': '3e5b6b3111368d20a4f1474670add302903e4d3541db45a7d9519a2ae82db546'}]}, 'timestamp': '2025-11-29 06:48:51.409027', '_unique_id': 'e3455aaf03ff4299984b536357b91858'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.410 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.412 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.412 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.412 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-AutoAllocateNetworkTest-server-1451606376>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-AutoAllocateNetworkTest-server-1451606376>]
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.413 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.413 12 DEBUG ceilometer.compute.pollsters [-] afe4ae44-2787-45ec-8e0b-72fa7297cebb/disk.device.read.bytes volume: 30968320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.414 12 DEBUG ceilometer.compute.pollsters [-] afe4ae44-2787-45ec-8e0b-72fa7297cebb/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4689b6be-2e8d-42b2-84d8-3194b2856fa6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30968320, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': 'afe4ae44-2787-45ec-8e0b-72fa7297cebb-vda', 'timestamp': '2025-11-29T06:48:51.413568', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-1451606376', 'name': 'instance-00000002', 'instance_id': 'afe4ae44-2787-45ec-8e0b-72fa7297cebb', 'instance_type': 'm1.nano', 'host': 'a3133a3095ec983db87fd330886dac81e6ce75fbc43387792049327c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '76feb24c-ccef-11f0-8f64-fa163e220349', 'monotonic_time': 4379.093352527, 'message_signature': 'b8b684a227b8411cf9be1cc09383abc09a76760123b7f25b9b5293826c2d2686'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': 'afe4ae44-2787-45ec-8e0b-72fa7297cebb-sda', 'timestamp': '2025-11-29T06:48:51.413568', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-1451606376', 'name': 'instance-00000002', 'instance_id': 'afe4ae44-2787-45ec-8e0b-72fa7297cebb', 'instance_type': 'm1.nano', 'host': 'a3133a3095ec983db87fd330886dac81e6ce75fbc43387792049327c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '76fec64c-ccef-11f0-8f64-fa163e220349', 'monotonic_time': 4379.093352527, 'message_signature': 'ab2a1da1df07cbbaca8ac351f06e5924936d2c0a2a22e94dbe4168b4717b9cb0'}]}, 'timestamp': '2025-11-29 06:48:51.414604', '_unique_id': '9df6da461fbb4fb38b78b7ff8f80252f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.415 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.417 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.420 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.420 12 DEBUG ceilometer.compute.pollsters [-] afe4ae44-2787-45ec-8e0b-72fa7297cebb/disk.device.write.latency volume: 22373565300 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.420 12 DEBUG ceilometer.compute.pollsters [-] afe4ae44-2787-45ec-8e0b-72fa7297cebb/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b73e98f7-dbc9-4e37-8af6-74fcb27d8b53', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22373565300, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': 'afe4ae44-2787-45ec-8e0b-72fa7297cebb-vda', 'timestamp': '2025-11-29T06:48:51.420332', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-1451606376', 'name': 'instance-00000002', 'instance_id': 'afe4ae44-2787-45ec-8e0b-72fa7297cebb', 'instance_type': 'm1.nano', 'host': 'a3133a3095ec983db87fd330886dac81e6ce75fbc43387792049327c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '76ffb98a-ccef-11f0-8f64-fa163e220349', 'monotonic_time': 4379.093352527, 'message_signature': '495075ca2daea00a0c86bd0224bfe2ad6814a735b5c85f5f3d90d411911c6af1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': 'afe4ae44-2787-45ec-8e0b-72fa7297cebb-sda', 'timestamp': '2025-11-29T06:48:51.420332', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-1451606376', 'name': 'instance-00000002', 'instance_id': 'afe4ae44-2787-45ec-8e0b-72fa7297cebb', 'instance_type': 'm1.nano', 'host': 'a3133a3095ec983db87fd330886dac81e6ce75fbc43387792049327c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '76ffcd62-ccef-11f0-8f64-fa163e220349', 'monotonic_time': 4379.093352527, 'message_signature': '94b6eaa56e51d9074822d74c17dc686c2145d84a79ee17b2a6fdf276ca8d4f33'}]}, 'timestamp': '2025-11-29 06:48:51.421331', '_unique_id': 'c3a913db45104e3f8d0bc831e9dccab4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.422 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.423 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.423 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.436 12 DEBUG ceilometer.compute.pollsters [-] afe4ae44-2787-45ec-8e0b-72fa7297cebb/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.436 12 DEBUG ceilometer.compute.pollsters [-] afe4ae44-2787-45ec-8e0b-72fa7297cebb/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7e1de140-811f-4ba6-986b-131298f64937', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': 'afe4ae44-2787-45ec-8e0b-72fa7297cebb-vda', 'timestamp': '2025-11-29T06:48:51.424060', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-1451606376', 'name': 'instance-00000002', 'instance_id': 'afe4ae44-2787-45ec-8e0b-72fa7297cebb', 'instance_type': 'm1.nano', 'host': 'a3133a3095ec983db87fd330886dac81e6ce75fbc43387792049327c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7702226a-ccef-11f0-8f64-fa163e220349', 'monotonic_time': 4379.142420046, 'message_signature': '11a74405f0e011c60c40b265916245a7dd8a4448b42615aad9f0fce5bf3b7c8a'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': 'afe4ae44-2787-45ec-8e0b-72fa7297cebb-sda', 'timestamp': '2025-11-29T06:48:51.424060', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-1451606376', 'name': 'instance-00000002', 'instance_id': 'afe4ae44-2787-45ec-8e0b-72fa7297cebb', 'instance_type': 'm1.nano', 'host': 'a3133a3095ec983db87fd330886dac81e6ce75fbc43387792049327c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '77023624-ccef-11f0-8f64-fa163e220349', 'monotonic_time': 4379.142420046, 'message_signature': '9744c0996daa373e88b4e35fe8f2d5caba3d54809a8c140afea7c1442e4cc8e3'}]}, 'timestamp': '2025-11-29 06:48:51.437125', '_unique_id': '23583b6bf47240bd95b045cba603d539'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.438 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.439 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.440 12 DEBUG ceilometer.compute.pollsters [-] afe4ae44-2787-45ec-8e0b-72fa7297cebb/disk.device.write.requests volume: 314 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.440 12 DEBUG ceilometer.compute.pollsters [-] afe4ae44-2787-45ec-8e0b-72fa7297cebb/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0bdb5e56-6d6d-41a9-8934-e39d08ae79be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 314, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': 'afe4ae44-2787-45ec-8e0b-72fa7297cebb-vda', 'timestamp': '2025-11-29T06:48:51.439962', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-1451606376', 'name': 'instance-00000002', 'instance_id': 'afe4ae44-2787-45ec-8e0b-72fa7297cebb', 'instance_type': 'm1.nano', 'host': 'a3133a3095ec983db87fd330886dac81e6ce75fbc43387792049327c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7702b900-ccef-11f0-8f64-fa163e220349', 'monotonic_time': 4379.093352527, 'message_signature': '035881106d2001660cc8ad99de8b3cc6023f8761758b267a21f2d688f827ed07'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': 'afe4ae44-2787-45ec-8e0b-72fa7297cebb-sda', 'timestamp': '2025-11-29T06:48:51.439962', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-1451606376', 'name': 'instance-00000002', 'instance_id': 'afe4ae44-2787-45ec-8e0b-72fa7297cebb', 'instance_type': 'm1.nano', 'host': 'a3133a3095ec983db87fd330886dac81e6ce75fbc43387792049327c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7702cc24-ccef-11f0-8f64-fa163e220349', 'monotonic_time': 4379.093352527, 'message_signature': '8c89e77f28ca00b44bab0f1df194bacc23b8ee95ff48489ef413bfa715bb5bc1'}]}, 'timestamp': '2025-11-29 06:48:51.440997', '_unique_id': '9f4272f7acd847c8aeb959b864b53f00'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.442 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.443 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.443 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.443 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.444 12 DEBUG ceilometer.compute.pollsters [-] afe4ae44-2787-45ec-8e0b-72fa7297cebb/disk.device.read.requests volume: 1125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.444 12 DEBUG ceilometer.compute.pollsters [-] afe4ae44-2787-45ec-8e0b-72fa7297cebb/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '11d8c61e-f252-4826-b9ca-b4541698831f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1125, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': 'afe4ae44-2787-45ec-8e0b-72fa7297cebb-vda', 'timestamp': '2025-11-29T06:48:51.444028', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-1451606376', 'name': 'instance-00000002', 'instance_id': 'afe4ae44-2787-45ec-8e0b-72fa7297cebb', 'instance_type': 'm1.nano', 'host': 'a3133a3095ec983db87fd330886dac81e6ce75fbc43387792049327c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '77035734-ccef-11f0-8f64-fa163e220349', 'monotonic_time': 4379.093352527, 'message_signature': '5d17af89c46470eeaaa1b16537bca085f8a830df2c0447e603d23a5380c73ee0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': 'afe4ae44-2787-45ec-8e0b-72fa7297cebb-sda', 'timestamp': '2025-11-29T06:48:51.444028', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-1451606376', 'name': 'instance-00000002', 'instance_id': 'afe4ae44-2787-45ec-8e0b-72fa7297cebb', 'instance_type': 'm1.nano', 'host': 'a3133a3095ec983db87fd330886dac81e6ce75fbc43387792049327c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '770368be-ccef-11f0-8f64-fa163e220349', 'monotonic_time': 4379.093352527, 'message_signature': '6d9d2c530780e3fc39aa6b9de90d3a4e420e093db1913823b8bbec2f84be9862'}]}, 'timestamp': '2025-11-29 06:48:51.445004', '_unique_id': 'd695b86636154517a9e68891553ffc7f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.446 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.447 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.447 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.447 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.448 12 DEBUG ceilometer.compute.pollsters [-] afe4ae44-2787-45ec-8e0b-72fa7297cebb/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.448 12 DEBUG ceilometer.compute.pollsters [-] afe4ae44-2787-45ec-8e0b-72fa7297cebb/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd1f07eae-a98a-453c-b304-0e1b0956b72d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': 'afe4ae44-2787-45ec-8e0b-72fa7297cebb-vda', 'timestamp': '2025-11-29T06:48:51.447983', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-1451606376', 'name': 'instance-00000002', 'instance_id': 'afe4ae44-2787-45ec-8e0b-72fa7297cebb', 'instance_type': 'm1.nano', 'host': 'a3133a3095ec983db87fd330886dac81e6ce75fbc43387792049327c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7703f0ae-ccef-11f0-8f64-fa163e220349', 'monotonic_time': 4379.142420046, 'message_signature': '46e03c52817351138089c5bd0d7b38234e71ce5561423a9f97c3d4a2563b87ef'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': 'afe4ae44-2787-45ec-8e0b-72fa7297cebb-sda', 'timestamp': '2025-11-29T06:48:51.447983', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-1451606376', 'name': 'instance-00000002', 'instance_id': 'afe4ae44-2787-45ec-8e0b-72fa7297cebb', 'instance_type': 'm1.nano', 'host': 'a3133a3095ec983db87fd330886dac81e6ce75fbc43387792049327c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '77040206-ccef-11f0-8f64-fa163e220349', 'monotonic_time': 4379.142420046, 'message_signature': '4ba847fe6c386e7d06df5c7df4f947954678a3cf93a7c8c5fcbba66af08ce8a2'}]}, 'timestamp': '2025-11-29 06:48:51.448916', '_unique_id': '21ea560cbbc9483d99b8227e8f878221'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.449 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.451 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.451 12 DEBUG ceilometer.compute.pollsters [-] afe4ae44-2787-45ec-8e0b-72fa7297cebb/disk.device.read.latency volume: 214404744 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.451 12 DEBUG ceilometer.compute.pollsters [-] afe4ae44-2787-45ec-8e0b-72fa7297cebb/disk.device.read.latency volume: 48028775 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '63d17ce1-6439-4b2f-83db-9abeaa8f1a6b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 214404744, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': 'afe4ae44-2787-45ec-8e0b-72fa7297cebb-vda', 'timestamp': '2025-11-29T06:48:51.451214', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-1451606376', 'name': 'instance-00000002', 'instance_id': 'afe4ae44-2787-45ec-8e0b-72fa7297cebb', 'instance_type': 'm1.nano', 'host': 'a3133a3095ec983db87fd330886dac81e6ce75fbc43387792049327c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '77046b1a-ccef-11f0-8f64-fa163e220349', 'monotonic_time': 4379.093352527, 'message_signature': 'b97cb259ada5f31f487f97c62efb0e32c92b528ad0dde0c85e856a2a707a53cc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 48028775, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': 'afe4ae44-2787-45ec-8e0b-72fa7297cebb-sda', 'timestamp': '2025-11-29T06:48:51.451214', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-1451606376', 'name': 'instance-00000002', 'instance_id': 'afe4ae44-2787-45ec-8e0b-72fa7297cebb', 'instance_type': 'm1.nano', 'host': 'a3133a3095ec983db87fd330886dac81e6ce75fbc43387792049327c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '770475ec-ccef-11f0-8f64-fa163e220349', 'monotonic_time': 4379.093352527, 'message_signature': '26d6cd4e05730726bf0a0d3db30df5efb45c0c8fd0b7d54a0e9598beb2312cb0'}]}, 'timestamp': '2025-11-29 06:48:51.451775', '_unique_id': '7eed3e0533784c4fa4eeb77d66f5c128'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.452 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.453 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.453 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.453 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-AutoAllocateNetworkTest-server-1451606376>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-AutoAllocateNetworkTest-server-1451606376>]
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.453 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.453 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.453 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.453 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-AutoAllocateNetworkTest-server-1451606376>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-AutoAllocateNetworkTest-server-1451606376>]
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.454 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.454 12 DEBUG ceilometer.compute.pollsters [-] afe4ae44-2787-45ec-8e0b-72fa7297cebb/disk.device.usage volume: 29884416 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.454 12 DEBUG ceilometer.compute.pollsters [-] afe4ae44-2787-45ec-8e0b-72fa7297cebb/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cb00eef5-238a-4577-abf8-8b0aa216ba76', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29884416, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': 'afe4ae44-2787-45ec-8e0b-72fa7297cebb-vda', 'timestamp': '2025-11-29T06:48:51.454136', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-1451606376', 'name': 'instance-00000002', 'instance_id': 'afe4ae44-2787-45ec-8e0b-72fa7297cebb', 'instance_type': 'm1.nano', 'host': 'a3133a3095ec983db87fd330886dac81e6ce75fbc43387792049327c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7704dd52-ccef-11f0-8f64-fa163e220349', 'monotonic_time': 4379.142420046, 'message_signature': '7caab00748fa456c1d0fa14509f4611884c18ad6d9f0f85bbd4c2884c2dbdc53'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': 'afe4ae44-2787-45ec-8e0b-72fa7297cebb-sda', 'timestamp': '2025-11-29T06:48:51.454136', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-1451606376', 'name': 'instance-00000002', 'instance_id': 'afe4ae44-2787-45ec-8e0b-72fa7297cebb', 'instance_type': 'm1.nano', 'host': 'a3133a3095ec983db87fd330886dac81e6ce75fbc43387792049327c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7704e7fc-ccef-11f0-8f64-fa163e220349', 'monotonic_time': 4379.142420046, 'message_signature': 'c51ca72c46972da16991b6b0ededab02b45517798281f49b49a9d745862eb84c'}]}, 'timestamp': '2025-11-29 06:48:51.454697', '_unique_id': '0159cf20ed73489abac14b65c9543afc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.456 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.457 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.457 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.457 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.457 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-AutoAllocateNetworkTest-server-1451606376>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-AutoAllocateNetworkTest-server-1451606376>]
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.457 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.457 12 DEBUG ceilometer.compute.pollsters [-] afe4ae44-2787-45ec-8e0b-72fa7297cebb/memory.usage volume: 40.828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a317f8ec-8be5-4b28-9b71-e622d9238de6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.828125, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_name': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_name': None, 'resource_id': 'afe4ae44-2787-45ec-8e0b-72fa7297cebb', 'timestamp': '2025-11-29T06:48:51.457602', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-1451606376', 'name': 'instance-00000002', 'instance_id': 'afe4ae44-2787-45ec-8e0b-72fa7297cebb', 'instance_type': 'm1.nano', 'host': 'a3133a3095ec983db87fd330886dac81e6ce75fbc43387792049327c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '770564de-ccef-11f0-8f64-fa163e220349', 'monotonic_time': 4379.07452826, 'message_signature': '1be720cbb2312b85b04c31d92ece70b249bc4f043a2dd3ac0b2470b7b6950a37'}]}, 'timestamp': '2025-11-29 06:48:51.457926', '_unique_id': '8bd5b261715b43cf9902edb675a4e3f0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:48:51 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:48:51.458 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:48:51 compute-0 nova_compute[187185]: 2025-11-29 06:48:51.594 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:48:52 compute-0 nova_compute[187185]: 2025-11-29 06:48:52.627 187189 INFO nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Creating config drive at /var/lib/nova/instances/67c594e4-def3-4964-bd5e-472a63536c4c/disk.config
Nov 29 06:48:52 compute-0 nova_compute[187185]: 2025-11-29 06:48:52.633 187189 DEBUG oslo_concurrency.processutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/67c594e4-def3-4964-bd5e-472a63536c4c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn_jrh8xt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:48:52 compute-0 nova_compute[187185]: 2025-11-29 06:48:52.772 187189 DEBUG oslo_concurrency.processutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/67c594e4-def3-4964-bd5e-472a63536c4c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn_jrh8xt" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:48:52 compute-0 podman[214161]: 2025-11-29 06:48:52.818974855 +0000 UTC m=+0.088836084 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 29 06:48:52 compute-0 kernel: tun: Universal TUN/TAP device driver, 1.6
Nov 29 06:48:52 compute-0 kernel: tap789be005-db: entered promiscuous mode
Nov 29 06:48:52 compute-0 ovn_controller[95281]: 2025-11-29T06:48:52Z|00027|binding|INFO|Claiming lport 789be005-db43-4c16-8d31-448144c818e2 for this chassis.
Nov 29 06:48:52 compute-0 ovn_controller[95281]: 2025-11-29T06:48:52Z|00028|binding|INFO|789be005-db43-4c16-8d31-448144c818e2: Claiming fa:16:3e:3a:47:de 10.1.0.40 fdfe:381f:8400::1f6
Nov 29 06:48:52 compute-0 NetworkManager[55227]: <info>  [1764398932.8405] manager: (tap789be005-db): new Tun device (/org/freedesktop/NetworkManager/Devices/22)
Nov 29 06:48:52 compute-0 nova_compute[187185]: 2025-11-29 06:48:52.840 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:48:52 compute-0 nova_compute[187185]: 2025-11-29 06:48:52.843 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:48:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:52.865 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3a:47:de 10.1.0.40 fdfe:381f:8400::1f6'], port_security=['fa:16:3e:3a:47:de 10.1.0.40 fdfe:381f:8400::1f6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.40/26 fdfe:381f:8400::1f6/64', 'neutron:device_id': '67c594e4-def3-4964-bd5e-472a63536c4c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-425e933e-ca72-466c-8d2b-499c7ba67318', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5cacaa01-dff2-46af-9e49-4a741508795b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=236265de-856a-468e-8ed3-00d3e824203d, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=789be005-db43-4c16-8d31-448144c818e2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 06:48:52 compute-0 systemd-udevd[214204]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 06:48:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:52.867 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 789be005-db43-4c16-8d31-448144c818e2 in datapath 425e933e-ca72-466c-8d2b-499c7ba67318 bound to our chassis
Nov 29 06:48:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:52.870 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 425e933e-ca72-466c-8d2b-499c7ba67318
Nov 29 06:48:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:52.871 104254 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpt3kyk75j/privsep.sock']
Nov 29 06:48:52 compute-0 NetworkManager[55227]: <info>  [1764398932.8839] device (tap789be005-db): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 06:48:52 compute-0 NetworkManager[55227]: <info>  [1764398932.8847] device (tap789be005-db): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 06:48:52 compute-0 systemd-machined[153486]: New machine qemu-2-instance-00000006.
Nov 29 06:48:52 compute-0 nova_compute[187185]: 2025-11-29 06:48:52.934 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:48:52 compute-0 systemd[1]: Started Virtual Machine qemu-2-instance-00000006.
Nov 29 06:48:52 compute-0 ovn_controller[95281]: 2025-11-29T06:48:52Z|00029|binding|INFO|Setting lport 789be005-db43-4c16-8d31-448144c818e2 ovn-installed in OVS
Nov 29 06:48:52 compute-0 ovn_controller[95281]: 2025-11-29T06:48:52Z|00030|binding|INFO|Setting lport 789be005-db43-4c16-8d31-448144c818e2 up in Southbound
Nov 29 06:48:52 compute-0 nova_compute[187185]: 2025-11-29 06:48:52.943 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:48:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:53.527 104254 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Nov 29 06:48:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:53.528 104254 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpt3kyk75j/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Nov 29 06:48:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:53.401 214223 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 29 06:48:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:53.407 214223 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 29 06:48:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:53.410 214223 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Nov 29 06:48:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:53.410 214223 INFO oslo.privsep.daemon [-] privsep daemon running as pid 214223
Nov 29 06:48:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:53.531 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[8335f2bf-10b0-4242-89ba-db02926e4ebf]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:48:53 compute-0 nova_compute[187185]: 2025-11-29 06:48:53.577 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764398933.5769796, 67c594e4-def3-4964-bd5e-472a63536c4c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:48:53 compute-0 nova_compute[187185]: 2025-11-29 06:48:53.578 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] VM Started (Lifecycle Event)
Nov 29 06:48:53 compute-0 nova_compute[187185]: 2025-11-29 06:48:53.613 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:48:53 compute-0 nova_compute[187185]: 2025-11-29 06:48:53.621 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764398933.5771391, 67c594e4-def3-4964-bd5e-472a63536c4c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:48:53 compute-0 nova_compute[187185]: 2025-11-29 06:48:53.621 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] VM Paused (Lifecycle Event)
Nov 29 06:48:53 compute-0 nova_compute[187185]: 2025-11-29 06:48:53.657 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:48:53 compute-0 nova_compute[187185]: 2025-11-29 06:48:53.661 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 06:48:53 compute-0 nova_compute[187185]: 2025-11-29 06:48:53.722 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 06:48:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:54.052 214223 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:48:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:54.052 214223 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:48:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:54.052 214223 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:48:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:54.608 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[7a83c138-ac99-4df1-ad93-15c7264b3380]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:48:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:54.610 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap425e933e-c1 in ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 06:48:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:54.612 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap425e933e-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 06:48:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:54.612 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e2c2c813-3b6e-4661-ad19-9d9cfa2466a8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:48:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:54.615 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f99f1532-8961-4464-90fc-6d1e59d38737]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:48:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:54.637 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[fb200074-4f51-4c01-a2b1-97edb3376e3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:48:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:54.649 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[0bb6513f-bff5-4045-a3ad-31612eae472c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:48:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:54.651 104254 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpiydhnb2s/privsep.sock']
Nov 29 06:48:54 compute-0 nova_compute[187185]: 2025-11-29 06:48:54.680 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:48:54 compute-0 podman[214242]: 2025-11-29 06:48:54.717827107 +0000 UTC m=+0.051968873 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 06:48:54 compute-0 nova_compute[187185]: 2025-11-29 06:48:54.789 187189 DEBUG nova.network.neutron [req-30be0973-b4ee-4f2d-9ec5-cd6f865f75b0 req-0dd4be9c-642d-485c-9c15-e3477bd07033 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Updated VIF entry in instance network info cache for port 789be005-db43-4c16-8d31-448144c818e2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 06:48:54 compute-0 nova_compute[187185]: 2025-11-29 06:48:54.790 187189 DEBUG nova.network.neutron [req-30be0973-b4ee-4f2d-9ec5-cd6f865f75b0 req-0dd4be9c-642d-485c-9c15-e3477bd07033 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Updating instance_info_cache with network_info: [{"id": "789be005-db43-4c16-8d31-448144c818e2", "address": "fa:16:3e:3a:47:de", "network": {"id": "425e933e-ca72-466c-8d2b-499c7ba67318", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::1f6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.40", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d2e7db012114f9eb8e8e1b0123c9974", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap789be005-db", "ovs_interfaceid": "789be005-db43-4c16-8d31-448144c818e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:48:55 compute-0 sshd-session[214236]: Received disconnect from 179.125.24.202 port 40950:11: Bye Bye [preauth]
Nov 29 06:48:55 compute-0 sshd-session[214236]: Disconnected from authenticating user root 179.125.24.202 port 40950 [preauth]
Nov 29 06:48:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:55.293 104254 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Nov 29 06:48:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:55.294 104254 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpiydhnb2s/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Nov 29 06:48:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:55.188 214273 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 29 06:48:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:55.191 214273 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 29 06:48:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:55.193 214273 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Nov 29 06:48:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:55.193 214273 INFO oslo.privsep.daemon [-] privsep daemon running as pid 214273
Nov 29 06:48:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:55.297 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[2bc8aae2-2953-455b-85fe-19910d40615c]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:48:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:55.761 214273 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:48:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:55.761 214273 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:48:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:55.761 214273 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:48:55 compute-0 podman[214278]: 2025-11-29 06:48:55.798593873 +0000 UTC m=+0.065831358 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:56.316 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[f2d526c0-d64f-47d4-8ba8-844aef5f3c52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:48:56 compute-0 NetworkManager[55227]: <info>  [1764398936.3341] manager: (tap425e933e-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/23)
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:56.335 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[827b24f3-96f1-444e-9d96-1f21ddf8d033]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:48:56 compute-0 systemd-udevd[214306]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:56.368 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[56c0076b-5664-4821-b418-5b8e83c84196]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:56.372 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[f260041f-e338-4916-bbc2-0c64c88e4a0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:48:56 compute-0 NetworkManager[55227]: <info>  [1764398936.3944] device (tap425e933e-c0): carrier: link connected
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:56.399 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[1cd1554c-d653-41dc-a479-6956ca95e654]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:56.414 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[91350732-a6f5-4478-89ea-c586437ed470]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap425e933e-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:96:d2:91'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438405, 'reachable_time': 41256, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214325, 'error': None, 'target': 'ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:56.433 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[14cf4ccd-1e0d-4f5b-8ab6-7082ecd0c0f2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe96:d291'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 438405, 'tstamp': 438405}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214326, 'error': None, 'target': 'ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:56.447 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6cd4eb73-3426-4e0d-ab6e-0b2dd9b5aa77]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap425e933e-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:96:d2:91'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438405, 'reachable_time': 41256, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214327, 'error': None, 'target': 'ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:56.473 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[80ab1ec2-7e00-46c2-be9d-bed29da032a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:56.529 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[7fcd1918-ba30-4e80-9628-ea2328112ba6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:56.532 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap425e933e-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:56.532 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:56.533 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap425e933e-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:48:56 compute-0 nova_compute[187185]: 2025-11-29 06:48:56.535 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:48:56 compute-0 NetworkManager[55227]: <info>  [1764398936.5359] manager: (tap425e933e-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/24)
Nov 29 06:48:56 compute-0 kernel: tap425e933e-c0: entered promiscuous mode
Nov 29 06:48:56 compute-0 nova_compute[187185]: 2025-11-29 06:48:56.537 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:56.538 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap425e933e-c0, col_values=(('external_ids', {'iface-id': 'c143daec-964e-4591-a13b-43e2014d70b5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:48:56 compute-0 nova_compute[187185]: 2025-11-29 06:48:56.539 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:48:56 compute-0 ovn_controller[95281]: 2025-11-29T06:48:56Z|00031|binding|INFO|Releasing lport c143daec-964e-4591-a13b-43e2014d70b5 from this chassis (sb_readonly=0)
Nov 29 06:48:56 compute-0 nova_compute[187185]: 2025-11-29 06:48:56.540 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:56.541 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/425e933e-ca72-466c-8d2b-499c7ba67318.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/425e933e-ca72-466c-8d2b-499c7ba67318.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:56.542 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[bcb1ae56-42fc-4089-b539-51be014628e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:56.543 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]: global
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-425e933e-ca72-466c-8d2b-499c7ba67318
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/425e933e-ca72-466c-8d2b-499c7ba67318.pid.haproxy
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]: 
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]: 
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]: 
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID 425e933e-ca72-466c-8d2b-499c7ba67318
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 06:48:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:48:56.544 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318', 'env', 'PROCESS_TAG=haproxy-425e933e-ca72-466c-8d2b-499c7ba67318', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/425e933e-ca72-466c-8d2b-499c7ba67318.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 06:48:56 compute-0 nova_compute[187185]: 2025-11-29 06:48:56.551 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:48:56 compute-0 nova_compute[187185]: 2025-11-29 06:48:56.631 187189 DEBUG oslo_concurrency.lockutils [req-30be0973-b4ee-4f2d-9ec5-cd6f865f75b0 req-0dd4be9c-642d-485c-9c15-e3477bd07033 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-67c594e4-def3-4964-bd5e-472a63536c4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:48:56 compute-0 nova_compute[187185]: 2025-11-29 06:48:56.632 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:48:56 compute-0 podman[214360]: 2025-11-29 06:48:56.944386254 +0000 UTC m=+0.052904000 container create b0e1b2279ba69f682067171ecc4843535797da6aa9500436f6d8f17fbeab084e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 29 06:48:56 compute-0 systemd[1]: Started libpod-conmon-b0e1b2279ba69f682067171ecc4843535797da6aa9500436f6d8f17fbeab084e.scope.
Nov 29 06:48:57 compute-0 podman[214360]: 2025-11-29 06:48:56.917689623 +0000 UTC m=+0.026207399 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 06:48:57 compute-0 systemd[1]: Started libcrun container.
Nov 29 06:48:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/76dc127a5df22cd429196de72b5c18c6b29843d704c6efa6ebfd803b1170bf2a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 06:48:57 compute-0 podman[214360]: 2025-11-29 06:48:57.033652578 +0000 UTC m=+0.142170344 container init b0e1b2279ba69f682067171ecc4843535797da6aa9500436f6d8f17fbeab084e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 29 06:48:57 compute-0 podman[214360]: 2025-11-29 06:48:57.039860365 +0000 UTC m=+0.148378111 container start b0e1b2279ba69f682067171ecc4843535797da6aa9500436f6d8f17fbeab084e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 06:48:57 compute-0 neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318[214375]: [NOTICE]   (214379) : New worker (214381) forked
Nov 29 06:48:57 compute-0 neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318[214375]: [NOTICE]   (214379) : Loading success.
Nov 29 06:48:59 compute-0 nova_compute[187185]: 2025-11-29 06:48:59.682 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:01 compute-0 nova_compute[187185]: 2025-11-29 06:49:01.226 187189 DEBUG nova.compute.manager [req-6f90086e-a01c-4f20-9f80-0998eb30b6ca req-7ac1668d-bf20-484c-853f-b187bb8f11d7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Received event network-vif-plugged-789be005-db43-4c16-8d31-448144c818e2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:49:01 compute-0 nova_compute[187185]: 2025-11-29 06:49:01.227 187189 DEBUG oslo_concurrency.lockutils [req-6f90086e-a01c-4f20-9f80-0998eb30b6ca req-7ac1668d-bf20-484c-853f-b187bb8f11d7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "67c594e4-def3-4964-bd5e-472a63536c4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:49:01 compute-0 nova_compute[187185]: 2025-11-29 06:49:01.227 187189 DEBUG oslo_concurrency.lockutils [req-6f90086e-a01c-4f20-9f80-0998eb30b6ca req-7ac1668d-bf20-484c-853f-b187bb8f11d7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "67c594e4-def3-4964-bd5e-472a63536c4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:49:01 compute-0 nova_compute[187185]: 2025-11-29 06:49:01.227 187189 DEBUG oslo_concurrency.lockutils [req-6f90086e-a01c-4f20-9f80-0998eb30b6ca req-7ac1668d-bf20-484c-853f-b187bb8f11d7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "67c594e4-def3-4964-bd5e-472a63536c4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:49:01 compute-0 nova_compute[187185]: 2025-11-29 06:49:01.227 187189 DEBUG nova.compute.manager [req-6f90086e-a01c-4f20-9f80-0998eb30b6ca req-7ac1668d-bf20-484c-853f-b187bb8f11d7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Processing event network-vif-plugged-789be005-db43-4c16-8d31-448144c818e2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 06:49:01 compute-0 nova_compute[187185]: 2025-11-29 06:49:01.228 187189 DEBUG nova.compute.manager [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Instance event wait completed in 7 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 06:49:01 compute-0 nova_compute[187185]: 2025-11-29 06:49:01.236 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764398941.2364037, 67c594e4-def3-4964-bd5e-472a63536c4c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:49:01 compute-0 nova_compute[187185]: 2025-11-29 06:49:01.237 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] VM Resumed (Lifecycle Event)
Nov 29 06:49:01 compute-0 nova_compute[187185]: 2025-11-29 06:49:01.240 187189 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 06:49:01 compute-0 nova_compute[187185]: 2025-11-29 06:49:01.246 187189 INFO nova.virt.libvirt.driver [-] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Instance spawned successfully.
Nov 29 06:49:01 compute-0 nova_compute[187185]: 2025-11-29 06:49:01.246 187189 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 06:49:01 compute-0 nova_compute[187185]: 2025-11-29 06:49:01.262 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:49:01 compute-0 nova_compute[187185]: 2025-11-29 06:49:01.270 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 06:49:01 compute-0 nova_compute[187185]: 2025-11-29 06:49:01.276 187189 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:49:01 compute-0 nova_compute[187185]: 2025-11-29 06:49:01.276 187189 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:49:01 compute-0 nova_compute[187185]: 2025-11-29 06:49:01.277 187189 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:49:01 compute-0 nova_compute[187185]: 2025-11-29 06:49:01.278 187189 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:49:01 compute-0 nova_compute[187185]: 2025-11-29 06:49:01.279 187189 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:49:01 compute-0 nova_compute[187185]: 2025-11-29 06:49:01.279 187189 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:49:01 compute-0 nova_compute[187185]: 2025-11-29 06:49:01.288 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 06:49:01 compute-0 nova_compute[187185]: 2025-11-29 06:49:01.372 187189 INFO nova.compute.manager [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Took 60.84 seconds to spawn the instance on the hypervisor.
Nov 29 06:49:01 compute-0 nova_compute[187185]: 2025-11-29 06:49:01.372 187189 DEBUG nova.compute.manager [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:49:01 compute-0 nova_compute[187185]: 2025-11-29 06:49:01.512 187189 INFO nova.compute.manager [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Took 61.85 seconds to build instance.
Nov 29 06:49:01 compute-0 nova_compute[187185]: 2025-11-29 06:49:01.570 187189 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "67c594e4-def3-4964-bd5e-472a63536c4c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 62.053s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:49:01 compute-0 nova_compute[187185]: 2025-11-29 06:49:01.632 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:02 compute-0 podman[214390]: 2025-11-29 06:49:02.79488309 +0000 UTC m=+0.064909062 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 06:49:04 compute-0 nova_compute[187185]: 2025-11-29 06:49:04.684 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:05 compute-0 nova_compute[187185]: 2025-11-29 06:49:05.561 187189 DEBUG nova.compute.manager [req-4b016656-d8ec-4bbb-b086-118b4c249906 req-29663eef-fae9-4aa8-9f44-3d609693484c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Received event network-vif-plugged-789be005-db43-4c16-8d31-448144c818e2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:49:05 compute-0 nova_compute[187185]: 2025-11-29 06:49:05.561 187189 DEBUG oslo_concurrency.lockutils [req-4b016656-d8ec-4bbb-b086-118b4c249906 req-29663eef-fae9-4aa8-9f44-3d609693484c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "67c594e4-def3-4964-bd5e-472a63536c4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:49:05 compute-0 nova_compute[187185]: 2025-11-29 06:49:05.562 187189 DEBUG oslo_concurrency.lockutils [req-4b016656-d8ec-4bbb-b086-118b4c249906 req-29663eef-fae9-4aa8-9f44-3d609693484c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "67c594e4-def3-4964-bd5e-472a63536c4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:49:05 compute-0 nova_compute[187185]: 2025-11-29 06:49:05.562 187189 DEBUG oslo_concurrency.lockutils [req-4b016656-d8ec-4bbb-b086-118b4c249906 req-29663eef-fae9-4aa8-9f44-3d609693484c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "67c594e4-def3-4964-bd5e-472a63536c4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:49:05 compute-0 nova_compute[187185]: 2025-11-29 06:49:05.562 187189 DEBUG nova.compute.manager [req-4b016656-d8ec-4bbb-b086-118b4c249906 req-29663eef-fae9-4aa8-9f44-3d609693484c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] No waiting events found dispatching network-vif-plugged-789be005-db43-4c16-8d31-448144c818e2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 06:49:05 compute-0 nova_compute[187185]: 2025-11-29 06:49:05.562 187189 WARNING nova.compute.manager [req-4b016656-d8ec-4bbb-b086-118b4c249906 req-29663eef-fae9-4aa8-9f44-3d609693484c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Received unexpected event network-vif-plugged-789be005-db43-4c16-8d31-448144c818e2 for instance with vm_state active and task_state None.
Nov 29 06:49:06 compute-0 nova_compute[187185]: 2025-11-29 06:49:06.634 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:09 compute-0 nova_compute[187185]: 2025-11-29 06:49:09.686 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:11 compute-0 nova_compute[187185]: 2025-11-29 06:49:11.635 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:13 compute-0 sshd-session[214411]: Received disconnect from 160.202.8.218 port 49150:11: Bye Bye [preauth]
Nov 29 06:49:13 compute-0 sshd-session[214411]: Disconnected from authenticating user root 160.202.8.218 port 49150 [preauth]
Nov 29 06:49:14 compute-0 nova_compute[187185]: 2025-11-29 06:49:14.688 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:15 compute-0 ovn_controller[95281]: 2025-11-29T06:49:15Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:3a:47:de 10.1.0.40
Nov 29 06:49:15 compute-0 ovn_controller[95281]: 2025-11-29T06:49:15Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3a:47:de 10.1.0.40
Nov 29 06:49:15 compute-0 podman[214435]: 2025-11-29 06:49:15.82077055 +0000 UTC m=+0.074320980 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, name=ubi9-minimal, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, container_name=openstack_network_exporter)
Nov 29 06:49:15 compute-0 podman[214436]: 2025-11-29 06:49:15.835674665 +0000 UTC m=+0.081927707 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 06:49:15 compute-0 nova_compute[187185]: 2025-11-29 06:49:15.916 187189 DEBUG oslo_concurrency.lockutils [None req-42c7baf0-1e84-4894-a21a-a6c4a68d379b 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquiring lock "67c594e4-def3-4964-bd5e-472a63536c4c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:49:15 compute-0 nova_compute[187185]: 2025-11-29 06:49:15.917 187189 DEBUG oslo_concurrency.lockutils [None req-42c7baf0-1e84-4894-a21a-a6c4a68d379b 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "67c594e4-def3-4964-bd5e-472a63536c4c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:49:15 compute-0 nova_compute[187185]: 2025-11-29 06:49:15.917 187189 DEBUG oslo_concurrency.lockutils [None req-42c7baf0-1e84-4894-a21a-a6c4a68d379b 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquiring lock "67c594e4-def3-4964-bd5e-472a63536c4c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:49:15 compute-0 nova_compute[187185]: 2025-11-29 06:49:15.917 187189 DEBUG oslo_concurrency.lockutils [None req-42c7baf0-1e84-4894-a21a-a6c4a68d379b 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "67c594e4-def3-4964-bd5e-472a63536c4c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:49:15 compute-0 nova_compute[187185]: 2025-11-29 06:49:15.917 187189 DEBUG oslo_concurrency.lockutils [None req-42c7baf0-1e84-4894-a21a-a6c4a68d379b 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "67c594e4-def3-4964-bd5e-472a63536c4c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:49:15 compute-0 nova_compute[187185]: 2025-11-29 06:49:15.930 187189 INFO nova.compute.manager [None req-42c7baf0-1e84-4894-a21a-a6c4a68d379b 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Terminating instance
Nov 29 06:49:15 compute-0 nova_compute[187185]: 2025-11-29 06:49:15.942 187189 DEBUG nova.compute.manager [None req-42c7baf0-1e84-4894-a21a-a6c4a68d379b 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 06:49:15 compute-0 kernel: tap789be005-db (unregistering): left promiscuous mode
Nov 29 06:49:15 compute-0 NetworkManager[55227]: <info>  [1764398955.9768] device (tap789be005-db): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 06:49:15 compute-0 nova_compute[187185]: 2025-11-29 06:49:15.984 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:15 compute-0 ovn_controller[95281]: 2025-11-29T06:49:15Z|00032|binding|INFO|Releasing lport 789be005-db43-4c16-8d31-448144c818e2 from this chassis (sb_readonly=0)
Nov 29 06:49:15 compute-0 ovn_controller[95281]: 2025-11-29T06:49:15Z|00033|binding|INFO|Setting lport 789be005-db43-4c16-8d31-448144c818e2 down in Southbound
Nov 29 06:49:15 compute-0 ovn_controller[95281]: 2025-11-29T06:49:15Z|00034|binding|INFO|Removing iface tap789be005-db ovn-installed in OVS
Nov 29 06:49:15 compute-0 nova_compute[187185]: 2025-11-29 06:49:15.987 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:15.997 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3a:47:de 10.1.0.40 fdfe:381f:8400::1f6'], port_security=['fa:16:3e:3a:47:de 10.1.0.40 fdfe:381f:8400::1f6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.40/26 fdfe:381f:8400::1f6/64', 'neutron:device_id': '67c594e4-def3-4964-bd5e-472a63536c4c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-425e933e-ca72-466c-8d2b-499c7ba67318', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5cacaa01-dff2-46af-9e49-4a741508795b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=236265de-856a-468e-8ed3-00d3e824203d, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=789be005-db43-4c16-8d31-448144c818e2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 06:49:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:15.999 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 789be005-db43-4c16-8d31-448144c818e2 in datapath 425e933e-ca72-466c-8d2b-499c7ba67318 unbound from our chassis
Nov 29 06:49:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:16.000 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 425e933e-ca72-466c-8d2b-499c7ba67318, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 06:49:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:16.002 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[c2d9c217-ad5c-45a2-abd5-ed97939c28b4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:49:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:16.002 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318 namespace which is not needed anymore
Nov 29 06:49:16 compute-0 nova_compute[187185]: 2025-11-29 06:49:16.005 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:16 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000006.scope: Deactivated successfully.
Nov 29 06:49:16 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000006.scope: Consumed 12.847s CPU time.
Nov 29 06:49:16 compute-0 systemd-machined[153486]: Machine qemu-2-instance-00000006 terminated.
Nov 29 06:49:16 compute-0 neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318[214375]: [NOTICE]   (214379) : haproxy version is 2.8.14-c23fe91
Nov 29 06:49:16 compute-0 neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318[214375]: [NOTICE]   (214379) : path to executable is /usr/sbin/haproxy
Nov 29 06:49:16 compute-0 neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318[214375]: [WARNING]  (214379) : Exiting Master process...
Nov 29 06:49:16 compute-0 neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318[214375]: [WARNING]  (214379) : Exiting Master process...
Nov 29 06:49:16 compute-0 neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318[214375]: [ALERT]    (214379) : Current worker (214381) exited with code 143 (Terminated)
Nov 29 06:49:16 compute-0 neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318[214375]: [WARNING]  (214379) : All workers exited. Exiting... (0)
Nov 29 06:49:16 compute-0 systemd[1]: libpod-b0e1b2279ba69f682067171ecc4843535797da6aa9500436f6d8f17fbeab084e.scope: Deactivated successfully.
Nov 29 06:49:16 compute-0 podman[214505]: 2025-11-29 06:49:16.152442727 +0000 UTC m=+0.049249465 container died b0e1b2279ba69f682067171ecc4843535797da6aa9500436f6d8f17fbeab084e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true)
Nov 29 06:49:16 compute-0 nova_compute[187185]: 2025-11-29 06:49:16.169 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:16 compute-0 nova_compute[187185]: 2025-11-29 06:49:16.175 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:16 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b0e1b2279ba69f682067171ecc4843535797da6aa9500436f6d8f17fbeab084e-userdata-shm.mount: Deactivated successfully.
Nov 29 06:49:16 compute-0 systemd[1]: var-lib-containers-storage-overlay-76dc127a5df22cd429196de72b5c18c6b29843d704c6efa6ebfd803b1170bf2a-merged.mount: Deactivated successfully.
Nov 29 06:49:16 compute-0 podman[214505]: 2025-11-29 06:49:16.194519007 +0000 UTC m=+0.091325755 container cleanup b0e1b2279ba69f682067171ecc4843535797da6aa9500436f6d8f17fbeab084e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 29 06:49:16 compute-0 systemd[1]: libpod-conmon-b0e1b2279ba69f682067171ecc4843535797da6aa9500436f6d8f17fbeab084e.scope: Deactivated successfully.
Nov 29 06:49:16 compute-0 nova_compute[187185]: 2025-11-29 06:49:16.217 187189 INFO nova.virt.libvirt.driver [-] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Instance destroyed successfully.
Nov 29 06:49:16 compute-0 nova_compute[187185]: 2025-11-29 06:49:16.218 187189 DEBUG nova.objects.instance [None req-42c7baf0-1e84-4894-a21a-a6c4a68d379b 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lazy-loading 'resources' on Instance uuid 67c594e4-def3-4964-bd5e-472a63536c4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:49:16 compute-0 nova_compute[187185]: 2025-11-29 06:49:16.238 187189 DEBUG nova.virt.libvirt.vif [None req-42c7baf0-1e84-4894-a21a-a6c4a68d379b 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:47:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-526752650-3',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-526752650-3',id=6,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=2,launched_at=2025-11-29T06:49:01Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6d2e7db012114f9eb8e8e1b0123c9974',ramdisk_id='',reservation_id='r-o9m0h1rn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AutoAllocateNetworkTest-224859463',owner_user_name='tempest-AutoAllocateNetworkTest-224859463-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:49:01Z,user_data=None,user_id='7a31c969c2f744a9810fc9890dd7acb2',uuid=67c594e4-def3-4964-bd5e-472a63536c4c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "789be005-db43-4c16-8d31-448144c818e2", "address": "fa:16:3e:3a:47:de", "network": {"id": "425e933e-ca72-466c-8d2b-499c7ba67318", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::1f6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.40", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d2e7db012114f9eb8e8e1b0123c9974", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap789be005-db", "ovs_interfaceid": "789be005-db43-4c16-8d31-448144c818e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 06:49:16 compute-0 nova_compute[187185]: 2025-11-29 06:49:16.239 187189 DEBUG nova.network.os_vif_util [None req-42c7baf0-1e84-4894-a21a-a6c4a68d379b 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Converting VIF {"id": "789be005-db43-4c16-8d31-448144c818e2", "address": "fa:16:3e:3a:47:de", "network": {"id": "425e933e-ca72-466c-8d2b-499c7ba67318", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::1f6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.40", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d2e7db012114f9eb8e8e1b0123c9974", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap789be005-db", "ovs_interfaceid": "789be005-db43-4c16-8d31-448144c818e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 06:49:16 compute-0 nova_compute[187185]: 2025-11-29 06:49:16.240 187189 DEBUG nova.network.os_vif_util [None req-42c7baf0-1e84-4894-a21a-a6c4a68d379b 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3a:47:de,bridge_name='br-int',has_traffic_filtering=True,id=789be005-db43-4c16-8d31-448144c818e2,network=Network(425e933e-ca72-466c-8d2b-499c7ba67318),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap789be005-db') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 06:49:16 compute-0 nova_compute[187185]: 2025-11-29 06:49:16.241 187189 DEBUG os_vif [None req-42c7baf0-1e84-4894-a21a-a6c4a68d379b 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3a:47:de,bridge_name='br-int',has_traffic_filtering=True,id=789be005-db43-4c16-8d31-448144c818e2,network=Network(425e933e-ca72-466c-8d2b-499c7ba67318),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap789be005-db') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 06:49:16 compute-0 nova_compute[187185]: 2025-11-29 06:49:16.243 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:16 compute-0 nova_compute[187185]: 2025-11-29 06:49:16.244 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap789be005-db, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:49:16 compute-0 nova_compute[187185]: 2025-11-29 06:49:16.246 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:16 compute-0 nova_compute[187185]: 2025-11-29 06:49:16.247 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:16 compute-0 nova_compute[187185]: 2025-11-29 06:49:16.251 187189 INFO os_vif [None req-42c7baf0-1e84-4894-a21a-a6c4a68d379b 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3a:47:de,bridge_name='br-int',has_traffic_filtering=True,id=789be005-db43-4c16-8d31-448144c818e2,network=Network(425e933e-ca72-466c-8d2b-499c7ba67318),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap789be005-db')
Nov 29 06:49:16 compute-0 nova_compute[187185]: 2025-11-29 06:49:16.253 187189 INFO nova.virt.libvirt.driver [None req-42c7baf0-1e84-4894-a21a-a6c4a68d379b 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Deleting instance files /var/lib/nova/instances/67c594e4-def3-4964-bd5e-472a63536c4c_del
Nov 29 06:49:16 compute-0 nova_compute[187185]: 2025-11-29 06:49:16.253 187189 INFO nova.virt.libvirt.driver [None req-42c7baf0-1e84-4894-a21a-a6c4a68d379b 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Deletion of /var/lib/nova/instances/67c594e4-def3-4964-bd5e-472a63536c4c_del complete
Nov 29 06:49:16 compute-0 podman[214547]: 2025-11-29 06:49:16.272952923 +0000 UTC m=+0.053480255 container remove b0e1b2279ba69f682067171ecc4843535797da6aa9500436f6d8f17fbeab084e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:49:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:16.277 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f0388768-3612-4b38-8f4b-32d87a2cc51a]: (4, ('Sat Nov 29 06:49:16 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318 (b0e1b2279ba69f682067171ecc4843535797da6aa9500436f6d8f17fbeab084e)\nb0e1b2279ba69f682067171ecc4843535797da6aa9500436f6d8f17fbeab084e\nSat Nov 29 06:49:16 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318 (b0e1b2279ba69f682067171ecc4843535797da6aa9500436f6d8f17fbeab084e)\nb0e1b2279ba69f682067171ecc4843535797da6aa9500436f6d8f17fbeab084e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:49:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:16.279 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6a7523a0-dc94-418a-aa14-84bebca4829b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:49:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:16.280 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap425e933e-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:49:16 compute-0 nova_compute[187185]: 2025-11-29 06:49:16.281 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:16 compute-0 kernel: tap425e933e-c0: left promiscuous mode
Nov 29 06:49:16 compute-0 nova_compute[187185]: 2025-11-29 06:49:16.284 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:16.286 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d27ccdb0-7389-406d-8bac-1330c5c34644]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:49:16 compute-0 nova_compute[187185]: 2025-11-29 06:49:16.297 187189 DEBUG nova.compute.manager [req-7a0eae31-976c-4133-883b-e868a0c512a7 req-c9f63369-e11a-4233-ab1e-0623b158f3f0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Received event network-vif-unplugged-789be005-db43-4c16-8d31-448144c818e2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:49:16 compute-0 nova_compute[187185]: 2025-11-29 06:49:16.298 187189 DEBUG oslo_concurrency.lockutils [req-7a0eae31-976c-4133-883b-e868a0c512a7 req-c9f63369-e11a-4233-ab1e-0623b158f3f0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "67c594e4-def3-4964-bd5e-472a63536c4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:49:16 compute-0 nova_compute[187185]: 2025-11-29 06:49:16.298 187189 DEBUG oslo_concurrency.lockutils [req-7a0eae31-976c-4133-883b-e868a0c512a7 req-c9f63369-e11a-4233-ab1e-0623b158f3f0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "67c594e4-def3-4964-bd5e-472a63536c4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:49:16 compute-0 nova_compute[187185]: 2025-11-29 06:49:16.299 187189 DEBUG oslo_concurrency.lockutils [req-7a0eae31-976c-4133-883b-e868a0c512a7 req-c9f63369-e11a-4233-ab1e-0623b158f3f0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "67c594e4-def3-4964-bd5e-472a63536c4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:49:16 compute-0 nova_compute[187185]: 2025-11-29 06:49:16.299 187189 DEBUG nova.compute.manager [req-7a0eae31-976c-4133-883b-e868a0c512a7 req-c9f63369-e11a-4233-ab1e-0623b158f3f0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] No waiting events found dispatching network-vif-unplugged-789be005-db43-4c16-8d31-448144c818e2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 06:49:16 compute-0 nova_compute[187185]: 2025-11-29 06:49:16.299 187189 DEBUG nova.compute.manager [req-7a0eae31-976c-4133-883b-e868a0c512a7 req-c9f63369-e11a-4233-ab1e-0623b158f3f0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Received event network-vif-unplugged-789be005-db43-4c16-8d31-448144c818e2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 06:49:16 compute-0 nova_compute[187185]: 2025-11-29 06:49:16.300 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:16.306 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[c355c4ce-1d45-41b6-8651-a140bd7ad1a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:49:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:16.307 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[14202f53-1078-40ad-848b-ac9ca3b4ce2c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:49:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:16.321 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6b6186c4-17f4-4799-9c63-2c059fc77bb8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438397, 'reachable_time': 40426, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214562, 'error': None, 'target': 'ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:49:16 compute-0 systemd[1]: run-netns-ovnmeta\x2d425e933e\x2dca72\x2d466c\x2d8d2b\x2d499c7ba67318.mount: Deactivated successfully.
Nov 29 06:49:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:16.331 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 06:49:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:16.332 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[73e0d9e4-4b3d-431c-b7c6-13924ea02670]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:49:16 compute-0 nova_compute[187185]: 2025-11-29 06:49:16.346 187189 DEBUG nova.virt.libvirt.host [None req-42c7baf0-1e84-4894-a21a-a6c4a68d379b 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Nov 29 06:49:16 compute-0 nova_compute[187185]: 2025-11-29 06:49:16.346 187189 INFO nova.virt.libvirt.host [None req-42c7baf0-1e84-4894-a21a-a6c4a68d379b 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] UEFI support detected
Nov 29 06:49:16 compute-0 nova_compute[187185]: 2025-11-29 06:49:16.348 187189 INFO nova.compute.manager [None req-42c7baf0-1e84-4894-a21a-a6c4a68d379b 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Took 0.41 seconds to destroy the instance on the hypervisor.
Nov 29 06:49:16 compute-0 nova_compute[187185]: 2025-11-29 06:49:16.349 187189 DEBUG oslo.service.loopingcall [None req-42c7baf0-1e84-4894-a21a-a6c4a68d379b 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 06:49:16 compute-0 nova_compute[187185]: 2025-11-29 06:49:16.349 187189 DEBUG nova.compute.manager [-] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 06:49:16 compute-0 nova_compute[187185]: 2025-11-29 06:49:16.349 187189 DEBUG nova.network.neutron [-] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 06:49:16 compute-0 nova_compute[187185]: 2025-11-29 06:49:16.637 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:17 compute-0 podman[214564]: 2025-11-29 06:49:17.84191748 +0000 UTC m=+0.101676900 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Nov 29 06:49:18 compute-0 nova_compute[187185]: 2025-11-29 06:49:18.268 187189 DEBUG nova.network.neutron [-] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:49:18 compute-0 nova_compute[187185]: 2025-11-29 06:49:18.316 187189 INFO nova.compute.manager [-] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Took 1.97 seconds to deallocate network for instance.
Nov 29 06:49:18 compute-0 nova_compute[187185]: 2025-11-29 06:49:18.365 187189 DEBUG nova.compute.manager [req-f6480992-9064-4e73-939b-ce0befaec7e8 req-4f256bf9-0bcf-46e5-8134-67dc73787e03 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Received event network-vif-deleted-789be005-db43-4c16-8d31-448144c818e2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:49:18 compute-0 nova_compute[187185]: 2025-11-29 06:49:18.404 187189 DEBUG oslo_concurrency.lockutils [None req-42c7baf0-1e84-4894-a21a-a6c4a68d379b 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:49:18 compute-0 nova_compute[187185]: 2025-11-29 06:49:18.404 187189 DEBUG oslo_concurrency.lockutils [None req-42c7baf0-1e84-4894-a21a-a6c4a68d379b 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:49:18 compute-0 nova_compute[187185]: 2025-11-29 06:49:18.480 187189 DEBUG nova.compute.manager [req-fe4c9d14-63aa-4d4f-91c1-02445be7e2a0 req-c1b5e2d2-5530-404d-93fc-50393186959c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Received event network-vif-plugged-789be005-db43-4c16-8d31-448144c818e2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:49:18 compute-0 nova_compute[187185]: 2025-11-29 06:49:18.481 187189 DEBUG oslo_concurrency.lockutils [req-fe4c9d14-63aa-4d4f-91c1-02445be7e2a0 req-c1b5e2d2-5530-404d-93fc-50393186959c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "67c594e4-def3-4964-bd5e-472a63536c4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:49:18 compute-0 nova_compute[187185]: 2025-11-29 06:49:18.481 187189 DEBUG oslo_concurrency.lockutils [req-fe4c9d14-63aa-4d4f-91c1-02445be7e2a0 req-c1b5e2d2-5530-404d-93fc-50393186959c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "67c594e4-def3-4964-bd5e-472a63536c4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:49:18 compute-0 nova_compute[187185]: 2025-11-29 06:49:18.481 187189 DEBUG oslo_concurrency.lockutils [req-fe4c9d14-63aa-4d4f-91c1-02445be7e2a0 req-c1b5e2d2-5530-404d-93fc-50393186959c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "67c594e4-def3-4964-bd5e-472a63536c4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:49:18 compute-0 nova_compute[187185]: 2025-11-29 06:49:18.482 187189 DEBUG nova.compute.manager [req-fe4c9d14-63aa-4d4f-91c1-02445be7e2a0 req-c1b5e2d2-5530-404d-93fc-50393186959c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] No waiting events found dispatching network-vif-plugged-789be005-db43-4c16-8d31-448144c818e2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 06:49:18 compute-0 nova_compute[187185]: 2025-11-29 06:49:18.482 187189 WARNING nova.compute.manager [req-fe4c9d14-63aa-4d4f-91c1-02445be7e2a0 req-c1b5e2d2-5530-404d-93fc-50393186959c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Received unexpected event network-vif-plugged-789be005-db43-4c16-8d31-448144c818e2 for instance with vm_state deleted and task_state None.
Nov 29 06:49:18 compute-0 nova_compute[187185]: 2025-11-29 06:49:18.493 187189 DEBUG nova.compute.provider_tree [None req-42c7baf0-1e84-4894-a21a-a6c4a68d379b 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:49:18 compute-0 nova_compute[187185]: 2025-11-29 06:49:18.506 187189 DEBUG nova.scheduler.client.report [None req-42c7baf0-1e84-4894-a21a-a6c4a68d379b 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:49:18 compute-0 nova_compute[187185]: 2025-11-29 06:49:18.527 187189 DEBUG oslo_concurrency.lockutils [None req-42c7baf0-1e84-4894-a21a-a6c4a68d379b 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:49:18 compute-0 nova_compute[187185]: 2025-11-29 06:49:18.556 187189 INFO nova.scheduler.client.report [None req-42c7baf0-1e84-4894-a21a-a6c4a68d379b 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Deleted allocations for instance 67c594e4-def3-4964-bd5e-472a63536c4c
Nov 29 06:49:18 compute-0 nova_compute[187185]: 2025-11-29 06:49:18.664 187189 DEBUG oslo_concurrency.lockutils [None req-42c7baf0-1e84-4894-a21a-a6c4a68d379b 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "67c594e4-def3-4964-bd5e-472a63536c4c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.747s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:49:21 compute-0 nova_compute[187185]: 2025-11-29 06:49:21.245 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:21 compute-0 nova_compute[187185]: 2025-11-29 06:49:21.638 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:23 compute-0 podman[214584]: 2025-11-29 06:49:23.870470303 +0000 UTC m=+0.129872034 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller)
Nov 29 06:49:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:24.806 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:49:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:24.807 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:49:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:24.807 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:49:25 compute-0 podman[214610]: 2025-11-29 06:49:25.827226806 +0000 UTC m=+0.082624447 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 06:49:25 compute-0 nova_compute[187185]: 2025-11-29 06:49:25.843 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:25 compute-0 podman[214634]: 2025-11-29 06:49:25.967180296 +0000 UTC m=+0.089275166 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 06:49:26 compute-0 nova_compute[187185]: 2025-11-29 06:49:26.247 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:26 compute-0 nova_compute[187185]: 2025-11-29 06:49:26.686 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:29 compute-0 nova_compute[187185]: 2025-11-29 06:49:29.191 187189 DEBUG oslo_concurrency.lockutils [None req-53082ad0-c52a-467c-8ce1-dd3baf6d9301 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquiring lock "afe4ae44-2787-45ec-8e0b-72fa7297cebb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:49:29 compute-0 nova_compute[187185]: 2025-11-29 06:49:29.192 187189 DEBUG oslo_concurrency.lockutils [None req-53082ad0-c52a-467c-8ce1-dd3baf6d9301 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "afe4ae44-2787-45ec-8e0b-72fa7297cebb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:49:29 compute-0 nova_compute[187185]: 2025-11-29 06:49:29.192 187189 DEBUG oslo_concurrency.lockutils [None req-53082ad0-c52a-467c-8ce1-dd3baf6d9301 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquiring lock "afe4ae44-2787-45ec-8e0b-72fa7297cebb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:49:29 compute-0 nova_compute[187185]: 2025-11-29 06:49:29.192 187189 DEBUG oslo_concurrency.lockutils [None req-53082ad0-c52a-467c-8ce1-dd3baf6d9301 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "afe4ae44-2787-45ec-8e0b-72fa7297cebb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:49:29 compute-0 nova_compute[187185]: 2025-11-29 06:49:29.192 187189 DEBUG oslo_concurrency.lockutils [None req-53082ad0-c52a-467c-8ce1-dd3baf6d9301 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "afe4ae44-2787-45ec-8e0b-72fa7297cebb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:49:29 compute-0 nova_compute[187185]: 2025-11-29 06:49:29.207 187189 INFO nova.compute.manager [None req-53082ad0-c52a-467c-8ce1-dd3baf6d9301 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Terminating instance
Nov 29 06:49:29 compute-0 nova_compute[187185]: 2025-11-29 06:49:29.216 187189 DEBUG oslo_concurrency.lockutils [None req-53082ad0-c52a-467c-8ce1-dd3baf6d9301 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquiring lock "refresh_cache-afe4ae44-2787-45ec-8e0b-72fa7297cebb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:49:29 compute-0 nova_compute[187185]: 2025-11-29 06:49:29.217 187189 DEBUG oslo_concurrency.lockutils [None req-53082ad0-c52a-467c-8ce1-dd3baf6d9301 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquired lock "refresh_cache-afe4ae44-2787-45ec-8e0b-72fa7297cebb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:49:29 compute-0 nova_compute[187185]: 2025-11-29 06:49:29.217 187189 DEBUG nova.network.neutron [None req-53082ad0-c52a-467c-8ce1-dd3baf6d9301 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 06:49:30 compute-0 nova_compute[187185]: 2025-11-29 06:49:30.047 187189 DEBUG nova.network.neutron [None req-53082ad0-c52a-467c-8ce1-dd3baf6d9301 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 06:49:31 compute-0 nova_compute[187185]: 2025-11-29 06:49:31.162 187189 DEBUG nova.network.neutron [None req-53082ad0-c52a-467c-8ce1-dd3baf6d9301 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:49:31 compute-0 nova_compute[187185]: 2025-11-29 06:49:31.191 187189 DEBUG oslo_concurrency.lockutils [None req-53082ad0-c52a-467c-8ce1-dd3baf6d9301 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Releasing lock "refresh_cache-afe4ae44-2787-45ec-8e0b-72fa7297cebb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:49:31 compute-0 nova_compute[187185]: 2025-11-29 06:49:31.192 187189 DEBUG nova.compute.manager [None req-53082ad0-c52a-467c-8ce1-dd3baf6d9301 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 06:49:31 compute-0 nova_compute[187185]: 2025-11-29 06:49:31.216 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764398956.2149346, 67c594e4-def3-4964-bd5e-472a63536c4c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:49:31 compute-0 nova_compute[187185]: 2025-11-29 06:49:31.217 187189 INFO nova.compute.manager [-] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] VM Stopped (Lifecycle Event)
Nov 29 06:49:31 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Deactivated successfully.
Nov 29 06:49:31 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Consumed 15.595s CPU time.
Nov 29 06:49:31 compute-0 systemd-machined[153486]: Machine qemu-1-instance-00000002 terminated.
Nov 29 06:49:31 compute-0 nova_compute[187185]: 2025-11-29 06:49:31.252 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:31 compute-0 nova_compute[187185]: 2025-11-29 06:49:31.327 187189 DEBUG nova.compute.manager [None req-08ec6454-804d-4d06-a0ff-f533e139ca58 - - - - - -] [instance: 67c594e4-def3-4964-bd5e-472a63536c4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:49:31 compute-0 nova_compute[187185]: 2025-11-29 06:49:31.438 187189 INFO nova.virt.libvirt.driver [-] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Instance destroyed successfully.
Nov 29 06:49:31 compute-0 nova_compute[187185]: 2025-11-29 06:49:31.439 187189 DEBUG nova.objects.instance [None req-53082ad0-c52a-467c-8ce1-dd3baf6d9301 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lazy-loading 'resources' on Instance uuid afe4ae44-2787-45ec-8e0b-72fa7297cebb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:49:31 compute-0 nova_compute[187185]: 2025-11-29 06:49:31.588 187189 INFO nova.virt.libvirt.driver [None req-53082ad0-c52a-467c-8ce1-dd3baf6d9301 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Deleting instance files /var/lib/nova/instances/afe4ae44-2787-45ec-8e0b-72fa7297cebb_del
Nov 29 06:49:31 compute-0 nova_compute[187185]: 2025-11-29 06:49:31.589 187189 INFO nova.virt.libvirt.driver [None req-53082ad0-c52a-467c-8ce1-dd3baf6d9301 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Deletion of /var/lib/nova/instances/afe4ae44-2787-45ec-8e0b-72fa7297cebb_del complete
Nov 29 06:49:31 compute-0 nova_compute[187185]: 2025-11-29 06:49:31.679 187189 INFO nova.compute.manager [None req-53082ad0-c52a-467c-8ce1-dd3baf6d9301 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Took 0.49 seconds to destroy the instance on the hypervisor.
Nov 29 06:49:31 compute-0 nova_compute[187185]: 2025-11-29 06:49:31.679 187189 DEBUG oslo.service.loopingcall [None req-53082ad0-c52a-467c-8ce1-dd3baf6d9301 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 06:49:31 compute-0 nova_compute[187185]: 2025-11-29 06:49:31.680 187189 DEBUG nova.compute.manager [-] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 06:49:31 compute-0 nova_compute[187185]: 2025-11-29 06:49:31.680 187189 DEBUG nova.network.neutron [-] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 06:49:31 compute-0 nova_compute[187185]: 2025-11-29 06:49:31.688 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:31 compute-0 nova_compute[187185]: 2025-11-29 06:49:31.986 187189 DEBUG nova.network.neutron [-] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 06:49:32 compute-0 nova_compute[187185]: 2025-11-29 06:49:32.003 187189 DEBUG nova.network.neutron [-] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:49:32 compute-0 nova_compute[187185]: 2025-11-29 06:49:32.019 187189 INFO nova.compute.manager [-] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Took 0.34 seconds to deallocate network for instance.
Nov 29 06:49:32 compute-0 nova_compute[187185]: 2025-11-29 06:49:32.091 187189 DEBUG oslo_concurrency.lockutils [None req-53082ad0-c52a-467c-8ce1-dd3baf6d9301 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:49:32 compute-0 nova_compute[187185]: 2025-11-29 06:49:32.092 187189 DEBUG oslo_concurrency.lockutils [None req-53082ad0-c52a-467c-8ce1-dd3baf6d9301 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:49:32 compute-0 nova_compute[187185]: 2025-11-29 06:49:32.158 187189 DEBUG nova.compute.provider_tree [None req-53082ad0-c52a-467c-8ce1-dd3baf6d9301 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:49:32 compute-0 nova_compute[187185]: 2025-11-29 06:49:32.173 187189 DEBUG nova.scheduler.client.report [None req-53082ad0-c52a-467c-8ce1-dd3baf6d9301 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:49:32 compute-0 nova_compute[187185]: 2025-11-29 06:49:32.265 187189 DEBUG oslo_concurrency.lockutils [None req-53082ad0-c52a-467c-8ce1-dd3baf6d9301 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:49:32 compute-0 nova_compute[187185]: 2025-11-29 06:49:32.443 187189 INFO nova.scheduler.client.report [None req-53082ad0-c52a-467c-8ce1-dd3baf6d9301 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Deleted allocations for instance afe4ae44-2787-45ec-8e0b-72fa7297cebb
Nov 29 06:49:32 compute-0 nova_compute[187185]: 2025-11-29 06:49:32.882 187189 DEBUG oslo_concurrency.lockutils [None req-53082ad0-c52a-467c-8ce1-dd3baf6d9301 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "afe4ae44-2787-45ec-8e0b-72fa7297cebb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.690s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:49:33 compute-0 sshd-session[214666]: Invalid user celery from 45.202.211.6 port 49186
Nov 29 06:49:33 compute-0 podman[214668]: 2025-11-29 06:49:33.800820577 +0000 UTC m=+0.062069521 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:49:33 compute-0 sshd-session[214666]: Received disconnect from 45.202.211.6 port 49186:11: Bye Bye [preauth]
Nov 29 06:49:33 compute-0 sshd-session[214666]: Disconnected from invalid user celery 45.202.211.6 port 49186 [preauth]
Nov 29 06:49:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:34.246 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 06:49:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:34.248 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 06:49:34 compute-0 nova_compute[187185]: 2025-11-29 06:49:34.299 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:36 compute-0 nova_compute[187185]: 2025-11-29 06:49:36.256 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:36 compute-0 nova_compute[187185]: 2025-11-29 06:49:36.695 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:39.252 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:49:41 compute-0 nova_compute[187185]: 2025-11-29 06:49:41.260 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:41 compute-0 nova_compute[187185]: 2025-11-29 06:49:41.697 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:43 compute-0 nova_compute[187185]: 2025-11-29 06:49:43.709 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:49:43 compute-0 nova_compute[187185]: 2025-11-29 06:49:43.710 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:49:43 compute-0 nova_compute[187185]: 2025-11-29 06:49:43.731 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:49:43 compute-0 nova_compute[187185]: 2025-11-29 06:49:43.731 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 06:49:43 compute-0 nova_compute[187185]: 2025-11-29 06:49:43.732 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 06:49:43 compute-0 nova_compute[187185]: 2025-11-29 06:49:43.744 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 06:49:43 compute-0 nova_compute[187185]: 2025-11-29 06:49:43.744 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:49:43 compute-0 nova_compute[187185]: 2025-11-29 06:49:43.744 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:49:43 compute-0 nova_compute[187185]: 2025-11-29 06:49:43.745 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:49:43 compute-0 nova_compute[187185]: 2025-11-29 06:49:43.745 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:49:43 compute-0 nova_compute[187185]: 2025-11-29 06:49:43.745 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:49:43 compute-0 nova_compute[187185]: 2025-11-29 06:49:43.745 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:49:43 compute-0 nova_compute[187185]: 2025-11-29 06:49:43.746 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 06:49:43 compute-0 nova_compute[187185]: 2025-11-29 06:49:43.746 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:49:43 compute-0 nova_compute[187185]: 2025-11-29 06:49:43.771 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:49:43 compute-0 nova_compute[187185]: 2025-11-29 06:49:43.771 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:49:43 compute-0 nova_compute[187185]: 2025-11-29 06:49:43.772 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:49:43 compute-0 nova_compute[187185]: 2025-11-29 06:49:43.772 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:49:43 compute-0 nova_compute[187185]: 2025-11-29 06:49:43.913 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:49:43 compute-0 nova_compute[187185]: 2025-11-29 06:49:43.915 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5782MB free_disk=73.34293746948242GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:49:43 compute-0 nova_compute[187185]: 2025-11-29 06:49:43.915 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:49:43 compute-0 nova_compute[187185]: 2025-11-29 06:49:43.916 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:49:44 compute-0 nova_compute[187185]: 2025-11-29 06:49:44.020 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:49:44 compute-0 nova_compute[187185]: 2025-11-29 06:49:44.020 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:49:44 compute-0 nova_compute[187185]: 2025-11-29 06:49:44.064 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:49:44 compute-0 nova_compute[187185]: 2025-11-29 06:49:44.094 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:49:44 compute-0 nova_compute[187185]: 2025-11-29 06:49:44.119 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:49:44 compute-0 nova_compute[187185]: 2025-11-29 06:49:44.120 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.204s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:49:46 compute-0 nova_compute[187185]: 2025-11-29 06:49:46.263 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:46 compute-0 nova_compute[187185]: 2025-11-29 06:49:46.437 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764398971.4353776, afe4ae44-2787-45ec-8e0b-72fa7297cebb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:49:46 compute-0 nova_compute[187185]: 2025-11-29 06:49:46.437 187189 INFO nova.compute.manager [-] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] VM Stopped (Lifecycle Event)
Nov 29 06:49:46 compute-0 nova_compute[187185]: 2025-11-29 06:49:46.504 187189 DEBUG nova.compute.manager [None req-baa45b5a-b9d9-447c-b200-aabd2c0a74ba - - - - - -] [instance: afe4ae44-2787-45ec-8e0b-72fa7297cebb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:49:46 compute-0 nova_compute[187185]: 2025-11-29 06:49:46.698 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:46 compute-0 podman[214690]: 2025-11-29 06:49:46.788679062 +0000 UTC m=+0.049057667 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 06:49:46 compute-0 podman[214689]: 2025-11-29 06:49:46.794050313 +0000 UTC m=+0.057317240 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.buildah.version=1.33.7, container_name=openstack_network_exporter, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, name=ubi9-minimal, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container)
Nov 29 06:49:47 compute-0 nova_compute[187185]: 2025-11-29 06:49:47.631 187189 DEBUG oslo_concurrency.lockutils [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Acquiring lock "e6b5b54b-9532-4f51-a346-42dee946a9ef" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:49:47 compute-0 nova_compute[187185]: 2025-11-29 06:49:47.631 187189 DEBUG oslo_concurrency.lockutils [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Lock "e6b5b54b-9532-4f51-a346-42dee946a9ef" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:49:47 compute-0 nova_compute[187185]: 2025-11-29 06:49:47.654 187189 DEBUG nova.compute.manager [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 06:49:47 compute-0 nova_compute[187185]: 2025-11-29 06:49:47.787 187189 DEBUG oslo_concurrency.lockutils [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:49:47 compute-0 nova_compute[187185]: 2025-11-29 06:49:47.787 187189 DEBUG oslo_concurrency.lockutils [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:49:47 compute-0 nova_compute[187185]: 2025-11-29 06:49:47.795 187189 DEBUG nova.virt.hardware [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 06:49:47 compute-0 nova_compute[187185]: 2025-11-29 06:49:47.796 187189 INFO nova.compute.claims [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Claim successful on node compute-0.ctlplane.example.com
Nov 29 06:49:47 compute-0 nova_compute[187185]: 2025-11-29 06:49:47.933 187189 DEBUG nova.compute.provider_tree [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:49:47 compute-0 nova_compute[187185]: 2025-11-29 06:49:47.947 187189 DEBUG nova.scheduler.client.report [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:49:47 compute-0 nova_compute[187185]: 2025-11-29 06:49:47.982 187189 DEBUG oslo_concurrency.lockutils [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.195s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:49:47 compute-0 nova_compute[187185]: 2025-11-29 06:49:47.983 187189 DEBUG nova.compute.manager [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 06:49:48 compute-0 nova_compute[187185]: 2025-11-29 06:49:48.058 187189 DEBUG nova.compute.manager [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 06:49:48 compute-0 nova_compute[187185]: 2025-11-29 06:49:48.059 187189 DEBUG nova.network.neutron [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 06:49:48 compute-0 nova_compute[187185]: 2025-11-29 06:49:48.086 187189 INFO nova.virt.libvirt.driver [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 06:49:48 compute-0 nova_compute[187185]: 2025-11-29 06:49:48.115 187189 DEBUG nova.compute.manager [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 06:49:48 compute-0 nova_compute[187185]: 2025-11-29 06:49:48.282 187189 DEBUG nova.compute.manager [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 06:49:48 compute-0 nova_compute[187185]: 2025-11-29 06:49:48.283 187189 DEBUG nova.virt.libvirt.driver [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 06:49:48 compute-0 nova_compute[187185]: 2025-11-29 06:49:48.284 187189 INFO nova.virt.libvirt.driver [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Creating image(s)
Nov 29 06:49:48 compute-0 nova_compute[187185]: 2025-11-29 06:49:48.284 187189 DEBUG oslo_concurrency.lockutils [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Acquiring lock "/var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:49:48 compute-0 nova_compute[187185]: 2025-11-29 06:49:48.284 187189 DEBUG oslo_concurrency.lockutils [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Lock "/var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:49:48 compute-0 nova_compute[187185]: 2025-11-29 06:49:48.285 187189 DEBUG oslo_concurrency.lockutils [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Lock "/var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:49:48 compute-0 nova_compute[187185]: 2025-11-29 06:49:48.298 187189 DEBUG oslo_concurrency.processutils [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:49:48 compute-0 nova_compute[187185]: 2025-11-29 06:49:48.350 187189 DEBUG oslo_concurrency.processutils [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:49:48 compute-0 nova_compute[187185]: 2025-11-29 06:49:48.351 187189 DEBUG oslo_concurrency.lockutils [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:49:48 compute-0 nova_compute[187185]: 2025-11-29 06:49:48.352 187189 DEBUG oslo_concurrency.lockutils [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:49:48 compute-0 nova_compute[187185]: 2025-11-29 06:49:48.362 187189 DEBUG oslo_concurrency.processutils [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:49:48 compute-0 nova_compute[187185]: 2025-11-29 06:49:48.411 187189 DEBUG oslo_concurrency.processutils [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:49:48 compute-0 nova_compute[187185]: 2025-11-29 06:49:48.412 187189 DEBUG oslo_concurrency.processutils [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:49:48 compute-0 nova_compute[187185]: 2025-11-29 06:49:48.483 187189 DEBUG nova.policy [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a01fd01629a1493bb3fb6df5a2462226', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 06:49:48 compute-0 nova_compute[187185]: 2025-11-29 06:49:48.533 187189 DEBUG oslo_concurrency.processutils [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef/disk 1073741824" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:49:48 compute-0 nova_compute[187185]: 2025-11-29 06:49:48.534 187189 DEBUG oslo_concurrency.lockutils [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.182s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:49:48 compute-0 nova_compute[187185]: 2025-11-29 06:49:48.534 187189 DEBUG oslo_concurrency.processutils [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:49:48 compute-0 nova_compute[187185]: 2025-11-29 06:49:48.589 187189 DEBUG oslo_concurrency.processutils [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:49:48 compute-0 nova_compute[187185]: 2025-11-29 06:49:48.590 187189 DEBUG nova.virt.disk.api [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Checking if we can resize image /var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 06:49:48 compute-0 nova_compute[187185]: 2025-11-29 06:49:48.590 187189 DEBUG oslo_concurrency.processutils [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:49:48 compute-0 nova_compute[187185]: 2025-11-29 06:49:48.647 187189 DEBUG oslo_concurrency.processutils [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:49:48 compute-0 nova_compute[187185]: 2025-11-29 06:49:48.648 187189 DEBUG nova.virt.disk.api [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Cannot resize image /var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 06:49:48 compute-0 nova_compute[187185]: 2025-11-29 06:49:48.649 187189 DEBUG nova.objects.instance [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Lazy-loading 'migration_context' on Instance uuid e6b5b54b-9532-4f51-a346-42dee946a9ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:49:48 compute-0 nova_compute[187185]: 2025-11-29 06:49:48.666 187189 DEBUG nova.virt.libvirt.driver [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 06:49:48 compute-0 nova_compute[187185]: 2025-11-29 06:49:48.666 187189 DEBUG nova.virt.libvirt.driver [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Ensure instance console log exists: /var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 06:49:48 compute-0 nova_compute[187185]: 2025-11-29 06:49:48.667 187189 DEBUG oslo_concurrency.lockutils [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:49:48 compute-0 nova_compute[187185]: 2025-11-29 06:49:48.667 187189 DEBUG oslo_concurrency.lockutils [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:49:48 compute-0 nova_compute[187185]: 2025-11-29 06:49:48.667 187189 DEBUG oslo_concurrency.lockutils [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:49:48 compute-0 podman[214747]: 2025-11-29 06:49:48.77808512 +0000 UTC m=+0.045990771 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 06:49:50 compute-0 nova_compute[187185]: 2025-11-29 06:49:50.132 187189 DEBUG nova.network.neutron [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Successfully updated port: 04956313-39e4-4275-ab3f-18aa7a1a0e46 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 06:49:50 compute-0 nova_compute[187185]: 2025-11-29 06:49:50.163 187189 DEBUG oslo_concurrency.lockutils [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Acquiring lock "refresh_cache-e6b5b54b-9532-4f51-a346-42dee946a9ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:49:50 compute-0 nova_compute[187185]: 2025-11-29 06:49:50.163 187189 DEBUG oslo_concurrency.lockutils [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Acquired lock "refresh_cache-e6b5b54b-9532-4f51-a346-42dee946a9ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:49:50 compute-0 nova_compute[187185]: 2025-11-29 06:49:50.163 187189 DEBUG nova.network.neutron [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 06:49:50 compute-0 nova_compute[187185]: 2025-11-29 06:49:50.303 187189 DEBUG nova.compute.manager [req-5b5232a2-c8bd-47b2-83d7-2f6fdd766622 req-1c756e19-5677-407f-9d33-72551554dfdc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Received event network-changed-04956313-39e4-4275-ab3f-18aa7a1a0e46 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:49:50 compute-0 nova_compute[187185]: 2025-11-29 06:49:50.304 187189 DEBUG nova.compute.manager [req-5b5232a2-c8bd-47b2-83d7-2f6fdd766622 req-1c756e19-5677-407f-9d33-72551554dfdc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Refreshing instance network info cache due to event network-changed-04956313-39e4-4275-ab3f-18aa7a1a0e46. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 06:49:50 compute-0 nova_compute[187185]: 2025-11-29 06:49:50.304 187189 DEBUG oslo_concurrency.lockutils [req-5b5232a2-c8bd-47b2-83d7-2f6fdd766622 req-1c756e19-5677-407f-9d33-72551554dfdc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-e6b5b54b-9532-4f51-a346-42dee946a9ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:49:50 compute-0 nova_compute[187185]: 2025-11-29 06:49:50.429 187189 DEBUG nova.network.neutron [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.267 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.472 187189 DEBUG nova.network.neutron [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Updating instance_info_cache with network_info: [{"id": "04956313-39e4-4275-ab3f-18aa7a1a0e46", "address": "fa:16:3e:b8:f5:6d", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04956313-39", "ovs_interfaceid": "04956313-39e4-4275-ab3f-18aa7a1a0e46", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.491 187189 DEBUG oslo_concurrency.lockutils [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Releasing lock "refresh_cache-e6b5b54b-9532-4f51-a346-42dee946a9ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.492 187189 DEBUG nova.compute.manager [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Instance network_info: |[{"id": "04956313-39e4-4275-ab3f-18aa7a1a0e46", "address": "fa:16:3e:b8:f5:6d", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04956313-39", "ovs_interfaceid": "04956313-39e4-4275-ab3f-18aa7a1a0e46", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.492 187189 DEBUG oslo_concurrency.lockutils [req-5b5232a2-c8bd-47b2-83d7-2f6fdd766622 req-1c756e19-5677-407f-9d33-72551554dfdc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-e6b5b54b-9532-4f51-a346-42dee946a9ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.492 187189 DEBUG nova.network.neutron [req-5b5232a2-c8bd-47b2-83d7-2f6fdd766622 req-1c756e19-5677-407f-9d33-72551554dfdc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Refreshing network info cache for port 04956313-39e4-4275-ab3f-18aa7a1a0e46 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.496 187189 DEBUG nova.virt.libvirt.driver [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Start _get_guest_xml network_info=[{"id": "04956313-39e4-4275-ab3f-18aa7a1a0e46", "address": "fa:16:3e:b8:f5:6d", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04956313-39", "ovs_interfaceid": "04956313-39e4-4275-ab3f-18aa7a1a0e46", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.500 187189 WARNING nova.virt.libvirt.driver [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.515 187189 DEBUG nova.virt.libvirt.host [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.516 187189 DEBUG nova.virt.libvirt.host [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.519 187189 DEBUG nova.virt.libvirt.host [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.520 187189 DEBUG nova.virt.libvirt.host [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.521 187189 DEBUG nova.virt.libvirt.driver [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.521 187189 DEBUG nova.virt.hardware [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.522 187189 DEBUG nova.virt.hardware [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.522 187189 DEBUG nova.virt.hardware [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.522 187189 DEBUG nova.virt.hardware [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.522 187189 DEBUG nova.virt.hardware [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.523 187189 DEBUG nova.virt.hardware [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.523 187189 DEBUG nova.virt.hardware [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.523 187189 DEBUG nova.virt.hardware [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.523 187189 DEBUG nova.virt.hardware [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.524 187189 DEBUG nova.virt.hardware [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.524 187189 DEBUG nova.virt.hardware [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.527 187189 DEBUG nova.virt.libvirt.vif [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:49:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-797555638',display_name='tempest-LiveMigrationTest-server-797555638',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-livemigrationtest-server-797555638',id=10,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2b6eb92d93c24eaaa0c6a3104a54633a',ramdisk_id='',reservation_id='r-duitc0f8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-440211682',owner_user_name='tempest-LiveMigrationTest-440211682-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:49:48Z,user_data=None,user_id='a01fd01629a1493bb3fb6df5a2462226',uuid=e6b5b54b-9532-4f51-a346-42dee946a9ef,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "04956313-39e4-4275-ab3f-18aa7a1a0e46", "address": "fa:16:3e:b8:f5:6d", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04956313-39", "ovs_interfaceid": "04956313-39e4-4275-ab3f-18aa7a1a0e46", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.527 187189 DEBUG nova.network.os_vif_util [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Converting VIF {"id": "04956313-39e4-4275-ab3f-18aa7a1a0e46", "address": "fa:16:3e:b8:f5:6d", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04956313-39", "ovs_interfaceid": "04956313-39e4-4275-ab3f-18aa7a1a0e46", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.528 187189 DEBUG nova.network.os_vif_util [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b8:f5:6d,bridge_name='br-int',has_traffic_filtering=True,id=04956313-39e4-4275-ab3f-18aa7a1a0e46,network=Network(24ee44f0-2b10-459c-aabf-bf9ef2c8d950),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap04956313-39') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.529 187189 DEBUG nova.objects.instance [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Lazy-loading 'pci_devices' on Instance uuid e6b5b54b-9532-4f51-a346-42dee946a9ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.547 187189 DEBUG nova.virt.libvirt.driver [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] End _get_guest_xml xml=<domain type="kvm">
Nov 29 06:49:51 compute-0 nova_compute[187185]:   <uuid>e6b5b54b-9532-4f51-a346-42dee946a9ef</uuid>
Nov 29 06:49:51 compute-0 nova_compute[187185]:   <name>instance-0000000a</name>
Nov 29 06:49:51 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 06:49:51 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 06:49:51 compute-0 nova_compute[187185]:   <metadata>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 06:49:51 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:       <nova:name>tempest-LiveMigrationTest-server-797555638</nova:name>
Nov 29 06:49:51 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 06:49:51</nova:creationTime>
Nov 29 06:49:51 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 06:49:51 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 06:49:51 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 06:49:51 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 06:49:51 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 06:49:51 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 06:49:51 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 06:49:51 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 06:49:51 compute-0 nova_compute[187185]:         <nova:user uuid="a01fd01629a1493bb3fb6df5a2462226">tempest-LiveMigrationTest-440211682-project-member</nova:user>
Nov 29 06:49:51 compute-0 nova_compute[187185]:         <nova:project uuid="2b6eb92d93c24eaaa0c6a3104a54633a">tempest-LiveMigrationTest-440211682</nova:project>
Nov 29 06:49:51 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 06:49:51 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 06:49:51 compute-0 nova_compute[187185]:         <nova:port uuid="04956313-39e4-4275-ab3f-18aa7a1a0e46">
Nov 29 06:49:51 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 06:49:51 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 06:49:51 compute-0 nova_compute[187185]:   </metadata>
Nov 29 06:49:51 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <system>
Nov 29 06:49:51 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 06:49:51 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 06:49:51 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 06:49:51 compute-0 nova_compute[187185]:       <entry name="serial">e6b5b54b-9532-4f51-a346-42dee946a9ef</entry>
Nov 29 06:49:51 compute-0 nova_compute[187185]:       <entry name="uuid">e6b5b54b-9532-4f51-a346-42dee946a9ef</entry>
Nov 29 06:49:51 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     </system>
Nov 29 06:49:51 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 06:49:51 compute-0 nova_compute[187185]:   <os>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:   </os>
Nov 29 06:49:51 compute-0 nova_compute[187185]:   <features>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <apic/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:   </features>
Nov 29 06:49:51 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:   </clock>
Nov 29 06:49:51 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:   </cpu>
Nov 29 06:49:51 compute-0 nova_compute[187185]:   <devices>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 06:49:51 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef/disk"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     </disk>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 06:49:51 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef/disk.config"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     </disk>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 06:49:51 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:b8:f5:6d"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:       <target dev="tap04956313-39"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     </interface>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 06:49:51 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef/console.log" append="off"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     </serial>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <video>
Nov 29 06:49:51 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     </video>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 06:49:51 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     </rng>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 06:49:51 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 06:49:51 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 06:49:51 compute-0 nova_compute[187185]:   </devices>
Nov 29 06:49:51 compute-0 nova_compute[187185]: </domain>
Nov 29 06:49:51 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.548 187189 DEBUG nova.compute.manager [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Preparing to wait for external event network-vif-plugged-04956313-39e4-4275-ab3f-18aa7a1a0e46 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.548 187189 DEBUG oslo_concurrency.lockutils [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Acquiring lock "e6b5b54b-9532-4f51-a346-42dee946a9ef-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.548 187189 DEBUG oslo_concurrency.lockutils [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Lock "e6b5b54b-9532-4f51-a346-42dee946a9ef-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.549 187189 DEBUG oslo_concurrency.lockutils [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Lock "e6b5b54b-9532-4f51-a346-42dee946a9ef-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.549 187189 DEBUG nova.virt.libvirt.vif [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:49:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-797555638',display_name='tempest-LiveMigrationTest-server-797555638',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-livemigrationtest-server-797555638',id=10,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2b6eb92d93c24eaaa0c6a3104a54633a',ramdisk_id='',reservation_id='r-duitc0f8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-440211682',owner_user_name='tempest-LiveMigrationTest-440211682-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:49:48Z,user_data=None,user_id='a01fd01629a1493bb3fb6df5a2462226',uuid=e6b5b54b-9532-4f51-a346-42dee946a9ef,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "04956313-39e4-4275-ab3f-18aa7a1a0e46", "address": "fa:16:3e:b8:f5:6d", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04956313-39", "ovs_interfaceid": "04956313-39e4-4275-ab3f-18aa7a1a0e46", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.549 187189 DEBUG nova.network.os_vif_util [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Converting VIF {"id": "04956313-39e4-4275-ab3f-18aa7a1a0e46", "address": "fa:16:3e:b8:f5:6d", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04956313-39", "ovs_interfaceid": "04956313-39e4-4275-ab3f-18aa7a1a0e46", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.550 187189 DEBUG nova.network.os_vif_util [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b8:f5:6d,bridge_name='br-int',has_traffic_filtering=True,id=04956313-39e4-4275-ab3f-18aa7a1a0e46,network=Network(24ee44f0-2b10-459c-aabf-bf9ef2c8d950),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap04956313-39') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.550 187189 DEBUG os_vif [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:f5:6d,bridge_name='br-int',has_traffic_filtering=True,id=04956313-39e4-4275-ab3f-18aa7a1a0e46,network=Network(24ee44f0-2b10-459c-aabf-bf9ef2c8d950),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap04956313-39') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.551 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.551 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.551 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.554 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.555 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap04956313-39, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.555 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap04956313-39, col_values=(('external_ids', {'iface-id': '04956313-39e4-4275-ab3f-18aa7a1a0e46', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b8:f5:6d', 'vm-uuid': 'e6b5b54b-9532-4f51-a346-42dee946a9ef'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.556 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:51 compute-0 NetworkManager[55227]: <info>  [1764398991.5581] manager: (tap04956313-39): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/25)
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.559 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.563 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.564 187189 INFO os_vif [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:f5:6d,bridge_name='br-int',has_traffic_filtering=True,id=04956313-39e4-4275-ab3f-18aa7a1a0e46,network=Network(24ee44f0-2b10-459c-aabf-bf9ef2c8d950),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap04956313-39')
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.699 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.842 187189 DEBUG nova.virt.libvirt.driver [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.843 187189 DEBUG nova.virt.libvirt.driver [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.843 187189 DEBUG nova.virt.libvirt.driver [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] No VIF found with MAC fa:16:3e:b8:f5:6d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 06:49:51 compute-0 nova_compute[187185]: 2025-11-29 06:49:51.843 187189 INFO nova.virt.libvirt.driver [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Using config drive
Nov 29 06:49:52 compute-0 nova_compute[187185]: 2025-11-29 06:49:52.671 187189 INFO nova.virt.libvirt.driver [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Creating config drive at /var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef/disk.config
Nov 29 06:49:52 compute-0 nova_compute[187185]: 2025-11-29 06:49:52.680 187189 DEBUG oslo_concurrency.processutils [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpaq9e_9gr execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:49:52 compute-0 nova_compute[187185]: 2025-11-29 06:49:52.821 187189 DEBUG oslo_concurrency.processutils [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpaq9e_9gr" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:49:52 compute-0 kernel: tap04956313-39: entered promiscuous mode
Nov 29 06:49:52 compute-0 NetworkManager[55227]: <info>  [1764398992.8925] manager: (tap04956313-39): new Tun device (/org/freedesktop/NetworkManager/Devices/26)
Nov 29 06:49:52 compute-0 ovn_controller[95281]: 2025-11-29T06:49:52Z|00035|binding|INFO|Claiming lport 04956313-39e4-4275-ab3f-18aa7a1a0e46 for this chassis.
Nov 29 06:49:52 compute-0 nova_compute[187185]: 2025-11-29 06:49:52.947 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:52 compute-0 ovn_controller[95281]: 2025-11-29T06:49:52Z|00036|binding|INFO|04956313-39e4-4275-ab3f-18aa7a1a0e46: Claiming fa:16:3e:b8:f5:6d 10.100.0.9
Nov 29 06:49:52 compute-0 ovn_controller[95281]: 2025-11-29T06:49:52Z|00037|binding|INFO|Claiming lport 81278169-001b-4894-adbd-075edcc27e49 for this chassis.
Nov 29 06:49:52 compute-0 ovn_controller[95281]: 2025-11-29T06:49:52Z|00038|binding|INFO|81278169-001b-4894-adbd-075edcc27e49: Claiming fa:16:3e:a3:12:0f 19.80.0.181
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:53.158 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a3:12:0f 19.80.0.181'], port_security=['fa:16:3e:a3:12:0f 19.80.0.181'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['04956313-39e4-4275-ab3f-18aa7a1a0e46'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1726918358', 'neutron:cidrs': '19.80.0.181/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1726918358', 'neutron:project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1b137676-29a0-4a8e-83e8-cda39edaccb8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=ec2078f4-7ef2-4848-8fcd-c69eaba744f4, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=81278169-001b-4894-adbd-075edcc27e49) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:53.160 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b8:f5:6d 10.100.0.9'], port_security=['fa:16:3e:b8:f5:6d 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-1473147608', 'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'e6b5b54b-9532-4f51-a346-42dee946a9ef', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24ee44f0-2b10-459c-aabf-bf9ef2c8d950', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-1473147608', 'neutron:project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1b137676-29a0-4a8e-83e8-cda39edaccb8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fafd611f-c010-460d-b1cc-2d52a79696f1, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=04956313-39e4-4275-ab3f-18aa7a1a0e46) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:53.161 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 81278169-001b-4894-adbd-075edcc27e49 in datapath 5bda1138-fab5-4b3a-9a12-4d1c90a4dce0 bound to our chassis
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:53.162 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5bda1138-fab5-4b3a-9a12-4d1c90a4dce0
Nov 29 06:49:53 compute-0 nova_compute[187185]: 2025-11-29 06:49:53.164 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:53.173 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[cf56adbb-d814-4225-95cf-ee5d727f7c5a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:53.174 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5bda1138-f1 in ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:53.176 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5bda1138-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:53.176 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f3e00193-3ff3-4878-8eb2-ecae2760bcde]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:53.177 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e38ffddb-f0ca-4bdd-9806-7b39bd270d03]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:53.190 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[23271aa7-9787-4fc6-b484-9f3f90207612]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:49:53 compute-0 systemd-udevd[214788]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 06:49:53 compute-0 NetworkManager[55227]: <info>  [1764398993.2032] device (tap04956313-39): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 06:49:53 compute-0 NetworkManager[55227]: <info>  [1764398993.2042] device (tap04956313-39): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 06:49:53 compute-0 systemd-machined[153486]: New machine qemu-3-instance-0000000a.
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:53.220 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[0bb6524c-42e8-45f0-b2ec-a57fc913fd9f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:49:53 compute-0 systemd[1]: Started Virtual Machine qemu-3-instance-0000000a.
Nov 29 06:49:53 compute-0 nova_compute[187185]: 2025-11-29 06:49:53.241 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:53.243 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[b4a4cce5-9018-47e6-97dd-ea9c4e92b6f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:49:53 compute-0 ovn_controller[95281]: 2025-11-29T06:49:53Z|00039|binding|INFO|Setting lport 04956313-39e4-4275-ab3f-18aa7a1a0e46 ovn-installed in OVS
Nov 29 06:49:53 compute-0 ovn_controller[95281]: 2025-11-29T06:49:53Z|00040|binding|INFO|Setting lport 04956313-39e4-4275-ab3f-18aa7a1a0e46 up in Southbound
Nov 29 06:49:53 compute-0 ovn_controller[95281]: 2025-11-29T06:49:53Z|00041|binding|INFO|Setting lport 81278169-001b-4894-adbd-075edcc27e49 up in Southbound
Nov 29 06:49:53 compute-0 nova_compute[187185]: 2025-11-29 06:49:53.247 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:53.248 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[760d9888-98c5-4c4c-8452-fc4f37aed3a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:49:53 compute-0 NetworkManager[55227]: <info>  [1764398993.2491] manager: (tap5bda1138-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/27)
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:53.280 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[197a2a3c-7208-4968-8fa2-8ee39051b774]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:53.282 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[3bc44292-7fac-4bb3-aa27-408dccf02d92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:49:53 compute-0 NetworkManager[55227]: <info>  [1764398993.3050] device (tap5bda1138-f0): carrier: link connected
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:53.305 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[c20a12b1-0f88-4d21-8512-56df7009f7b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:53.317 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[b8be6493-f325-4543-9582-3717431f637e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5bda1138-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:70:5d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 444096, 'reachable_time': 23625, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214820, 'error': None, 'target': 'ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:53.329 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[111c57cb-d511-420c-93fd-e1ae8c1f879a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedd:705d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 444096, 'tstamp': 444096}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214821, 'error': None, 'target': 'ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:53.341 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[4cfe1173-d46e-4f86-9e8c-3473bc1e8b35]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5bda1138-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:70:5d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 444096, 'reachable_time': 23625, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214822, 'error': None, 'target': 'ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:53.367 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6d68b83d-8c44-480e-ae4d-bd5b09e397e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:53.413 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[1ed9b60d-ea29-4d1c-8be4-ff714e9785ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:53.414 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5bda1138-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:53.415 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:53.415 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5bda1138-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:49:53 compute-0 nova_compute[187185]: 2025-11-29 06:49:53.417 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:53 compute-0 NetworkManager[55227]: <info>  [1764398993.4176] manager: (tap5bda1138-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Nov 29 06:49:53 compute-0 kernel: tap5bda1138-f0: entered promiscuous mode
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:53.420 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5bda1138-f0, col_values=(('external_ids', {'iface-id': 'a5f83360-af8d-41aa-987f-5ef9d63c1561'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:49:53 compute-0 ovn_controller[95281]: 2025-11-29T06:49:53Z|00042|binding|INFO|Releasing lport a5f83360-af8d-41aa-987f-5ef9d63c1561 from this chassis (sb_readonly=0)
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:53.443 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5bda1138-fab5-4b3a-9a12-4d1c90a4dce0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5bda1138-fab5-4b3a-9a12-4d1c90a4dce0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 06:49:53 compute-0 nova_compute[187185]: 2025-11-29 06:49:53.444 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:53.445 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d2372172-0cbb-469a-8635-e8b36fb4ee4c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:53.445 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]: global
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/5bda1138-fab5-4b3a-9a12-4d1c90a4dce0.pid.haproxy
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]: 
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]: 
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]: 
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID 5bda1138-fab5-4b3a-9a12-4d1c90a4dce0
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 06:49:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:53.447 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0', 'env', 'PROCESS_TAG=haproxy-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5bda1138-fab5-4b3a-9a12-4d1c90a4dce0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 06:49:53 compute-0 nova_compute[187185]: 2025-11-29 06:49:53.543 187189 DEBUG nova.network.neutron [req-5b5232a2-c8bd-47b2-83d7-2f6fdd766622 req-1c756e19-5677-407f-9d33-72551554dfdc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Updated VIF entry in instance network info cache for port 04956313-39e4-4275-ab3f-18aa7a1a0e46. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 06:49:53 compute-0 nova_compute[187185]: 2025-11-29 06:49:53.543 187189 DEBUG nova.network.neutron [req-5b5232a2-c8bd-47b2-83d7-2f6fdd766622 req-1c756e19-5677-407f-9d33-72551554dfdc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Updating instance_info_cache with network_info: [{"id": "04956313-39e4-4275-ab3f-18aa7a1a0e46", "address": "fa:16:3e:b8:f5:6d", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04956313-39", "ovs_interfaceid": "04956313-39e4-4275-ab3f-18aa7a1a0e46", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:49:53 compute-0 nova_compute[187185]: 2025-11-29 06:49:53.562 187189 DEBUG oslo_concurrency.lockutils [req-5b5232a2-c8bd-47b2-83d7-2f6fdd766622 req-1c756e19-5677-407f-9d33-72551554dfdc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-e6b5b54b-9532-4f51-a346-42dee946a9ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:49:53 compute-0 podman[214854]: 2025-11-29 06:49:53.819052427 +0000 UTC m=+0.026933652 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 06:49:54 compute-0 podman[214854]: 2025-11-29 06:49:54.20352662 +0000 UTC m=+0.411407785 container create 15907331385acf92b8c052164d5eaffc9c6d32dad52e8a168a5ec0e6c24c5532 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 29 06:49:54 compute-0 systemd[1]: Started libpod-conmon-15907331385acf92b8c052164d5eaffc9c6d32dad52e8a168a5ec0e6c24c5532.scope.
Nov 29 06:49:54 compute-0 systemd[1]: Started libcrun container.
Nov 29 06:49:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf3470abb1a795d0ca4becbf18763f127bec27e1dc68f075ec14203be0614634/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 06:49:54 compute-0 podman[214854]: 2025-11-29 06:49:54.44815617 +0000 UTC m=+0.656037355 container init 15907331385acf92b8c052164d5eaffc9c6d32dad52e8a168a5ec0e6c24c5532 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Nov 29 06:49:54 compute-0 podman[214867]: 2025-11-29 06:49:54.448621894 +0000 UTC m=+0.201923876 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 06:49:54 compute-0 podman[214854]: 2025-11-29 06:49:54.457768832 +0000 UTC m=+0.665650017 container start 15907331385acf92b8c052164d5eaffc9c6d32dad52e8a168a5ec0e6c24c5532 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 06:49:54 compute-0 neutron-haproxy-ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0[214883]: [NOTICE]   (214899) : New worker (214901) forked
Nov 29 06:49:54 compute-0 neutron-haproxy-ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0[214883]: [NOTICE]   (214899) : Loading success.
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:54.522 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 04956313-39e4-4275-ab3f-18aa7a1a0e46 in datapath 24ee44f0-2b10-459c-aabf-bf9ef2c8d950 unbound from our chassis
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:54.525 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 24ee44f0-2b10-459c-aabf-bf9ef2c8d950
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:54.538 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d9101a0e-cf7c-45ec-b270-62cd08bb8e2d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:54.540 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap24ee44f0-21 in ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:54.542 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap24ee44f0-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:54.543 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[c223cbb9-a494-4f4f-9222-13d2b52ae562]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:54.544 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[0b750a89-dfc6-4589-94ea-6ceb54cd9d59]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:54.557 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[a53d8997-1aa2-48a2-ad87-555bc3023355]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:54.572 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[32dd4ce0-535e-48eb-b209-a28ffca9fae0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:54.607 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[e092dc92-d304-468c-86c6-0dfafe76d716]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:54.617 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[582b84fb-0ba1-4115-8423-65d5443f105d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:49:54 compute-0 NetworkManager[55227]: <info>  [1764398994.6188] manager: (tap24ee44f0-20): new Veth device (/org/freedesktop/NetworkManager/Devices/29)
Nov 29 06:49:54 compute-0 systemd-udevd[214810]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:54.657 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[0af8d078-a10c-40fe-b3c9-388e583dc72a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:54.662 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[e7acc665-9f9e-420c-9e73-3c4944a1305e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:49:54 compute-0 NetworkManager[55227]: <info>  [1764398994.6907] device (tap24ee44f0-20): carrier: link connected
Nov 29 06:49:54 compute-0 nova_compute[187185]: 2025-11-29 06:49:54.697 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764398994.6963987, e6b5b54b-9532-4f51-a346-42dee946a9ef => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:54.697 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[a2d4c961-d69a-4804-ab87-435a79775b20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:49:54 compute-0 nova_compute[187185]: 2025-11-29 06:49:54.698 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] VM Started (Lifecycle Event)
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:54.719 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[c1083cbb-2ece-4867-a214-6731b84b689a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24ee44f0-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a4:94:0c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 444235, 'reachable_time': 36238, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214928, 'error': None, 'target': 'ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:49:54 compute-0 nova_compute[187185]: 2025-11-29 06:49:54.729 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:49:54 compute-0 nova_compute[187185]: 2025-11-29 06:49:54.738 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764398994.6966496, e6b5b54b-9532-4f51-a346-42dee946a9ef => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:49:54 compute-0 nova_compute[187185]: 2025-11-29 06:49:54.738 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] VM Paused (Lifecycle Event)
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:54.743 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[701ac74b-df6f-4e64-9cb4-97ef71429c5c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea4:940c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 444235, 'tstamp': 444235}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214929, 'error': None, 'target': 'ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:49:54 compute-0 nova_compute[187185]: 2025-11-29 06:49:54.761 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:49:54 compute-0 nova_compute[187185]: 2025-11-29 06:49:54.766 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:54.767 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d7c083e5-77e6-455b-8415-441ee1849cf3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24ee44f0-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a4:94:0c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 444235, 'reachable_time': 36238, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214930, 'error': None, 'target': 'ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:49:54 compute-0 nova_compute[187185]: 2025-11-29 06:49:54.784 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:54.811 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[7f8b9abc-74e7-4f0f-ba9b-2ea33946328c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:54.876 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[aa759a41-6e92-4674-9f7b-42e192365a29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:54.879 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24ee44f0-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:54.880 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:54.881 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap24ee44f0-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:49:54 compute-0 nova_compute[187185]: 2025-11-29 06:49:54.885 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:54 compute-0 NetworkManager[55227]: <info>  [1764398994.8859] manager: (tap24ee44f0-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Nov 29 06:49:54 compute-0 kernel: tap24ee44f0-20: entered promiscuous mode
Nov 29 06:49:54 compute-0 nova_compute[187185]: 2025-11-29 06:49:54.888 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:54.891 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap24ee44f0-20, col_values=(('external_ids', {'iface-id': 'ffbd3b8f-7e45-45d4-84ce-cd74c712f992'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:49:54 compute-0 nova_compute[187185]: 2025-11-29 06:49:54.893 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:54 compute-0 ovn_controller[95281]: 2025-11-29T06:49:54Z|00043|binding|INFO|Releasing lport ffbd3b8f-7e45-45d4-84ce-cd74c712f992 from this chassis (sb_readonly=0)
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:54.895 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/24ee44f0-2b10-459c-aabf-bf9ef2c8d950.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/24ee44f0-2b10-459c-aabf-bf9ef2c8d950.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:54.896 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[074dd756-4635-4252-ac33-c73b1db154dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:54.897 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]: global
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-24ee44f0-2b10-459c-aabf-bf9ef2c8d950
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/24ee44f0-2b10-459c-aabf-bf9ef2c8d950.pid.haproxy
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]: 
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]: 
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]: 
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID 24ee44f0-2b10-459c-aabf-bf9ef2c8d950
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 06:49:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:49:54.898 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950', 'env', 'PROCESS_TAG=haproxy-24ee44f0-2b10-459c-aabf-bf9ef2c8d950', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/24ee44f0-2b10-459c-aabf-bf9ef2c8d950.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 06:49:54 compute-0 nova_compute[187185]: 2025-11-29 06:49:54.905 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:55 compute-0 podman[214961]: 2025-11-29 06:49:55.249410699 +0000 UTC m=+0.026383416 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 06:49:56 compute-0 podman[214961]: 2025-11-29 06:49:56.052102519 +0000 UTC m=+0.829075216 container create 1e886a291adc90ea9db68dab53383e2808ca51cb06bd503b1468169c83a8dbb2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:49:56 compute-0 systemd[1]: Started libpod-conmon-1e886a291adc90ea9db68dab53383e2808ca51cb06bd503b1468169c83a8dbb2.scope.
Nov 29 06:49:56 compute-0 systemd[1]: Started libcrun container.
Nov 29 06:49:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/876280c791202d40c17c8f8bc9e29f36dfba8c2f63858da516c2d2737292db9b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 06:49:56 compute-0 podman[214961]: 2025-11-29 06:49:56.498702177 +0000 UTC m=+1.275674894 container init 1e886a291adc90ea9db68dab53383e2808ca51cb06bd503b1468169c83a8dbb2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 06:49:56 compute-0 podman[214961]: 2025-11-29 06:49:56.506981251 +0000 UTC m=+1.283953958 container start 1e886a291adc90ea9db68dab53383e2808ca51cb06bd503b1468169c83a8dbb2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 29 06:49:56 compute-0 neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950[214998]: [NOTICE]   (215021) : New worker (215023) forked
Nov 29 06:49:56 compute-0 neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950[214998]: [NOTICE]   (215021) : Loading success.
Nov 29 06:49:56 compute-0 podman[214974]: 2025-11-29 06:49:56.596103889 +0000 UTC m=+0.499291888 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 06:49:56 compute-0 nova_compute[187185]: 2025-11-29 06:49:56.615 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:56 compute-0 podman[214975]: 2025-11-29 06:49:56.640489473 +0000 UTC m=+0.536965433 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 06:49:56 compute-0 nova_compute[187185]: 2025-11-29 06:49:56.701 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:49:58 compute-0 nova_compute[187185]: 2025-11-29 06:49:58.892 187189 DEBUG nova.compute.manager [req-aa7f763c-0e87-4b2d-82c0-84b0332acee0 req-6287c555-9592-461c-bde1-4d45af7231d3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Received event network-vif-plugged-04956313-39e4-4275-ab3f-18aa7a1a0e46 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:49:58 compute-0 nova_compute[187185]: 2025-11-29 06:49:58.892 187189 DEBUG oslo_concurrency.lockutils [req-aa7f763c-0e87-4b2d-82c0-84b0332acee0 req-6287c555-9592-461c-bde1-4d45af7231d3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "e6b5b54b-9532-4f51-a346-42dee946a9ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:49:58 compute-0 nova_compute[187185]: 2025-11-29 06:49:58.892 187189 DEBUG oslo_concurrency.lockutils [req-aa7f763c-0e87-4b2d-82c0-84b0332acee0 req-6287c555-9592-461c-bde1-4d45af7231d3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e6b5b54b-9532-4f51-a346-42dee946a9ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:49:58 compute-0 nova_compute[187185]: 2025-11-29 06:49:58.893 187189 DEBUG oslo_concurrency.lockutils [req-aa7f763c-0e87-4b2d-82c0-84b0332acee0 req-6287c555-9592-461c-bde1-4d45af7231d3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e6b5b54b-9532-4f51-a346-42dee946a9ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:49:58 compute-0 nova_compute[187185]: 2025-11-29 06:49:58.893 187189 DEBUG nova.compute.manager [req-aa7f763c-0e87-4b2d-82c0-84b0332acee0 req-6287c555-9592-461c-bde1-4d45af7231d3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Processing event network-vif-plugged-04956313-39e4-4275-ab3f-18aa7a1a0e46 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 06:49:58 compute-0 nova_compute[187185]: 2025-11-29 06:49:58.893 187189 DEBUG nova.compute.manager [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 06:49:58 compute-0 nova_compute[187185]: 2025-11-29 06:49:58.898 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764398998.897817, e6b5b54b-9532-4f51-a346-42dee946a9ef => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:49:58 compute-0 nova_compute[187185]: 2025-11-29 06:49:58.898 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] VM Resumed (Lifecycle Event)
Nov 29 06:49:58 compute-0 nova_compute[187185]: 2025-11-29 06:49:58.899 187189 DEBUG nova.virt.libvirt.driver [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 06:49:58 compute-0 nova_compute[187185]: 2025-11-29 06:49:58.903 187189 INFO nova.virt.libvirt.driver [-] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Instance spawned successfully.
Nov 29 06:49:58 compute-0 nova_compute[187185]: 2025-11-29 06:49:58.903 187189 DEBUG nova.virt.libvirt.driver [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 06:49:58 compute-0 nova_compute[187185]: 2025-11-29 06:49:58.925 187189 DEBUG nova.virt.libvirt.driver [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:49:58 compute-0 nova_compute[187185]: 2025-11-29 06:49:58.925 187189 DEBUG nova.virt.libvirt.driver [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:49:58 compute-0 nova_compute[187185]: 2025-11-29 06:49:58.926 187189 DEBUG nova.virt.libvirt.driver [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:49:58 compute-0 nova_compute[187185]: 2025-11-29 06:49:58.926 187189 DEBUG nova.virt.libvirt.driver [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:49:58 compute-0 nova_compute[187185]: 2025-11-29 06:49:58.926 187189 DEBUG nova.virt.libvirt.driver [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:49:58 compute-0 nova_compute[187185]: 2025-11-29 06:49:58.927 187189 DEBUG nova.virt.libvirt.driver [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:49:58 compute-0 nova_compute[187185]: 2025-11-29 06:49:58.934 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:49:58 compute-0 nova_compute[187185]: 2025-11-29 06:49:58.938 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 06:49:58 compute-0 nova_compute[187185]: 2025-11-29 06:49:58.960 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 06:49:59 compute-0 nova_compute[187185]: 2025-11-29 06:49:59.030 187189 INFO nova.compute.manager [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Took 10.75 seconds to spawn the instance on the hypervisor.
Nov 29 06:49:59 compute-0 nova_compute[187185]: 2025-11-29 06:49:59.031 187189 DEBUG nova.compute.manager [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:49:59 compute-0 nova_compute[187185]: 2025-11-29 06:49:59.512 187189 INFO nova.compute.manager [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Took 11.77 seconds to build instance.
Nov 29 06:49:59 compute-0 nova_compute[187185]: 2025-11-29 06:49:59.544 187189 DEBUG oslo_concurrency.lockutils [None req-c9ed6643-1889-4bf9-a21b-27b67497e785 a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Lock "e6b5b54b-9532-4f51-a346-42dee946a9ef" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.913s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:50:01 compute-0 nova_compute[187185]: 2025-11-29 06:50:01.270 187189 DEBUG nova.compute.manager [req-359f0657-7b52-4bb6-9eaf-0b22203c27a4 req-704fcf69-c33f-4ca9-b69c-7315d45ace5b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Received event network-vif-plugged-04956313-39e4-4275-ab3f-18aa7a1a0e46 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:50:01 compute-0 nova_compute[187185]: 2025-11-29 06:50:01.271 187189 DEBUG oslo_concurrency.lockutils [req-359f0657-7b52-4bb6-9eaf-0b22203c27a4 req-704fcf69-c33f-4ca9-b69c-7315d45ace5b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "e6b5b54b-9532-4f51-a346-42dee946a9ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:50:01 compute-0 nova_compute[187185]: 2025-11-29 06:50:01.271 187189 DEBUG oslo_concurrency.lockutils [req-359f0657-7b52-4bb6-9eaf-0b22203c27a4 req-704fcf69-c33f-4ca9-b69c-7315d45ace5b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e6b5b54b-9532-4f51-a346-42dee946a9ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:50:01 compute-0 nova_compute[187185]: 2025-11-29 06:50:01.271 187189 DEBUG oslo_concurrency.lockutils [req-359f0657-7b52-4bb6-9eaf-0b22203c27a4 req-704fcf69-c33f-4ca9-b69c-7315d45ace5b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e6b5b54b-9532-4f51-a346-42dee946a9ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:50:01 compute-0 nova_compute[187185]: 2025-11-29 06:50:01.271 187189 DEBUG nova.compute.manager [req-359f0657-7b52-4bb6-9eaf-0b22203c27a4 req-704fcf69-c33f-4ca9-b69c-7315d45ace5b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] No waiting events found dispatching network-vif-plugged-04956313-39e4-4275-ab3f-18aa7a1a0e46 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 06:50:01 compute-0 nova_compute[187185]: 2025-11-29 06:50:01.272 187189 WARNING nova.compute.manager [req-359f0657-7b52-4bb6-9eaf-0b22203c27a4 req-704fcf69-c33f-4ca9-b69c-7315d45ace5b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Received unexpected event network-vif-plugged-04956313-39e4-4275-ab3f-18aa7a1a0e46 for instance with vm_state active and task_state None.
Nov 29 06:50:01 compute-0 nova_compute[187185]: 2025-11-29 06:50:01.621 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:50:01 compute-0 nova_compute[187185]: 2025-11-29 06:50:01.705 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:50:01 compute-0 nova_compute[187185]: 2025-11-29 06:50:01.939 187189 DEBUG nova.virt.libvirt.driver [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Check if temp file /var/lib/nova/instances/tmpafq0nrkl exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Nov 29 06:50:01 compute-0 nova_compute[187185]: 2025-11-29 06:50:01.947 187189 DEBUG oslo_concurrency.processutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:50:02 compute-0 nova_compute[187185]: 2025-11-29 06:50:02.052 187189 DEBUG oslo_concurrency.processutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef/disk --force-share --output=json" returned: 0 in 0.105s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:50:02 compute-0 nova_compute[187185]: 2025-11-29 06:50:02.053 187189 DEBUG oslo_concurrency.processutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:50:02 compute-0 nova_compute[187185]: 2025-11-29 06:50:02.109 187189 DEBUG oslo_concurrency.processutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:50:02 compute-0 nova_compute[187185]: 2025-11-29 06:50:02.111 187189 DEBUG nova.compute.manager [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpafq0nrkl',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e6b5b54b-9532-4f51-a346-42dee946a9ef',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Nov 29 06:50:02 compute-0 nova_compute[187185]: 2025-11-29 06:50:02.869 187189 DEBUG oslo_concurrency.processutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:50:02 compute-0 nova_compute[187185]: 2025-11-29 06:50:02.938 187189 DEBUG oslo_concurrency.processutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:50:02 compute-0 nova_compute[187185]: 2025-11-29 06:50:02.939 187189 DEBUG oslo_concurrency.processutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:50:03 compute-0 nova_compute[187185]: 2025-11-29 06:50:03.039 187189 DEBUG oslo_concurrency.processutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef/disk --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:50:03 compute-0 nova_compute[187185]: 2025-11-29 06:50:03.040 187189 DEBUG oslo_concurrency.lockutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:50:03 compute-0 nova_compute[187185]: 2025-11-29 06:50:03.041 187189 DEBUG oslo_concurrency.lockutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:50:03 compute-0 nova_compute[187185]: 2025-11-29 06:50:03.050 187189 INFO nova.compute.rpcapi [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66
Nov 29 06:50:03 compute-0 nova_compute[187185]: 2025-11-29 06:50:03.051 187189 DEBUG oslo_concurrency.lockutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:50:04 compute-0 podman[215045]: 2025-11-29 06:50:04.812982977 +0000 UTC m=+0.073674293 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 06:50:05 compute-0 sshd-session[215065]: Accepted publickey for nova from 192.168.122.102 port 41822 ssh2: ECDSA SHA256:TRxHqyAgYthLa8Amy2/P9EvhpVYcaH8++okwzvcHmaY
Nov 29 06:50:05 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Nov 29 06:50:05 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 29 06:50:05 compute-0 systemd-logind[788]: New session 26 of user nova.
Nov 29 06:50:05 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 29 06:50:05 compute-0 systemd[1]: Starting User Manager for UID 42436...
Nov 29 06:50:05 compute-0 systemd[215069]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 29 06:50:05 compute-0 systemd[215069]: Queued start job for default target Main User Target.
Nov 29 06:50:05 compute-0 systemd[215069]: Created slice User Application Slice.
Nov 29 06:50:05 compute-0 systemd[215069]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 29 06:50:05 compute-0 systemd[215069]: Started Daily Cleanup of User's Temporary Directories.
Nov 29 06:50:05 compute-0 systemd[215069]: Reached target Paths.
Nov 29 06:50:05 compute-0 systemd[215069]: Reached target Timers.
Nov 29 06:50:05 compute-0 systemd[215069]: Starting D-Bus User Message Bus Socket...
Nov 29 06:50:05 compute-0 systemd[215069]: Starting Create User's Volatile Files and Directories...
Nov 29 06:50:05 compute-0 systemd[215069]: Listening on D-Bus User Message Bus Socket.
Nov 29 06:50:05 compute-0 systemd[215069]: Reached target Sockets.
Nov 29 06:50:05 compute-0 systemd[215069]: Finished Create User's Volatile Files and Directories.
Nov 29 06:50:05 compute-0 systemd[215069]: Reached target Basic System.
Nov 29 06:50:05 compute-0 systemd[215069]: Reached target Main User Target.
Nov 29 06:50:05 compute-0 systemd[215069]: Startup finished in 184ms.
Nov 29 06:50:05 compute-0 systemd[1]: Started User Manager for UID 42436.
Nov 29 06:50:05 compute-0 systemd[1]: Started Session 26 of User nova.
Nov 29 06:50:05 compute-0 sshd-session[215065]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 29 06:50:05 compute-0 sshd-session[215084]: Received disconnect from 192.168.122.102 port 41822:11: disconnected by user
Nov 29 06:50:05 compute-0 sshd-session[215084]: Disconnected from user nova 192.168.122.102 port 41822
Nov 29 06:50:05 compute-0 sshd-session[215065]: pam_unix(sshd:session): session closed for user nova
Nov 29 06:50:05 compute-0 systemd[1]: session-26.scope: Deactivated successfully.
Nov 29 06:50:05 compute-0 systemd-logind[788]: Session 26 logged out. Waiting for processes to exit.
Nov 29 06:50:05 compute-0 systemd-logind[788]: Removed session 26.
Nov 29 06:50:06 compute-0 nova_compute[187185]: 2025-11-29 06:50:06.656 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:50:06 compute-0 nova_compute[187185]: 2025-11-29 06:50:06.708 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:50:06 compute-0 nova_compute[187185]: 2025-11-29 06:50:06.750 187189 DEBUG nova.compute.manager [req-1ed971b3-6738-45ba-bc34-80beae681d67 req-a5e98a2c-3cd2-4e86-92b3-e3cae8213fc3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Received event network-vif-unplugged-04956313-39e4-4275-ab3f-18aa7a1a0e46 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:50:06 compute-0 nova_compute[187185]: 2025-11-29 06:50:06.751 187189 DEBUG oslo_concurrency.lockutils [req-1ed971b3-6738-45ba-bc34-80beae681d67 req-a5e98a2c-3cd2-4e86-92b3-e3cae8213fc3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "e6b5b54b-9532-4f51-a346-42dee946a9ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:50:06 compute-0 nova_compute[187185]: 2025-11-29 06:50:06.751 187189 DEBUG oslo_concurrency.lockutils [req-1ed971b3-6738-45ba-bc34-80beae681d67 req-a5e98a2c-3cd2-4e86-92b3-e3cae8213fc3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e6b5b54b-9532-4f51-a346-42dee946a9ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:50:06 compute-0 nova_compute[187185]: 2025-11-29 06:50:06.752 187189 DEBUG oslo_concurrency.lockutils [req-1ed971b3-6738-45ba-bc34-80beae681d67 req-a5e98a2c-3cd2-4e86-92b3-e3cae8213fc3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e6b5b54b-9532-4f51-a346-42dee946a9ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:50:06 compute-0 nova_compute[187185]: 2025-11-29 06:50:06.752 187189 DEBUG nova.compute.manager [req-1ed971b3-6738-45ba-bc34-80beae681d67 req-a5e98a2c-3cd2-4e86-92b3-e3cae8213fc3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] No waiting events found dispatching network-vif-unplugged-04956313-39e4-4275-ab3f-18aa7a1a0e46 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 06:50:06 compute-0 nova_compute[187185]: 2025-11-29 06:50:06.752 187189 DEBUG nova.compute.manager [req-1ed971b3-6738-45ba-bc34-80beae681d67 req-a5e98a2c-3cd2-4e86-92b3-e3cae8213fc3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Received event network-vif-unplugged-04956313-39e4-4275-ab3f-18aa7a1a0e46 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 06:50:07 compute-0 nova_compute[187185]: 2025-11-29 06:50:07.745 187189 INFO nova.compute.manager [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Took 4.70 seconds for pre_live_migration on destination host compute-2.ctlplane.example.com.
Nov 29 06:50:07 compute-0 nova_compute[187185]: 2025-11-29 06:50:07.747 187189 DEBUG nova.compute.manager [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 06:50:07 compute-0 nova_compute[187185]: 2025-11-29 06:50:07.768 187189 DEBUG nova.compute.manager [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpafq0nrkl',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e6b5b54b-9532-4f51-a346-42dee946a9ef',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(09644b82-9200-4175-a5be-cd05781c44be),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Nov 29 06:50:07 compute-0 nova_compute[187185]: 2025-11-29 06:50:07.832 187189 DEBUG nova.objects.instance [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lazy-loading 'migration_context' on Instance uuid e6b5b54b-9532-4f51-a346-42dee946a9ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:50:07 compute-0 nova_compute[187185]: 2025-11-29 06:50:07.834 187189 DEBUG nova.virt.libvirt.driver [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Nov 29 06:50:07 compute-0 nova_compute[187185]: 2025-11-29 06:50:07.838 187189 DEBUG nova.virt.libvirt.driver [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Nov 29 06:50:07 compute-0 nova_compute[187185]: 2025-11-29 06:50:07.838 187189 DEBUG nova.virt.libvirt.driver [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Nov 29 06:50:07 compute-0 nova_compute[187185]: 2025-11-29 06:50:07.858 187189 DEBUG nova.virt.libvirt.vif [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:49:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-797555638',display_name='tempest-LiveMigrationTest-server-797555638',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-livemigrationtest-server-797555638',id=10,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:49:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2b6eb92d93c24eaaa0c6a3104a54633a',ramdisk_id='',reservation_id='r-duitc0f8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-440211682',owner_user_name='tempest-LiveMigrationTest-440211682-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:49:59Z,user_data=None,user_id='a01fd01629a1493bb3fb6df5a2462226',uuid=e6b5b54b-9532-4f51-a346-42dee946a9ef,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "04956313-39e4-4275-ab3f-18aa7a1a0e46", "address": "fa:16:3e:b8:f5:6d", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap04956313-39", "ovs_interfaceid": "04956313-39e4-4275-ab3f-18aa7a1a0e46", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 06:50:07 compute-0 nova_compute[187185]: 2025-11-29 06:50:07.859 187189 DEBUG nova.network.os_vif_util [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Converting VIF {"id": "04956313-39e4-4275-ab3f-18aa7a1a0e46", "address": "fa:16:3e:b8:f5:6d", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap04956313-39", "ovs_interfaceid": "04956313-39e4-4275-ab3f-18aa7a1a0e46", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 06:50:07 compute-0 nova_compute[187185]: 2025-11-29 06:50:07.860 187189 DEBUG nova.network.os_vif_util [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b8:f5:6d,bridge_name='br-int',has_traffic_filtering=True,id=04956313-39e4-4275-ab3f-18aa7a1a0e46,network=Network(24ee44f0-2b10-459c-aabf-bf9ef2c8d950),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap04956313-39') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 06:50:07 compute-0 nova_compute[187185]: 2025-11-29 06:50:07.862 187189 DEBUG nova.virt.libvirt.migration [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Updating guest XML with vif config: <interface type="ethernet">
Nov 29 06:50:07 compute-0 nova_compute[187185]:   <mac address="fa:16:3e:b8:f5:6d"/>
Nov 29 06:50:07 compute-0 nova_compute[187185]:   <model type="virtio"/>
Nov 29 06:50:07 compute-0 nova_compute[187185]:   <driver name="vhost" rx_queue_size="512"/>
Nov 29 06:50:07 compute-0 nova_compute[187185]:   <mtu size="1442"/>
Nov 29 06:50:07 compute-0 nova_compute[187185]:   <target dev="tap04956313-39"/>
Nov 29 06:50:07 compute-0 nova_compute[187185]: </interface>
Nov 29 06:50:07 compute-0 nova_compute[187185]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Nov 29 06:50:07 compute-0 nova_compute[187185]: 2025-11-29 06:50:07.863 187189 DEBUG nova.virt.libvirt.driver [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Nov 29 06:50:08 compute-0 nova_compute[187185]: 2025-11-29 06:50:08.342 187189 DEBUG nova.virt.libvirt.migration [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 29 06:50:08 compute-0 nova_compute[187185]: 2025-11-29 06:50:08.343 187189 INFO nova.virt.libvirt.migration [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Increasing downtime to 50 ms after 0 sec elapsed time
Nov 29 06:50:08 compute-0 nova_compute[187185]: 2025-11-29 06:50:08.464 187189 INFO nova.virt.libvirt.driver [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Nov 29 06:50:08 compute-0 nova_compute[187185]: 2025-11-29 06:50:08.909 187189 DEBUG nova.compute.manager [req-6533fc99-a5c0-42b3-91ef-09ab27e3173a req-b8b7747d-d3e1-46c2-b66b-82bc1fe44cdc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Received event network-vif-plugged-04956313-39e4-4275-ab3f-18aa7a1a0e46 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:50:08 compute-0 nova_compute[187185]: 2025-11-29 06:50:08.910 187189 DEBUG oslo_concurrency.lockutils [req-6533fc99-a5c0-42b3-91ef-09ab27e3173a req-b8b7747d-d3e1-46c2-b66b-82bc1fe44cdc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "e6b5b54b-9532-4f51-a346-42dee946a9ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:50:08 compute-0 nova_compute[187185]: 2025-11-29 06:50:08.911 187189 DEBUG oslo_concurrency.lockutils [req-6533fc99-a5c0-42b3-91ef-09ab27e3173a req-b8b7747d-d3e1-46c2-b66b-82bc1fe44cdc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e6b5b54b-9532-4f51-a346-42dee946a9ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:50:08 compute-0 nova_compute[187185]: 2025-11-29 06:50:08.911 187189 DEBUG oslo_concurrency.lockutils [req-6533fc99-a5c0-42b3-91ef-09ab27e3173a req-b8b7747d-d3e1-46c2-b66b-82bc1fe44cdc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e6b5b54b-9532-4f51-a346-42dee946a9ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:50:08 compute-0 nova_compute[187185]: 2025-11-29 06:50:08.912 187189 DEBUG nova.compute.manager [req-6533fc99-a5c0-42b3-91ef-09ab27e3173a req-b8b7747d-d3e1-46c2-b66b-82bc1fe44cdc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] No waiting events found dispatching network-vif-plugged-04956313-39e4-4275-ab3f-18aa7a1a0e46 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 06:50:08 compute-0 nova_compute[187185]: 2025-11-29 06:50:08.912 187189 WARNING nova.compute.manager [req-6533fc99-a5c0-42b3-91ef-09ab27e3173a req-b8b7747d-d3e1-46c2-b66b-82bc1fe44cdc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Received unexpected event network-vif-plugged-04956313-39e4-4275-ab3f-18aa7a1a0e46 for instance with vm_state active and task_state migrating.
Nov 29 06:50:08 compute-0 nova_compute[187185]: 2025-11-29 06:50:08.913 187189 DEBUG nova.compute.manager [req-6533fc99-a5c0-42b3-91ef-09ab27e3173a req-b8b7747d-d3e1-46c2-b66b-82bc1fe44cdc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Received event network-changed-04956313-39e4-4275-ab3f-18aa7a1a0e46 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:50:08 compute-0 nova_compute[187185]: 2025-11-29 06:50:08.913 187189 DEBUG nova.compute.manager [req-6533fc99-a5c0-42b3-91ef-09ab27e3173a req-b8b7747d-d3e1-46c2-b66b-82bc1fe44cdc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Refreshing instance network info cache due to event network-changed-04956313-39e4-4275-ab3f-18aa7a1a0e46. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 06:50:08 compute-0 nova_compute[187185]: 2025-11-29 06:50:08.913 187189 DEBUG oslo_concurrency.lockutils [req-6533fc99-a5c0-42b3-91ef-09ab27e3173a req-b8b7747d-d3e1-46c2-b66b-82bc1fe44cdc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-e6b5b54b-9532-4f51-a346-42dee946a9ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:50:08 compute-0 nova_compute[187185]: 2025-11-29 06:50:08.914 187189 DEBUG oslo_concurrency.lockutils [req-6533fc99-a5c0-42b3-91ef-09ab27e3173a req-b8b7747d-d3e1-46c2-b66b-82bc1fe44cdc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-e6b5b54b-9532-4f51-a346-42dee946a9ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:50:08 compute-0 nova_compute[187185]: 2025-11-29 06:50:08.914 187189 DEBUG nova.network.neutron [req-6533fc99-a5c0-42b3-91ef-09ab27e3173a req-b8b7747d-d3e1-46c2-b66b-82bc1fe44cdc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Refreshing network info cache for port 04956313-39e4-4275-ab3f-18aa7a1a0e46 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 06:50:08 compute-0 nova_compute[187185]: 2025-11-29 06:50:08.967 187189 DEBUG nova.virt.libvirt.migration [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 29 06:50:08 compute-0 nova_compute[187185]: 2025-11-29 06:50:08.967 187189 DEBUG nova.virt.libvirt.migration [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 29 06:50:09 compute-0 nova_compute[187185]: 2025-11-29 06:50:09.471 187189 DEBUG nova.virt.libvirt.migration [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 29 06:50:09 compute-0 nova_compute[187185]: 2025-11-29 06:50:09.472 187189 DEBUG nova.virt.libvirt.migration [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 29 06:50:09 compute-0 nova_compute[187185]: 2025-11-29 06:50:09.977 187189 DEBUG nova.virt.libvirt.migration [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 29 06:50:09 compute-0 nova_compute[187185]: 2025-11-29 06:50:09.978 187189 DEBUG nova.virt.libvirt.migration [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 29 06:50:10 compute-0 nova_compute[187185]: 2025-11-29 06:50:10.174 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399010.174281, e6b5b54b-9532-4f51-a346-42dee946a9ef => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:50:10 compute-0 nova_compute[187185]: 2025-11-29 06:50:10.175 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] VM Paused (Lifecycle Event)
Nov 29 06:50:10 compute-0 nova_compute[187185]: 2025-11-29 06:50:10.199 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:50:10 compute-0 nova_compute[187185]: 2025-11-29 06:50:10.203 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 06:50:10 compute-0 nova_compute[187185]: 2025-11-29 06:50:10.226 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] During sync_power_state the instance has a pending task (migrating). Skip.
Nov 29 06:50:10 compute-0 kernel: tap04956313-39 (unregistering): left promiscuous mode
Nov 29 06:50:10 compute-0 NetworkManager[55227]: <info>  [1764399010.3158] device (tap04956313-39): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 06:50:10 compute-0 nova_compute[187185]: 2025-11-29 06:50:10.355 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:50:10 compute-0 ovn_controller[95281]: 2025-11-29T06:50:10Z|00044|binding|INFO|Releasing lport 04956313-39e4-4275-ab3f-18aa7a1a0e46 from this chassis (sb_readonly=0)
Nov 29 06:50:10 compute-0 ovn_controller[95281]: 2025-11-29T06:50:10Z|00045|binding|INFO|Setting lport 04956313-39e4-4275-ab3f-18aa7a1a0e46 down in Southbound
Nov 29 06:50:10 compute-0 ovn_controller[95281]: 2025-11-29T06:50:10Z|00046|binding|INFO|Releasing lport 81278169-001b-4894-adbd-075edcc27e49 from this chassis (sb_readonly=0)
Nov 29 06:50:10 compute-0 ovn_controller[95281]: 2025-11-29T06:50:10Z|00047|binding|INFO|Setting lport 81278169-001b-4894-adbd-075edcc27e49 down in Southbound
Nov 29 06:50:10 compute-0 ovn_controller[95281]: 2025-11-29T06:50:10Z|00048|binding|INFO|Removing iface tap04956313-39 ovn-installed in OVS
Nov 29 06:50:10 compute-0 nova_compute[187185]: 2025-11-29 06:50:10.360 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:50:10 compute-0 nova_compute[187185]: 2025-11-29 06:50:10.382 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:50:10 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Nov 29 06:50:10 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d0000000a.scope: Consumed 13.792s CPU time.
Nov 29 06:50:10 compute-0 systemd-machined[153486]: Machine qemu-3-instance-0000000a terminated.
Nov 29 06:50:10 compute-0 nova_compute[187185]: 2025-11-29 06:50:10.571 187189 DEBUG nova.virt.libvirt.guest [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Nov 29 06:50:10 compute-0 nova_compute[187185]: 2025-11-29 06:50:10.574 187189 INFO nova.virt.libvirt.driver [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Migration operation has completed
Nov 29 06:50:10 compute-0 nova_compute[187185]: 2025-11-29 06:50:10.574 187189 INFO nova.compute.manager [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] _post_live_migration() is started..
Nov 29 06:50:10 compute-0 nova_compute[187185]: 2025-11-29 06:50:10.577 187189 DEBUG nova.virt.libvirt.driver [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Nov 29 06:50:10 compute-0 nova_compute[187185]: 2025-11-29 06:50:10.577 187189 DEBUG nova.virt.libvirt.driver [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Nov 29 06:50:10 compute-0 nova_compute[187185]: 2025-11-29 06:50:10.577 187189 DEBUG nova.virt.libvirt.driver [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Nov 29 06:50:10 compute-0 ovn_controller[95281]: 2025-11-29T06:50:10Z|00049|binding|INFO|Releasing lport ffbd3b8f-7e45-45d4-84ce-cd74c712f992 from this chassis (sb_readonly=0)
Nov 29 06:50:10 compute-0 ovn_controller[95281]: 2025-11-29T06:50:10Z|00050|binding|INFO|Releasing lport a5f83360-af8d-41aa-987f-5ef9d63c1561 from this chassis (sb_readonly=0)
Nov 29 06:50:10 compute-0 nova_compute[187185]: 2025-11-29 06:50:10.626 187189 DEBUG nova.network.neutron [req-6533fc99-a5c0-42b3-91ef-09ab27e3173a req-b8b7747d-d3e1-46c2-b66b-82bc1fe44cdc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Updated VIF entry in instance network info cache for port 04956313-39e4-4275-ab3f-18aa7a1a0e46. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 06:50:10 compute-0 nova_compute[187185]: 2025-11-29 06:50:10.627 187189 DEBUG nova.network.neutron [req-6533fc99-a5c0-42b3-91ef-09ab27e3173a req-b8b7747d-d3e1-46c2-b66b-82bc1fe44cdc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Updating instance_info_cache with network_info: [{"id": "04956313-39e4-4275-ab3f-18aa7a1a0e46", "address": "fa:16:3e:b8:f5:6d", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04956313-39", "ovs_interfaceid": "04956313-39e4-4275-ab3f-18aa7a1a0e46", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-2.ctlplane.example.com"}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:50:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:50:10.627 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a3:12:0f 19.80.0.181'], port_security=['fa:16:3e:a3:12:0f 19.80.0.181'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['04956313-39e4-4275-ab3f-18aa7a1a0e46'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1726918358', 'neutron:cidrs': '19.80.0.181/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1726918358', 'neutron:project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '1b137676-29a0-4a8e-83e8-cda39edaccb8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=ec2078f4-7ef2-4848-8fcd-c69eaba744f4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=81278169-001b-4894-adbd-075edcc27e49) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 06:50:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:50:10.630 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b8:f5:6d 10.100.0.9'], port_security=['fa:16:3e:b8:f5:6d 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-2.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-1473147608', 'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'e6b5b54b-9532-4f51-a346-42dee946a9ef', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24ee44f0-2b10-459c-aabf-bf9ef2c8d950', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-1473147608', 'neutron:project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'neutron:revision_number': '8', 'neutron:security_group_ids': '1b137676-29a0-4a8e-83e8-cda39edaccb8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fafd611f-c010-460d-b1cc-2d52a79696f1, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=04956313-39e4-4275-ab3f-18aa7a1a0e46) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 06:50:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:50:10.632 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 81278169-001b-4894-adbd-075edcc27e49 in datapath 5bda1138-fab5-4b3a-9a12-4d1c90a4dce0 unbound from our chassis
Nov 29 06:50:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:50:10.635 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5bda1138-fab5-4b3a-9a12-4d1c90a4dce0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 06:50:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:50:10.637 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[759ab6cc-e4ac-4ebd-88bb-32ce88e18262]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:50:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:50:10.638 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0 namespace which is not needed anymore
Nov 29 06:50:10 compute-0 nova_compute[187185]: 2025-11-29 06:50:10.692 187189 DEBUG oslo_concurrency.lockutils [req-6533fc99-a5c0-42b3-91ef-09ab27e3173a req-b8b7747d-d3e1-46c2-b66b-82bc1fe44cdc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-e6b5b54b-9532-4f51-a346-42dee946a9ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:50:10 compute-0 nova_compute[187185]: 2025-11-29 06:50:10.692 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:50:11 compute-0 neutron-haproxy-ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0[214883]: [NOTICE]   (214899) : haproxy version is 2.8.14-c23fe91
Nov 29 06:50:11 compute-0 neutron-haproxy-ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0[214883]: [NOTICE]   (214899) : path to executable is /usr/sbin/haproxy
Nov 29 06:50:11 compute-0 neutron-haproxy-ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0[214883]: [WARNING]  (214899) : Exiting Master process...
Nov 29 06:50:11 compute-0 neutron-haproxy-ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0[214883]: [ALERT]    (214899) : Current worker (214901) exited with code 143 (Terminated)
Nov 29 06:50:11 compute-0 neutron-haproxy-ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0[214883]: [WARNING]  (214899) : All workers exited. Exiting... (0)
Nov 29 06:50:11 compute-0 systemd[1]: libpod-15907331385acf92b8c052164d5eaffc9c6d32dad52e8a168a5ec0e6c24c5532.scope: Deactivated successfully.
Nov 29 06:50:11 compute-0 podman[215136]: 2025-11-29 06:50:11.196944148 +0000 UTC m=+0.450533200 container died 15907331385acf92b8c052164d5eaffc9c6d32dad52e8a168a5ec0e6c24c5532 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:50:11 compute-0 nova_compute[187185]: 2025-11-29 06:50:11.207 187189 DEBUG nova.compute.manager [req-7065cf5e-7c3f-426b-a299-e03b789b4221 req-57802549-6331-4dd2-86ec-179fd847bfba 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Received event network-vif-unplugged-04956313-39e4-4275-ab3f-18aa7a1a0e46 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:50:11 compute-0 nova_compute[187185]: 2025-11-29 06:50:11.208 187189 DEBUG oslo_concurrency.lockutils [req-7065cf5e-7c3f-426b-a299-e03b789b4221 req-57802549-6331-4dd2-86ec-179fd847bfba 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "e6b5b54b-9532-4f51-a346-42dee946a9ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:50:11 compute-0 nova_compute[187185]: 2025-11-29 06:50:11.208 187189 DEBUG oslo_concurrency.lockutils [req-7065cf5e-7c3f-426b-a299-e03b789b4221 req-57802549-6331-4dd2-86ec-179fd847bfba 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e6b5b54b-9532-4f51-a346-42dee946a9ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:50:11 compute-0 nova_compute[187185]: 2025-11-29 06:50:11.208 187189 DEBUG oslo_concurrency.lockutils [req-7065cf5e-7c3f-426b-a299-e03b789b4221 req-57802549-6331-4dd2-86ec-179fd847bfba 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e6b5b54b-9532-4f51-a346-42dee946a9ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:50:11 compute-0 nova_compute[187185]: 2025-11-29 06:50:11.209 187189 DEBUG nova.compute.manager [req-7065cf5e-7c3f-426b-a299-e03b789b4221 req-57802549-6331-4dd2-86ec-179fd847bfba 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] No waiting events found dispatching network-vif-unplugged-04956313-39e4-4275-ab3f-18aa7a1a0e46 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 06:50:11 compute-0 nova_compute[187185]: 2025-11-29 06:50:11.209 187189 DEBUG nova.compute.manager [req-7065cf5e-7c3f-426b-a299-e03b789b4221 req-57802549-6331-4dd2-86ec-179fd847bfba 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Received event network-vif-unplugged-04956313-39e4-4275-ab3f-18aa7a1a0e46 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 06:50:11 compute-0 sshd-session[215094]: Invalid user hu from 1.214.197.163 port 43218
Nov 29 06:50:11 compute-0 nova_compute[187185]: 2025-11-29 06:50:11.659 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:50:11 compute-0 nova_compute[187185]: 2025-11-29 06:50:11.709 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:50:11 compute-0 sshd-session[215094]: Received disconnect from 1.214.197.163 port 43218:11: Bye Bye [preauth]
Nov 29 06:50:11 compute-0 sshd-session[215094]: Disconnected from invalid user hu 1.214.197.163 port 43218 [preauth]
Nov 29 06:50:11 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-15907331385acf92b8c052164d5eaffc9c6d32dad52e8a168a5ec0e6c24c5532-userdata-shm.mount: Deactivated successfully.
Nov 29 06:50:11 compute-0 systemd[1]: var-lib-containers-storage-overlay-cf3470abb1a795d0ca4becbf18763f127bec27e1dc68f075ec14203be0614634-merged.mount: Deactivated successfully.
Nov 29 06:50:12 compute-0 podman[215136]: 2025-11-29 06:50:12.238182857 +0000 UTC m=+1.491771959 container cleanup 15907331385acf92b8c052164d5eaffc9c6d32dad52e8a168a5ec0e6c24c5532 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 29 06:50:12 compute-0 systemd[1]: libpod-conmon-15907331385acf92b8c052164d5eaffc9c6d32dad52e8a168a5ec0e6c24c5532.scope: Deactivated successfully.
Nov 29 06:50:12 compute-0 nova_compute[187185]: 2025-11-29 06:50:12.531 187189 DEBUG nova.network.neutron [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Activated binding for port 04956313-39e4-4275-ab3f-18aa7a1a0e46 and host compute-2.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Nov 29 06:50:12 compute-0 nova_compute[187185]: 2025-11-29 06:50:12.533 187189 DEBUG nova.compute.manager [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "04956313-39e4-4275-ab3f-18aa7a1a0e46", "address": "fa:16:3e:b8:f5:6d", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04956313-39", "ovs_interfaceid": "04956313-39e4-4275-ab3f-18aa7a1a0e46", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Nov 29 06:50:12 compute-0 nova_compute[187185]: 2025-11-29 06:50:12.535 187189 DEBUG nova.virt.libvirt.vif [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:49:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-797555638',display_name='tempest-LiveMigrationTest-server-797555638',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-livemigrationtest-server-797555638',id=10,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:49:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2b6eb92d93c24eaaa0c6a3104a54633a',ramdisk_id='',reservation_id='r-duitc0f8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-440211682',owner_user_name='tempest-LiveMigrationTest-440211682-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:50:01Z,user_data=None,user_id='a01fd01629a1493bb3fb6df5a2462226',uuid=e6b5b54b-9532-4f51-a346-42dee946a9ef,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "04956313-39e4-4275-ab3f-18aa7a1a0e46", "address": "fa:16:3e:b8:f5:6d", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04956313-39", "ovs_interfaceid": "04956313-39e4-4275-ab3f-18aa7a1a0e46", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 06:50:12 compute-0 nova_compute[187185]: 2025-11-29 06:50:12.535 187189 DEBUG nova.network.os_vif_util [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Converting VIF {"id": "04956313-39e4-4275-ab3f-18aa7a1a0e46", "address": "fa:16:3e:b8:f5:6d", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04956313-39", "ovs_interfaceid": "04956313-39e4-4275-ab3f-18aa7a1a0e46", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 06:50:12 compute-0 nova_compute[187185]: 2025-11-29 06:50:12.537 187189 DEBUG nova.network.os_vif_util [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b8:f5:6d,bridge_name='br-int',has_traffic_filtering=True,id=04956313-39e4-4275-ab3f-18aa7a1a0e46,network=Network(24ee44f0-2b10-459c-aabf-bf9ef2c8d950),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap04956313-39') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 06:50:12 compute-0 nova_compute[187185]: 2025-11-29 06:50:12.538 187189 DEBUG os_vif [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:f5:6d,bridge_name='br-int',has_traffic_filtering=True,id=04956313-39e4-4275-ab3f-18aa7a1a0e46,network=Network(24ee44f0-2b10-459c-aabf-bf9ef2c8d950),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap04956313-39') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 06:50:12 compute-0 nova_compute[187185]: 2025-11-29 06:50:12.540 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:50:12 compute-0 nova_compute[187185]: 2025-11-29 06:50:12.541 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap04956313-39, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:50:12 compute-0 nova_compute[187185]: 2025-11-29 06:50:12.614 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:50:12 compute-0 nova_compute[187185]: 2025-11-29 06:50:12.618 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 06:50:12 compute-0 nova_compute[187185]: 2025-11-29 06:50:12.621 187189 INFO os_vif [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:f5:6d,bridge_name='br-int',has_traffic_filtering=True,id=04956313-39e4-4275-ab3f-18aa7a1a0e46,network=Network(24ee44f0-2b10-459c-aabf-bf9ef2c8d950),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap04956313-39')
Nov 29 06:50:12 compute-0 nova_compute[187185]: 2025-11-29 06:50:12.622 187189 DEBUG oslo_concurrency.lockutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:50:12 compute-0 nova_compute[187185]: 2025-11-29 06:50:12.623 187189 DEBUG oslo_concurrency.lockutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:50:12 compute-0 nova_compute[187185]: 2025-11-29 06:50:12.624 187189 DEBUG oslo_concurrency.lockutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:50:12 compute-0 nova_compute[187185]: 2025-11-29 06:50:12.624 187189 DEBUG nova.compute.manager [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Nov 29 06:50:12 compute-0 nova_compute[187185]: 2025-11-29 06:50:12.626 187189 INFO nova.virt.libvirt.driver [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Deleting instance files /var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef_del
Nov 29 06:50:12 compute-0 nova_compute[187185]: 2025-11-29 06:50:12.628 187189 INFO nova.virt.libvirt.driver [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Deletion of /var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef_del complete
Nov 29 06:50:13 compute-0 podman[215168]: 2025-11-29 06:50:13.105972265 +0000 UTC m=+0.843804432 container remove 15907331385acf92b8c052164d5eaffc9c6d32dad52e8a168a5ec0e6c24c5532 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 29 06:50:13 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:50:13.111 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[11c24fe8-9154-4499-a0cb-639cd3505dd3]: (4, ('Sat Nov 29 06:50:10 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0 (15907331385acf92b8c052164d5eaffc9c6d32dad52e8a168a5ec0e6c24c5532)\n15907331385acf92b8c052164d5eaffc9c6d32dad52e8a168a5ec0e6c24c5532\nSat Nov 29 06:50:12 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0 (15907331385acf92b8c052164d5eaffc9c6d32dad52e8a168a5ec0e6c24c5532)\n15907331385acf92b8c052164d5eaffc9c6d32dad52e8a168a5ec0e6c24c5532\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:50:13 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:50:13.112 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[4b7cda3d-162a-4c56-adf0-cdb7ccc4d511]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:50:13 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:50:13.114 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5bda1138-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:50:13 compute-0 nova_compute[187185]: 2025-11-29 06:50:13.116 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:50:13 compute-0 kernel: tap5bda1138-f0: left promiscuous mode
Nov 29 06:50:13 compute-0 nova_compute[187185]: 2025-11-29 06:50:13.127 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:50:13 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:50:13.130 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[33586a90-c9a9-4d86-b390-44e79f32cb5f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:50:13 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:50:13.144 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[5559d91a-3161-4da1-9bfa-bae57bced04f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:50:13 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:50:13.145 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[4c278ab2-e951-4007-8b0b-d0c6c13784b6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:50:13 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:50:13.159 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[874a7da6-fe3c-4354-ae74-e3b804093486]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 444090, 'reachable_time': 25952, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215185, 'error': None, 'target': 'ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:50:13 compute-0 systemd[1]: run-netns-ovnmeta\x2d5bda1138\x2dfab5\x2d4b3a\x2d9a12\x2d4d1c90a4dce0.mount: Deactivated successfully.
Nov 29 06:50:13 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:50:13.163 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 06:50:13 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:50:13.164 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[5d989402-343f-4fdf-b28c-f2b6411b2de6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:50:13 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:50:13.165 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 04956313-39e4-4275-ab3f-18aa7a1a0e46 in datapath 24ee44f0-2b10-459c-aabf-bf9ef2c8d950 unbound from our chassis
Nov 29 06:50:13 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:50:13.166 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 24ee44f0-2b10-459c-aabf-bf9ef2c8d950, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 06:50:13 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:50:13.167 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[1824e2a9-4ab4-4306-8abb-c54af99e8f81]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:50:13 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:50:13.167 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950 namespace which is not needed anymore
Nov 29 06:50:13 compute-0 nova_compute[187185]: 2025-11-29 06:50:13.345 187189 DEBUG nova.compute.manager [req-787b27f2-a1a8-4297-a797-726470a90995 req-9c5f19ec-b6c6-4d01-8449-26b93f2f5ce4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Received event network-vif-plugged-04956313-39e4-4275-ab3f-18aa7a1a0e46 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:50:13 compute-0 nova_compute[187185]: 2025-11-29 06:50:13.345 187189 DEBUG oslo_concurrency.lockutils [req-787b27f2-a1a8-4297-a797-726470a90995 req-9c5f19ec-b6c6-4d01-8449-26b93f2f5ce4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "e6b5b54b-9532-4f51-a346-42dee946a9ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:50:13 compute-0 nova_compute[187185]: 2025-11-29 06:50:13.345 187189 DEBUG oslo_concurrency.lockutils [req-787b27f2-a1a8-4297-a797-726470a90995 req-9c5f19ec-b6c6-4d01-8449-26b93f2f5ce4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e6b5b54b-9532-4f51-a346-42dee946a9ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:50:13 compute-0 nova_compute[187185]: 2025-11-29 06:50:13.346 187189 DEBUG oslo_concurrency.lockutils [req-787b27f2-a1a8-4297-a797-726470a90995 req-9c5f19ec-b6c6-4d01-8449-26b93f2f5ce4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e6b5b54b-9532-4f51-a346-42dee946a9ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:50:13 compute-0 nova_compute[187185]: 2025-11-29 06:50:13.346 187189 DEBUG nova.compute.manager [req-787b27f2-a1a8-4297-a797-726470a90995 req-9c5f19ec-b6c6-4d01-8449-26b93f2f5ce4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] No waiting events found dispatching network-vif-plugged-04956313-39e4-4275-ab3f-18aa7a1a0e46 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 06:50:13 compute-0 nova_compute[187185]: 2025-11-29 06:50:13.346 187189 WARNING nova.compute.manager [req-787b27f2-a1a8-4297-a797-726470a90995 req-9c5f19ec-b6c6-4d01-8449-26b93f2f5ce4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Received unexpected event network-vif-plugged-04956313-39e4-4275-ab3f-18aa7a1a0e46 for instance with vm_state active and task_state migrating.
Nov 29 06:50:13 compute-0 nova_compute[187185]: 2025-11-29 06:50:13.346 187189 DEBUG nova.compute.manager [req-787b27f2-a1a8-4297-a797-726470a90995 req-9c5f19ec-b6c6-4d01-8449-26b93f2f5ce4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Received event network-vif-plugged-04956313-39e4-4275-ab3f-18aa7a1a0e46 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:50:13 compute-0 nova_compute[187185]: 2025-11-29 06:50:13.346 187189 DEBUG oslo_concurrency.lockutils [req-787b27f2-a1a8-4297-a797-726470a90995 req-9c5f19ec-b6c6-4d01-8449-26b93f2f5ce4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "e6b5b54b-9532-4f51-a346-42dee946a9ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:50:13 compute-0 nova_compute[187185]: 2025-11-29 06:50:13.347 187189 DEBUG oslo_concurrency.lockutils [req-787b27f2-a1a8-4297-a797-726470a90995 req-9c5f19ec-b6c6-4d01-8449-26b93f2f5ce4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e6b5b54b-9532-4f51-a346-42dee946a9ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:50:13 compute-0 nova_compute[187185]: 2025-11-29 06:50:13.347 187189 DEBUG oslo_concurrency.lockutils [req-787b27f2-a1a8-4297-a797-726470a90995 req-9c5f19ec-b6c6-4d01-8449-26b93f2f5ce4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e6b5b54b-9532-4f51-a346-42dee946a9ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:50:13 compute-0 nova_compute[187185]: 2025-11-29 06:50:13.347 187189 DEBUG nova.compute.manager [req-787b27f2-a1a8-4297-a797-726470a90995 req-9c5f19ec-b6c6-4d01-8449-26b93f2f5ce4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] No waiting events found dispatching network-vif-plugged-04956313-39e4-4275-ab3f-18aa7a1a0e46 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 06:50:13 compute-0 nova_compute[187185]: 2025-11-29 06:50:13.347 187189 WARNING nova.compute.manager [req-787b27f2-a1a8-4297-a797-726470a90995 req-9c5f19ec-b6c6-4d01-8449-26b93f2f5ce4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Received unexpected event network-vif-plugged-04956313-39e4-4275-ab3f-18aa7a1a0e46 for instance with vm_state active and task_state migrating.
Nov 29 06:50:13 compute-0 nova_compute[187185]: 2025-11-29 06:50:13.347 187189 DEBUG nova.compute.manager [req-787b27f2-a1a8-4297-a797-726470a90995 req-9c5f19ec-b6c6-4d01-8449-26b93f2f5ce4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Received event network-vif-plugged-04956313-39e4-4275-ab3f-18aa7a1a0e46 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:50:13 compute-0 nova_compute[187185]: 2025-11-29 06:50:13.348 187189 DEBUG oslo_concurrency.lockutils [req-787b27f2-a1a8-4297-a797-726470a90995 req-9c5f19ec-b6c6-4d01-8449-26b93f2f5ce4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "e6b5b54b-9532-4f51-a346-42dee946a9ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:50:13 compute-0 nova_compute[187185]: 2025-11-29 06:50:13.348 187189 DEBUG oslo_concurrency.lockutils [req-787b27f2-a1a8-4297-a797-726470a90995 req-9c5f19ec-b6c6-4d01-8449-26b93f2f5ce4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e6b5b54b-9532-4f51-a346-42dee946a9ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:50:13 compute-0 nova_compute[187185]: 2025-11-29 06:50:13.348 187189 DEBUG oslo_concurrency.lockutils [req-787b27f2-a1a8-4297-a797-726470a90995 req-9c5f19ec-b6c6-4d01-8449-26b93f2f5ce4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e6b5b54b-9532-4f51-a346-42dee946a9ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:50:13 compute-0 nova_compute[187185]: 2025-11-29 06:50:13.348 187189 DEBUG nova.compute.manager [req-787b27f2-a1a8-4297-a797-726470a90995 req-9c5f19ec-b6c6-4d01-8449-26b93f2f5ce4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] No waiting events found dispatching network-vif-plugged-04956313-39e4-4275-ab3f-18aa7a1a0e46 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 06:50:13 compute-0 nova_compute[187185]: 2025-11-29 06:50:13.348 187189 WARNING nova.compute.manager [req-787b27f2-a1a8-4297-a797-726470a90995 req-9c5f19ec-b6c6-4d01-8449-26b93f2f5ce4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Received unexpected event network-vif-plugged-04956313-39e4-4275-ab3f-18aa7a1a0e46 for instance with vm_state active and task_state migrating.
Nov 29 06:50:13 compute-0 neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950[214998]: [NOTICE]   (215021) : haproxy version is 2.8.14-c23fe91
Nov 29 06:50:13 compute-0 neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950[214998]: [NOTICE]   (215021) : path to executable is /usr/sbin/haproxy
Nov 29 06:50:13 compute-0 neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950[214998]: [WARNING]  (215021) : Exiting Master process...
Nov 29 06:50:13 compute-0 neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950[214998]: [ALERT]    (215021) : Current worker (215023) exited with code 143 (Terminated)
Nov 29 06:50:13 compute-0 neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950[214998]: [WARNING]  (215021) : All workers exited. Exiting... (0)
Nov 29 06:50:13 compute-0 systemd[1]: libpod-1e886a291adc90ea9db68dab53383e2808ca51cb06bd503b1468169c83a8dbb2.scope: Deactivated successfully.
Nov 29 06:50:13 compute-0 podman[215200]: 2025-11-29 06:50:13.466668136 +0000 UTC m=+0.225016579 container died 1e886a291adc90ea9db68dab53383e2808ca51cb06bd503b1468169c83a8dbb2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 29 06:50:14 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1e886a291adc90ea9db68dab53383e2808ca51cb06bd503b1468169c83a8dbb2-userdata-shm.mount: Deactivated successfully.
Nov 29 06:50:14 compute-0 systemd[1]: var-lib-containers-storage-overlay-876280c791202d40c17c8f8bc9e29f36dfba8c2f63858da516c2d2737292db9b-merged.mount: Deactivated successfully.
Nov 29 06:50:14 compute-0 podman[215200]: 2025-11-29 06:50:14.738651096 +0000 UTC m=+1.496999539 container cleanup 1e886a291adc90ea9db68dab53383e2808ca51cb06bd503b1468169c83a8dbb2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:50:15 compute-0 podman[215228]: 2025-11-29 06:50:15.574055949 +0000 UTC m=+0.817156099 container remove 1e886a291adc90ea9db68dab53383e2808ca51cb06bd503b1468169c83a8dbb2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 29 06:50:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:50:15.579 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[8351a8f7-b357-4423-ace7-ffaf8d43ac9d]: (4, ('Sat Nov 29 06:50:13 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950 (1e886a291adc90ea9db68dab53383e2808ca51cb06bd503b1468169c83a8dbb2)\n1e886a291adc90ea9db68dab53383e2808ca51cb06bd503b1468169c83a8dbb2\nSat Nov 29 06:50:14 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950 (1e886a291adc90ea9db68dab53383e2808ca51cb06bd503b1468169c83a8dbb2)\n1e886a291adc90ea9db68dab53383e2808ca51cb06bd503b1468169c83a8dbb2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:50:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:50:15.581 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[9db154bc-d95f-43a7-bbb7-d46104262a8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:50:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:50:15.582 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24ee44f0-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:50:15 compute-0 nova_compute[187185]: 2025-11-29 06:50:15.583 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:50:15 compute-0 kernel: tap24ee44f0-20: left promiscuous mode
Nov 29 06:50:15 compute-0 nova_compute[187185]: 2025-11-29 06:50:15.594 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:50:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:50:15.598 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[058d80b0-1166-4af0-accb-0c9c3b45dfa8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:50:15 compute-0 systemd[1]: libpod-conmon-1e886a291adc90ea9db68dab53383e2808ca51cb06bd503b1468169c83a8dbb2.scope: Deactivated successfully.
Nov 29 06:50:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:50:15.616 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[8620669e-a8f5-43ba-8d8f-c8ba6fb1b486]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:50:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:50:15.617 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[641ccb07-836a-49b6-ac26-89c17b9cabe2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:50:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:50:15.636 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[10345b5b-1d7b-441b-a747-2bcead79e025]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 444226, 'reachable_time': 30816, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215243, 'error': None, 'target': 'ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:50:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:50:15.639 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 06:50:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:50:15.640 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[b40532ac-9c42-4cf7-a5c6-251dd3a8bffb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:50:15 compute-0 systemd[1]: run-netns-ovnmeta\x2d24ee44f0\x2d2b10\x2d459c\x2daabf\x2dbf9ef2c8d950.mount: Deactivated successfully.
Nov 29 06:50:15 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Nov 29 06:50:15 compute-0 systemd[215069]: Activating special unit Exit the Session...
Nov 29 06:50:15 compute-0 systemd[215069]: Stopped target Main User Target.
Nov 29 06:50:15 compute-0 systemd[215069]: Stopped target Basic System.
Nov 29 06:50:15 compute-0 systemd[215069]: Stopped target Paths.
Nov 29 06:50:15 compute-0 systemd[215069]: Stopped target Sockets.
Nov 29 06:50:15 compute-0 systemd[215069]: Stopped target Timers.
Nov 29 06:50:15 compute-0 systemd[215069]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 29 06:50:15 compute-0 systemd[215069]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 29 06:50:15 compute-0 systemd[215069]: Closed D-Bus User Message Bus Socket.
Nov 29 06:50:15 compute-0 systemd[215069]: Stopped Create User's Volatile Files and Directories.
Nov 29 06:50:15 compute-0 systemd[215069]: Removed slice User Application Slice.
Nov 29 06:50:15 compute-0 systemd[215069]: Reached target Shutdown.
Nov 29 06:50:15 compute-0 systemd[215069]: Finished Exit the Session.
Nov 29 06:50:15 compute-0 systemd[215069]: Reached target Exit the Session.
Nov 29 06:50:15 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Nov 29 06:50:15 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Nov 29 06:50:15 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 29 06:50:16 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 29 06:50:16 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 29 06:50:16 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 29 06:50:16 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Nov 29 06:50:16 compute-0 nova_compute[187185]: 2025-11-29 06:50:16.711 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:50:17 compute-0 nova_compute[187185]: 2025-11-29 06:50:17.614 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:50:17 compute-0 podman[215250]: 2025-11-29 06:50:17.786261252 +0000 UTC m=+0.057785783 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_id=edpm, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, version=9.6)
Nov 29 06:50:17 compute-0 podman[215251]: 2025-11-29 06:50:17.786604232 +0000 UTC m=+0.051973210 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 06:50:19 compute-0 sshd-session[215296]: Invalid user zjw from 179.125.24.202 port 56266
Nov 29 06:50:19 compute-0 podman[215298]: 2025-11-29 06:50:19.278343728 +0000 UTC m=+0.058486103 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 06:50:19 compute-0 sshd-session[215296]: Received disconnect from 179.125.24.202 port 56266:11: Bye Bye [preauth]
Nov 29 06:50:19 compute-0 sshd-session[215296]: Disconnected from invalid user zjw 179.125.24.202 port 56266 [preauth]
Nov 29 06:50:21 compute-0 nova_compute[187185]: 2025-11-29 06:50:21.708 187189 DEBUG oslo_concurrency.lockutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Acquiring lock "e6b5b54b-9532-4f51-a346-42dee946a9ef-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:50:21 compute-0 nova_compute[187185]: 2025-11-29 06:50:21.709 187189 DEBUG oslo_concurrency.lockutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lock "e6b5b54b-9532-4f51-a346-42dee946a9ef-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:50:21 compute-0 nova_compute[187185]: 2025-11-29 06:50:21.709 187189 DEBUG oslo_concurrency.lockutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lock "e6b5b54b-9532-4f51-a346-42dee946a9ef-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:50:21 compute-0 nova_compute[187185]: 2025-11-29 06:50:21.713 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:50:21 compute-0 nova_compute[187185]: 2025-11-29 06:50:21.744 187189 DEBUG oslo_concurrency.lockutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:50:21 compute-0 nova_compute[187185]: 2025-11-29 06:50:21.745 187189 DEBUG oslo_concurrency.lockutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:50:21 compute-0 nova_compute[187185]: 2025-11-29 06:50:21.745 187189 DEBUG oslo_concurrency.lockutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:50:21 compute-0 nova_compute[187185]: 2025-11-29 06:50:21.746 187189 DEBUG nova.compute.resource_tracker [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:50:21 compute-0 nova_compute[187185]: 2025-11-29 06:50:21.977 187189 WARNING nova.virt.libvirt.driver [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:50:21 compute-0 nova_compute[187185]: 2025-11-29 06:50:21.979 187189 DEBUG nova.compute.resource_tracker [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5768MB free_disk=73.34292221069336GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:50:21 compute-0 nova_compute[187185]: 2025-11-29 06:50:21.980 187189 DEBUG oslo_concurrency.lockutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:50:21 compute-0 nova_compute[187185]: 2025-11-29 06:50:21.980 187189 DEBUG oslo_concurrency.lockutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:50:22 compute-0 nova_compute[187185]: 2025-11-29 06:50:22.285 187189 DEBUG nova.compute.resource_tracker [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Migration for instance e6b5b54b-9532-4f51-a346-42dee946a9ef refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Nov 29 06:50:22 compute-0 nova_compute[187185]: 2025-11-29 06:50:22.312 187189 DEBUG nova.compute.resource_tracker [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Nov 29 06:50:22 compute-0 nova_compute[187185]: 2025-11-29 06:50:22.349 187189 DEBUG nova.compute.resource_tracker [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Migration 09644b82-9200-4175-a5be-cd05781c44be is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Nov 29 06:50:22 compute-0 nova_compute[187185]: 2025-11-29 06:50:22.350 187189 DEBUG nova.compute.resource_tracker [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:50:22 compute-0 nova_compute[187185]: 2025-11-29 06:50:22.350 187189 DEBUG nova.compute.resource_tracker [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:50:22 compute-0 nova_compute[187185]: 2025-11-29 06:50:22.393 187189 DEBUG nova.compute.provider_tree [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:50:22 compute-0 nova_compute[187185]: 2025-11-29 06:50:22.616 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:50:22 compute-0 nova_compute[187185]: 2025-11-29 06:50:22.724 187189 DEBUG nova.scheduler.client.report [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:50:22 compute-0 nova_compute[187185]: 2025-11-29 06:50:22.979 187189 DEBUG nova.compute.resource_tracker [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:50:22 compute-0 nova_compute[187185]: 2025-11-29 06:50:22.979 187189 DEBUG oslo_concurrency.lockutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.999s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:50:23 compute-0 nova_compute[187185]: 2025-11-29 06:50:23.344 187189 INFO nova.compute.manager [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Migrating instance to compute-2.ctlplane.example.com finished successfully.
Nov 29 06:50:23 compute-0 nova_compute[187185]: 2025-11-29 06:50:23.445 187189 INFO nova.scheduler.client.report [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Deleted allocation for migration 09644b82-9200-4175-a5be-cd05781c44be
Nov 29 06:50:23 compute-0 nova_compute[187185]: 2025-11-29 06:50:23.445 187189 DEBUG nova.virt.libvirt.driver [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Nov 29 06:50:24 compute-0 nova_compute[187185]: 2025-11-29 06:50:24.296 187189 DEBUG oslo_concurrency.lockutils [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Acquiring lock "7de218fe-a558-4904-b3c9-252e2928c03d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:50:24 compute-0 nova_compute[187185]: 2025-11-29 06:50:24.297 187189 DEBUG oslo_concurrency.lockutils [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Lock "7de218fe-a558-4904-b3c9-252e2928c03d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:50:24 compute-0 nova_compute[187185]: 2025-11-29 06:50:24.313 187189 DEBUG nova.compute.manager [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 06:50:24 compute-0 nova_compute[187185]: 2025-11-29 06:50:24.424 187189 DEBUG oslo_concurrency.lockutils [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:50:24 compute-0 nova_compute[187185]: 2025-11-29 06:50:24.425 187189 DEBUG oslo_concurrency.lockutils [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:50:24 compute-0 nova_compute[187185]: 2025-11-29 06:50:24.433 187189 DEBUG nova.virt.hardware [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 06:50:24 compute-0 nova_compute[187185]: 2025-11-29 06:50:24.434 187189 INFO nova.compute.claims [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Claim successful on node compute-0.ctlplane.example.com
Nov 29 06:50:24 compute-0 nova_compute[187185]: 2025-11-29 06:50:24.644 187189 DEBUG nova.compute.provider_tree [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:50:24 compute-0 nova_compute[187185]: 2025-11-29 06:50:24.657 187189 DEBUG nova.scheduler.client.report [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:50:24 compute-0 nova_compute[187185]: 2025-11-29 06:50:24.686 187189 DEBUG oslo_concurrency.lockutils [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.261s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:50:24 compute-0 nova_compute[187185]: 2025-11-29 06:50:24.687 187189 DEBUG nova.compute.manager [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 06:50:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:50:24.808 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:50:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:50:24.809 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:50:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:50:24.809 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:50:24 compute-0 podman[215318]: 2025-11-29 06:50:24.84366962 +0000 UTC m=+0.113430426 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 06:50:25 compute-0 nova_compute[187185]: 2025-11-29 06:50:25.577 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399010.5707242, e6b5b54b-9532-4f51-a346-42dee946a9ef => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:50:25 compute-0 nova_compute[187185]: 2025-11-29 06:50:25.577 187189 INFO nova.compute.manager [-] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] VM Stopped (Lifecycle Event)
Nov 29 06:50:25 compute-0 nova_compute[187185]: 2025-11-29 06:50:25.781 187189 DEBUG nova.compute.manager [None req-dcd7de40-152c-41ae-b748-73d3d19e8063 - - - - - -] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:50:25 compute-0 nova_compute[187185]: 2025-11-29 06:50:25.806 187189 DEBUG nova.compute.manager [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 06:50:25 compute-0 nova_compute[187185]: 2025-11-29 06:50:25.807 187189 DEBUG nova.network.neutron [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 06:50:25 compute-0 nova_compute[187185]: 2025-11-29 06:50:25.847 187189 INFO nova.virt.libvirt.driver [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 06:50:25 compute-0 nova_compute[187185]: 2025-11-29 06:50:25.883 187189 DEBUG nova.compute.manager [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 06:50:26 compute-0 nova_compute[187185]: 2025-11-29 06:50:26.021 187189 DEBUG nova.compute.manager [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 06:50:26 compute-0 nova_compute[187185]: 2025-11-29 06:50:26.023 187189 DEBUG nova.virt.libvirt.driver [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 06:50:26 compute-0 nova_compute[187185]: 2025-11-29 06:50:26.024 187189 INFO nova.virt.libvirt.driver [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Creating image(s)
Nov 29 06:50:26 compute-0 nova_compute[187185]: 2025-11-29 06:50:26.026 187189 DEBUG oslo_concurrency.lockutils [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Acquiring lock "/var/lib/nova/instances/7de218fe-a558-4904-b3c9-252e2928c03d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:50:26 compute-0 nova_compute[187185]: 2025-11-29 06:50:26.026 187189 DEBUG oslo_concurrency.lockutils [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Lock "/var/lib/nova/instances/7de218fe-a558-4904-b3c9-252e2928c03d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:50:26 compute-0 nova_compute[187185]: 2025-11-29 06:50:26.028 187189 DEBUG oslo_concurrency.lockutils [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Lock "/var/lib/nova/instances/7de218fe-a558-4904-b3c9-252e2928c03d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:50:26 compute-0 nova_compute[187185]: 2025-11-29 06:50:26.060 187189 DEBUG oslo_concurrency.processutils [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:50:26 compute-0 nova_compute[187185]: 2025-11-29 06:50:26.135 187189 DEBUG oslo_concurrency.processutils [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:50:26 compute-0 nova_compute[187185]: 2025-11-29 06:50:26.153 187189 DEBUG oslo_concurrency.lockutils [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:50:26 compute-0 nova_compute[187185]: 2025-11-29 06:50:26.159 187189 DEBUG oslo_concurrency.lockutils [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:50:26 compute-0 nova_compute[187185]: 2025-11-29 06:50:26.178 187189 DEBUG oslo_concurrency.processutils [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:50:26 compute-0 nova_compute[187185]: 2025-11-29 06:50:26.241 187189 DEBUG oslo_concurrency.processutils [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:50:26 compute-0 nova_compute[187185]: 2025-11-29 06:50:26.242 187189 DEBUG oslo_concurrency.processutils [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/7de218fe-a558-4904-b3c9-252e2928c03d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:50:26 compute-0 nova_compute[187185]: 2025-11-29 06:50:26.283 187189 DEBUG nova.network.neutron [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Nov 29 06:50:26 compute-0 nova_compute[187185]: 2025-11-29 06:50:26.285 187189 DEBUG nova.compute.manager [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 06:50:26 compute-0 nova_compute[187185]: 2025-11-29 06:50:26.773 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:50:26 compute-0 nova_compute[187185]: 2025-11-29 06:50:26.813 187189 DEBUG oslo_concurrency.processutils [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/7de218fe-a558-4904-b3c9-252e2928c03d/disk 1073741824" returned: 0 in 0.571s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:50:26 compute-0 nova_compute[187185]: 2025-11-29 06:50:26.814 187189 DEBUG oslo_concurrency.lockutils [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.656s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:50:26 compute-0 nova_compute[187185]: 2025-11-29 06:50:26.815 187189 DEBUG oslo_concurrency.processutils [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:50:26 compute-0 podman[215353]: 2025-11-29 06:50:26.868122778 +0000 UTC m=+0.066732427 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 06:50:26 compute-0 nova_compute[187185]: 2025-11-29 06:50:26.881 187189 DEBUG oslo_concurrency.processutils [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:50:26 compute-0 nova_compute[187185]: 2025-11-29 06:50:26.882 187189 DEBUG nova.virt.disk.api [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Checking if we can resize image /var/lib/nova/instances/7de218fe-a558-4904-b3c9-252e2928c03d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 06:50:26 compute-0 nova_compute[187185]: 2025-11-29 06:50:26.883 187189 DEBUG oslo_concurrency.processutils [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7de218fe-a558-4904-b3c9-252e2928c03d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:50:26 compute-0 podman[215352]: 2025-11-29 06:50:26.923484212 +0000 UTC m=+0.116843562 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3)
Nov 29 06:50:26 compute-0 nova_compute[187185]: 2025-11-29 06:50:26.963 187189 DEBUG oslo_concurrency.processutils [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7de218fe-a558-4904-b3c9-252e2928c03d/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:50:26 compute-0 nova_compute[187185]: 2025-11-29 06:50:26.966 187189 DEBUG nova.virt.disk.api [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Cannot resize image /var/lib/nova/instances/7de218fe-a558-4904-b3c9-252e2928c03d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 06:50:26 compute-0 nova_compute[187185]: 2025-11-29 06:50:26.966 187189 DEBUG nova.objects.instance [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Lazy-loading 'migration_context' on Instance uuid 7de218fe-a558-4904-b3c9-252e2928c03d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:50:27 compute-0 nova_compute[187185]: 2025-11-29 06:50:27.618 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:50:27 compute-0 nova_compute[187185]: 2025-11-29 06:50:27.640 187189 DEBUG nova.virt.libvirt.driver [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 06:50:27 compute-0 nova_compute[187185]: 2025-11-29 06:50:27.641 187189 DEBUG nova.virt.libvirt.driver [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Ensure instance console log exists: /var/lib/nova/instances/7de218fe-a558-4904-b3c9-252e2928c03d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 06:50:27 compute-0 nova_compute[187185]: 2025-11-29 06:50:27.641 187189 DEBUG oslo_concurrency.lockutils [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:50:27 compute-0 nova_compute[187185]: 2025-11-29 06:50:27.642 187189 DEBUG oslo_concurrency.lockutils [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:50:27 compute-0 nova_compute[187185]: 2025-11-29 06:50:27.642 187189 DEBUG oslo_concurrency.lockutils [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:50:27 compute-0 nova_compute[187185]: 2025-11-29 06:50:27.644 187189 DEBUG nova.virt.libvirt.driver [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 06:50:27 compute-0 nova_compute[187185]: 2025-11-29 06:50:27.650 187189 WARNING nova.virt.libvirt.driver [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:50:27 compute-0 nova_compute[187185]: 2025-11-29 06:50:27.657 187189 DEBUG nova.virt.libvirt.host [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 06:50:27 compute-0 nova_compute[187185]: 2025-11-29 06:50:27.658 187189 DEBUG nova.virt.libvirt.host [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 06:50:27 compute-0 nova_compute[187185]: 2025-11-29 06:50:27.661 187189 DEBUG nova.virt.libvirt.host [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 06:50:27 compute-0 nova_compute[187185]: 2025-11-29 06:50:27.662 187189 DEBUG nova.virt.libvirt.host [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 06:50:27 compute-0 nova_compute[187185]: 2025-11-29 06:50:27.664 187189 DEBUG nova.virt.libvirt.driver [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 06:50:27 compute-0 nova_compute[187185]: 2025-11-29 06:50:27.665 187189 DEBUG nova.virt.hardware [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 06:50:27 compute-0 nova_compute[187185]: 2025-11-29 06:50:27.665 187189 DEBUG nova.virt.hardware [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 06:50:27 compute-0 nova_compute[187185]: 2025-11-29 06:50:27.665 187189 DEBUG nova.virt.hardware [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 06:50:27 compute-0 nova_compute[187185]: 2025-11-29 06:50:27.666 187189 DEBUG nova.virt.hardware [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 06:50:27 compute-0 nova_compute[187185]: 2025-11-29 06:50:27.666 187189 DEBUG nova.virt.hardware [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 06:50:27 compute-0 nova_compute[187185]: 2025-11-29 06:50:27.666 187189 DEBUG nova.virt.hardware [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 06:50:27 compute-0 nova_compute[187185]: 2025-11-29 06:50:27.666 187189 DEBUG nova.virt.hardware [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 06:50:27 compute-0 nova_compute[187185]: 2025-11-29 06:50:27.666 187189 DEBUG nova.virt.hardware [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 06:50:27 compute-0 nova_compute[187185]: 2025-11-29 06:50:27.667 187189 DEBUG nova.virt.hardware [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 06:50:27 compute-0 nova_compute[187185]: 2025-11-29 06:50:27.667 187189 DEBUG nova.virt.hardware [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 06:50:27 compute-0 nova_compute[187185]: 2025-11-29 06:50:27.667 187189 DEBUG nova.virt.hardware [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 06:50:27 compute-0 nova_compute[187185]: 2025-11-29 06:50:27.673 187189 DEBUG nova.objects.instance [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Lazy-loading 'pci_devices' on Instance uuid 7de218fe-a558-4904-b3c9-252e2928c03d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:50:28 compute-0 nova_compute[187185]: 2025-11-29 06:50:28.017 187189 DEBUG nova.virt.libvirt.driver [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] End _get_guest_xml xml=<domain type="kvm">
Nov 29 06:50:28 compute-0 nova_compute[187185]:   <uuid>7de218fe-a558-4904-b3c9-252e2928c03d</uuid>
Nov 29 06:50:28 compute-0 nova_compute[187185]:   <name>instance-0000000d</name>
Nov 29 06:50:28 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 06:50:28 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 06:50:28 compute-0 nova_compute[187185]:   <metadata>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 06:50:28 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:       <nova:name>tempest-ServersAdminNegativeTestJSON-server-615044941</nova:name>
Nov 29 06:50:28 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 06:50:27</nova:creationTime>
Nov 29 06:50:28 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 06:50:28 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 06:50:28 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 06:50:28 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 06:50:28 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 06:50:28 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 06:50:28 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 06:50:28 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 06:50:28 compute-0 nova_compute[187185]:         <nova:user uuid="7554120578c443aeb4b37d4ac60be1e6">tempest-ServersAdminNegativeTestJSON-633564556-project-member</nova:user>
Nov 29 06:50:28 compute-0 nova_compute[187185]:         <nova:project uuid="8edc6838ec0a494a86a17e1f5d0d039a">tempest-ServersAdminNegativeTestJSON-633564556</nova:project>
Nov 29 06:50:28 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 06:50:28 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:       <nova:ports/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 06:50:28 compute-0 nova_compute[187185]:   </metadata>
Nov 29 06:50:28 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <system>
Nov 29 06:50:28 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 06:50:28 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 06:50:28 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 06:50:28 compute-0 nova_compute[187185]:       <entry name="serial">7de218fe-a558-4904-b3c9-252e2928c03d</entry>
Nov 29 06:50:28 compute-0 nova_compute[187185]:       <entry name="uuid">7de218fe-a558-4904-b3c9-252e2928c03d</entry>
Nov 29 06:50:28 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     </system>
Nov 29 06:50:28 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 06:50:28 compute-0 nova_compute[187185]:   <os>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:   </os>
Nov 29 06:50:28 compute-0 nova_compute[187185]:   <features>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <apic/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:   </features>
Nov 29 06:50:28 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:   </clock>
Nov 29 06:50:28 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:   </cpu>
Nov 29 06:50:28 compute-0 nova_compute[187185]:   <devices>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 06:50:28 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/7de218fe-a558-4904-b3c9-252e2928c03d/disk"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     </disk>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 06:50:28 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/7de218fe-a558-4904-b3c9-252e2928c03d/disk.config"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     </disk>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 06:50:28 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/7de218fe-a558-4904-b3c9-252e2928c03d/console.log" append="off"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     </serial>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <video>
Nov 29 06:50:28 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     </video>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 06:50:28 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     </rng>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 06:50:28 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 06:50:28 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 06:50:28 compute-0 nova_compute[187185]:   </devices>
Nov 29 06:50:28 compute-0 nova_compute[187185]: </domain>
Nov 29 06:50:28 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 06:50:28 compute-0 nova_compute[187185]: 2025-11-29 06:50:28.074 187189 DEBUG nova.virt.libvirt.driver [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 06:50:28 compute-0 nova_compute[187185]: 2025-11-29 06:50:28.074 187189 DEBUG nova.virt.libvirt.driver [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 06:50:28 compute-0 nova_compute[187185]: 2025-11-29 06:50:28.075 187189 INFO nova.virt.libvirt.driver [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Using config drive
Nov 29 06:50:28 compute-0 nova_compute[187185]: 2025-11-29 06:50:28.255 187189 INFO nova.virt.libvirt.driver [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Creating config drive at /var/lib/nova/instances/7de218fe-a558-4904-b3c9-252e2928c03d/disk.config
Nov 29 06:50:28 compute-0 nova_compute[187185]: 2025-11-29 06:50:28.262 187189 DEBUG oslo_concurrency.processutils [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7de218fe-a558-4904-b3c9-252e2928c03d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3ged8el1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:50:28 compute-0 nova_compute[187185]: 2025-11-29 06:50:28.399 187189 DEBUG oslo_concurrency.processutils [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7de218fe-a558-4904-b3c9-252e2928c03d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3ged8el1" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:50:28 compute-0 systemd-machined[153486]: New machine qemu-4-instance-0000000d.
Nov 29 06:50:28 compute-0 systemd[1]: Started Virtual Machine qemu-4-instance-0000000d.
Nov 29 06:50:28 compute-0 nova_compute[187185]: 2025-11-29 06:50:28.803 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399028.8021836, 7de218fe-a558-4904-b3c9-252e2928c03d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:50:28 compute-0 nova_compute[187185]: 2025-11-29 06:50:28.805 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] VM Resumed (Lifecycle Event)
Nov 29 06:50:28 compute-0 nova_compute[187185]: 2025-11-29 06:50:28.808 187189 DEBUG nova.compute.manager [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 06:50:28 compute-0 nova_compute[187185]: 2025-11-29 06:50:28.808 187189 DEBUG nova.virt.libvirt.driver [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 06:50:28 compute-0 nova_compute[187185]: 2025-11-29 06:50:28.812 187189 INFO nova.virt.libvirt.driver [-] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Instance spawned successfully.
Nov 29 06:50:28 compute-0 nova_compute[187185]: 2025-11-29 06:50:28.812 187189 DEBUG nova.virt.libvirt.driver [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 06:50:28 compute-0 nova_compute[187185]: 2025-11-29 06:50:28.996 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:50:29 compute-0 nova_compute[187185]: 2025-11-29 06:50:29.002 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 06:50:29 compute-0 nova_compute[187185]: 2025-11-29 06:50:29.007 187189 DEBUG nova.virt.libvirt.driver [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:50:29 compute-0 nova_compute[187185]: 2025-11-29 06:50:29.008 187189 DEBUG nova.virt.libvirt.driver [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:50:29 compute-0 nova_compute[187185]: 2025-11-29 06:50:29.008 187189 DEBUG nova.virt.libvirt.driver [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:50:29 compute-0 nova_compute[187185]: 2025-11-29 06:50:29.008 187189 DEBUG nova.virt.libvirt.driver [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:50:29 compute-0 nova_compute[187185]: 2025-11-29 06:50:29.009 187189 DEBUG nova.virt.libvirt.driver [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:50:29 compute-0 nova_compute[187185]: 2025-11-29 06:50:29.009 187189 DEBUG nova.virt.libvirt.driver [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:50:29 compute-0 nova_compute[187185]: 2025-11-29 06:50:29.032 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 06:50:29 compute-0 nova_compute[187185]: 2025-11-29 06:50:29.032 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399028.8042817, 7de218fe-a558-4904-b3c9-252e2928c03d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:50:29 compute-0 nova_compute[187185]: 2025-11-29 06:50:29.032 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] VM Started (Lifecycle Event)
Nov 29 06:50:29 compute-0 nova_compute[187185]: 2025-11-29 06:50:29.076 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:50:29 compute-0 nova_compute[187185]: 2025-11-29 06:50:29.089 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 06:50:29 compute-0 nova_compute[187185]: 2025-11-29 06:50:29.130 187189 INFO nova.compute.manager [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Took 3.11 seconds to spawn the instance on the hypervisor.
Nov 29 06:50:29 compute-0 nova_compute[187185]: 2025-11-29 06:50:29.131 187189 DEBUG nova.compute.manager [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:50:29 compute-0 nova_compute[187185]: 2025-11-29 06:50:29.135 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 06:50:29 compute-0 nova_compute[187185]: 2025-11-29 06:50:29.260 187189 INFO nova.compute.manager [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Took 4.88 seconds to build instance.
Nov 29 06:50:29 compute-0 nova_compute[187185]: 2025-11-29 06:50:29.379 187189 DEBUG oslo_concurrency.lockutils [None req-c59101d7-d9a5-4f53-8243-e61974ca8815 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Lock "7de218fe-a558-4904-b3c9-252e2928c03d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.082s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:50:31 compute-0 nova_compute[187185]: 2025-11-29 06:50:31.775 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:50:32 compute-0 nova_compute[187185]: 2025-11-29 06:50:32.621 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:50:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:50:34.906 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 06:50:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:50:34.907 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 06:50:34 compute-0 nova_compute[187185]: 2025-11-29 06:50:34.907 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:50:35 compute-0 podman[215428]: 2025-11-29 06:50:35.843161687 +0000 UTC m=+0.094028118 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd)
Nov 29 06:50:36 compute-0 nova_compute[187185]: 2025-11-29 06:50:36.781 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:50:37 compute-0 nova_compute[187185]: 2025-11-29 06:50:37.627 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:50:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:50:39.909 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:50:41 compute-0 nova_compute[187185]: 2025-11-29 06:50:41.786 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:50:42 compute-0 nova_compute[187185]: 2025-11-29 06:50:42.631 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:50:43 compute-0 sshd-session[215463]: Invalid user a from 45.202.211.6 port 56396
Nov 29 06:50:43 compute-0 sshd-session[215463]: Received disconnect from 45.202.211.6 port 56396:11: Bye Bye [preauth]
Nov 29 06:50:43 compute-0 sshd-session[215463]: Disconnected from invalid user a 45.202.211.6 port 56396 [preauth]
Nov 29 06:50:44 compute-0 nova_compute[187185]: 2025-11-29 06:50:44.122 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:50:44 compute-0 nova_compute[187185]: 2025-11-29 06:50:44.122 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:50:44 compute-0 nova_compute[187185]: 2025-11-29 06:50:44.123 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 06:50:44 compute-0 nova_compute[187185]: 2025-11-29 06:50:44.123 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 06:50:44 compute-0 nova_compute[187185]: 2025-11-29 06:50:44.508 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "refresh_cache-7de218fe-a558-4904-b3c9-252e2928c03d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:50:44 compute-0 nova_compute[187185]: 2025-11-29 06:50:44.508 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquired lock "refresh_cache-7de218fe-a558-4904-b3c9-252e2928c03d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:50:44 compute-0 nova_compute[187185]: 2025-11-29 06:50:44.508 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 29 06:50:44 compute-0 nova_compute[187185]: 2025-11-29 06:50:44.509 187189 DEBUG nova.objects.instance [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 7de218fe-a558-4904-b3c9-252e2928c03d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:50:44 compute-0 nova_compute[187185]: 2025-11-29 06:50:44.814 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 06:50:45 compute-0 nova_compute[187185]: 2025-11-29 06:50:45.246 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:50:45 compute-0 nova_compute[187185]: 2025-11-29 06:50:45.265 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Releasing lock "refresh_cache-7de218fe-a558-4904-b3c9-252e2928c03d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:50:45 compute-0 nova_compute[187185]: 2025-11-29 06:50:45.266 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 29 06:50:45 compute-0 nova_compute[187185]: 2025-11-29 06:50:45.267 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:50:45 compute-0 nova_compute[187185]: 2025-11-29 06:50:45.268 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:50:45 compute-0 nova_compute[187185]: 2025-11-29 06:50:45.268 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:50:45 compute-0 nova_compute[187185]: 2025-11-29 06:50:45.269 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:50:45 compute-0 nova_compute[187185]: 2025-11-29 06:50:45.270 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:50:45 compute-0 nova_compute[187185]: 2025-11-29 06:50:45.270 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:50:45 compute-0 nova_compute[187185]: 2025-11-29 06:50:45.271 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 06:50:45 compute-0 nova_compute[187185]: 2025-11-29 06:50:45.271 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:50:45 compute-0 nova_compute[187185]: 2025-11-29 06:50:45.301 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:50:45 compute-0 nova_compute[187185]: 2025-11-29 06:50:45.302 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:50:45 compute-0 nova_compute[187185]: 2025-11-29 06:50:45.302 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:50:45 compute-0 nova_compute[187185]: 2025-11-29 06:50:45.303 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:50:45 compute-0 nova_compute[187185]: 2025-11-29 06:50:45.418 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7de218fe-a558-4904-b3c9-252e2928c03d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:50:45 compute-0 nova_compute[187185]: 2025-11-29 06:50:45.495 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7de218fe-a558-4904-b3c9-252e2928c03d/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:50:45 compute-0 nova_compute[187185]: 2025-11-29 06:50:45.497 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7de218fe-a558-4904-b3c9-252e2928c03d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:50:45 compute-0 nova_compute[187185]: 2025-11-29 06:50:45.556 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7de218fe-a558-4904-b3c9-252e2928c03d/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:50:45 compute-0 nova_compute[187185]: 2025-11-29 06:50:45.691 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:50:45 compute-0 nova_compute[187185]: 2025-11-29 06:50:45.693 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5618MB free_disk=73.3142318725586GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:50:45 compute-0 nova_compute[187185]: 2025-11-29 06:50:45.693 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:50:45 compute-0 nova_compute[187185]: 2025-11-29 06:50:45.693 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:50:45 compute-0 nova_compute[187185]: 2025-11-29 06:50:45.772 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance 7de218fe-a558-4904-b3c9-252e2928c03d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 06:50:45 compute-0 nova_compute[187185]: 2025-11-29 06:50:45.772 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:50:45 compute-0 nova_compute[187185]: 2025-11-29 06:50:45.772 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:50:45 compute-0 nova_compute[187185]: 2025-11-29 06:50:45.812 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:50:45 compute-0 nova_compute[187185]: 2025-11-29 06:50:45.828 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:50:45 compute-0 nova_compute[187185]: 2025-11-29 06:50:45.850 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:50:45 compute-0 nova_compute[187185]: 2025-11-29 06:50:45.850 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:50:46 compute-0 nova_compute[187185]: 2025-11-29 06:50:46.481 187189 DEBUG oslo_concurrency.lockutils [None req-68a43e11-205f-418e-a197-f868169458c7 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Acquiring lock "7de218fe-a558-4904-b3c9-252e2928c03d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:50:46 compute-0 nova_compute[187185]: 2025-11-29 06:50:46.482 187189 DEBUG oslo_concurrency.lockutils [None req-68a43e11-205f-418e-a197-f868169458c7 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Lock "7de218fe-a558-4904-b3c9-252e2928c03d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:50:46 compute-0 nova_compute[187185]: 2025-11-29 06:50:46.482 187189 DEBUG oslo_concurrency.lockutils [None req-68a43e11-205f-418e-a197-f868169458c7 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Acquiring lock "7de218fe-a558-4904-b3c9-252e2928c03d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:50:46 compute-0 nova_compute[187185]: 2025-11-29 06:50:46.483 187189 DEBUG oslo_concurrency.lockutils [None req-68a43e11-205f-418e-a197-f868169458c7 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Lock "7de218fe-a558-4904-b3c9-252e2928c03d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:50:46 compute-0 nova_compute[187185]: 2025-11-29 06:50:46.483 187189 DEBUG oslo_concurrency.lockutils [None req-68a43e11-205f-418e-a197-f868169458c7 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Lock "7de218fe-a558-4904-b3c9-252e2928c03d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:50:46 compute-0 sshd-session[215465]: Invalid user vyos from 160.202.8.218 port 42664
Nov 29 06:50:46 compute-0 nova_compute[187185]: 2025-11-29 06:50:46.505 187189 INFO nova.compute.manager [None req-68a43e11-205f-418e-a197-f868169458c7 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Terminating instance
Nov 29 06:50:46 compute-0 nova_compute[187185]: 2025-11-29 06:50:46.531 187189 DEBUG oslo_concurrency.lockutils [None req-68a43e11-205f-418e-a197-f868169458c7 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Acquiring lock "refresh_cache-7de218fe-a558-4904-b3c9-252e2928c03d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:50:46 compute-0 nova_compute[187185]: 2025-11-29 06:50:46.532 187189 DEBUG oslo_concurrency.lockutils [None req-68a43e11-205f-418e-a197-f868169458c7 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Acquired lock "refresh_cache-7de218fe-a558-4904-b3c9-252e2928c03d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:50:46 compute-0 nova_compute[187185]: 2025-11-29 06:50:46.532 187189 DEBUG nova.network.neutron [None req-68a43e11-205f-418e-a197-f868169458c7 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 06:50:46 compute-0 sshd-session[215465]: Received disconnect from 160.202.8.218 port 42664:11: Bye Bye [preauth]
Nov 29 06:50:46 compute-0 sshd-session[215465]: Disconnected from invalid user vyos 160.202.8.218 port 42664 [preauth]
Nov 29 06:50:46 compute-0 nova_compute[187185]: 2025-11-29 06:50:46.738 187189 DEBUG nova.network.neutron [None req-68a43e11-205f-418e-a197-f868169458c7 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 06:50:46 compute-0 nova_compute[187185]: 2025-11-29 06:50:46.809 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:50:47 compute-0 sshd-session[215474]: Invalid user mike from 103.179.56.44 port 40082
Nov 29 06:50:47 compute-0 nova_compute[187185]: 2025-11-29 06:50:47.088 187189 DEBUG nova.network.neutron [None req-68a43e11-205f-418e-a197-f868169458c7 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:50:47 compute-0 nova_compute[187185]: 2025-11-29 06:50:47.243 187189 DEBUG oslo_concurrency.lockutils [None req-68a43e11-205f-418e-a197-f868169458c7 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Releasing lock "refresh_cache-7de218fe-a558-4904-b3c9-252e2928c03d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:50:47 compute-0 nova_compute[187185]: 2025-11-29 06:50:47.245 187189 DEBUG nova.compute.manager [None req-68a43e11-205f-418e-a197-f868169458c7 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 06:50:47 compute-0 sshd-session[215474]: Received disconnect from 103.179.56.44 port 40082:11: Bye Bye [preauth]
Nov 29 06:50:47 compute-0 sshd-session[215474]: Disconnected from invalid user mike 103.179.56.44 port 40082 [preauth]
Nov 29 06:50:47 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Nov 29 06:50:47 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000d.scope: Consumed 12.723s CPU time.
Nov 29 06:50:47 compute-0 systemd-machined[153486]: Machine qemu-4-instance-0000000d terminated.
Nov 29 06:50:47 compute-0 nova_compute[187185]: 2025-11-29 06:50:47.508 187189 INFO nova.virt.libvirt.driver [-] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Instance destroyed successfully.
Nov 29 06:50:47 compute-0 nova_compute[187185]: 2025-11-29 06:50:47.511 187189 DEBUG nova.objects.instance [None req-68a43e11-205f-418e-a197-f868169458c7 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Lazy-loading 'resources' on Instance uuid 7de218fe-a558-4904-b3c9-252e2928c03d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:50:47 compute-0 nova_compute[187185]: 2025-11-29 06:50:47.531 187189 INFO nova.virt.libvirt.driver [None req-68a43e11-205f-418e-a197-f868169458c7 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Deleting instance files /var/lib/nova/instances/7de218fe-a558-4904-b3c9-252e2928c03d_del
Nov 29 06:50:47 compute-0 nova_compute[187185]: 2025-11-29 06:50:47.531 187189 INFO nova.virt.libvirt.driver [None req-68a43e11-205f-418e-a197-f868169458c7 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Deletion of /var/lib/nova/instances/7de218fe-a558-4904-b3c9-252e2928c03d_del complete
Nov 29 06:50:47 compute-0 nova_compute[187185]: 2025-11-29 06:50:47.603 187189 INFO nova.compute.manager [None req-68a43e11-205f-418e-a197-f868169458c7 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Took 0.36 seconds to destroy the instance on the hypervisor.
Nov 29 06:50:47 compute-0 nova_compute[187185]: 2025-11-29 06:50:47.604 187189 DEBUG oslo.service.loopingcall [None req-68a43e11-205f-418e-a197-f868169458c7 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 06:50:47 compute-0 nova_compute[187185]: 2025-11-29 06:50:47.604 187189 DEBUG nova.compute.manager [-] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 06:50:47 compute-0 nova_compute[187185]: 2025-11-29 06:50:47.604 187189 DEBUG nova.network.neutron [-] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 06:50:47 compute-0 nova_compute[187185]: 2025-11-29 06:50:47.636 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:50:47 compute-0 nova_compute[187185]: 2025-11-29 06:50:47.799 187189 DEBUG nova.network.neutron [-] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 06:50:47 compute-0 nova_compute[187185]: 2025-11-29 06:50:47.822 187189 DEBUG nova.network.neutron [-] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:50:47 compute-0 nova_compute[187185]: 2025-11-29 06:50:47.836 187189 INFO nova.compute.manager [-] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Took 0.23 seconds to deallocate network for instance.
Nov 29 06:50:47 compute-0 nova_compute[187185]: 2025-11-29 06:50:47.938 187189 DEBUG oslo_concurrency.lockutils [None req-68a43e11-205f-418e-a197-f868169458c7 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:50:47 compute-0 nova_compute[187185]: 2025-11-29 06:50:47.938 187189 DEBUG oslo_concurrency.lockutils [None req-68a43e11-205f-418e-a197-f868169458c7 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:50:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:50:47.983 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:50:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:50:47.985 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:50:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:50:47.985 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:50:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:50:47.985 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:50:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:50:47.985 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:50:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:50:47.985 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:50:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:50:47.985 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:50:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:50:47.986 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:50:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:50:47.986 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:50:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:50:47.986 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:50:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:50:47.986 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:50:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:50:47.986 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:50:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:50:47.986 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:50:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:50:47.986 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:50:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:50:47.987 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:50:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:50:47.987 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:50:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:50:47.987 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:50:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:50:47.987 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:50:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:50:47.987 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:50:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:50:47.987 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:50:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:50:47.988 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:50:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:50:47.988 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:50:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:50:47.988 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:50:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:50:47.988 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:50:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:50:47.988 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:50:48 compute-0 nova_compute[187185]: 2025-11-29 06:50:48.035 187189 DEBUG nova.compute.provider_tree [None req-68a43e11-205f-418e-a197-f868169458c7 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:50:48 compute-0 nova_compute[187185]: 2025-11-29 06:50:48.055 187189 DEBUG nova.scheduler.client.report [None req-68a43e11-205f-418e-a197-f868169458c7 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:50:48 compute-0 nova_compute[187185]: 2025-11-29 06:50:48.086 187189 DEBUG oslo_concurrency.lockutils [None req-68a43e11-205f-418e-a197-f868169458c7 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:50:48 compute-0 nova_compute[187185]: 2025-11-29 06:50:48.123 187189 INFO nova.scheduler.client.report [None req-68a43e11-205f-418e-a197-f868169458c7 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Deleted allocations for instance 7de218fe-a558-4904-b3c9-252e2928c03d
Nov 29 06:50:48 compute-0 nova_compute[187185]: 2025-11-29 06:50:48.250 187189 DEBUG oslo_concurrency.lockutils [None req-68a43e11-205f-418e-a197-f868169458c7 7554120578c443aeb4b37d4ac60be1e6 8edc6838ec0a494a86a17e1f5d0d039a - - default default] Lock "7de218fe-a558-4904-b3c9-252e2928c03d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.768s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:50:48 compute-0 podman[215486]: 2025-11-29 06:50:48.84111576 +0000 UTC m=+0.096637641 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 06:50:48 compute-0 podman[215485]: 2025-11-29 06:50:48.844973869 +0000 UTC m=+0.102320062 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, architecture=x86_64, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 29 06:50:49 compute-0 podman[215528]: 2025-11-29 06:50:49.815574516 +0000 UTC m=+0.076783731 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible)
Nov 29 06:50:51 compute-0 nova_compute[187185]: 2025-11-29 06:50:51.848 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:50:52 compute-0 nova_compute[187185]: 2025-11-29 06:50:52.638 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:50:55 compute-0 podman[215548]: 2025-11-29 06:50:55.855735105 +0000 UTC m=+0.117840816 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 29 06:50:56 compute-0 nova_compute[187185]: 2025-11-29 06:50:56.849 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:50:57 compute-0 nova_compute[187185]: 2025-11-29 06:50:57.640 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:50:57 compute-0 podman[215575]: 2025-11-29 06:50:57.744936199 +0000 UTC m=+0.076380880 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Nov 29 06:50:57 compute-0 podman[215576]: 2025-11-29 06:50:57.76239812 +0000 UTC m=+0.078277003 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 06:51:01 compute-0 nova_compute[187185]: 2025-11-29 06:51:01.884 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:51:02 compute-0 nova_compute[187185]: 2025-11-29 06:51:02.507 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399047.5054848, 7de218fe-a558-4904-b3c9-252e2928c03d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:51:02 compute-0 nova_compute[187185]: 2025-11-29 06:51:02.508 187189 INFO nova.compute.manager [-] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] VM Stopped (Lifecycle Event)
Nov 29 06:51:02 compute-0 nova_compute[187185]: 2025-11-29 06:51:02.643 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:51:02 compute-0 nova_compute[187185]: 2025-11-29 06:51:02.732 187189 DEBUG nova.compute.manager [None req-82e84a4e-30a1-4c09-ad62-ddddb9e4ba03 - - - - - -] [instance: 7de218fe-a558-4904-b3c9-252e2928c03d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:51:06 compute-0 podman[215619]: 2025-11-29 06:51:06.845890131 +0000 UTC m=+0.093523821 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 06:51:06 compute-0 nova_compute[187185]: 2025-11-29 06:51:06.869 187189 DEBUG oslo_concurrency.lockutils [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Acquiring lock "ec67e8c7-49f4-4ba9-bcf9-edb89515859d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:51:06 compute-0 nova_compute[187185]: 2025-11-29 06:51:06.870 187189 DEBUG oslo_concurrency.lockutils [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Lock "ec67e8c7-49f4-4ba9-bcf9-edb89515859d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:51:06 compute-0 nova_compute[187185]: 2025-11-29 06:51:06.886 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:51:07 compute-0 nova_compute[187185]: 2025-11-29 06:51:07.105 187189 DEBUG nova.compute.manager [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 06:51:07 compute-0 nova_compute[187185]: 2025-11-29 06:51:07.220 187189 DEBUG oslo_concurrency.lockutils [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:51:07 compute-0 nova_compute[187185]: 2025-11-29 06:51:07.221 187189 DEBUG oslo_concurrency.lockutils [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:51:07 compute-0 nova_compute[187185]: 2025-11-29 06:51:07.231 187189 DEBUG nova.virt.hardware [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 06:51:07 compute-0 nova_compute[187185]: 2025-11-29 06:51:07.232 187189 INFO nova.compute.claims [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Claim successful on node compute-0.ctlplane.example.com
Nov 29 06:51:07 compute-0 nova_compute[187185]: 2025-11-29 06:51:07.370 187189 DEBUG nova.compute.provider_tree [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:51:07 compute-0 nova_compute[187185]: 2025-11-29 06:51:07.390 187189 DEBUG nova.scheduler.client.report [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:51:07 compute-0 nova_compute[187185]: 2025-11-29 06:51:07.415 187189 DEBUG oslo_concurrency.lockutils [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.193s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:51:07 compute-0 nova_compute[187185]: 2025-11-29 06:51:07.416 187189 DEBUG nova.compute.manager [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 06:51:07 compute-0 nova_compute[187185]: 2025-11-29 06:51:07.484 187189 DEBUG nova.compute.manager [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 06:51:07 compute-0 nova_compute[187185]: 2025-11-29 06:51:07.485 187189 DEBUG nova.network.neutron [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 06:51:07 compute-0 nova_compute[187185]: 2025-11-29 06:51:07.503 187189 INFO nova.virt.libvirt.driver [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 06:51:07 compute-0 nova_compute[187185]: 2025-11-29 06:51:07.529 187189 DEBUG nova.compute.manager [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 06:51:07 compute-0 nova_compute[187185]: 2025-11-29 06:51:07.647 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:51:07 compute-0 nova_compute[187185]: 2025-11-29 06:51:07.657 187189 DEBUG nova.compute.manager [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 06:51:07 compute-0 nova_compute[187185]: 2025-11-29 06:51:07.659 187189 DEBUG nova.virt.libvirt.driver [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 06:51:07 compute-0 nova_compute[187185]: 2025-11-29 06:51:07.660 187189 INFO nova.virt.libvirt.driver [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Creating image(s)
Nov 29 06:51:07 compute-0 nova_compute[187185]: 2025-11-29 06:51:07.660 187189 DEBUG oslo_concurrency.lockutils [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Acquiring lock "/var/lib/nova/instances/ec67e8c7-49f4-4ba9-bcf9-edb89515859d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:51:07 compute-0 nova_compute[187185]: 2025-11-29 06:51:07.661 187189 DEBUG oslo_concurrency.lockutils [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Lock "/var/lib/nova/instances/ec67e8c7-49f4-4ba9-bcf9-edb89515859d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:51:07 compute-0 nova_compute[187185]: 2025-11-29 06:51:07.662 187189 DEBUG oslo_concurrency.lockutils [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Lock "/var/lib/nova/instances/ec67e8c7-49f4-4ba9-bcf9-edb89515859d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:51:07 compute-0 nova_compute[187185]: 2025-11-29 06:51:07.679 187189 DEBUG oslo_concurrency.processutils [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:51:07 compute-0 nova_compute[187185]: 2025-11-29 06:51:07.769 187189 DEBUG oslo_concurrency.processutils [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:51:07 compute-0 nova_compute[187185]: 2025-11-29 06:51:07.771 187189 DEBUG oslo_concurrency.lockutils [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:51:07 compute-0 nova_compute[187185]: 2025-11-29 06:51:07.771 187189 DEBUG oslo_concurrency.lockutils [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:51:07 compute-0 nova_compute[187185]: 2025-11-29 06:51:07.782 187189 DEBUG oslo_concurrency.processutils [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:51:07 compute-0 nova_compute[187185]: 2025-11-29 06:51:07.854 187189 DEBUG oslo_concurrency.processutils [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:51:07 compute-0 nova_compute[187185]: 2025-11-29 06:51:07.856 187189 DEBUG oslo_concurrency.processutils [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/ec67e8c7-49f4-4ba9-bcf9-edb89515859d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:51:07 compute-0 nova_compute[187185]: 2025-11-29 06:51:07.885 187189 DEBUG nova.network.neutron [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Nov 29 06:51:07 compute-0 nova_compute[187185]: 2025-11-29 06:51:07.885 187189 DEBUG nova.compute.manager [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 06:51:08 compute-0 nova_compute[187185]: 2025-11-29 06:51:08.010 187189 DEBUG oslo_concurrency.processutils [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/ec67e8c7-49f4-4ba9-bcf9-edb89515859d/disk 1073741824" returned: 0 in 0.155s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:51:08 compute-0 nova_compute[187185]: 2025-11-29 06:51:08.012 187189 DEBUG oslo_concurrency.lockutils [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.240s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:51:08 compute-0 nova_compute[187185]: 2025-11-29 06:51:08.012 187189 DEBUG oslo_concurrency.processutils [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:51:08 compute-0 nova_compute[187185]: 2025-11-29 06:51:08.092 187189 DEBUG oslo_concurrency.processutils [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:51:08 compute-0 nova_compute[187185]: 2025-11-29 06:51:08.093 187189 DEBUG nova.virt.disk.api [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Checking if we can resize image /var/lib/nova/instances/ec67e8c7-49f4-4ba9-bcf9-edb89515859d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 06:51:08 compute-0 nova_compute[187185]: 2025-11-29 06:51:08.094 187189 DEBUG oslo_concurrency.processutils [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ec67e8c7-49f4-4ba9-bcf9-edb89515859d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:51:08 compute-0 nova_compute[187185]: 2025-11-29 06:51:08.186 187189 DEBUG oslo_concurrency.processutils [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ec67e8c7-49f4-4ba9-bcf9-edb89515859d/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:51:08 compute-0 nova_compute[187185]: 2025-11-29 06:51:08.187 187189 DEBUG nova.virt.disk.api [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Cannot resize image /var/lib/nova/instances/ec67e8c7-49f4-4ba9-bcf9-edb89515859d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 06:51:08 compute-0 nova_compute[187185]: 2025-11-29 06:51:08.188 187189 DEBUG nova.objects.instance [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Lazy-loading 'migration_context' on Instance uuid ec67e8c7-49f4-4ba9-bcf9-edb89515859d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:51:08 compute-0 nova_compute[187185]: 2025-11-29 06:51:08.211 187189 DEBUG nova.virt.libvirt.driver [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 06:51:08 compute-0 nova_compute[187185]: 2025-11-29 06:51:08.212 187189 DEBUG nova.virt.libvirt.driver [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Ensure instance console log exists: /var/lib/nova/instances/ec67e8c7-49f4-4ba9-bcf9-edb89515859d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 06:51:08 compute-0 nova_compute[187185]: 2025-11-29 06:51:08.213 187189 DEBUG oslo_concurrency.lockutils [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:51:08 compute-0 nova_compute[187185]: 2025-11-29 06:51:08.213 187189 DEBUG oslo_concurrency.lockutils [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:51:08 compute-0 nova_compute[187185]: 2025-11-29 06:51:08.214 187189 DEBUG oslo_concurrency.lockutils [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:51:08 compute-0 nova_compute[187185]: 2025-11-29 06:51:08.217 187189 DEBUG nova.virt.libvirt.driver [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 06:51:08 compute-0 nova_compute[187185]: 2025-11-29 06:51:08.223 187189 WARNING nova.virt.libvirt.driver [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:51:08 compute-0 nova_compute[187185]: 2025-11-29 06:51:08.229 187189 DEBUG nova.virt.libvirt.host [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 06:51:08 compute-0 nova_compute[187185]: 2025-11-29 06:51:08.230 187189 DEBUG nova.virt.libvirt.host [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 06:51:08 compute-0 nova_compute[187185]: 2025-11-29 06:51:08.237 187189 DEBUG nova.virt.libvirt.host [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 06:51:08 compute-0 nova_compute[187185]: 2025-11-29 06:51:08.238 187189 DEBUG nova.virt.libvirt.host [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 06:51:08 compute-0 nova_compute[187185]: 2025-11-29 06:51:08.240 187189 DEBUG nova.virt.libvirt.driver [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 06:51:08 compute-0 nova_compute[187185]: 2025-11-29 06:51:08.241 187189 DEBUG nova.virt.hardware [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 06:51:08 compute-0 nova_compute[187185]: 2025-11-29 06:51:08.242 187189 DEBUG nova.virt.hardware [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 06:51:08 compute-0 nova_compute[187185]: 2025-11-29 06:51:08.242 187189 DEBUG nova.virt.hardware [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 06:51:08 compute-0 nova_compute[187185]: 2025-11-29 06:51:08.243 187189 DEBUG nova.virt.hardware [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 06:51:08 compute-0 nova_compute[187185]: 2025-11-29 06:51:08.243 187189 DEBUG nova.virt.hardware [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 06:51:08 compute-0 nova_compute[187185]: 2025-11-29 06:51:08.243 187189 DEBUG nova.virt.hardware [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 06:51:08 compute-0 nova_compute[187185]: 2025-11-29 06:51:08.244 187189 DEBUG nova.virt.hardware [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 06:51:08 compute-0 nova_compute[187185]: 2025-11-29 06:51:08.244 187189 DEBUG nova.virt.hardware [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 06:51:08 compute-0 nova_compute[187185]: 2025-11-29 06:51:08.245 187189 DEBUG nova.virt.hardware [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 06:51:08 compute-0 nova_compute[187185]: 2025-11-29 06:51:08.245 187189 DEBUG nova.virt.hardware [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 06:51:08 compute-0 nova_compute[187185]: 2025-11-29 06:51:08.246 187189 DEBUG nova.virt.hardware [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 06:51:08 compute-0 nova_compute[187185]: 2025-11-29 06:51:08.251 187189 DEBUG nova.objects.instance [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Lazy-loading 'pci_devices' on Instance uuid ec67e8c7-49f4-4ba9-bcf9-edb89515859d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:51:08 compute-0 nova_compute[187185]: 2025-11-29 06:51:08.265 187189 DEBUG nova.virt.libvirt.driver [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] End _get_guest_xml xml=<domain type="kvm">
Nov 29 06:51:08 compute-0 nova_compute[187185]:   <uuid>ec67e8c7-49f4-4ba9-bcf9-edb89515859d</uuid>
Nov 29 06:51:08 compute-0 nova_compute[187185]:   <name>instance-00000011</name>
Nov 29 06:51:08 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 06:51:08 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 06:51:08 compute-0 nova_compute[187185]:   <metadata>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 06:51:08 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:       <nova:name>tempest-TenantUsagesTestJSON-server-1847492086</nova:name>
Nov 29 06:51:08 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 06:51:08</nova:creationTime>
Nov 29 06:51:08 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 06:51:08 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 06:51:08 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 06:51:08 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 06:51:08 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 06:51:08 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 06:51:08 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 06:51:08 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 06:51:08 compute-0 nova_compute[187185]:         <nova:user uuid="3f3456486e4244088e418ed04de4f32a">tempest-TenantUsagesTestJSON-1713460248-project-member</nova:user>
Nov 29 06:51:08 compute-0 nova_compute[187185]:         <nova:project uuid="6aaa2c8f4690456c89e6d3a1335b3abf">tempest-TenantUsagesTestJSON-1713460248</nova:project>
Nov 29 06:51:08 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 06:51:08 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:       <nova:ports/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 06:51:08 compute-0 nova_compute[187185]:   </metadata>
Nov 29 06:51:08 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <system>
Nov 29 06:51:08 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 06:51:08 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 06:51:08 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 06:51:08 compute-0 nova_compute[187185]:       <entry name="serial">ec67e8c7-49f4-4ba9-bcf9-edb89515859d</entry>
Nov 29 06:51:08 compute-0 nova_compute[187185]:       <entry name="uuid">ec67e8c7-49f4-4ba9-bcf9-edb89515859d</entry>
Nov 29 06:51:08 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     </system>
Nov 29 06:51:08 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 06:51:08 compute-0 nova_compute[187185]:   <os>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:   </os>
Nov 29 06:51:08 compute-0 nova_compute[187185]:   <features>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <apic/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:   </features>
Nov 29 06:51:08 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:   </clock>
Nov 29 06:51:08 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:   </cpu>
Nov 29 06:51:08 compute-0 nova_compute[187185]:   <devices>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 06:51:08 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/ec67e8c7-49f4-4ba9-bcf9-edb89515859d/disk"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     </disk>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 06:51:08 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/ec67e8c7-49f4-4ba9-bcf9-edb89515859d/disk.config"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     </disk>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 06:51:08 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/ec67e8c7-49f4-4ba9-bcf9-edb89515859d/console.log" append="off"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     </serial>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <video>
Nov 29 06:51:08 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     </video>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 06:51:08 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     </rng>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 06:51:08 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 06:51:08 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 06:51:08 compute-0 nova_compute[187185]:   </devices>
Nov 29 06:51:08 compute-0 nova_compute[187185]: </domain>
Nov 29 06:51:08 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 06:51:08 compute-0 nova_compute[187185]: 2025-11-29 06:51:08.363 187189 DEBUG nova.virt.libvirt.driver [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 06:51:08 compute-0 nova_compute[187185]: 2025-11-29 06:51:08.363 187189 DEBUG nova.virt.libvirt.driver [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 06:51:08 compute-0 nova_compute[187185]: 2025-11-29 06:51:08.364 187189 INFO nova.virt.libvirt.driver [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Using config drive
Nov 29 06:51:08 compute-0 nova_compute[187185]: 2025-11-29 06:51:08.615 187189 INFO nova.virt.libvirt.driver [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Creating config drive at /var/lib/nova/instances/ec67e8c7-49f4-4ba9-bcf9-edb89515859d/disk.config
Nov 29 06:51:08 compute-0 nova_compute[187185]: 2025-11-29 06:51:08.620 187189 DEBUG oslo_concurrency.processutils [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ec67e8c7-49f4-4ba9-bcf9-edb89515859d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptvj33hbo execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:51:08 compute-0 nova_compute[187185]: 2025-11-29 06:51:08.760 187189 DEBUG oslo_concurrency.processutils [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ec67e8c7-49f4-4ba9-bcf9-edb89515859d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptvj33hbo" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:51:08 compute-0 systemd-machined[153486]: New machine qemu-5-instance-00000011.
Nov 29 06:51:08 compute-0 systemd[1]: Started Virtual Machine qemu-5-instance-00000011.
Nov 29 06:51:09 compute-0 nova_compute[187185]: 2025-11-29 06:51:09.144 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399069.1438088, ec67e8c7-49f4-4ba9-bcf9-edb89515859d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:51:09 compute-0 nova_compute[187185]: 2025-11-29 06:51:09.145 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] VM Resumed (Lifecycle Event)
Nov 29 06:51:09 compute-0 nova_compute[187185]: 2025-11-29 06:51:09.148 187189 DEBUG nova.compute.manager [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 06:51:09 compute-0 nova_compute[187185]: 2025-11-29 06:51:09.149 187189 DEBUG nova.virt.libvirt.driver [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 06:51:09 compute-0 nova_compute[187185]: 2025-11-29 06:51:09.153 187189 INFO nova.virt.libvirt.driver [-] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Instance spawned successfully.
Nov 29 06:51:09 compute-0 nova_compute[187185]: 2025-11-29 06:51:09.153 187189 DEBUG nova.virt.libvirt.driver [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 06:51:09 compute-0 nova_compute[187185]: 2025-11-29 06:51:09.196 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:51:09 compute-0 nova_compute[187185]: 2025-11-29 06:51:09.202 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 06:51:09 compute-0 nova_compute[187185]: 2025-11-29 06:51:09.367 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 06:51:09 compute-0 nova_compute[187185]: 2025-11-29 06:51:09.367 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399069.1477513, ec67e8c7-49f4-4ba9-bcf9-edb89515859d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:51:09 compute-0 nova_compute[187185]: 2025-11-29 06:51:09.368 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] VM Started (Lifecycle Event)
Nov 29 06:51:09 compute-0 nova_compute[187185]: 2025-11-29 06:51:09.370 187189 DEBUG nova.virt.libvirt.driver [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:51:09 compute-0 nova_compute[187185]: 2025-11-29 06:51:09.371 187189 DEBUG nova.virt.libvirt.driver [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:51:09 compute-0 nova_compute[187185]: 2025-11-29 06:51:09.371 187189 DEBUG nova.virt.libvirt.driver [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:51:09 compute-0 nova_compute[187185]: 2025-11-29 06:51:09.372 187189 DEBUG nova.virt.libvirt.driver [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:51:09 compute-0 nova_compute[187185]: 2025-11-29 06:51:09.372 187189 DEBUG nova.virt.libvirt.driver [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:51:09 compute-0 nova_compute[187185]: 2025-11-29 06:51:09.373 187189 DEBUG nova.virt.libvirt.driver [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:51:09 compute-0 nova_compute[187185]: 2025-11-29 06:51:09.425 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:51:09 compute-0 nova_compute[187185]: 2025-11-29 06:51:09.430 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 06:51:09 compute-0 nova_compute[187185]: 2025-11-29 06:51:09.459 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 06:51:09 compute-0 nova_compute[187185]: 2025-11-29 06:51:09.463 187189 INFO nova.compute.manager [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Took 1.80 seconds to spawn the instance on the hypervisor.
Nov 29 06:51:09 compute-0 nova_compute[187185]: 2025-11-29 06:51:09.463 187189 DEBUG nova.compute.manager [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:51:09 compute-0 nova_compute[187185]: 2025-11-29 06:51:09.556 187189 INFO nova.compute.manager [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Took 2.38 seconds to build instance.
Nov 29 06:51:09 compute-0 nova_compute[187185]: 2025-11-29 06:51:09.578 187189 DEBUG oslo_concurrency.lockutils [None req-6e020183-67b0-417a-a215-99f3482b08e3 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Lock "ec67e8c7-49f4-4ba9-bcf9-edb89515859d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.708s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:51:10 compute-0 ovn_controller[95281]: 2025-11-29T06:51:10Z|00051|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Nov 29 06:51:11 compute-0 nova_compute[187185]: 2025-11-29 06:51:11.073 187189 DEBUG oslo_concurrency.lockutils [None req-b65df903-b87e-4e87-b681-c591dfd327ef 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Acquiring lock "ec67e8c7-49f4-4ba9-bcf9-edb89515859d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:51:11 compute-0 nova_compute[187185]: 2025-11-29 06:51:11.073 187189 DEBUG oslo_concurrency.lockutils [None req-b65df903-b87e-4e87-b681-c591dfd327ef 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Lock "ec67e8c7-49f4-4ba9-bcf9-edb89515859d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:51:11 compute-0 nova_compute[187185]: 2025-11-29 06:51:11.074 187189 DEBUG oslo_concurrency.lockutils [None req-b65df903-b87e-4e87-b681-c591dfd327ef 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Acquiring lock "ec67e8c7-49f4-4ba9-bcf9-edb89515859d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:51:11 compute-0 nova_compute[187185]: 2025-11-29 06:51:11.074 187189 DEBUG oslo_concurrency.lockutils [None req-b65df903-b87e-4e87-b681-c591dfd327ef 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Lock "ec67e8c7-49f4-4ba9-bcf9-edb89515859d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:51:11 compute-0 nova_compute[187185]: 2025-11-29 06:51:11.074 187189 DEBUG oslo_concurrency.lockutils [None req-b65df903-b87e-4e87-b681-c591dfd327ef 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Lock "ec67e8c7-49f4-4ba9-bcf9-edb89515859d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:51:11 compute-0 nova_compute[187185]: 2025-11-29 06:51:11.087 187189 INFO nova.compute.manager [None req-b65df903-b87e-4e87-b681-c591dfd327ef 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Terminating instance
Nov 29 06:51:11 compute-0 nova_compute[187185]: 2025-11-29 06:51:11.097 187189 DEBUG oslo_concurrency.lockutils [None req-b65df903-b87e-4e87-b681-c591dfd327ef 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Acquiring lock "refresh_cache-ec67e8c7-49f4-4ba9-bcf9-edb89515859d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:51:11 compute-0 nova_compute[187185]: 2025-11-29 06:51:11.098 187189 DEBUG oslo_concurrency.lockutils [None req-b65df903-b87e-4e87-b681-c591dfd327ef 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Acquired lock "refresh_cache-ec67e8c7-49f4-4ba9-bcf9-edb89515859d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:51:11 compute-0 nova_compute[187185]: 2025-11-29 06:51:11.098 187189 DEBUG nova.network.neutron [None req-b65df903-b87e-4e87-b681-c591dfd327ef 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 06:51:11 compute-0 nova_compute[187185]: 2025-11-29 06:51:11.253 187189 DEBUG nova.network.neutron [None req-b65df903-b87e-4e87-b681-c591dfd327ef 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 06:51:11 compute-0 nova_compute[187185]: 2025-11-29 06:51:11.523 187189 DEBUG nova.network.neutron [None req-b65df903-b87e-4e87-b681-c591dfd327ef 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:51:11 compute-0 nova_compute[187185]: 2025-11-29 06:51:11.551 187189 DEBUG oslo_concurrency.lockutils [None req-b65df903-b87e-4e87-b681-c591dfd327ef 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Releasing lock "refresh_cache-ec67e8c7-49f4-4ba9-bcf9-edb89515859d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:51:11 compute-0 nova_compute[187185]: 2025-11-29 06:51:11.552 187189 DEBUG nova.compute.manager [None req-b65df903-b87e-4e87-b681-c591dfd327ef 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 06:51:11 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000011.scope: Deactivated successfully.
Nov 29 06:51:11 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000011.scope: Consumed 2.617s CPU time.
Nov 29 06:51:11 compute-0 systemd-machined[153486]: Machine qemu-5-instance-00000011 terminated.
Nov 29 06:51:11 compute-0 nova_compute[187185]: 2025-11-29 06:51:11.810 187189 INFO nova.virt.libvirt.driver [-] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Instance destroyed successfully.
Nov 29 06:51:11 compute-0 nova_compute[187185]: 2025-11-29 06:51:11.811 187189 DEBUG nova.objects.instance [None req-b65df903-b87e-4e87-b681-c591dfd327ef 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Lazy-loading 'resources' on Instance uuid ec67e8c7-49f4-4ba9-bcf9-edb89515859d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:51:11 compute-0 nova_compute[187185]: 2025-11-29 06:51:11.826 187189 INFO nova.virt.libvirt.driver [None req-b65df903-b87e-4e87-b681-c591dfd327ef 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Deleting instance files /var/lib/nova/instances/ec67e8c7-49f4-4ba9-bcf9-edb89515859d_del
Nov 29 06:51:11 compute-0 nova_compute[187185]: 2025-11-29 06:51:11.828 187189 INFO nova.virt.libvirt.driver [None req-b65df903-b87e-4e87-b681-c591dfd327ef 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Deletion of /var/lib/nova/instances/ec67e8c7-49f4-4ba9-bcf9-edb89515859d_del complete
Nov 29 06:51:11 compute-0 nova_compute[187185]: 2025-11-29 06:51:11.918 187189 INFO nova.compute.manager [None req-b65df903-b87e-4e87-b681-c591dfd327ef 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Took 0.36 seconds to destroy the instance on the hypervisor.
Nov 29 06:51:11 compute-0 nova_compute[187185]: 2025-11-29 06:51:11.919 187189 DEBUG oslo.service.loopingcall [None req-b65df903-b87e-4e87-b681-c591dfd327ef 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 06:51:11 compute-0 nova_compute[187185]: 2025-11-29 06:51:11.920 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:51:11 compute-0 nova_compute[187185]: 2025-11-29 06:51:11.922 187189 DEBUG nova.compute.manager [-] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 06:51:11 compute-0 nova_compute[187185]: 2025-11-29 06:51:11.923 187189 DEBUG nova.network.neutron [-] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 06:51:12 compute-0 nova_compute[187185]: 2025-11-29 06:51:12.116 187189 DEBUG nova.network.neutron [-] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 06:51:12 compute-0 nova_compute[187185]: 2025-11-29 06:51:12.262 187189 DEBUG nova.network.neutron [-] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:51:12 compute-0 nova_compute[187185]: 2025-11-29 06:51:12.281 187189 INFO nova.compute.manager [-] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Took 0.36 seconds to deallocate network for instance.
Nov 29 06:51:12 compute-0 nova_compute[187185]: 2025-11-29 06:51:12.351 187189 DEBUG oslo_concurrency.lockutils [None req-b65df903-b87e-4e87-b681-c591dfd327ef 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:51:12 compute-0 nova_compute[187185]: 2025-11-29 06:51:12.351 187189 DEBUG oslo_concurrency.lockutils [None req-b65df903-b87e-4e87-b681-c591dfd327ef 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:51:12 compute-0 nova_compute[187185]: 2025-11-29 06:51:12.395 187189 DEBUG nova.compute.provider_tree [None req-b65df903-b87e-4e87-b681-c591dfd327ef 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:51:12 compute-0 nova_compute[187185]: 2025-11-29 06:51:12.413 187189 DEBUG nova.scheduler.client.report [None req-b65df903-b87e-4e87-b681-c591dfd327ef 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:51:12 compute-0 nova_compute[187185]: 2025-11-29 06:51:12.434 187189 DEBUG oslo_concurrency.lockutils [None req-b65df903-b87e-4e87-b681-c591dfd327ef 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.083s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:51:12 compute-0 nova_compute[187185]: 2025-11-29 06:51:12.472 187189 INFO nova.scheduler.client.report [None req-b65df903-b87e-4e87-b681-c591dfd327ef 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Deleted allocations for instance ec67e8c7-49f4-4ba9-bcf9-edb89515859d
Nov 29 06:51:12 compute-0 nova_compute[187185]: 2025-11-29 06:51:12.595 187189 DEBUG oslo_concurrency.lockutils [None req-b65df903-b87e-4e87-b681-c591dfd327ef 3f3456486e4244088e418ed04de4f32a 6aaa2c8f4690456c89e6d3a1335b3abf - - default default] Lock "ec67e8c7-49f4-4ba9-bcf9-edb89515859d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.522s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:51:12 compute-0 nova_compute[187185]: 2025-11-29 06:51:12.649 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:51:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:51:16.154 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 06:51:16 compute-0 nova_compute[187185]: 2025-11-29 06:51:16.156 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:51:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:51:16.156 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 06:51:16 compute-0 nova_compute[187185]: 2025-11-29 06:51:16.959 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:51:17 compute-0 nova_compute[187185]: 2025-11-29 06:51:17.653 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:51:19 compute-0 podman[215691]: 2025-11-29 06:51:19.817578558 +0000 UTC m=+0.080691021 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., version=9.6, config_id=edpm, io.buildah.version=1.33.7, io.openshift.expose-services=, managed_by=edpm_ansible, release=1755695350, distribution-scope=public, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 29 06:51:19 compute-0 podman[215692]: 2025-11-29 06:51:19.837284002 +0000 UTC m=+0.090024633 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 06:51:19 compute-0 podman[215736]: 2025-11-29 06:51:19.938302753 +0000 UTC m=+0.075092153 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 29 06:51:22 compute-0 nova_compute[187185]: 2025-11-29 06:51:22.007 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:51:22 compute-0 nova_compute[187185]: 2025-11-29 06:51:22.656 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:51:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:51:24.159 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:51:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:51:24.810 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:51:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:51:24.811 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:51:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:51:24.811 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:51:26 compute-0 nova_compute[187185]: 2025-11-29 06:51:26.809 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399071.8075159, ec67e8c7-49f4-4ba9-bcf9-edb89515859d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:51:26 compute-0 nova_compute[187185]: 2025-11-29 06:51:26.809 187189 INFO nova.compute.manager [-] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] VM Stopped (Lifecycle Event)
Nov 29 06:51:26 compute-0 podman[215753]: 2025-11-29 06:51:26.820688509 +0000 UTC m=+0.086904676 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:51:26 compute-0 nova_compute[187185]: 2025-11-29 06:51:26.836 187189 DEBUG nova.compute.manager [None req-3a30f2c9-d4a3-4d58-843c-3597a40576bd - - - - - -] [instance: ec67e8c7-49f4-4ba9-bcf9-edb89515859d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:51:27 compute-0 nova_compute[187185]: 2025-11-29 06:51:27.009 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:51:27 compute-0 nova_compute[187185]: 2025-11-29 06:51:27.659 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:51:28 compute-0 podman[215780]: 2025-11-29 06:51:28.801023014 +0000 UTC m=+0.060698659 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 06:51:28 compute-0 podman[215779]: 2025-11-29 06:51:28.804942144 +0000 UTC m=+0.067633323 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 06:51:32 compute-0 nova_compute[187185]: 2025-11-29 06:51:32.011 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:51:32 compute-0 nova_compute[187185]: 2025-11-29 06:51:32.690 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:51:34 compute-0 nova_compute[187185]: 2025-11-29 06:51:34.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:51:34 compute-0 nova_compute[187185]: 2025-11-29 06:51:34.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 29 06:51:37 compute-0 nova_compute[187185]: 2025-11-29 06:51:37.051 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:51:37 compute-0 nova_compute[187185]: 2025-11-29 06:51:37.692 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:51:37 compute-0 podman[215821]: 2025-11-29 06:51:37.812029758 +0000 UTC m=+0.081384701 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 06:51:39 compute-0 nova_compute[187185]: 2025-11-29 06:51:39.350 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:51:39 compute-0 nova_compute[187185]: 2025-11-29 06:51:39.352 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 06:51:39 compute-0 nova_compute[187185]: 2025-11-29 06:51:39.353 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 06:51:39 compute-0 nova_compute[187185]: 2025-11-29 06:51:39.456 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 06:51:39 compute-0 sshd-session[215842]: Received disconnect from 1.214.197.163 port 44604:11: Bye Bye [preauth]
Nov 29 06:51:39 compute-0 sshd-session[215842]: Disconnected from authenticating user root 1.214.197.163 port 44604 [preauth]
Nov 29 06:51:40 compute-0 sshd-session[215844]: Invalid user bitnami from 179.125.24.202 port 40170
Nov 29 06:51:40 compute-0 sshd-session[215844]: Received disconnect from 179.125.24.202 port 40170:11: Bye Bye [preauth]
Nov 29 06:51:40 compute-0 sshd-session[215844]: Disconnected from invalid user bitnami 179.125.24.202 port 40170 [preauth]
Nov 29 06:51:40 compute-0 nova_compute[187185]: 2025-11-29 06:51:40.416 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:51:41 compute-0 nova_compute[187185]: 2025-11-29 06:51:41.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:51:41 compute-0 nova_compute[187185]: 2025-11-29 06:51:41.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:51:42 compute-0 nova_compute[187185]: 2025-11-29 06:51:42.082 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:51:42 compute-0 nova_compute[187185]: 2025-11-29 06:51:42.311 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:51:42 compute-0 nova_compute[187185]: 2025-11-29 06:51:42.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:51:42 compute-0 nova_compute[187185]: 2025-11-29 06:51:42.695 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:51:43 compute-0 nova_compute[187185]: 2025-11-29 06:51:43.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:51:43 compute-0 nova_compute[187185]: 2025-11-29 06:51:43.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:51:44 compute-0 nova_compute[187185]: 2025-11-29 06:51:44.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:51:44 compute-0 nova_compute[187185]: 2025-11-29 06:51:44.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 06:51:44 compute-0 nova_compute[187185]: 2025-11-29 06:51:44.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:51:44 compute-0 nova_compute[187185]: 2025-11-29 06:51:44.356 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:51:44 compute-0 nova_compute[187185]: 2025-11-29 06:51:44.357 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:51:44 compute-0 nova_compute[187185]: 2025-11-29 06:51:44.358 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:51:44 compute-0 nova_compute[187185]: 2025-11-29 06:51:44.358 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:51:44 compute-0 nova_compute[187185]: 2025-11-29 06:51:44.666 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:51:44 compute-0 nova_compute[187185]: 2025-11-29 06:51:44.667 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5791MB free_disk=73.33899688720703GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:51:44 compute-0 nova_compute[187185]: 2025-11-29 06:51:44.667 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:51:44 compute-0 nova_compute[187185]: 2025-11-29 06:51:44.667 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:51:44 compute-0 nova_compute[187185]: 2025-11-29 06:51:44.912 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:51:44 compute-0 nova_compute[187185]: 2025-11-29 06:51:44.913 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:51:44 compute-0 nova_compute[187185]: 2025-11-29 06:51:44.949 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:51:44 compute-0 nova_compute[187185]: 2025-11-29 06:51:44.986 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:51:45 compute-0 nova_compute[187185]: 2025-11-29 06:51:45.051 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:51:45 compute-0 nova_compute[187185]: 2025-11-29 06:51:45.052 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.385s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:51:45 compute-0 nova_compute[187185]: 2025-11-29 06:51:45.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:51:45 compute-0 nova_compute[187185]: 2025-11-29 06:51:45.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 29 06:51:45 compute-0 nova_compute[187185]: 2025-11-29 06:51:45.352 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 29 06:51:47 compute-0 nova_compute[187185]: 2025-11-29 06:51:47.117 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:51:47 compute-0 nova_compute[187185]: 2025-11-29 06:51:47.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:51:47 compute-0 nova_compute[187185]: 2025-11-29 06:51:47.698 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:51:50 compute-0 podman[215847]: 2025-11-29 06:51:50.890130386 +0000 UTC m=+0.143189168 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 29 06:51:50 compute-0 podman[215848]: 2025-11-29 06:51:50.906032374 +0000 UTC m=+0.158770037 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., distribution-scope=public, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, io.openshift.expose-services=)
Nov 29 06:51:50 compute-0 podman[215849]: 2025-11-29 06:51:50.916050145 +0000 UTC m=+0.152551221 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 06:51:52 compute-0 nova_compute[187185]: 2025-11-29 06:51:52.119 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:51:52 compute-0 nova_compute[187185]: 2025-11-29 06:51:52.702 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:51:57 compute-0 nova_compute[187185]: 2025-11-29 06:51:57.163 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:51:57 compute-0 nova_compute[187185]: 2025-11-29 06:51:57.706 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:51:57 compute-0 podman[215909]: 2025-11-29 06:51:57.896305814 +0000 UTC m=+0.150552188 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Nov 29 06:51:59 compute-0 podman[215936]: 2025-11-29 06:51:59.844672808 +0000 UTC m=+0.088953333 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 06:51:59 compute-0 podman[215935]: 2025-11-29 06:51:59.860710926 +0000 UTC m=+0.111568471 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:52:02 compute-0 nova_compute[187185]: 2025-11-29 06:52:02.166 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:02 compute-0 nova_compute[187185]: 2025-11-29 06:52:02.709 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:07 compute-0 nova_compute[187185]: 2025-11-29 06:52:07.223 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:07 compute-0 nova_compute[187185]: 2025-11-29 06:52:07.718 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:08 compute-0 podman[215979]: 2025-11-29 06:52:08.865863508 +0000 UTC m=+0.091752570 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 06:52:12 compute-0 nova_compute[187185]: 2025-11-29 06:52:12.240 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:12 compute-0 nova_compute[187185]: 2025-11-29 06:52:12.726 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:17 compute-0 nova_compute[187185]: 2025-11-29 06:52:17.305 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:17 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:17.482 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 06:52:17 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:17.483 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 06:52:17 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:17.485 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:52:17 compute-0 nova_compute[187185]: 2025-11-29 06:52:17.485 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:17 compute-0 nova_compute[187185]: 2025-11-29 06:52:17.728 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:19 compute-0 sshd-session[215999]: Received disconnect from 160.202.8.218 port 36190:11: Bye Bye [preauth]
Nov 29 06:52:19 compute-0 sshd-session[215999]: Disconnected from authenticating user root 160.202.8.218 port 36190 [preauth]
Nov 29 06:52:21 compute-0 podman[216001]: 2025-11-29 06:52:21.802875007 +0000 UTC m=+0.066976622 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 06:52:21 compute-0 podman[216003]: 2025-11-29 06:52:21.808383858 +0000 UTC m=+0.066484639 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 06:52:21 compute-0 podman[216002]: 2025-11-29 06:52:21.811167894 +0000 UTC m=+0.071586039 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.expose-services=, version=9.6, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc.)
Nov 29 06:52:22 compute-0 nova_compute[187185]: 2025-11-29 06:52:22.372 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:22 compute-0 nova_compute[187185]: 2025-11-29 06:52:22.730 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:24.810 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:52:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:24.811 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:52:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:24.811 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:52:25 compute-0 nova_compute[187185]: 2025-11-29 06:52:25.107 187189 DEBUG oslo_concurrency.lockutils [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Acquiring lock "942e977f-fd74-45d1-b0be-661b15431eca" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:52:25 compute-0 nova_compute[187185]: 2025-11-29 06:52:25.107 187189 DEBUG oslo_concurrency.lockutils [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Lock "942e977f-fd74-45d1-b0be-661b15431eca" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:52:25 compute-0 nova_compute[187185]: 2025-11-29 06:52:25.124 187189 DEBUG nova.compute.manager [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 06:52:25 compute-0 nova_compute[187185]: 2025-11-29 06:52:25.338 187189 DEBUG oslo_concurrency.lockutils [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:52:25 compute-0 nova_compute[187185]: 2025-11-29 06:52:25.339 187189 DEBUG oslo_concurrency.lockutils [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:52:25 compute-0 nova_compute[187185]: 2025-11-29 06:52:25.349 187189 DEBUG nova.virt.hardware [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 06:52:25 compute-0 nova_compute[187185]: 2025-11-29 06:52:25.349 187189 INFO nova.compute.claims [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Claim successful on node compute-0.ctlplane.example.com
Nov 29 06:52:25 compute-0 nova_compute[187185]: 2025-11-29 06:52:25.636 187189 DEBUG nova.compute.provider_tree [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:52:25 compute-0 nova_compute[187185]: 2025-11-29 06:52:25.653 187189 DEBUG nova.scheduler.client.report [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:52:25 compute-0 nova_compute[187185]: 2025-11-29 06:52:25.679 187189 DEBUG oslo_concurrency.lockutils [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.341s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:52:25 compute-0 nova_compute[187185]: 2025-11-29 06:52:25.680 187189 DEBUG nova.compute.manager [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 06:52:25 compute-0 nova_compute[187185]: 2025-11-29 06:52:25.748 187189 DEBUG nova.compute.manager [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 06:52:25 compute-0 nova_compute[187185]: 2025-11-29 06:52:25.748 187189 DEBUG nova.network.neutron [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 06:52:25 compute-0 nova_compute[187185]: 2025-11-29 06:52:25.770 187189 INFO nova.virt.libvirt.driver [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 06:52:25 compute-0 nova_compute[187185]: 2025-11-29 06:52:25.786 187189 DEBUG nova.compute.manager [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 06:52:25 compute-0 nova_compute[187185]: 2025-11-29 06:52:25.885 187189 DEBUG nova.compute.manager [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 06:52:25 compute-0 nova_compute[187185]: 2025-11-29 06:52:25.887 187189 DEBUG nova.virt.libvirt.driver [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 06:52:25 compute-0 nova_compute[187185]: 2025-11-29 06:52:25.887 187189 INFO nova.virt.libvirt.driver [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Creating image(s)
Nov 29 06:52:25 compute-0 nova_compute[187185]: 2025-11-29 06:52:25.888 187189 DEBUG oslo_concurrency.lockutils [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Acquiring lock "/var/lib/nova/instances/942e977f-fd74-45d1-b0be-661b15431eca/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:52:25 compute-0 nova_compute[187185]: 2025-11-29 06:52:25.888 187189 DEBUG oslo_concurrency.lockutils [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Lock "/var/lib/nova/instances/942e977f-fd74-45d1-b0be-661b15431eca/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:52:25 compute-0 nova_compute[187185]: 2025-11-29 06:52:25.889 187189 DEBUG oslo_concurrency.lockutils [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Lock "/var/lib/nova/instances/942e977f-fd74-45d1-b0be-661b15431eca/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:52:25 compute-0 nova_compute[187185]: 2025-11-29 06:52:25.901 187189 DEBUG oslo_concurrency.processutils [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:52:25 compute-0 nova_compute[187185]: 2025-11-29 06:52:25.971 187189 DEBUG oslo_concurrency.processutils [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:52:25 compute-0 nova_compute[187185]: 2025-11-29 06:52:25.972 187189 DEBUG oslo_concurrency.lockutils [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:52:25 compute-0 nova_compute[187185]: 2025-11-29 06:52:25.973 187189 DEBUG oslo_concurrency.lockutils [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:52:25 compute-0 nova_compute[187185]: 2025-11-29 06:52:25.983 187189 DEBUG oslo_concurrency.processutils [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:52:26 compute-0 nova_compute[187185]: 2025-11-29 06:52:26.038 187189 DEBUG oslo_concurrency.processutils [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:52:26 compute-0 nova_compute[187185]: 2025-11-29 06:52:26.039 187189 DEBUG oslo_concurrency.processutils [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/942e977f-fd74-45d1-b0be-661b15431eca/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:52:26 compute-0 nova_compute[187185]: 2025-11-29 06:52:26.074 187189 DEBUG oslo_concurrency.processutils [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/942e977f-fd74-45d1-b0be-661b15431eca/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:52:26 compute-0 nova_compute[187185]: 2025-11-29 06:52:26.075 187189 DEBUG oslo_concurrency.lockutils [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:52:26 compute-0 nova_compute[187185]: 2025-11-29 06:52:26.076 187189 DEBUG oslo_concurrency.processutils [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:52:26 compute-0 nova_compute[187185]: 2025-11-29 06:52:26.138 187189 DEBUG oslo_concurrency.processutils [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:52:26 compute-0 nova_compute[187185]: 2025-11-29 06:52:26.139 187189 DEBUG nova.virt.disk.api [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Checking if we can resize image /var/lib/nova/instances/942e977f-fd74-45d1-b0be-661b15431eca/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 06:52:26 compute-0 nova_compute[187185]: 2025-11-29 06:52:26.140 187189 DEBUG oslo_concurrency.processutils [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/942e977f-fd74-45d1-b0be-661b15431eca/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:52:26 compute-0 nova_compute[187185]: 2025-11-29 06:52:26.160 187189 DEBUG nova.policy [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 06:52:26 compute-0 nova_compute[187185]: 2025-11-29 06:52:26.197 187189 DEBUG oslo_concurrency.processutils [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/942e977f-fd74-45d1-b0be-661b15431eca/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:52:26 compute-0 nova_compute[187185]: 2025-11-29 06:52:26.197 187189 DEBUG nova.virt.disk.api [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Cannot resize image /var/lib/nova/instances/942e977f-fd74-45d1-b0be-661b15431eca/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 06:52:26 compute-0 nova_compute[187185]: 2025-11-29 06:52:26.198 187189 DEBUG nova.objects.instance [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Lazy-loading 'migration_context' on Instance uuid 942e977f-fd74-45d1-b0be-661b15431eca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:52:26 compute-0 nova_compute[187185]: 2025-11-29 06:52:26.214 187189 DEBUG nova.virt.libvirt.driver [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 06:52:26 compute-0 nova_compute[187185]: 2025-11-29 06:52:26.214 187189 DEBUG nova.virt.libvirt.driver [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Ensure instance console log exists: /var/lib/nova/instances/942e977f-fd74-45d1-b0be-661b15431eca/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 06:52:26 compute-0 nova_compute[187185]: 2025-11-29 06:52:26.215 187189 DEBUG oslo_concurrency.lockutils [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:52:26 compute-0 nova_compute[187185]: 2025-11-29 06:52:26.215 187189 DEBUG oslo_concurrency.lockutils [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:52:26 compute-0 nova_compute[187185]: 2025-11-29 06:52:26.215 187189 DEBUG oslo_concurrency.lockutils [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:52:27 compute-0 nova_compute[187185]: 2025-11-29 06:52:27.415 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:27 compute-0 nova_compute[187185]: 2025-11-29 06:52:27.732 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:28 compute-0 nova_compute[187185]: 2025-11-29 06:52:28.579 187189 DEBUG nova.network.neutron [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Successfully created port: 586fc8d7-18ba-4421-a518-60f4d0a6950c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 06:52:28 compute-0 nova_compute[187185]: 2025-11-29 06:52:28.588 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:52:28 compute-0 nova_compute[187185]: 2025-11-29 06:52:28.635 187189 WARNING nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] While synchronizing instance power states, found 1 instances in the database and 0 instances on the hypervisor.
Nov 29 06:52:28 compute-0 nova_compute[187185]: 2025-11-29 06:52:28.636 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Triggering sync for uuid 942e977f-fd74-45d1-b0be-661b15431eca _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Nov 29 06:52:28 compute-0 nova_compute[187185]: 2025-11-29 06:52:28.636 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "942e977f-fd74-45d1-b0be-661b15431eca" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:52:28 compute-0 podman[216077]: 2025-11-29 06:52:28.876614565 +0000 UTC m=+0.140074761 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Nov 29 06:52:30 compute-0 podman[216105]: 2025-11-29 06:52:30.801821627 +0000 UTC m=+0.066294833 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 06:52:30 compute-0 nova_compute[187185]: 2025-11-29 06:52:30.810 187189 DEBUG nova.network.neutron [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Successfully updated port: 586fc8d7-18ba-4421-a518-60f4d0a6950c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 06:52:30 compute-0 nova_compute[187185]: 2025-11-29 06:52:30.825 187189 DEBUG oslo_concurrency.lockutils [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Acquiring lock "refresh_cache-942e977f-fd74-45d1-b0be-661b15431eca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:52:30 compute-0 nova_compute[187185]: 2025-11-29 06:52:30.825 187189 DEBUG oslo_concurrency.lockutils [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Acquired lock "refresh_cache-942e977f-fd74-45d1-b0be-661b15431eca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:52:30 compute-0 nova_compute[187185]: 2025-11-29 06:52:30.825 187189 DEBUG nova.network.neutron [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 06:52:30 compute-0 podman[216104]: 2025-11-29 06:52:30.835177169 +0000 UTC m=+0.102214546 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Nov 29 06:52:30 compute-0 nova_compute[187185]: 2025-11-29 06:52:30.990 187189 DEBUG nova.compute.manager [req-9441b4ee-f27a-400d-af04-20467752bda5 req-e5466404-8d0d-40c0-83dc-b9eac80cb432 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Received event network-changed-586fc8d7-18ba-4421-a518-60f4d0a6950c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:52:30 compute-0 nova_compute[187185]: 2025-11-29 06:52:30.990 187189 DEBUG nova.compute.manager [req-9441b4ee-f27a-400d-af04-20467752bda5 req-e5466404-8d0d-40c0-83dc-b9eac80cb432 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Refreshing instance network info cache due to event network-changed-586fc8d7-18ba-4421-a518-60f4d0a6950c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 06:52:30 compute-0 nova_compute[187185]: 2025-11-29 06:52:30.991 187189 DEBUG oslo_concurrency.lockutils [req-9441b4ee-f27a-400d-af04-20467752bda5 req-e5466404-8d0d-40c0-83dc-b9eac80cb432 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-942e977f-fd74-45d1-b0be-661b15431eca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:52:31 compute-0 nova_compute[187185]: 2025-11-29 06:52:31.095 187189 DEBUG nova.network.neutron [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.108 187189 DEBUG nova.network.neutron [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Updating instance_info_cache with network_info: [{"id": "586fc8d7-18ba-4421-a518-60f4d0a6950c", "address": "fa:16:3e:90:18:7a", "network": {"id": "3c63c551-2e9f-4b47-9e49-c73140efe20a", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-555773636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71af3e88884e42c48fb244d7d6ca31e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap586fc8d7-18", "ovs_interfaceid": "586fc8d7-18ba-4421-a518-60f4d0a6950c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.126 187189 DEBUG oslo_concurrency.lockutils [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Releasing lock "refresh_cache-942e977f-fd74-45d1-b0be-661b15431eca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.126 187189 DEBUG nova.compute.manager [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Instance network_info: |[{"id": "586fc8d7-18ba-4421-a518-60f4d0a6950c", "address": "fa:16:3e:90:18:7a", "network": {"id": "3c63c551-2e9f-4b47-9e49-c73140efe20a", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-555773636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71af3e88884e42c48fb244d7d6ca31e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap586fc8d7-18", "ovs_interfaceid": "586fc8d7-18ba-4421-a518-60f4d0a6950c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.127 187189 DEBUG oslo_concurrency.lockutils [req-9441b4ee-f27a-400d-af04-20467752bda5 req-e5466404-8d0d-40c0-83dc-b9eac80cb432 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-942e977f-fd74-45d1-b0be-661b15431eca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.127 187189 DEBUG nova.network.neutron [req-9441b4ee-f27a-400d-af04-20467752bda5 req-e5466404-8d0d-40c0-83dc-b9eac80cb432 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Refreshing network info cache for port 586fc8d7-18ba-4421-a518-60f4d0a6950c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.129 187189 DEBUG nova.virt.libvirt.driver [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Start _get_guest_xml network_info=[{"id": "586fc8d7-18ba-4421-a518-60f4d0a6950c", "address": "fa:16:3e:90:18:7a", "network": {"id": "3c63c551-2e9f-4b47-9e49-c73140efe20a", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-555773636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71af3e88884e42c48fb244d7d6ca31e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap586fc8d7-18", "ovs_interfaceid": "586fc8d7-18ba-4421-a518-60f4d0a6950c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.135 187189 WARNING nova.virt.libvirt.driver [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.147 187189 DEBUG nova.virt.libvirt.host [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.147 187189 DEBUG nova.virt.libvirt.host [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.151 187189 DEBUG nova.virt.libvirt.host [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.152 187189 DEBUG nova.virt.libvirt.host [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.153 187189 DEBUG nova.virt.libvirt.driver [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.153 187189 DEBUG nova.virt.hardware [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.154 187189 DEBUG nova.virt.hardware [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.154 187189 DEBUG nova.virt.hardware [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.154 187189 DEBUG nova.virt.hardware [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.154 187189 DEBUG nova.virt.hardware [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.154 187189 DEBUG nova.virt.hardware [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.155 187189 DEBUG nova.virt.hardware [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.155 187189 DEBUG nova.virt.hardware [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.155 187189 DEBUG nova.virt.hardware [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.155 187189 DEBUG nova.virt.hardware [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.156 187189 DEBUG nova.virt.hardware [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.159 187189 DEBUG nova.virt.libvirt.vif [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:52:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-1366973524',display_name='tempest-FloatingIPsAssociationTestJSON-server-1366973524',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-1366973524',id=20,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71af3e88884e42c48fb244d7d6ca31e2',ramdisk_id='',reservation_id='r-tesdhtm5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationTestJSON-940149563',owner_user_name='tempest-FloatingIPsAssociationTestJSON-940149563-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:52:25Z,user_data=None,user_id='a0fcd4f4de7e4072be30f7e3d4ac7c77',uuid=942e977f-fd74-45d1-b0be-661b15431eca,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "586fc8d7-18ba-4421-a518-60f4d0a6950c", "address": "fa:16:3e:90:18:7a", "network": {"id": "3c63c551-2e9f-4b47-9e49-c73140efe20a", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-555773636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71af3e88884e42c48fb244d7d6ca31e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap586fc8d7-18", "ovs_interfaceid": "586fc8d7-18ba-4421-a518-60f4d0a6950c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.159 187189 DEBUG nova.network.os_vif_util [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Converting VIF {"id": "586fc8d7-18ba-4421-a518-60f4d0a6950c", "address": "fa:16:3e:90:18:7a", "network": {"id": "3c63c551-2e9f-4b47-9e49-c73140efe20a", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-555773636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71af3e88884e42c48fb244d7d6ca31e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap586fc8d7-18", "ovs_interfaceid": "586fc8d7-18ba-4421-a518-60f4d0a6950c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.160 187189 DEBUG nova.network.os_vif_util [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:90:18:7a,bridge_name='br-int',has_traffic_filtering=True,id=586fc8d7-18ba-4421-a518-60f4d0a6950c,network=Network(3c63c551-2e9f-4b47-9e49-c73140efe20a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap586fc8d7-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.161 187189 DEBUG nova.objects.instance [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Lazy-loading 'pci_devices' on Instance uuid 942e977f-fd74-45d1-b0be-661b15431eca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.184 187189 DEBUG nova.virt.libvirt.driver [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] End _get_guest_xml xml=<domain type="kvm">
Nov 29 06:52:32 compute-0 nova_compute[187185]:   <uuid>942e977f-fd74-45d1-b0be-661b15431eca</uuid>
Nov 29 06:52:32 compute-0 nova_compute[187185]:   <name>instance-00000014</name>
Nov 29 06:52:32 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 06:52:32 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 06:52:32 compute-0 nova_compute[187185]:   <metadata>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 06:52:32 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:       <nova:name>tempest-FloatingIPsAssociationTestJSON-server-1366973524</nova:name>
Nov 29 06:52:32 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 06:52:32</nova:creationTime>
Nov 29 06:52:32 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 06:52:32 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 06:52:32 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 06:52:32 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 06:52:32 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 06:52:32 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 06:52:32 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 06:52:32 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 06:52:32 compute-0 nova_compute[187185]:         <nova:user uuid="a0fcd4f4de7e4072be30f7e3d4ac7c77">tempest-FloatingIPsAssociationTestJSON-940149563-project-member</nova:user>
Nov 29 06:52:32 compute-0 nova_compute[187185]:         <nova:project uuid="71af3e88884e42c48fb244d7d6ca31e2">tempest-FloatingIPsAssociationTestJSON-940149563</nova:project>
Nov 29 06:52:32 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 06:52:32 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 06:52:32 compute-0 nova_compute[187185]:         <nova:port uuid="586fc8d7-18ba-4421-a518-60f4d0a6950c">
Nov 29 06:52:32 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 06:52:32 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 06:52:32 compute-0 nova_compute[187185]:   </metadata>
Nov 29 06:52:32 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <system>
Nov 29 06:52:32 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 06:52:32 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 06:52:32 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 06:52:32 compute-0 nova_compute[187185]:       <entry name="serial">942e977f-fd74-45d1-b0be-661b15431eca</entry>
Nov 29 06:52:32 compute-0 nova_compute[187185]:       <entry name="uuid">942e977f-fd74-45d1-b0be-661b15431eca</entry>
Nov 29 06:52:32 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     </system>
Nov 29 06:52:32 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 06:52:32 compute-0 nova_compute[187185]:   <os>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:   </os>
Nov 29 06:52:32 compute-0 nova_compute[187185]:   <features>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <apic/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:   </features>
Nov 29 06:52:32 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:   </clock>
Nov 29 06:52:32 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:   </cpu>
Nov 29 06:52:32 compute-0 nova_compute[187185]:   <devices>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 06:52:32 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/942e977f-fd74-45d1-b0be-661b15431eca/disk"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     </disk>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 06:52:32 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/942e977f-fd74-45d1-b0be-661b15431eca/disk.config"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     </disk>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 06:52:32 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:90:18:7a"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:       <target dev="tap586fc8d7-18"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     </interface>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 06:52:32 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/942e977f-fd74-45d1-b0be-661b15431eca/console.log" append="off"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     </serial>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <video>
Nov 29 06:52:32 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     </video>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 06:52:32 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     </rng>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 06:52:32 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 06:52:32 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 06:52:32 compute-0 nova_compute[187185]:   </devices>
Nov 29 06:52:32 compute-0 nova_compute[187185]: </domain>
Nov 29 06:52:32 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.185 187189 DEBUG nova.compute.manager [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Preparing to wait for external event network-vif-plugged-586fc8d7-18ba-4421-a518-60f4d0a6950c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.185 187189 DEBUG oslo_concurrency.lockutils [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Acquiring lock "942e977f-fd74-45d1-b0be-661b15431eca-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.185 187189 DEBUG oslo_concurrency.lockutils [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Lock "942e977f-fd74-45d1-b0be-661b15431eca-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.186 187189 DEBUG oslo_concurrency.lockutils [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Lock "942e977f-fd74-45d1-b0be-661b15431eca-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.186 187189 DEBUG nova.virt.libvirt.vif [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:52:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-1366973524',display_name='tempest-FloatingIPsAssociationTestJSON-server-1366973524',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-1366973524',id=20,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71af3e88884e42c48fb244d7d6ca31e2',ramdisk_id='',reservation_id='r-tesdhtm5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationTestJSON-940149563',owner_user_name='tempest-FloatingIPsAssociationTestJSON-940149563-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:52:25Z,user_data=None,user_id='a0fcd4f4de7e4072be30f7e3d4ac7c77',uuid=942e977f-fd74-45d1-b0be-661b15431eca,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "586fc8d7-18ba-4421-a518-60f4d0a6950c", "address": "fa:16:3e:90:18:7a", "network": {"id": "3c63c551-2e9f-4b47-9e49-c73140efe20a", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-555773636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71af3e88884e42c48fb244d7d6ca31e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap586fc8d7-18", "ovs_interfaceid": "586fc8d7-18ba-4421-a518-60f4d0a6950c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.187 187189 DEBUG nova.network.os_vif_util [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Converting VIF {"id": "586fc8d7-18ba-4421-a518-60f4d0a6950c", "address": "fa:16:3e:90:18:7a", "network": {"id": "3c63c551-2e9f-4b47-9e49-c73140efe20a", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-555773636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71af3e88884e42c48fb244d7d6ca31e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap586fc8d7-18", "ovs_interfaceid": "586fc8d7-18ba-4421-a518-60f4d0a6950c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.187 187189 DEBUG nova.network.os_vif_util [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:90:18:7a,bridge_name='br-int',has_traffic_filtering=True,id=586fc8d7-18ba-4421-a518-60f4d0a6950c,network=Network(3c63c551-2e9f-4b47-9e49-c73140efe20a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap586fc8d7-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.187 187189 DEBUG os_vif [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:90:18:7a,bridge_name='br-int',has_traffic_filtering=True,id=586fc8d7-18ba-4421-a518-60f4d0a6950c,network=Network(3c63c551-2e9f-4b47-9e49-c73140efe20a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap586fc8d7-18') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.188 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.188 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.189 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.195 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.196 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap586fc8d7-18, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.196 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap586fc8d7-18, col_values=(('external_ids', {'iface-id': '586fc8d7-18ba-4421-a518-60f4d0a6950c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:90:18:7a', 'vm-uuid': '942e977f-fd74-45d1-b0be-661b15431eca'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.198 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:32 compute-0 NetworkManager[55227]: <info>  [1764399152.2003] manager: (tap586fc8d7-18): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.202 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.211 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.212 187189 INFO os_vif [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:90:18:7a,bridge_name='br-int',has_traffic_filtering=True,id=586fc8d7-18ba-4421-a518-60f4d0a6950c,network=Network(3c63c551-2e9f-4b47-9e49-c73140efe20a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap586fc8d7-18')
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.451 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.649 187189 DEBUG nova.virt.libvirt.driver [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.649 187189 DEBUG nova.virt.libvirt.driver [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.650 187189 DEBUG nova.virt.libvirt.driver [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] No VIF found with MAC fa:16:3e:90:18:7a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 06:52:32 compute-0 nova_compute[187185]: 2025-11-29 06:52:32.650 187189 INFO nova.virt.libvirt.driver [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Using config drive
Nov 29 06:52:33 compute-0 nova_compute[187185]: 2025-11-29 06:52:33.422 187189 INFO nova.virt.libvirt.driver [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Creating config drive at /var/lib/nova/instances/942e977f-fd74-45d1-b0be-661b15431eca/disk.config
Nov 29 06:52:33 compute-0 nova_compute[187185]: 2025-11-29 06:52:33.427 187189 DEBUG oslo_concurrency.processutils [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/942e977f-fd74-45d1-b0be-661b15431eca/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9q9gb8tn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:52:33 compute-0 nova_compute[187185]: 2025-11-29 06:52:33.558 187189 DEBUG oslo_concurrency.processutils [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/942e977f-fd74-45d1-b0be-661b15431eca/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9q9gb8tn" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:52:33 compute-0 kernel: tap586fc8d7-18: entered promiscuous mode
Nov 29 06:52:33 compute-0 NetworkManager[55227]: <info>  [1764399153.6339] manager: (tap586fc8d7-18): new Tun device (/org/freedesktop/NetworkManager/Devices/32)
Nov 29 06:52:33 compute-0 systemd-udevd[216164]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 06:52:33 compute-0 NetworkManager[55227]: <info>  [1764399153.6827] device (tap586fc8d7-18): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 06:52:33 compute-0 NetworkManager[55227]: <info>  [1764399153.6844] device (tap586fc8d7-18): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 06:52:33 compute-0 ovn_controller[95281]: 2025-11-29T06:52:33Z|00052|binding|INFO|Claiming lport 586fc8d7-18ba-4421-a518-60f4d0a6950c for this chassis.
Nov 29 06:52:33 compute-0 ovn_controller[95281]: 2025-11-29T06:52:33Z|00053|binding|INFO|586fc8d7-18ba-4421-a518-60f4d0a6950c: Claiming fa:16:3e:90:18:7a 10.100.0.13
Nov 29 06:52:33 compute-0 nova_compute[187185]: 2025-11-29 06:52:33.715 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:33.731 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:90:18:7a 10.100.0.13'], port_security=['fa:16:3e:90:18:7a 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '942e977f-fd74-45d1-b0be-661b15431eca', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3c63c551-2e9f-4b47-9e49-c73140efe20a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5f562e81-d2bf-4e2c-b0ea-0aa5dfe52d68', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=86b613b3-d246-4b07-a5b7-9ab1b7da74dc, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=586fc8d7-18ba-4421-a518-60f4d0a6950c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 06:52:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:33.732 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 586fc8d7-18ba-4421-a518-60f4d0a6950c in datapath 3c63c551-2e9f-4b47-9e49-c73140efe20a bound to our chassis
Nov 29 06:52:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:33.734 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3c63c551-2e9f-4b47-9e49-c73140efe20a
Nov 29 06:52:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:33.746 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[fcc18990-0476-41c4-9cc7-9fe2435b49b5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:52:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:33.748 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3c63c551-21 in ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 06:52:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:33.750 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3c63c551-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 06:52:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:33.751 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[1b52112a-391b-44f4-a6a9-187483371a21]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:52:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:33.753 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[fc831b63-2382-4b8f-bf4b-07b78cd1eec0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:52:33 compute-0 systemd-machined[153486]: New machine qemu-6-instance-00000014.
Nov 29 06:52:33 compute-0 nova_compute[187185]: 2025-11-29 06:52:33.777 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:33 compute-0 ovn_controller[95281]: 2025-11-29T06:52:33Z|00054|binding|INFO|Setting lport 586fc8d7-18ba-4421-a518-60f4d0a6950c ovn-installed in OVS
Nov 29 06:52:33 compute-0 ovn_controller[95281]: 2025-11-29T06:52:33Z|00055|binding|INFO|Setting lport 586fc8d7-18ba-4421-a518-60f4d0a6950c up in Southbound
Nov 29 06:52:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:33.781 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[c02d9fa9-f8b6-4ef0-91c9-be70fc809583]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:52:33 compute-0 systemd[1]: Started Virtual Machine qemu-6-instance-00000014.
Nov 29 06:52:33 compute-0 nova_compute[187185]: 2025-11-29 06:52:33.784 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:33.803 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[71c851f0-d252-412f-bcf4-3facead1ca9c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:52:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:33.838 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[722a4015-f07f-424e-8c41-b781e4e83bfe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:52:33 compute-0 systemd-udevd[216166]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 06:52:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:33.852 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[7bc6c6c2-7f11-4e63-aa9c-84cf734b2dda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:52:33 compute-0 NetworkManager[55227]: <info>  [1764399153.8611] manager: (tap3c63c551-20): new Veth device (/org/freedesktop/NetworkManager/Devices/33)
Nov 29 06:52:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:33.896 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[e79d4cd4-08bc-4114-8030-6cb1172bedd6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:52:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:33.901 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[23517c2c-cd43-40b1-96e2-fce124df7dd8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:52:33 compute-0 NetworkManager[55227]: <info>  [1764399153.9215] device (tap3c63c551-20): carrier: link connected
Nov 29 06:52:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:33.926 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[2f7e09a9-1963-4d93-9014-89a6a253f085]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:52:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:33.941 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[62fcb978-0030-47ec-9f8f-8ee0b8d37cd7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3c63c551-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9b:20:eb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 19], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 460158, 'reachable_time': 15590, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216201, 'error': None, 'target': 'ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:52:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:33.955 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[cb1ba347-8be5-4912-95a6-6e35a60759e0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9b:20eb'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 460158, 'tstamp': 460158}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216202, 'error': None, 'target': 'ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:52:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:33.973 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[5cf57b34-1bca-4c0b-a09a-b82643f605a6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3c63c551-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9b:20:eb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 19], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 460158, 'reachable_time': 15590, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216203, 'error': None, 'target': 'ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:52:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:34.001 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d5beefdf-164f-4b9f-a8c0-3e19629f38ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:52:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:34.053 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f6ae2baa-f2cb-47fe-a90e-07c6a578f2ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:52:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:34.054 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3c63c551-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:52:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:34.055 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 06:52:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:34.055 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3c63c551-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:52:34 compute-0 nova_compute[187185]: 2025-11-29 06:52:34.057 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:34 compute-0 kernel: tap3c63c551-20: entered promiscuous mode
Nov 29 06:52:34 compute-0 NetworkManager[55227]: <info>  [1764399154.0598] manager: (tap3c63c551-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Nov 29 06:52:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:34.060 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3c63c551-20, col_values=(('external_ids', {'iface-id': '90a33ad8-e32a-4cc0-85e0-1ed390ab00fa'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:52:34 compute-0 nova_compute[187185]: 2025-11-29 06:52:34.060 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:34 compute-0 ovn_controller[95281]: 2025-11-29T06:52:34Z|00056|binding|INFO|Releasing lport 90a33ad8-e32a-4cc0-85e0-1ed390ab00fa from this chassis (sb_readonly=0)
Nov 29 06:52:34 compute-0 nova_compute[187185]: 2025-11-29 06:52:34.061 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:34 compute-0 nova_compute[187185]: 2025-11-29 06:52:34.074 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:34 compute-0 nova_compute[187185]: 2025-11-29 06:52:34.074 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:34.075 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3c63c551-2e9f-4b47-9e49-c73140efe20a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3c63c551-2e9f-4b47-9e49-c73140efe20a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 06:52:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:34.077 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[cc13dd4e-3618-4a08-9835-2130e2b1dea9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:52:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:34.077 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 06:52:34 compute-0 ovn_metadata_agent[104249]: global
Nov 29 06:52:34 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 06:52:34 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-3c63c551-2e9f-4b47-9e49-c73140efe20a
Nov 29 06:52:34 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 06:52:34 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 06:52:34 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 06:52:34 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/3c63c551-2e9f-4b47-9e49-c73140efe20a.pid.haproxy
Nov 29 06:52:34 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 06:52:34 compute-0 ovn_metadata_agent[104249]: 
Nov 29 06:52:34 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 06:52:34 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 06:52:34 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 06:52:34 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 06:52:34 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 06:52:34 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 06:52:34 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 06:52:34 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 06:52:34 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 06:52:34 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 06:52:34 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 06:52:34 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 06:52:34 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 06:52:34 compute-0 ovn_metadata_agent[104249]: 
Nov 29 06:52:34 compute-0 ovn_metadata_agent[104249]: 
Nov 29 06:52:34 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 06:52:34 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 06:52:34 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 06:52:34 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID 3c63c551-2e9f-4b47-9e49-c73140efe20a
Nov 29 06:52:34 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 06:52:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:34.078 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a', 'env', 'PROCESS_TAG=haproxy-3c63c551-2e9f-4b47-9e49-c73140efe20a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3c63c551-2e9f-4b47-9e49-c73140efe20a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 06:52:34 compute-0 nova_compute[187185]: 2025-11-29 06:52:34.157 187189 DEBUG nova.compute.manager [req-29b2e679-cc91-40fa-a229-3eb36c0afcec req-bc717d6e-6df8-4287-b24d-2d6e879ce759 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Received event network-vif-plugged-586fc8d7-18ba-4421-a518-60f4d0a6950c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:52:34 compute-0 nova_compute[187185]: 2025-11-29 06:52:34.157 187189 DEBUG oslo_concurrency.lockutils [req-29b2e679-cc91-40fa-a229-3eb36c0afcec req-bc717d6e-6df8-4287-b24d-2d6e879ce759 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "942e977f-fd74-45d1-b0be-661b15431eca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:52:34 compute-0 nova_compute[187185]: 2025-11-29 06:52:34.158 187189 DEBUG oslo_concurrency.lockutils [req-29b2e679-cc91-40fa-a229-3eb36c0afcec req-bc717d6e-6df8-4287-b24d-2d6e879ce759 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "942e977f-fd74-45d1-b0be-661b15431eca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:52:34 compute-0 nova_compute[187185]: 2025-11-29 06:52:34.158 187189 DEBUG oslo_concurrency.lockutils [req-29b2e679-cc91-40fa-a229-3eb36c0afcec req-bc717d6e-6df8-4287-b24d-2d6e879ce759 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "942e977f-fd74-45d1-b0be-661b15431eca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:52:34 compute-0 nova_compute[187185]: 2025-11-29 06:52:34.158 187189 DEBUG nova.compute.manager [req-29b2e679-cc91-40fa-a229-3eb36c0afcec req-bc717d6e-6df8-4287-b24d-2d6e879ce759 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Processing event network-vif-plugged-586fc8d7-18ba-4421-a518-60f4d0a6950c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 06:52:34 compute-0 nova_compute[187185]: 2025-11-29 06:52:34.267 187189 DEBUG nova.network.neutron [req-9441b4ee-f27a-400d-af04-20467752bda5 req-e5466404-8d0d-40c0-83dc-b9eac80cb432 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Updated VIF entry in instance network info cache for port 586fc8d7-18ba-4421-a518-60f4d0a6950c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 06:52:34 compute-0 nova_compute[187185]: 2025-11-29 06:52:34.268 187189 DEBUG nova.network.neutron [req-9441b4ee-f27a-400d-af04-20467752bda5 req-e5466404-8d0d-40c0-83dc-b9eac80cb432 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Updating instance_info_cache with network_info: [{"id": "586fc8d7-18ba-4421-a518-60f4d0a6950c", "address": "fa:16:3e:90:18:7a", "network": {"id": "3c63c551-2e9f-4b47-9e49-c73140efe20a", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-555773636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71af3e88884e42c48fb244d7d6ca31e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap586fc8d7-18", "ovs_interfaceid": "586fc8d7-18ba-4421-a518-60f4d0a6950c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:52:34 compute-0 nova_compute[187185]: 2025-11-29 06:52:34.287 187189 DEBUG oslo_concurrency.lockutils [req-9441b4ee-f27a-400d-af04-20467752bda5 req-e5466404-8d0d-40c0-83dc-b9eac80cb432 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-942e977f-fd74-45d1-b0be-661b15431eca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:52:34 compute-0 podman[216235]: 2025-11-29 06:52:34.409305458 +0000 UTC m=+0.023792482 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 06:52:34 compute-0 podman[216235]: 2025-11-29 06:52:34.591717746 +0000 UTC m=+0.206204760 container create b814c2b33f28d15a79f7ceae2c7ea208fdacedc32e186c111a63d1751b6fad30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 06:52:34 compute-0 systemd[1]: Started libpod-conmon-b814c2b33f28d15a79f7ceae2c7ea208fdacedc32e186c111a63d1751b6fad30.scope.
Nov 29 06:52:34 compute-0 systemd[1]: Started libcrun container.
Nov 29 06:52:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/499831e352667b04e531c33ccd01d6a5fbf69076b11ff0e78e057770d809c5a2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 06:52:34 compute-0 nova_compute[187185]: 2025-11-29 06:52:34.809 187189 DEBUG nova.compute.manager [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 06:52:34 compute-0 nova_compute[187185]: 2025-11-29 06:52:34.810 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399154.8086915, 942e977f-fd74-45d1-b0be-661b15431eca => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:52:34 compute-0 nova_compute[187185]: 2025-11-29 06:52:34.810 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] VM Started (Lifecycle Event)
Nov 29 06:52:34 compute-0 nova_compute[187185]: 2025-11-29 06:52:34.813 187189 DEBUG nova.virt.libvirt.driver [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 06:52:34 compute-0 nova_compute[187185]: 2025-11-29 06:52:34.817 187189 INFO nova.virt.libvirt.driver [-] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Instance spawned successfully.
Nov 29 06:52:34 compute-0 nova_compute[187185]: 2025-11-29 06:52:34.817 187189 DEBUG nova.virt.libvirt.driver [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 06:52:34 compute-0 nova_compute[187185]: 2025-11-29 06:52:34.853 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:52:34 compute-0 nova_compute[187185]: 2025-11-29 06:52:34.855 187189 DEBUG nova.virt.libvirt.driver [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:52:34 compute-0 nova_compute[187185]: 2025-11-29 06:52:34.856 187189 DEBUG nova.virt.libvirt.driver [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:52:34 compute-0 nova_compute[187185]: 2025-11-29 06:52:34.856 187189 DEBUG nova.virt.libvirt.driver [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:52:34 compute-0 nova_compute[187185]: 2025-11-29 06:52:34.856 187189 DEBUG nova.virt.libvirt.driver [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:52:34 compute-0 nova_compute[187185]: 2025-11-29 06:52:34.857 187189 DEBUG nova.virt.libvirt.driver [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:52:34 compute-0 nova_compute[187185]: 2025-11-29 06:52:34.857 187189 DEBUG nova.virt.libvirt.driver [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:52:34 compute-0 nova_compute[187185]: 2025-11-29 06:52:34.862 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 06:52:34 compute-0 nova_compute[187185]: 2025-11-29 06:52:34.905 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 06:52:34 compute-0 nova_compute[187185]: 2025-11-29 06:52:34.905 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399154.808987, 942e977f-fd74-45d1-b0be-661b15431eca => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:52:34 compute-0 nova_compute[187185]: 2025-11-29 06:52:34.906 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] VM Paused (Lifecycle Event)
Nov 29 06:52:34 compute-0 nova_compute[187185]: 2025-11-29 06:52:34.924 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:52:34 compute-0 nova_compute[187185]: 2025-11-29 06:52:34.928 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399154.813527, 942e977f-fd74-45d1-b0be-661b15431eca => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:52:34 compute-0 nova_compute[187185]: 2025-11-29 06:52:34.928 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] VM Resumed (Lifecycle Event)
Nov 29 06:52:34 compute-0 nova_compute[187185]: 2025-11-29 06:52:34.945 187189 INFO nova.compute.manager [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Took 9.06 seconds to spawn the instance on the hypervisor.
Nov 29 06:52:34 compute-0 nova_compute[187185]: 2025-11-29 06:52:34.946 187189 DEBUG nova.compute.manager [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:52:34 compute-0 nova_compute[187185]: 2025-11-29 06:52:34.950 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:52:34 compute-0 nova_compute[187185]: 2025-11-29 06:52:34.956 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 06:52:35 compute-0 nova_compute[187185]: 2025-11-29 06:52:35.027 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 06:52:35 compute-0 podman[216235]: 2025-11-29 06:52:35.164158109 +0000 UTC m=+0.778645133 container init b814c2b33f28d15a79f7ceae2c7ea208fdacedc32e186c111a63d1751b6fad30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:52:35 compute-0 podman[216235]: 2025-11-29 06:52:35.176663441 +0000 UTC m=+0.791150455 container start b814c2b33f28d15a79f7ceae2c7ea208fdacedc32e186c111a63d1751b6fad30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 06:52:35 compute-0 nova_compute[187185]: 2025-11-29 06:52:35.196 187189 INFO nova.compute.manager [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Took 10.02 seconds to build instance.
Nov 29 06:52:35 compute-0 nova_compute[187185]: 2025-11-29 06:52:35.219 187189 DEBUG oslo_concurrency.lockutils [None req-d6a43831-df4e-4202-a485-8d3a2fba0fdf a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Lock "942e977f-fd74-45d1-b0be-661b15431eca" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:52:35 compute-0 nova_compute[187185]: 2025-11-29 06:52:35.220 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "942e977f-fd74-45d1-b0be-661b15431eca" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 6.584s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:52:35 compute-0 nova_compute[187185]: 2025-11-29 06:52:35.221 187189 INFO nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 06:52:35 compute-0 nova_compute[187185]: 2025-11-29 06:52:35.221 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "942e977f-fd74-45d1-b0be-661b15431eca" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:52:35 compute-0 neutron-haproxy-ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a[216250]: [NOTICE]   (216261) : New worker (216263) forked
Nov 29 06:52:35 compute-0 neutron-haproxy-ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a[216250]: [NOTICE]   (216261) : Loading success.
Nov 29 06:52:36 compute-0 nova_compute[187185]: 2025-11-29 06:52:36.272 187189 DEBUG nova.compute.manager [req-4728986b-b43f-486c-813b-31a9a1d1a4e3 req-dcf93876-ee66-4732-9fa6-52f07363a356 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Received event network-vif-plugged-586fc8d7-18ba-4421-a518-60f4d0a6950c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:52:36 compute-0 nova_compute[187185]: 2025-11-29 06:52:36.274 187189 DEBUG oslo_concurrency.lockutils [req-4728986b-b43f-486c-813b-31a9a1d1a4e3 req-dcf93876-ee66-4732-9fa6-52f07363a356 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "942e977f-fd74-45d1-b0be-661b15431eca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:52:36 compute-0 nova_compute[187185]: 2025-11-29 06:52:36.275 187189 DEBUG oslo_concurrency.lockutils [req-4728986b-b43f-486c-813b-31a9a1d1a4e3 req-dcf93876-ee66-4732-9fa6-52f07363a356 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "942e977f-fd74-45d1-b0be-661b15431eca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:52:36 compute-0 nova_compute[187185]: 2025-11-29 06:52:36.275 187189 DEBUG oslo_concurrency.lockutils [req-4728986b-b43f-486c-813b-31a9a1d1a4e3 req-dcf93876-ee66-4732-9fa6-52f07363a356 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "942e977f-fd74-45d1-b0be-661b15431eca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:52:36 compute-0 nova_compute[187185]: 2025-11-29 06:52:36.276 187189 DEBUG nova.compute.manager [req-4728986b-b43f-486c-813b-31a9a1d1a4e3 req-dcf93876-ee66-4732-9fa6-52f07363a356 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] No waiting events found dispatching network-vif-plugged-586fc8d7-18ba-4421-a518-60f4d0a6950c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 06:52:36 compute-0 nova_compute[187185]: 2025-11-29 06:52:36.276 187189 WARNING nova.compute.manager [req-4728986b-b43f-486c-813b-31a9a1d1a4e3 req-dcf93876-ee66-4732-9fa6-52f07363a356 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Received unexpected event network-vif-plugged-586fc8d7-18ba-4421-a518-60f4d0a6950c for instance with vm_state active and task_state None.
Nov 29 06:52:37 compute-0 nova_compute[187185]: 2025-11-29 06:52:37.199 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:37 compute-0 nova_compute[187185]: 2025-11-29 06:52:37.497 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:39 compute-0 podman[216272]: 2025-11-29 06:52:39.822008039 +0000 UTC m=+0.080923293 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 06:52:40 compute-0 nova_compute[187185]: 2025-11-29 06:52:40.365 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:52:40 compute-0 nova_compute[187185]: 2025-11-29 06:52:40.367 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 06:52:40 compute-0 nova_compute[187185]: 2025-11-29 06:52:40.367 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 06:52:40 compute-0 nova_compute[187185]: 2025-11-29 06:52:40.686 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "refresh_cache-942e977f-fd74-45d1-b0be-661b15431eca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:52:40 compute-0 nova_compute[187185]: 2025-11-29 06:52:40.687 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquired lock "refresh_cache-942e977f-fd74-45d1-b0be-661b15431eca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:52:40 compute-0 nova_compute[187185]: 2025-11-29 06:52:40.687 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 29 06:52:40 compute-0 nova_compute[187185]: 2025-11-29 06:52:40.688 187189 DEBUG nova.objects.instance [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 942e977f-fd74-45d1-b0be-661b15431eca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:52:42 compute-0 nova_compute[187185]: 2025-11-29 06:52:42.205 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:42 compute-0 sshd-session[216291]: Invalid user svn from 103.179.56.44 port 52282
Nov 29 06:52:42 compute-0 nova_compute[187185]: 2025-11-29 06:52:42.532 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Updating instance_info_cache with network_info: [{"id": "586fc8d7-18ba-4421-a518-60f4d0a6950c", "address": "fa:16:3e:90:18:7a", "network": {"id": "3c63c551-2e9f-4b47-9e49-c73140efe20a", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-555773636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71af3e88884e42c48fb244d7d6ca31e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap586fc8d7-18", "ovs_interfaceid": "586fc8d7-18ba-4421-a518-60f4d0a6950c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:52:42 compute-0 nova_compute[187185]: 2025-11-29 06:52:42.562 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Releasing lock "refresh_cache-942e977f-fd74-45d1-b0be-661b15431eca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:52:42 compute-0 nova_compute[187185]: 2025-11-29 06:52:42.562 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 29 06:52:42 compute-0 nova_compute[187185]: 2025-11-29 06:52:42.563 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:52:42 compute-0 nova_compute[187185]: 2025-11-29 06:52:42.580 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:42 compute-0 sshd-session[216291]: Received disconnect from 103.179.56.44 port 52282:11: Bye Bye [preauth]
Nov 29 06:52:42 compute-0 sshd-session[216291]: Disconnected from invalid user svn 103.179.56.44 port 52282 [preauth]
Nov 29 06:52:43 compute-0 nova_compute[187185]: 2025-11-29 06:52:43.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:52:43 compute-0 nova_compute[187185]: 2025-11-29 06:52:43.319 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:52:44 compute-0 nova_compute[187185]: 2025-11-29 06:52:44.313 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:52:44 compute-0 nova_compute[187185]: 2025-11-29 06:52:44.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:52:45 compute-0 nova_compute[187185]: 2025-11-29 06:52:45.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:52:45 compute-0 nova_compute[187185]: 2025-11-29 06:52:45.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:52:45 compute-0 nova_compute[187185]: 2025-11-29 06:52:45.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 06:52:45 compute-0 nova_compute[187185]: 2025-11-29 06:52:45.318 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:52:45 compute-0 nova_compute[187185]: 2025-11-29 06:52:45.369 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:52:45 compute-0 nova_compute[187185]: 2025-11-29 06:52:45.369 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:52:45 compute-0 nova_compute[187185]: 2025-11-29 06:52:45.370 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:52:45 compute-0 nova_compute[187185]: 2025-11-29 06:52:45.370 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:52:45 compute-0 nova_compute[187185]: 2025-11-29 06:52:45.463 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/942e977f-fd74-45d1-b0be-661b15431eca/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:52:45 compute-0 nova_compute[187185]: 2025-11-29 06:52:45.524 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/942e977f-fd74-45d1-b0be-661b15431eca/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:52:45 compute-0 nova_compute[187185]: 2025-11-29 06:52:45.525 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/942e977f-fd74-45d1-b0be-661b15431eca/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:52:45 compute-0 nova_compute[187185]: 2025-11-29 06:52:45.579 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/942e977f-fd74-45d1-b0be-661b15431eca/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:52:45 compute-0 nova_compute[187185]: 2025-11-29 06:52:45.716 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:52:45 compute-0 nova_compute[187185]: 2025-11-29 06:52:45.718 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5629MB free_disk=73.33818054199219GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:52:45 compute-0 nova_compute[187185]: 2025-11-29 06:52:45.718 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:52:45 compute-0 nova_compute[187185]: 2025-11-29 06:52:45.719 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:52:45 compute-0 nova_compute[187185]: 2025-11-29 06:52:45.802 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance 942e977f-fd74-45d1-b0be-661b15431eca actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 06:52:45 compute-0 nova_compute[187185]: 2025-11-29 06:52:45.803 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:52:45 compute-0 nova_compute[187185]: 2025-11-29 06:52:45.803 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:52:45 compute-0 nova_compute[187185]: 2025-11-29 06:52:45.838 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:52:45 compute-0 nova_compute[187185]: 2025-11-29 06:52:45.854 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:52:45 compute-0 nova_compute[187185]: 2025-11-29 06:52:45.879 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:52:45 compute-0 nova_compute[187185]: 2025-11-29 06:52:45.880 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.161s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:52:46 compute-0 nova_compute[187185]: 2025-11-29 06:52:46.132 187189 DEBUG oslo_concurrency.lockutils [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Acquiring lock "91bf50da-3c1f-4f88-a67f-21ec183c3812" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:52:46 compute-0 nova_compute[187185]: 2025-11-29 06:52:46.133 187189 DEBUG oslo_concurrency.lockutils [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Lock "91bf50da-3c1f-4f88-a67f-21ec183c3812" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:52:46 compute-0 nova_compute[187185]: 2025-11-29 06:52:46.148 187189 DEBUG nova.compute.manager [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 06:52:46 compute-0 nova_compute[187185]: 2025-11-29 06:52:46.227 187189 DEBUG oslo_concurrency.lockutils [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:52:46 compute-0 nova_compute[187185]: 2025-11-29 06:52:46.228 187189 DEBUG oslo_concurrency.lockutils [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:52:46 compute-0 nova_compute[187185]: 2025-11-29 06:52:46.233 187189 DEBUG nova.virt.hardware [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 06:52:46 compute-0 nova_compute[187185]: 2025-11-29 06:52:46.234 187189 INFO nova.compute.claims [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Claim successful on node compute-0.ctlplane.example.com
Nov 29 06:52:46 compute-0 nova_compute[187185]: 2025-11-29 06:52:46.362 187189 DEBUG nova.compute.provider_tree [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:52:46 compute-0 nova_compute[187185]: 2025-11-29 06:52:46.384 187189 DEBUG nova.scheduler.client.report [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:52:46 compute-0 nova_compute[187185]: 2025-11-29 06:52:46.413 187189 DEBUG oslo_concurrency.lockutils [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.184s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:52:46 compute-0 nova_compute[187185]: 2025-11-29 06:52:46.414 187189 DEBUG nova.compute.manager [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 06:52:46 compute-0 nova_compute[187185]: 2025-11-29 06:52:46.479 187189 DEBUG nova.compute.manager [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 06:52:46 compute-0 nova_compute[187185]: 2025-11-29 06:52:46.480 187189 DEBUG nova.network.neutron [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 06:52:46 compute-0 nova_compute[187185]: 2025-11-29 06:52:46.529 187189 INFO nova.virt.libvirt.driver [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 06:52:46 compute-0 nova_compute[187185]: 2025-11-29 06:52:46.576 187189 DEBUG nova.compute.manager [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 06:52:46 compute-0 nova_compute[187185]: 2025-11-29 06:52:46.710 187189 DEBUG nova.compute.manager [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 06:52:46 compute-0 nova_compute[187185]: 2025-11-29 06:52:46.711 187189 DEBUG nova.virt.libvirt.driver [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 06:52:46 compute-0 nova_compute[187185]: 2025-11-29 06:52:46.712 187189 INFO nova.virt.libvirt.driver [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Creating image(s)
Nov 29 06:52:46 compute-0 nova_compute[187185]: 2025-11-29 06:52:46.713 187189 DEBUG oslo_concurrency.lockutils [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Acquiring lock "/var/lib/nova/instances/91bf50da-3c1f-4f88-a67f-21ec183c3812/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:52:46 compute-0 nova_compute[187185]: 2025-11-29 06:52:46.713 187189 DEBUG oslo_concurrency.lockutils [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Lock "/var/lib/nova/instances/91bf50da-3c1f-4f88-a67f-21ec183c3812/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:52:46 compute-0 nova_compute[187185]: 2025-11-29 06:52:46.714 187189 DEBUG oslo_concurrency.lockutils [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Lock "/var/lib/nova/instances/91bf50da-3c1f-4f88-a67f-21ec183c3812/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:52:46 compute-0 nova_compute[187185]: 2025-11-29 06:52:46.738 187189 DEBUG oslo_concurrency.processutils [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:52:46 compute-0 nova_compute[187185]: 2025-11-29 06:52:46.759 187189 DEBUG nova.policy [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '840f4cdf2bf9409f9b4fd2a7218fcfbb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '462ed9e91718488eab9f1fece4b6b34b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 06:52:46 compute-0 nova_compute[187185]: 2025-11-29 06:52:46.795 187189 DEBUG oslo_concurrency.processutils [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:52:46 compute-0 nova_compute[187185]: 2025-11-29 06:52:46.796 187189 DEBUG oslo_concurrency.lockutils [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:52:46 compute-0 nova_compute[187185]: 2025-11-29 06:52:46.797 187189 DEBUG oslo_concurrency.lockutils [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:52:46 compute-0 nova_compute[187185]: 2025-11-29 06:52:46.809 187189 DEBUG oslo_concurrency.processutils [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:52:46 compute-0 ovn_controller[95281]: 2025-11-29T06:52:46Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:90:18:7a 10.100.0.13
Nov 29 06:52:46 compute-0 ovn_controller[95281]: 2025-11-29T06:52:46Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:90:18:7a 10.100.0.13
Nov 29 06:52:46 compute-0 nova_compute[187185]: 2025-11-29 06:52:46.878 187189 DEBUG oslo_concurrency.processutils [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:52:46 compute-0 nova_compute[187185]: 2025-11-29 06:52:46.879 187189 DEBUG oslo_concurrency.processutils [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/91bf50da-3c1f-4f88-a67f-21ec183c3812/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:52:47 compute-0 nova_compute[187185]: 2025-11-29 06:52:47.110 187189 DEBUG oslo_concurrency.processutils [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/91bf50da-3c1f-4f88-a67f-21ec183c3812/disk 1073741824" returned: 0 in 0.231s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:52:47 compute-0 nova_compute[187185]: 2025-11-29 06:52:47.112 187189 DEBUG oslo_concurrency.lockutils [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.315s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:52:47 compute-0 nova_compute[187185]: 2025-11-29 06:52:47.113 187189 DEBUG oslo_concurrency.processutils [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:52:47 compute-0 nova_compute[187185]: 2025-11-29 06:52:47.177 187189 DEBUG oslo_concurrency.processutils [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:52:47 compute-0 nova_compute[187185]: 2025-11-29 06:52:47.178 187189 DEBUG nova.virt.disk.api [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Checking if we can resize image /var/lib/nova/instances/91bf50da-3c1f-4f88-a67f-21ec183c3812/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 06:52:47 compute-0 nova_compute[187185]: 2025-11-29 06:52:47.179 187189 DEBUG oslo_concurrency.processutils [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/91bf50da-3c1f-4f88-a67f-21ec183c3812/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:52:47 compute-0 nova_compute[187185]: 2025-11-29 06:52:47.237 187189 DEBUG oslo_concurrency.processutils [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/91bf50da-3c1f-4f88-a67f-21ec183c3812/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:52:47 compute-0 nova_compute[187185]: 2025-11-29 06:52:47.238 187189 DEBUG nova.virt.disk.api [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Cannot resize image /var/lib/nova/instances/91bf50da-3c1f-4f88-a67f-21ec183c3812/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 06:52:47 compute-0 nova_compute[187185]: 2025-11-29 06:52:47.239 187189 DEBUG nova.objects.instance [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Lazy-loading 'migration_context' on Instance uuid 91bf50da-3c1f-4f88-a67f-21ec183c3812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:52:47 compute-0 nova_compute[187185]: 2025-11-29 06:52:47.251 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:47 compute-0 nova_compute[187185]: 2025-11-29 06:52:47.254 187189 DEBUG nova.virt.libvirt.driver [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 06:52:47 compute-0 nova_compute[187185]: 2025-11-29 06:52:47.254 187189 DEBUG nova.virt.libvirt.driver [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Ensure instance console log exists: /var/lib/nova/instances/91bf50da-3c1f-4f88-a67f-21ec183c3812/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 06:52:47 compute-0 nova_compute[187185]: 2025-11-29 06:52:47.255 187189 DEBUG oslo_concurrency.lockutils [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:52:47 compute-0 nova_compute[187185]: 2025-11-29 06:52:47.255 187189 DEBUG oslo_concurrency.lockutils [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:52:47 compute-0 nova_compute[187185]: 2025-11-29 06:52:47.256 187189 DEBUG oslo_concurrency.lockutils [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:52:47 compute-0 nova_compute[187185]: 2025-11-29 06:52:47.583 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:47.986 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '942e977f-fd74-45d1-b0be-661b15431eca', 'name': 'tempest-FloatingIPsAssociationTestJSON-server-1366973524', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000014', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '71af3e88884e42c48fb244d7d6ca31e2', 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'hostId': 'e201be18799c6716174431ca08031e83bd94cf42dfb4c354d60e5a7d', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 06:52:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:47.987 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.015 12 DEBUG ceilometer.compute.pollsters [-] 942e977f-fd74-45d1-b0be-661b15431eca/disk.device.read.latency volume: 194311363 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.016 12 DEBUG ceilometer.compute.pollsters [-] 942e977f-fd74-45d1-b0be-661b15431eca/disk.device.read.latency volume: 22957105 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0a1b8b8-c10d-4e9f-8882-ef3ada1c6657', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 194311363, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': '942e977f-fd74-45d1-b0be-661b15431eca-vda', 'timestamp': '2025-11-29T06:52:47.987630', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1366973524', 'name': 'instance-00000014', 'instance_id': '942e977f-fd74-45d1-b0be-661b15431eca', 'instance_type': 'm1.nano', 'host': 'e201be18799c6716174431ca08031e83bd94cf42dfb4c354d60e5a7d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '04054f5c-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4615.705866646, 'message_signature': '6f279e9b0f454e8ce9fa1eef7d7d42c34820d05c089a9857187b32b65b819f8d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22957105, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': '942e977f-fd74-45d1-b0be-661b15431eca-sda', 'timestamp': '2025-11-29T06:52:47.987630', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1366973524', 'name': 'instance-00000014', 'instance_id': '942e977f-fd74-45d1-b0be-661b15431eca', 'instance_type': 'm1.nano', 'host': 'e201be18799c6716174431ca08031e83bd94cf42dfb4c354d60e5a7d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '040560a0-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4615.705866646, 'message_signature': '32d952906484c21305c4b4086a2aabf3fc79b59da1e2fc8106ff66eb81502515'}]}, 'timestamp': '2025-11-29 06:52:48.016576', '_unique_id': 'f0140e2ac77b405894d789c2f06c0522'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.018 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.019 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.030 12 DEBUG ceilometer.compute.pollsters [-] 942e977f-fd74-45d1-b0be-661b15431eca/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.030 12 DEBUG ceilometer.compute.pollsters [-] 942e977f-fd74-45d1-b0be-661b15431eca/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e05c9330-3f16-4809-bff7-8c92338e8cf2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': '942e977f-fd74-45d1-b0be-661b15431eca-vda', 'timestamp': '2025-11-29T06:52:48.019492', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1366973524', 'name': 'instance-00000014', 'instance_id': '942e977f-fd74-45d1-b0be-661b15431eca', 'instance_type': 'm1.nano', 'host': 'e201be18799c6716174431ca08031e83bd94cf42dfb4c354d60e5a7d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0407813c-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4615.737748487, 'message_signature': '6e76d88df2d9f2424f4f56fff0fe5baa4c17064180952314ccb465d7d0eded45'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': '942e977f-fd74-45d1-b0be-661b15431eca-sda', 'timestamp': '2025-11-29T06:52:48.019492', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1366973524', 'name': 'instance-00000014', 'instance_id': '942e977f-fd74-45d1-b0be-661b15431eca', 'instance_type': 'm1.nano', 'host': 'e201be18799c6716174431ca08031e83bd94cf42dfb4c354d60e5a7d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '04078d08-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4615.737748487, 'message_signature': '4c21818d22fbfe0875dd2e350c5b0d65b80e1570e7bd00f9e7b9c63ba9efe095'}]}, 'timestamp': '2025-11-29 06:52:48.030757', '_unique_id': '1c9272c7b9d941eb811f44ea947e77b2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.031 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.032 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.032 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.032 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-1366973524>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-1366973524>]
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.032 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.035 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 942e977f-fd74-45d1-b0be-661b15431eca / tap586fc8d7-18 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.035 12 DEBUG ceilometer.compute.pollsters [-] 942e977f-fd74-45d1-b0be-661b15431eca/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e2a736d2-6372-4a01-9814-fc5aa5828b2c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': 'instance-00000014-942e977f-fd74-45d1-b0be-661b15431eca-tap586fc8d7-18', 'timestamp': '2025-11-29T06:52:48.032956', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1366973524', 'name': 'tap586fc8d7-18', 'instance_id': '942e977f-fd74-45d1-b0be-661b15431eca', 'instance_type': 'm1.nano', 'host': 'e201be18799c6716174431ca08031e83bd94cf42dfb4c354d60e5a7d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:90:18:7a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap586fc8d7-18'}, 'message_id': '04084ffe-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4615.751196725, 'message_signature': '045e50e014a3c6393e1feb0e6ced8361ca5786698faa032bf08e0b56af696455'}]}, 'timestamp': '2025-11-29 06:52:48.035823', '_unique_id': '511a9fdd59a4402daa5eca5cf1f6cd43'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.036 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.037 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.037 12 DEBUG ceilometer.compute.pollsters [-] 942e977f-fd74-45d1-b0be-661b15431eca/disk.device.read.bytes volume: 30288384 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.037 12 DEBUG ceilometer.compute.pollsters [-] 942e977f-fd74-45d1-b0be-661b15431eca/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1667f272-437a-43cb-a51d-c640203281bd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30288384, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': '942e977f-fd74-45d1-b0be-661b15431eca-vda', 'timestamp': '2025-11-29T06:52:48.037425', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1366973524', 'name': 'instance-00000014', 'instance_id': '942e977f-fd74-45d1-b0be-661b15431eca', 'instance_type': 'm1.nano', 'host': 'e201be18799c6716174431ca08031e83bd94cf42dfb4c354d60e5a7d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '04089ab8-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4615.705866646, 'message_signature': '943b89c351fca05f33efec8aa49dc20f2de23b4daabbc024019fb363518d3b76'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': '942e977f-fd74-45d1-b0be-661b15431eca-sda', 'timestamp': '2025-11-29T06:52:48.037425', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1366973524', 'name': 'instance-00000014', 'instance_id': '942e977f-fd74-45d1-b0be-661b15431eca', 'instance_type': 'm1.nano', 'host': 'e201be18799c6716174431ca08031e83bd94cf42dfb4c354d60e5a7d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0408a288-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4615.705866646, 'message_signature': '4a179e251b5777167abfa39c5bae14adb36d35e261e94100c64d34d1be2daf82'}]}, 'timestamp': '2025-11-29 06:52:48.037859', '_unique_id': 'd3896193f91e47ea92ac2f2f23283dc2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.038 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 DEBUG ceilometer.compute.pollsters [-] 942e977f-fd74-45d1-b0be-661b15431eca/network.outgoing.bytes volume: 992 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'af8c3bf4-a542-4c6f-8771-6508c457554c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 992, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': 'instance-00000014-942e977f-fd74-45d1-b0be-661b15431eca-tap586fc8d7-18', 'timestamp': '2025-11-29T06:52:48.039212', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1366973524', 'name': 'tap586fc8d7-18', 'instance_id': '942e977f-fd74-45d1-b0be-661b15431eca', 'instance_type': 'm1.nano', 'host': 'e201be18799c6716174431ca08031e83bd94cf42dfb4c354d60e5a7d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:90:18:7a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap586fc8d7-18'}, 'message_id': '0408e158-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4615.751196725, 'message_signature': '02414a542576425b649df3bcb1f9e7c1327e07d20ad043d720f25b1d569b515c'}]}, 'timestamp': '2025-11-29 06:52:48.039479', '_unique_id': '34a38e8b05794f7b8739df7d3f96d425'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.039 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.040 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.040 12 DEBUG ceilometer.compute.pollsters [-] 942e977f-fd74-45d1-b0be-661b15431eca/network.incoming.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f592ed0-11bb-439e-adf5-fafc9d8e00a1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 9, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': 'instance-00000014-942e977f-fd74-45d1-b0be-661b15431eca-tap586fc8d7-18', 'timestamp': '2025-11-29T06:52:48.040655', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1366973524', 'name': 'tap586fc8d7-18', 'instance_id': '942e977f-fd74-45d1-b0be-661b15431eca', 'instance_type': 'm1.nano', 'host': 'e201be18799c6716174431ca08031e83bd94cf42dfb4c354d60e5a7d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:90:18:7a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap586fc8d7-18'}, 'message_id': '04091a42-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4615.751196725, 'message_signature': '4bf723af30c5b396de90d7c29a22cf604ca8ff1c0e0ae4d333e2535a4eb35342'}]}, 'timestamp': '2025-11-29 06:52:48.040976', '_unique_id': '839f6c78f0ef44a09e6ece5a72475be3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.041 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.042 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.042 12 DEBUG ceilometer.compute.pollsters [-] 942e977f-fd74-45d1-b0be-661b15431eca/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.042 12 DEBUG ceilometer.compute.pollsters [-] 942e977f-fd74-45d1-b0be-661b15431eca/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2e0dd94e-fcbb-47ee-89d4-beb3ef53cb66', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': '942e977f-fd74-45d1-b0be-661b15431eca-vda', 'timestamp': '2025-11-29T06:52:48.042159', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1366973524', 'name': 'instance-00000014', 'instance_id': '942e977f-fd74-45d1-b0be-661b15431eca', 'instance_type': 'm1.nano', 'host': 'e201be18799c6716174431ca08031e83bd94cf42dfb4c354d60e5a7d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '04095480-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4615.737748487, 'message_signature': '5d5ebf25c0303550270940a4bd3d7fb43a9fd6c972361ac82b952d2dc9564194'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': '942e977f-fd74-45d1-b0be-661b15431eca-sda', 'timestamp': '2025-11-29T06:52:48.042159', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1366973524', 'name': 'instance-00000014', 'instance_id': '942e977f-fd74-45d1-b0be-661b15431eca', 'instance_type': 'm1.nano', 'host': 'e201be18799c6716174431ca08031e83bd94cf42dfb4c354d60e5a7d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '04095c5a-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4615.737748487, 'message_signature': 'b94ae066e023d2064c2e39013612cd2ab9aa45039d68c3d152f89d6f46639a72'}]}, 'timestamp': '2025-11-29 06:52:48.042598', '_unique_id': '42f11b7142be4786809371f58e96d62b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.043 12 DEBUG ceilometer.compute.pollsters [-] 942e977f-fd74-45d1-b0be-661b15431eca/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '29573f7b-c92c-483f-ba58-e6fbff899797', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': 'instance-00000014-942e977f-fd74-45d1-b0be-661b15431eca-tap586fc8d7-18', 'timestamp': '2025-11-29T06:52:48.043725', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1366973524', 'name': 'tap586fc8d7-18', 'instance_id': '942e977f-fd74-45d1-b0be-661b15431eca', 'instance_type': 'm1.nano', 'host': 'e201be18799c6716174431ca08031e83bd94cf42dfb4c354d60e5a7d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:90:18:7a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap586fc8d7-18'}, 'message_id': '0409918e-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4615.751196725, 'message_signature': '7dbae93db4ee554fc6422cc01ea3f9654c04d6385fcfc7f715d64e878f77e11d'}]}, 'timestamp': '2025-11-29 06:52:48.043990', '_unique_id': 'ed3462b0b8aa46069020d5aff3885985'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.044 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 DEBUG ceilometer.compute.pollsters [-] 942e977f-fd74-45d1-b0be-661b15431eca/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d61df29-9c92-4ed4-9591-6a3200ff9c83', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': 'instance-00000014-942e977f-fd74-45d1-b0be-661b15431eca-tap586fc8d7-18', 'timestamp': '2025-11-29T06:52:48.045263', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1366973524', 'name': 'tap586fc8d7-18', 'instance_id': '942e977f-fd74-45d1-b0be-661b15431eca', 'instance_type': 'm1.nano', 'host': 'e201be18799c6716174431ca08031e83bd94cf42dfb4c354d60e5a7d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:90:18:7a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap586fc8d7-18'}, 'message_id': '0409cd34-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4615.751196725, 'message_signature': '03bd1c941393841301a4314f97d7974016ffb800a7e507f490245ff55ffd7682'}]}, 'timestamp': '2025-11-29 06:52:48.045503', '_unique_id': 'abc5a81077284bc89a9900bcceac5549'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.045 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.046 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.046 12 DEBUG ceilometer.compute.pollsters [-] 942e977f-fd74-45d1-b0be-661b15431eca/network.outgoing.packets volume: 6 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2129de32-ff98-49db-9953-72ac636f812a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 6, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': 'instance-00000014-942e977f-fd74-45d1-b0be-661b15431eca-tap586fc8d7-18', 'timestamp': '2025-11-29T06:52:48.046721', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1366973524', 'name': 'tap586fc8d7-18', 'instance_id': '942e977f-fd74-45d1-b0be-661b15431eca', 'instance_type': 'm1.nano', 'host': 'e201be18799c6716174431ca08031e83bd94cf42dfb4c354d60e5a7d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:90:18:7a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap586fc8d7-18'}, 'message_id': '040a0790-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4615.751196725, 'message_signature': '037362e78871b640fd8b9b72654b9ba0c9f03c8ec0472d4283dd7d5c20591f76'}]}, 'timestamp': '2025-11-29 06:52:48.047016', '_unique_id': '6e4071e64b3c470885ebe1cd04c3b137'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.047 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.048 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.048 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.048 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-1366973524>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-1366973524>]
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.048 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.048 12 DEBUG ceilometer.compute.pollsters [-] 942e977f-fd74-45d1-b0be-661b15431eca/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6786a64c-dff9-4af0-9006-6c0b8a3850b8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': 'instance-00000014-942e977f-fd74-45d1-b0be-661b15431eca-tap586fc8d7-18', 'timestamp': '2025-11-29T06:52:48.048566', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1366973524', 'name': 'tap586fc8d7-18', 'instance_id': '942e977f-fd74-45d1-b0be-661b15431eca', 'instance_type': 'm1.nano', 'host': 'e201be18799c6716174431ca08031e83bd94cf42dfb4c354d60e5a7d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:90:18:7a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap586fc8d7-18'}, 'message_id': '040a4f98-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4615.751196725, 'message_signature': 'b591573915ef79b7c10df133e60ecb930155f1618e2b4449b119394c26083a82'}]}, 'timestamp': '2025-11-29 06:52:48.048885', '_unique_id': '383c72497f014b63835da8d651b8cca9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.049 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.050 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.050 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.050 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-1366973524>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-1366973524>]
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.050 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.050 12 DEBUG ceilometer.compute.pollsters [-] 942e977f-fd74-45d1-b0be-661b15431eca/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc00123d-5685-4a56-908a-c9fa3cd41bc9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': 'instance-00000014-942e977f-fd74-45d1-b0be-661b15431eca-tap586fc8d7-18', 'timestamp': '2025-11-29T06:52:48.050436', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1366973524', 'name': 'tap586fc8d7-18', 'instance_id': '942e977f-fd74-45d1-b0be-661b15431eca', 'instance_type': 'm1.nano', 'host': 'e201be18799c6716174431ca08031e83bd94cf42dfb4c354d60e5a7d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:90:18:7a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap586fc8d7-18'}, 'message_id': '040a970a-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4615.751196725, 'message_signature': '093b321211b0e997adef17d4481b3d21d1219b302dcd2c51efba58bff9998df8'}]}, 'timestamp': '2025-11-29 06:52:48.050670', '_unique_id': 'f810b3ce974344be92b5e5ac1b23b1f0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.051 12 DEBUG ceilometer.compute.pollsters [-] 942e977f-fd74-45d1-b0be-661b15431eca/disk.device.write.bytes volume: 72695808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 DEBUG ceilometer.compute.pollsters [-] 942e977f-fd74-45d1-b0be-661b15431eca/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6f0f50aa-92b6-43ce-b4a4-9737f460d0ed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72695808, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': '942e977f-fd74-45d1-b0be-661b15431eca-vda', 'timestamp': '2025-11-29T06:52:48.051916', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1366973524', 'name': 'instance-00000014', 'instance_id': '942e977f-fd74-45d1-b0be-661b15431eca', 'instance_type': 'm1.nano', 'host': 'e201be18799c6716174431ca08031e83bd94cf42dfb4c354d60e5a7d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '040ad09e-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4615.705866646, 'message_signature': '25b879c6f5147e9dd32545fbd21d4ab920bd31dcdebe627847fb7ccabe2052b6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': '942e977f-fd74-45d1-b0be-661b15431eca-sda', 'timestamp': '2025-11-29T06:52:48.051916', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1366973524', 'name': 'instance-00000014', 'instance_id': '942e977f-fd74-45d1-b0be-661b15431eca', 'instance_type': 'm1.nano', 'host': 'e201be18799c6716174431ca08031e83bd94cf42dfb4c354d60e5a7d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '040ad86e-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4615.705866646, 'message_signature': '395c36ebb00a551f522b8d8a1599a2f9bc4831c108b67c7d390e224f105f2489'}]}, 'timestamp': '2025-11-29 06:52:48.052328', '_unique_id': '9b236c398658414cb4b16d717617e143'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.052 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.053 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.053 12 DEBUG ceilometer.compute.pollsters [-] 942e977f-fd74-45d1-b0be-661b15431eca/disk.device.read.requests volume: 1086 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.053 12 DEBUG ceilometer.compute.pollsters [-] 942e977f-fd74-45d1-b0be-661b15431eca/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c09a764c-2432-4f2e-9fdb-ae7a094b93c0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1086, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': '942e977f-fd74-45d1-b0be-661b15431eca-vda', 'timestamp': '2025-11-29T06:52:48.053606', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1366973524', 'name': 'instance-00000014', 'instance_id': '942e977f-fd74-45d1-b0be-661b15431eca', 'instance_type': 'm1.nano', 'host': 'e201be18799c6716174431ca08031e83bd94cf42dfb4c354d60e5a7d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '040b127a-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4615.705866646, 'message_signature': 'faac0b09e4302cd2f7eb6b2d9bf40d51635e0c6d82e60b9f7b08a51209423ecd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': '942e977f-fd74-45d1-b0be-661b15431eca-sda', 'timestamp': '2025-11-29T06:52:48.053606', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1366973524', 'name': 'instance-00000014', 'instance_id': '942e977f-fd74-45d1-b0be-661b15431eca', 'instance_type': 'm1.nano', 'host': 'e201be18799c6716174431ca08031e83bd94cf42dfb4c354d60e5a7d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '040b1c98-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4615.705866646, 'message_signature': 'ff092b9554a9193bb5e1d25736c0bf8443c011bf1f4eb5a3633084988c11a71d'}]}, 'timestamp': '2025-11-29 06:52:48.054079', '_unique_id': '95c35c71b6a642699fb9ab3f72cd0a7d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.054 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.055 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.070 12 DEBUG ceilometer.compute.pollsters [-] 942e977f-fd74-45d1-b0be-661b15431eca/memory.usage volume: 40.37890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dc9406a0-4d3a-44d0-a289-51c06491bf27', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.37890625, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': '942e977f-fd74-45d1-b0be-661b15431eca', 'timestamp': '2025-11-29T06:52:48.055316', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1366973524', 'name': 'instance-00000014', 'instance_id': '942e977f-fd74-45d1-b0be-661b15431eca', 'instance_type': 'm1.nano', 'host': 'e201be18799c6716174431ca08031e83bd94cf42dfb4c354d60e5a7d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '040db7dc-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4615.788797783, 'message_signature': '853a85dc7ab37b9ebb482cdda8bcb6d631f499004e01f7397f79a8599bebcd53'}]}, 'timestamp': '2025-11-29 06:52:48.071476', '_unique_id': '30133331caef4651aeed4ab23e261d35'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.072 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.073 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.073 12 DEBUG ceilometer.compute.pollsters [-] 942e977f-fd74-45d1-b0be-661b15431eca/network.incoming.bytes volume: 1352 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b70eeef6-31d6-4279-b7f8-f5223460bf69', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1352, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': 'instance-00000014-942e977f-fd74-45d1-b0be-661b15431eca-tap586fc8d7-18', 'timestamp': '2025-11-29T06:52:48.073901', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1366973524', 'name': 'tap586fc8d7-18', 'instance_id': '942e977f-fd74-45d1-b0be-661b15431eca', 'instance_type': 'm1.nano', 'host': 'e201be18799c6716174431ca08031e83bd94cf42dfb4c354d60e5a7d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:90:18:7a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap586fc8d7-18'}, 'message_id': '040e2e38-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4615.751196725, 'message_signature': '8a3101b10a0d0616982da43b5e08f09ba5933831af5ee22e4e71e6b1250f8094'}]}, 'timestamp': '2025-11-29 06:52:48.074231', '_unique_id': '35f21401438f4aa8a90a0d933ee79482'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.074 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.075 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.075 12 DEBUG ceilometer.compute.pollsters [-] 942e977f-fd74-45d1-b0be-661b15431eca/cpu volume: 11340000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dcfb3e31-4731-4db5-a673-d55e872c4c0a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11340000000, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': '942e977f-fd74-45d1-b0be-661b15431eca', 'timestamp': '2025-11-29T06:52:48.075657', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1366973524', 'name': 'instance-00000014', 'instance_id': '942e977f-fd74-45d1-b0be-661b15431eca', 'instance_type': 'm1.nano', 'host': 'e201be18799c6716174431ca08031e83bd94cf42dfb4c354d60e5a7d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '040e7186-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4615.788797783, 'message_signature': '882ff0b8273fdec14ad5a448dd65439cddcadd2c2f846ddac92ef2a6832203d8'}]}, 'timestamp': '2025-11-29 06:52:48.075978', '_unique_id': '639e0959dce741ad8170f1011d83c89d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.076 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.077 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.077 12 DEBUG ceilometer.compute.pollsters [-] 942e977f-fd74-45d1-b0be-661b15431eca/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0cdbb59a-e5d6-4272-b4f9-7c3f743a9861', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': 'instance-00000014-942e977f-fd74-45d1-b0be-661b15431eca-tap586fc8d7-18', 'timestamp': '2025-11-29T06:52:48.077447', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1366973524', 'name': 'tap586fc8d7-18', 'instance_id': '942e977f-fd74-45d1-b0be-661b15431eca', 'instance_type': 'm1.nano', 'host': 'e201be18799c6716174431ca08031e83bd94cf42dfb4c354d60e5a7d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:90:18:7a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap586fc8d7-18'}, 'message_id': '040eb790-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4615.751196725, 'message_signature': '65f4bd4ca9a8f1413b21e269e5ec0a547b8e216a4a775aefd9e492768045f468'}]}, 'timestamp': '2025-11-29 06:52:48.077761', '_unique_id': '3fb8ae4f7f5e4bfaafc90527c9482ab8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.078 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.079 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.079 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.079 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-1366973524>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-1366973524>]
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.079 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.079 12 DEBUG ceilometer.compute.pollsters [-] 942e977f-fd74-45d1-b0be-661b15431eca/disk.device.write.latency volume: 2455596578 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.079 12 DEBUG ceilometer.compute.pollsters [-] 942e977f-fd74-45d1-b0be-661b15431eca/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '72c7adbd-cfe9-495f-b7ad-d3a83c9565b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2455596578, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': '942e977f-fd74-45d1-b0be-661b15431eca-vda', 'timestamp': '2025-11-29T06:52:48.079638', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1366973524', 'name': 'instance-00000014', 'instance_id': '942e977f-fd74-45d1-b0be-661b15431eca', 'instance_type': 'm1.nano', 'host': 'e201be18799c6716174431ca08031e83bd94cf42dfb4c354d60e5a7d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '040f0d9e-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4615.705866646, 'message_signature': 'bfc9cd695678649a84f861501fcc89df63a924b8c38234f751d1469403e1b519'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': '942e977f-fd74-45d1-b0be-661b15431eca-sda', 'timestamp': '2025-11-29T06:52:48.079638', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1366973524', 'name': 'instance-00000014', 'instance_id': '942e977f-fd74-45d1-b0be-661b15431eca', 'instance_type': 'm1.nano', 'host': 'e201be18799c6716174431ca08031e83bd94cf42dfb4c354d60e5a7d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '040f1974-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4615.705866646, 'message_signature': '5b13eca9adc801414729a7914b5b58ac221bcfcc03606a6703d627a57d3a2b8f'}]}, 'timestamp': '2025-11-29 06:52:48.080254', '_unique_id': 'fdb72e028c1442d6b61205526e44dd87'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.080 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.081 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.081 12 DEBUG ceilometer.compute.pollsters [-] 942e977f-fd74-45d1-b0be-661b15431eca/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 DEBUG ceilometer.compute.pollsters [-] 942e977f-fd74-45d1-b0be-661b15431eca/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b902ac09-6753-4686-bd9a-cc3ce3059beb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': '942e977f-fd74-45d1-b0be-661b15431eca-vda', 'timestamp': '2025-11-29T06:52:48.081757', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1366973524', 'name': 'instance-00000014', 'instance_id': '942e977f-fd74-45d1-b0be-661b15431eca', 'instance_type': 'm1.nano', 'host': 'e201be18799c6716174431ca08031e83bd94cf42dfb4c354d60e5a7d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '040f60b4-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4615.737748487, 'message_signature': 'e2d128c9da90f0ecd08abb55aa1bf44a50a64b7e35bd15d08bbf767e42d872db'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': '942e977f-fd74-45d1-b0be-661b15431eca-sda', 'timestamp': '2025-11-29T06:52:48.081757', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1366973524', 'name': 'instance-00000014', 'instance_id': '942e977f-fd74-45d1-b0be-661b15431eca', 'instance_type': 'm1.nano', 'host': 'e201be18799c6716174431ca08031e83bd94cf42dfb4c354d60e5a7d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '040f6c12-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4615.737748487, 'message_signature': '2c300b5df7da82d2b8a000cab9c002355935d60af10af3fbed22477f69385d00'}]}, 'timestamp': '2025-11-29 06:52:48.082368', '_unique_id': 'd55bc7a9279e4ace8b0adbe653c20cb4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.082 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.083 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.083 12 DEBUG ceilometer.compute.pollsters [-] 942e977f-fd74-45d1-b0be-661b15431eca/disk.device.write.requests volume: 309 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.084 12 DEBUG ceilometer.compute.pollsters [-] 942e977f-fd74-45d1-b0be-661b15431eca/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9e149d5d-db87-477a-93e4-bfab141c79ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 309, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': '942e977f-fd74-45d1-b0be-661b15431eca-vda', 'timestamp': '2025-11-29T06:52:48.083946', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1366973524', 'name': 'instance-00000014', 'instance_id': '942e977f-fd74-45d1-b0be-661b15431eca', 'instance_type': 'm1.nano', 'host': 'e201be18799c6716174431ca08031e83bd94cf42dfb4c354d60e5a7d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '040fb528-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4615.705866646, 'message_signature': '6e802d5d91ff22298bbfbe1fc8ac2fcf2bcf3b8049797fc5a09bdabc01bb75c6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a0fcd4f4de7e4072be30f7e3d4ac7c77', 'user_name': None, 'project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'project_name': None, 'resource_id': '942e977f-fd74-45d1-b0be-661b15431eca-sda', 'timestamp': '2025-11-29T06:52:48.083946', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-1366973524', 'name': 'instance-00000014', 'instance_id': '942e977f-fd74-45d1-b0be-661b15431eca', 'instance_type': 'm1.nano', 'host': 'e201be18799c6716174431ca08031e83bd94cf42dfb4c354d60e5a7d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '040fc144-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4615.705866646, 'message_signature': '6ff7b546d3ed24e100d4afd612e85bbbc9c360636a0dddf4779e34974525d1ae'}]}, 'timestamp': '2025-11-29 06:52:48.084549', '_unique_id': '1011095a9cdf4e629d4ec3ba9ffaab5b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:52:48.085 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:52:48 compute-0 nova_compute[187185]: 2025-11-29 06:52:48.875 187189 DEBUG nova.network.neutron [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Successfully created port: db7d3de5-1042-4ad8-a3e2-d1f59d68f37d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 06:52:50 compute-0 nova_compute[187185]: 2025-11-29 06:52:50.725 187189 DEBUG nova.network.neutron [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Successfully updated port: db7d3de5-1042-4ad8-a3e2-d1f59d68f37d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 06:52:50 compute-0 nova_compute[187185]: 2025-11-29 06:52:50.748 187189 DEBUG oslo_concurrency.lockutils [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Acquiring lock "refresh_cache-91bf50da-3c1f-4f88-a67f-21ec183c3812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:52:50 compute-0 nova_compute[187185]: 2025-11-29 06:52:50.748 187189 DEBUG oslo_concurrency.lockutils [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Acquired lock "refresh_cache-91bf50da-3c1f-4f88-a67f-21ec183c3812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:52:50 compute-0 nova_compute[187185]: 2025-11-29 06:52:50.748 187189 DEBUG nova.network.neutron [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 06:52:50 compute-0 nova_compute[187185]: 2025-11-29 06:52:50.908 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:50 compute-0 NetworkManager[55227]: <info>  [1764399170.9101] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/35)
Nov 29 06:52:50 compute-0 NetworkManager[55227]: <info>  [1764399170.9116] device (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 06:52:50 compute-0 NetworkManager[55227]: <info>  [1764399170.9147] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/36)
Nov 29 06:52:50 compute-0 NetworkManager[55227]: <info>  [1764399170.9158] device (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 06:52:50 compute-0 NetworkManager[55227]: <info>  [1764399170.9185] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Nov 29 06:52:50 compute-0 NetworkManager[55227]: <info>  [1764399170.9203] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/38)
Nov 29 06:52:50 compute-0 NetworkManager[55227]: <info>  [1764399170.9215] device (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 29 06:52:50 compute-0 NetworkManager[55227]: <info>  [1764399170.9225] device (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 29 06:52:50 compute-0 nova_compute[187185]: 2025-11-29 06:52:50.985 187189 DEBUG nova.network.neutron [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 06:52:51 compute-0 nova_compute[187185]: 2025-11-29 06:52:51.089 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:51 compute-0 ovn_controller[95281]: 2025-11-29T06:52:51Z|00057|binding|INFO|Releasing lport 90a33ad8-e32a-4cc0-85e0-1ed390ab00fa from this chassis (sb_readonly=0)
Nov 29 06:52:51 compute-0 nova_compute[187185]: 2025-11-29 06:52:51.121 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:51 compute-0 nova_compute[187185]: 2025-11-29 06:52:51.880 187189 DEBUG nova.network.neutron [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Updating instance_info_cache with network_info: [{"id": "db7d3de5-1042-4ad8-a3e2-d1f59d68f37d", "address": "fa:16:3e:92:df:75", "network": {"id": "fb6261a0-1734-4a19-8eaa-94660a8ddab1", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1526889533-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462ed9e91718488eab9f1fece4b6b34b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb7d3de5-10", "ovs_interfaceid": "db7d3de5-1042-4ad8-a3e2-d1f59d68f37d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:52:51 compute-0 nova_compute[187185]: 2025-11-29 06:52:51.903 187189 DEBUG oslo_concurrency.lockutils [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Releasing lock "refresh_cache-91bf50da-3c1f-4f88-a67f-21ec183c3812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:52:51 compute-0 nova_compute[187185]: 2025-11-29 06:52:51.904 187189 DEBUG nova.compute.manager [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Instance network_info: |[{"id": "db7d3de5-1042-4ad8-a3e2-d1f59d68f37d", "address": "fa:16:3e:92:df:75", "network": {"id": "fb6261a0-1734-4a19-8eaa-94660a8ddab1", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1526889533-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462ed9e91718488eab9f1fece4b6b34b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb7d3de5-10", "ovs_interfaceid": "db7d3de5-1042-4ad8-a3e2-d1f59d68f37d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 06:52:51 compute-0 nova_compute[187185]: 2025-11-29 06:52:51.906 187189 DEBUG nova.virt.libvirt.driver [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Start _get_guest_xml network_info=[{"id": "db7d3de5-1042-4ad8-a3e2-d1f59d68f37d", "address": "fa:16:3e:92:df:75", "network": {"id": "fb6261a0-1734-4a19-8eaa-94660a8ddab1", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1526889533-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462ed9e91718488eab9f1fece4b6b34b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb7d3de5-10", "ovs_interfaceid": "db7d3de5-1042-4ad8-a3e2-d1f59d68f37d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 06:52:51 compute-0 nova_compute[187185]: 2025-11-29 06:52:51.911 187189 WARNING nova.virt.libvirt.driver [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:52:51 compute-0 nova_compute[187185]: 2025-11-29 06:52:51.917 187189 DEBUG nova.virt.libvirt.host [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 06:52:51 compute-0 nova_compute[187185]: 2025-11-29 06:52:51.918 187189 DEBUG nova.virt.libvirt.host [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 06:52:51 compute-0 nova_compute[187185]: 2025-11-29 06:52:51.921 187189 DEBUG nova.virt.libvirt.host [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 06:52:51 compute-0 nova_compute[187185]: 2025-11-29 06:52:51.922 187189 DEBUG nova.virt.libvirt.host [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 06:52:51 compute-0 nova_compute[187185]: 2025-11-29 06:52:51.924 187189 DEBUG nova.virt.libvirt.driver [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 06:52:51 compute-0 nova_compute[187185]: 2025-11-29 06:52:51.924 187189 DEBUG nova.virt.hardware [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 06:52:51 compute-0 nova_compute[187185]: 2025-11-29 06:52:51.924 187189 DEBUG nova.virt.hardware [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 06:52:51 compute-0 nova_compute[187185]: 2025-11-29 06:52:51.925 187189 DEBUG nova.virt.hardware [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 06:52:51 compute-0 nova_compute[187185]: 2025-11-29 06:52:51.925 187189 DEBUG nova.virt.hardware [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 06:52:51 compute-0 nova_compute[187185]: 2025-11-29 06:52:51.925 187189 DEBUG nova.virt.hardware [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 06:52:51 compute-0 nova_compute[187185]: 2025-11-29 06:52:51.926 187189 DEBUG nova.virt.hardware [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 06:52:51 compute-0 nova_compute[187185]: 2025-11-29 06:52:51.926 187189 DEBUG nova.virt.hardware [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 06:52:51 compute-0 nova_compute[187185]: 2025-11-29 06:52:51.926 187189 DEBUG nova.virt.hardware [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 06:52:51 compute-0 nova_compute[187185]: 2025-11-29 06:52:51.926 187189 DEBUG nova.virt.hardware [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 06:52:51 compute-0 nova_compute[187185]: 2025-11-29 06:52:51.927 187189 DEBUG nova.virt.hardware [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 06:52:51 compute-0 nova_compute[187185]: 2025-11-29 06:52:51.927 187189 DEBUG nova.virt.hardware [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 06:52:51 compute-0 nova_compute[187185]: 2025-11-29 06:52:51.931 187189 DEBUG nova.virt.libvirt.vif [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:52:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesUnderV243Test-server-519568352',display_name='tempest-AttachInterfacesUnderV243Test-server-519568352',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesunderv243test-server-519568352',id=24,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCvDH9qxVoUCEjlJhQsaEAMbNOzskbLtC6DtWT7gJN5DrOZvy7OhqKmDO7brKeHOYs7363P/8xYAF1DlOOdDyjNgDnx8R2KTRFVjZOrU6WuZomi7dZ/t1KQJzhHGDhHODw==',key_name='tempest-keypair-1451011827',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='462ed9e91718488eab9f1fece4b6b34b',ramdisk_id='',reservation_id='r-z9rgod64',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesUnderV243Test-153400252',owner_user_name='tempest-AttachInterfacesUnderV243Test-153400252-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:52:46Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='840f4cdf2bf9409f9b4fd2a7218fcfbb',uuid=91bf50da-3c1f-4f88-a67f-21ec183c3812,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "db7d3de5-1042-4ad8-a3e2-d1f59d68f37d", "address": "fa:16:3e:92:df:75", "network": {"id": "fb6261a0-1734-4a19-8eaa-94660a8ddab1", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1526889533-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462ed9e91718488eab9f1fece4b6b34b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb7d3de5-10", "ovs_interfaceid": "db7d3de5-1042-4ad8-a3e2-d1f59d68f37d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 06:52:51 compute-0 nova_compute[187185]: 2025-11-29 06:52:51.932 187189 DEBUG nova.network.os_vif_util [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Converting VIF {"id": "db7d3de5-1042-4ad8-a3e2-d1f59d68f37d", "address": "fa:16:3e:92:df:75", "network": {"id": "fb6261a0-1734-4a19-8eaa-94660a8ddab1", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1526889533-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462ed9e91718488eab9f1fece4b6b34b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb7d3de5-10", "ovs_interfaceid": "db7d3de5-1042-4ad8-a3e2-d1f59d68f37d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 06:52:51 compute-0 nova_compute[187185]: 2025-11-29 06:52:51.933 187189 DEBUG nova.network.os_vif_util [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:92:df:75,bridge_name='br-int',has_traffic_filtering=True,id=db7d3de5-1042-4ad8-a3e2-d1f59d68f37d,network=Network(fb6261a0-1734-4a19-8eaa-94660a8ddab1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb7d3de5-10') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 06:52:51 compute-0 nova_compute[187185]: 2025-11-29 06:52:51.934 187189 DEBUG nova.objects.instance [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Lazy-loading 'pci_devices' on Instance uuid 91bf50da-3c1f-4f88-a67f-21ec183c3812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:52:51 compute-0 nova_compute[187185]: 2025-11-29 06:52:51.951 187189 DEBUG nova.virt.libvirt.driver [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] End _get_guest_xml xml=<domain type="kvm">
Nov 29 06:52:51 compute-0 nova_compute[187185]:   <uuid>91bf50da-3c1f-4f88-a67f-21ec183c3812</uuid>
Nov 29 06:52:51 compute-0 nova_compute[187185]:   <name>instance-00000018</name>
Nov 29 06:52:51 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 06:52:51 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 06:52:51 compute-0 nova_compute[187185]:   <metadata>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 06:52:51 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:       <nova:name>tempest-AttachInterfacesUnderV243Test-server-519568352</nova:name>
Nov 29 06:52:51 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 06:52:51</nova:creationTime>
Nov 29 06:52:51 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 06:52:51 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 06:52:51 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 06:52:51 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 06:52:51 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 06:52:51 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 06:52:51 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 06:52:51 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 06:52:51 compute-0 nova_compute[187185]:         <nova:user uuid="840f4cdf2bf9409f9b4fd2a7218fcfbb">tempest-AttachInterfacesUnderV243Test-153400252-project-member</nova:user>
Nov 29 06:52:51 compute-0 nova_compute[187185]:         <nova:project uuid="462ed9e91718488eab9f1fece4b6b34b">tempest-AttachInterfacesUnderV243Test-153400252</nova:project>
Nov 29 06:52:51 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 06:52:51 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 06:52:51 compute-0 nova_compute[187185]:         <nova:port uuid="db7d3de5-1042-4ad8-a3e2-d1f59d68f37d">
Nov 29 06:52:51 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 06:52:51 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 06:52:51 compute-0 nova_compute[187185]:   </metadata>
Nov 29 06:52:51 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <system>
Nov 29 06:52:51 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 06:52:51 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 06:52:51 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 06:52:51 compute-0 nova_compute[187185]:       <entry name="serial">91bf50da-3c1f-4f88-a67f-21ec183c3812</entry>
Nov 29 06:52:51 compute-0 nova_compute[187185]:       <entry name="uuid">91bf50da-3c1f-4f88-a67f-21ec183c3812</entry>
Nov 29 06:52:51 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     </system>
Nov 29 06:52:51 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 06:52:51 compute-0 nova_compute[187185]:   <os>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:   </os>
Nov 29 06:52:51 compute-0 nova_compute[187185]:   <features>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <apic/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:   </features>
Nov 29 06:52:51 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:   </clock>
Nov 29 06:52:51 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:   </cpu>
Nov 29 06:52:51 compute-0 nova_compute[187185]:   <devices>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 06:52:51 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/91bf50da-3c1f-4f88-a67f-21ec183c3812/disk"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     </disk>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 06:52:51 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/91bf50da-3c1f-4f88-a67f-21ec183c3812/disk.config"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     </disk>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 06:52:51 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:92:df:75"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:       <target dev="tapdb7d3de5-10"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     </interface>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 06:52:51 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/91bf50da-3c1f-4f88-a67f-21ec183c3812/console.log" append="off"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     </serial>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <video>
Nov 29 06:52:51 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     </video>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 06:52:51 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     </rng>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 06:52:51 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 06:52:51 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 06:52:51 compute-0 nova_compute[187185]:   </devices>
Nov 29 06:52:51 compute-0 nova_compute[187185]: </domain>
Nov 29 06:52:51 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 06:52:51 compute-0 nova_compute[187185]: 2025-11-29 06:52:51.953 187189 DEBUG nova.compute.manager [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Preparing to wait for external event network-vif-plugged-db7d3de5-1042-4ad8-a3e2-d1f59d68f37d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 06:52:51 compute-0 nova_compute[187185]: 2025-11-29 06:52:51.954 187189 DEBUG oslo_concurrency.lockutils [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Acquiring lock "91bf50da-3c1f-4f88-a67f-21ec183c3812-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:52:51 compute-0 nova_compute[187185]: 2025-11-29 06:52:51.954 187189 DEBUG oslo_concurrency.lockutils [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Lock "91bf50da-3c1f-4f88-a67f-21ec183c3812-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:52:51 compute-0 nova_compute[187185]: 2025-11-29 06:52:51.954 187189 DEBUG oslo_concurrency.lockutils [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Lock "91bf50da-3c1f-4f88-a67f-21ec183c3812-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:52:51 compute-0 nova_compute[187185]: 2025-11-29 06:52:51.955 187189 DEBUG nova.virt.libvirt.vif [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:52:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesUnderV243Test-server-519568352',display_name='tempest-AttachInterfacesUnderV243Test-server-519568352',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesunderv243test-server-519568352',id=24,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCvDH9qxVoUCEjlJhQsaEAMbNOzskbLtC6DtWT7gJN5DrOZvy7OhqKmDO7brKeHOYs7363P/8xYAF1DlOOdDyjNgDnx8R2KTRFVjZOrU6WuZomi7dZ/t1KQJzhHGDhHODw==',key_name='tempest-keypair-1451011827',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='462ed9e91718488eab9f1fece4b6b34b',ramdisk_id='',reservation_id='r-z9rgod64',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesUnderV243Test-153400252',owner_user_name='tempest-AttachInterfacesUnderV243Test-153400252-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:52:46Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='840f4cdf2bf9409f9b4fd2a7218fcfbb',uuid=91bf50da-3c1f-4f88-a67f-21ec183c3812,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "db7d3de5-1042-4ad8-a3e2-d1f59d68f37d", "address": "fa:16:3e:92:df:75", "network": {"id": "fb6261a0-1734-4a19-8eaa-94660a8ddab1", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1526889533-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462ed9e91718488eab9f1fece4b6b34b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb7d3de5-10", "ovs_interfaceid": "db7d3de5-1042-4ad8-a3e2-d1f59d68f37d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 06:52:51 compute-0 nova_compute[187185]: 2025-11-29 06:52:51.955 187189 DEBUG nova.network.os_vif_util [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Converting VIF {"id": "db7d3de5-1042-4ad8-a3e2-d1f59d68f37d", "address": "fa:16:3e:92:df:75", "network": {"id": "fb6261a0-1734-4a19-8eaa-94660a8ddab1", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1526889533-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462ed9e91718488eab9f1fece4b6b34b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb7d3de5-10", "ovs_interfaceid": "db7d3de5-1042-4ad8-a3e2-d1f59d68f37d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 06:52:51 compute-0 nova_compute[187185]: 2025-11-29 06:52:51.956 187189 DEBUG nova.network.os_vif_util [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:92:df:75,bridge_name='br-int',has_traffic_filtering=True,id=db7d3de5-1042-4ad8-a3e2-d1f59d68f37d,network=Network(fb6261a0-1734-4a19-8eaa-94660a8ddab1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb7d3de5-10') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 06:52:51 compute-0 nova_compute[187185]: 2025-11-29 06:52:51.956 187189 DEBUG os_vif [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:92:df:75,bridge_name='br-int',has_traffic_filtering=True,id=db7d3de5-1042-4ad8-a3e2-d1f59d68f37d,network=Network(fb6261a0-1734-4a19-8eaa-94660a8ddab1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb7d3de5-10') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 06:52:51 compute-0 nova_compute[187185]: 2025-11-29 06:52:51.957 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:51 compute-0 nova_compute[187185]: 2025-11-29 06:52:51.957 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:52:51 compute-0 nova_compute[187185]: 2025-11-29 06:52:51.958 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 06:52:51 compute-0 nova_compute[187185]: 2025-11-29 06:52:51.962 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:51 compute-0 nova_compute[187185]: 2025-11-29 06:52:51.962 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdb7d3de5-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:52:51 compute-0 nova_compute[187185]: 2025-11-29 06:52:51.962 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdb7d3de5-10, col_values=(('external_ids', {'iface-id': 'db7d3de5-1042-4ad8-a3e2-d1f59d68f37d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:92:df:75', 'vm-uuid': '91bf50da-3c1f-4f88-a67f-21ec183c3812'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:52:52 compute-0 NetworkManager[55227]: <info>  [1764399171.9999] manager: (tapdb7d3de5-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/39)
Nov 29 06:52:52 compute-0 nova_compute[187185]: 2025-11-29 06:52:51.999 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:52 compute-0 nova_compute[187185]: 2025-11-29 06:52:52.002 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 06:52:52 compute-0 nova_compute[187185]: 2025-11-29 06:52:52.011 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:52 compute-0 nova_compute[187185]: 2025-11-29 06:52:52.012 187189 INFO os_vif [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:92:df:75,bridge_name='br-int',has_traffic_filtering=True,id=db7d3de5-1042-4ad8-a3e2-d1f59d68f37d,network=Network(fb6261a0-1734-4a19-8eaa-94660a8ddab1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb7d3de5-10')
Nov 29 06:52:52 compute-0 nova_compute[187185]: 2025-11-29 06:52:52.146 187189 DEBUG nova.compute.manager [req-3158e465-b18e-42d3-ac61-b38e4fd03433 req-14da3d22-85f0-484c-b1aa-f11ca9db9075 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Received event network-changed-db7d3de5-1042-4ad8-a3e2-d1f59d68f37d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:52:52 compute-0 nova_compute[187185]: 2025-11-29 06:52:52.146 187189 DEBUG nova.compute.manager [req-3158e465-b18e-42d3-ac61-b38e4fd03433 req-14da3d22-85f0-484c-b1aa-f11ca9db9075 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Refreshing instance network info cache due to event network-changed-db7d3de5-1042-4ad8-a3e2-d1f59d68f37d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 06:52:52 compute-0 nova_compute[187185]: 2025-11-29 06:52:52.148 187189 DEBUG oslo_concurrency.lockutils [req-3158e465-b18e-42d3-ac61-b38e4fd03433 req-14da3d22-85f0-484c-b1aa-f11ca9db9075 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-91bf50da-3c1f-4f88-a67f-21ec183c3812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:52:52 compute-0 nova_compute[187185]: 2025-11-29 06:52:52.148 187189 DEBUG oslo_concurrency.lockutils [req-3158e465-b18e-42d3-ac61-b38e4fd03433 req-14da3d22-85f0-484c-b1aa-f11ca9db9075 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-91bf50da-3c1f-4f88-a67f-21ec183c3812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:52:52 compute-0 nova_compute[187185]: 2025-11-29 06:52:52.148 187189 DEBUG nova.network.neutron [req-3158e465-b18e-42d3-ac61-b38e4fd03433 req-14da3d22-85f0-484c-b1aa-f11ca9db9075 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Refreshing network info cache for port db7d3de5-1042-4ad8-a3e2-d1f59d68f37d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 06:52:52 compute-0 nova_compute[187185]: 2025-11-29 06:52:52.215 187189 DEBUG nova.virt.libvirt.driver [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 06:52:52 compute-0 nova_compute[187185]: 2025-11-29 06:52:52.215 187189 DEBUG nova.virt.libvirt.driver [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 06:52:52 compute-0 nova_compute[187185]: 2025-11-29 06:52:52.215 187189 DEBUG nova.virt.libvirt.driver [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] No VIF found with MAC fa:16:3e:92:df:75, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 06:52:52 compute-0 nova_compute[187185]: 2025-11-29 06:52:52.216 187189 INFO nova.virt.libvirt.driver [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Using config drive
Nov 29 06:52:52 compute-0 nova_compute[187185]: 2025-11-29 06:52:52.536 187189 INFO nova.virt.libvirt.driver [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Creating config drive at /var/lib/nova/instances/91bf50da-3c1f-4f88-a67f-21ec183c3812/disk.config
Nov 29 06:52:52 compute-0 nova_compute[187185]: 2025-11-29 06:52:52.543 187189 DEBUG oslo_concurrency.processutils [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/91bf50da-3c1f-4f88-a67f-21ec183c3812/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwi95m5a8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:52:52 compute-0 nova_compute[187185]: 2025-11-29 06:52:52.586 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:52 compute-0 nova_compute[187185]: 2025-11-29 06:52:52.674 187189 DEBUG oslo_concurrency.processutils [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/91bf50da-3c1f-4f88-a67f-21ec183c3812/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwi95m5a8" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:52:52 compute-0 kernel: tapdb7d3de5-10: entered promiscuous mode
Nov 29 06:52:52 compute-0 NetworkManager[55227]: <info>  [1764399172.7662] manager: (tapdb7d3de5-10): new Tun device (/org/freedesktop/NetworkManager/Devices/40)
Nov 29 06:52:52 compute-0 ovn_controller[95281]: 2025-11-29T06:52:52Z|00058|binding|INFO|Claiming lport db7d3de5-1042-4ad8-a3e2-d1f59d68f37d for this chassis.
Nov 29 06:52:52 compute-0 nova_compute[187185]: 2025-11-29 06:52:52.769 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:52 compute-0 ovn_controller[95281]: 2025-11-29T06:52:52Z|00059|binding|INFO|db7d3de5-1042-4ad8-a3e2-d1f59d68f37d: Claiming fa:16:3e:92:df:75 10.100.0.9
Nov 29 06:52:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:52.780 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:92:df:75 10.100.0.9'], port_security=['fa:16:3e:92:df:75 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '91bf50da-3c1f-4f88-a67f-21ec183c3812', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fb6261a0-1734-4a19-8eaa-94660a8ddab1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '462ed9e91718488eab9f1fece4b6b34b', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c22a403f-3305-4d7a-b0bf-8c5bc0b8a78a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=65e466a2-60ed-4685-8786-88e3e79b87bd, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=db7d3de5-1042-4ad8-a3e2-d1f59d68f37d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 06:52:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:52.784 104254 INFO neutron.agent.ovn.metadata.agent [-] Port db7d3de5-1042-4ad8-a3e2-d1f59d68f37d in datapath fb6261a0-1734-4a19-8eaa-94660a8ddab1 bound to our chassis
Nov 29 06:52:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:52.788 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fb6261a0-1734-4a19-8eaa-94660a8ddab1
Nov 29 06:52:52 compute-0 ovn_controller[95281]: 2025-11-29T06:52:52Z|00060|binding|INFO|Setting lport db7d3de5-1042-4ad8-a3e2-d1f59d68f37d ovn-installed in OVS
Nov 29 06:52:52 compute-0 ovn_controller[95281]: 2025-11-29T06:52:52Z|00061|binding|INFO|Setting lport db7d3de5-1042-4ad8-a3e2-d1f59d68f37d up in Southbound
Nov 29 06:52:52 compute-0 nova_compute[187185]: 2025-11-29 06:52:52.793 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:52 compute-0 nova_compute[187185]: 2025-11-29 06:52:52.796 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:52.808 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[ff624875-3b29-4fe1-b8c1-b5f41fddaaa5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:52:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:52.809 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfb6261a0-11 in ovnmeta-fb6261a0-1734-4a19-8eaa-94660a8ddab1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 06:52:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:52.811 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfb6261a0-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 06:52:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:52.811 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[5ecfe4ed-07e7-4a21-ad2e-81d4a172cce6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:52:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:52.812 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[232d8ab1-15d0-4c04-994a-4e8c4acb0314]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:52:52 compute-0 systemd-udevd[216375]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 06:52:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:52.833 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[fe31e5ab-1c79-4680-a398-19dd929e7862]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:52:52 compute-0 systemd-machined[153486]: New machine qemu-7-instance-00000018.
Nov 29 06:52:52 compute-0 NetworkManager[55227]: <info>  [1764399172.8457] device (tapdb7d3de5-10): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 06:52:52 compute-0 NetworkManager[55227]: <info>  [1764399172.8474] device (tapdb7d3de5-10): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 06:52:52 compute-0 systemd[1]: Started Virtual Machine qemu-7-instance-00000018.
Nov 29 06:52:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:52.850 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[5ae55a1d-b5d0-4ed5-ae74-151f78a354ec]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:52:52 compute-0 podman[216334]: 2025-11-29 06:52:52.866526288 +0000 UTC m=+0.112117797 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 06:52:52 compute-0 podman[216337]: 2025-11-29 06:52:52.875966556 +0000 UTC m=+0.102872944 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, managed_by=edpm_ansible, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public)
Nov 29 06:52:52 compute-0 podman[216338]: 2025-11-29 06:52:52.877737704 +0000 UTC m=+0.118504091 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 06:52:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:52.890 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[5e8a1b2c-6e36-40ca-9107-b3ac4fe049fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:52:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:52.896 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[2e5aed9c-7454-4d1c-b063-6f8fc284f211]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:52:52 compute-0 NetworkManager[55227]: <info>  [1764399172.8970] manager: (tapfb6261a0-10): new Veth device (/org/freedesktop/NetworkManager/Devices/41)
Nov 29 06:52:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:52.931 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[94e9ef65-8442-401f-95e7-1d3473fb4c8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:52:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:52.934 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[3034664f-5ec0-481e-b410-25df15193139]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:52:52 compute-0 NetworkManager[55227]: <info>  [1764399172.9564] device (tapfb6261a0-10): carrier: link connected
Nov 29 06:52:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:52.960 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[d36c8d64-ed04-41b2-bea0-a2dd13948278]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:52:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:52.975 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[3d27739a-6044-4610-96c3-0f6f101e7e11]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfb6261a0-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e5:8f:2b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 462062, 'reachable_time': 37566, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216437, 'error': None, 'target': 'ovnmeta-fb6261a0-1734-4a19-8eaa-94660a8ddab1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:52:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:52.992 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[c7f31468-ebaa-4a66-9b06-e284da4ebd66]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee5:8f2b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 462062, 'tstamp': 462062}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216439, 'error': None, 'target': 'ovnmeta-fb6261a0-1734-4a19-8eaa-94660a8ddab1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:52:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:53.008 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[05610947-3585-4877-adf9-33cd1539948d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfb6261a0-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e5:8f:2b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 462062, 'reachable_time': 37566, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216440, 'error': None, 'target': 'ovnmeta-fb6261a0-1734-4a19-8eaa-94660a8ddab1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:52:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:53.050 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[90287861-bbed-4747-a076-16a07ec6e7eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:52:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:53.109 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[859c2f6d-860b-400c-9e84-294df5534091]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:52:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:53.112 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfb6261a0-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:52:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:53.112 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 06:52:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:53.113 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfb6261a0-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:52:53 compute-0 NetworkManager[55227]: <info>  [1764399173.1524] manager: (tapfb6261a0-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/42)
Nov 29 06:52:53 compute-0 kernel: tapfb6261a0-10: entered promiscuous mode
Nov 29 06:52:53 compute-0 nova_compute[187185]: 2025-11-29 06:52:53.152 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:53 compute-0 nova_compute[187185]: 2025-11-29 06:52:53.154 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:53 compute-0 nova_compute[187185]: 2025-11-29 06:52:53.157 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:53.156 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfb6261a0-10, col_values=(('external_ids', {'iface-id': 'd72b5273-d899-47b6-b2e9-b2f0ad4da789'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:52:53 compute-0 ovn_controller[95281]: 2025-11-29T06:52:53Z|00062|binding|INFO|Releasing lport d72b5273-d899-47b6-b2e9-b2f0ad4da789 from this chassis (sb_readonly=0)
Nov 29 06:52:53 compute-0 nova_compute[187185]: 2025-11-29 06:52:53.173 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:53.174 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fb6261a0-1734-4a19-8eaa-94660a8ddab1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fb6261a0-1734-4a19-8eaa-94660a8ddab1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 06:52:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:53.175 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6f8a80c9-90e5-4e10-add1-cac6206e3dfc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:52:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:53.175 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 06:52:53 compute-0 ovn_metadata_agent[104249]: global
Nov 29 06:52:53 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 06:52:53 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-fb6261a0-1734-4a19-8eaa-94660a8ddab1
Nov 29 06:52:53 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 06:52:53 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 06:52:53 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 06:52:53 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/fb6261a0-1734-4a19-8eaa-94660a8ddab1.pid.haproxy
Nov 29 06:52:53 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 06:52:53 compute-0 ovn_metadata_agent[104249]: 
Nov 29 06:52:53 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 06:52:53 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 06:52:53 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 06:52:53 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 06:52:53 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 06:52:53 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 06:52:53 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 06:52:53 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 06:52:53 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 06:52:53 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 06:52:53 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 06:52:53 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 06:52:53 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 06:52:53 compute-0 ovn_metadata_agent[104249]: 
Nov 29 06:52:53 compute-0 ovn_metadata_agent[104249]: 
Nov 29 06:52:53 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 06:52:53 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 06:52:53 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 06:52:53 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID fb6261a0-1734-4a19-8eaa-94660a8ddab1
Nov 29 06:52:53 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 06:52:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:52:53.176 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fb6261a0-1734-4a19-8eaa-94660a8ddab1', 'env', 'PROCESS_TAG=haproxy-fb6261a0-1734-4a19-8eaa-94660a8ddab1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fb6261a0-1734-4a19-8eaa-94660a8ddab1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 06:52:53 compute-0 nova_compute[187185]: 2025-11-29 06:52:53.367 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399173.3665695, 91bf50da-3c1f-4f88-a67f-21ec183c3812 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:52:53 compute-0 nova_compute[187185]: 2025-11-29 06:52:53.368 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] VM Started (Lifecycle Event)
Nov 29 06:52:53 compute-0 nova_compute[187185]: 2025-11-29 06:52:53.403 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:52:53 compute-0 nova_compute[187185]: 2025-11-29 06:52:53.408 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399173.367363, 91bf50da-3c1f-4f88-a67f-21ec183c3812 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:52:53 compute-0 nova_compute[187185]: 2025-11-29 06:52:53.408 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] VM Paused (Lifecycle Event)
Nov 29 06:52:53 compute-0 nova_compute[187185]: 2025-11-29 06:52:53.427 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:52:53 compute-0 nova_compute[187185]: 2025-11-29 06:52:53.430 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 06:52:53 compute-0 nova_compute[187185]: 2025-11-29 06:52:53.451 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 06:52:53 compute-0 podman[216479]: 2025-11-29 06:52:53.596206679 +0000 UTC m=+0.066547551 container create e22e0c7565c6b7fc577b9e2fab3682cdd3ff1f074f042b0f7643d39e5e7a394c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fb6261a0-1734-4a19-8eaa-94660a8ddab1, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 06:52:53 compute-0 systemd[1]: Started libpod-conmon-e22e0c7565c6b7fc577b9e2fab3682cdd3ff1f074f042b0f7643d39e5e7a394c.scope.
Nov 29 06:52:53 compute-0 podman[216479]: 2025-11-29 06:52:53.558924379 +0000 UTC m=+0.029265301 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 06:52:53 compute-0 systemd[1]: Started libcrun container.
Nov 29 06:52:53 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d88424dbe4b767114ba09b5ee20405388066c623725d6c8c3a0ef92ec39c70a4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 06:52:53 compute-0 podman[216479]: 2025-11-29 06:52:53.699162534 +0000 UTC m=+0.169503396 container init e22e0c7565c6b7fc577b9e2fab3682cdd3ff1f074f042b0f7643d39e5e7a394c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fb6261a0-1734-4a19-8eaa-94660a8ddab1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 06:52:53 compute-0 podman[216479]: 2025-11-29 06:52:53.705580829 +0000 UTC m=+0.175921671 container start e22e0c7565c6b7fc577b9e2fab3682cdd3ff1f074f042b0f7643d39e5e7a394c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fb6261a0-1734-4a19-8eaa-94660a8ddab1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 06:52:53 compute-0 neutron-haproxy-ovnmeta-fb6261a0-1734-4a19-8eaa-94660a8ddab1[216493]: [NOTICE]   (216497) : New worker (216499) forked
Nov 29 06:52:53 compute-0 neutron-haproxy-ovnmeta-fb6261a0-1734-4a19-8eaa-94660a8ddab1[216493]: [NOTICE]   (216497) : Loading success.
Nov 29 06:52:54 compute-0 nova_compute[187185]: 2025-11-29 06:52:54.230 187189 DEBUG nova.network.neutron [req-3158e465-b18e-42d3-ac61-b38e4fd03433 req-14da3d22-85f0-484c-b1aa-f11ca9db9075 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Updated VIF entry in instance network info cache for port db7d3de5-1042-4ad8-a3e2-d1f59d68f37d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 06:52:54 compute-0 nova_compute[187185]: 2025-11-29 06:52:54.230 187189 DEBUG nova.network.neutron [req-3158e465-b18e-42d3-ac61-b38e4fd03433 req-14da3d22-85f0-484c-b1aa-f11ca9db9075 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Updating instance_info_cache with network_info: [{"id": "db7d3de5-1042-4ad8-a3e2-d1f59d68f37d", "address": "fa:16:3e:92:df:75", "network": {"id": "fb6261a0-1734-4a19-8eaa-94660a8ddab1", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1526889533-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462ed9e91718488eab9f1fece4b6b34b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb7d3de5-10", "ovs_interfaceid": "db7d3de5-1042-4ad8-a3e2-d1f59d68f37d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:52:54 compute-0 nova_compute[187185]: 2025-11-29 06:52:54.305 187189 DEBUG oslo_concurrency.lockutils [req-3158e465-b18e-42d3-ac61-b38e4fd03433 req-14da3d22-85f0-484c-b1aa-f11ca9db9075 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-91bf50da-3c1f-4f88-a67f-21ec183c3812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:52:54 compute-0 nova_compute[187185]: 2025-11-29 06:52:54.306 187189 DEBUG nova.compute.manager [req-3158e465-b18e-42d3-ac61-b38e4fd03433 req-14da3d22-85f0-484c-b1aa-f11ca9db9075 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Received event network-changed-586fc8d7-18ba-4421-a518-60f4d0a6950c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:52:54 compute-0 nova_compute[187185]: 2025-11-29 06:52:54.306 187189 DEBUG nova.compute.manager [req-3158e465-b18e-42d3-ac61-b38e4fd03433 req-14da3d22-85f0-484c-b1aa-f11ca9db9075 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Refreshing instance network info cache due to event network-changed-586fc8d7-18ba-4421-a518-60f4d0a6950c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 06:52:54 compute-0 nova_compute[187185]: 2025-11-29 06:52:54.306 187189 DEBUG oslo_concurrency.lockutils [req-3158e465-b18e-42d3-ac61-b38e4fd03433 req-14da3d22-85f0-484c-b1aa-f11ca9db9075 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-942e977f-fd74-45d1-b0be-661b15431eca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:52:54 compute-0 nova_compute[187185]: 2025-11-29 06:52:54.306 187189 DEBUG oslo_concurrency.lockutils [req-3158e465-b18e-42d3-ac61-b38e4fd03433 req-14da3d22-85f0-484c-b1aa-f11ca9db9075 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-942e977f-fd74-45d1-b0be-661b15431eca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:52:54 compute-0 nova_compute[187185]: 2025-11-29 06:52:54.307 187189 DEBUG nova.network.neutron [req-3158e465-b18e-42d3-ac61-b38e4fd03433 req-14da3d22-85f0-484c-b1aa-f11ca9db9075 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Refreshing network info cache for port 586fc8d7-18ba-4421-a518-60f4d0a6950c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 06:52:56 compute-0 nova_compute[187185]: 2025-11-29 06:52:56.361 187189 DEBUG nova.compute.manager [req-f8463604-07d5-42d0-bda1-1d29f91c58a7 req-3ebae108-2b3e-490f-b979-565bd17a494b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Received event network-changed-586fc8d7-18ba-4421-a518-60f4d0a6950c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:52:56 compute-0 nova_compute[187185]: 2025-11-29 06:52:56.362 187189 DEBUG nova.compute.manager [req-f8463604-07d5-42d0-bda1-1d29f91c58a7 req-3ebae108-2b3e-490f-b979-565bd17a494b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Refreshing instance network info cache due to event network-changed-586fc8d7-18ba-4421-a518-60f4d0a6950c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 06:52:56 compute-0 nova_compute[187185]: 2025-11-29 06:52:56.363 187189 DEBUG oslo_concurrency.lockutils [req-f8463604-07d5-42d0-bda1-1d29f91c58a7 req-3ebae108-2b3e-490f-b979-565bd17a494b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-942e977f-fd74-45d1-b0be-661b15431eca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:52:56 compute-0 nova_compute[187185]: 2025-11-29 06:52:56.880 187189 DEBUG nova.network.neutron [req-3158e465-b18e-42d3-ac61-b38e4fd03433 req-14da3d22-85f0-484c-b1aa-f11ca9db9075 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Updated VIF entry in instance network info cache for port 586fc8d7-18ba-4421-a518-60f4d0a6950c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 06:52:56 compute-0 nova_compute[187185]: 2025-11-29 06:52:56.880 187189 DEBUG nova.network.neutron [req-3158e465-b18e-42d3-ac61-b38e4fd03433 req-14da3d22-85f0-484c-b1aa-f11ca9db9075 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Updating instance_info_cache with network_info: [{"id": "586fc8d7-18ba-4421-a518-60f4d0a6950c", "address": "fa:16:3e:90:18:7a", "network": {"id": "3c63c551-2e9f-4b47-9e49-c73140efe20a", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-555773636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71af3e88884e42c48fb244d7d6ca31e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap586fc8d7-18", "ovs_interfaceid": "586fc8d7-18ba-4421-a518-60f4d0a6950c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:52:56 compute-0 nova_compute[187185]: 2025-11-29 06:52:56.903 187189 DEBUG oslo_concurrency.lockutils [req-3158e465-b18e-42d3-ac61-b38e4fd03433 req-14da3d22-85f0-484c-b1aa-f11ca9db9075 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-942e977f-fd74-45d1-b0be-661b15431eca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:52:56 compute-0 nova_compute[187185]: 2025-11-29 06:52:56.904 187189 DEBUG oslo_concurrency.lockutils [req-f8463604-07d5-42d0-bda1-1d29f91c58a7 req-3ebae108-2b3e-490f-b979-565bd17a494b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-942e977f-fd74-45d1-b0be-661b15431eca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:52:56 compute-0 nova_compute[187185]: 2025-11-29 06:52:56.904 187189 DEBUG nova.network.neutron [req-f8463604-07d5-42d0-bda1-1d29f91c58a7 req-3ebae108-2b3e-490f-b979-565bd17a494b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Refreshing network info cache for port 586fc8d7-18ba-4421-a518-60f4d0a6950c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 06:52:57 compute-0 nova_compute[187185]: 2025-11-29 06:52:57.001 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:57 compute-0 nova_compute[187185]: 2025-11-29 06:52:57.484 187189 DEBUG nova.compute.manager [req-41bccbb2-8042-432d-b4a2-38f4ac624060 req-b20ea527-a32b-4913-81e6-e22b93296daf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Received event network-vif-plugged-db7d3de5-1042-4ad8-a3e2-d1f59d68f37d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:52:57 compute-0 nova_compute[187185]: 2025-11-29 06:52:57.484 187189 DEBUG oslo_concurrency.lockutils [req-41bccbb2-8042-432d-b4a2-38f4ac624060 req-b20ea527-a32b-4913-81e6-e22b93296daf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "91bf50da-3c1f-4f88-a67f-21ec183c3812-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:52:57 compute-0 nova_compute[187185]: 2025-11-29 06:52:57.484 187189 DEBUG oslo_concurrency.lockutils [req-41bccbb2-8042-432d-b4a2-38f4ac624060 req-b20ea527-a32b-4913-81e6-e22b93296daf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "91bf50da-3c1f-4f88-a67f-21ec183c3812-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:52:57 compute-0 nova_compute[187185]: 2025-11-29 06:52:57.485 187189 DEBUG oslo_concurrency.lockutils [req-41bccbb2-8042-432d-b4a2-38f4ac624060 req-b20ea527-a32b-4913-81e6-e22b93296daf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "91bf50da-3c1f-4f88-a67f-21ec183c3812-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:52:57 compute-0 nova_compute[187185]: 2025-11-29 06:52:57.485 187189 DEBUG nova.compute.manager [req-41bccbb2-8042-432d-b4a2-38f4ac624060 req-b20ea527-a32b-4913-81e6-e22b93296daf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Processing event network-vif-plugged-db7d3de5-1042-4ad8-a3e2-d1f59d68f37d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 06:52:57 compute-0 nova_compute[187185]: 2025-11-29 06:52:57.485 187189 DEBUG nova.compute.manager [req-41bccbb2-8042-432d-b4a2-38f4ac624060 req-b20ea527-a32b-4913-81e6-e22b93296daf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Received event network-vif-plugged-db7d3de5-1042-4ad8-a3e2-d1f59d68f37d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:52:57 compute-0 nova_compute[187185]: 2025-11-29 06:52:57.486 187189 DEBUG oslo_concurrency.lockutils [req-41bccbb2-8042-432d-b4a2-38f4ac624060 req-b20ea527-a32b-4913-81e6-e22b93296daf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "91bf50da-3c1f-4f88-a67f-21ec183c3812-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:52:57 compute-0 nova_compute[187185]: 2025-11-29 06:52:57.486 187189 DEBUG oslo_concurrency.lockutils [req-41bccbb2-8042-432d-b4a2-38f4ac624060 req-b20ea527-a32b-4913-81e6-e22b93296daf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "91bf50da-3c1f-4f88-a67f-21ec183c3812-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:52:57 compute-0 nova_compute[187185]: 2025-11-29 06:52:57.486 187189 DEBUG oslo_concurrency.lockutils [req-41bccbb2-8042-432d-b4a2-38f4ac624060 req-b20ea527-a32b-4913-81e6-e22b93296daf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "91bf50da-3c1f-4f88-a67f-21ec183c3812-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:52:57 compute-0 nova_compute[187185]: 2025-11-29 06:52:57.486 187189 DEBUG nova.compute.manager [req-41bccbb2-8042-432d-b4a2-38f4ac624060 req-b20ea527-a32b-4913-81e6-e22b93296daf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] No waiting events found dispatching network-vif-plugged-db7d3de5-1042-4ad8-a3e2-d1f59d68f37d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 06:52:57 compute-0 nova_compute[187185]: 2025-11-29 06:52:57.487 187189 WARNING nova.compute.manager [req-41bccbb2-8042-432d-b4a2-38f4ac624060 req-b20ea527-a32b-4913-81e6-e22b93296daf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Received unexpected event network-vif-plugged-db7d3de5-1042-4ad8-a3e2-d1f59d68f37d for instance with vm_state building and task_state spawning.
Nov 29 06:52:57 compute-0 nova_compute[187185]: 2025-11-29 06:52:57.487 187189 DEBUG nova.compute.manager [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 06:52:57 compute-0 nova_compute[187185]: 2025-11-29 06:52:57.492 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399177.491734, 91bf50da-3c1f-4f88-a67f-21ec183c3812 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:52:57 compute-0 nova_compute[187185]: 2025-11-29 06:52:57.493 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] VM Resumed (Lifecycle Event)
Nov 29 06:52:57 compute-0 nova_compute[187185]: 2025-11-29 06:52:57.495 187189 DEBUG nova.virt.libvirt.driver [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 06:52:57 compute-0 nova_compute[187185]: 2025-11-29 06:52:57.499 187189 INFO nova.virt.libvirt.driver [-] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Instance spawned successfully.
Nov 29 06:52:57 compute-0 nova_compute[187185]: 2025-11-29 06:52:57.500 187189 DEBUG nova.virt.libvirt.driver [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 06:52:57 compute-0 nova_compute[187185]: 2025-11-29 06:52:57.524 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:52:57 compute-0 nova_compute[187185]: 2025-11-29 06:52:57.528 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 06:52:57 compute-0 nova_compute[187185]: 2025-11-29 06:52:57.539 187189 DEBUG nova.virt.libvirt.driver [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:52:57 compute-0 nova_compute[187185]: 2025-11-29 06:52:57.539 187189 DEBUG nova.virt.libvirt.driver [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:52:57 compute-0 nova_compute[187185]: 2025-11-29 06:52:57.540 187189 DEBUG nova.virt.libvirt.driver [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:52:57 compute-0 nova_compute[187185]: 2025-11-29 06:52:57.541 187189 DEBUG nova.virt.libvirt.driver [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:52:57 compute-0 nova_compute[187185]: 2025-11-29 06:52:57.541 187189 DEBUG nova.virt.libvirt.driver [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:52:57 compute-0 nova_compute[187185]: 2025-11-29 06:52:57.542 187189 DEBUG nova.virt.libvirt.driver [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:52:57 compute-0 nova_compute[187185]: 2025-11-29 06:52:57.563 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 06:52:57 compute-0 nova_compute[187185]: 2025-11-29 06:52:57.590 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:52:57 compute-0 nova_compute[187185]: 2025-11-29 06:52:57.665 187189 INFO nova.compute.manager [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Took 10.95 seconds to spawn the instance on the hypervisor.
Nov 29 06:52:57 compute-0 nova_compute[187185]: 2025-11-29 06:52:57.666 187189 DEBUG nova.compute.manager [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:52:57 compute-0 nova_compute[187185]: 2025-11-29 06:52:57.797 187189 INFO nova.compute.manager [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Took 11.60 seconds to build instance.
Nov 29 06:52:57 compute-0 nova_compute[187185]: 2025-11-29 06:52:57.821 187189 DEBUG oslo_concurrency.lockutils [None req-6dd52099-f4a3-44b4-8439-20d85540227a 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Lock "91bf50da-3c1f-4f88-a67f-21ec183c3812" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.688s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:52:58 compute-0 nova_compute[187185]: 2025-11-29 06:52:58.258 187189 DEBUG nova.network.neutron [req-f8463604-07d5-42d0-bda1-1d29f91c58a7 req-3ebae108-2b3e-490f-b979-565bd17a494b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Updated VIF entry in instance network info cache for port 586fc8d7-18ba-4421-a518-60f4d0a6950c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 06:52:58 compute-0 nova_compute[187185]: 2025-11-29 06:52:58.259 187189 DEBUG nova.network.neutron [req-f8463604-07d5-42d0-bda1-1d29f91c58a7 req-3ebae108-2b3e-490f-b979-565bd17a494b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Updating instance_info_cache with network_info: [{"id": "586fc8d7-18ba-4421-a518-60f4d0a6950c", "address": "fa:16:3e:90:18:7a", "network": {"id": "3c63c551-2e9f-4b47-9e49-c73140efe20a", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-555773636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71af3e88884e42c48fb244d7d6ca31e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap586fc8d7-18", "ovs_interfaceid": "586fc8d7-18ba-4421-a518-60f4d0a6950c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:52:58 compute-0 nova_compute[187185]: 2025-11-29 06:52:58.274 187189 DEBUG oslo_concurrency.lockutils [req-f8463604-07d5-42d0-bda1-1d29f91c58a7 req-3ebae108-2b3e-490f-b979-565bd17a494b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-942e977f-fd74-45d1-b0be-661b15431eca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:52:59 compute-0 podman[216508]: 2025-11-29 06:52:59.83537733 +0000 UTC m=+0.100516313 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Nov 29 06:53:01 compute-0 podman[216534]: 2025-11-29 06:53:01.796815683 +0000 UTC m=+0.063274388 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 06:53:01 compute-0 podman[216533]: 2025-11-29 06:53:01.796876635 +0000 UTC m=+0.064154762 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Nov 29 06:53:02 compute-0 nova_compute[187185]: 2025-11-29 06:53:02.005 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:02 compute-0 nova_compute[187185]: 2025-11-29 06:53:02.592 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:02 compute-0 nova_compute[187185]: 2025-11-29 06:53:02.626 187189 DEBUG nova.compute.manager [req-fcdecbb1-b5a9-4f4d-8ecc-5aa5f42e6ecc req-126f70df-f722-4efe-b8e3-d6f39d253205 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Received event network-changed-db7d3de5-1042-4ad8-a3e2-d1f59d68f37d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:53:02 compute-0 nova_compute[187185]: 2025-11-29 06:53:02.627 187189 DEBUG nova.compute.manager [req-fcdecbb1-b5a9-4f4d-8ecc-5aa5f42e6ecc req-126f70df-f722-4efe-b8e3-d6f39d253205 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Refreshing instance network info cache due to event network-changed-db7d3de5-1042-4ad8-a3e2-d1f59d68f37d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 06:53:02 compute-0 nova_compute[187185]: 2025-11-29 06:53:02.628 187189 DEBUG oslo_concurrency.lockutils [req-fcdecbb1-b5a9-4f4d-8ecc-5aa5f42e6ecc req-126f70df-f722-4efe-b8e3-d6f39d253205 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-91bf50da-3c1f-4f88-a67f-21ec183c3812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:53:02 compute-0 nova_compute[187185]: 2025-11-29 06:53:02.628 187189 DEBUG oslo_concurrency.lockutils [req-fcdecbb1-b5a9-4f4d-8ecc-5aa5f42e6ecc req-126f70df-f722-4efe-b8e3-d6f39d253205 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-91bf50da-3c1f-4f88-a67f-21ec183c3812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:53:02 compute-0 nova_compute[187185]: 2025-11-29 06:53:02.629 187189 DEBUG nova.network.neutron [req-fcdecbb1-b5a9-4f4d-8ecc-5aa5f42e6ecc req-126f70df-f722-4efe-b8e3-d6f39d253205 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Refreshing network info cache for port db7d3de5-1042-4ad8-a3e2-d1f59d68f37d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 06:53:02 compute-0 sshd-session[216577]: Received disconnect from 179.125.24.202 port 52556:11: Bye Bye [preauth]
Nov 29 06:53:02 compute-0 sshd-session[216577]: Disconnected from authenticating user root 179.125.24.202 port 52556 [preauth]
Nov 29 06:53:04 compute-0 nova_compute[187185]: 2025-11-29 06:53:04.644 187189 DEBUG nova.network.neutron [req-fcdecbb1-b5a9-4f4d-8ecc-5aa5f42e6ecc req-126f70df-f722-4efe-b8e3-d6f39d253205 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Updated VIF entry in instance network info cache for port db7d3de5-1042-4ad8-a3e2-d1f59d68f37d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 06:53:04 compute-0 nova_compute[187185]: 2025-11-29 06:53:04.646 187189 DEBUG nova.network.neutron [req-fcdecbb1-b5a9-4f4d-8ecc-5aa5f42e6ecc req-126f70df-f722-4efe-b8e3-d6f39d253205 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Updating instance_info_cache with network_info: [{"id": "db7d3de5-1042-4ad8-a3e2-d1f59d68f37d", "address": "fa:16:3e:92:df:75", "network": {"id": "fb6261a0-1734-4a19-8eaa-94660a8ddab1", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1526889533-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462ed9e91718488eab9f1fece4b6b34b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb7d3de5-10", "ovs_interfaceid": "db7d3de5-1042-4ad8-a3e2-d1f59d68f37d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:53:04 compute-0 nova_compute[187185]: 2025-11-29 06:53:04.669 187189 DEBUG oslo_concurrency.lockutils [req-fcdecbb1-b5a9-4f4d-8ecc-5aa5f42e6ecc req-126f70df-f722-4efe-b8e3-d6f39d253205 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-91bf50da-3c1f-4f88-a67f-21ec183c3812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:53:07 compute-0 nova_compute[187185]: 2025-11-29 06:53:07.008 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:07 compute-0 nova_compute[187185]: 2025-11-29 06:53:07.592 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:08 compute-0 nova_compute[187185]: 2025-11-29 06:53:08.591 187189 DEBUG nova.compute.manager [req-b0834aba-14d2-45c8-aaee-aa556276fa35 req-cf366764-d53a-4c30-9da7-9fa0530afefb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Received event network-changed-586fc8d7-18ba-4421-a518-60f4d0a6950c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:53:08 compute-0 nova_compute[187185]: 2025-11-29 06:53:08.592 187189 DEBUG nova.compute.manager [req-b0834aba-14d2-45c8-aaee-aa556276fa35 req-cf366764-d53a-4c30-9da7-9fa0530afefb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Refreshing instance network info cache due to event network-changed-586fc8d7-18ba-4421-a518-60f4d0a6950c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 06:53:08 compute-0 nova_compute[187185]: 2025-11-29 06:53:08.593 187189 DEBUG oslo_concurrency.lockutils [req-b0834aba-14d2-45c8-aaee-aa556276fa35 req-cf366764-d53a-4c30-9da7-9fa0530afefb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-942e977f-fd74-45d1-b0be-661b15431eca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:53:08 compute-0 nova_compute[187185]: 2025-11-29 06:53:08.594 187189 DEBUG oslo_concurrency.lockutils [req-b0834aba-14d2-45c8-aaee-aa556276fa35 req-cf366764-d53a-4c30-9da7-9fa0530afefb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-942e977f-fd74-45d1-b0be-661b15431eca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:53:08 compute-0 nova_compute[187185]: 2025-11-29 06:53:08.595 187189 DEBUG nova.network.neutron [req-b0834aba-14d2-45c8-aaee-aa556276fa35 req-cf366764-d53a-4c30-9da7-9fa0530afefb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Refreshing network info cache for port 586fc8d7-18ba-4421-a518-60f4d0a6950c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 06:53:09 compute-0 sshd-session[216579]: Received disconnect from 1.214.197.163 port 46010:11: Bye Bye [preauth]
Nov 29 06:53:09 compute-0 sshd-session[216579]: Disconnected from authenticating user root 1.214.197.163 port 46010 [preauth]
Nov 29 06:53:10 compute-0 podman[216600]: 2025-11-29 06:53:10.813025043 +0000 UTC m=+0.079609258 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 29 06:53:11 compute-0 nova_compute[187185]: 2025-11-29 06:53:11.218 187189 DEBUG nova.network.neutron [req-b0834aba-14d2-45c8-aaee-aa556276fa35 req-cf366764-d53a-4c30-9da7-9fa0530afefb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Updated VIF entry in instance network info cache for port 586fc8d7-18ba-4421-a518-60f4d0a6950c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 06:53:11 compute-0 nova_compute[187185]: 2025-11-29 06:53:11.220 187189 DEBUG nova.network.neutron [req-b0834aba-14d2-45c8-aaee-aa556276fa35 req-cf366764-d53a-4c30-9da7-9fa0530afefb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Updating instance_info_cache with network_info: [{"id": "586fc8d7-18ba-4421-a518-60f4d0a6950c", "address": "fa:16:3e:90:18:7a", "network": {"id": "3c63c551-2e9f-4b47-9e49-c73140efe20a", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-555773636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71af3e88884e42c48fb244d7d6ca31e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap586fc8d7-18", "ovs_interfaceid": "586fc8d7-18ba-4421-a518-60f4d0a6950c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:53:11 compute-0 nova_compute[187185]: 2025-11-29 06:53:11.237 187189 DEBUG oslo_concurrency.lockutils [req-b0834aba-14d2-45c8-aaee-aa556276fa35 req-cf366764-d53a-4c30-9da7-9fa0530afefb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-942e977f-fd74-45d1-b0be-661b15431eca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:53:11 compute-0 ovn_controller[95281]: 2025-11-29T06:53:11Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:92:df:75 10.100.0.9
Nov 29 06:53:11 compute-0 ovn_controller[95281]: 2025-11-29T06:53:11Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:92:df:75 10.100.0.9
Nov 29 06:53:12 compute-0 nova_compute[187185]: 2025-11-29 06:53:12.011 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:12 compute-0 nova_compute[187185]: 2025-11-29 06:53:12.595 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:14 compute-0 nova_compute[187185]: 2025-11-29 06:53:14.319 187189 DEBUG nova.compute.manager [req-fa6bde2c-f054-45ed-bc95-62187656b3c7 req-f81e00c1-60cf-4eae-9b03-eeb5237e4ea9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Received event network-changed-586fc8d7-18ba-4421-a518-60f4d0a6950c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:53:14 compute-0 nova_compute[187185]: 2025-11-29 06:53:14.319 187189 DEBUG nova.compute.manager [req-fa6bde2c-f054-45ed-bc95-62187656b3c7 req-f81e00c1-60cf-4eae-9b03-eeb5237e4ea9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Refreshing instance network info cache due to event network-changed-586fc8d7-18ba-4421-a518-60f4d0a6950c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 06:53:14 compute-0 nova_compute[187185]: 2025-11-29 06:53:14.320 187189 DEBUG oslo_concurrency.lockutils [req-fa6bde2c-f054-45ed-bc95-62187656b3c7 req-f81e00c1-60cf-4eae-9b03-eeb5237e4ea9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-942e977f-fd74-45d1-b0be-661b15431eca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:53:14 compute-0 nova_compute[187185]: 2025-11-29 06:53:14.320 187189 DEBUG oslo_concurrency.lockutils [req-fa6bde2c-f054-45ed-bc95-62187656b3c7 req-f81e00c1-60cf-4eae-9b03-eeb5237e4ea9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-942e977f-fd74-45d1-b0be-661b15431eca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:53:14 compute-0 nova_compute[187185]: 2025-11-29 06:53:14.320 187189 DEBUG nova.network.neutron [req-fa6bde2c-f054-45ed-bc95-62187656b3c7 req-f81e00c1-60cf-4eae-9b03-eeb5237e4ea9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Refreshing network info cache for port 586fc8d7-18ba-4421-a518-60f4d0a6950c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 06:53:17 compute-0 nova_compute[187185]: 2025-11-29 06:53:17.016 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:17 compute-0 nova_compute[187185]: 2025-11-29 06:53:17.650 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:17 compute-0 nova_compute[187185]: 2025-11-29 06:53:17.909 187189 DEBUG nova.network.neutron [req-fa6bde2c-f054-45ed-bc95-62187656b3c7 req-f81e00c1-60cf-4eae-9b03-eeb5237e4ea9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Updated VIF entry in instance network info cache for port 586fc8d7-18ba-4421-a518-60f4d0a6950c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 06:53:17 compute-0 nova_compute[187185]: 2025-11-29 06:53:17.910 187189 DEBUG nova.network.neutron [req-fa6bde2c-f054-45ed-bc95-62187656b3c7 req-f81e00c1-60cf-4eae-9b03-eeb5237e4ea9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Updating instance_info_cache with network_info: [{"id": "586fc8d7-18ba-4421-a518-60f4d0a6950c", "address": "fa:16:3e:90:18:7a", "network": {"id": "3c63c551-2e9f-4b47-9e49-c73140efe20a", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-555773636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71af3e88884e42c48fb244d7d6ca31e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap586fc8d7-18", "ovs_interfaceid": "586fc8d7-18ba-4421-a518-60f4d0a6950c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:53:17 compute-0 nova_compute[187185]: 2025-11-29 06:53:17.938 187189 DEBUG oslo_concurrency.lockutils [req-fa6bde2c-f054-45ed-bc95-62187656b3c7 req-f81e00c1-60cf-4eae-9b03-eeb5237e4ea9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-942e977f-fd74-45d1-b0be-661b15431eca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:53:18 compute-0 nova_compute[187185]: 2025-11-29 06:53:18.466 187189 DEBUG nova.objects.instance [None req-cd00d234-cd4a-454e-b8d7-f30dc7823e53 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Lazy-loading 'flavor' on Instance uuid 91bf50da-3c1f-4f88-a67f-21ec183c3812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:53:18 compute-0 nova_compute[187185]: 2025-11-29 06:53:18.496 187189 DEBUG oslo_concurrency.lockutils [None req-cd00d234-cd4a-454e-b8d7-f30dc7823e53 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Acquiring lock "refresh_cache-91bf50da-3c1f-4f88-a67f-21ec183c3812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:53:18 compute-0 nova_compute[187185]: 2025-11-29 06:53:18.497 187189 DEBUG oslo_concurrency.lockutils [None req-cd00d234-cd4a-454e-b8d7-f30dc7823e53 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Acquired lock "refresh_cache-91bf50da-3c1f-4f88-a67f-21ec183c3812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:53:20 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:20.932 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 06:53:20 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:20.934 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 06:53:20 compute-0 nova_compute[187185]: 2025-11-29 06:53:20.949 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:21 compute-0 nova_compute[187185]: 2025-11-29 06:53:21.048 187189 DEBUG nova.network.neutron [None req-cd00d234-cd4a-454e-b8d7-f30dc7823e53 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 06:53:21 compute-0 nova_compute[187185]: 2025-11-29 06:53:21.167 187189 DEBUG nova.compute.manager [req-e1d3631f-7265-4fc6-8ceb-c4e1c1c8e9a5 req-f38090f2-d65b-4f8f-87a3-90a21da9a71c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Received event network-changed-db7d3de5-1042-4ad8-a3e2-d1f59d68f37d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:53:21 compute-0 nova_compute[187185]: 2025-11-29 06:53:21.168 187189 DEBUG nova.compute.manager [req-e1d3631f-7265-4fc6-8ceb-c4e1c1c8e9a5 req-f38090f2-d65b-4f8f-87a3-90a21da9a71c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Refreshing instance network info cache due to event network-changed-db7d3de5-1042-4ad8-a3e2-d1f59d68f37d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 06:53:21 compute-0 nova_compute[187185]: 2025-11-29 06:53:21.169 187189 DEBUG oslo_concurrency.lockutils [req-e1d3631f-7265-4fc6-8ceb-c4e1c1c8e9a5 req-f38090f2-d65b-4f8f-87a3-90a21da9a71c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-91bf50da-3c1f-4f88-a67f-21ec183c3812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:53:21 compute-0 nova_compute[187185]: 2025-11-29 06:53:21.410 187189 DEBUG oslo_concurrency.lockutils [None req-5c94d1ec-6d0b-46e9-86fc-7530d6de8a85 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Acquiring lock "942e977f-fd74-45d1-b0be-661b15431eca" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:53:21 compute-0 nova_compute[187185]: 2025-11-29 06:53:21.411 187189 DEBUG oslo_concurrency.lockutils [None req-5c94d1ec-6d0b-46e9-86fc-7530d6de8a85 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Lock "942e977f-fd74-45d1-b0be-661b15431eca" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:53:21 compute-0 nova_compute[187185]: 2025-11-29 06:53:21.411 187189 DEBUG oslo_concurrency.lockutils [None req-5c94d1ec-6d0b-46e9-86fc-7530d6de8a85 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Acquiring lock "942e977f-fd74-45d1-b0be-661b15431eca-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:53:21 compute-0 nova_compute[187185]: 2025-11-29 06:53:21.412 187189 DEBUG oslo_concurrency.lockutils [None req-5c94d1ec-6d0b-46e9-86fc-7530d6de8a85 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Lock "942e977f-fd74-45d1-b0be-661b15431eca-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:53:21 compute-0 nova_compute[187185]: 2025-11-29 06:53:21.412 187189 DEBUG oslo_concurrency.lockutils [None req-5c94d1ec-6d0b-46e9-86fc-7530d6de8a85 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Lock "942e977f-fd74-45d1-b0be-661b15431eca-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:53:21 compute-0 nova_compute[187185]: 2025-11-29 06:53:21.425 187189 INFO nova.compute.manager [None req-5c94d1ec-6d0b-46e9-86fc-7530d6de8a85 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Terminating instance
Nov 29 06:53:21 compute-0 nova_compute[187185]: 2025-11-29 06:53:21.437 187189 DEBUG nova.compute.manager [None req-5c94d1ec-6d0b-46e9-86fc-7530d6de8a85 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 06:53:21 compute-0 kernel: tap586fc8d7-18 (unregistering): left promiscuous mode
Nov 29 06:53:21 compute-0 NetworkManager[55227]: <info>  [1764399201.5596] device (tap586fc8d7-18): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 06:53:21 compute-0 ovn_controller[95281]: 2025-11-29T06:53:21Z|00063|binding|INFO|Releasing lport 586fc8d7-18ba-4421-a518-60f4d0a6950c from this chassis (sb_readonly=0)
Nov 29 06:53:21 compute-0 ovn_controller[95281]: 2025-11-29T06:53:21Z|00064|binding|INFO|Setting lport 586fc8d7-18ba-4421-a518-60f4d0a6950c down in Southbound
Nov 29 06:53:21 compute-0 nova_compute[187185]: 2025-11-29 06:53:21.575 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:21 compute-0 ovn_controller[95281]: 2025-11-29T06:53:21Z|00065|binding|INFO|Removing iface tap586fc8d7-18 ovn-installed in OVS
Nov 29 06:53:21 compute-0 nova_compute[187185]: 2025-11-29 06:53:21.578 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:21.601 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:90:18:7a 10.100.0.13'], port_security=['fa:16:3e:90:18:7a 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '942e977f-fd74-45d1-b0be-661b15431eca', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3c63c551-2e9f-4b47-9e49-c73140efe20a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71af3e88884e42c48fb244d7d6ca31e2', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5f562e81-d2bf-4e2c-b0ea-0aa5dfe52d68', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=86b613b3-d246-4b07-a5b7-9ab1b7da74dc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=586fc8d7-18ba-4421-a518-60f4d0a6950c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 06:53:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:21.602 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 586fc8d7-18ba-4421-a518-60f4d0a6950c in datapath 3c63c551-2e9f-4b47-9e49-c73140efe20a unbound from our chassis
Nov 29 06:53:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:21.604 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3c63c551-2e9f-4b47-9e49-c73140efe20a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 06:53:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:21.605 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[7968ef0a-76b7-4074-a989-3085877dd3da]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:53:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:21.605 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a namespace which is not needed anymore
Nov 29 06:53:21 compute-0 nova_compute[187185]: 2025-11-29 06:53:21.605 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:21 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000014.scope: Deactivated successfully.
Nov 29 06:53:21 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000014.scope: Consumed 15.000s CPU time.
Nov 29 06:53:21 compute-0 systemd-machined[153486]: Machine qemu-6-instance-00000014 terminated.
Nov 29 06:53:21 compute-0 nova_compute[187185]: 2025-11-29 06:53:21.665 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:21 compute-0 nova_compute[187185]: 2025-11-29 06:53:21.671 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:21 compute-0 nova_compute[187185]: 2025-11-29 06:53:21.712 187189 INFO nova.virt.libvirt.driver [-] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Instance destroyed successfully.
Nov 29 06:53:21 compute-0 nova_compute[187185]: 2025-11-29 06:53:21.713 187189 DEBUG nova.objects.instance [None req-5c94d1ec-6d0b-46e9-86fc-7530d6de8a85 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Lazy-loading 'resources' on Instance uuid 942e977f-fd74-45d1-b0be-661b15431eca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:53:21 compute-0 nova_compute[187185]: 2025-11-29 06:53:21.728 187189 DEBUG nova.virt.libvirt.vif [None req-5c94d1ec-6d0b-46e9-86fc-7530d6de8a85 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:52:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-1366973524',display_name='tempest-FloatingIPsAssociationTestJSON-server-1366973524',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-1366973524',id=20,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:52:34Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='71af3e88884e42c48fb244d7d6ca31e2',ramdisk_id='',reservation_id='r-tesdhtm5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIPsAssociationTestJSON-940149563',owner_user_name='tempest-FloatingIPsAssociationTestJSON-940149563-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:52:35Z,user_data=None,user_id='a0fcd4f4de7e4072be30f7e3d4ac7c77',uuid=942e977f-fd74-45d1-b0be-661b15431eca,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "586fc8d7-18ba-4421-a518-60f4d0a6950c", "address": "fa:16:3e:90:18:7a", "network": {"id": "3c63c551-2e9f-4b47-9e49-c73140efe20a", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-555773636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71af3e88884e42c48fb244d7d6ca31e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap586fc8d7-18", "ovs_interfaceid": "586fc8d7-18ba-4421-a518-60f4d0a6950c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 06:53:21 compute-0 nova_compute[187185]: 2025-11-29 06:53:21.728 187189 DEBUG nova.network.os_vif_util [None req-5c94d1ec-6d0b-46e9-86fc-7530d6de8a85 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Converting VIF {"id": "586fc8d7-18ba-4421-a518-60f4d0a6950c", "address": "fa:16:3e:90:18:7a", "network": {"id": "3c63c551-2e9f-4b47-9e49-c73140efe20a", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-555773636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71af3e88884e42c48fb244d7d6ca31e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap586fc8d7-18", "ovs_interfaceid": "586fc8d7-18ba-4421-a518-60f4d0a6950c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 06:53:21 compute-0 nova_compute[187185]: 2025-11-29 06:53:21.730 187189 DEBUG nova.network.os_vif_util [None req-5c94d1ec-6d0b-46e9-86fc-7530d6de8a85 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:90:18:7a,bridge_name='br-int',has_traffic_filtering=True,id=586fc8d7-18ba-4421-a518-60f4d0a6950c,network=Network(3c63c551-2e9f-4b47-9e49-c73140efe20a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap586fc8d7-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 06:53:21 compute-0 nova_compute[187185]: 2025-11-29 06:53:21.730 187189 DEBUG os_vif [None req-5c94d1ec-6d0b-46e9-86fc-7530d6de8a85 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:90:18:7a,bridge_name='br-int',has_traffic_filtering=True,id=586fc8d7-18ba-4421-a518-60f4d0a6950c,network=Network(3c63c551-2e9f-4b47-9e49-c73140efe20a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap586fc8d7-18') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 06:53:21 compute-0 nova_compute[187185]: 2025-11-29 06:53:21.732 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:21 compute-0 nova_compute[187185]: 2025-11-29 06:53:21.732 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap586fc8d7-18, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:53:21 compute-0 nova_compute[187185]: 2025-11-29 06:53:21.734 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:21 compute-0 nova_compute[187185]: 2025-11-29 06:53:21.735 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:21 compute-0 nova_compute[187185]: 2025-11-29 06:53:21.739 187189 INFO os_vif [None req-5c94d1ec-6d0b-46e9-86fc-7530d6de8a85 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:90:18:7a,bridge_name='br-int',has_traffic_filtering=True,id=586fc8d7-18ba-4421-a518-60f4d0a6950c,network=Network(3c63c551-2e9f-4b47-9e49-c73140efe20a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap586fc8d7-18')
Nov 29 06:53:21 compute-0 nova_compute[187185]: 2025-11-29 06:53:21.740 187189 INFO nova.virt.libvirt.driver [None req-5c94d1ec-6d0b-46e9-86fc-7530d6de8a85 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Deleting instance files /var/lib/nova/instances/942e977f-fd74-45d1-b0be-661b15431eca_del
Nov 29 06:53:21 compute-0 nova_compute[187185]: 2025-11-29 06:53:21.741 187189 INFO nova.virt.libvirt.driver [None req-5c94d1ec-6d0b-46e9-86fc-7530d6de8a85 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Deletion of /var/lib/nova/instances/942e977f-fd74-45d1-b0be-661b15431eca_del complete
Nov 29 06:53:21 compute-0 neutron-haproxy-ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a[216250]: [NOTICE]   (216261) : haproxy version is 2.8.14-c23fe91
Nov 29 06:53:21 compute-0 neutron-haproxy-ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a[216250]: [NOTICE]   (216261) : path to executable is /usr/sbin/haproxy
Nov 29 06:53:21 compute-0 neutron-haproxy-ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a[216250]: [WARNING]  (216261) : Exiting Master process...
Nov 29 06:53:21 compute-0 neutron-haproxy-ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a[216250]: [WARNING]  (216261) : Exiting Master process...
Nov 29 06:53:21 compute-0 neutron-haproxy-ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a[216250]: [ALERT]    (216261) : Current worker (216263) exited with code 143 (Terminated)
Nov 29 06:53:21 compute-0 neutron-haproxy-ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a[216250]: [WARNING]  (216261) : All workers exited. Exiting... (0)
Nov 29 06:53:21 compute-0 systemd[1]: libpod-b814c2b33f28d15a79f7ceae2c7ea208fdacedc32e186c111a63d1751b6fad30.scope: Deactivated successfully.
Nov 29 06:53:21 compute-0 podman[216657]: 2025-11-29 06:53:21.776660163 +0000 UTC m=+0.050175615 container died b814c2b33f28d15a79f7ceae2c7ea208fdacedc32e186c111a63d1751b6fad30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 06:53:21 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b814c2b33f28d15a79f7ceae2c7ea208fdacedc32e186c111a63d1751b6fad30-userdata-shm.mount: Deactivated successfully.
Nov 29 06:53:21 compute-0 systemd[1]: var-lib-containers-storage-overlay-499831e352667b04e531c33ccd01d6a5fbf69076b11ff0e78e057770d809c5a2-merged.mount: Deactivated successfully.
Nov 29 06:53:21 compute-0 podman[216657]: 2025-11-29 06:53:21.821133752 +0000 UTC m=+0.094649194 container cleanup b814c2b33f28d15a79f7ceae2c7ea208fdacedc32e186c111a63d1751b6fad30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 29 06:53:21 compute-0 systemd[1]: libpod-conmon-b814c2b33f28d15a79f7ceae2c7ea208fdacedc32e186c111a63d1751b6fad30.scope: Deactivated successfully.
Nov 29 06:53:21 compute-0 podman[216686]: 2025-11-29 06:53:21.898707014 +0000 UTC m=+0.052061474 container remove b814c2b33f28d15a79f7ceae2c7ea208fdacedc32e186c111a63d1751b6fad30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:53:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:21.904 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e02f8b43-a8aa-4fa8-ab2d-8f022f440e90]: (4, ('Sat Nov 29 06:53:21 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a (b814c2b33f28d15a79f7ceae2c7ea208fdacedc32e186c111a63d1751b6fad30)\nb814c2b33f28d15a79f7ceae2c7ea208fdacedc32e186c111a63d1751b6fad30\nSat Nov 29 06:53:21 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a (b814c2b33f28d15a79f7ceae2c7ea208fdacedc32e186c111a63d1751b6fad30)\nb814c2b33f28d15a79f7ceae2c7ea208fdacedc32e186c111a63d1751b6fad30\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:53:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:21.908 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[bf0b309f-e8f0-40a5-9cfb-20bf368e2a94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:53:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:21.909 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3c63c551-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:53:21 compute-0 nova_compute[187185]: 2025-11-29 06:53:21.911 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:21 compute-0 kernel: tap3c63c551-20: left promiscuous mode
Nov 29 06:53:21 compute-0 nova_compute[187185]: 2025-11-29 06:53:21.926 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:21.931 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a0d0fe33-935b-432c-9779-e99e3fe85903]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:53:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:21.950 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[fe449ccb-86cd-40b5-b406-ffe2cbc8f397]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:53:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:21.952 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6f1b2122-3289-4ea1-917f-24d502c3603a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:53:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:21.972 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e1d1c1cb-96e7-49cf-90dd-8307a46af12b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 460149, 'reachable_time': 40796, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216701, 'error': None, 'target': 'ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:53:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:21.977 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3c63c551-2e9f-4b47-9e49-c73140efe20a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 06:53:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:21.978 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[fd62cef5-e321-43dd-ba15-25c6f45831c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:53:21 compute-0 systemd[1]: run-netns-ovnmeta\x2d3c63c551\x2d2e9f\x2d4b47\x2d9e49\x2dc73140efe20a.mount: Deactivated successfully.
Nov 29 06:53:22 compute-0 nova_compute[187185]: 2025-11-29 06:53:22.034 187189 INFO nova.compute.manager [None req-5c94d1ec-6d0b-46e9-86fc-7530d6de8a85 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Took 0.60 seconds to destroy the instance on the hypervisor.
Nov 29 06:53:22 compute-0 nova_compute[187185]: 2025-11-29 06:53:22.036 187189 DEBUG oslo.service.loopingcall [None req-5c94d1ec-6d0b-46e9-86fc-7530d6de8a85 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 06:53:22 compute-0 nova_compute[187185]: 2025-11-29 06:53:22.037 187189 DEBUG nova.compute.manager [-] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 06:53:22 compute-0 nova_compute[187185]: 2025-11-29 06:53:22.037 187189 DEBUG nova.network.neutron [-] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 06:53:22 compute-0 nova_compute[187185]: 2025-11-29 06:53:22.616 187189 DEBUG nova.network.neutron [-] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:53:22 compute-0 nova_compute[187185]: 2025-11-29 06:53:22.633 187189 INFO nova.compute.manager [-] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Took 0.60 seconds to deallocate network for instance.
Nov 29 06:53:22 compute-0 nova_compute[187185]: 2025-11-29 06:53:22.681 187189 DEBUG nova.network.neutron [None req-cd00d234-cd4a-454e-b8d7-f30dc7823e53 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Updating instance_info_cache with network_info: [{"id": "db7d3de5-1042-4ad8-a3e2-d1f59d68f37d", "address": "fa:16:3e:92:df:75", "network": {"id": "fb6261a0-1734-4a19-8eaa-94660a8ddab1", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1526889533-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}, {"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462ed9e91718488eab9f1fece4b6b34b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb7d3de5-10", "ovs_interfaceid": "db7d3de5-1042-4ad8-a3e2-d1f59d68f37d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:53:22 compute-0 nova_compute[187185]: 2025-11-29 06:53:22.713 187189 DEBUG oslo_concurrency.lockutils [None req-cd00d234-cd4a-454e-b8d7-f30dc7823e53 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Releasing lock "refresh_cache-91bf50da-3c1f-4f88-a67f-21ec183c3812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:53:22 compute-0 nova_compute[187185]: 2025-11-29 06:53:22.714 187189 DEBUG nova.compute.manager [None req-cd00d234-cd4a-454e-b8d7-f30dc7823e53 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Inject network info _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7144
Nov 29 06:53:22 compute-0 nova_compute[187185]: 2025-11-29 06:53:22.714 187189 DEBUG nova.compute.manager [None req-cd00d234-cd4a-454e-b8d7-f30dc7823e53 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] network_info to inject: |[{"id": "db7d3de5-1042-4ad8-a3e2-d1f59d68f37d", "address": "fa:16:3e:92:df:75", "network": {"id": "fb6261a0-1734-4a19-8eaa-94660a8ddab1", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1526889533-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}, {"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462ed9e91718488eab9f1fece4b6b34b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb7d3de5-10", "ovs_interfaceid": "db7d3de5-1042-4ad8-a3e2-d1f59d68f37d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7145
Nov 29 06:53:22 compute-0 nova_compute[187185]: 2025-11-29 06:53:22.716 187189 DEBUG oslo_concurrency.lockutils [req-e1d3631f-7265-4fc6-8ceb-c4e1c1c8e9a5 req-f38090f2-d65b-4f8f-87a3-90a21da9a71c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-91bf50da-3c1f-4f88-a67f-21ec183c3812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:53:22 compute-0 nova_compute[187185]: 2025-11-29 06:53:22.716 187189 DEBUG nova.network.neutron [req-e1d3631f-7265-4fc6-8ceb-c4e1c1c8e9a5 req-f38090f2-d65b-4f8f-87a3-90a21da9a71c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Refreshing network info cache for port db7d3de5-1042-4ad8-a3e2-d1f59d68f37d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 06:53:22 compute-0 nova_compute[187185]: 2025-11-29 06:53:22.750 187189 DEBUG oslo_concurrency.lockutils [None req-5c94d1ec-6d0b-46e9-86fc-7530d6de8a85 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:53:22 compute-0 nova_compute[187185]: 2025-11-29 06:53:22.751 187189 DEBUG oslo_concurrency.lockutils [None req-5c94d1ec-6d0b-46e9-86fc-7530d6de8a85 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:53:22 compute-0 nova_compute[187185]: 2025-11-29 06:53:22.753 187189 DEBUG nova.compute.manager [req-9c6c3b48-02fa-447f-881f-5beab51252fc req-9761d745-31e9-49da-9eea-1717395c0d57 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Received event network-vif-deleted-586fc8d7-18ba-4421-a518-60f4d0a6950c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:53:22 compute-0 nova_compute[187185]: 2025-11-29 06:53:22.778 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:22 compute-0 nova_compute[187185]: 2025-11-29 06:53:22.806 187189 DEBUG nova.scheduler.client.report [None req-5c94d1ec-6d0b-46e9-86fc-7530d6de8a85 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Refreshing inventories for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 29 06:53:22 compute-0 nova_compute[187185]: 2025-11-29 06:53:22.825 187189 DEBUG nova.scheduler.client.report [None req-5c94d1ec-6d0b-46e9-86fc-7530d6de8a85 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Updating ProviderTree inventory for provider 4e39a026-df39-4e20-874a-dbb5a40df044 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 29 06:53:22 compute-0 nova_compute[187185]: 2025-11-29 06:53:22.825 187189 DEBUG nova.compute.provider_tree [None req-5c94d1ec-6d0b-46e9-86fc-7530d6de8a85 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Updating inventory in ProviderTree for provider 4e39a026-df39-4e20-874a-dbb5a40df044 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 29 06:53:22 compute-0 nova_compute[187185]: 2025-11-29 06:53:22.839 187189 DEBUG nova.scheduler.client.report [None req-5c94d1ec-6d0b-46e9-86fc-7530d6de8a85 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Refreshing aggregate associations for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 29 06:53:22 compute-0 nova_compute[187185]: 2025-11-29 06:53:22.863 187189 DEBUG nova.scheduler.client.report [None req-5c94d1ec-6d0b-46e9-86fc-7530d6de8a85 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Refreshing trait associations for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044, traits: HW_CPU_X86_SSE,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_TRUSTED_CERTS,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 29 06:53:23 compute-0 nova_compute[187185]: 2025-11-29 06:53:23.174 187189 DEBUG nova.compute.provider_tree [None req-5c94d1ec-6d0b-46e9-86fc-7530d6de8a85 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:53:23 compute-0 nova_compute[187185]: 2025-11-29 06:53:23.229 187189 DEBUG nova.scheduler.client.report [None req-5c94d1ec-6d0b-46e9-86fc-7530d6de8a85 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:53:23 compute-0 nova_compute[187185]: 2025-11-29 06:53:23.253 187189 DEBUG oslo_concurrency.lockutils [None req-5c94d1ec-6d0b-46e9-86fc-7530d6de8a85 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.503s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:53:23 compute-0 nova_compute[187185]: 2025-11-29 06:53:23.285 187189 INFO nova.scheduler.client.report [None req-5c94d1ec-6d0b-46e9-86fc-7530d6de8a85 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Deleted allocations for instance 942e977f-fd74-45d1-b0be-661b15431eca
Nov 29 06:53:23 compute-0 nova_compute[187185]: 2025-11-29 06:53:23.316 187189 DEBUG nova.compute.manager [req-0ec098bf-3b01-4000-9ed9-dfce68b03558 req-713bf48f-1fad-4254-a4c1-b4acbba7527f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Received event network-vif-unplugged-586fc8d7-18ba-4421-a518-60f4d0a6950c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:53:23 compute-0 nova_compute[187185]: 2025-11-29 06:53:23.316 187189 DEBUG oslo_concurrency.lockutils [req-0ec098bf-3b01-4000-9ed9-dfce68b03558 req-713bf48f-1fad-4254-a4c1-b4acbba7527f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "942e977f-fd74-45d1-b0be-661b15431eca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:53:23 compute-0 nova_compute[187185]: 2025-11-29 06:53:23.316 187189 DEBUG oslo_concurrency.lockutils [req-0ec098bf-3b01-4000-9ed9-dfce68b03558 req-713bf48f-1fad-4254-a4c1-b4acbba7527f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "942e977f-fd74-45d1-b0be-661b15431eca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:53:23 compute-0 nova_compute[187185]: 2025-11-29 06:53:23.317 187189 DEBUG oslo_concurrency.lockutils [req-0ec098bf-3b01-4000-9ed9-dfce68b03558 req-713bf48f-1fad-4254-a4c1-b4acbba7527f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "942e977f-fd74-45d1-b0be-661b15431eca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:53:23 compute-0 nova_compute[187185]: 2025-11-29 06:53:23.317 187189 DEBUG nova.compute.manager [req-0ec098bf-3b01-4000-9ed9-dfce68b03558 req-713bf48f-1fad-4254-a4c1-b4acbba7527f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] No waiting events found dispatching network-vif-unplugged-586fc8d7-18ba-4421-a518-60f4d0a6950c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 06:53:23 compute-0 nova_compute[187185]: 2025-11-29 06:53:23.317 187189 WARNING nova.compute.manager [req-0ec098bf-3b01-4000-9ed9-dfce68b03558 req-713bf48f-1fad-4254-a4c1-b4acbba7527f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Received unexpected event network-vif-unplugged-586fc8d7-18ba-4421-a518-60f4d0a6950c for instance with vm_state deleted and task_state None.
Nov 29 06:53:23 compute-0 nova_compute[187185]: 2025-11-29 06:53:23.317 187189 DEBUG nova.compute.manager [req-0ec098bf-3b01-4000-9ed9-dfce68b03558 req-713bf48f-1fad-4254-a4c1-b4acbba7527f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Received event network-vif-plugged-586fc8d7-18ba-4421-a518-60f4d0a6950c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:53:23 compute-0 nova_compute[187185]: 2025-11-29 06:53:23.317 187189 DEBUG oslo_concurrency.lockutils [req-0ec098bf-3b01-4000-9ed9-dfce68b03558 req-713bf48f-1fad-4254-a4c1-b4acbba7527f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "942e977f-fd74-45d1-b0be-661b15431eca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:53:23 compute-0 nova_compute[187185]: 2025-11-29 06:53:23.317 187189 DEBUG oslo_concurrency.lockutils [req-0ec098bf-3b01-4000-9ed9-dfce68b03558 req-713bf48f-1fad-4254-a4c1-b4acbba7527f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "942e977f-fd74-45d1-b0be-661b15431eca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:53:23 compute-0 nova_compute[187185]: 2025-11-29 06:53:23.318 187189 DEBUG oslo_concurrency.lockutils [req-0ec098bf-3b01-4000-9ed9-dfce68b03558 req-713bf48f-1fad-4254-a4c1-b4acbba7527f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "942e977f-fd74-45d1-b0be-661b15431eca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:53:23 compute-0 nova_compute[187185]: 2025-11-29 06:53:23.318 187189 DEBUG nova.compute.manager [req-0ec098bf-3b01-4000-9ed9-dfce68b03558 req-713bf48f-1fad-4254-a4c1-b4acbba7527f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] No waiting events found dispatching network-vif-plugged-586fc8d7-18ba-4421-a518-60f4d0a6950c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 06:53:23 compute-0 nova_compute[187185]: 2025-11-29 06:53:23.318 187189 WARNING nova.compute.manager [req-0ec098bf-3b01-4000-9ed9-dfce68b03558 req-713bf48f-1fad-4254-a4c1-b4acbba7527f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Received unexpected event network-vif-plugged-586fc8d7-18ba-4421-a518-60f4d0a6950c for instance with vm_state deleted and task_state None.
Nov 29 06:53:23 compute-0 nova_compute[187185]: 2025-11-29 06:53:23.411 187189 DEBUG oslo_concurrency.lockutils [None req-5c94d1ec-6d0b-46e9-86fc-7530d6de8a85 a0fcd4f4de7e4072be30f7e3d4ac7c77 71af3e88884e42c48fb244d7d6ca31e2 - - default default] Lock "942e977f-fd74-45d1-b0be-661b15431eca" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:53:23 compute-0 podman[216702]: 2025-11-29 06:53:23.827932999 +0000 UTC m=+0.077049919 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Nov 29 06:53:23 compute-0 podman[216703]: 2025-11-29 06:53:23.832276787 +0000 UTC m=+0.083359740 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, version=9.6, architecture=x86_64, io.openshift.tags=minimal rhel9, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.expose-services=, name=ubi9-minimal, vcs-type=git, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 29 06:53:23 compute-0 podman[216704]: 2025-11-29 06:53:23.872603784 +0000 UTC m=+0.111097718 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 06:53:24 compute-0 nova_compute[187185]: 2025-11-29 06:53:24.147 187189 DEBUG nova.objects.instance [None req-3aa76950-357e-422d-a9f1-4d5cf9d3ea25 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Lazy-loading 'flavor' on Instance uuid 91bf50da-3c1f-4f88-a67f-21ec183c3812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:53:24 compute-0 nova_compute[187185]: 2025-11-29 06:53:24.192 187189 DEBUG oslo_concurrency.lockutils [None req-3aa76950-357e-422d-a9f1-4d5cf9d3ea25 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Acquiring lock "refresh_cache-91bf50da-3c1f-4f88-a67f-21ec183c3812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:53:24 compute-0 nova_compute[187185]: 2025-11-29 06:53:24.691 187189 DEBUG nova.network.neutron [req-e1d3631f-7265-4fc6-8ceb-c4e1c1c8e9a5 req-f38090f2-d65b-4f8f-87a3-90a21da9a71c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Updated VIF entry in instance network info cache for port db7d3de5-1042-4ad8-a3e2-d1f59d68f37d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 06:53:24 compute-0 nova_compute[187185]: 2025-11-29 06:53:24.693 187189 DEBUG nova.network.neutron [req-e1d3631f-7265-4fc6-8ceb-c4e1c1c8e9a5 req-f38090f2-d65b-4f8f-87a3-90a21da9a71c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Updating instance_info_cache with network_info: [{"id": "db7d3de5-1042-4ad8-a3e2-d1f59d68f37d", "address": "fa:16:3e:92:df:75", "network": {"id": "fb6261a0-1734-4a19-8eaa-94660a8ddab1", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1526889533-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}, {"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462ed9e91718488eab9f1fece4b6b34b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb7d3de5-10", "ovs_interfaceid": "db7d3de5-1042-4ad8-a3e2-d1f59d68f37d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:53:24 compute-0 nova_compute[187185]: 2025-11-29 06:53:24.715 187189 DEBUG oslo_concurrency.lockutils [req-e1d3631f-7265-4fc6-8ceb-c4e1c1c8e9a5 req-f38090f2-d65b-4f8f-87a3-90a21da9a71c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-91bf50da-3c1f-4f88-a67f-21ec183c3812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:53:24 compute-0 nova_compute[187185]: 2025-11-29 06:53:24.716 187189 DEBUG oslo_concurrency.lockutils [None req-3aa76950-357e-422d-a9f1-4d5cf9d3ea25 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Acquired lock "refresh_cache-91bf50da-3c1f-4f88-a67f-21ec183c3812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:53:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:24.812 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:53:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:24.813 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:53:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:24.813 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:53:26 compute-0 nova_compute[187185]: 2025-11-29 06:53:26.436 187189 DEBUG nova.network.neutron [None req-3aa76950-357e-422d-a9f1-4d5cf9d3ea25 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 06:53:26 compute-0 nova_compute[187185]: 2025-11-29 06:53:26.623 187189 DEBUG nova.compute.manager [req-04100cc4-47ab-431a-8196-1d433200c00f req-8f0372bb-2628-460d-b8f4-9965104a92d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Received event network-changed-db7d3de5-1042-4ad8-a3e2-d1f59d68f37d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:53:26 compute-0 nova_compute[187185]: 2025-11-29 06:53:26.623 187189 DEBUG nova.compute.manager [req-04100cc4-47ab-431a-8196-1d433200c00f req-8f0372bb-2628-460d-b8f4-9965104a92d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Refreshing instance network info cache due to event network-changed-db7d3de5-1042-4ad8-a3e2-d1f59d68f37d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 06:53:26 compute-0 nova_compute[187185]: 2025-11-29 06:53:26.623 187189 DEBUG oslo_concurrency.lockutils [req-04100cc4-47ab-431a-8196-1d433200c00f req-8f0372bb-2628-460d-b8f4-9965104a92d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-91bf50da-3c1f-4f88-a67f-21ec183c3812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:53:26 compute-0 nova_compute[187185]: 2025-11-29 06:53:26.736 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:27 compute-0 nova_compute[187185]: 2025-11-29 06:53:27.782 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:28 compute-0 ovn_controller[95281]: 2025-11-29T06:53:28Z|00066|binding|INFO|Releasing lport d72b5273-d899-47b6-b2e9-b2f0ad4da789 from this chassis (sb_readonly=0)
Nov 29 06:53:28 compute-0 nova_compute[187185]: 2025-11-29 06:53:28.364 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:28 compute-0 nova_compute[187185]: 2025-11-29 06:53:28.491 187189 DEBUG nova.network.neutron [None req-3aa76950-357e-422d-a9f1-4d5cf9d3ea25 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Updating instance_info_cache with network_info: [{"id": "db7d3de5-1042-4ad8-a3e2-d1f59d68f37d", "address": "fa:16:3e:92:df:75", "network": {"id": "fb6261a0-1734-4a19-8eaa-94660a8ddab1", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1526889533-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462ed9e91718488eab9f1fece4b6b34b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb7d3de5-10", "ovs_interfaceid": "db7d3de5-1042-4ad8-a3e2-d1f59d68f37d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:53:28 compute-0 nova_compute[187185]: 2025-11-29 06:53:28.515 187189 DEBUG oslo_concurrency.lockutils [None req-3aa76950-357e-422d-a9f1-4d5cf9d3ea25 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Releasing lock "refresh_cache-91bf50da-3c1f-4f88-a67f-21ec183c3812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:53:28 compute-0 nova_compute[187185]: 2025-11-29 06:53:28.515 187189 DEBUG nova.compute.manager [None req-3aa76950-357e-422d-a9f1-4d5cf9d3ea25 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Inject network info _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7144
Nov 29 06:53:28 compute-0 nova_compute[187185]: 2025-11-29 06:53:28.516 187189 DEBUG nova.compute.manager [None req-3aa76950-357e-422d-a9f1-4d5cf9d3ea25 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] network_info to inject: |[{"id": "db7d3de5-1042-4ad8-a3e2-d1f59d68f37d", "address": "fa:16:3e:92:df:75", "network": {"id": "fb6261a0-1734-4a19-8eaa-94660a8ddab1", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1526889533-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462ed9e91718488eab9f1fece4b6b34b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb7d3de5-10", "ovs_interfaceid": "db7d3de5-1042-4ad8-a3e2-d1f59d68f37d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7145
Nov 29 06:53:28 compute-0 nova_compute[187185]: 2025-11-29 06:53:28.518 187189 DEBUG oslo_concurrency.lockutils [req-04100cc4-47ab-431a-8196-1d433200c00f req-8f0372bb-2628-460d-b8f4-9965104a92d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-91bf50da-3c1f-4f88-a67f-21ec183c3812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:53:28 compute-0 nova_compute[187185]: 2025-11-29 06:53:28.519 187189 DEBUG nova.network.neutron [req-04100cc4-47ab-431a-8196-1d433200c00f req-8f0372bb-2628-460d-b8f4-9965104a92d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Refreshing network info cache for port db7d3de5-1042-4ad8-a3e2-d1f59d68f37d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 06:53:29 compute-0 nova_compute[187185]: 2025-11-29 06:53:29.027 187189 DEBUG oslo_concurrency.lockutils [None req-ee8ca00e-8ad0-4484-8b27-68ae2b93dc87 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Acquiring lock "91bf50da-3c1f-4f88-a67f-21ec183c3812" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:53:29 compute-0 nova_compute[187185]: 2025-11-29 06:53:29.029 187189 DEBUG oslo_concurrency.lockutils [None req-ee8ca00e-8ad0-4484-8b27-68ae2b93dc87 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Lock "91bf50da-3c1f-4f88-a67f-21ec183c3812" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:53:29 compute-0 nova_compute[187185]: 2025-11-29 06:53:29.030 187189 DEBUG oslo_concurrency.lockutils [None req-ee8ca00e-8ad0-4484-8b27-68ae2b93dc87 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Acquiring lock "91bf50da-3c1f-4f88-a67f-21ec183c3812-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:53:29 compute-0 nova_compute[187185]: 2025-11-29 06:53:29.030 187189 DEBUG oslo_concurrency.lockutils [None req-ee8ca00e-8ad0-4484-8b27-68ae2b93dc87 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Lock "91bf50da-3c1f-4f88-a67f-21ec183c3812-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:53:29 compute-0 nova_compute[187185]: 2025-11-29 06:53:29.030 187189 DEBUG oslo_concurrency.lockutils [None req-ee8ca00e-8ad0-4484-8b27-68ae2b93dc87 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Lock "91bf50da-3c1f-4f88-a67f-21ec183c3812-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:53:29 compute-0 nova_compute[187185]: 2025-11-29 06:53:29.042 187189 INFO nova.compute.manager [None req-ee8ca00e-8ad0-4484-8b27-68ae2b93dc87 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Terminating instance
Nov 29 06:53:29 compute-0 nova_compute[187185]: 2025-11-29 06:53:29.056 187189 DEBUG nova.compute.manager [None req-ee8ca00e-8ad0-4484-8b27-68ae2b93dc87 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 06:53:29 compute-0 kernel: tapdb7d3de5-10 (unregistering): left promiscuous mode
Nov 29 06:53:29 compute-0 NetworkManager[55227]: <info>  [1764399209.0866] device (tapdb7d3de5-10): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 06:53:29 compute-0 ovn_controller[95281]: 2025-11-29T06:53:29Z|00067|binding|INFO|Releasing lport db7d3de5-1042-4ad8-a3e2-d1f59d68f37d from this chassis (sb_readonly=0)
Nov 29 06:53:29 compute-0 ovn_controller[95281]: 2025-11-29T06:53:29Z|00068|binding|INFO|Setting lport db7d3de5-1042-4ad8-a3e2-d1f59d68f37d down in Southbound
Nov 29 06:53:29 compute-0 ovn_controller[95281]: 2025-11-29T06:53:29Z|00069|binding|INFO|Removing iface tapdb7d3de5-10 ovn-installed in OVS
Nov 29 06:53:29 compute-0 nova_compute[187185]: 2025-11-29 06:53:29.092 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:29 compute-0 nova_compute[187185]: 2025-11-29 06:53:29.096 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:29 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:29.105 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:92:df:75 10.100.0.9'], port_security=['fa:16:3e:92:df:75 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '91bf50da-3c1f-4f88-a67f-21ec183c3812', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fb6261a0-1734-4a19-8eaa-94660a8ddab1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '462ed9e91718488eab9f1fece4b6b34b', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'c22a403f-3305-4d7a-b0bf-8c5bc0b8a78a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.199'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=65e466a2-60ed-4685-8786-88e3e79b87bd, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=db7d3de5-1042-4ad8-a3e2-d1f59d68f37d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 06:53:29 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:29.107 104254 INFO neutron.agent.ovn.metadata.agent [-] Port db7d3de5-1042-4ad8-a3e2-d1f59d68f37d in datapath fb6261a0-1734-4a19-8eaa-94660a8ddab1 unbound from our chassis
Nov 29 06:53:29 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:29.109 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fb6261a0-1734-4a19-8eaa-94660a8ddab1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 06:53:29 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:29.110 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[ae7fcdfc-da80-4328-bfe7-b13d2abd3ad6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:53:29 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:29.110 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fb6261a0-1734-4a19-8eaa-94660a8ddab1 namespace which is not needed anymore
Nov 29 06:53:29 compute-0 nova_compute[187185]: 2025-11-29 06:53:29.112 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:29 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000018.scope: Deactivated successfully.
Nov 29 06:53:29 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000018.scope: Consumed 13.609s CPU time.
Nov 29 06:53:29 compute-0 systemd-machined[153486]: Machine qemu-7-instance-00000018 terminated.
Nov 29 06:53:29 compute-0 nova_compute[187185]: 2025-11-29 06:53:29.293 187189 DEBUG nova.compute.manager [req-40d35de3-769d-4a3e-ba75-4802422d494b req-80ae24dc-f96e-4c0c-b10a-70500afe1a3b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Received event network-vif-unplugged-db7d3de5-1042-4ad8-a3e2-d1f59d68f37d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:53:29 compute-0 nova_compute[187185]: 2025-11-29 06:53:29.294 187189 DEBUG oslo_concurrency.lockutils [req-40d35de3-769d-4a3e-ba75-4802422d494b req-80ae24dc-f96e-4c0c-b10a-70500afe1a3b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "91bf50da-3c1f-4f88-a67f-21ec183c3812-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:53:29 compute-0 nova_compute[187185]: 2025-11-29 06:53:29.294 187189 DEBUG oslo_concurrency.lockutils [req-40d35de3-769d-4a3e-ba75-4802422d494b req-80ae24dc-f96e-4c0c-b10a-70500afe1a3b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "91bf50da-3c1f-4f88-a67f-21ec183c3812-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:53:29 compute-0 nova_compute[187185]: 2025-11-29 06:53:29.295 187189 DEBUG oslo_concurrency.lockutils [req-40d35de3-769d-4a3e-ba75-4802422d494b req-80ae24dc-f96e-4c0c-b10a-70500afe1a3b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "91bf50da-3c1f-4f88-a67f-21ec183c3812-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:53:29 compute-0 nova_compute[187185]: 2025-11-29 06:53:29.295 187189 DEBUG nova.compute.manager [req-40d35de3-769d-4a3e-ba75-4802422d494b req-80ae24dc-f96e-4c0c-b10a-70500afe1a3b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] No waiting events found dispatching network-vif-unplugged-db7d3de5-1042-4ad8-a3e2-d1f59d68f37d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 06:53:29 compute-0 nova_compute[187185]: 2025-11-29 06:53:29.296 187189 DEBUG nova.compute.manager [req-40d35de3-769d-4a3e-ba75-4802422d494b req-80ae24dc-f96e-4c0c-b10a-70500afe1a3b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Received event network-vif-unplugged-db7d3de5-1042-4ad8-a3e2-d1f59d68f37d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 06:53:29 compute-0 nova_compute[187185]: 2025-11-29 06:53:29.326 187189 INFO nova.virt.libvirt.driver [-] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Instance destroyed successfully.
Nov 29 06:53:29 compute-0 nova_compute[187185]: 2025-11-29 06:53:29.326 187189 DEBUG nova.objects.instance [None req-ee8ca00e-8ad0-4484-8b27-68ae2b93dc87 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Lazy-loading 'resources' on Instance uuid 91bf50da-3c1f-4f88-a67f-21ec183c3812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:53:29 compute-0 nova_compute[187185]: 2025-11-29 06:53:29.344 187189 DEBUG nova.virt.libvirt.vif [None req-ee8ca00e-8ad0-4484-8b27-68ae2b93dc87 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:52:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesUnderV243Test-server-519568352',display_name='tempest-AttachInterfacesUnderV243Test-server-519568352',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesunderv243test-server-519568352',id=24,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCvDH9qxVoUCEjlJhQsaEAMbNOzskbLtC6DtWT7gJN5DrOZvy7OhqKmDO7brKeHOYs7363P/8xYAF1DlOOdDyjNgDnx8R2KTRFVjZOrU6WuZomi7dZ/t1KQJzhHGDhHODw==',key_name='tempest-keypair-1451011827',keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:52:57Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='462ed9e91718488eab9f1fece4b6b34b',ramdisk_id='',reservation_id='r-z9rgod64',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesUnderV243Test-153400252',owner_user_name='tempest-AttachInterfacesUnderV243Test-153400252-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:53:28Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='840f4cdf2bf9409f9b4fd2a7218fcfbb',uuid=91bf50da-3c1f-4f88-a67f-21ec183c3812,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "db7d3de5-1042-4ad8-a3e2-d1f59d68f37d", "address": "fa:16:3e:92:df:75", "network": {"id": "fb6261a0-1734-4a19-8eaa-94660a8ddab1", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1526889533-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462ed9e91718488eab9f1fece4b6b34b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb7d3de5-10", "ovs_interfaceid": "db7d3de5-1042-4ad8-a3e2-d1f59d68f37d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 06:53:29 compute-0 nova_compute[187185]: 2025-11-29 06:53:29.345 187189 DEBUG nova.network.os_vif_util [None req-ee8ca00e-8ad0-4484-8b27-68ae2b93dc87 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Converting VIF {"id": "db7d3de5-1042-4ad8-a3e2-d1f59d68f37d", "address": "fa:16:3e:92:df:75", "network": {"id": "fb6261a0-1734-4a19-8eaa-94660a8ddab1", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1526889533-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462ed9e91718488eab9f1fece4b6b34b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb7d3de5-10", "ovs_interfaceid": "db7d3de5-1042-4ad8-a3e2-d1f59d68f37d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 06:53:29 compute-0 nova_compute[187185]: 2025-11-29 06:53:29.345 187189 DEBUG nova.network.os_vif_util [None req-ee8ca00e-8ad0-4484-8b27-68ae2b93dc87 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:92:df:75,bridge_name='br-int',has_traffic_filtering=True,id=db7d3de5-1042-4ad8-a3e2-d1f59d68f37d,network=Network(fb6261a0-1734-4a19-8eaa-94660a8ddab1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb7d3de5-10') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 06:53:29 compute-0 nova_compute[187185]: 2025-11-29 06:53:29.346 187189 DEBUG os_vif [None req-ee8ca00e-8ad0-4484-8b27-68ae2b93dc87 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:92:df:75,bridge_name='br-int',has_traffic_filtering=True,id=db7d3de5-1042-4ad8-a3e2-d1f59d68f37d,network=Network(fb6261a0-1734-4a19-8eaa-94660a8ddab1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb7d3de5-10') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 06:53:29 compute-0 nova_compute[187185]: 2025-11-29 06:53:29.348 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:29 compute-0 nova_compute[187185]: 2025-11-29 06:53:29.348 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdb7d3de5-10, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:53:29 compute-0 nova_compute[187185]: 2025-11-29 06:53:29.350 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:29 compute-0 nova_compute[187185]: 2025-11-29 06:53:29.351 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:29 compute-0 nova_compute[187185]: 2025-11-29 06:53:29.354 187189 INFO os_vif [None req-ee8ca00e-8ad0-4484-8b27-68ae2b93dc87 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:92:df:75,bridge_name='br-int',has_traffic_filtering=True,id=db7d3de5-1042-4ad8-a3e2-d1f59d68f37d,network=Network(fb6261a0-1734-4a19-8eaa-94660a8ddab1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb7d3de5-10')
Nov 29 06:53:29 compute-0 nova_compute[187185]: 2025-11-29 06:53:29.355 187189 INFO nova.virt.libvirt.driver [None req-ee8ca00e-8ad0-4484-8b27-68ae2b93dc87 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Deleting instance files /var/lib/nova/instances/91bf50da-3c1f-4f88-a67f-21ec183c3812_del
Nov 29 06:53:29 compute-0 nova_compute[187185]: 2025-11-29 06:53:29.356 187189 INFO nova.virt.libvirt.driver [None req-ee8ca00e-8ad0-4484-8b27-68ae2b93dc87 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Deletion of /var/lib/nova/instances/91bf50da-3c1f-4f88-a67f-21ec183c3812_del complete
Nov 29 06:53:29 compute-0 neutron-haproxy-ovnmeta-fb6261a0-1734-4a19-8eaa-94660a8ddab1[216493]: [NOTICE]   (216497) : haproxy version is 2.8.14-c23fe91
Nov 29 06:53:29 compute-0 neutron-haproxy-ovnmeta-fb6261a0-1734-4a19-8eaa-94660a8ddab1[216493]: [NOTICE]   (216497) : path to executable is /usr/sbin/haproxy
Nov 29 06:53:29 compute-0 neutron-haproxy-ovnmeta-fb6261a0-1734-4a19-8eaa-94660a8ddab1[216493]: [WARNING]  (216497) : Exiting Master process...
Nov 29 06:53:29 compute-0 neutron-haproxy-ovnmeta-fb6261a0-1734-4a19-8eaa-94660a8ddab1[216493]: [ALERT]    (216497) : Current worker (216499) exited with code 143 (Terminated)
Nov 29 06:53:29 compute-0 neutron-haproxy-ovnmeta-fb6261a0-1734-4a19-8eaa-94660a8ddab1[216493]: [WARNING]  (216497) : All workers exited. Exiting... (0)
Nov 29 06:53:29 compute-0 systemd[1]: libpod-e22e0c7565c6b7fc577b9e2fab3682cdd3ff1f074f042b0f7643d39e5e7a394c.scope: Deactivated successfully.
Nov 29 06:53:29 compute-0 podman[216789]: 2025-11-29 06:53:29.370240223 +0000 UTC m=+0.159018211 container died e22e0c7565c6b7fc577b9e2fab3682cdd3ff1f074f042b0f7643d39e5e7a394c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fb6261a0-1734-4a19-8eaa-94660a8ddab1, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 06:53:29 compute-0 nova_compute[187185]: 2025-11-29 06:53:29.433 187189 INFO nova.compute.manager [None req-ee8ca00e-8ad0-4484-8b27-68ae2b93dc87 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Took 0.38 seconds to destroy the instance on the hypervisor.
Nov 29 06:53:29 compute-0 nova_compute[187185]: 2025-11-29 06:53:29.434 187189 DEBUG oslo.service.loopingcall [None req-ee8ca00e-8ad0-4484-8b27-68ae2b93dc87 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 06:53:29 compute-0 nova_compute[187185]: 2025-11-29 06:53:29.435 187189 DEBUG nova.compute.manager [-] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 06:53:29 compute-0 nova_compute[187185]: 2025-11-29 06:53:29.435 187189 DEBUG nova.network.neutron [-] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 06:53:29 compute-0 nova_compute[187185]: 2025-11-29 06:53:29.818 187189 DEBUG nova.network.neutron [req-04100cc4-47ab-431a-8196-1d433200c00f req-8f0372bb-2628-460d-b8f4-9965104a92d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Updated VIF entry in instance network info cache for port db7d3de5-1042-4ad8-a3e2-d1f59d68f37d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 06:53:29 compute-0 nova_compute[187185]: 2025-11-29 06:53:29.819 187189 DEBUG nova.network.neutron [req-04100cc4-47ab-431a-8196-1d433200c00f req-8f0372bb-2628-460d-b8f4-9965104a92d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Updating instance_info_cache with network_info: [{"id": "db7d3de5-1042-4ad8-a3e2-d1f59d68f37d", "address": "fa:16:3e:92:df:75", "network": {"id": "fb6261a0-1734-4a19-8eaa-94660a8ddab1", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1526889533-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "462ed9e91718488eab9f1fece4b6b34b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb7d3de5-10", "ovs_interfaceid": "db7d3de5-1042-4ad8-a3e2-d1f59d68f37d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:53:29 compute-0 nova_compute[187185]: 2025-11-29 06:53:29.838 187189 DEBUG oslo_concurrency.lockutils [req-04100cc4-47ab-431a-8196-1d433200c00f req-8f0372bb-2628-460d-b8f4-9965104a92d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-91bf50da-3c1f-4f88-a67f-21ec183c3812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:53:29 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:29.935 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:53:30 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e22e0c7565c6b7fc577b9e2fab3682cdd3ff1f074f042b0f7643d39e5e7a394c-userdata-shm.mount: Deactivated successfully.
Nov 29 06:53:30 compute-0 systemd[1]: var-lib-containers-storage-overlay-d88424dbe4b767114ba09b5ee20405388066c623725d6c8c3a0ef92ec39c70a4-merged.mount: Deactivated successfully.
Nov 29 06:53:30 compute-0 podman[216789]: 2025-11-29 06:53:30.30794324 +0000 UTC m=+1.096721278 container cleanup e22e0c7565c6b7fc577b9e2fab3682cdd3ff1f074f042b0f7643d39e5e7a394c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fb6261a0-1734-4a19-8eaa-94660a8ddab1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 29 06:53:30 compute-0 podman[216834]: 2025-11-29 06:53:30.315510924 +0000 UTC m=+0.396519898 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 06:53:30 compute-0 systemd[1]: libpod-conmon-e22e0c7565c6b7fc577b9e2fab3682cdd3ff1f074f042b0f7643d39e5e7a394c.scope: Deactivated successfully.
Nov 29 06:53:30 compute-0 nova_compute[187185]: 2025-11-29 06:53:30.643 187189 DEBUG nova.network.neutron [-] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:53:30 compute-0 nova_compute[187185]: 2025-11-29 06:53:30.666 187189 INFO nova.compute.manager [-] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Took 1.23 seconds to deallocate network for instance.
Nov 29 06:53:30 compute-0 nova_compute[187185]: 2025-11-29 06:53:30.747 187189 DEBUG oslo_concurrency.lockutils [None req-ee8ca00e-8ad0-4484-8b27-68ae2b93dc87 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:53:30 compute-0 nova_compute[187185]: 2025-11-29 06:53:30.748 187189 DEBUG oslo_concurrency.lockutils [None req-ee8ca00e-8ad0-4484-8b27-68ae2b93dc87 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:53:30 compute-0 nova_compute[187185]: 2025-11-29 06:53:30.766 187189 DEBUG nova.compute.manager [req-85594a79-421a-4f43-8e6c-6a00bc034d1b req-7da51007-dd6a-462e-8ffa-17a2aca784f1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Received event network-vif-deleted-db7d3de5-1042-4ad8-a3e2-d1f59d68f37d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:53:30 compute-0 nova_compute[187185]: 2025-11-29 06:53:30.830 187189 DEBUG nova.compute.provider_tree [None req-ee8ca00e-8ad0-4484-8b27-68ae2b93dc87 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:53:30 compute-0 nova_compute[187185]: 2025-11-29 06:53:30.849 187189 DEBUG nova.scheduler.client.report [None req-ee8ca00e-8ad0-4484-8b27-68ae2b93dc87 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:53:30 compute-0 nova_compute[187185]: 2025-11-29 06:53:30.868 187189 DEBUG oslo_concurrency.lockutils [None req-ee8ca00e-8ad0-4484-8b27-68ae2b93dc87 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:53:30 compute-0 podman[216863]: 2025-11-29 06:53:30.88845688 +0000 UTC m=+0.541096868 container remove e22e0c7565c6b7fc577b9e2fab3682cdd3ff1f074f042b0f7643d39e5e7a394c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fb6261a0-1734-4a19-8eaa-94660a8ddab1, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:53:30 compute-0 nova_compute[187185]: 2025-11-29 06:53:30.895 187189 INFO nova.scheduler.client.report [None req-ee8ca00e-8ad0-4484-8b27-68ae2b93dc87 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Deleted allocations for instance 91bf50da-3c1f-4f88-a67f-21ec183c3812
Nov 29 06:53:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:30.898 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[ff4bf439-9cef-451b-9aac-a3ee7315bf18]: (4, ('Sat Nov 29 06:53:29 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-fb6261a0-1734-4a19-8eaa-94660a8ddab1 (e22e0c7565c6b7fc577b9e2fab3682cdd3ff1f074f042b0f7643d39e5e7a394c)\ne22e0c7565c6b7fc577b9e2fab3682cdd3ff1f074f042b0f7643d39e5e7a394c\nSat Nov 29 06:53:30 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-fb6261a0-1734-4a19-8eaa-94660a8ddab1 (e22e0c7565c6b7fc577b9e2fab3682cdd3ff1f074f042b0f7643d39e5e7a394c)\ne22e0c7565c6b7fc577b9e2fab3682cdd3ff1f074f042b0f7643d39e5e7a394c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:53:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:30.901 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[fdb22eba-99e2-459a-8e4e-bd633012e25b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:53:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:30.902 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfb6261a0-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:53:30 compute-0 nova_compute[187185]: 2025-11-29 06:53:30.904 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:30 compute-0 kernel: tapfb6261a0-10: left promiscuous mode
Nov 29 06:53:30 compute-0 nova_compute[187185]: 2025-11-29 06:53:30.917 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:30.922 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[943988ea-37ab-4c8e-9816-cafc0da61c6b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:53:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:30.945 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[03d3de5f-97f6-4397-a540-e4195f0e2b17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:53:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:30.947 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[528857cf-b6db-4d1f-bae9-bb576b841879]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:53:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:30.964 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[96a0b744-53a8-42f5-8aca-a2adb8bf5e36]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 462054, 'reachable_time': 23300, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216880, 'error': None, 'target': 'ovnmeta-fb6261a0-1734-4a19-8eaa-94660a8ddab1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:53:30 compute-0 systemd[1]: run-netns-ovnmeta\x2dfb6261a0\x2d1734\x2d4a19\x2d8eaa\x2d94660a8ddab1.mount: Deactivated successfully.
Nov 29 06:53:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:30.968 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fb6261a0-1734-4a19-8eaa-94660a8ddab1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 06:53:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:30.969 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[82630023-5474-4b27-ae53-abce8aff42dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:53:30 compute-0 nova_compute[187185]: 2025-11-29 06:53:30.997 187189 DEBUG oslo_concurrency.lockutils [None req-ee8ca00e-8ad0-4484-8b27-68ae2b93dc87 840f4cdf2bf9409f9b4fd2a7218fcfbb 462ed9e91718488eab9f1fece4b6b34b - - default default] Lock "91bf50da-3c1f-4f88-a67f-21ec183c3812" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.968s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:53:31 compute-0 nova_compute[187185]: 2025-11-29 06:53:31.525 187189 DEBUG nova.compute.manager [req-a0f22198-d982-411e-925e-3dbc1939d416 req-c85546ef-a020-457c-abe2-801e80d92dc1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Received event network-vif-plugged-db7d3de5-1042-4ad8-a3e2-d1f59d68f37d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:53:31 compute-0 nova_compute[187185]: 2025-11-29 06:53:31.525 187189 DEBUG oslo_concurrency.lockutils [req-a0f22198-d982-411e-925e-3dbc1939d416 req-c85546ef-a020-457c-abe2-801e80d92dc1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "91bf50da-3c1f-4f88-a67f-21ec183c3812-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:53:31 compute-0 nova_compute[187185]: 2025-11-29 06:53:31.526 187189 DEBUG oslo_concurrency.lockutils [req-a0f22198-d982-411e-925e-3dbc1939d416 req-c85546ef-a020-457c-abe2-801e80d92dc1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "91bf50da-3c1f-4f88-a67f-21ec183c3812-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:53:31 compute-0 nova_compute[187185]: 2025-11-29 06:53:31.526 187189 DEBUG oslo_concurrency.lockutils [req-a0f22198-d982-411e-925e-3dbc1939d416 req-c85546ef-a020-457c-abe2-801e80d92dc1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "91bf50da-3c1f-4f88-a67f-21ec183c3812-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:53:31 compute-0 nova_compute[187185]: 2025-11-29 06:53:31.527 187189 DEBUG nova.compute.manager [req-a0f22198-d982-411e-925e-3dbc1939d416 req-c85546ef-a020-457c-abe2-801e80d92dc1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] No waiting events found dispatching network-vif-plugged-db7d3de5-1042-4ad8-a3e2-d1f59d68f37d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 06:53:31 compute-0 nova_compute[187185]: 2025-11-29 06:53:31.527 187189 WARNING nova.compute.manager [req-a0f22198-d982-411e-925e-3dbc1939d416 req-c85546ef-a020-457c-abe2-801e80d92dc1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Received unexpected event network-vif-plugged-db7d3de5-1042-4ad8-a3e2-d1f59d68f37d for instance with vm_state deleted and task_state None.
Nov 29 06:53:32 compute-0 nova_compute[187185]: 2025-11-29 06:53:32.783 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:32 compute-0 podman[216882]: 2025-11-29 06:53:32.812861554 +0000 UTC m=+0.067269726 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 06:53:32 compute-0 podman[216881]: 2025-11-29 06:53:32.836919623 +0000 UTC m=+0.089359142 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 06:53:34 compute-0 nova_compute[187185]: 2025-11-29 06:53:34.393 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:35 compute-0 nova_compute[187185]: 2025-11-29 06:53:35.590 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:36 compute-0 nova_compute[187185]: 2025-11-29 06:53:36.473 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:36 compute-0 nova_compute[187185]: 2025-11-29 06:53:36.699 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:36 compute-0 nova_compute[187185]: 2025-11-29 06:53:36.711 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399201.7096746, 942e977f-fd74-45d1-b0be-661b15431eca => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:53:36 compute-0 nova_compute[187185]: 2025-11-29 06:53:36.711 187189 INFO nova.compute.manager [-] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] VM Stopped (Lifecycle Event)
Nov 29 06:53:36 compute-0 nova_compute[187185]: 2025-11-29 06:53:36.731 187189 DEBUG nova.compute.manager [None req-263cb103-462e-447b-8b15-dabd0cec9b31 - - - - - -] [instance: 942e977f-fd74-45d1-b0be-661b15431eca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:53:37 compute-0 nova_compute[187185]: 2025-11-29 06:53:37.818 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:39 compute-0 nova_compute[187185]: 2025-11-29 06:53:39.396 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:41 compute-0 podman[216927]: 2025-11-29 06:53:41.786775021 +0000 UTC m=+0.053151355 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible)
Nov 29 06:53:41 compute-0 nova_compute[187185]: 2025-11-29 06:53:41.879 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:53:41 compute-0 nova_compute[187185]: 2025-11-29 06:53:41.879 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 06:53:41 compute-0 nova_compute[187185]: 2025-11-29 06:53:41.879 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 06:53:41 compute-0 nova_compute[187185]: 2025-11-29 06:53:41.894 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 06:53:41 compute-0 nova_compute[187185]: 2025-11-29 06:53:41.895 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:53:42 compute-0 nova_compute[187185]: 2025-11-29 06:53:42.820 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:43 compute-0 nova_compute[187185]: 2025-11-29 06:53:43.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:53:43 compute-0 nova_compute[187185]: 2025-11-29 06:53:43.339 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:53:44 compute-0 nova_compute[187185]: 2025-11-29 06:53:44.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:53:44 compute-0 nova_compute[187185]: 2025-11-29 06:53:44.324 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399209.3230748, 91bf50da-3c1f-4f88-a67f-21ec183c3812 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:53:44 compute-0 nova_compute[187185]: 2025-11-29 06:53:44.324 187189 INFO nova.compute.manager [-] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] VM Stopped (Lifecycle Event)
Nov 29 06:53:44 compute-0 nova_compute[187185]: 2025-11-29 06:53:44.346 187189 DEBUG nova.compute.manager [None req-b33caefa-c828-4a25-9bbb-40e4ee3b9df4 - - - - - -] [instance: 91bf50da-3c1f-4f88-a67f-21ec183c3812] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:53:44 compute-0 nova_compute[187185]: 2025-11-29 06:53:44.400 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:45 compute-0 nova_compute[187185]: 2025-11-29 06:53:45.311 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:53:45 compute-0 nova_compute[187185]: 2025-11-29 06:53:45.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:53:45 compute-0 nova_compute[187185]: 2025-11-29 06:53:45.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:53:45 compute-0 nova_compute[187185]: 2025-11-29 06:53:45.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 06:53:45 compute-0 nova_compute[187185]: 2025-11-29 06:53:45.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:53:45 compute-0 nova_compute[187185]: 2025-11-29 06:53:45.343 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:53:45 compute-0 nova_compute[187185]: 2025-11-29 06:53:45.344 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:53:45 compute-0 nova_compute[187185]: 2025-11-29 06:53:45.344 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:53:45 compute-0 nova_compute[187185]: 2025-11-29 06:53:45.344 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:53:45 compute-0 nova_compute[187185]: 2025-11-29 06:53:45.539 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:53:45 compute-0 nova_compute[187185]: 2025-11-29 06:53:45.540 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5777MB free_disk=73.33890914916992GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:53:45 compute-0 nova_compute[187185]: 2025-11-29 06:53:45.540 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:53:45 compute-0 nova_compute[187185]: 2025-11-29 06:53:45.541 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:53:45 compute-0 nova_compute[187185]: 2025-11-29 06:53:45.630 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:53:45 compute-0 nova_compute[187185]: 2025-11-29 06:53:45.631 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:53:45 compute-0 nova_compute[187185]: 2025-11-29 06:53:45.661 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:53:45 compute-0 nova_compute[187185]: 2025-11-29 06:53:45.675 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:53:45 compute-0 nova_compute[187185]: 2025-11-29 06:53:45.704 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:53:45 compute-0 nova_compute[187185]: 2025-11-29 06:53:45.705 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.164s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:53:46 compute-0 nova_compute[187185]: 2025-11-29 06:53:46.706 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:53:47 compute-0 nova_compute[187185]: 2025-11-29 06:53:47.824 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:49 compute-0 nova_compute[187185]: 2025-11-29 06:53:49.405 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:50 compute-0 nova_compute[187185]: 2025-11-29 06:53:50.790 187189 DEBUG oslo_concurrency.lockutils [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "0e798fb6-eb0e-40cc-a4ce-b4ff86357e92" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:53:50 compute-0 nova_compute[187185]: 2025-11-29 06:53:50.791 187189 DEBUG oslo_concurrency.lockutils [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "0e798fb6-eb0e-40cc-a4ce-b4ff86357e92" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:53:50 compute-0 nova_compute[187185]: 2025-11-29 06:53:50.818 187189 DEBUG nova.compute.manager [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 06:53:50 compute-0 nova_compute[187185]: 2025-11-29 06:53:50.942 187189 DEBUG oslo_concurrency.lockutils [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:53:50 compute-0 nova_compute[187185]: 2025-11-29 06:53:50.943 187189 DEBUG oslo_concurrency.lockutils [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:53:50 compute-0 nova_compute[187185]: 2025-11-29 06:53:50.951 187189 DEBUG nova.virt.hardware [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 06:53:50 compute-0 nova_compute[187185]: 2025-11-29 06:53:50.952 187189 INFO nova.compute.claims [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Claim successful on node compute-0.ctlplane.example.com
Nov 29 06:53:51 compute-0 nova_compute[187185]: 2025-11-29 06:53:51.085 187189 DEBUG nova.compute.provider_tree [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:53:51 compute-0 nova_compute[187185]: 2025-11-29 06:53:51.106 187189 DEBUG nova.scheduler.client.report [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:53:51 compute-0 nova_compute[187185]: 2025-11-29 06:53:51.135 187189 DEBUG oslo_concurrency.lockutils [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:53:51 compute-0 nova_compute[187185]: 2025-11-29 06:53:51.136 187189 DEBUG nova.compute.manager [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 06:53:51 compute-0 nova_compute[187185]: 2025-11-29 06:53:51.210 187189 DEBUG nova.compute.manager [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 06:53:51 compute-0 nova_compute[187185]: 2025-11-29 06:53:51.211 187189 DEBUG nova.network.neutron [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 06:53:51 compute-0 nova_compute[187185]: 2025-11-29 06:53:51.242 187189 INFO nova.virt.libvirt.driver [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 06:53:51 compute-0 nova_compute[187185]: 2025-11-29 06:53:51.266 187189 DEBUG nova.compute.manager [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 06:53:51 compute-0 nova_compute[187185]: 2025-11-29 06:53:51.404 187189 DEBUG nova.compute.manager [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 06:53:51 compute-0 nova_compute[187185]: 2025-11-29 06:53:51.405 187189 DEBUG nova.virt.libvirt.driver [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 06:53:51 compute-0 nova_compute[187185]: 2025-11-29 06:53:51.406 187189 INFO nova.virt.libvirt.driver [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Creating image(s)
Nov 29 06:53:51 compute-0 nova_compute[187185]: 2025-11-29 06:53:51.406 187189 DEBUG oslo_concurrency.lockutils [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "/var/lib/nova/instances/0e798fb6-eb0e-40cc-a4ce-b4ff86357e92/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:53:51 compute-0 nova_compute[187185]: 2025-11-29 06:53:51.407 187189 DEBUG oslo_concurrency.lockutils [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "/var/lib/nova/instances/0e798fb6-eb0e-40cc-a4ce-b4ff86357e92/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:53:51 compute-0 nova_compute[187185]: 2025-11-29 06:53:51.407 187189 DEBUG oslo_concurrency.lockutils [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "/var/lib/nova/instances/0e798fb6-eb0e-40cc-a4ce-b4ff86357e92/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:53:51 compute-0 nova_compute[187185]: 2025-11-29 06:53:51.419 187189 DEBUG nova.policy [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 06:53:51 compute-0 nova_compute[187185]: 2025-11-29 06:53:51.422 187189 DEBUG oslo_concurrency.processutils [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:53:51 compute-0 nova_compute[187185]: 2025-11-29 06:53:51.486 187189 DEBUG oslo_concurrency.processutils [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:53:51 compute-0 nova_compute[187185]: 2025-11-29 06:53:51.487 187189 DEBUG oslo_concurrency.lockutils [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:53:51 compute-0 nova_compute[187185]: 2025-11-29 06:53:51.489 187189 DEBUG oslo_concurrency.lockutils [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:53:51 compute-0 nova_compute[187185]: 2025-11-29 06:53:51.501 187189 DEBUG oslo_concurrency.processutils [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:53:51 compute-0 nova_compute[187185]: 2025-11-29 06:53:51.570 187189 DEBUG oslo_concurrency.processutils [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:53:51 compute-0 nova_compute[187185]: 2025-11-29 06:53:51.572 187189 DEBUG oslo_concurrency.processutils [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/0e798fb6-eb0e-40cc-a4ce-b4ff86357e92/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:53:51 compute-0 nova_compute[187185]: 2025-11-29 06:53:51.617 187189 DEBUG oslo_concurrency.processutils [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/0e798fb6-eb0e-40cc-a4ce-b4ff86357e92/disk 1073741824" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:53:51 compute-0 nova_compute[187185]: 2025-11-29 06:53:51.619 187189 DEBUG oslo_concurrency.lockutils [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:53:51 compute-0 nova_compute[187185]: 2025-11-29 06:53:51.619 187189 DEBUG oslo_concurrency.processutils [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:53:51 compute-0 nova_compute[187185]: 2025-11-29 06:53:51.706 187189 DEBUG oslo_concurrency.processutils [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:53:51 compute-0 nova_compute[187185]: 2025-11-29 06:53:51.709 187189 DEBUG nova.virt.disk.api [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Checking if we can resize image /var/lib/nova/instances/0e798fb6-eb0e-40cc-a4ce-b4ff86357e92/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 06:53:51 compute-0 nova_compute[187185]: 2025-11-29 06:53:51.709 187189 DEBUG oslo_concurrency.processutils [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0e798fb6-eb0e-40cc-a4ce-b4ff86357e92/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:53:51 compute-0 nova_compute[187185]: 2025-11-29 06:53:51.774 187189 DEBUG oslo_concurrency.processutils [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0e798fb6-eb0e-40cc-a4ce-b4ff86357e92/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:53:51 compute-0 nova_compute[187185]: 2025-11-29 06:53:51.775 187189 DEBUG nova.virt.disk.api [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Cannot resize image /var/lib/nova/instances/0e798fb6-eb0e-40cc-a4ce-b4ff86357e92/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 06:53:51 compute-0 nova_compute[187185]: 2025-11-29 06:53:51.776 187189 DEBUG nova.objects.instance [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lazy-loading 'migration_context' on Instance uuid 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:53:51 compute-0 nova_compute[187185]: 2025-11-29 06:53:51.792 187189 DEBUG nova.virt.libvirt.driver [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 06:53:51 compute-0 nova_compute[187185]: 2025-11-29 06:53:51.793 187189 DEBUG nova.virt.libvirt.driver [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Ensure instance console log exists: /var/lib/nova/instances/0e798fb6-eb0e-40cc-a4ce-b4ff86357e92/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 06:53:51 compute-0 nova_compute[187185]: 2025-11-29 06:53:51.794 187189 DEBUG oslo_concurrency.lockutils [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:53:51 compute-0 nova_compute[187185]: 2025-11-29 06:53:51.794 187189 DEBUG oslo_concurrency.lockutils [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:53:51 compute-0 nova_compute[187185]: 2025-11-29 06:53:51.795 187189 DEBUG oslo_concurrency.lockutils [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:53:52 compute-0 nova_compute[187185]: 2025-11-29 06:53:52.827 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:53 compute-0 nova_compute[187185]: 2025-11-29 06:53:53.045 187189 DEBUG nova.network.neutron [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Successfully created port: d40e2418-d6ae-4f44-9f23-f141b0ab11a6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 06:53:54 compute-0 nova_compute[187185]: 2025-11-29 06:53:54.062 187189 DEBUG nova.network.neutron [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Successfully updated port: d40e2418-d6ae-4f44-9f23-f141b0ab11a6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 06:53:54 compute-0 nova_compute[187185]: 2025-11-29 06:53:54.085 187189 DEBUG oslo_concurrency.lockutils [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "refresh_cache-0e798fb6-eb0e-40cc-a4ce-b4ff86357e92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:53:54 compute-0 nova_compute[187185]: 2025-11-29 06:53:54.085 187189 DEBUG oslo_concurrency.lockutils [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquired lock "refresh_cache-0e798fb6-eb0e-40cc-a4ce-b4ff86357e92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:53:54 compute-0 nova_compute[187185]: 2025-11-29 06:53:54.086 187189 DEBUG nova.network.neutron [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 06:53:54 compute-0 sshd-session[216964]: Invalid user tmp from 160.202.8.218 port 57924
Nov 29 06:53:54 compute-0 podman[216967]: 2025-11-29 06:53:54.231390856 +0000 UTC m=+0.096706680 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., name=ubi9-minimal, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=edpm, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 29 06:53:54 compute-0 podman[216968]: 2025-11-29 06:53:54.231375035 +0000 UTC m=+0.079038353 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 06:53:54 compute-0 podman[216966]: 2025-11-29 06:53:54.259050612 +0000 UTC m=+0.123126883 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:53:54 compute-0 nova_compute[187185]: 2025-11-29 06:53:54.260 187189 DEBUG nova.compute.manager [req-71b2947a-c1f4-4ae4-8c1b-21dac7880fce req-b46c2135-db42-4f93-93aa-3ae155051af6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Received event network-changed-d40e2418-d6ae-4f44-9f23-f141b0ab11a6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:53:54 compute-0 nova_compute[187185]: 2025-11-29 06:53:54.261 187189 DEBUG nova.compute.manager [req-71b2947a-c1f4-4ae4-8c1b-21dac7880fce req-b46c2135-db42-4f93-93aa-3ae155051af6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Refreshing instance network info cache due to event network-changed-d40e2418-d6ae-4f44-9f23-f141b0ab11a6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 06:53:54 compute-0 nova_compute[187185]: 2025-11-29 06:53:54.261 187189 DEBUG oslo_concurrency.lockutils [req-71b2947a-c1f4-4ae4-8c1b-21dac7880fce req-b46c2135-db42-4f93-93aa-3ae155051af6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-0e798fb6-eb0e-40cc-a4ce-b4ff86357e92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:53:54 compute-0 nova_compute[187185]: 2025-11-29 06:53:54.294 187189 DEBUG nova.network.neutron [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 06:53:54 compute-0 sshd-session[216964]: Received disconnect from 160.202.8.218 port 57924:11: Bye Bye [preauth]
Nov 29 06:53:54 compute-0 sshd-session[216964]: Disconnected from invalid user tmp 160.202.8.218 port 57924 [preauth]
Nov 29 06:53:54 compute-0 nova_compute[187185]: 2025-11-29 06:53:54.408 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.641 187189 DEBUG nova.network.neutron [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Updating instance_info_cache with network_info: [{"id": "d40e2418-d6ae-4f44-9f23-f141b0ab11a6", "address": "fa:16:3e:dd:6e:1f", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd40e2418-d6", "ovs_interfaceid": "d40e2418-d6ae-4f44-9f23-f141b0ab11a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.693 187189 DEBUG oslo_concurrency.lockutils [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Releasing lock "refresh_cache-0e798fb6-eb0e-40cc-a4ce-b4ff86357e92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.694 187189 DEBUG nova.compute.manager [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Instance network_info: |[{"id": "d40e2418-d6ae-4f44-9f23-f141b0ab11a6", "address": "fa:16:3e:dd:6e:1f", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd40e2418-d6", "ovs_interfaceid": "d40e2418-d6ae-4f44-9f23-f141b0ab11a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.694 187189 DEBUG oslo_concurrency.lockutils [req-71b2947a-c1f4-4ae4-8c1b-21dac7880fce req-b46c2135-db42-4f93-93aa-3ae155051af6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-0e798fb6-eb0e-40cc-a4ce-b4ff86357e92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.695 187189 DEBUG nova.network.neutron [req-71b2947a-c1f4-4ae4-8c1b-21dac7880fce req-b46c2135-db42-4f93-93aa-3ae155051af6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Refreshing network info cache for port d40e2418-d6ae-4f44-9f23-f141b0ab11a6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.698 187189 DEBUG nova.virt.libvirt.driver [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Start _get_guest_xml network_info=[{"id": "d40e2418-d6ae-4f44-9f23-f141b0ab11a6", "address": "fa:16:3e:dd:6e:1f", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd40e2418-d6", "ovs_interfaceid": "d40e2418-d6ae-4f44-9f23-f141b0ab11a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.705 187189 WARNING nova.virt.libvirt.driver [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.725 187189 DEBUG nova.virt.libvirt.host [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.726 187189 DEBUG nova.virt.libvirt.host [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.734 187189 DEBUG nova.virt.libvirt.host [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.735 187189 DEBUG nova.virt.libvirt.host [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.736 187189 DEBUG nova.virt.libvirt.driver [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.736 187189 DEBUG nova.virt.hardware [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.737 187189 DEBUG nova.virt.hardware [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.737 187189 DEBUG nova.virt.hardware [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.737 187189 DEBUG nova.virt.hardware [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.737 187189 DEBUG nova.virt.hardware [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.738 187189 DEBUG nova.virt.hardware [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.738 187189 DEBUG nova.virt.hardware [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.738 187189 DEBUG nova.virt.hardware [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.738 187189 DEBUG nova.virt.hardware [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.738 187189 DEBUG nova.virt.hardware [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.739 187189 DEBUG nova.virt.hardware [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.742 187189 DEBUG nova.virt.libvirt.vif [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:53:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1151506089',display_name='tempest-ImagesTestJSON-server-1151506089',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-1151506089',id=26,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='78f8ba841bbe4fdcb9d9e2237d97bf73',ramdisk_id='',reservation_id='r-9ur3ibun',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-1674785298',owner_user_name='tempest-ImagesTestJSON-1674785298-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:53:51Z,user_data=None,user_id='315be492c2ce4b9f8af2898e6794a256',uuid=0e798fb6-eb0e-40cc-a4ce-b4ff86357e92,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d40e2418-d6ae-4f44-9f23-f141b0ab11a6", "address": "fa:16:3e:dd:6e:1f", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd40e2418-d6", "ovs_interfaceid": "d40e2418-d6ae-4f44-9f23-f141b0ab11a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.742 187189 DEBUG nova.network.os_vif_util [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Converting VIF {"id": "d40e2418-d6ae-4f44-9f23-f141b0ab11a6", "address": "fa:16:3e:dd:6e:1f", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd40e2418-d6", "ovs_interfaceid": "d40e2418-d6ae-4f44-9f23-f141b0ab11a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.744 187189 DEBUG nova.network.os_vif_util [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:6e:1f,bridge_name='br-int',has_traffic_filtering=True,id=d40e2418-d6ae-4f44-9f23-f141b0ab11a6,network=Network(17ec2ca4-3fa9-41aa-80ef-35bf92d404ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd40e2418-d6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.745 187189 DEBUG nova.objects.instance [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.765 187189 DEBUG nova.virt.libvirt.driver [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] End _get_guest_xml xml=<domain type="kvm">
Nov 29 06:53:55 compute-0 nova_compute[187185]:   <uuid>0e798fb6-eb0e-40cc-a4ce-b4ff86357e92</uuid>
Nov 29 06:53:55 compute-0 nova_compute[187185]:   <name>instance-0000001a</name>
Nov 29 06:53:55 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 06:53:55 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 06:53:55 compute-0 nova_compute[187185]:   <metadata>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 06:53:55 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:       <nova:name>tempest-ImagesTestJSON-server-1151506089</nova:name>
Nov 29 06:53:55 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 06:53:55</nova:creationTime>
Nov 29 06:53:55 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 06:53:55 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 06:53:55 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 06:53:55 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 06:53:55 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 06:53:55 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 06:53:55 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 06:53:55 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 06:53:55 compute-0 nova_compute[187185]:         <nova:user uuid="315be492c2ce4b9f8af2898e6794a256">tempest-ImagesTestJSON-1674785298-project-member</nova:user>
Nov 29 06:53:55 compute-0 nova_compute[187185]:         <nova:project uuid="78f8ba841bbe4fdcb9d9e2237d97bf73">tempest-ImagesTestJSON-1674785298</nova:project>
Nov 29 06:53:55 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 06:53:55 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 06:53:55 compute-0 nova_compute[187185]:         <nova:port uuid="d40e2418-d6ae-4f44-9f23-f141b0ab11a6">
Nov 29 06:53:55 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 06:53:55 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 06:53:55 compute-0 nova_compute[187185]:   </metadata>
Nov 29 06:53:55 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <system>
Nov 29 06:53:55 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 06:53:55 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 06:53:55 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 06:53:55 compute-0 nova_compute[187185]:       <entry name="serial">0e798fb6-eb0e-40cc-a4ce-b4ff86357e92</entry>
Nov 29 06:53:55 compute-0 nova_compute[187185]:       <entry name="uuid">0e798fb6-eb0e-40cc-a4ce-b4ff86357e92</entry>
Nov 29 06:53:55 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     </system>
Nov 29 06:53:55 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 06:53:55 compute-0 nova_compute[187185]:   <os>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:   </os>
Nov 29 06:53:55 compute-0 nova_compute[187185]:   <features>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <apic/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:   </features>
Nov 29 06:53:55 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:   </clock>
Nov 29 06:53:55 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:   </cpu>
Nov 29 06:53:55 compute-0 nova_compute[187185]:   <devices>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 06:53:55 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/0e798fb6-eb0e-40cc-a4ce-b4ff86357e92/disk"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     </disk>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 06:53:55 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/0e798fb6-eb0e-40cc-a4ce-b4ff86357e92/disk.config"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     </disk>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 06:53:55 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:dd:6e:1f"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:       <target dev="tapd40e2418-d6"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     </interface>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 06:53:55 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/0e798fb6-eb0e-40cc-a4ce-b4ff86357e92/console.log" append="off"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     </serial>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <video>
Nov 29 06:53:55 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     </video>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 06:53:55 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     </rng>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 06:53:55 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 06:53:55 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 06:53:55 compute-0 nova_compute[187185]:   </devices>
Nov 29 06:53:55 compute-0 nova_compute[187185]: </domain>
Nov 29 06:53:55 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.766 187189 DEBUG nova.compute.manager [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Preparing to wait for external event network-vif-plugged-d40e2418-d6ae-4f44-9f23-f141b0ab11a6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.766 187189 DEBUG oslo_concurrency.lockutils [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "0e798fb6-eb0e-40cc-a4ce-b4ff86357e92-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.767 187189 DEBUG oslo_concurrency.lockutils [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "0e798fb6-eb0e-40cc-a4ce-b4ff86357e92-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.767 187189 DEBUG oslo_concurrency.lockutils [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "0e798fb6-eb0e-40cc-a4ce-b4ff86357e92-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.767 187189 DEBUG nova.virt.libvirt.vif [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:53:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1151506089',display_name='tempest-ImagesTestJSON-server-1151506089',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-1151506089',id=26,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='78f8ba841bbe4fdcb9d9e2237d97bf73',ramdisk_id='',reservation_id='r-9ur3ibun',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-1674785298',owner_user_name='tempest-ImagesTestJSON-1674785298-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:53:51Z,user_data=None,user_id='315be492c2ce4b9f8af2898e6794a256',uuid=0e798fb6-eb0e-40cc-a4ce-b4ff86357e92,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d40e2418-d6ae-4f44-9f23-f141b0ab11a6", "address": "fa:16:3e:dd:6e:1f", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd40e2418-d6", "ovs_interfaceid": "d40e2418-d6ae-4f44-9f23-f141b0ab11a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.768 187189 DEBUG nova.network.os_vif_util [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Converting VIF {"id": "d40e2418-d6ae-4f44-9f23-f141b0ab11a6", "address": "fa:16:3e:dd:6e:1f", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd40e2418-d6", "ovs_interfaceid": "d40e2418-d6ae-4f44-9f23-f141b0ab11a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.768 187189 DEBUG nova.network.os_vif_util [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:6e:1f,bridge_name='br-int',has_traffic_filtering=True,id=d40e2418-d6ae-4f44-9f23-f141b0ab11a6,network=Network(17ec2ca4-3fa9-41aa-80ef-35bf92d404ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd40e2418-d6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.769 187189 DEBUG os_vif [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:6e:1f,bridge_name='br-int',has_traffic_filtering=True,id=d40e2418-d6ae-4f44-9f23-f141b0ab11a6,network=Network(17ec2ca4-3fa9-41aa-80ef-35bf92d404ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd40e2418-d6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.769 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.770 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.770 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.773 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.773 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd40e2418-d6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.774 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd40e2418-d6, col_values=(('external_ids', {'iface-id': 'd40e2418-d6ae-4f44-9f23-f141b0ab11a6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:dd:6e:1f', 'vm-uuid': '0e798fb6-eb0e-40cc-a4ce-b4ff86357e92'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.814 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:55 compute-0 NetworkManager[55227]: <info>  [1764399235.8153] manager: (tapd40e2418-d6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/43)
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.816 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.823 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.824 187189 INFO os_vif [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:6e:1f,bridge_name='br-int',has_traffic_filtering=True,id=d40e2418-d6ae-4f44-9f23-f141b0ab11a6,network=Network(17ec2ca4-3fa9-41aa-80ef-35bf92d404ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd40e2418-d6')
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.990 187189 DEBUG nova.virt.libvirt.driver [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.991 187189 DEBUG nova.virt.libvirt.driver [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.991 187189 DEBUG nova.virt.libvirt.driver [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] No VIF found with MAC fa:16:3e:dd:6e:1f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 06:53:55 compute-0 nova_compute[187185]: 2025-11-29 06:53:55.991 187189 INFO nova.virt.libvirt.driver [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Using config drive
Nov 29 06:53:56 compute-0 nova_compute[187185]: 2025-11-29 06:53:56.733 187189 INFO nova.virt.libvirt.driver [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Creating config drive at /var/lib/nova/instances/0e798fb6-eb0e-40cc-a4ce-b4ff86357e92/disk.config
Nov 29 06:53:56 compute-0 nova_compute[187185]: 2025-11-29 06:53:56.743 187189 DEBUG oslo_concurrency.processutils [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0e798fb6-eb0e-40cc-a4ce-b4ff86357e92/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpktf0vwxn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:53:56 compute-0 nova_compute[187185]: 2025-11-29 06:53:56.870 187189 DEBUG oslo_concurrency.processutils [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0e798fb6-eb0e-40cc-a4ce-b4ff86357e92/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpktf0vwxn" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:53:56 compute-0 kernel: tapd40e2418-d6: entered promiscuous mode
Nov 29 06:53:56 compute-0 NetworkManager[55227]: <info>  [1764399236.9628] manager: (tapd40e2418-d6): new Tun device (/org/freedesktop/NetworkManager/Devices/44)
Nov 29 06:53:56 compute-0 ovn_controller[95281]: 2025-11-29T06:53:56Z|00070|binding|INFO|Claiming lport d40e2418-d6ae-4f44-9f23-f141b0ab11a6 for this chassis.
Nov 29 06:53:56 compute-0 ovn_controller[95281]: 2025-11-29T06:53:56Z|00071|binding|INFO|d40e2418-d6ae-4f44-9f23-f141b0ab11a6: Claiming fa:16:3e:dd:6e:1f 10.100.0.6
Nov 29 06:53:56 compute-0 nova_compute[187185]: 2025-11-29 06:53:56.971 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:57.005 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:6e:1f 10.100.0.6'], port_security=['fa:16:3e:dd:6e:1f 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '0e798fb6-eb0e-40cc-a4ce-b4ff86357e92', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ca8fef31-1a4b-4249-948f-73ea087430b2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8122595a-c31d-4e3d-a668-dbae500c1d72, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=d40e2418-d6ae-4f44-9f23-f141b0ab11a6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:57.007 104254 INFO neutron.agent.ovn.metadata.agent [-] Port d40e2418-d6ae-4f44-9f23-f141b0ab11a6 in datapath 17ec2ca4-3fa9-41aa-80ef-35bf92d404ba bound to our chassis
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:57.011 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 17ec2ca4-3fa9-41aa-80ef-35bf92d404ba
Nov 29 06:53:57 compute-0 systemd-udevd[217047]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 06:53:57 compute-0 systemd-machined[153486]: New machine qemu-8-instance-0000001a.
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:57.027 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d5392604-9590-4eac-8994-24163c76ef35]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:57.029 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap17ec2ca4-31 in ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:57.032 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap17ec2ca4-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:57.032 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[ac58b180-b5b3-4cff-9ffd-4e7e09cfe378]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:57.033 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[ba264549-0c92-4517-8dce-8efd9753b001]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:53:57 compute-0 NetworkManager[55227]: <info>  [1764399237.0385] device (tapd40e2418-d6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 06:53:57 compute-0 NetworkManager[55227]: <info>  [1764399237.0397] device (tapd40e2418-d6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:57.048 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[a77903a8-3de0-46bd-8a6e-e66ae9d04d12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:53:57 compute-0 nova_compute[187185]: 2025-11-29 06:53:57.056 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:57 compute-0 systemd[1]: Started Virtual Machine qemu-8-instance-0000001a.
Nov 29 06:53:57 compute-0 ovn_controller[95281]: 2025-11-29T06:53:57Z|00072|binding|INFO|Setting lport d40e2418-d6ae-4f44-9f23-f141b0ab11a6 ovn-installed in OVS
Nov 29 06:53:57 compute-0 ovn_controller[95281]: 2025-11-29T06:53:57Z|00073|binding|INFO|Setting lport d40e2418-d6ae-4f44-9f23-f141b0ab11a6 up in Southbound
Nov 29 06:53:57 compute-0 nova_compute[187185]: 2025-11-29 06:53:57.061 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:57.074 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[61238f7b-8b0a-45e5-b7a5-c8da318ca05a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:57.117 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[4ba372c5-dba2-4fb4-b7a2-b7684dff1994]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:53:57 compute-0 NetworkManager[55227]: <info>  [1764399237.1246] manager: (tap17ec2ca4-30): new Veth device (/org/freedesktop/NetworkManager/Devices/45)
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:57.122 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[0c466c81-9073-4f2c-90b7-796afb69029d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:57.163 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[3b4572d0-cfbc-4e57-8706-1f76dfda0fc5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:57.168 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[e74f685e-b45f-462e-8aa5-557770a94ac6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:53:57 compute-0 NetworkManager[55227]: <info>  [1764399237.2006] device (tap17ec2ca4-30): carrier: link connected
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:57.206 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[cc5e4130-8f40-444a-8f44-4e15d89ee0ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:57.227 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[30bff226-66c3-4954-b7d2-690f08c09336]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap17ec2ca4-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ec:55:6b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 468486, 'reachable_time': 39534, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217080, 'error': None, 'target': 'ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:57.246 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[c8a581a9-b5c1-49ee-9156-ab90faedfd3c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feec:556b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 468486, 'tstamp': 468486}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217081, 'error': None, 'target': 'ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:57.264 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[bc57603a-423e-4a2f-a92e-2c3938443c38]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap17ec2ca4-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ec:55:6b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 468486, 'reachable_time': 39534, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217082, 'error': None, 'target': 'ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:57.306 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[7434c128-e99a-44de-8ac2-746508dff4fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:57.373 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[0b7424aa-76b4-4420-8168-1bc9b25d1db8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:57.375 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap17ec2ca4-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:57.376 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:57.377 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap17ec2ca4-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:53:57 compute-0 nova_compute[187185]: 2025-11-29 06:53:57.381 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:57 compute-0 kernel: tap17ec2ca4-30: entered promiscuous mode
Nov 29 06:53:57 compute-0 NetworkManager[55227]: <info>  [1764399237.3822] manager: (tap17ec2ca4-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Nov 29 06:53:57 compute-0 nova_compute[187185]: 2025-11-29 06:53:57.383 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:57.387 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap17ec2ca4-30, col_values=(('external_ids', {'iface-id': '97d66506-c891-4bf7-8595-2d091560f247'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:53:57 compute-0 nova_compute[187185]: 2025-11-29 06:53:57.389 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:57 compute-0 ovn_controller[95281]: 2025-11-29T06:53:57Z|00074|binding|INFO|Releasing lport 97d66506-c891-4bf7-8595-2d091560f247 from this chassis (sb_readonly=0)
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:57.392 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/17ec2ca4-3fa9-41aa-80ef-35bf92d404ba.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/17ec2ca4-3fa9-41aa-80ef-35bf92d404ba.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:57.394 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[128a70bc-aaa3-4442-9929-59a6205f52e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:57.396 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]: global
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/17ec2ca4-3fa9-41aa-80ef-35bf92d404ba.pid.haproxy
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]: 
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]: 
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]: 
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID 17ec2ca4-3fa9-41aa-80ef-35bf92d404ba
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 06:53:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:53:57.397 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'env', 'PROCESS_TAG=haproxy-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/17ec2ca4-3fa9-41aa-80ef-35bf92d404ba.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 06:53:57 compute-0 nova_compute[187185]: 2025-11-29 06:53:57.403 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:57 compute-0 nova_compute[187185]: 2025-11-29 06:53:57.520 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399237.5194, 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:53:57 compute-0 nova_compute[187185]: 2025-11-29 06:53:57.521 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] VM Started (Lifecycle Event)
Nov 29 06:53:57 compute-0 nova_compute[187185]: 2025-11-29 06:53:57.552 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:53:57 compute-0 nova_compute[187185]: 2025-11-29 06:53:57.559 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399237.5212085, 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:53:57 compute-0 nova_compute[187185]: 2025-11-29 06:53:57.559 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] VM Paused (Lifecycle Event)
Nov 29 06:53:57 compute-0 nova_compute[187185]: 2025-11-29 06:53:57.587 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:53:57 compute-0 nova_compute[187185]: 2025-11-29 06:53:57.593 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 06:53:57 compute-0 nova_compute[187185]: 2025-11-29 06:53:57.617 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 06:53:57 compute-0 nova_compute[187185]: 2025-11-29 06:53:57.656 187189 DEBUG nova.network.neutron [req-71b2947a-c1f4-4ae4-8c1b-21dac7880fce req-b46c2135-db42-4f93-93aa-3ae155051af6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Updated VIF entry in instance network info cache for port d40e2418-d6ae-4f44-9f23-f141b0ab11a6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 06:53:57 compute-0 nova_compute[187185]: 2025-11-29 06:53:57.657 187189 DEBUG nova.network.neutron [req-71b2947a-c1f4-4ae4-8c1b-21dac7880fce req-b46c2135-db42-4f93-93aa-3ae155051af6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Updating instance_info_cache with network_info: [{"id": "d40e2418-d6ae-4f44-9f23-f141b0ab11a6", "address": "fa:16:3e:dd:6e:1f", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd40e2418-d6", "ovs_interfaceid": "d40e2418-d6ae-4f44-9f23-f141b0ab11a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:53:57 compute-0 nova_compute[187185]: 2025-11-29 06:53:57.676 187189 DEBUG oslo_concurrency.lockutils [req-71b2947a-c1f4-4ae4-8c1b-21dac7880fce req-b46c2135-db42-4f93-93aa-3ae155051af6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-0e798fb6-eb0e-40cc-a4ce-b4ff86357e92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:53:57 compute-0 nova_compute[187185]: 2025-11-29 06:53:57.830 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:53:57 compute-0 podman[217121]: 2025-11-29 06:53:57.887569668 +0000 UTC m=+0.043950817 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 06:53:58 compute-0 podman[217121]: 2025-11-29 06:53:58.432505169 +0000 UTC m=+0.588886328 container create 3c02ac4adca38526a13c14a39a0bf389e271f4793b6c70b2354488597af61c8f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 29 06:53:58 compute-0 systemd[1]: Started libpod-conmon-3c02ac4adca38526a13c14a39a0bf389e271f4793b6c70b2354488597af61c8f.scope.
Nov 29 06:53:58 compute-0 systemd[1]: Started libcrun container.
Nov 29 06:53:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c12e14704a2dd788ea4183980ae6c9b3aa2b47f42d2a61ac3b1db63ba565b586/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 06:53:58 compute-0 podman[217121]: 2025-11-29 06:53:58.565294191 +0000 UTC m=+0.721675330 container init 3c02ac4adca38526a13c14a39a0bf389e271f4793b6c70b2354488597af61c8f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:53:58 compute-0 podman[217121]: 2025-11-29 06:53:58.571363045 +0000 UTC m=+0.727744194 container start 3c02ac4adca38526a13c14a39a0bf389e271f4793b6c70b2354488597af61c8f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 29 06:53:58 compute-0 neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba[217137]: [NOTICE]   (217141) : New worker (217143) forked
Nov 29 06:53:58 compute-0 neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba[217137]: [NOTICE]   (217141) : Loading success.
Nov 29 06:53:59 compute-0 nova_compute[187185]: 2025-11-29 06:53:59.128 187189 DEBUG nova.compute.manager [req-d6bbf176-9844-464e-8fcb-2901489f6454 req-4d3adac5-7bd4-4290-843f-8f131bf13bab 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Received event network-vif-plugged-d40e2418-d6ae-4f44-9f23-f141b0ab11a6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:53:59 compute-0 nova_compute[187185]: 2025-11-29 06:53:59.129 187189 DEBUG oslo_concurrency.lockutils [req-d6bbf176-9844-464e-8fcb-2901489f6454 req-4d3adac5-7bd4-4290-843f-8f131bf13bab 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "0e798fb6-eb0e-40cc-a4ce-b4ff86357e92-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:53:59 compute-0 nova_compute[187185]: 2025-11-29 06:53:59.129 187189 DEBUG oslo_concurrency.lockutils [req-d6bbf176-9844-464e-8fcb-2901489f6454 req-4d3adac5-7bd4-4290-843f-8f131bf13bab 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0e798fb6-eb0e-40cc-a4ce-b4ff86357e92-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:53:59 compute-0 nova_compute[187185]: 2025-11-29 06:53:59.129 187189 DEBUG oslo_concurrency.lockutils [req-d6bbf176-9844-464e-8fcb-2901489f6454 req-4d3adac5-7bd4-4290-843f-8f131bf13bab 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0e798fb6-eb0e-40cc-a4ce-b4ff86357e92-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:53:59 compute-0 nova_compute[187185]: 2025-11-29 06:53:59.129 187189 DEBUG nova.compute.manager [req-d6bbf176-9844-464e-8fcb-2901489f6454 req-4d3adac5-7bd4-4290-843f-8f131bf13bab 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Processing event network-vif-plugged-d40e2418-d6ae-4f44-9f23-f141b0ab11a6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 06:53:59 compute-0 nova_compute[187185]: 2025-11-29 06:53:59.130 187189 DEBUG nova.compute.manager [req-d6bbf176-9844-464e-8fcb-2901489f6454 req-4d3adac5-7bd4-4290-843f-8f131bf13bab 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Received event network-vif-plugged-d40e2418-d6ae-4f44-9f23-f141b0ab11a6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:53:59 compute-0 nova_compute[187185]: 2025-11-29 06:53:59.130 187189 DEBUG oslo_concurrency.lockutils [req-d6bbf176-9844-464e-8fcb-2901489f6454 req-4d3adac5-7bd4-4290-843f-8f131bf13bab 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "0e798fb6-eb0e-40cc-a4ce-b4ff86357e92-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:53:59 compute-0 nova_compute[187185]: 2025-11-29 06:53:59.130 187189 DEBUG oslo_concurrency.lockutils [req-d6bbf176-9844-464e-8fcb-2901489f6454 req-4d3adac5-7bd4-4290-843f-8f131bf13bab 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0e798fb6-eb0e-40cc-a4ce-b4ff86357e92-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:53:59 compute-0 nova_compute[187185]: 2025-11-29 06:53:59.130 187189 DEBUG oslo_concurrency.lockutils [req-d6bbf176-9844-464e-8fcb-2901489f6454 req-4d3adac5-7bd4-4290-843f-8f131bf13bab 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0e798fb6-eb0e-40cc-a4ce-b4ff86357e92-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:53:59 compute-0 nova_compute[187185]: 2025-11-29 06:53:59.131 187189 DEBUG nova.compute.manager [req-d6bbf176-9844-464e-8fcb-2901489f6454 req-4d3adac5-7bd4-4290-843f-8f131bf13bab 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] No waiting events found dispatching network-vif-plugged-d40e2418-d6ae-4f44-9f23-f141b0ab11a6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 06:53:59 compute-0 nova_compute[187185]: 2025-11-29 06:53:59.131 187189 WARNING nova.compute.manager [req-d6bbf176-9844-464e-8fcb-2901489f6454 req-4d3adac5-7bd4-4290-843f-8f131bf13bab 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Received unexpected event network-vif-plugged-d40e2418-d6ae-4f44-9f23-f141b0ab11a6 for instance with vm_state building and task_state spawning.
Nov 29 06:53:59 compute-0 nova_compute[187185]: 2025-11-29 06:53:59.131 187189 DEBUG nova.compute.manager [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 06:53:59 compute-0 nova_compute[187185]: 2025-11-29 06:53:59.137 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399239.1374629, 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:53:59 compute-0 nova_compute[187185]: 2025-11-29 06:53:59.138 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] VM Resumed (Lifecycle Event)
Nov 29 06:53:59 compute-0 nova_compute[187185]: 2025-11-29 06:53:59.141 187189 DEBUG nova.virt.libvirt.driver [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 06:53:59 compute-0 nova_compute[187185]: 2025-11-29 06:53:59.145 187189 INFO nova.virt.libvirt.driver [-] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Instance spawned successfully.
Nov 29 06:53:59 compute-0 nova_compute[187185]: 2025-11-29 06:53:59.146 187189 DEBUG nova.virt.libvirt.driver [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 06:53:59 compute-0 nova_compute[187185]: 2025-11-29 06:53:59.178 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:53:59 compute-0 nova_compute[187185]: 2025-11-29 06:53:59.183 187189 DEBUG nova.virt.libvirt.driver [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:53:59 compute-0 nova_compute[187185]: 2025-11-29 06:53:59.184 187189 DEBUG nova.virt.libvirt.driver [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:53:59 compute-0 nova_compute[187185]: 2025-11-29 06:53:59.185 187189 DEBUG nova.virt.libvirt.driver [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:53:59 compute-0 nova_compute[187185]: 2025-11-29 06:53:59.185 187189 DEBUG nova.virt.libvirt.driver [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:53:59 compute-0 nova_compute[187185]: 2025-11-29 06:53:59.185 187189 DEBUG nova.virt.libvirt.driver [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:53:59 compute-0 nova_compute[187185]: 2025-11-29 06:53:59.186 187189 DEBUG nova.virt.libvirt.driver [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:53:59 compute-0 nova_compute[187185]: 2025-11-29 06:53:59.190 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 06:53:59 compute-0 nova_compute[187185]: 2025-11-29 06:53:59.236 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 06:53:59 compute-0 nova_compute[187185]: 2025-11-29 06:53:59.319 187189 INFO nova.compute.manager [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Took 7.91 seconds to spawn the instance on the hypervisor.
Nov 29 06:53:59 compute-0 nova_compute[187185]: 2025-11-29 06:53:59.319 187189 DEBUG nova.compute.manager [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:53:59 compute-0 nova_compute[187185]: 2025-11-29 06:53:59.419 187189 INFO nova.compute.manager [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Took 8.53 seconds to build instance.
Nov 29 06:53:59 compute-0 nova_compute[187185]: 2025-11-29 06:53:59.442 187189 DEBUG oslo_concurrency.lockutils [None req-14618f75-c783-40a9-806c-867694721b32 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "0e798fb6-eb0e-40cc-a4ce-b4ff86357e92" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.651s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:53:59 compute-0 nova_compute[187185]: 2025-11-29 06:53:59.855 187189 DEBUG oslo_concurrency.lockutils [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Acquiring lock "ac6c9936-a82b-4dc4-b2b3-3aaae70701ab" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:53:59 compute-0 nova_compute[187185]: 2025-11-29 06:53:59.856 187189 DEBUG oslo_concurrency.lockutils [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Lock "ac6c9936-a82b-4dc4-b2b3-3aaae70701ab" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:53:59 compute-0 nova_compute[187185]: 2025-11-29 06:53:59.885 187189 DEBUG nova.compute.manager [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 06:54:00 compute-0 nova_compute[187185]: 2025-11-29 06:54:00.053 187189 DEBUG oslo_concurrency.lockutils [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:54:00 compute-0 nova_compute[187185]: 2025-11-29 06:54:00.054 187189 DEBUG oslo_concurrency.lockutils [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:54:00 compute-0 nova_compute[187185]: 2025-11-29 06:54:00.063 187189 DEBUG nova.virt.hardware [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 06:54:00 compute-0 nova_compute[187185]: 2025-11-29 06:54:00.064 187189 INFO nova.compute.claims [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Claim successful on node compute-0.ctlplane.example.com
Nov 29 06:54:00 compute-0 nova_compute[187185]: 2025-11-29 06:54:00.267 187189 DEBUG oslo_concurrency.lockutils [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Acquiring lock "6af9191a-9cf3-47b8-9172-1f844e3f2d44" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:54:00 compute-0 nova_compute[187185]: 2025-11-29 06:54:00.267 187189 DEBUG oslo_concurrency.lockutils [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Lock "6af9191a-9cf3-47b8-9172-1f844e3f2d44" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:54:00 compute-0 nova_compute[187185]: 2025-11-29 06:54:00.298 187189 DEBUG nova.compute.manager [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 06:54:00 compute-0 nova_compute[187185]: 2025-11-29 06:54:00.365 187189 DEBUG nova.compute.provider_tree [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:54:00 compute-0 nova_compute[187185]: 2025-11-29 06:54:00.471 187189 DEBUG nova.scheduler.client.report [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:54:00 compute-0 nova_compute[187185]: 2025-11-29 06:54:00.507 187189 DEBUG oslo_concurrency.lockutils [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.454s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:54:00 compute-0 nova_compute[187185]: 2025-11-29 06:54:00.508 187189 DEBUG nova.compute.manager [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 06:54:00 compute-0 nova_compute[187185]: 2025-11-29 06:54:00.520 187189 DEBUG nova.objects.instance [None req-f54455e6-d818-4dca-91f3-a6b6e8cbef25 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:54:00 compute-0 nova_compute[187185]: 2025-11-29 06:54:00.567 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399240.567084, 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:54:00 compute-0 nova_compute[187185]: 2025-11-29 06:54:00.568 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] VM Paused (Lifecycle Event)
Nov 29 06:54:00 compute-0 nova_compute[187185]: 2025-11-29 06:54:00.571 187189 DEBUG oslo_concurrency.lockutils [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:54:00 compute-0 nova_compute[187185]: 2025-11-29 06:54:00.571 187189 DEBUG oslo_concurrency.lockutils [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:54:00 compute-0 nova_compute[187185]: 2025-11-29 06:54:00.583 187189 DEBUG nova.virt.hardware [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 06:54:00 compute-0 nova_compute[187185]: 2025-11-29 06:54:00.584 187189 INFO nova.compute.claims [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Claim successful on node compute-0.ctlplane.example.com
Nov 29 06:54:00 compute-0 nova_compute[187185]: 2025-11-29 06:54:00.594 187189 DEBUG nova.compute.manager [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 06:54:00 compute-0 nova_compute[187185]: 2025-11-29 06:54:00.595 187189 DEBUG nova.network.neutron [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 06:54:00 compute-0 nova_compute[187185]: 2025-11-29 06:54:00.608 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:54:00 compute-0 nova_compute[187185]: 2025-11-29 06:54:00.613 187189 INFO nova.virt.libvirt.driver [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 06:54:00 compute-0 nova_compute[187185]: 2025-11-29 06:54:00.617 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 06:54:00 compute-0 nova_compute[187185]: 2025-11-29 06:54:00.647 187189 DEBUG nova.compute.manager [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 06:54:00 compute-0 nova_compute[187185]: 2025-11-29 06:54:00.650 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] During sync_power_state the instance has a pending task (suspending). Skip.
Nov 29 06:54:00 compute-0 nova_compute[187185]: 2025-11-29 06:54:00.763 187189 DEBUG nova.compute.provider_tree [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:54:00 compute-0 nova_compute[187185]: 2025-11-29 06:54:00.769 187189 DEBUG nova.compute.manager [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 06:54:00 compute-0 nova_compute[187185]: 2025-11-29 06:54:00.771 187189 DEBUG nova.virt.libvirt.driver [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 06:54:00 compute-0 nova_compute[187185]: 2025-11-29 06:54:00.772 187189 INFO nova.virt.libvirt.driver [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Creating image(s)
Nov 29 06:54:00 compute-0 nova_compute[187185]: 2025-11-29 06:54:00.773 187189 DEBUG oslo_concurrency.lockutils [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Acquiring lock "/var/lib/nova/instances/ac6c9936-a82b-4dc4-b2b3-3aaae70701ab/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:54:00 compute-0 nova_compute[187185]: 2025-11-29 06:54:00.774 187189 DEBUG oslo_concurrency.lockutils [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Lock "/var/lib/nova/instances/ac6c9936-a82b-4dc4-b2b3-3aaae70701ab/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:54:00 compute-0 nova_compute[187185]: 2025-11-29 06:54:00.775 187189 DEBUG oslo_concurrency.lockutils [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Lock "/var/lib/nova/instances/ac6c9936-a82b-4dc4-b2b3-3aaae70701ab/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:54:00 compute-0 nova_compute[187185]: 2025-11-29 06:54:00.804 187189 DEBUG nova.scheduler.client.report [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:54:00 compute-0 nova_compute[187185]: 2025-11-29 06:54:00.809 187189 DEBUG oslo_concurrency.processutils [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:54:00 compute-0 nova_compute[187185]: 2025-11-29 06:54:00.842 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:00 compute-0 nova_compute[187185]: 2025-11-29 06:54:00.881 187189 DEBUG oslo_concurrency.lockutils [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.310s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:54:00 compute-0 nova_compute[187185]: 2025-11-29 06:54:00.882 187189 DEBUG nova.compute.manager [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 06:54:00 compute-0 podman[217155]: 2025-11-29 06:54:00.911534626 +0000 UTC m=+0.169082152 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller)
Nov 29 06:54:00 compute-0 nova_compute[187185]: 2025-11-29 06:54:00.922 187189 DEBUG oslo_concurrency.processutils [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.112s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:54:00 compute-0 nova_compute[187185]: 2025-11-29 06:54:00.922 187189 DEBUG oslo_concurrency.lockutils [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:54:00 compute-0 nova_compute[187185]: 2025-11-29 06:54:00.923 187189 DEBUG oslo_concurrency.lockutils [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:54:00 compute-0 nova_compute[187185]: 2025-11-29 06:54:00.936 187189 DEBUG oslo_concurrency.processutils [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:54:00 compute-0 nova_compute[187185]: 2025-11-29 06:54:00.974 187189 DEBUG nova.compute.manager [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 06:54:00 compute-0 nova_compute[187185]: 2025-11-29 06:54:00.975 187189 DEBUG nova.network.neutron [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 06:54:01 compute-0 nova_compute[187185]: 2025-11-29 06:54:01.008 187189 INFO nova.virt.libvirt.driver [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 06:54:01 compute-0 nova_compute[187185]: 2025-11-29 06:54:01.025 187189 DEBUG oslo_concurrency.processutils [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:54:01 compute-0 nova_compute[187185]: 2025-11-29 06:54:01.026 187189 DEBUG oslo_concurrency.processutils [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/ac6c9936-a82b-4dc4-b2b3-3aaae70701ab/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:54:01 compute-0 nova_compute[187185]: 2025-11-29 06:54:01.056 187189 DEBUG nova.compute.manager [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 06:54:01 compute-0 nova_compute[187185]: 2025-11-29 06:54:01.174 187189 DEBUG nova.compute.manager [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 06:54:01 compute-0 nova_compute[187185]: 2025-11-29 06:54:01.177 187189 DEBUG nova.virt.libvirt.driver [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 06:54:01 compute-0 nova_compute[187185]: 2025-11-29 06:54:01.178 187189 INFO nova.virt.libvirt.driver [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Creating image(s)
Nov 29 06:54:01 compute-0 nova_compute[187185]: 2025-11-29 06:54:01.179 187189 DEBUG oslo_concurrency.lockutils [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Acquiring lock "/var/lib/nova/instances/6af9191a-9cf3-47b8-9172-1f844e3f2d44/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:54:01 compute-0 nova_compute[187185]: 2025-11-29 06:54:01.180 187189 DEBUG oslo_concurrency.lockutils [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Lock "/var/lib/nova/instances/6af9191a-9cf3-47b8-9172-1f844e3f2d44/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:54:01 compute-0 nova_compute[187185]: 2025-11-29 06:54:01.181 187189 DEBUG oslo_concurrency.lockutils [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Lock "/var/lib/nova/instances/6af9191a-9cf3-47b8-9172-1f844e3f2d44/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:54:01 compute-0 nova_compute[187185]: 2025-11-29 06:54:01.212 187189 DEBUG oslo_concurrency.processutils [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:54:01 compute-0 nova_compute[187185]: 2025-11-29 06:54:01.298 187189 DEBUG nova.policy [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4ba956b833ca4d31936e5917bd1c2e96', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4d0114f1ba5f4ee6bdcaf9c8cf95d744', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 06:54:01 compute-0 nova_compute[187185]: 2025-11-29 06:54:01.314 187189 DEBUG oslo_concurrency.processutils [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.102s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:54:01 compute-0 nova_compute[187185]: 2025-11-29 06:54:01.315 187189 DEBUG oslo_concurrency.lockutils [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:54:01 compute-0 nova_compute[187185]: 2025-11-29 06:54:01.394 187189 DEBUG nova.network.neutron [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Nov 29 06:54:01 compute-0 nova_compute[187185]: 2025-11-29 06:54:01.395 187189 DEBUG nova.compute.manager [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 06:54:02 compute-0 kernel: tapd40e2418-d6 (unregistering): left promiscuous mode
Nov 29 06:54:02 compute-0 NetworkManager[55227]: <info>  [1764399242.7447] device (tapd40e2418-d6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 06:54:02 compute-0 nova_compute[187185]: 2025-11-29 06:54:02.801 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:02 compute-0 ovn_controller[95281]: 2025-11-29T06:54:02Z|00075|binding|INFO|Releasing lport d40e2418-d6ae-4f44-9f23-f141b0ab11a6 from this chassis (sb_readonly=0)
Nov 29 06:54:02 compute-0 ovn_controller[95281]: 2025-11-29T06:54:02Z|00076|binding|INFO|Setting lport d40e2418-d6ae-4f44-9f23-f141b0ab11a6 down in Southbound
Nov 29 06:54:02 compute-0 ovn_controller[95281]: 2025-11-29T06:54:02Z|00077|binding|INFO|Removing iface tapd40e2418-d6 ovn-installed in OVS
Nov 29 06:54:02 compute-0 nova_compute[187185]: 2025-11-29 06:54:02.817 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:02 compute-0 nova_compute[187185]: 2025-11-29 06:54:02.832 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:02 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000001a.scope: Deactivated successfully.
Nov 29 06:54:02 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000001a.scope: Consumed 2.095s CPU time.
Nov 29 06:54:02 compute-0 systemd-machined[153486]: Machine qemu-8-instance-0000001a terminated.
Nov 29 06:54:02 compute-0 podman[217200]: 2025-11-29 06:54:02.95197858 +0000 UTC m=+0.090991476 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 06:54:02 compute-0 nova_compute[187185]: 2025-11-29 06:54:02.954 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:02 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:02.993 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:6e:1f 10.100.0.6'], port_security=['fa:16:3e:dd:6e:1f 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '0e798fb6-eb0e-40cc-a4ce-b4ff86357e92', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ca8fef31-1a4b-4249-948f-73ea087430b2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8122595a-c31d-4e3d-a668-dbae500c1d72, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=d40e2418-d6ae-4f44-9f23-f141b0ab11a6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 06:54:02 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:02.997 104254 INFO neutron.agent.ovn.metadata.agent [-] Port d40e2418-d6ae-4f44-9f23-f141b0ab11a6 in datapath 17ec2ca4-3fa9-41aa-80ef-35bf92d404ba unbound from our chassis
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:02.999 187189 DEBUG nova.compute.manager [None req-f54455e6-d818-4dca-91f3-a6b6e8cbef25 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:54:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:03.001 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 17ec2ca4-3fa9-41aa-80ef-35bf92d404ba, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 06:54:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:03.005 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[21d50393-ab8c-430b-8e8c-0c810c650f1d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:54:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:03.007 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba namespace which is not needed anymore
Nov 29 06:54:03 compute-0 podman[217203]: 2025-11-29 06:54:03.013615723 +0000 UTC m=+0.123283417 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.094 187189 DEBUG oslo_concurrency.processutils [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/ac6c9936-a82b-4dc4-b2b3-3aaae70701ab/disk 1073741824" returned: 0 in 2.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.095 187189 DEBUG oslo_concurrency.lockutils [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 2.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.095 187189 DEBUG oslo_concurrency.processutils [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.124 187189 DEBUG oslo_concurrency.lockutils [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 1.810s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.139 187189 DEBUG oslo_concurrency.processutils [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.167 187189 DEBUG oslo_concurrency.processutils [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.168 187189 DEBUG nova.virt.disk.api [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Checking if we can resize image /var/lib/nova/instances/ac6c9936-a82b-4dc4-b2b3-3aaae70701ab/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.169 187189 DEBUG oslo_concurrency.processutils [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ac6c9936-a82b-4dc4-b2b3-3aaae70701ab/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.189 187189 DEBUG nova.network.neutron [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Successfully created port: 59322bae-60f4-453d-8167-213727034e0a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.203 187189 DEBUG oslo_concurrency.processutils [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.204 187189 DEBUG oslo_concurrency.processutils [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/6af9191a-9cf3-47b8-9172-1f844e3f2d44/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.226 187189 DEBUG oslo_concurrency.processutils [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ac6c9936-a82b-4dc4-b2b3-3aaae70701ab/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.227 187189 DEBUG nova.virt.disk.api [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Cannot resize image /var/lib/nova/instances/ac6c9936-a82b-4dc4-b2b3-3aaae70701ab/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.228 187189 DEBUG nova.objects.instance [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Lazy-loading 'migration_context' on Instance uuid ac6c9936-a82b-4dc4-b2b3-3aaae70701ab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.242 187189 DEBUG nova.virt.libvirt.driver [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.243 187189 DEBUG nova.virt.libvirt.driver [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Ensure instance console log exists: /var/lib/nova/instances/ac6c9936-a82b-4dc4-b2b3-3aaae70701ab/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.243 187189 DEBUG oslo_concurrency.lockutils [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.243 187189 DEBUG oslo_concurrency.lockutils [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.244 187189 DEBUG oslo_concurrency.lockutils [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.354 187189 DEBUG oslo_concurrency.processutils [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/6af9191a-9cf3-47b8-9172-1f844e3f2d44/disk 1073741824" returned: 0 in 0.150s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.355 187189 DEBUG oslo_concurrency.lockutils [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.231s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.357 187189 DEBUG oslo_concurrency.processutils [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:54:03 compute-0 neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba[217137]: [NOTICE]   (217141) : haproxy version is 2.8.14-c23fe91
Nov 29 06:54:03 compute-0 neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba[217137]: [NOTICE]   (217141) : path to executable is /usr/sbin/haproxy
Nov 29 06:54:03 compute-0 neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba[217137]: [WARNING]  (217141) : Exiting Master process...
Nov 29 06:54:03 compute-0 neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba[217137]: [ALERT]    (217141) : Current worker (217143) exited with code 143 (Terminated)
Nov 29 06:54:03 compute-0 neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba[217137]: [WARNING]  (217141) : All workers exited. Exiting... (0)
Nov 29 06:54:03 compute-0 systemd[1]: libpod-3c02ac4adca38526a13c14a39a0bf389e271f4793b6c70b2354488597af61c8f.scope: Deactivated successfully.
Nov 29 06:54:03 compute-0 podman[217278]: 2025-11-29 06:54:03.370189192 +0000 UTC m=+0.251454334 container died 3c02ac4adca38526a13c14a39a0bf389e271f4793b6c70b2354488597af61c8f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true)
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.401 187189 DEBUG nova.compute.manager [req-b46e12be-2af6-4256-a697-c3c8f4682b1d req-d87973d8-5a81-4f57-bf46-c15c410e2048 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Received event network-vif-unplugged-d40e2418-d6ae-4f44-9f23-f141b0ab11a6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.402 187189 DEBUG oslo_concurrency.lockutils [req-b46e12be-2af6-4256-a697-c3c8f4682b1d req-d87973d8-5a81-4f57-bf46-c15c410e2048 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "0e798fb6-eb0e-40cc-a4ce-b4ff86357e92-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.403 187189 DEBUG oslo_concurrency.lockutils [req-b46e12be-2af6-4256-a697-c3c8f4682b1d req-d87973d8-5a81-4f57-bf46-c15c410e2048 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0e798fb6-eb0e-40cc-a4ce-b4ff86357e92-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.404 187189 DEBUG oslo_concurrency.lockutils [req-b46e12be-2af6-4256-a697-c3c8f4682b1d req-d87973d8-5a81-4f57-bf46-c15c410e2048 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0e798fb6-eb0e-40cc-a4ce-b4ff86357e92-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.404 187189 DEBUG nova.compute.manager [req-b46e12be-2af6-4256-a697-c3c8f4682b1d req-d87973d8-5a81-4f57-bf46-c15c410e2048 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] No waiting events found dispatching network-vif-unplugged-d40e2418-d6ae-4f44-9f23-f141b0ab11a6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.405 187189 WARNING nova.compute.manager [req-b46e12be-2af6-4256-a697-c3c8f4682b1d req-d87973d8-5a81-4f57-bf46-c15c410e2048 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Received unexpected event network-vif-unplugged-d40e2418-d6ae-4f44-9f23-f141b0ab11a6 for instance with vm_state suspended and task_state None.
Nov 29 06:54:03 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3c02ac4adca38526a13c14a39a0bf389e271f4793b6c70b2354488597af61c8f-userdata-shm.mount: Deactivated successfully.
Nov 29 06:54:03 compute-0 systemd[1]: var-lib-containers-storage-overlay-c12e14704a2dd788ea4183980ae6c9b3aa2b47f42d2a61ac3b1db63ba565b586-merged.mount: Deactivated successfully.
Nov 29 06:54:03 compute-0 podman[217278]: 2025-11-29 06:54:03.425551976 +0000 UTC m=+0.306817098 container cleanup 3c02ac4adca38526a13c14a39a0bf389e271f4793b6c70b2354488597af61c8f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 06:54:03 compute-0 systemd[1]: libpod-conmon-3c02ac4adca38526a13c14a39a0bf389e271f4793b6c70b2354488597af61c8f.scope: Deactivated successfully.
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.456 187189 DEBUG oslo_concurrency.processutils [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.457 187189 DEBUG nova.virt.disk.api [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Checking if we can resize image /var/lib/nova/instances/6af9191a-9cf3-47b8-9172-1f844e3f2d44/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.458 187189 DEBUG oslo_concurrency.processutils [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6af9191a-9cf3-47b8-9172-1f844e3f2d44/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:54:03 compute-0 podman[217321]: 2025-11-29 06:54:03.497076775 +0000 UTC m=+0.045699304 container remove 3c02ac4adca38526a13c14a39a0bf389e271f4793b6c70b2354488597af61c8f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 06:54:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:03.502 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[cd6aab0c-a205-4254-a49b-66d29642cd9d]: (4, ('Sat Nov 29 06:54:03 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba (3c02ac4adca38526a13c14a39a0bf389e271f4793b6c70b2354488597af61c8f)\n3c02ac4adca38526a13c14a39a0bf389e271f4793b6c70b2354488597af61c8f\nSat Nov 29 06:54:03 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba (3c02ac4adca38526a13c14a39a0bf389e271f4793b6c70b2354488597af61c8f)\n3c02ac4adca38526a13c14a39a0bf389e271f4793b6c70b2354488597af61c8f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:54:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:03.504 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[12a83627-90cb-4cae-93a1-c44a0e3af815]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:54:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:03.506 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap17ec2ca4-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.508 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:03 compute-0 kernel: tap17ec2ca4-30: left promiscuous mode
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.517 187189 DEBUG oslo_concurrency.processutils [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6af9191a-9cf3-47b8-9172-1f844e3f2d44/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.518 187189 DEBUG nova.virt.disk.api [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Cannot resize image /var/lib/nova/instances/6af9191a-9cf3-47b8-9172-1f844e3f2d44/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.518 187189 DEBUG nova.objects.instance [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Lazy-loading 'migration_context' on Instance uuid 6af9191a-9cf3-47b8-9172-1f844e3f2d44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.526 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:03.530 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6a8422a6-95b7-4b6e-8fa9-1126b7a48a63]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.533 187189 DEBUG nova.virt.libvirt.driver [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.534 187189 DEBUG nova.virt.libvirt.driver [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Ensure instance console log exists: /var/lib/nova/instances/6af9191a-9cf3-47b8-9172-1f844e3f2d44/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.534 187189 DEBUG oslo_concurrency.lockutils [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.535 187189 DEBUG oslo_concurrency.lockutils [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.536 187189 DEBUG oslo_concurrency.lockutils [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.538 187189 DEBUG nova.virt.libvirt.driver [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.544 187189 WARNING nova.virt.libvirt.driver [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:54:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:03.548 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[56e0aaa8-a29c-41e2-bbc7-8d3080d6429a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:54:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:03.549 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[5a03fb00-6d43-4b47-9143-cc420f4f155d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.549 187189 DEBUG nova.virt.libvirt.host [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.550 187189 DEBUG nova.virt.libvirt.host [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.555 187189 DEBUG nova.virt.libvirt.host [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.555 187189 DEBUG nova.virt.libvirt.host [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.557 187189 DEBUG nova.virt.libvirt.driver [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.557 187189 DEBUG nova.virt.hardware [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.558 187189 DEBUG nova.virt.hardware [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.558 187189 DEBUG nova.virt.hardware [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.558 187189 DEBUG nova.virt.hardware [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.558 187189 DEBUG nova.virt.hardware [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.558 187189 DEBUG nova.virt.hardware [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.559 187189 DEBUG nova.virt.hardware [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.559 187189 DEBUG nova.virt.hardware [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.559 187189 DEBUG nova.virt.hardware [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.559 187189 DEBUG nova.virt.hardware [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.560 187189 DEBUG nova.virt.hardware [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.564 187189 DEBUG nova.objects.instance [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Lazy-loading 'pci_devices' on Instance uuid 6af9191a-9cf3-47b8-9172-1f844e3f2d44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:54:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:03.565 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[9f5d18f2-87f0-4648-b281-eb31390c3501]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 468477, 'reachable_time': 24753, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217343, 'error': None, 'target': 'ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:54:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:03.568 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 06:54:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:03.569 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[f54415fb-4547-48ce-af4f-97977334815c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:54:03 compute-0 systemd[1]: run-netns-ovnmeta\x2d17ec2ca4\x2d3fa9\x2d41aa\x2d80ef\x2d35bf92d404ba.mount: Deactivated successfully.
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.578 187189 DEBUG nova.virt.libvirt.driver [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] End _get_guest_xml xml=<domain type="kvm">
Nov 29 06:54:03 compute-0 nova_compute[187185]:   <uuid>6af9191a-9cf3-47b8-9172-1f844e3f2d44</uuid>
Nov 29 06:54:03 compute-0 nova_compute[187185]:   <name>instance-0000001e</name>
Nov 29 06:54:03 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 06:54:03 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 06:54:03 compute-0 nova_compute[187185]:   <metadata>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 06:54:03 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:       <nova:name>tempest-ListImageFiltersTestJSON-server-1291053205</nova:name>
Nov 29 06:54:03 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 06:54:03</nova:creationTime>
Nov 29 06:54:03 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 06:54:03 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 06:54:03 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 06:54:03 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 06:54:03 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 06:54:03 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 06:54:03 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 06:54:03 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 06:54:03 compute-0 nova_compute[187185]:         <nova:user uuid="c480a3bf2f8f485889154b20872eada2">tempest-ListImageFiltersTestJSON-573297769-project-member</nova:user>
Nov 29 06:54:03 compute-0 nova_compute[187185]:         <nova:project uuid="34d1587b0fbb4c3cbf3c8c4a71d1a6be">tempest-ListImageFiltersTestJSON-573297769</nova:project>
Nov 29 06:54:03 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 06:54:03 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:       <nova:ports/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 06:54:03 compute-0 nova_compute[187185]:   </metadata>
Nov 29 06:54:03 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <system>
Nov 29 06:54:03 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 06:54:03 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 06:54:03 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 06:54:03 compute-0 nova_compute[187185]:       <entry name="serial">6af9191a-9cf3-47b8-9172-1f844e3f2d44</entry>
Nov 29 06:54:03 compute-0 nova_compute[187185]:       <entry name="uuid">6af9191a-9cf3-47b8-9172-1f844e3f2d44</entry>
Nov 29 06:54:03 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     </system>
Nov 29 06:54:03 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 06:54:03 compute-0 nova_compute[187185]:   <os>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:   </os>
Nov 29 06:54:03 compute-0 nova_compute[187185]:   <features>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <apic/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:   </features>
Nov 29 06:54:03 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:   </clock>
Nov 29 06:54:03 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:   </cpu>
Nov 29 06:54:03 compute-0 nova_compute[187185]:   <devices>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 06:54:03 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/6af9191a-9cf3-47b8-9172-1f844e3f2d44/disk"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     </disk>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 06:54:03 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/6af9191a-9cf3-47b8-9172-1f844e3f2d44/disk.config"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     </disk>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 06:54:03 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/6af9191a-9cf3-47b8-9172-1f844e3f2d44/console.log" append="off"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     </serial>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <video>
Nov 29 06:54:03 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     </video>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 06:54:03 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     </rng>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 06:54:03 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 06:54:03 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 06:54:03 compute-0 nova_compute[187185]:   </devices>
Nov 29 06:54:03 compute-0 nova_compute[187185]: </domain>
Nov 29 06:54:03 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.638 187189 DEBUG nova.virt.libvirt.driver [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.638 187189 DEBUG nova.virt.libvirt.driver [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 06:54:03 compute-0 nova_compute[187185]: 2025-11-29 06:54:03.639 187189 INFO nova.virt.libvirt.driver [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Using config drive
Nov 29 06:54:04 compute-0 nova_compute[187185]: 2025-11-29 06:54:04.136 187189 INFO nova.virt.libvirt.driver [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Creating config drive at /var/lib/nova/instances/6af9191a-9cf3-47b8-9172-1f844e3f2d44/disk.config
Nov 29 06:54:04 compute-0 nova_compute[187185]: 2025-11-29 06:54:04.144 187189 DEBUG oslo_concurrency.processutils [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6af9191a-9cf3-47b8-9172-1f844e3f2d44/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8ner5yea execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:54:04 compute-0 nova_compute[187185]: 2025-11-29 06:54:04.295 187189 DEBUG oslo_concurrency.processutils [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6af9191a-9cf3-47b8-9172-1f844e3f2d44/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8ner5yea" returned: 0 in 0.151s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:54:04 compute-0 nova_compute[187185]: 2025-11-29 06:54:04.331 187189 DEBUG nova.network.neutron [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Successfully updated port: 59322bae-60f4-453d-8167-213727034e0a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 06:54:04 compute-0 nova_compute[187185]: 2025-11-29 06:54:04.355 187189 DEBUG oslo_concurrency.lockutils [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Acquiring lock "refresh_cache-ac6c9936-a82b-4dc4-b2b3-3aaae70701ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:54:04 compute-0 nova_compute[187185]: 2025-11-29 06:54:04.355 187189 DEBUG oslo_concurrency.lockutils [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Acquired lock "refresh_cache-ac6c9936-a82b-4dc4-b2b3-3aaae70701ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:54:04 compute-0 nova_compute[187185]: 2025-11-29 06:54:04.355 187189 DEBUG nova.network.neutron [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 06:54:04 compute-0 systemd-machined[153486]: New machine qemu-9-instance-0000001e.
Nov 29 06:54:04 compute-0 systemd[1]: Started Virtual Machine qemu-9-instance-0000001e.
Nov 29 06:54:04 compute-0 nova_compute[187185]: 2025-11-29 06:54:04.427 187189 DEBUG nova.compute.manager [req-2c61f9e0-5b02-4b27-b9ba-f4fdf263d54b req-3b82a95c-a50e-4511-9153-c760259e4840 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Received event network-changed-59322bae-60f4-453d-8167-213727034e0a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:54:04 compute-0 nova_compute[187185]: 2025-11-29 06:54:04.427 187189 DEBUG nova.compute.manager [req-2c61f9e0-5b02-4b27-b9ba-f4fdf263d54b req-3b82a95c-a50e-4511-9153-c760259e4840 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Refreshing instance network info cache due to event network-changed-59322bae-60f4-453d-8167-213727034e0a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 06:54:04 compute-0 nova_compute[187185]: 2025-11-29 06:54:04.428 187189 DEBUG oslo_concurrency.lockutils [req-2c61f9e0-5b02-4b27-b9ba-f4fdf263d54b req-3b82a95c-a50e-4511-9153-c760259e4840 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-ac6c9936-a82b-4dc4-b2b3-3aaae70701ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:54:04 compute-0 nova_compute[187185]: 2025-11-29 06:54:04.504 187189 DEBUG nova.network.neutron [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 06:54:04 compute-0 nova_compute[187185]: 2025-11-29 06:54:04.989 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399244.9885669, 6af9191a-9cf3-47b8-9172-1f844e3f2d44 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:54:04 compute-0 nova_compute[187185]: 2025-11-29 06:54:04.990 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] VM Resumed (Lifecycle Event)
Nov 29 06:54:04 compute-0 nova_compute[187185]: 2025-11-29 06:54:04.994 187189 DEBUG nova.compute.manager [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 06:54:04 compute-0 nova_compute[187185]: 2025-11-29 06:54:04.995 187189 DEBUG nova.virt.libvirt.driver [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.001 187189 INFO nova.virt.libvirt.driver [-] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Instance spawned successfully.
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.001 187189 DEBUG nova.virt.libvirt.driver [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.015 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.024 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.028 187189 DEBUG nova.virt.libvirt.driver [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.029 187189 DEBUG nova.virt.libvirt.driver [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.029 187189 DEBUG nova.virt.libvirt.driver [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.030 187189 DEBUG nova.virt.libvirt.driver [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.031 187189 DEBUG nova.virt.libvirt.driver [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.031 187189 DEBUG nova.virt.libvirt.driver [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.057 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.057 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399244.989889, 6af9191a-9cf3-47b8-9172-1f844e3f2d44 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.057 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] VM Started (Lifecycle Event)
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.085 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.089 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.135 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.147 187189 INFO nova.compute.manager [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Took 3.97 seconds to spawn the instance on the hypervisor.
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.147 187189 DEBUG nova.compute.manager [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.266 187189 INFO nova.compute.manager [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Took 4.73 seconds to build instance.
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.354 187189 DEBUG oslo_concurrency.lockutils [None req-5b5c3ecf-0813-415d-a2bc-7d31635507cb c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Lock "6af9191a-9cf3-47b8-9172-1f844e3f2d44" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.087s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.410 187189 DEBUG nova.compute.manager [None req-e1160efe-3adc-45f8-84a5-953149be27b2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.487 187189 INFO nova.compute.manager [None req-e1160efe-3adc-45f8-84a5-953149be27b2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] instance snapshotting
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.488 187189 WARNING nova.compute.manager [None req-e1160efe-3adc-45f8-84a5-953149be27b2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] trying to snapshot a non-running instance: (state: 4 expected: 1)
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.545 187189 DEBUG nova.compute.manager [req-5234fe8b-0fe0-48b9-86b7-f8d1917cf986 req-216f5ae0-0ff1-4418-9d0c-1d2df4cd1369 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Received event network-vif-plugged-d40e2418-d6ae-4f44-9f23-f141b0ab11a6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.546 187189 DEBUG oslo_concurrency.lockutils [req-5234fe8b-0fe0-48b9-86b7-f8d1917cf986 req-216f5ae0-0ff1-4418-9d0c-1d2df4cd1369 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "0e798fb6-eb0e-40cc-a4ce-b4ff86357e92-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.548 187189 DEBUG oslo_concurrency.lockutils [req-5234fe8b-0fe0-48b9-86b7-f8d1917cf986 req-216f5ae0-0ff1-4418-9d0c-1d2df4cd1369 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0e798fb6-eb0e-40cc-a4ce-b4ff86357e92-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.548 187189 DEBUG oslo_concurrency.lockutils [req-5234fe8b-0fe0-48b9-86b7-f8d1917cf986 req-216f5ae0-0ff1-4418-9d0c-1d2df4cd1369 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0e798fb6-eb0e-40cc-a4ce-b4ff86357e92-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.549 187189 DEBUG nova.compute.manager [req-5234fe8b-0fe0-48b9-86b7-f8d1917cf986 req-216f5ae0-0ff1-4418-9d0c-1d2df4cd1369 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] No waiting events found dispatching network-vif-plugged-d40e2418-d6ae-4f44-9f23-f141b0ab11a6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.549 187189 WARNING nova.compute.manager [req-5234fe8b-0fe0-48b9-86b7-f8d1917cf986 req-216f5ae0-0ff1-4418-9d0c-1d2df4cd1369 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Received unexpected event network-vif-plugged-d40e2418-d6ae-4f44-9f23-f141b0ab11a6 for instance with vm_state suspended and task_state image_snapshot.
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.762 187189 DEBUG nova.network.neutron [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Updating instance_info_cache with network_info: [{"id": "59322bae-60f4-453d-8167-213727034e0a", "address": "fa:16:3e:f0:49:52", "network": {"id": "e21746f3-c39c-4f95-ab09-ca8da7420cb0", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-835937901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d0114f1ba5f4ee6bdcaf9c8cf95d744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap59322bae-60", "ovs_interfaceid": "59322bae-60f4-453d-8167-213727034e0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.766 187189 INFO nova.virt.libvirt.driver [None req-e1160efe-3adc-45f8-84a5-953149be27b2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Beginning cold snapshot process
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.800 187189 DEBUG oslo_concurrency.lockutils [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Releasing lock "refresh_cache-ac6c9936-a82b-4dc4-b2b3-3aaae70701ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.800 187189 DEBUG nova.compute.manager [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Instance network_info: |[{"id": "59322bae-60f4-453d-8167-213727034e0a", "address": "fa:16:3e:f0:49:52", "network": {"id": "e21746f3-c39c-4f95-ab09-ca8da7420cb0", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-835937901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d0114f1ba5f4ee6bdcaf9c8cf95d744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap59322bae-60", "ovs_interfaceid": "59322bae-60f4-453d-8167-213727034e0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.801 187189 DEBUG oslo_concurrency.lockutils [req-2c61f9e0-5b02-4b27-b9ba-f4fdf263d54b req-3b82a95c-a50e-4511-9153-c760259e4840 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-ac6c9936-a82b-4dc4-b2b3-3aaae70701ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.801 187189 DEBUG nova.network.neutron [req-2c61f9e0-5b02-4b27-b9ba-f4fdf263d54b req-3b82a95c-a50e-4511-9153-c760259e4840 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Refreshing network info cache for port 59322bae-60f4-453d-8167-213727034e0a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.804 187189 DEBUG nova.virt.libvirt.driver [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Start _get_guest_xml network_info=[{"id": "59322bae-60f4-453d-8167-213727034e0a", "address": "fa:16:3e:f0:49:52", "network": {"id": "e21746f3-c39c-4f95-ab09-ca8da7420cb0", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-835937901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d0114f1ba5f4ee6bdcaf9c8cf95d744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap59322bae-60", "ovs_interfaceid": "59322bae-60f4-453d-8167-213727034e0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.808 187189 WARNING nova.virt.libvirt.driver [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.815 187189 DEBUG nova.virt.libvirt.host [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.817 187189 DEBUG nova.virt.libvirt.host [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.820 187189 DEBUG nova.virt.libvirt.host [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.821 187189 DEBUG nova.virt.libvirt.host [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.822 187189 DEBUG nova.virt.libvirt.driver [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.822 187189 DEBUG nova.virt.hardware [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.822 187189 DEBUG nova.virt.hardware [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.823 187189 DEBUG nova.virt.hardware [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.823 187189 DEBUG nova.virt.hardware [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.823 187189 DEBUG nova.virt.hardware [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.823 187189 DEBUG nova.virt.hardware [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.823 187189 DEBUG nova.virt.hardware [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.823 187189 DEBUG nova.virt.hardware [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.823 187189 DEBUG nova.virt.hardware [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.824 187189 DEBUG nova.virt.hardware [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.824 187189 DEBUG nova.virt.hardware [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.827 187189 DEBUG nova.virt.libvirt.vif [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] vif_type=ovs instance=Instance(access_ip_v4=2.2.2.2,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:53:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='guest-instance-1.domain.com',display_name='guest-instance-1.domain.com',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='guest-instance-1-domain-com',id=29,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI0h1tyJ10sGVfgbtaTfaukoV4S7P+aEjK9eLgPUiG9CPE4jsFdm+3ZESYhXBmq2VKuHkx2B1jNhgo5RY3RmEHOxb4evCuMePeLG8sRUx+ZQvFMNXMQJhi2ttCvkNaBsRw==',key_name='tempest-keypair-992509475',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4d0114f1ba5f4ee6bdcaf9c8cf95d744',ramdisk_id='',reservation_id='r-np7rmq9z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestFqdnHostnames-645710174',owner_user_name='tempest-ServersTestFqdnHostnames-645710174-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:54:00Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='4ba956b833ca4d31936e5917bd1c2e96',uuid=ac6c9936-a82b-4dc4-b2b3-3aaae70701ab,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "59322bae-60f4-453d-8167-213727034e0a", "address": "fa:16:3e:f0:49:52", "network": {"id": "e21746f3-c39c-4f95-ab09-ca8da7420cb0", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-835937901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d0114f1ba5f4ee6bdcaf9c8cf95d744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap59322bae-60", "ovs_interfaceid": "59322bae-60f4-453d-8167-213727034e0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.827 187189 DEBUG nova.network.os_vif_util [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Converting VIF {"id": "59322bae-60f4-453d-8167-213727034e0a", "address": "fa:16:3e:f0:49:52", "network": {"id": "e21746f3-c39c-4f95-ab09-ca8da7420cb0", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-835937901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d0114f1ba5f4ee6bdcaf9c8cf95d744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap59322bae-60", "ovs_interfaceid": "59322bae-60f4-453d-8167-213727034e0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.828 187189 DEBUG nova.network.os_vif_util [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f0:49:52,bridge_name='br-int',has_traffic_filtering=True,id=59322bae-60f4-453d-8167-213727034e0a,network=Network(e21746f3-c39c-4f95-ab09-ca8da7420cb0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap59322bae-60') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.828 187189 DEBUG nova.objects.instance [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Lazy-loading 'pci_devices' on Instance uuid ac6c9936-a82b-4dc4-b2b3-3aaae70701ab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.845 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.849 187189 DEBUG nova.virt.libvirt.driver [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] End _get_guest_xml xml=<domain type="kvm">
Nov 29 06:54:05 compute-0 nova_compute[187185]:   <uuid>ac6c9936-a82b-4dc4-b2b3-3aaae70701ab</uuid>
Nov 29 06:54:05 compute-0 nova_compute[187185]:   <name>instance-0000001d</name>
Nov 29 06:54:05 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 06:54:05 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 06:54:05 compute-0 nova_compute[187185]:   <metadata>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 06:54:05 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:       <nova:name>guest-instance-1.domain.com</nova:name>
Nov 29 06:54:05 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 06:54:05</nova:creationTime>
Nov 29 06:54:05 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 06:54:05 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 06:54:05 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 06:54:05 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 06:54:05 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 06:54:05 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 06:54:05 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 06:54:05 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 06:54:05 compute-0 nova_compute[187185]:         <nova:user uuid="4ba956b833ca4d31936e5917bd1c2e96">tempest-ServersTestFqdnHostnames-645710174-project-member</nova:user>
Nov 29 06:54:05 compute-0 nova_compute[187185]:         <nova:project uuid="4d0114f1ba5f4ee6bdcaf9c8cf95d744">tempest-ServersTestFqdnHostnames-645710174</nova:project>
Nov 29 06:54:05 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 06:54:05 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 06:54:05 compute-0 nova_compute[187185]:         <nova:port uuid="59322bae-60f4-453d-8167-213727034e0a">
Nov 29 06:54:05 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 06:54:05 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 06:54:05 compute-0 nova_compute[187185]:   </metadata>
Nov 29 06:54:05 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <system>
Nov 29 06:54:05 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 06:54:05 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 06:54:05 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 06:54:05 compute-0 nova_compute[187185]:       <entry name="serial">ac6c9936-a82b-4dc4-b2b3-3aaae70701ab</entry>
Nov 29 06:54:05 compute-0 nova_compute[187185]:       <entry name="uuid">ac6c9936-a82b-4dc4-b2b3-3aaae70701ab</entry>
Nov 29 06:54:05 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     </system>
Nov 29 06:54:05 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 06:54:05 compute-0 nova_compute[187185]:   <os>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:   </os>
Nov 29 06:54:05 compute-0 nova_compute[187185]:   <features>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <apic/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:   </features>
Nov 29 06:54:05 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:   </clock>
Nov 29 06:54:05 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:   </cpu>
Nov 29 06:54:05 compute-0 nova_compute[187185]:   <devices>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 06:54:05 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/ac6c9936-a82b-4dc4-b2b3-3aaae70701ab/disk"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     </disk>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 06:54:05 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/ac6c9936-a82b-4dc4-b2b3-3aaae70701ab/disk.config"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     </disk>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 06:54:05 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:f0:49:52"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:       <target dev="tap59322bae-60"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     </interface>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 06:54:05 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/ac6c9936-a82b-4dc4-b2b3-3aaae70701ab/console.log" append="off"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     </serial>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <video>
Nov 29 06:54:05 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     </video>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 06:54:05 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     </rng>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 06:54:05 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 06:54:05 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 06:54:05 compute-0 nova_compute[187185]:   </devices>
Nov 29 06:54:05 compute-0 nova_compute[187185]: </domain>
Nov 29 06:54:05 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.849 187189 DEBUG nova.compute.manager [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Preparing to wait for external event network-vif-plugged-59322bae-60f4-453d-8167-213727034e0a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.850 187189 DEBUG oslo_concurrency.lockutils [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Acquiring lock "ac6c9936-a82b-4dc4-b2b3-3aaae70701ab-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.850 187189 DEBUG oslo_concurrency.lockutils [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Lock "ac6c9936-a82b-4dc4-b2b3-3aaae70701ab-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.850 187189 DEBUG oslo_concurrency.lockutils [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Lock "ac6c9936-a82b-4dc4-b2b3-3aaae70701ab-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.851 187189 DEBUG nova.virt.libvirt.vif [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] vif_type=ovs instance=Instance(access_ip_v4=2.2.2.2,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:53:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='guest-instance-1.domain.com',display_name='guest-instance-1.domain.com',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='guest-instance-1-domain-com',id=29,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI0h1tyJ10sGVfgbtaTfaukoV4S7P+aEjK9eLgPUiG9CPE4jsFdm+3ZESYhXBmq2VKuHkx2B1jNhgo5RY3RmEHOxb4evCuMePeLG8sRUx+ZQvFMNXMQJhi2ttCvkNaBsRw==',key_name='tempest-keypair-992509475',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4d0114f1ba5f4ee6bdcaf9c8cf95d744',ramdisk_id='',reservation_id='r-np7rmq9z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestFqdnHostnames-645710174',owner_user_name='tempest-ServersTestFqdnHostnames-645710174-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:54:00Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='4ba956b833ca4d31936e5917bd1c2e96',uuid=ac6c9936-a82b-4dc4-b2b3-3aaae70701ab,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "59322bae-60f4-453d-8167-213727034e0a", "address": "fa:16:3e:f0:49:52", "network": {"id": "e21746f3-c39c-4f95-ab09-ca8da7420cb0", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-835937901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d0114f1ba5f4ee6bdcaf9c8cf95d744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap59322bae-60", "ovs_interfaceid": "59322bae-60f4-453d-8167-213727034e0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.851 187189 DEBUG nova.network.os_vif_util [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Converting VIF {"id": "59322bae-60f4-453d-8167-213727034e0a", "address": "fa:16:3e:f0:49:52", "network": {"id": "e21746f3-c39c-4f95-ab09-ca8da7420cb0", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-835937901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d0114f1ba5f4ee6bdcaf9c8cf95d744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap59322bae-60", "ovs_interfaceid": "59322bae-60f4-453d-8167-213727034e0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.851 187189 DEBUG nova.network.os_vif_util [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f0:49:52,bridge_name='br-int',has_traffic_filtering=True,id=59322bae-60f4-453d-8167-213727034e0a,network=Network(e21746f3-c39c-4f95-ab09-ca8da7420cb0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap59322bae-60') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.852 187189 DEBUG os_vif [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f0:49:52,bridge_name='br-int',has_traffic_filtering=True,id=59322bae-60f4-453d-8167-213727034e0a,network=Network(e21746f3-c39c-4f95-ab09-ca8da7420cb0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap59322bae-60') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.852 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.852 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.853 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.855 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.855 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap59322bae-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.856 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap59322bae-60, col_values=(('external_ids', {'iface-id': '59322bae-60f4-453d-8167-213727034e0a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f0:49:52', 'vm-uuid': 'ac6c9936-a82b-4dc4-b2b3-3aaae70701ab'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:54:05 compute-0 NetworkManager[55227]: <info>  [1764399245.8581] manager: (tap59322bae-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.857 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.861 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.867 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.868 187189 INFO os_vif [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f0:49:52,bridge_name='br-int',has_traffic_filtering=True,id=59322bae-60f4-453d-8167-213727034e0a,network=Network(e21746f3-c39c-4f95-ab09-ca8da7420cb0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap59322bae-60')
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.958 187189 DEBUG nova.privsep.utils [None req-e1160efe-3adc-45f8-84a5-953149be27b2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 29 06:54:05 compute-0 nova_compute[187185]: 2025-11-29 06:54:05.958 187189 DEBUG oslo_concurrency.processutils [None req-e1160efe-3adc-45f8-84a5-953149be27b2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/0e798fb6-eb0e-40cc-a4ce-b4ff86357e92/disk /var/lib/nova/instances/snapshots/tmp9_gvrn9x/32ad4fc4bf144bc1af9b03faf1bea1f4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:54:06 compute-0 nova_compute[187185]: 2025-11-29 06:54:06.018 187189 DEBUG nova.virt.libvirt.driver [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 06:54:06 compute-0 nova_compute[187185]: 2025-11-29 06:54:06.019 187189 DEBUG nova.virt.libvirt.driver [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 06:54:06 compute-0 nova_compute[187185]: 2025-11-29 06:54:06.019 187189 DEBUG nova.virt.libvirt.driver [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] No VIF found with MAC fa:16:3e:f0:49:52, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 06:54:06 compute-0 nova_compute[187185]: 2025-11-29 06:54:06.020 187189 INFO nova.virt.libvirt.driver [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Using config drive
Nov 29 06:54:06 compute-0 nova_compute[187185]: 2025-11-29 06:54:06.507 187189 INFO nova.virt.libvirt.driver [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Creating config drive at /var/lib/nova/instances/ac6c9936-a82b-4dc4-b2b3-3aaae70701ab/disk.config
Nov 29 06:54:06 compute-0 nova_compute[187185]: 2025-11-29 06:54:06.517 187189 DEBUG oslo_concurrency.processutils [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ac6c9936-a82b-4dc4-b2b3-3aaae70701ab/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpapvzqx9d execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:54:06 compute-0 nova_compute[187185]: 2025-11-29 06:54:06.669 187189 DEBUG oslo_concurrency.processutils [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ac6c9936-a82b-4dc4-b2b3-3aaae70701ab/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpapvzqx9d" returned: 0 in 0.152s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:54:06 compute-0 NetworkManager[55227]: <info>  [1764399246.7289] manager: (tap59322bae-60): new Tun device (/org/freedesktop/NetworkManager/Devices/48)
Nov 29 06:54:06 compute-0 kernel: tap59322bae-60: entered promiscuous mode
Nov 29 06:54:06 compute-0 ovn_controller[95281]: 2025-11-29T06:54:06Z|00078|binding|INFO|Claiming lport 59322bae-60f4-453d-8167-213727034e0a for this chassis.
Nov 29 06:54:06 compute-0 ovn_controller[95281]: 2025-11-29T06:54:06Z|00079|binding|INFO|59322bae-60f4-453d-8167-213727034e0a: Claiming fa:16:3e:f0:49:52 10.100.0.11
Nov 29 06:54:06 compute-0 nova_compute[187185]: 2025-11-29 06:54:06.742 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:06.762 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:49:52 10.100.0.11'], port_security=['fa:16:3e:f0:49:52 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'ac6c9936-a82b-4dc4-b2b3-3aaae70701ab', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e21746f3-c39c-4f95-ab09-ca8da7420cb0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d0114f1ba5f4ee6bdcaf9c8cf95d744', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b4431f2a-96df-4c7a-a674-7ab10a2da638', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d3e8b222-3dc2-4009-a628-f393d12cd667, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=59322bae-60f4-453d-8167-213727034e0a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 06:54:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:06.763 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 59322bae-60f4-453d-8167-213727034e0a in datapath e21746f3-c39c-4f95-ab09-ca8da7420cb0 bound to our chassis
Nov 29 06:54:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:06.765 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e21746f3-c39c-4f95-ab09-ca8da7420cb0
Nov 29 06:54:06 compute-0 systemd-machined[153486]: New machine qemu-10-instance-0000001d.
Nov 29 06:54:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:06.781 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[bb8205bc-2a2a-400c-a9b3-c550f5e802e5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:54:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:06.782 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape21746f3-c1 in ovnmeta-e21746f3-c39c-4f95-ab09-ca8da7420cb0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 06:54:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:06.786 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape21746f3-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 06:54:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:06.786 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e04c8935-e8de-4efc-9809-ee8bffb23238]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:54:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:06.787 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[88220f0a-66cb-4fa6-aa7d-180e0c092941]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:54:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:06.801 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[ab2f241d-c3f5-4cea-8af9-67a5f7ba2b6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:54:06 compute-0 ovn_controller[95281]: 2025-11-29T06:54:06Z|00080|binding|INFO|Setting lport 59322bae-60f4-453d-8167-213727034e0a ovn-installed in OVS
Nov 29 06:54:06 compute-0 ovn_controller[95281]: 2025-11-29T06:54:06Z|00081|binding|INFO|Setting lport 59322bae-60f4-453d-8167-213727034e0a up in Southbound
Nov 29 06:54:06 compute-0 systemd[1]: Started Virtual Machine qemu-10-instance-0000001d.
Nov 29 06:54:06 compute-0 nova_compute[187185]: 2025-11-29 06:54:06.822 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:06 compute-0 systemd-udevd[217403]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 06:54:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:06.829 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[870bc999-a997-4c8b-b243-0865f91acb42]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:54:06 compute-0 nova_compute[187185]: 2025-11-29 06:54:06.840 187189 DEBUG oslo_concurrency.processutils [None req-e1160efe-3adc-45f8-84a5-953149be27b2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/0e798fb6-eb0e-40cc-a4ce-b4ff86357e92/disk /var/lib/nova/instances/snapshots/tmp9_gvrn9x/32ad4fc4bf144bc1af9b03faf1bea1f4" returned: 0 in 0.882s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:54:06 compute-0 nova_compute[187185]: 2025-11-29 06:54:06.841 187189 INFO nova.virt.libvirt.driver [None req-e1160efe-3adc-45f8-84a5-953149be27b2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Snapshot extracted, beginning image upload
Nov 29 06:54:06 compute-0 NetworkManager[55227]: <info>  [1764399246.8426] device (tap59322bae-60): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 06:54:06 compute-0 NetworkManager[55227]: <info>  [1764399246.8440] device (tap59322bae-60): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 06:54:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:06.871 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[06ff5035-c478-4921-a5ef-6b95d794a479]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:54:06 compute-0 systemd-udevd[217406]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 06:54:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:06.879 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a5326aa0-eb51-4fdd-b391-a3648b6917d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:54:06 compute-0 NetworkManager[55227]: <info>  [1764399246.8803] manager: (tape21746f3-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/49)
Nov 29 06:54:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:06.923 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[1715383c-9788-4c2d-ad48-780bdbe7d2c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:54:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:06.927 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[9b8bca44-af81-4d71-850c-8c80017fe46b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:54:06 compute-0 NetworkManager[55227]: <info>  [1764399246.9535] device (tape21746f3-c0): carrier: link connected
Nov 29 06:54:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:06.963 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[8f0e9b05-fc96-4d77-9ee3-9f6dd7fffa79]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:54:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:06.982 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e2f36210-91ec-482f-a474-d146841c9fdb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape21746f3-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0c:5d:87'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 469461, 'reachable_time': 21699, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217436, 'error': None, 'target': 'ovnmeta-e21746f3-c39c-4f95-ab09-ca8da7420cb0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:54:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:06.997 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[fe0bee8f-673c-43a8-9e52-c8791a383b6e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0c:5d87'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 469461, 'tstamp': 469461}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217437, 'error': None, 'target': 'ovnmeta-e21746f3-c39c-4f95-ab09-ca8da7420cb0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:54:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:07.017 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[36621dfd-d097-457f-ac27-43eb0808ece8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape21746f3-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0c:5d:87'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 469461, 'reachable_time': 21699, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217438, 'error': None, 'target': 'ovnmeta-e21746f3-c39c-4f95-ab09-ca8da7420cb0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:54:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:07.046 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[23eba7c7-a2f0-45db-91b9-5872c5f84ae1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:54:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:07.112 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[42d47ffc-5f90-4bd4-8c5a-99de8b9b9e86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:54:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:07.113 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape21746f3-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:54:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:07.114 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 06:54:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:07.114 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape21746f3-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:54:07 compute-0 NetworkManager[55227]: <info>  [1764399247.1171] manager: (tape21746f3-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/50)
Nov 29 06:54:07 compute-0 nova_compute[187185]: 2025-11-29 06:54:07.116 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:07 compute-0 kernel: tape21746f3-c0: entered promiscuous mode
Nov 29 06:54:07 compute-0 nova_compute[187185]: 2025-11-29 06:54:07.124 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:07.126 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape21746f3-c0, col_values=(('external_ids', {'iface-id': '9fe03714-858c-48b5-b3c9-530adc56ad95'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:54:07 compute-0 nova_compute[187185]: 2025-11-29 06:54:07.127 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:07 compute-0 ovn_controller[95281]: 2025-11-29T06:54:07Z|00082|binding|INFO|Releasing lport 9fe03714-858c-48b5-b3c9-530adc56ad95 from this chassis (sb_readonly=0)
Nov 29 06:54:07 compute-0 nova_compute[187185]: 2025-11-29 06:54:07.141 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:07 compute-0 nova_compute[187185]: 2025-11-29 06:54:07.150 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:07.152 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e21746f3-c39c-4f95-ab09-ca8da7420cb0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e21746f3-c39c-4f95-ab09-ca8da7420cb0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 06:54:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:07.153 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[b28f78ce-df9f-4fe5-9b8e-9d9cf74044a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:54:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:07.154 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 06:54:07 compute-0 ovn_metadata_agent[104249]: global
Nov 29 06:54:07 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 06:54:07 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-e21746f3-c39c-4f95-ab09-ca8da7420cb0
Nov 29 06:54:07 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 06:54:07 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 06:54:07 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 06:54:07 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/e21746f3-c39c-4f95-ab09-ca8da7420cb0.pid.haproxy
Nov 29 06:54:07 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 06:54:07 compute-0 ovn_metadata_agent[104249]: 
Nov 29 06:54:07 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 06:54:07 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 06:54:07 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 06:54:07 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 06:54:07 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 06:54:07 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 06:54:07 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 06:54:07 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 06:54:07 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 06:54:07 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 06:54:07 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 06:54:07 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 06:54:07 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 06:54:07 compute-0 ovn_metadata_agent[104249]: 
Nov 29 06:54:07 compute-0 ovn_metadata_agent[104249]: 
Nov 29 06:54:07 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 06:54:07 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 06:54:07 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 06:54:07 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID e21746f3-c39c-4f95-ab09-ca8da7420cb0
Nov 29 06:54:07 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 06:54:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:07.155 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e21746f3-c39c-4f95-ab09-ca8da7420cb0', 'env', 'PROCESS_TAG=haproxy-e21746f3-c39c-4f95-ab09-ca8da7420cb0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e21746f3-c39c-4f95-ab09-ca8da7420cb0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 06:54:07 compute-0 nova_compute[187185]: 2025-11-29 06:54:07.197 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399247.1967344, ac6c9936-a82b-4dc4-b2b3-3aaae70701ab => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:54:07 compute-0 nova_compute[187185]: 2025-11-29 06:54:07.198 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] VM Started (Lifecycle Event)
Nov 29 06:54:07 compute-0 nova_compute[187185]: 2025-11-29 06:54:07.225 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:54:07 compute-0 nova_compute[187185]: 2025-11-29 06:54:07.232 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399247.1978972, ac6c9936-a82b-4dc4-b2b3-3aaae70701ab => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:54:07 compute-0 nova_compute[187185]: 2025-11-29 06:54:07.233 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] VM Paused (Lifecycle Event)
Nov 29 06:54:07 compute-0 nova_compute[187185]: 2025-11-29 06:54:07.257 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:54:07 compute-0 nova_compute[187185]: 2025-11-29 06:54:07.261 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 06:54:07 compute-0 nova_compute[187185]: 2025-11-29 06:54:07.284 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 06:54:07 compute-0 nova_compute[187185]: 2025-11-29 06:54:07.555 187189 DEBUG nova.compute.manager [req-7e5c8d39-c77d-4099-a7fb-2303f532b9ea req-b23a8b56-1551-4c2f-b91e-a5289bf50347 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Received event network-vif-plugged-59322bae-60f4-453d-8167-213727034e0a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:54:07 compute-0 nova_compute[187185]: 2025-11-29 06:54:07.555 187189 DEBUG oslo_concurrency.lockutils [req-7e5c8d39-c77d-4099-a7fb-2303f532b9ea req-b23a8b56-1551-4c2f-b91e-a5289bf50347 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "ac6c9936-a82b-4dc4-b2b3-3aaae70701ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:54:07 compute-0 nova_compute[187185]: 2025-11-29 06:54:07.556 187189 DEBUG oslo_concurrency.lockutils [req-7e5c8d39-c77d-4099-a7fb-2303f532b9ea req-b23a8b56-1551-4c2f-b91e-a5289bf50347 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ac6c9936-a82b-4dc4-b2b3-3aaae70701ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:54:07 compute-0 nova_compute[187185]: 2025-11-29 06:54:07.556 187189 DEBUG oslo_concurrency.lockutils [req-7e5c8d39-c77d-4099-a7fb-2303f532b9ea req-b23a8b56-1551-4c2f-b91e-a5289bf50347 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ac6c9936-a82b-4dc4-b2b3-3aaae70701ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:54:07 compute-0 nova_compute[187185]: 2025-11-29 06:54:07.556 187189 DEBUG nova.compute.manager [req-7e5c8d39-c77d-4099-a7fb-2303f532b9ea req-b23a8b56-1551-4c2f-b91e-a5289bf50347 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Processing event network-vif-plugged-59322bae-60f4-453d-8167-213727034e0a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 06:54:07 compute-0 nova_compute[187185]: 2025-11-29 06:54:07.557 187189 DEBUG nova.compute.manager [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 06:54:07 compute-0 nova_compute[187185]: 2025-11-29 06:54:07.563 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399247.5621617, ac6c9936-a82b-4dc4-b2b3-3aaae70701ab => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:54:07 compute-0 nova_compute[187185]: 2025-11-29 06:54:07.564 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] VM Resumed (Lifecycle Event)
Nov 29 06:54:07 compute-0 nova_compute[187185]: 2025-11-29 06:54:07.574 187189 DEBUG nova.virt.libvirt.driver [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 06:54:07 compute-0 nova_compute[187185]: 2025-11-29 06:54:07.601 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:54:07 compute-0 nova_compute[187185]: 2025-11-29 06:54:07.608 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 06:54:07 compute-0 nova_compute[187185]: 2025-11-29 06:54:07.610 187189 INFO nova.virt.libvirt.driver [-] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Instance spawned successfully.
Nov 29 06:54:07 compute-0 nova_compute[187185]: 2025-11-29 06:54:07.611 187189 DEBUG nova.virt.libvirt.driver [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 06:54:07 compute-0 nova_compute[187185]: 2025-11-29 06:54:07.641 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 06:54:07 compute-0 nova_compute[187185]: 2025-11-29 06:54:07.647 187189 DEBUG nova.virt.libvirt.driver [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:54:07 compute-0 nova_compute[187185]: 2025-11-29 06:54:07.648 187189 DEBUG nova.virt.libvirt.driver [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:54:07 compute-0 nova_compute[187185]: 2025-11-29 06:54:07.649 187189 DEBUG nova.virt.libvirt.driver [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:54:07 compute-0 nova_compute[187185]: 2025-11-29 06:54:07.649 187189 DEBUG nova.virt.libvirt.driver [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:54:07 compute-0 nova_compute[187185]: 2025-11-29 06:54:07.650 187189 DEBUG nova.virt.libvirt.driver [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:54:07 compute-0 nova_compute[187185]: 2025-11-29 06:54:07.651 187189 DEBUG nova.virt.libvirt.driver [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:54:07 compute-0 podman[217482]: 2025-11-29 06:54:07.594324587 +0000 UTC m=+0.064140094 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 06:54:07 compute-0 nova_compute[187185]: 2025-11-29 06:54:07.751 187189 INFO nova.compute.manager [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Took 6.98 seconds to spawn the instance on the hypervisor.
Nov 29 06:54:07 compute-0 nova_compute[187185]: 2025-11-29 06:54:07.753 187189 DEBUG nova.compute.manager [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:54:07 compute-0 nova_compute[187185]: 2025-11-29 06:54:07.758 187189 DEBUG nova.network.neutron [req-2c61f9e0-5b02-4b27-b9ba-f4fdf263d54b req-3b82a95c-a50e-4511-9153-c760259e4840 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Updated VIF entry in instance network info cache for port 59322bae-60f4-453d-8167-213727034e0a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 06:54:07 compute-0 nova_compute[187185]: 2025-11-29 06:54:07.759 187189 DEBUG nova.network.neutron [req-2c61f9e0-5b02-4b27-b9ba-f4fdf263d54b req-3b82a95c-a50e-4511-9153-c760259e4840 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Updating instance_info_cache with network_info: [{"id": "59322bae-60f4-453d-8167-213727034e0a", "address": "fa:16:3e:f0:49:52", "network": {"id": "e21746f3-c39c-4f95-ab09-ca8da7420cb0", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-835937901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d0114f1ba5f4ee6bdcaf9c8cf95d744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap59322bae-60", "ovs_interfaceid": "59322bae-60f4-453d-8167-213727034e0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:54:07 compute-0 nova_compute[187185]: 2025-11-29 06:54:07.798 187189 DEBUG oslo_concurrency.lockutils [req-2c61f9e0-5b02-4b27-b9ba-f4fdf263d54b req-3b82a95c-a50e-4511-9153-c760259e4840 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-ac6c9936-a82b-4dc4-b2b3-3aaae70701ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:54:07 compute-0 nova_compute[187185]: 2025-11-29 06:54:07.833 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:07 compute-0 nova_compute[187185]: 2025-11-29 06:54:07.873 187189 INFO nova.compute.manager [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Took 7.88 seconds to build instance.
Nov 29 06:54:07 compute-0 nova_compute[187185]: 2025-11-29 06:54:07.903 187189 DEBUG oslo_concurrency.lockutils [None req-e4ae101c-8916-4fe8-993a-9c6d25c12c65 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Lock "ac6c9936-a82b-4dc4-b2b3-3aaae70701ab" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.048s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:54:08 compute-0 podman[217482]: 2025-11-29 06:54:08.271018886 +0000 UTC m=+0.740834373 container create c88cc16fb00948d46364cd5416c0b1514bb6ac56cc03e2d602f1559dfbb22442 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e21746f3-c39c-4f95-ab09-ca8da7420cb0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 06:54:08 compute-0 systemd[1]: Started libpod-conmon-c88cc16fb00948d46364cd5416c0b1514bb6ac56cc03e2d602f1559dfbb22442.scope.
Nov 29 06:54:08 compute-0 systemd[1]: Started libcrun container.
Nov 29 06:54:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/633b69d72dfb0c9db51fe512107d02c13df108631b865afab2105ba479305aba/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 06:54:08 compute-0 podman[217482]: 2025-11-29 06:54:08.51706684 +0000 UTC m=+0.986882317 container init c88cc16fb00948d46364cd5416c0b1514bb6ac56cc03e2d602f1559dfbb22442 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e21746f3-c39c-4f95-ab09-ca8da7420cb0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 06:54:08 compute-0 podman[217482]: 2025-11-29 06:54:08.524147672 +0000 UTC m=+0.993963129 container start c88cc16fb00948d46364cd5416c0b1514bb6ac56cc03e2d602f1559dfbb22442 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e21746f3-c39c-4f95-ab09-ca8da7420cb0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Nov 29 06:54:08 compute-0 neutron-haproxy-ovnmeta-e21746f3-c39c-4f95-ab09-ca8da7420cb0[217498]: [NOTICE]   (217502) : New worker (217504) forked
Nov 29 06:54:08 compute-0 neutron-haproxy-ovnmeta-e21746f3-c39c-4f95-ab09-ca8da7420cb0[217498]: [NOTICE]   (217502) : Loading success.
Nov 29 06:54:09 compute-0 nova_compute[187185]: 2025-11-29 06:54:09.304 187189 INFO nova.virt.libvirt.driver [None req-e1160efe-3adc-45f8-84a5-953149be27b2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Snapshot image upload complete
Nov 29 06:54:09 compute-0 nova_compute[187185]: 2025-11-29 06:54:09.307 187189 INFO nova.compute.manager [None req-e1160efe-3adc-45f8-84a5-953149be27b2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Took 3.80 seconds to snapshot the instance on the hypervisor.
Nov 29 06:54:09 compute-0 nova_compute[187185]: 2025-11-29 06:54:09.409 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:09 compute-0 NetworkManager[55227]: <info>  [1764399249.4103] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/51)
Nov 29 06:54:09 compute-0 NetworkManager[55227]: <info>  [1764399249.4115] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/52)
Nov 29 06:54:09 compute-0 ovn_controller[95281]: 2025-11-29T06:54:09Z|00083|binding|INFO|Releasing lport 9fe03714-858c-48b5-b3c9-530adc56ad95 from this chassis (sb_readonly=0)
Nov 29 06:54:09 compute-0 nova_compute[187185]: 2025-11-29 06:54:09.553 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:09 compute-0 nova_compute[187185]: 2025-11-29 06:54:09.560 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:09 compute-0 nova_compute[187185]: 2025-11-29 06:54:09.637 187189 DEBUG nova.compute.manager [req-7a7ab6c5-41af-4dac-90bc-c82f82b8903f req-732d99bd-659c-47da-b6de-fad7415030dd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Received event network-vif-plugged-59322bae-60f4-453d-8167-213727034e0a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:54:09 compute-0 nova_compute[187185]: 2025-11-29 06:54:09.639 187189 DEBUG oslo_concurrency.lockutils [req-7a7ab6c5-41af-4dac-90bc-c82f82b8903f req-732d99bd-659c-47da-b6de-fad7415030dd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "ac6c9936-a82b-4dc4-b2b3-3aaae70701ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:54:09 compute-0 nova_compute[187185]: 2025-11-29 06:54:09.639 187189 DEBUG oslo_concurrency.lockutils [req-7a7ab6c5-41af-4dac-90bc-c82f82b8903f req-732d99bd-659c-47da-b6de-fad7415030dd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ac6c9936-a82b-4dc4-b2b3-3aaae70701ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:54:09 compute-0 nova_compute[187185]: 2025-11-29 06:54:09.640 187189 DEBUG oslo_concurrency.lockutils [req-7a7ab6c5-41af-4dac-90bc-c82f82b8903f req-732d99bd-659c-47da-b6de-fad7415030dd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ac6c9936-a82b-4dc4-b2b3-3aaae70701ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:54:09 compute-0 nova_compute[187185]: 2025-11-29 06:54:09.641 187189 DEBUG nova.compute.manager [req-7a7ab6c5-41af-4dac-90bc-c82f82b8903f req-732d99bd-659c-47da-b6de-fad7415030dd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] No waiting events found dispatching network-vif-plugged-59322bae-60f4-453d-8167-213727034e0a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 06:54:09 compute-0 nova_compute[187185]: 2025-11-29 06:54:09.641 187189 WARNING nova.compute.manager [req-7a7ab6c5-41af-4dac-90bc-c82f82b8903f req-732d99bd-659c-47da-b6de-fad7415030dd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Received unexpected event network-vif-plugged-59322bae-60f4-453d-8167-213727034e0a for instance with vm_state active and task_state None.
Nov 29 06:54:09 compute-0 nova_compute[187185]: 2025-11-29 06:54:09.752 187189 DEBUG nova.compute.manager [req-1cad6759-b66f-4435-9870-766e605f2918 req-84c435dc-185a-4d5c-bd2a-e132ae4065d8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Received event network-changed-59322bae-60f4-453d-8167-213727034e0a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:54:09 compute-0 nova_compute[187185]: 2025-11-29 06:54:09.752 187189 DEBUG nova.compute.manager [req-1cad6759-b66f-4435-9870-766e605f2918 req-84c435dc-185a-4d5c-bd2a-e132ae4065d8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Refreshing instance network info cache due to event network-changed-59322bae-60f4-453d-8167-213727034e0a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 06:54:09 compute-0 nova_compute[187185]: 2025-11-29 06:54:09.753 187189 DEBUG oslo_concurrency.lockutils [req-1cad6759-b66f-4435-9870-766e605f2918 req-84c435dc-185a-4d5c-bd2a-e132ae4065d8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-ac6c9936-a82b-4dc4-b2b3-3aaae70701ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:54:09 compute-0 nova_compute[187185]: 2025-11-29 06:54:09.753 187189 DEBUG oslo_concurrency.lockutils [req-1cad6759-b66f-4435-9870-766e605f2918 req-84c435dc-185a-4d5c-bd2a-e132ae4065d8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-ac6c9936-a82b-4dc4-b2b3-3aaae70701ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:54:09 compute-0 nova_compute[187185]: 2025-11-29 06:54:09.754 187189 DEBUG nova.network.neutron [req-1cad6759-b66f-4435-9870-766e605f2918 req-84c435dc-185a-4d5c-bd2a-e132ae4065d8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Refreshing network info cache for port 59322bae-60f4-453d-8167-213727034e0a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 06:54:10 compute-0 nova_compute[187185]: 2025-11-29 06:54:10.905 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:11 compute-0 nova_compute[187185]: 2025-11-29 06:54:11.285 187189 DEBUG oslo_concurrency.lockutils [None req-da0d0384-9afe-40a6-98c7-5dc9f4622b65 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "0e798fb6-eb0e-40cc-a4ce-b4ff86357e92" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:54:11 compute-0 nova_compute[187185]: 2025-11-29 06:54:11.286 187189 DEBUG oslo_concurrency.lockutils [None req-da0d0384-9afe-40a6-98c7-5dc9f4622b65 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "0e798fb6-eb0e-40cc-a4ce-b4ff86357e92" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:54:11 compute-0 nova_compute[187185]: 2025-11-29 06:54:11.287 187189 DEBUG oslo_concurrency.lockutils [None req-da0d0384-9afe-40a6-98c7-5dc9f4622b65 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "0e798fb6-eb0e-40cc-a4ce-b4ff86357e92-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:54:11 compute-0 nova_compute[187185]: 2025-11-29 06:54:11.287 187189 DEBUG oslo_concurrency.lockutils [None req-da0d0384-9afe-40a6-98c7-5dc9f4622b65 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "0e798fb6-eb0e-40cc-a4ce-b4ff86357e92-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:54:11 compute-0 nova_compute[187185]: 2025-11-29 06:54:11.288 187189 DEBUG oslo_concurrency.lockutils [None req-da0d0384-9afe-40a6-98c7-5dc9f4622b65 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "0e798fb6-eb0e-40cc-a4ce-b4ff86357e92-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:54:11 compute-0 nova_compute[187185]: 2025-11-29 06:54:11.301 187189 INFO nova.compute.manager [None req-da0d0384-9afe-40a6-98c7-5dc9f4622b65 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Terminating instance
Nov 29 06:54:11 compute-0 nova_compute[187185]: 2025-11-29 06:54:11.314 187189 DEBUG nova.compute.manager [None req-da0d0384-9afe-40a6-98c7-5dc9f4622b65 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 06:54:11 compute-0 nova_compute[187185]: 2025-11-29 06:54:11.322 187189 INFO nova.virt.libvirt.driver [-] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Instance destroyed successfully.
Nov 29 06:54:11 compute-0 nova_compute[187185]: 2025-11-29 06:54:11.322 187189 DEBUG nova.objects.instance [None req-da0d0384-9afe-40a6-98c7-5dc9f4622b65 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lazy-loading 'resources' on Instance uuid 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:54:11 compute-0 nova_compute[187185]: 2025-11-29 06:54:11.351 187189 DEBUG nova.virt.libvirt.vif [None req-da0d0384-9afe-40a6-98c7-5dc9f4622b65 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:53:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1151506089',display_name='tempest-ImagesTestJSON-server-1151506089',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-1151506089',id=26,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:53:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='78f8ba841bbe4fdcb9d9e2237d97bf73',ramdisk_id='',reservation_id='r-9ur3ibun',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ImagesTestJSON-1674785298',owner_user_name='tempest-ImagesTestJSON-1674785298-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:54:09Z,user_data=None,user_id='315be492c2ce4b9f8af2898e6794a256',uuid=0e798fb6-eb0e-40cc-a4ce-b4ff86357e92,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "d40e2418-d6ae-4f44-9f23-f141b0ab11a6", "address": "fa:16:3e:dd:6e:1f", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd40e2418-d6", "ovs_interfaceid": "d40e2418-d6ae-4f44-9f23-f141b0ab11a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 06:54:11 compute-0 nova_compute[187185]: 2025-11-29 06:54:11.352 187189 DEBUG nova.network.os_vif_util [None req-da0d0384-9afe-40a6-98c7-5dc9f4622b65 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Converting VIF {"id": "d40e2418-d6ae-4f44-9f23-f141b0ab11a6", "address": "fa:16:3e:dd:6e:1f", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd40e2418-d6", "ovs_interfaceid": "d40e2418-d6ae-4f44-9f23-f141b0ab11a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 06:54:11 compute-0 nova_compute[187185]: 2025-11-29 06:54:11.353 187189 DEBUG nova.network.os_vif_util [None req-da0d0384-9afe-40a6-98c7-5dc9f4622b65 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:6e:1f,bridge_name='br-int',has_traffic_filtering=True,id=d40e2418-d6ae-4f44-9f23-f141b0ab11a6,network=Network(17ec2ca4-3fa9-41aa-80ef-35bf92d404ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd40e2418-d6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 06:54:11 compute-0 nova_compute[187185]: 2025-11-29 06:54:11.353 187189 DEBUG os_vif [None req-da0d0384-9afe-40a6-98c7-5dc9f4622b65 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:6e:1f,bridge_name='br-int',has_traffic_filtering=True,id=d40e2418-d6ae-4f44-9f23-f141b0ab11a6,network=Network(17ec2ca4-3fa9-41aa-80ef-35bf92d404ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd40e2418-d6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 06:54:11 compute-0 nova_compute[187185]: 2025-11-29 06:54:11.356 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:11 compute-0 nova_compute[187185]: 2025-11-29 06:54:11.356 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd40e2418-d6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:54:11 compute-0 nova_compute[187185]: 2025-11-29 06:54:11.360 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:11 compute-0 nova_compute[187185]: 2025-11-29 06:54:11.363 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 06:54:11 compute-0 nova_compute[187185]: 2025-11-29 06:54:11.368 187189 INFO os_vif [None req-da0d0384-9afe-40a6-98c7-5dc9f4622b65 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:6e:1f,bridge_name='br-int',has_traffic_filtering=True,id=d40e2418-d6ae-4f44-9f23-f141b0ab11a6,network=Network(17ec2ca4-3fa9-41aa-80ef-35bf92d404ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd40e2418-d6')
Nov 29 06:54:11 compute-0 nova_compute[187185]: 2025-11-29 06:54:11.369 187189 INFO nova.virt.libvirt.driver [None req-da0d0384-9afe-40a6-98c7-5dc9f4622b65 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Deleting instance files /var/lib/nova/instances/0e798fb6-eb0e-40cc-a4ce-b4ff86357e92_del
Nov 29 06:54:11 compute-0 nova_compute[187185]: 2025-11-29 06:54:11.370 187189 INFO nova.virt.libvirt.driver [None req-da0d0384-9afe-40a6-98c7-5dc9f4622b65 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Deletion of /var/lib/nova/instances/0e798fb6-eb0e-40cc-a4ce-b4ff86357e92_del complete
Nov 29 06:54:11 compute-0 nova_compute[187185]: 2025-11-29 06:54:11.488 187189 DEBUG nova.network.neutron [req-1cad6759-b66f-4435-9870-766e605f2918 req-84c435dc-185a-4d5c-bd2a-e132ae4065d8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Updated VIF entry in instance network info cache for port 59322bae-60f4-453d-8167-213727034e0a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 06:54:11 compute-0 nova_compute[187185]: 2025-11-29 06:54:11.489 187189 DEBUG nova.network.neutron [req-1cad6759-b66f-4435-9870-766e605f2918 req-84c435dc-185a-4d5c-bd2a-e132ae4065d8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Updating instance_info_cache with network_info: [{"id": "59322bae-60f4-453d-8167-213727034e0a", "address": "fa:16:3e:f0:49:52", "network": {"id": "e21746f3-c39c-4f95-ab09-ca8da7420cb0", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-835937901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d0114f1ba5f4ee6bdcaf9c8cf95d744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap59322bae-60", "ovs_interfaceid": "59322bae-60f4-453d-8167-213727034e0a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:54:11 compute-0 nova_compute[187185]: 2025-11-29 06:54:11.496 187189 INFO nova.compute.manager [None req-da0d0384-9afe-40a6-98c7-5dc9f4622b65 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Took 0.18 seconds to destroy the instance on the hypervisor.
Nov 29 06:54:11 compute-0 nova_compute[187185]: 2025-11-29 06:54:11.498 187189 DEBUG oslo.service.loopingcall [None req-da0d0384-9afe-40a6-98c7-5dc9f4622b65 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 06:54:11 compute-0 nova_compute[187185]: 2025-11-29 06:54:11.498 187189 DEBUG nova.compute.manager [-] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 06:54:11 compute-0 nova_compute[187185]: 2025-11-29 06:54:11.499 187189 DEBUG nova.network.neutron [-] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 06:54:11 compute-0 nova_compute[187185]: 2025-11-29 06:54:11.513 187189 DEBUG oslo_concurrency.lockutils [req-1cad6759-b66f-4435-9870-766e605f2918 req-84c435dc-185a-4d5c-bd2a-e132ae4065d8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-ac6c9936-a82b-4dc4-b2b3-3aaae70701ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:54:12 compute-0 nova_compute[187185]: 2025-11-29 06:54:12.504 187189 DEBUG nova.network.neutron [-] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:54:12 compute-0 nova_compute[187185]: 2025-11-29 06:54:12.543 187189 INFO nova.compute.manager [-] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Took 1.04 seconds to deallocate network for instance.
Nov 29 06:54:12 compute-0 nova_compute[187185]: 2025-11-29 06:54:12.694 187189 DEBUG oslo_concurrency.lockutils [None req-da0d0384-9afe-40a6-98c7-5dc9f4622b65 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:54:12 compute-0 nova_compute[187185]: 2025-11-29 06:54:12.695 187189 DEBUG oslo_concurrency.lockutils [None req-da0d0384-9afe-40a6-98c7-5dc9f4622b65 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:54:12 compute-0 nova_compute[187185]: 2025-11-29 06:54:12.794 187189 DEBUG nova.compute.provider_tree [None req-da0d0384-9afe-40a6-98c7-5dc9f4622b65 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:54:12 compute-0 podman[217515]: 2025-11-29 06:54:12.827759356 +0000 UTC m=+0.087780466 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 06:54:12 compute-0 nova_compute[187185]: 2025-11-29 06:54:12.837 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:12 compute-0 nova_compute[187185]: 2025-11-29 06:54:12.888 187189 DEBUG nova.scheduler.client.report [None req-da0d0384-9afe-40a6-98c7-5dc9f4622b65 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:54:12 compute-0 nova_compute[187185]: 2025-11-29 06:54:12.941 187189 DEBUG oslo_concurrency.lockutils [None req-da0d0384-9afe-40a6-98c7-5dc9f4622b65 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.246s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:54:13 compute-0 nova_compute[187185]: 2025-11-29 06:54:13.034 187189 INFO nova.scheduler.client.report [None req-da0d0384-9afe-40a6-98c7-5dc9f4622b65 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Deleted allocations for instance 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92
Nov 29 06:54:13 compute-0 nova_compute[187185]: 2025-11-29 06:54:13.316 187189 DEBUG oslo_concurrency.lockutils [None req-da0d0384-9afe-40a6-98c7-5dc9f4622b65 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "0e798fb6-eb0e-40cc-a4ce-b4ff86357e92" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.030s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:54:13 compute-0 nova_compute[187185]: 2025-11-29 06:54:13.393 187189 DEBUG nova.compute.manager [None req-8b230dff-5771-43fc-9beb-948a34934f9d c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:54:13 compute-0 nova_compute[187185]: 2025-11-29 06:54:13.470 187189 INFO nova.compute.manager [None req-8b230dff-5771-43fc-9beb-948a34934f9d c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] instance snapshotting
Nov 29 06:54:13 compute-0 nova_compute[187185]: 2025-11-29 06:54:13.654 187189 INFO nova.virt.libvirt.driver [None req-8b230dff-5771-43fc-9beb-948a34934f9d c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Beginning live snapshot process
Nov 29 06:54:13 compute-0 nova_compute[187185]: 2025-11-29 06:54:13.731 187189 DEBUG nova.compute.manager [req-178c72f0-b743-428b-80e3-d23122d5cee9 req-8156dc61-ea64-4527-a4e1-1c486b41da5d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Received event network-vif-deleted-d40e2418-d6ae-4f44-9f23-f141b0ab11a6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:54:14 compute-0 virtqemud[186729]: invalid argument: disk vda does not have an active block job
Nov 29 06:54:14 compute-0 nova_compute[187185]: 2025-11-29 06:54:14.124 187189 DEBUG oslo_concurrency.processutils [None req-8b230dff-5771-43fc-9beb-948a34934f9d c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6af9191a-9cf3-47b8-9172-1f844e3f2d44/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:54:14 compute-0 nova_compute[187185]: 2025-11-29 06:54:14.224 187189 DEBUG oslo_concurrency.processutils [None req-8b230dff-5771-43fc-9beb-948a34934f9d c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6af9191a-9cf3-47b8-9172-1f844e3f2d44/disk --force-share --output=json -f qcow2" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:54:14 compute-0 nova_compute[187185]: 2025-11-29 06:54:14.226 187189 DEBUG oslo_concurrency.processutils [None req-8b230dff-5771-43fc-9beb-948a34934f9d c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6af9191a-9cf3-47b8-9172-1f844e3f2d44/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:54:14 compute-0 nova_compute[187185]: 2025-11-29 06:54:14.325 187189 DEBUG oslo_concurrency.processutils [None req-8b230dff-5771-43fc-9beb-948a34934f9d c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6af9191a-9cf3-47b8-9172-1f844e3f2d44/disk --force-share --output=json -f qcow2" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:54:14 compute-0 nova_compute[187185]: 2025-11-29 06:54:14.350 187189 DEBUG oslo_concurrency.processutils [None req-8b230dff-5771-43fc-9beb-948a34934f9d c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:54:14 compute-0 nova_compute[187185]: 2025-11-29 06:54:14.451 187189 DEBUG oslo_concurrency.processutils [None req-8b230dff-5771-43fc-9beb-948a34934f9d c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:54:14 compute-0 nova_compute[187185]: 2025-11-29 06:54:14.456 187189 DEBUG oslo_concurrency.processutils [None req-8b230dff-5771-43fc-9beb-948a34934f9d c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpuwlxp2h9/190831f8ee0c4662867a8fa15df0a27a.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:54:15 compute-0 nova_compute[187185]: 2025-11-29 06:54:15.298 187189 DEBUG oslo_concurrency.processutils [None req-8b230dff-5771-43fc-9beb-948a34934f9d c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpuwlxp2h9/190831f8ee0c4662867a8fa15df0a27a.delta 1073741824" returned: 0 in 0.842s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:54:15 compute-0 nova_compute[187185]: 2025-11-29 06:54:15.300 187189 INFO nova.virt.libvirt.driver [None req-8b230dff-5771-43fc-9beb-948a34934f9d c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Quiescing instance not available: QEMU guest agent is not enabled.
Nov 29 06:54:15 compute-0 nova_compute[187185]: 2025-11-29 06:54:15.350 187189 DEBUG nova.virt.libvirt.guest [None req-8b230dff-5771-43fc-9beb-948a34934f9d c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Nov 29 06:54:15 compute-0 nova_compute[187185]: 2025-11-29 06:54:15.355 187189 INFO nova.virt.libvirt.driver [None req-8b230dff-5771-43fc-9beb-948a34934f9d c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Skipping quiescing instance: QEMU guest agent is not enabled.
Nov 29 06:54:15 compute-0 nova_compute[187185]: 2025-11-29 06:54:15.519 187189 DEBUG nova.privsep.utils [None req-8b230dff-5771-43fc-9beb-948a34934f9d c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 29 06:54:15 compute-0 nova_compute[187185]: 2025-11-29 06:54:15.520 187189 DEBUG oslo_concurrency.processutils [None req-8b230dff-5771-43fc-9beb-948a34934f9d c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpuwlxp2h9/190831f8ee0c4662867a8fa15df0a27a.delta /var/lib/nova/instances/snapshots/tmpuwlxp2h9/190831f8ee0c4662867a8fa15df0a27a execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:54:15 compute-0 nova_compute[187185]: 2025-11-29 06:54:15.779 187189 DEBUG oslo_concurrency.processutils [None req-8b230dff-5771-43fc-9beb-948a34934f9d c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpuwlxp2h9/190831f8ee0c4662867a8fa15df0a27a.delta /var/lib/nova/instances/snapshots/tmpuwlxp2h9/190831f8ee0c4662867a8fa15df0a27a" returned: 0 in 0.259s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:54:15 compute-0 nova_compute[187185]: 2025-11-29 06:54:15.781 187189 INFO nova.virt.libvirt.driver [None req-8b230dff-5771-43fc-9beb-948a34934f9d c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Snapshot extracted, beginning image upload
Nov 29 06:54:16 compute-0 nova_compute[187185]: 2025-11-29 06:54:16.360 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:17 compute-0 nova_compute[187185]: 2025-11-29 06:54:17.839 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:18 compute-0 nova_compute[187185]: 2025-11-29 06:54:18.001 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399242.9990282, 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:54:18 compute-0 nova_compute[187185]: 2025-11-29 06:54:18.002 187189 INFO nova.compute.manager [-] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] VM Stopped (Lifecycle Event)
Nov 29 06:54:18 compute-0 nova_compute[187185]: 2025-11-29 06:54:18.023 187189 DEBUG nova.compute.manager [None req-667c12c8-85b0-41f4-8661-afa7f372727f - - - - - -] [instance: 0e798fb6-eb0e-40cc-a4ce-b4ff86357e92] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:54:18 compute-0 nova_compute[187185]: 2025-11-29 06:54:18.031 187189 INFO nova.virt.libvirt.driver [None req-8b230dff-5771-43fc-9beb-948a34934f9d c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Snapshot image upload complete
Nov 29 06:54:18 compute-0 nova_compute[187185]: 2025-11-29 06:54:18.031 187189 INFO nova.compute.manager [None req-8b230dff-5771-43fc-9beb-948a34934f9d c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Took 4.54 seconds to snapshot the instance on the hypervisor.
Nov 29 06:54:21 compute-0 nova_compute[187185]: 2025-11-29 06:54:21.365 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:21 compute-0 ovn_controller[95281]: 2025-11-29T06:54:21Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f0:49:52 10.100.0.11
Nov 29 06:54:21 compute-0 ovn_controller[95281]: 2025-11-29T06:54:21Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f0:49:52 10.100.0.11
Nov 29 06:54:22 compute-0 nova_compute[187185]: 2025-11-29 06:54:22.904 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:24.313 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 06:54:24 compute-0 nova_compute[187185]: 2025-11-29 06:54:24.314 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:24.315 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 06:54:24 compute-0 podman[217584]: 2025-11-29 06:54:24.791684627 +0000 UTC m=+0.057341671 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 06:54:24 compute-0 podman[217586]: 2025-11-29 06:54:24.799774357 +0000 UTC m=+0.058355340 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 06:54:24 compute-0 podman[217585]: 2025-11-29 06:54:24.799898361 +0000 UTC m=+0.065383660 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, config_id=edpm, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, architecture=x86_64)
Nov 29 06:54:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:24.813 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:54:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:24.814 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:54:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:24.815 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:54:26 compute-0 nova_compute[187185]: 2025-11-29 06:54:26.406 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:27 compute-0 sshd-session[217646]: Invalid user system from 179.125.24.202 port 37092
Nov 29 06:54:27 compute-0 sshd-session[217646]: Received disconnect from 179.125.24.202 port 37092:11: Bye Bye [preauth]
Nov 29 06:54:27 compute-0 sshd-session[217646]: Disconnected from invalid user system 179.125.24.202 port 37092 [preauth]
Nov 29 06:54:27 compute-0 nova_compute[187185]: 2025-11-29 06:54:27.912 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:31 compute-0 nova_compute[187185]: 2025-11-29 06:54:31.408 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:31 compute-0 podman[217648]: 2025-11-29 06:54:31.84900101 +0000 UTC m=+0.112603913 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:54:32 compute-0 nova_compute[187185]: 2025-11-29 06:54:32.964 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:33.318 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:54:33 compute-0 podman[217677]: 2025-11-29 06:54:33.836207397 +0000 UTC m=+0.081996482 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 06:54:33 compute-0 podman[217676]: 2025-11-29 06:54:33.837133423 +0000 UTC m=+0.094821817 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:54:34 compute-0 sshd-session[217674]: Invalid user bitrix from 103.179.56.44 port 49370
Nov 29 06:54:35 compute-0 sshd-session[217674]: Received disconnect from 103.179.56.44 port 49370:11: Bye Bye [preauth]
Nov 29 06:54:35 compute-0 sshd-session[217674]: Disconnected from invalid user bitrix 103.179.56.44 port 49370 [preauth]
Nov 29 06:54:36 compute-0 nova_compute[187185]: 2025-11-29 06:54:36.453 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:36 compute-0 nova_compute[187185]: 2025-11-29 06:54:36.826 187189 DEBUG oslo_concurrency.lockutils [None req-ca14f78a-67a7-4f45-a188-ff1dfac128d6 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Acquiring lock "6af9191a-9cf3-47b8-9172-1f844e3f2d44" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:54:36 compute-0 nova_compute[187185]: 2025-11-29 06:54:36.827 187189 DEBUG oslo_concurrency.lockutils [None req-ca14f78a-67a7-4f45-a188-ff1dfac128d6 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Lock "6af9191a-9cf3-47b8-9172-1f844e3f2d44" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:54:36 compute-0 nova_compute[187185]: 2025-11-29 06:54:36.827 187189 DEBUG oslo_concurrency.lockutils [None req-ca14f78a-67a7-4f45-a188-ff1dfac128d6 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Acquiring lock "6af9191a-9cf3-47b8-9172-1f844e3f2d44-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:54:36 compute-0 nova_compute[187185]: 2025-11-29 06:54:36.828 187189 DEBUG oslo_concurrency.lockutils [None req-ca14f78a-67a7-4f45-a188-ff1dfac128d6 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Lock "6af9191a-9cf3-47b8-9172-1f844e3f2d44-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:54:36 compute-0 nova_compute[187185]: 2025-11-29 06:54:36.829 187189 DEBUG oslo_concurrency.lockutils [None req-ca14f78a-67a7-4f45-a188-ff1dfac128d6 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Lock "6af9191a-9cf3-47b8-9172-1f844e3f2d44-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:54:36 compute-0 nova_compute[187185]: 2025-11-29 06:54:36.906 187189 INFO nova.compute.manager [None req-ca14f78a-67a7-4f45-a188-ff1dfac128d6 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Terminating instance
Nov 29 06:54:36 compute-0 nova_compute[187185]: 2025-11-29 06:54:36.921 187189 DEBUG oslo_concurrency.lockutils [None req-ca14f78a-67a7-4f45-a188-ff1dfac128d6 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Acquiring lock "refresh_cache-6af9191a-9cf3-47b8-9172-1f844e3f2d44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:54:36 compute-0 nova_compute[187185]: 2025-11-29 06:54:36.922 187189 DEBUG oslo_concurrency.lockutils [None req-ca14f78a-67a7-4f45-a188-ff1dfac128d6 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Acquired lock "refresh_cache-6af9191a-9cf3-47b8-9172-1f844e3f2d44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:54:36 compute-0 nova_compute[187185]: 2025-11-29 06:54:36.923 187189 DEBUG nova.network.neutron [None req-ca14f78a-67a7-4f45-a188-ff1dfac128d6 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 06:54:37 compute-0 nova_compute[187185]: 2025-11-29 06:54:37.160 187189 DEBUG nova.network.neutron [None req-ca14f78a-67a7-4f45-a188-ff1dfac128d6 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 06:54:37 compute-0 nova_compute[187185]: 2025-11-29 06:54:37.598 187189 DEBUG nova.network.neutron [None req-ca14f78a-67a7-4f45-a188-ff1dfac128d6 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:54:37 compute-0 nova_compute[187185]: 2025-11-29 06:54:37.619 187189 DEBUG oslo_concurrency.lockutils [None req-ca14f78a-67a7-4f45-a188-ff1dfac128d6 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Releasing lock "refresh_cache-6af9191a-9cf3-47b8-9172-1f844e3f2d44" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:54:37 compute-0 nova_compute[187185]: 2025-11-29 06:54:37.621 187189 DEBUG nova.compute.manager [None req-ca14f78a-67a7-4f45-a188-ff1dfac128d6 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 06:54:37 compute-0 ovn_controller[95281]: 2025-11-29T06:54:37Z|00084|binding|INFO|Releasing lport 9fe03714-858c-48b5-b3c9-530adc56ad95 from this chassis (sb_readonly=0)
Nov 29 06:54:37 compute-0 nova_compute[187185]: 2025-11-29 06:54:37.677 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:37 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000001e.scope: Deactivated successfully.
Nov 29 06:54:37 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000001e.scope: Consumed 14.057s CPU time.
Nov 29 06:54:37 compute-0 systemd-machined[153486]: Machine qemu-9-instance-0000001e terminated.
Nov 29 06:54:37 compute-0 nova_compute[187185]: 2025-11-29 06:54:37.889 187189 INFO nova.virt.libvirt.driver [-] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Instance destroyed successfully.
Nov 29 06:54:37 compute-0 nova_compute[187185]: 2025-11-29 06:54:37.891 187189 DEBUG nova.objects.instance [None req-ca14f78a-67a7-4f45-a188-ff1dfac128d6 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Lazy-loading 'resources' on Instance uuid 6af9191a-9cf3-47b8-9172-1f844e3f2d44 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:54:37 compute-0 nova_compute[187185]: 2025-11-29 06:54:37.914 187189 INFO nova.virt.libvirt.driver [None req-ca14f78a-67a7-4f45-a188-ff1dfac128d6 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Deleting instance files /var/lib/nova/instances/6af9191a-9cf3-47b8-9172-1f844e3f2d44_del
Nov 29 06:54:37 compute-0 nova_compute[187185]: 2025-11-29 06:54:37.915 187189 INFO nova.virt.libvirt.driver [None req-ca14f78a-67a7-4f45-a188-ff1dfac128d6 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Deletion of /var/lib/nova/instances/6af9191a-9cf3-47b8-9172-1f844e3f2d44_del complete
Nov 29 06:54:37 compute-0 nova_compute[187185]: 2025-11-29 06:54:37.966 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:38 compute-0 nova_compute[187185]: 2025-11-29 06:54:38.076 187189 INFO nova.compute.manager [None req-ca14f78a-67a7-4f45-a188-ff1dfac128d6 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Took 0.46 seconds to destroy the instance on the hypervisor.
Nov 29 06:54:38 compute-0 nova_compute[187185]: 2025-11-29 06:54:38.077 187189 DEBUG oslo.service.loopingcall [None req-ca14f78a-67a7-4f45-a188-ff1dfac128d6 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 06:54:38 compute-0 nova_compute[187185]: 2025-11-29 06:54:38.081 187189 DEBUG nova.compute.manager [-] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 06:54:38 compute-0 nova_compute[187185]: 2025-11-29 06:54:38.081 187189 DEBUG nova.network.neutron [-] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 06:54:38 compute-0 nova_compute[187185]: 2025-11-29 06:54:38.238 187189 DEBUG nova.network.neutron [-] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 06:54:38 compute-0 nova_compute[187185]: 2025-11-29 06:54:38.256 187189 DEBUG nova.network.neutron [-] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:54:38 compute-0 nova_compute[187185]: 2025-11-29 06:54:38.274 187189 INFO nova.compute.manager [-] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Took 0.19 seconds to deallocate network for instance.
Nov 29 06:54:38 compute-0 nova_compute[187185]: 2025-11-29 06:54:38.366 187189 DEBUG oslo_concurrency.lockutils [None req-ca14f78a-67a7-4f45-a188-ff1dfac128d6 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:54:38 compute-0 nova_compute[187185]: 2025-11-29 06:54:38.367 187189 DEBUG oslo_concurrency.lockutils [None req-ca14f78a-67a7-4f45-a188-ff1dfac128d6 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:54:38 compute-0 nova_compute[187185]: 2025-11-29 06:54:38.466 187189 DEBUG nova.compute.provider_tree [None req-ca14f78a-67a7-4f45-a188-ff1dfac128d6 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:54:38 compute-0 nova_compute[187185]: 2025-11-29 06:54:38.485 187189 DEBUG nova.scheduler.client.report [None req-ca14f78a-67a7-4f45-a188-ff1dfac128d6 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:54:38 compute-0 nova_compute[187185]: 2025-11-29 06:54:38.585 187189 DEBUG oslo_concurrency.lockutils [None req-ca14f78a-67a7-4f45-a188-ff1dfac128d6 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.218s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:54:38 compute-0 nova_compute[187185]: 2025-11-29 06:54:38.620 187189 INFO nova.scheduler.client.report [None req-ca14f78a-67a7-4f45-a188-ff1dfac128d6 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Deleted allocations for instance 6af9191a-9cf3-47b8-9172-1f844e3f2d44
Nov 29 06:54:39 compute-0 nova_compute[187185]: 2025-11-29 06:54:39.268 187189 DEBUG oslo_concurrency.lockutils [None req-ca14f78a-67a7-4f45-a188-ff1dfac128d6 c480a3bf2f8f485889154b20872eada2 34d1587b0fbb4c3cbf3c8c4a71d1a6be - - default default] Lock "6af9191a-9cf3-47b8-9172-1f844e3f2d44" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.441s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:54:40 compute-0 sshd-session[217729]: Invalid user hu from 1.214.197.163 port 47408
Nov 29 06:54:40 compute-0 sshd-session[217729]: Received disconnect from 1.214.197.163 port 47408:11: Bye Bye [preauth]
Nov 29 06:54:40 compute-0 sshd-session[217729]: Disconnected from invalid user hu 1.214.197.163 port 47408 [preauth]
Nov 29 06:54:41 compute-0 nova_compute[187185]: 2025-11-29 06:54:41.498 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:41 compute-0 nova_compute[187185]: 2025-11-29 06:54:41.539 187189 DEBUG oslo_concurrency.lockutils [None req-a2460934-2493-4b83-9e5e-4b7ddcf31982 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Acquiring lock "ac6c9936-a82b-4dc4-b2b3-3aaae70701ab" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:54:41 compute-0 nova_compute[187185]: 2025-11-29 06:54:41.540 187189 DEBUG oslo_concurrency.lockutils [None req-a2460934-2493-4b83-9e5e-4b7ddcf31982 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Lock "ac6c9936-a82b-4dc4-b2b3-3aaae70701ab" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:54:41 compute-0 nova_compute[187185]: 2025-11-29 06:54:41.541 187189 DEBUG oslo_concurrency.lockutils [None req-a2460934-2493-4b83-9e5e-4b7ddcf31982 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Acquiring lock "ac6c9936-a82b-4dc4-b2b3-3aaae70701ab-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:54:41 compute-0 nova_compute[187185]: 2025-11-29 06:54:41.541 187189 DEBUG oslo_concurrency.lockutils [None req-a2460934-2493-4b83-9e5e-4b7ddcf31982 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Lock "ac6c9936-a82b-4dc4-b2b3-3aaae70701ab-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:54:41 compute-0 nova_compute[187185]: 2025-11-29 06:54:41.541 187189 DEBUG oslo_concurrency.lockutils [None req-a2460934-2493-4b83-9e5e-4b7ddcf31982 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Lock "ac6c9936-a82b-4dc4-b2b3-3aaae70701ab-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:54:41 compute-0 nova_compute[187185]: 2025-11-29 06:54:41.560 187189 INFO nova.compute.manager [None req-a2460934-2493-4b83-9e5e-4b7ddcf31982 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Terminating instance
Nov 29 06:54:41 compute-0 nova_compute[187185]: 2025-11-29 06:54:41.571 187189 DEBUG nova.compute.manager [None req-a2460934-2493-4b83-9e5e-4b7ddcf31982 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 06:54:41 compute-0 kernel: tap59322bae-60 (unregistering): left promiscuous mode
Nov 29 06:54:41 compute-0 NetworkManager[55227]: <info>  [1764399281.5988] device (tap59322bae-60): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 06:54:41 compute-0 nova_compute[187185]: 2025-11-29 06:54:41.625 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:41 compute-0 ovn_controller[95281]: 2025-11-29T06:54:41Z|00085|binding|INFO|Releasing lport 59322bae-60f4-453d-8167-213727034e0a from this chassis (sb_readonly=0)
Nov 29 06:54:41 compute-0 ovn_controller[95281]: 2025-11-29T06:54:41Z|00086|binding|INFO|Setting lport 59322bae-60f4-453d-8167-213727034e0a down in Southbound
Nov 29 06:54:41 compute-0 ovn_controller[95281]: 2025-11-29T06:54:41Z|00087|binding|INFO|Removing iface tap59322bae-60 ovn-installed in OVS
Nov 29 06:54:41 compute-0 nova_compute[187185]: 2025-11-29 06:54:41.629 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:41.635 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:49:52 10.100.0.11'], port_security=['fa:16:3e:f0:49:52 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'ac6c9936-a82b-4dc4-b2b3-3aaae70701ab', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e21746f3-c39c-4f95-ab09-ca8da7420cb0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d0114f1ba5f4ee6bdcaf9c8cf95d744', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b4431f2a-96df-4c7a-a674-7ab10a2da638', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.222'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d3e8b222-3dc2-4009-a628-f393d12cd667, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=59322bae-60f4-453d-8167-213727034e0a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 06:54:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:41.638 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 59322bae-60f4-453d-8167-213727034e0a in datapath e21746f3-c39c-4f95-ab09-ca8da7420cb0 unbound from our chassis
Nov 29 06:54:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:41.642 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e21746f3-c39c-4f95-ab09-ca8da7420cb0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 06:54:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:41.646 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[634e0c11-cc02-4fcd-a43b-45e698942a1b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:54:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:41.649 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e21746f3-c39c-4f95-ab09-ca8da7420cb0 namespace which is not needed anymore
Nov 29 06:54:41 compute-0 nova_compute[187185]: 2025-11-29 06:54:41.651 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:41 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000001d.scope: Deactivated successfully.
Nov 29 06:54:41 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000001d.scope: Consumed 13.838s CPU time.
Nov 29 06:54:41 compute-0 systemd-machined[153486]: Machine qemu-10-instance-0000001d terminated.
Nov 29 06:54:41 compute-0 neutron-haproxy-ovnmeta-e21746f3-c39c-4f95-ab09-ca8da7420cb0[217498]: [NOTICE]   (217502) : haproxy version is 2.8.14-c23fe91
Nov 29 06:54:41 compute-0 neutron-haproxy-ovnmeta-e21746f3-c39c-4f95-ab09-ca8da7420cb0[217498]: [NOTICE]   (217502) : path to executable is /usr/sbin/haproxy
Nov 29 06:54:41 compute-0 neutron-haproxy-ovnmeta-e21746f3-c39c-4f95-ab09-ca8da7420cb0[217498]: [WARNING]  (217502) : Exiting Master process...
Nov 29 06:54:41 compute-0 neutron-haproxy-ovnmeta-e21746f3-c39c-4f95-ab09-ca8da7420cb0[217498]: [ALERT]    (217502) : Current worker (217504) exited with code 143 (Terminated)
Nov 29 06:54:41 compute-0 neutron-haproxy-ovnmeta-e21746f3-c39c-4f95-ab09-ca8da7420cb0[217498]: [WARNING]  (217502) : All workers exited. Exiting... (0)
Nov 29 06:54:41 compute-0 systemd[1]: libpod-c88cc16fb00948d46364cd5416c0b1514bb6ac56cc03e2d602f1559dfbb22442.scope: Deactivated successfully.
Nov 29 06:54:41 compute-0 podman[217755]: 2025-11-29 06:54:41.828592895 +0000 UTC m=+0.059089231 container died c88cc16fb00948d46364cd5416c0b1514bb6ac56cc03e2d602f1559dfbb22442 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e21746f3-c39c-4f95-ab09-ca8da7420cb0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 06:54:41 compute-0 nova_compute[187185]: 2025-11-29 06:54:41.850 187189 INFO nova.virt.libvirt.driver [-] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Instance destroyed successfully.
Nov 29 06:54:41 compute-0 nova_compute[187185]: 2025-11-29 06:54:41.851 187189 DEBUG nova.objects.instance [None req-a2460934-2493-4b83-9e5e-4b7ddcf31982 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Lazy-loading 'resources' on Instance uuid ac6c9936-a82b-4dc4-b2b3-3aaae70701ab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:54:41 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c88cc16fb00948d46364cd5416c0b1514bb6ac56cc03e2d602f1559dfbb22442-userdata-shm.mount: Deactivated successfully.
Nov 29 06:54:41 compute-0 systemd[1]: var-lib-containers-storage-overlay-633b69d72dfb0c9db51fe512107d02c13df108631b865afab2105ba479305aba-merged.mount: Deactivated successfully.
Nov 29 06:54:41 compute-0 podman[217755]: 2025-11-29 06:54:41.88259117 +0000 UTC m=+0.113087536 container cleanup c88cc16fb00948d46364cd5416c0b1514bb6ac56cc03e2d602f1559dfbb22442 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e21746f3-c39c-4f95-ab09-ca8da7420cb0, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 06:54:41 compute-0 systemd[1]: libpod-conmon-c88cc16fb00948d46364cd5416c0b1514bb6ac56cc03e2d602f1559dfbb22442.scope: Deactivated successfully.
Nov 29 06:54:41 compute-0 podman[217802]: 2025-11-29 06:54:41.956181272 +0000 UTC m=+0.044130405 container remove c88cc16fb00948d46364cd5416c0b1514bb6ac56cc03e2d602f1559dfbb22442 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e21746f3-c39c-4f95-ab09-ca8da7420cb0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 29 06:54:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:41.963 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a870ac89-f93e-413d-bcd3-0e2a9df7ffe4]: (4, ('Sat Nov 29 06:54:41 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-e21746f3-c39c-4f95-ab09-ca8da7420cb0 (c88cc16fb00948d46364cd5416c0b1514bb6ac56cc03e2d602f1559dfbb22442)\nc88cc16fb00948d46364cd5416c0b1514bb6ac56cc03e2d602f1559dfbb22442\nSat Nov 29 06:54:41 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-e21746f3-c39c-4f95-ab09-ca8da7420cb0 (c88cc16fb00948d46364cd5416c0b1514bb6ac56cc03e2d602f1559dfbb22442)\nc88cc16fb00948d46364cd5416c0b1514bb6ac56cc03e2d602f1559dfbb22442\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:54:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:41.966 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[eb6fe669-2827-4676-9a11-dfa9852edd7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:54:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:41.967 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape21746f3-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:54:41 compute-0 nova_compute[187185]: 2025-11-29 06:54:41.969 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:41 compute-0 kernel: tape21746f3-c0: left promiscuous mode
Nov 29 06:54:41 compute-0 nova_compute[187185]: 2025-11-29 06:54:41.994 187189 DEBUG nova.compute.manager [req-1b1a3bfd-aa71-4703-9072-2fe2d42daf4a req-3c5fc8a4-6c0f-4744-a306-afc6e9956a86 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Received event network-vif-unplugged-59322bae-60f4-453d-8167-213727034e0a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:54:41 compute-0 nova_compute[187185]: 2025-11-29 06:54:41.995 187189 DEBUG oslo_concurrency.lockutils [req-1b1a3bfd-aa71-4703-9072-2fe2d42daf4a req-3c5fc8a4-6c0f-4744-a306-afc6e9956a86 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "ac6c9936-a82b-4dc4-b2b3-3aaae70701ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:54:41 compute-0 nova_compute[187185]: 2025-11-29 06:54:41.995 187189 DEBUG oslo_concurrency.lockutils [req-1b1a3bfd-aa71-4703-9072-2fe2d42daf4a req-3c5fc8a4-6c0f-4744-a306-afc6e9956a86 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ac6c9936-a82b-4dc4-b2b3-3aaae70701ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:54:41 compute-0 nova_compute[187185]: 2025-11-29 06:54:41.995 187189 DEBUG oslo_concurrency.lockutils [req-1b1a3bfd-aa71-4703-9072-2fe2d42daf4a req-3c5fc8a4-6c0f-4744-a306-afc6e9956a86 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ac6c9936-a82b-4dc4-b2b3-3aaae70701ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:54:41 compute-0 nova_compute[187185]: 2025-11-29 06:54:41.995 187189 DEBUG nova.compute.manager [req-1b1a3bfd-aa71-4703-9072-2fe2d42daf4a req-3c5fc8a4-6c0f-4744-a306-afc6e9956a86 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] No waiting events found dispatching network-vif-unplugged-59322bae-60f4-453d-8167-213727034e0a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 06:54:41 compute-0 nova_compute[187185]: 2025-11-29 06:54:41.995 187189 DEBUG nova.compute.manager [req-1b1a3bfd-aa71-4703-9072-2fe2d42daf4a req-3c5fc8a4-6c0f-4744-a306-afc6e9956a86 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Received event network-vif-unplugged-59322bae-60f4-453d-8167-213727034e0a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 06:54:41 compute-0 nova_compute[187185]: 2025-11-29 06:54:41.996 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:41 compute-0 nova_compute[187185]: 2025-11-29 06:54:41.997 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:42.000 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[19f5a0c9-e1da-413b-8978-33fa7a38d207]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:54:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:42.017 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d9fa6f25-112b-4f13-ba10-87d5c75c6d62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:54:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:42.020 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[c948aa0d-20ec-4729-a275-88d3fa5414a8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:54:42 compute-0 nova_compute[187185]: 2025-11-29 06:54:42.026 187189 DEBUG nova.virt.libvirt.vif [None req-a2460934-2493-4b83-9e5e-4b7ddcf31982 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] vif_type=ovs instance=Instance(access_ip_v4=2.2.2.2,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:53:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='guest-instance-1.domain.com',display_name='guest-instance-1.domain.com',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='guest-instance-1-domain-com',id=29,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI0h1tyJ10sGVfgbtaTfaukoV4S7P+aEjK9eLgPUiG9CPE4jsFdm+3ZESYhXBmq2VKuHkx2B1jNhgo5RY3RmEHOxb4evCuMePeLG8sRUx+ZQvFMNXMQJhi2ttCvkNaBsRw==',key_name='tempest-keypair-992509475',keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:54:07Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4d0114f1ba5f4ee6bdcaf9c8cf95d744',ramdisk_id='',reservation_id='r-np7rmq9z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestFqdnHostnames-645710174',owner_user_name='tempest-ServersTestFqdnHostnames-645710174-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:54:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='4ba956b833ca4d31936e5917bd1c2e96',uuid=ac6c9936-a82b-4dc4-b2b3-3aaae70701ab,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "59322bae-60f4-453d-8167-213727034e0a", "address": "fa:16:3e:f0:49:52", "network": {"id": "e21746f3-c39c-4f95-ab09-ca8da7420cb0", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-835937901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d0114f1ba5f4ee6bdcaf9c8cf95d744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap59322bae-60", "ovs_interfaceid": "59322bae-60f4-453d-8167-213727034e0a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 06:54:42 compute-0 nova_compute[187185]: 2025-11-29 06:54:42.027 187189 DEBUG nova.network.os_vif_util [None req-a2460934-2493-4b83-9e5e-4b7ddcf31982 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Converting VIF {"id": "59322bae-60f4-453d-8167-213727034e0a", "address": "fa:16:3e:f0:49:52", "network": {"id": "e21746f3-c39c-4f95-ab09-ca8da7420cb0", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-835937901-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d0114f1ba5f4ee6bdcaf9c8cf95d744", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap59322bae-60", "ovs_interfaceid": "59322bae-60f4-453d-8167-213727034e0a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 06:54:42 compute-0 nova_compute[187185]: 2025-11-29 06:54:42.028 187189 DEBUG nova.network.os_vif_util [None req-a2460934-2493-4b83-9e5e-4b7ddcf31982 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f0:49:52,bridge_name='br-int',has_traffic_filtering=True,id=59322bae-60f4-453d-8167-213727034e0a,network=Network(e21746f3-c39c-4f95-ab09-ca8da7420cb0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap59322bae-60') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 06:54:42 compute-0 nova_compute[187185]: 2025-11-29 06:54:42.028 187189 DEBUG os_vif [None req-a2460934-2493-4b83-9e5e-4b7ddcf31982 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f0:49:52,bridge_name='br-int',has_traffic_filtering=True,id=59322bae-60f4-453d-8167-213727034e0a,network=Network(e21746f3-c39c-4f95-ab09-ca8da7420cb0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap59322bae-60') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 06:54:42 compute-0 nova_compute[187185]: 2025-11-29 06:54:42.031 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:42 compute-0 nova_compute[187185]: 2025-11-29 06:54:42.031 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap59322bae-60, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:54:42 compute-0 nova_compute[187185]: 2025-11-29 06:54:42.033 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:42 compute-0 nova_compute[187185]: 2025-11-29 06:54:42.035 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:42 compute-0 nova_compute[187185]: 2025-11-29 06:54:42.041 187189 INFO os_vif [None req-a2460934-2493-4b83-9e5e-4b7ddcf31982 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f0:49:52,bridge_name='br-int',has_traffic_filtering=True,id=59322bae-60f4-453d-8167-213727034e0a,network=Network(e21746f3-c39c-4f95-ab09-ca8da7420cb0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap59322bae-60')
Nov 29 06:54:42 compute-0 nova_compute[187185]: 2025-11-29 06:54:42.042 187189 INFO nova.virt.libvirt.driver [None req-a2460934-2493-4b83-9e5e-4b7ddcf31982 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Deleting instance files /var/lib/nova/instances/ac6c9936-a82b-4dc4-b2b3-3aaae70701ab_del
Nov 29 06:54:42 compute-0 nova_compute[187185]: 2025-11-29 06:54:42.043 187189 INFO nova.virt.libvirt.driver [None req-a2460934-2493-4b83-9e5e-4b7ddcf31982 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Deletion of /var/lib/nova/instances/ac6c9936-a82b-4dc4-b2b3-3aaae70701ab_del complete
Nov 29 06:54:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:42.049 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[afe12513-e8f2-4efb-929d-710a13fc3a79]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 469452, 'reachable_time': 35417, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217821, 'error': None, 'target': 'ovnmeta-e21746f3-c39c-4f95-ab09-ca8da7420cb0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:54:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:42.054 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e21746f3-c39c-4f95-ab09-ca8da7420cb0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 06:54:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:54:42.054 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[9c1e3c12-8431-4e43-bb4a-159cfd40662b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:54:42 compute-0 systemd[1]: run-netns-ovnmeta\x2de21746f3\x2dc39c\x2d4f95\x2dab09\x2dca8da7420cb0.mount: Deactivated successfully.
Nov 29 06:54:42 compute-0 nova_compute[187185]: 2025-11-29 06:54:42.140 187189 INFO nova.compute.manager [None req-a2460934-2493-4b83-9e5e-4b7ddcf31982 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Took 0.57 seconds to destroy the instance on the hypervisor.
Nov 29 06:54:42 compute-0 nova_compute[187185]: 2025-11-29 06:54:42.141 187189 DEBUG oslo.service.loopingcall [None req-a2460934-2493-4b83-9e5e-4b7ddcf31982 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 06:54:42 compute-0 nova_compute[187185]: 2025-11-29 06:54:42.141 187189 DEBUG nova.compute.manager [-] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 06:54:42 compute-0 nova_compute[187185]: 2025-11-29 06:54:42.141 187189 DEBUG nova.network.neutron [-] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 06:54:43 compute-0 nova_compute[187185]: 2025-11-29 06:54:43.009 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:43 compute-0 nova_compute[187185]: 2025-11-29 06:54:43.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:54:43 compute-0 nova_compute[187185]: 2025-11-29 06:54:43.318 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 06:54:43 compute-0 nova_compute[187185]: 2025-11-29 06:54:43.319 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 06:54:43 compute-0 nova_compute[187185]: 2025-11-29 06:54:43.349 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875
Nov 29 06:54:43 compute-0 nova_compute[187185]: 2025-11-29 06:54:43.349 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 06:54:43 compute-0 nova_compute[187185]: 2025-11-29 06:54:43.350 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:54:43 compute-0 nova_compute[187185]: 2025-11-29 06:54:43.351 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:54:43 compute-0 nova_compute[187185]: 2025-11-29 06:54:43.738 187189 DEBUG nova.network.neutron [-] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:54:43 compute-0 nova_compute[187185]: 2025-11-29 06:54:43.765 187189 INFO nova.compute.manager [-] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Took 1.62 seconds to deallocate network for instance.
Nov 29 06:54:43 compute-0 nova_compute[187185]: 2025-11-29 06:54:43.797 187189 DEBUG nova.compute.manager [req-49b608b6-8ef2-4d90-96df-d2abd356f798 req-3d1115d6-cad8-42dc-b83a-2c1941d80025 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Received event network-vif-deleted-59322bae-60f4-453d-8167-213727034e0a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:54:43 compute-0 podman[217822]: 2025-11-29 06:54:43.839081135 +0000 UTC m=+0.088183368 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3)
Nov 29 06:54:43 compute-0 nova_compute[187185]: 2025-11-29 06:54:43.871 187189 DEBUG oslo_concurrency.lockutils [None req-a2460934-2493-4b83-9e5e-4b7ddcf31982 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:54:43 compute-0 nova_compute[187185]: 2025-11-29 06:54:43.872 187189 DEBUG oslo_concurrency.lockutils [None req-a2460934-2493-4b83-9e5e-4b7ddcf31982 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:54:43 compute-0 nova_compute[187185]: 2025-11-29 06:54:43.933 187189 DEBUG nova.compute.provider_tree [None req-a2460934-2493-4b83-9e5e-4b7ddcf31982 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:54:43 compute-0 nova_compute[187185]: 2025-11-29 06:54:43.958 187189 DEBUG nova.scheduler.client.report [None req-a2460934-2493-4b83-9e5e-4b7ddcf31982 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:54:43 compute-0 nova_compute[187185]: 2025-11-29 06:54:43.988 187189 DEBUG oslo_concurrency.lockutils [None req-a2460934-2493-4b83-9e5e-4b7ddcf31982 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:54:44 compute-0 nova_compute[187185]: 2025-11-29 06:54:44.079 187189 INFO nova.scheduler.client.report [None req-a2460934-2493-4b83-9e5e-4b7ddcf31982 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Deleted allocations for instance ac6c9936-a82b-4dc4-b2b3-3aaae70701ab
Nov 29 06:54:44 compute-0 nova_compute[187185]: 2025-11-29 06:54:44.159 187189 DEBUG oslo_concurrency.lockutils [None req-a2460934-2493-4b83-9e5e-4b7ddcf31982 4ba956b833ca4d31936e5917bd1c2e96 4d0114f1ba5f4ee6bdcaf9c8cf95d744 - - default default] Lock "ac6c9936-a82b-4dc4-b2b3-3aaae70701ab" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.619s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:54:44 compute-0 nova_compute[187185]: 2025-11-29 06:54:44.162 187189 DEBUG nova.compute.manager [req-f146aec3-4343-4bc8-99ee-189fd01daeb8 req-4b868b34-56c3-4ccc-aecd-e8cfc116f44e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Received event network-vif-plugged-59322bae-60f4-453d-8167-213727034e0a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:54:44 compute-0 nova_compute[187185]: 2025-11-29 06:54:44.162 187189 DEBUG oslo_concurrency.lockutils [req-f146aec3-4343-4bc8-99ee-189fd01daeb8 req-4b868b34-56c3-4ccc-aecd-e8cfc116f44e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "ac6c9936-a82b-4dc4-b2b3-3aaae70701ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:54:44 compute-0 nova_compute[187185]: 2025-11-29 06:54:44.162 187189 DEBUG oslo_concurrency.lockutils [req-f146aec3-4343-4bc8-99ee-189fd01daeb8 req-4b868b34-56c3-4ccc-aecd-e8cfc116f44e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ac6c9936-a82b-4dc4-b2b3-3aaae70701ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:54:44 compute-0 nova_compute[187185]: 2025-11-29 06:54:44.163 187189 DEBUG oslo_concurrency.lockutils [req-f146aec3-4343-4bc8-99ee-189fd01daeb8 req-4b868b34-56c3-4ccc-aecd-e8cfc116f44e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ac6c9936-a82b-4dc4-b2b3-3aaae70701ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:54:44 compute-0 nova_compute[187185]: 2025-11-29 06:54:44.163 187189 DEBUG nova.compute.manager [req-f146aec3-4343-4bc8-99ee-189fd01daeb8 req-4b868b34-56c3-4ccc-aecd-e8cfc116f44e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] No waiting events found dispatching network-vif-plugged-59322bae-60f4-453d-8167-213727034e0a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 06:54:44 compute-0 nova_compute[187185]: 2025-11-29 06:54:44.163 187189 WARNING nova.compute.manager [req-f146aec3-4343-4bc8-99ee-189fd01daeb8 req-4b868b34-56c3-4ccc-aecd-e8cfc116f44e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Received unexpected event network-vif-plugged-59322bae-60f4-453d-8167-213727034e0a for instance with vm_state deleted and task_state None.
Nov 29 06:54:44 compute-0 nova_compute[187185]: 2025-11-29 06:54:44.310 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:44 compute-0 nova_compute[187185]: 2025-11-29 06:54:44.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:54:44 compute-0 nova_compute[187185]: 2025-11-29 06:54:44.492 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:45 compute-0 nova_compute[187185]: 2025-11-29 06:54:45.311 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:54:46 compute-0 nova_compute[187185]: 2025-11-29 06:54:46.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:54:46 compute-0 nova_compute[187185]: 2025-11-29 06:54:46.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:54:46 compute-0 nova_compute[187185]: 2025-11-29 06:54:46.339 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:54:46 compute-0 nova_compute[187185]: 2025-11-29 06:54:46.340 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:54:46 compute-0 nova_compute[187185]: 2025-11-29 06:54:46.340 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:54:46 compute-0 nova_compute[187185]: 2025-11-29 06:54:46.341 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:54:46 compute-0 nova_compute[187185]: 2025-11-29 06:54:46.516 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:54:46 compute-0 nova_compute[187185]: 2025-11-29 06:54:46.517 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5721MB free_disk=73.33886337280273GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:54:46 compute-0 nova_compute[187185]: 2025-11-29 06:54:46.517 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:54:46 compute-0 nova_compute[187185]: 2025-11-29 06:54:46.517 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:54:46 compute-0 nova_compute[187185]: 2025-11-29 06:54:46.633 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:54:46 compute-0 nova_compute[187185]: 2025-11-29 06:54:46.634 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:54:46 compute-0 nova_compute[187185]: 2025-11-29 06:54:46.664 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:54:46 compute-0 nova_compute[187185]: 2025-11-29 06:54:46.676 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:54:46 compute-0 nova_compute[187185]: 2025-11-29 06:54:46.701 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:54:46 compute-0 nova_compute[187185]: 2025-11-29 06:54:46.702 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.184s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:54:47 compute-0 nova_compute[187185]: 2025-11-29 06:54:47.033 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:47 compute-0 nova_compute[187185]: 2025-11-29 06:54:47.702 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:54:47 compute-0 nova_compute[187185]: 2025-11-29 06:54:47.703 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:54:47 compute-0 nova_compute[187185]: 2025-11-29 06:54:47.703 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 06:54:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:54:47.989 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:54:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:54:47.991 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:54:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:54:47.991 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:54:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:54:47.991 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:54:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:54:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:54:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:54:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:54:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:54:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:54:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:54:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:54:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:54:47.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:54:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:54:47.993 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:54:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:54:47.993 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:54:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:54:47.993 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:54:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:54:47.993 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:54:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:54:47.993 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:54:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:54:47.993 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:54:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:54:47.994 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:54:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:54:47.994 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:54:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:54:47.994 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:54:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:54:47.994 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:54:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:54:47.994 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:54:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:54:47.994 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:54:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:54:47.995 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:54:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:54:47.995 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:54:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:54:47.995 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:54:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:54:47.995 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 06:54:48 compute-0 nova_compute[187185]: 2025-11-29 06:54:48.010 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:52 compute-0 nova_compute[187185]: 2025-11-29 06:54:52.035 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:52 compute-0 nova_compute[187185]: 2025-11-29 06:54:52.886 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399277.883386, 6af9191a-9cf3-47b8-9172-1f844e3f2d44 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:54:52 compute-0 nova_compute[187185]: 2025-11-29 06:54:52.887 187189 INFO nova.compute.manager [-] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] VM Stopped (Lifecycle Event)
Nov 29 06:54:52 compute-0 nova_compute[187185]: 2025-11-29 06:54:52.930 187189 DEBUG nova.compute.manager [None req-71008666-aaf0-4a4c-b04b-3e26be77d05f - - - - - -] [instance: 6af9191a-9cf3-47b8-9172-1f844e3f2d44] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:54:53 compute-0 nova_compute[187185]: 2025-11-29 06:54:53.059 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:55 compute-0 podman[217847]: 2025-11-29 06:54:55.823203362 +0000 UTC m=+0.072823721 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 06:54:55 compute-0 podman[217846]: 2025-11-29 06:54:55.843028796 +0000 UTC m=+0.091746560 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, release=1755695350, container_name=openstack_network_exporter, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, maintainer=Red Hat, Inc., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41)
Nov 29 06:54:55 compute-0 podman[217845]: 2025-11-29 06:54:55.868001776 +0000 UTC m=+0.115224647 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent)
Nov 29 06:54:56 compute-0 nova_compute[187185]: 2025-11-29 06:54:56.848 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399281.8446848, ac6c9936-a82b-4dc4-b2b3-3aaae70701ab => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:54:56 compute-0 nova_compute[187185]: 2025-11-29 06:54:56.849 187189 INFO nova.compute.manager [-] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] VM Stopped (Lifecycle Event)
Nov 29 06:54:56 compute-0 nova_compute[187185]: 2025-11-29 06:54:56.894 187189 DEBUG nova.compute.manager [None req-198f140f-5425-4360-b1d0-389c910a98fe - - - - - -] [instance: ac6c9936-a82b-4dc4-b2b3-3aaae70701ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:54:57 compute-0 nova_compute[187185]: 2025-11-29 06:54:57.037 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:58 compute-0 nova_compute[187185]: 2025-11-29 06:54:58.063 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:54:58 compute-0 nova_compute[187185]: 2025-11-29 06:54:58.745 187189 DEBUG oslo_concurrency.lockutils [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "66b9235f-7cc8-40d4-877b-b690613298a4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:54:58 compute-0 nova_compute[187185]: 2025-11-29 06:54:58.746 187189 DEBUG oslo_concurrency.lockutils [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "66b9235f-7cc8-40d4-877b-b690613298a4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:54:58 compute-0 nova_compute[187185]: 2025-11-29 06:54:58.762 187189 DEBUG nova.compute.manager [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 06:54:59 compute-0 nova_compute[187185]: 2025-11-29 06:54:59.159 187189 DEBUG oslo_concurrency.lockutils [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:54:59 compute-0 nova_compute[187185]: 2025-11-29 06:54:59.159 187189 DEBUG oslo_concurrency.lockutils [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:54:59 compute-0 nova_compute[187185]: 2025-11-29 06:54:59.167 187189 DEBUG nova.virt.hardware [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 06:54:59 compute-0 nova_compute[187185]: 2025-11-29 06:54:59.167 187189 INFO nova.compute.claims [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Claim successful on node compute-0.ctlplane.example.com
Nov 29 06:54:59 compute-0 nova_compute[187185]: 2025-11-29 06:54:59.355 187189 DEBUG nova.compute.provider_tree [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:54:59 compute-0 nova_compute[187185]: 2025-11-29 06:54:59.373 187189 DEBUG nova.scheduler.client.report [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:54:59 compute-0 nova_compute[187185]: 2025-11-29 06:54:59.397 187189 DEBUG oslo_concurrency.lockutils [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.238s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:54:59 compute-0 nova_compute[187185]: 2025-11-29 06:54:59.398 187189 DEBUG nova.compute.manager [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 06:54:59 compute-0 nova_compute[187185]: 2025-11-29 06:54:59.474 187189 DEBUG nova.compute.manager [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 06:54:59 compute-0 nova_compute[187185]: 2025-11-29 06:54:59.475 187189 DEBUG nova.network.neutron [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 06:54:59 compute-0 nova_compute[187185]: 2025-11-29 06:54:59.533 187189 INFO nova.virt.libvirt.driver [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 06:54:59 compute-0 nova_compute[187185]: 2025-11-29 06:54:59.563 187189 DEBUG nova.compute.manager [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 06:54:59 compute-0 nova_compute[187185]: 2025-11-29 06:54:59.774 187189 DEBUG nova.compute.manager [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 06:54:59 compute-0 nova_compute[187185]: 2025-11-29 06:54:59.776 187189 DEBUG nova.virt.libvirt.driver [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 06:54:59 compute-0 nova_compute[187185]: 2025-11-29 06:54:59.776 187189 INFO nova.virt.libvirt.driver [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Creating image(s)
Nov 29 06:54:59 compute-0 nova_compute[187185]: 2025-11-29 06:54:59.777 187189 DEBUG oslo_concurrency.lockutils [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "/var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:54:59 compute-0 nova_compute[187185]: 2025-11-29 06:54:59.778 187189 DEBUG oslo_concurrency.lockutils [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "/var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:54:59 compute-0 nova_compute[187185]: 2025-11-29 06:54:59.779 187189 DEBUG oslo_concurrency.lockutils [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "/var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:54:59 compute-0 nova_compute[187185]: 2025-11-29 06:54:59.800 187189 DEBUG oslo_concurrency.processutils [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:54:59 compute-0 nova_compute[187185]: 2025-11-29 06:54:59.882 187189 DEBUG oslo_concurrency.processutils [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:54:59 compute-0 nova_compute[187185]: 2025-11-29 06:54:59.883 187189 DEBUG oslo_concurrency.lockutils [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:54:59 compute-0 nova_compute[187185]: 2025-11-29 06:54:59.884 187189 DEBUG oslo_concurrency.lockutils [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:54:59 compute-0 nova_compute[187185]: 2025-11-29 06:54:59.894 187189 DEBUG oslo_concurrency.processutils [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:54:59 compute-0 nova_compute[187185]: 2025-11-29 06:54:59.927 187189 DEBUG oslo_concurrency.lockutils [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "6171b282-fd8f-42ba-9875-2ae45ea80365" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:54:59 compute-0 nova_compute[187185]: 2025-11-29 06:54:59.928 187189 DEBUG oslo_concurrency.lockutils [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "6171b282-fd8f-42ba-9875-2ae45ea80365" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:54:59 compute-0 nova_compute[187185]: 2025-11-29 06:54:59.949 187189 DEBUG nova.compute.manager [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 06:54:59 compute-0 nova_compute[187185]: 2025-11-29 06:54:59.961 187189 DEBUG oslo_concurrency.processutils [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:54:59 compute-0 nova_compute[187185]: 2025-11-29 06:54:59.961 187189 DEBUG oslo_concurrency.processutils [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.115 187189 DEBUG oslo_concurrency.lockutils [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.115 187189 DEBUG oslo_concurrency.lockutils [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.122 187189 DEBUG nova.virt.hardware [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.122 187189 INFO nova.compute.claims [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Claim successful on node compute-0.ctlplane.example.com
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.376 187189 DEBUG nova.compute.provider_tree [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.393 187189 DEBUG nova.scheduler.client.report [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.420 187189 DEBUG oslo_concurrency.lockutils [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.305s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.421 187189 DEBUG nova.compute.manager [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.480 187189 DEBUG nova.compute.manager [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.480 187189 DEBUG nova.network.neutron [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.510 187189 INFO nova.virt.libvirt.driver [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.532 187189 DEBUG nova.compute.manager [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.674 187189 DEBUG nova.compute.manager [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.676 187189 DEBUG nova.virt.libvirt.driver [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.677 187189 INFO nova.virt.libvirt.driver [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Creating image(s)
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.677 187189 DEBUG oslo_concurrency.lockutils [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "/var/lib/nova/instances/6171b282-fd8f-42ba-9875-2ae45ea80365/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.677 187189 DEBUG oslo_concurrency.lockutils [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "/var/lib/nova/instances/6171b282-fd8f-42ba-9875-2ae45ea80365/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.678 187189 DEBUG oslo_concurrency.lockutils [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "/var/lib/nova/instances/6171b282-fd8f-42ba-9875-2ae45ea80365/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.690 187189 DEBUG oslo_concurrency.processutils [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.719 187189 DEBUG nova.network.neutron [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.720 187189 DEBUG nova.compute.manager [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.723 187189 DEBUG oslo_concurrency.processutils [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk 1073741824" returned: 0 in 0.762s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.724 187189 DEBUG oslo_concurrency.lockutils [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.840s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.724 187189 DEBUG oslo_concurrency.processutils [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.787 187189 DEBUG oslo_concurrency.processutils [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.788 187189 DEBUG oslo_concurrency.lockutils [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.789 187189 DEBUG oslo_concurrency.lockutils [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.800 187189 DEBUG oslo_concurrency.processutils [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.827 187189 DEBUG oslo_concurrency.processutils [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.103s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.829 187189 DEBUG nova.virt.disk.api [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Checking if we can resize image /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.829 187189 DEBUG oslo_concurrency.processutils [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.882 187189 DEBUG oslo_concurrency.processutils [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.883 187189 DEBUG oslo_concurrency.processutils [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/6171b282-fd8f-42ba-9875-2ae45ea80365/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.904 187189 DEBUG nova.policy [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.918 187189 DEBUG oslo_concurrency.processutils [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.919 187189 DEBUG nova.virt.disk.api [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Cannot resize image /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.919 187189 DEBUG nova.objects.instance [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lazy-loading 'migration_context' on Instance uuid 66b9235f-7cc8-40d4-877b-b690613298a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.940 187189 DEBUG nova.virt.libvirt.driver [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.941 187189 DEBUG nova.virt.libvirt.driver [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Ensure instance console log exists: /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.941 187189 DEBUG oslo_concurrency.lockutils [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.942 187189 DEBUG oslo_concurrency.lockutils [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.942 187189 DEBUG oslo_concurrency.lockutils [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.943 187189 DEBUG nova.virt.libvirt.driver [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.949 187189 WARNING nova.virt.libvirt.driver [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.954 187189 DEBUG nova.virt.libvirt.host [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.955 187189 DEBUG nova.virt.libvirt.host [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.962 187189 DEBUG nova.virt.libvirt.host [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.963 187189 DEBUG nova.virt.libvirt.host [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.964 187189 DEBUG nova.virt.libvirt.driver [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.964 187189 DEBUG nova.virt.hardware [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.965 187189 DEBUG nova.virt.hardware [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.965 187189 DEBUG nova.virt.hardware [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.965 187189 DEBUG nova.virt.hardware [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.965 187189 DEBUG nova.virt.hardware [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.966 187189 DEBUG nova.virt.hardware [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.966 187189 DEBUG nova.virt.hardware [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.966 187189 DEBUG nova.virt.hardware [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.966 187189 DEBUG nova.virt.hardware [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.967 187189 DEBUG nova.virt.hardware [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.967 187189 DEBUG nova.virt.hardware [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 06:55:00 compute-0 nova_compute[187185]: 2025-11-29 06:55:00.971 187189 DEBUG nova.objects.instance [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lazy-loading 'pci_devices' on Instance uuid 66b9235f-7cc8-40d4-877b-b690613298a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:55:01 compute-0 nova_compute[187185]: 2025-11-29 06:55:01.002 187189 DEBUG nova.virt.libvirt.driver [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] End _get_guest_xml xml=<domain type="kvm">
Nov 29 06:55:01 compute-0 nova_compute[187185]:   <uuid>66b9235f-7cc8-40d4-877b-b690613298a4</uuid>
Nov 29 06:55:01 compute-0 nova_compute[187185]:   <name>instance-00000021</name>
Nov 29 06:55:01 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 06:55:01 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 06:55:01 compute-0 nova_compute[187185]:   <metadata>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 06:55:01 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:       <nova:name>tempest-MigrationsAdminTest-server-2086906237</nova:name>
Nov 29 06:55:01 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 06:55:00</nova:creationTime>
Nov 29 06:55:01 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 06:55:01 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 06:55:01 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 06:55:01 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 06:55:01 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 06:55:01 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 06:55:01 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 06:55:01 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 06:55:01 compute-0 nova_compute[187185]:         <nova:user uuid="53ee944c04484336b9b14d84235a62b8">tempest-MigrationsAdminTest-1601255173-project-member</nova:user>
Nov 29 06:55:01 compute-0 nova_compute[187185]:         <nova:project uuid="890f94a625b342fdb17128922403c925">tempest-MigrationsAdminTest-1601255173</nova:project>
Nov 29 06:55:01 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 06:55:01 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:       <nova:ports/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 06:55:01 compute-0 nova_compute[187185]:   </metadata>
Nov 29 06:55:01 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <system>
Nov 29 06:55:01 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 06:55:01 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 06:55:01 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 06:55:01 compute-0 nova_compute[187185]:       <entry name="serial">66b9235f-7cc8-40d4-877b-b690613298a4</entry>
Nov 29 06:55:01 compute-0 nova_compute[187185]:       <entry name="uuid">66b9235f-7cc8-40d4-877b-b690613298a4</entry>
Nov 29 06:55:01 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     </system>
Nov 29 06:55:01 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 06:55:01 compute-0 nova_compute[187185]:   <os>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:   </os>
Nov 29 06:55:01 compute-0 nova_compute[187185]:   <features>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <apic/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:   </features>
Nov 29 06:55:01 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:   </clock>
Nov 29 06:55:01 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:   </cpu>
Nov 29 06:55:01 compute-0 nova_compute[187185]:   <devices>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 06:55:01 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     </disk>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 06:55:01 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk.config"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     </disk>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 06:55:01 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/console.log" append="off"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     </serial>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <video>
Nov 29 06:55:01 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     </video>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 06:55:01 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     </rng>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 06:55:01 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 06:55:01 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 06:55:01 compute-0 nova_compute[187185]:   </devices>
Nov 29 06:55:01 compute-0 nova_compute[187185]: </domain>
Nov 29 06:55:01 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 06:55:01 compute-0 nova_compute[187185]: 2025-11-29 06:55:01.436 187189 DEBUG nova.virt.libvirt.driver [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 06:55:01 compute-0 nova_compute[187185]: 2025-11-29 06:55:01.437 187189 DEBUG nova.virt.libvirt.driver [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 06:55:01 compute-0 nova_compute[187185]: 2025-11-29 06:55:01.438 187189 INFO nova.virt.libvirt.driver [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Using config drive
Nov 29 06:55:01 compute-0 nova_compute[187185]: 2025-11-29 06:55:01.751 187189 DEBUG nova.network.neutron [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Successfully created port: 43628527-f640-422f-909a-1feda8b1b46b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 06:55:01 compute-0 nova_compute[187185]: 2025-11-29 06:55:01.871 187189 INFO nova.virt.libvirt.driver [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Creating config drive at /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk.config
Nov 29 06:55:01 compute-0 nova_compute[187185]: 2025-11-29 06:55:01.883 187189 DEBUG oslo_concurrency.processutils [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprruonrq_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:55:01 compute-0 nova_compute[187185]: 2025-11-29 06:55:01.965 187189 DEBUG oslo_concurrency.processutils [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/6171b282-fd8f-42ba-9875-2ae45ea80365/disk 1073741824" returned: 0 in 1.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:55:01 compute-0 nova_compute[187185]: 2025-11-29 06:55:01.967 187189 DEBUG oslo_concurrency.lockutils [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 1.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:55:01 compute-0 nova_compute[187185]: 2025-11-29 06:55:01.968 187189 DEBUG oslo_concurrency.processutils [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:55:02 compute-0 nova_compute[187185]: 2025-11-29 06:55:02.033 187189 DEBUG oslo_concurrency.processutils [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprruonrq_" returned: 0 in 0.150s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:55:02 compute-0 nova_compute[187185]: 2025-11-29 06:55:02.039 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:55:02 compute-0 nova_compute[187185]: 2025-11-29 06:55:02.051 187189 DEBUG oslo_concurrency.processutils [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:55:02 compute-0 nova_compute[187185]: 2025-11-29 06:55:02.053 187189 DEBUG nova.virt.disk.api [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Checking if we can resize image /var/lib/nova/instances/6171b282-fd8f-42ba-9875-2ae45ea80365/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 06:55:02 compute-0 nova_compute[187185]: 2025-11-29 06:55:02.054 187189 DEBUG oslo_concurrency.processutils [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6171b282-fd8f-42ba-9875-2ae45ea80365/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:55:02 compute-0 nova_compute[187185]: 2025-11-29 06:55:02.179 187189 DEBUG oslo_concurrency.processutils [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6171b282-fd8f-42ba-9875-2ae45ea80365/disk --force-share --output=json" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:55:02 compute-0 nova_compute[187185]: 2025-11-29 06:55:02.181 187189 DEBUG nova.virt.disk.api [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Cannot resize image /var/lib/nova/instances/6171b282-fd8f-42ba-9875-2ae45ea80365/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 06:55:02 compute-0 nova_compute[187185]: 2025-11-29 06:55:02.182 187189 DEBUG nova.objects.instance [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lazy-loading 'migration_context' on Instance uuid 6171b282-fd8f-42ba-9875-2ae45ea80365 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:55:02 compute-0 systemd-machined[153486]: New machine qemu-11-instance-00000021.
Nov 29 06:55:02 compute-0 systemd[1]: Started Virtual Machine qemu-11-instance-00000021.
Nov 29 06:55:02 compute-0 nova_compute[187185]: 2025-11-29 06:55:02.199 187189 DEBUG nova.virt.libvirt.driver [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 06:55:02 compute-0 nova_compute[187185]: 2025-11-29 06:55:02.200 187189 DEBUG nova.virt.libvirt.driver [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Ensure instance console log exists: /var/lib/nova/instances/6171b282-fd8f-42ba-9875-2ae45ea80365/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 06:55:02 compute-0 nova_compute[187185]: 2025-11-29 06:55:02.200 187189 DEBUG oslo_concurrency.lockutils [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:55:02 compute-0 nova_compute[187185]: 2025-11-29 06:55:02.200 187189 DEBUG oslo_concurrency.lockutils [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:55:02 compute-0 nova_compute[187185]: 2025-11-29 06:55:02.200 187189 DEBUG oslo_concurrency.lockutils [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:55:02 compute-0 podman[217944]: 2025-11-29 06:55:02.274182478 +0000 UTC m=+0.113518588 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 06:55:02 compute-0 nova_compute[187185]: 2025-11-29 06:55:02.572 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399302.5713823, 66b9235f-7cc8-40d4-877b-b690613298a4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:55:02 compute-0 nova_compute[187185]: 2025-11-29 06:55:02.573 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] VM Resumed (Lifecycle Event)
Nov 29 06:55:02 compute-0 nova_compute[187185]: 2025-11-29 06:55:02.578 187189 DEBUG nova.compute.manager [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 06:55:02 compute-0 nova_compute[187185]: 2025-11-29 06:55:02.579 187189 DEBUG nova.virt.libvirt.driver [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 06:55:02 compute-0 nova_compute[187185]: 2025-11-29 06:55:02.584 187189 INFO nova.virt.libvirt.driver [-] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Instance spawned successfully.
Nov 29 06:55:02 compute-0 nova_compute[187185]: 2025-11-29 06:55:02.585 187189 DEBUG nova.virt.libvirt.driver [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 06:55:02 compute-0 nova_compute[187185]: 2025-11-29 06:55:02.609 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:55:02 compute-0 nova_compute[187185]: 2025-11-29 06:55:02.617 187189 DEBUG nova.virt.libvirt.driver [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:55:02 compute-0 nova_compute[187185]: 2025-11-29 06:55:02.618 187189 DEBUG nova.virt.libvirt.driver [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:55:02 compute-0 nova_compute[187185]: 2025-11-29 06:55:02.618 187189 DEBUG nova.virt.libvirt.driver [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:55:02 compute-0 nova_compute[187185]: 2025-11-29 06:55:02.619 187189 DEBUG nova.virt.libvirt.driver [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:55:02 compute-0 nova_compute[187185]: 2025-11-29 06:55:02.620 187189 DEBUG nova.virt.libvirt.driver [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:55:02 compute-0 nova_compute[187185]: 2025-11-29 06:55:02.621 187189 DEBUG nova.virt.libvirt.driver [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:55:02 compute-0 nova_compute[187185]: 2025-11-29 06:55:02.627 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 06:55:02 compute-0 nova_compute[187185]: 2025-11-29 06:55:02.667 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 06:55:02 compute-0 nova_compute[187185]: 2025-11-29 06:55:02.668 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399302.5775316, 66b9235f-7cc8-40d4-877b-b690613298a4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:55:02 compute-0 nova_compute[187185]: 2025-11-29 06:55:02.668 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] VM Started (Lifecycle Event)
Nov 29 06:55:02 compute-0 nova_compute[187185]: 2025-11-29 06:55:02.702 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:55:02 compute-0 nova_compute[187185]: 2025-11-29 06:55:02.709 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 06:55:02 compute-0 nova_compute[187185]: 2025-11-29 06:55:02.734 187189 INFO nova.compute.manager [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Took 2.96 seconds to spawn the instance on the hypervisor.
Nov 29 06:55:02 compute-0 nova_compute[187185]: 2025-11-29 06:55:02.735 187189 DEBUG nova.compute.manager [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:55:02 compute-0 nova_compute[187185]: 2025-11-29 06:55:02.746 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 06:55:02 compute-0 nova_compute[187185]: 2025-11-29 06:55:02.844 187189 INFO nova.compute.manager [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Took 3.76 seconds to build instance.
Nov 29 06:55:02 compute-0 nova_compute[187185]: 2025-11-29 06:55:02.882 187189 DEBUG oslo_concurrency.lockutils [None req-48f98868-ce56-4a92-9da7-83675ccb3f8b 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "66b9235f-7cc8-40d4-877b-b690613298a4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.136s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:55:02 compute-0 nova_compute[187185]: 2025-11-29 06:55:02.952 187189 DEBUG nova.network.neutron [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Successfully updated port: 43628527-f640-422f-909a-1feda8b1b46b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 06:55:02 compute-0 nova_compute[187185]: 2025-11-29 06:55:02.976 187189 DEBUG oslo_concurrency.lockutils [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "refresh_cache-6171b282-fd8f-42ba-9875-2ae45ea80365" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:55:02 compute-0 nova_compute[187185]: 2025-11-29 06:55:02.976 187189 DEBUG oslo_concurrency.lockutils [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquired lock "refresh_cache-6171b282-fd8f-42ba-9875-2ae45ea80365" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:55:02 compute-0 nova_compute[187185]: 2025-11-29 06:55:02.977 187189 DEBUG nova.network.neutron [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 06:55:03 compute-0 nova_compute[187185]: 2025-11-29 06:55:03.107 187189 DEBUG nova.compute.manager [req-49d0f041-f9d2-4faa-bcfa-51d00ac8bd33 req-f8b49190-24f9-490c-93d4-4a4116a21f49 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Received event network-changed-43628527-f640-422f-909a-1feda8b1b46b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:55:03 compute-0 nova_compute[187185]: 2025-11-29 06:55:03.109 187189 DEBUG nova.compute.manager [req-49d0f041-f9d2-4faa-bcfa-51d00ac8bd33 req-f8b49190-24f9-490c-93d4-4a4116a21f49 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Refreshing instance network info cache due to event network-changed-43628527-f640-422f-909a-1feda8b1b46b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 06:55:03 compute-0 nova_compute[187185]: 2025-11-29 06:55:03.110 187189 DEBUG oslo_concurrency.lockutils [req-49d0f041-f9d2-4faa-bcfa-51d00ac8bd33 req-f8b49190-24f9-490c-93d4-4a4116a21f49 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-6171b282-fd8f-42ba-9875-2ae45ea80365" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:55:03 compute-0 nova_compute[187185]: 2025-11-29 06:55:03.113 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:55:03 compute-0 nova_compute[187185]: 2025-11-29 06:55:03.356 187189 DEBUG nova.network.neutron [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 06:55:04 compute-0 podman[217990]: 2025-11-29 06:55:04.827948674 +0000 UTC m=+0.080597642 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 06:55:04 compute-0 podman[217989]: 2025-11-29 06:55:04.839215545 +0000 UTC m=+0.100854369 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.009 187189 DEBUG nova.network.neutron [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Updating instance_info_cache with network_info: [{"id": "43628527-f640-422f-909a-1feda8b1b46b", "address": "fa:16:3e:9d:c7:52", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43628527-f6", "ovs_interfaceid": "43628527-f640-422f-909a-1feda8b1b46b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.059 187189 DEBUG oslo_concurrency.lockutils [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Releasing lock "refresh_cache-6171b282-fd8f-42ba-9875-2ae45ea80365" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.060 187189 DEBUG nova.compute.manager [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Instance network_info: |[{"id": "43628527-f640-422f-909a-1feda8b1b46b", "address": "fa:16:3e:9d:c7:52", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43628527-f6", "ovs_interfaceid": "43628527-f640-422f-909a-1feda8b1b46b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.061 187189 DEBUG oslo_concurrency.lockutils [req-49d0f041-f9d2-4faa-bcfa-51d00ac8bd33 req-f8b49190-24f9-490c-93d4-4a4116a21f49 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-6171b282-fd8f-42ba-9875-2ae45ea80365" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.062 187189 DEBUG nova.network.neutron [req-49d0f041-f9d2-4faa-bcfa-51d00ac8bd33 req-f8b49190-24f9-490c-93d4-4a4116a21f49 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Refreshing network info cache for port 43628527-f640-422f-909a-1feda8b1b46b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.066 187189 DEBUG nova.virt.libvirt.driver [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Start _get_guest_xml network_info=[{"id": "43628527-f640-422f-909a-1feda8b1b46b", "address": "fa:16:3e:9d:c7:52", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43628527-f6", "ovs_interfaceid": "43628527-f640-422f-909a-1feda8b1b46b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.085 187189 WARNING nova.virt.libvirt.driver [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.092 187189 DEBUG nova.virt.libvirt.host [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.093 187189 DEBUG nova.virt.libvirt.host [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.097 187189 DEBUG nova.virt.libvirt.host [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.098 187189 DEBUG nova.virt.libvirt.host [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.099 187189 DEBUG nova.virt.libvirt.driver [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.100 187189 DEBUG nova.virt.hardware [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.101 187189 DEBUG nova.virt.hardware [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.101 187189 DEBUG nova.virt.hardware [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.101 187189 DEBUG nova.virt.hardware [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.102 187189 DEBUG nova.virt.hardware [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.102 187189 DEBUG nova.virt.hardware [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.102 187189 DEBUG nova.virt.hardware [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.103 187189 DEBUG nova.virt.hardware [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.103 187189 DEBUG nova.virt.hardware [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.103 187189 DEBUG nova.virt.hardware [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.104 187189 DEBUG nova.virt.hardware [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.108 187189 DEBUG nova.virt.libvirt.vif [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:54:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-797406206',display_name='tempest-ImagesTestJSON-server-797406206',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-797406206',id=34,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='78f8ba841bbe4fdcb9d9e2237d97bf73',ramdisk_id='',reservation_id='r-q8w53trj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-1674785298',owner_user_name='tempest-ImagesTestJSON-1674785298-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:55:00Z,user_data=None,user_id='315be492c2ce4b9f8af2898e6794a256',uuid=6171b282-fd8f-42ba-9875-2ae45ea80365,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "43628527-f640-422f-909a-1feda8b1b46b", "address": "fa:16:3e:9d:c7:52", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43628527-f6", "ovs_interfaceid": "43628527-f640-422f-909a-1feda8b1b46b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.108 187189 DEBUG nova.network.os_vif_util [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Converting VIF {"id": "43628527-f640-422f-909a-1feda8b1b46b", "address": "fa:16:3e:9d:c7:52", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43628527-f6", "ovs_interfaceid": "43628527-f640-422f-909a-1feda8b1b46b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.110 187189 DEBUG nova.network.os_vif_util [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9d:c7:52,bridge_name='br-int',has_traffic_filtering=True,id=43628527-f640-422f-909a-1feda8b1b46b,network=Network(17ec2ca4-3fa9-41aa-80ef-35bf92d404ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43628527-f6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.111 187189 DEBUG nova.objects.instance [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6171b282-fd8f-42ba-9875-2ae45ea80365 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.139 187189 DEBUG nova.virt.libvirt.driver [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] End _get_guest_xml xml=<domain type="kvm">
Nov 29 06:55:05 compute-0 nova_compute[187185]:   <uuid>6171b282-fd8f-42ba-9875-2ae45ea80365</uuid>
Nov 29 06:55:05 compute-0 nova_compute[187185]:   <name>instance-00000022</name>
Nov 29 06:55:05 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 06:55:05 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 06:55:05 compute-0 nova_compute[187185]:   <metadata>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 06:55:05 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:       <nova:name>tempest-ImagesTestJSON-server-797406206</nova:name>
Nov 29 06:55:05 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 06:55:05</nova:creationTime>
Nov 29 06:55:05 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 06:55:05 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 06:55:05 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 06:55:05 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 06:55:05 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 06:55:05 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 06:55:05 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 06:55:05 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 06:55:05 compute-0 nova_compute[187185]:         <nova:user uuid="315be492c2ce4b9f8af2898e6794a256">tempest-ImagesTestJSON-1674785298-project-member</nova:user>
Nov 29 06:55:05 compute-0 nova_compute[187185]:         <nova:project uuid="78f8ba841bbe4fdcb9d9e2237d97bf73">tempest-ImagesTestJSON-1674785298</nova:project>
Nov 29 06:55:05 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 06:55:05 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 06:55:05 compute-0 nova_compute[187185]:         <nova:port uuid="43628527-f640-422f-909a-1feda8b1b46b">
Nov 29 06:55:05 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 06:55:05 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 06:55:05 compute-0 nova_compute[187185]:   </metadata>
Nov 29 06:55:05 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <system>
Nov 29 06:55:05 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 06:55:05 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 06:55:05 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 06:55:05 compute-0 nova_compute[187185]:       <entry name="serial">6171b282-fd8f-42ba-9875-2ae45ea80365</entry>
Nov 29 06:55:05 compute-0 nova_compute[187185]:       <entry name="uuid">6171b282-fd8f-42ba-9875-2ae45ea80365</entry>
Nov 29 06:55:05 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     </system>
Nov 29 06:55:05 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 06:55:05 compute-0 nova_compute[187185]:   <os>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:   </os>
Nov 29 06:55:05 compute-0 nova_compute[187185]:   <features>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <apic/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:   </features>
Nov 29 06:55:05 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:   </clock>
Nov 29 06:55:05 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:   </cpu>
Nov 29 06:55:05 compute-0 nova_compute[187185]:   <devices>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 06:55:05 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/6171b282-fd8f-42ba-9875-2ae45ea80365/disk"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     </disk>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 06:55:05 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/6171b282-fd8f-42ba-9875-2ae45ea80365/disk.config"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     </disk>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 06:55:05 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:9d:c7:52"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:       <target dev="tap43628527-f6"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     </interface>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 06:55:05 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/6171b282-fd8f-42ba-9875-2ae45ea80365/console.log" append="off"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     </serial>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <video>
Nov 29 06:55:05 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     </video>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 06:55:05 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     </rng>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 06:55:05 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 06:55:05 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 06:55:05 compute-0 nova_compute[187185]:   </devices>
Nov 29 06:55:05 compute-0 nova_compute[187185]: </domain>
Nov 29 06:55:05 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.147 187189 DEBUG nova.compute.manager [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Preparing to wait for external event network-vif-plugged-43628527-f640-422f-909a-1feda8b1b46b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.148 187189 DEBUG oslo_concurrency.lockutils [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "6171b282-fd8f-42ba-9875-2ae45ea80365-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.148 187189 DEBUG oslo_concurrency.lockutils [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "6171b282-fd8f-42ba-9875-2ae45ea80365-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.148 187189 DEBUG oslo_concurrency.lockutils [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "6171b282-fd8f-42ba-9875-2ae45ea80365-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.149 187189 DEBUG nova.virt.libvirt.vif [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:54:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-797406206',display_name='tempest-ImagesTestJSON-server-797406206',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-797406206',id=34,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='78f8ba841bbe4fdcb9d9e2237d97bf73',ramdisk_id='',reservation_id='r-q8w53trj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-1674785298',owner_user_name='tempest-ImagesTestJSON-1674785298-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:55:00Z,user_data=None,user_id='315be492c2ce4b9f8af2898e6794a256',uuid=6171b282-fd8f-42ba-9875-2ae45ea80365,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "43628527-f640-422f-909a-1feda8b1b46b", "address": "fa:16:3e:9d:c7:52", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43628527-f6", "ovs_interfaceid": "43628527-f640-422f-909a-1feda8b1b46b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.149 187189 DEBUG nova.network.os_vif_util [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Converting VIF {"id": "43628527-f640-422f-909a-1feda8b1b46b", "address": "fa:16:3e:9d:c7:52", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43628527-f6", "ovs_interfaceid": "43628527-f640-422f-909a-1feda8b1b46b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.150 187189 DEBUG nova.network.os_vif_util [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9d:c7:52,bridge_name='br-int',has_traffic_filtering=True,id=43628527-f640-422f-909a-1feda8b1b46b,network=Network(17ec2ca4-3fa9-41aa-80ef-35bf92d404ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43628527-f6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.151 187189 DEBUG os_vif [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9d:c7:52,bridge_name='br-int',has_traffic_filtering=True,id=43628527-f640-422f-909a-1feda8b1b46b,network=Network(17ec2ca4-3fa9-41aa-80ef-35bf92d404ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43628527-f6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.152 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.152 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.153 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.156 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.157 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap43628527-f6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.157 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap43628527-f6, col_values=(('external_ids', {'iface-id': '43628527-f640-422f-909a-1feda8b1b46b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9d:c7:52', 'vm-uuid': '6171b282-fd8f-42ba-9875-2ae45ea80365'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.159 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:55:05 compute-0 NetworkManager[55227]: <info>  [1764399305.1612] manager: (tap43628527-f6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.161 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.168 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.169 187189 INFO os_vif [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9d:c7:52,bridge_name='br-int',has_traffic_filtering=True,id=43628527-f640-422f-909a-1feda8b1b46b,network=Network(17ec2ca4-3fa9-41aa-80ef-35bf92d404ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43628527-f6')
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.798 187189 DEBUG nova.virt.libvirt.driver [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.799 187189 DEBUG nova.virt.libvirt.driver [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.799 187189 DEBUG nova.virt.libvirt.driver [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] No VIF found with MAC fa:16:3e:9d:c7:52, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 06:55:05 compute-0 nova_compute[187185]: 2025-11-29 06:55:05.800 187189 INFO nova.virt.libvirt.driver [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Using config drive
Nov 29 06:55:06 compute-0 nova_compute[187185]: 2025-11-29 06:55:06.505 187189 INFO nova.virt.libvirt.driver [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Creating config drive at /var/lib/nova/instances/6171b282-fd8f-42ba-9875-2ae45ea80365/disk.config
Nov 29 06:55:06 compute-0 nova_compute[187185]: 2025-11-29 06:55:06.512 187189 DEBUG oslo_concurrency.processutils [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6171b282-fd8f-42ba-9875-2ae45ea80365/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf71ia1t6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:55:06 compute-0 nova_compute[187185]: 2025-11-29 06:55:06.659 187189 DEBUG oslo_concurrency.processutils [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6171b282-fd8f-42ba-9875-2ae45ea80365/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf71ia1t6" returned: 0 in 0.147s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:55:06 compute-0 kernel: tap43628527-f6: entered promiscuous mode
Nov 29 06:55:06 compute-0 NetworkManager[55227]: <info>  [1764399306.7729] manager: (tap43628527-f6): new Tun device (/org/freedesktop/NetworkManager/Devices/54)
Nov 29 06:55:06 compute-0 ovn_controller[95281]: 2025-11-29T06:55:06Z|00088|binding|INFO|Claiming lport 43628527-f640-422f-909a-1feda8b1b46b for this chassis.
Nov 29 06:55:06 compute-0 ovn_controller[95281]: 2025-11-29T06:55:06Z|00089|binding|INFO|43628527-f640-422f-909a-1feda8b1b46b: Claiming fa:16:3e:9d:c7:52 10.100.0.13
Nov 29 06:55:06 compute-0 nova_compute[187185]: 2025-11-29 06:55:06.772 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:55:06 compute-0 nova_compute[187185]: 2025-11-29 06:55:06.780 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:55:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:06.797 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9d:c7:52 10.100.0.13'], port_security=['fa:16:3e:9d:c7:52 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '6171b282-fd8f-42ba-9875-2ae45ea80365', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ca8fef31-1a4b-4249-948f-73ea087430b2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8122595a-c31d-4e3d-a668-dbae500c1d72, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=43628527-f640-422f-909a-1feda8b1b46b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 06:55:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:06.799 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 43628527-f640-422f-909a-1feda8b1b46b in datapath 17ec2ca4-3fa9-41aa-80ef-35bf92d404ba bound to our chassis
Nov 29 06:55:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:06.801 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 17ec2ca4-3fa9-41aa-80ef-35bf92d404ba
Nov 29 06:55:06 compute-0 systemd-udevd[218050]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 06:55:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:06.824 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[bc3e9e71-72cb-48cc-b95c-00af582c7579]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:55:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:06.825 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap17ec2ca4-31 in ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 06:55:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:06.828 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap17ec2ca4-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 06:55:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:06.829 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[70c7085d-9b14-43bd-82e8-a3e5cfbcdc7a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:55:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:06.829 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[7f10219f-bd63-4ace-9003-38c8778cfe34]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:55:06 compute-0 NetworkManager[55227]: <info>  [1764399306.8447] device (tap43628527-f6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 06:55:06 compute-0 NetworkManager[55227]: <info>  [1764399306.8462] device (tap43628527-f6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 06:55:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:06.854 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[85ccd574-021e-44a4-a34b-b717fc1c8f16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:55:06 compute-0 nova_compute[187185]: 2025-11-29 06:55:06.865 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:55:06 compute-0 ovn_controller[95281]: 2025-11-29T06:55:06Z|00090|binding|INFO|Setting lport 43628527-f640-422f-909a-1feda8b1b46b ovn-installed in OVS
Nov 29 06:55:06 compute-0 ovn_controller[95281]: 2025-11-29T06:55:06Z|00091|binding|INFO|Setting lport 43628527-f640-422f-909a-1feda8b1b46b up in Southbound
Nov 29 06:55:06 compute-0 systemd-machined[153486]: New machine qemu-12-instance-00000022.
Nov 29 06:55:06 compute-0 nova_compute[187185]: 2025-11-29 06:55:06.872 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:55:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:06.886 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[ad0824fb-3a55-4633-a19a-d51dfb10f61a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:55:06 compute-0 systemd[1]: Started Virtual Machine qemu-12-instance-00000022.
Nov 29 06:55:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:06.927 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[08ae3e07-f258-45b1-8f7b-ec41f29055d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:55:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:06.934 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[1af87257-dc66-4d05-bc66-bc211a640f18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:55:06 compute-0 NetworkManager[55227]: <info>  [1764399306.9357] manager: (tap17ec2ca4-30): new Veth device (/org/freedesktop/NetworkManager/Devices/55)
Nov 29 06:55:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:06.974 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[e17f3d91-e419-43df-9d96-df29aebe109b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:55:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:06.977 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[a60b2cea-c0e7-40de-a6f2-70edbdf1d87b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:55:07 compute-0 NetworkManager[55227]: <info>  [1764399307.0037] device (tap17ec2ca4-30): carrier: link connected
Nov 29 06:55:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:07.012 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[89e09710-0a3a-41f7-a941-456296250b16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:55:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:07.035 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[161b9ef7-17fa-45fc-bfd3-b9316cfd9a27]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap17ec2ca4-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ec:55:6b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 475466, 'reachable_time': 21193, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218085, 'error': None, 'target': 'ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:55:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:07.056 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[781776ae-d2ba-40b7-a495-144feed95888]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feec:556b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 475466, 'tstamp': 475466}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218086, 'error': None, 'target': 'ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:55:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:07.080 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[9c380d90-428a-417f-9524-a6ae9393de73]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap17ec2ca4-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ec:55:6b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 475466, 'reachable_time': 21193, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218092, 'error': None, 'target': 'ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:55:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:07.117 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[5df65be3-aacd-4c69-b69f-680ea2da06d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:55:07 compute-0 nova_compute[187185]: 2025-11-29 06:55:07.164 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399307.1632392, 6171b282-fd8f-42ba-9875-2ae45ea80365 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:55:07 compute-0 nova_compute[187185]: 2025-11-29 06:55:07.165 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] VM Started (Lifecycle Event)
Nov 29 06:55:07 compute-0 nova_compute[187185]: 2025-11-29 06:55:07.194 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:55:07 compute-0 nova_compute[187185]: 2025-11-29 06:55:07.200 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399307.1659622, 6171b282-fd8f-42ba-9875-2ae45ea80365 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:55:07 compute-0 nova_compute[187185]: 2025-11-29 06:55:07.200 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] VM Paused (Lifecycle Event)
Nov 29 06:55:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:07.203 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[3c91f4de-d1ca-4457-87d8-4770136037cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:55:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:07.204 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap17ec2ca4-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:55:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:07.205 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 06:55:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:07.205 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap17ec2ca4-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:55:07 compute-0 nova_compute[187185]: 2025-11-29 06:55:07.207 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:55:07 compute-0 NetworkManager[55227]: <info>  [1764399307.2082] manager: (tap17ec2ca4-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Nov 29 06:55:07 compute-0 kernel: tap17ec2ca4-30: entered promiscuous mode
Nov 29 06:55:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:07.211 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap17ec2ca4-30, col_values=(('external_ids', {'iface-id': '97d66506-c891-4bf7-8595-2d091560f247'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:55:07 compute-0 nova_compute[187185]: 2025-11-29 06:55:07.212 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:55:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:07.213 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/17ec2ca4-3fa9-41aa-80ef-35bf92d404ba.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/17ec2ca4-3fa9-41aa-80ef-35bf92d404ba.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 06:55:07 compute-0 ovn_controller[95281]: 2025-11-29T06:55:07Z|00092|binding|INFO|Releasing lport 97d66506-c891-4bf7-8595-2d091560f247 from this chassis (sb_readonly=0)
Nov 29 06:55:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:07.221 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e9c54d74-5d2f-4a7a-a11d-585161b3c95d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:55:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:07.222 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 06:55:07 compute-0 ovn_metadata_agent[104249]: global
Nov 29 06:55:07 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 06:55:07 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba
Nov 29 06:55:07 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 06:55:07 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 06:55:07 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 06:55:07 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/17ec2ca4-3fa9-41aa-80ef-35bf92d404ba.pid.haproxy
Nov 29 06:55:07 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 06:55:07 compute-0 ovn_metadata_agent[104249]: 
Nov 29 06:55:07 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 06:55:07 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 06:55:07 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 06:55:07 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 06:55:07 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 06:55:07 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 06:55:07 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 06:55:07 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 06:55:07 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 06:55:07 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 06:55:07 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 06:55:07 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 06:55:07 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 06:55:07 compute-0 ovn_metadata_agent[104249]: 
Nov 29 06:55:07 compute-0 ovn_metadata_agent[104249]: 
Nov 29 06:55:07 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 06:55:07 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 06:55:07 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 06:55:07 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID 17ec2ca4-3fa9-41aa-80ef-35bf92d404ba
Nov 29 06:55:07 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 06:55:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:07.223 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'env', 'PROCESS_TAG=haproxy-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/17ec2ca4-3fa9-41aa-80ef-35bf92d404ba.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 06:55:07 compute-0 nova_compute[187185]: 2025-11-29 06:55:07.226 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:55:07 compute-0 nova_compute[187185]: 2025-11-29 06:55:07.231 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 06:55:07 compute-0 nova_compute[187185]: 2025-11-29 06:55:07.234 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:55:07 compute-0 nova_compute[187185]: 2025-11-29 06:55:07.252 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 06:55:07 compute-0 nova_compute[187185]: 2025-11-29 06:55:07.356 187189 DEBUG nova.compute.manager [req-74159d5d-54ac-465a-a3d2-dd1ac7b0047f req-060b27d1-2c60-40bd-8d2c-70c3cc8b24d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Received event network-vif-plugged-43628527-f640-422f-909a-1feda8b1b46b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:55:07 compute-0 nova_compute[187185]: 2025-11-29 06:55:07.369 187189 DEBUG oslo_concurrency.lockutils [req-74159d5d-54ac-465a-a3d2-dd1ac7b0047f req-060b27d1-2c60-40bd-8d2c-70c3cc8b24d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "6171b282-fd8f-42ba-9875-2ae45ea80365-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:55:07 compute-0 nova_compute[187185]: 2025-11-29 06:55:07.370 187189 DEBUG oslo_concurrency.lockutils [req-74159d5d-54ac-465a-a3d2-dd1ac7b0047f req-060b27d1-2c60-40bd-8d2c-70c3cc8b24d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "6171b282-fd8f-42ba-9875-2ae45ea80365-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:55:07 compute-0 nova_compute[187185]: 2025-11-29 06:55:07.370 187189 DEBUG oslo_concurrency.lockutils [req-74159d5d-54ac-465a-a3d2-dd1ac7b0047f req-060b27d1-2c60-40bd-8d2c-70c3cc8b24d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "6171b282-fd8f-42ba-9875-2ae45ea80365-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:55:07 compute-0 nova_compute[187185]: 2025-11-29 06:55:07.371 187189 DEBUG nova.compute.manager [req-74159d5d-54ac-465a-a3d2-dd1ac7b0047f req-060b27d1-2c60-40bd-8d2c-70c3cc8b24d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Processing event network-vif-plugged-43628527-f640-422f-909a-1feda8b1b46b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 06:55:07 compute-0 nova_compute[187185]: 2025-11-29 06:55:07.372 187189 DEBUG nova.compute.manager [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 06:55:07 compute-0 nova_compute[187185]: 2025-11-29 06:55:07.376 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399307.3765638, 6171b282-fd8f-42ba-9875-2ae45ea80365 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:55:07 compute-0 nova_compute[187185]: 2025-11-29 06:55:07.377 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] VM Resumed (Lifecycle Event)
Nov 29 06:55:07 compute-0 nova_compute[187185]: 2025-11-29 06:55:07.381 187189 DEBUG nova.virt.libvirt.driver [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 06:55:07 compute-0 nova_compute[187185]: 2025-11-29 06:55:07.386 187189 INFO nova.virt.libvirt.driver [-] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Instance spawned successfully.
Nov 29 06:55:07 compute-0 nova_compute[187185]: 2025-11-29 06:55:07.387 187189 DEBUG nova.virt.libvirt.driver [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 06:55:07 compute-0 nova_compute[187185]: 2025-11-29 06:55:07.408 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:55:07 compute-0 nova_compute[187185]: 2025-11-29 06:55:07.419 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 06:55:07 compute-0 nova_compute[187185]: 2025-11-29 06:55:07.426 187189 DEBUG nova.virt.libvirt.driver [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:55:07 compute-0 nova_compute[187185]: 2025-11-29 06:55:07.427 187189 DEBUG nova.virt.libvirt.driver [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:55:07 compute-0 nova_compute[187185]: 2025-11-29 06:55:07.428 187189 DEBUG nova.virt.libvirt.driver [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:55:07 compute-0 nova_compute[187185]: 2025-11-29 06:55:07.429 187189 DEBUG nova.virt.libvirt.driver [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:55:07 compute-0 nova_compute[187185]: 2025-11-29 06:55:07.430 187189 DEBUG nova.virt.libvirt.driver [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:55:07 compute-0 nova_compute[187185]: 2025-11-29 06:55:07.431 187189 DEBUG nova.virt.libvirt.driver [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:55:07 compute-0 nova_compute[187185]: 2025-11-29 06:55:07.439 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 06:55:07 compute-0 nova_compute[187185]: 2025-11-29 06:55:07.513 187189 INFO nova.compute.manager [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Took 6.84 seconds to spawn the instance on the hypervisor.
Nov 29 06:55:07 compute-0 nova_compute[187185]: 2025-11-29 06:55:07.514 187189 DEBUG nova.compute.manager [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:55:07 compute-0 nova_compute[187185]: 2025-11-29 06:55:07.713 187189 DEBUG oslo_concurrency.lockutils [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Acquiring lock "refresh_cache-66b9235f-7cc8-40d4-877b-b690613298a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:55:07 compute-0 nova_compute[187185]: 2025-11-29 06:55:07.715 187189 DEBUG oslo_concurrency.lockutils [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Acquired lock "refresh_cache-66b9235f-7cc8-40d4-877b-b690613298a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:55:07 compute-0 nova_compute[187185]: 2025-11-29 06:55:07.716 187189 DEBUG nova.network.neutron [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 06:55:07 compute-0 nova_compute[187185]: 2025-11-29 06:55:07.739 187189 INFO nova.compute.manager [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Took 7.68 seconds to build instance.
Nov 29 06:55:07 compute-0 podman[218125]: 2025-11-29 06:55:07.660385142 +0000 UTC m=+0.043106077 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 06:55:07 compute-0 nova_compute[187185]: 2025-11-29 06:55:07.763 187189 DEBUG oslo_concurrency.lockutils [None req-9d110088-41fd-4cd3-b131-8e6045ffb8a6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "6171b282-fd8f-42ba-9875-2ae45ea80365" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.836s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:55:07 compute-0 nova_compute[187185]: 2025-11-29 06:55:07.965 187189 DEBUG nova.network.neutron [req-49d0f041-f9d2-4faa-bcfa-51d00ac8bd33 req-f8b49190-24f9-490c-93d4-4a4116a21f49 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Updated VIF entry in instance network info cache for port 43628527-f640-422f-909a-1feda8b1b46b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 06:55:07 compute-0 nova_compute[187185]: 2025-11-29 06:55:07.969 187189 DEBUG nova.network.neutron [req-49d0f041-f9d2-4faa-bcfa-51d00ac8bd33 req-f8b49190-24f9-490c-93d4-4a4116a21f49 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Updating instance_info_cache with network_info: [{"id": "43628527-f640-422f-909a-1feda8b1b46b", "address": "fa:16:3e:9d:c7:52", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43628527-f6", "ovs_interfaceid": "43628527-f640-422f-909a-1feda8b1b46b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:55:08 compute-0 nova_compute[187185]: 2025-11-29 06:55:08.002 187189 DEBUG nova.network.neutron [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 06:55:08 compute-0 nova_compute[187185]: 2025-11-29 06:55:08.009 187189 DEBUG oslo_concurrency.lockutils [req-49d0f041-f9d2-4faa-bcfa-51d00ac8bd33 req-f8b49190-24f9-490c-93d4-4a4116a21f49 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-6171b282-fd8f-42ba-9875-2ae45ea80365" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:55:08 compute-0 nova_compute[187185]: 2025-11-29 06:55:08.114 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:55:08 compute-0 podman[218125]: 2025-11-29 06:55:08.240600118 +0000 UTC m=+0.623321043 container create 89eaf63b3bc77ec68b3e7bab5443b6aa967d9614bc70c6f941f99673de9c4f87 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:55:08 compute-0 systemd[1]: Started libpod-conmon-89eaf63b3bc77ec68b3e7bab5443b6aa967d9614bc70c6f941f99673de9c4f87.scope.
Nov 29 06:55:08 compute-0 systemd[1]: Started libcrun container.
Nov 29 06:55:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4df7b7fd28614d455b6b400d92a5a5a1adad67384cb8db154712c8c31cf4899c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 06:55:08 compute-0 podman[218125]: 2025-11-29 06:55:08.667790433 +0000 UTC m=+1.050511348 container init 89eaf63b3bc77ec68b3e7bab5443b6aa967d9614bc70c6f941f99673de9c4f87 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 29 06:55:08 compute-0 podman[218125]: 2025-11-29 06:55:08.674108173 +0000 UTC m=+1.056829068 container start 89eaf63b3bc77ec68b3e7bab5443b6aa967d9614bc70c6f941f99673de9c4f87 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 29 06:55:08 compute-0 neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba[218140]: [NOTICE]   (218145) : New worker (218147) forked
Nov 29 06:55:08 compute-0 neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba[218140]: [NOTICE]   (218145) : Loading success.
Nov 29 06:55:08 compute-0 nova_compute[187185]: 2025-11-29 06:55:08.732 187189 DEBUG nova.network.neutron [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:55:08 compute-0 nova_compute[187185]: 2025-11-29 06:55:08.753 187189 DEBUG oslo_concurrency.lockutils [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Releasing lock "refresh_cache-66b9235f-7cc8-40d4-877b-b690613298a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:55:08 compute-0 nova_compute[187185]: 2025-11-29 06:55:08.927 187189 DEBUG nova.virt.libvirt.driver [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Nov 29 06:55:08 compute-0 nova_compute[187185]: 2025-11-29 06:55:08.929 187189 DEBUG nova.virt.libvirt.volume.remotefs [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Creating file /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/97bf34eae73743f7a962b89f49a8bc38.tmp on remote host 192.168.122.101 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79
Nov 29 06:55:08 compute-0 nova_compute[187185]: 2025-11-29 06:55:08.930 187189 DEBUG oslo_concurrency.processutils [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/97bf34eae73743f7a962b89f49a8bc38.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:55:09 compute-0 nova_compute[187185]: 2025-11-29 06:55:09.431 187189 DEBUG nova.compute.manager [None req-ef31d579-2b96-4887-8607-ff17c3862ec6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:55:09 compute-0 nova_compute[187185]: 2025-11-29 06:55:09.445 187189 DEBUG oslo_concurrency.processutils [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/97bf34eae73743f7a962b89f49a8bc38.tmp" returned: 1 in 0.516s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:55:09 compute-0 nova_compute[187185]: 2025-11-29 06:55:09.447 187189 DEBUG oslo_concurrency.processutils [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] 'ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/97bf34eae73743f7a962b89f49a8bc38.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Nov 29 06:55:09 compute-0 nova_compute[187185]: 2025-11-29 06:55:09.447 187189 DEBUG nova.virt.libvirt.volume.remotefs [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Creating directory /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4 on remote host 192.168.122.101 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91
Nov 29 06:55:09 compute-0 nova_compute[187185]: 2025-11-29 06:55:09.447 187189 DEBUG oslo_concurrency.processutils [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:55:09 compute-0 nova_compute[187185]: 2025-11-29 06:55:09.538 187189 INFO nova.compute.manager [None req-ef31d579-2b96-4887-8607-ff17c3862ec6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] instance snapshotting
Nov 29 06:55:09 compute-0 nova_compute[187185]: 2025-11-29 06:55:09.560 187189 DEBUG nova.compute.manager [req-a6c5fa0d-56e0-4828-8380-5b388a6a5287 req-ecb73fa6-584b-4d60-a84c-47479edb50b9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Received event network-vif-plugged-43628527-f640-422f-909a-1feda8b1b46b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:55:09 compute-0 nova_compute[187185]: 2025-11-29 06:55:09.560 187189 DEBUG oslo_concurrency.lockutils [req-a6c5fa0d-56e0-4828-8380-5b388a6a5287 req-ecb73fa6-584b-4d60-a84c-47479edb50b9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "6171b282-fd8f-42ba-9875-2ae45ea80365-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:55:09 compute-0 nova_compute[187185]: 2025-11-29 06:55:09.561 187189 DEBUG oslo_concurrency.lockutils [req-a6c5fa0d-56e0-4828-8380-5b388a6a5287 req-ecb73fa6-584b-4d60-a84c-47479edb50b9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "6171b282-fd8f-42ba-9875-2ae45ea80365-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:55:09 compute-0 nova_compute[187185]: 2025-11-29 06:55:09.561 187189 DEBUG oslo_concurrency.lockutils [req-a6c5fa0d-56e0-4828-8380-5b388a6a5287 req-ecb73fa6-584b-4d60-a84c-47479edb50b9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "6171b282-fd8f-42ba-9875-2ae45ea80365-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:55:09 compute-0 nova_compute[187185]: 2025-11-29 06:55:09.561 187189 DEBUG nova.compute.manager [req-a6c5fa0d-56e0-4828-8380-5b388a6a5287 req-ecb73fa6-584b-4d60-a84c-47479edb50b9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] No waiting events found dispatching network-vif-plugged-43628527-f640-422f-909a-1feda8b1b46b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 06:55:09 compute-0 nova_compute[187185]: 2025-11-29 06:55:09.562 187189 WARNING nova.compute.manager [req-a6c5fa0d-56e0-4828-8380-5b388a6a5287 req-ecb73fa6-584b-4d60-a84c-47479edb50b9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Received unexpected event network-vif-plugged-43628527-f640-422f-909a-1feda8b1b46b for instance with vm_state active and task_state image_snapshot.
Nov 29 06:55:09 compute-0 nova_compute[187185]: 2025-11-29 06:55:09.682 187189 DEBUG oslo_concurrency.processutils [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4" returned: 0 in 0.234s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:55:09 compute-0 nova_compute[187185]: 2025-11-29 06:55:09.691 187189 DEBUG nova.virt.libvirt.driver [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Nov 29 06:55:10 compute-0 nova_compute[187185]: 2025-11-29 06:55:10.117 187189 INFO nova.virt.libvirt.driver [None req-ef31d579-2b96-4887-8607-ff17c3862ec6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Beginning live snapshot process
Nov 29 06:55:10 compute-0 nova_compute[187185]: 2025-11-29 06:55:10.161 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:55:10 compute-0 virtqemud[186729]: invalid argument: disk vda does not have an active block job
Nov 29 06:55:10 compute-0 nova_compute[187185]: 2025-11-29 06:55:10.353 187189 DEBUG oslo_concurrency.processutils [None req-ef31d579-2b96-4887-8607-ff17c3862ec6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6171b282-fd8f-42ba-9875-2ae45ea80365/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:55:10 compute-0 nova_compute[187185]: 2025-11-29 06:55:10.422 187189 DEBUG oslo_concurrency.processutils [None req-ef31d579-2b96-4887-8607-ff17c3862ec6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6171b282-fd8f-42ba-9875-2ae45ea80365/disk --force-share --output=json -f qcow2" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:55:10 compute-0 nova_compute[187185]: 2025-11-29 06:55:10.427 187189 DEBUG oslo_concurrency.processutils [None req-ef31d579-2b96-4887-8607-ff17c3862ec6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6171b282-fd8f-42ba-9875-2ae45ea80365/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:55:10 compute-0 nova_compute[187185]: 2025-11-29 06:55:10.486 187189 DEBUG oslo_concurrency.processutils [None req-ef31d579-2b96-4887-8607-ff17c3862ec6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6171b282-fd8f-42ba-9875-2ae45ea80365/disk --force-share --output=json -f qcow2" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:55:10 compute-0 nova_compute[187185]: 2025-11-29 06:55:10.517 187189 DEBUG oslo_concurrency.processutils [None req-ef31d579-2b96-4887-8607-ff17c3862ec6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:55:10 compute-0 nova_compute[187185]: 2025-11-29 06:55:10.576 187189 DEBUG oslo_concurrency.processutils [None req-ef31d579-2b96-4887-8607-ff17c3862ec6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:55:10 compute-0 nova_compute[187185]: 2025-11-29 06:55:10.578 187189 DEBUG oslo_concurrency.processutils [None req-ef31d579-2b96-4887-8607-ff17c3862ec6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpgjtzszpr/e997ac9e020c48209fb323b2b4326cd1.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:55:11 compute-0 nova_compute[187185]: 2025-11-29 06:55:11.918 187189 DEBUG oslo_concurrency.processutils [None req-ef31d579-2b96-4887-8607-ff17c3862ec6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpgjtzszpr/e997ac9e020c48209fb323b2b4326cd1.delta 1073741824" returned: 0 in 1.340s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:55:11 compute-0 nova_compute[187185]: 2025-11-29 06:55:11.919 187189 INFO nova.virt.libvirt.driver [None req-ef31d579-2b96-4887-8607-ff17c3862ec6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Quiescing instance not available: QEMU guest agent is not enabled.
Nov 29 06:55:11 compute-0 nova_compute[187185]: 2025-11-29 06:55:11.970 187189 DEBUG nova.virt.libvirt.guest [None req-ef31d579-2b96-4887-8607-ff17c3862ec6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Nov 29 06:55:11 compute-0 nova_compute[187185]: 2025-11-29 06:55:11.976 187189 INFO nova.virt.libvirt.driver [None req-ef31d579-2b96-4887-8607-ff17c3862ec6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Skipping quiescing instance: QEMU guest agent is not enabled.
Nov 29 06:55:12 compute-0 nova_compute[187185]: 2025-11-29 06:55:12.212 187189 DEBUG nova.privsep.utils [None req-ef31d579-2b96-4887-8607-ff17c3862ec6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 29 06:55:12 compute-0 nova_compute[187185]: 2025-11-29 06:55:12.213 187189 DEBUG oslo_concurrency.processutils [None req-ef31d579-2b96-4887-8607-ff17c3862ec6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpgjtzszpr/e997ac9e020c48209fb323b2b4326cd1.delta /var/lib/nova/instances/snapshots/tmpgjtzszpr/e997ac9e020c48209fb323b2b4326cd1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:55:13 compute-0 nova_compute[187185]: 2025-11-29 06:55:13.175 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:55:13 compute-0 nova_compute[187185]: 2025-11-29 06:55:13.640 187189 DEBUG oslo_concurrency.processutils [None req-ef31d579-2b96-4887-8607-ff17c3862ec6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpgjtzszpr/e997ac9e020c48209fb323b2b4326cd1.delta /var/lib/nova/instances/snapshots/tmpgjtzszpr/e997ac9e020c48209fb323b2b4326cd1" returned: 0 in 1.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:55:13 compute-0 nova_compute[187185]: 2025-11-29 06:55:13.641 187189 INFO nova.virt.libvirt.driver [None req-ef31d579-2b96-4887-8607-ff17c3862ec6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Snapshot extracted, beginning image upload
Nov 29 06:55:14 compute-0 nova_compute[187185]: 2025-11-29 06:55:14.172 187189 WARNING nova.compute.manager [None req-ef31d579-2b96-4887-8607-ff17c3862ec6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Image not found during snapshot: nova.exception.ImageNotFound: Image bbe05586-d701-416c-adb6-940bd9dd2fc7 could not be found.
Nov 29 06:55:14 compute-0 podman[218188]: 2025-11-29 06:55:14.814728286 +0000 UTC m=+0.074976048 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 29 06:55:15 compute-0 nova_compute[187185]: 2025-11-29 06:55:15.164 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:55:16 compute-0 nova_compute[187185]: 2025-11-29 06:55:16.464 187189 DEBUG oslo_concurrency.lockutils [None req-778dbeb8-7eec-477d-b6cb-20890f262ee2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "6171b282-fd8f-42ba-9875-2ae45ea80365" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:55:16 compute-0 nova_compute[187185]: 2025-11-29 06:55:16.465 187189 DEBUG oslo_concurrency.lockutils [None req-778dbeb8-7eec-477d-b6cb-20890f262ee2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "6171b282-fd8f-42ba-9875-2ae45ea80365" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:55:16 compute-0 nova_compute[187185]: 2025-11-29 06:55:16.465 187189 DEBUG oslo_concurrency.lockutils [None req-778dbeb8-7eec-477d-b6cb-20890f262ee2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "6171b282-fd8f-42ba-9875-2ae45ea80365-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:55:16 compute-0 nova_compute[187185]: 2025-11-29 06:55:16.465 187189 DEBUG oslo_concurrency.lockutils [None req-778dbeb8-7eec-477d-b6cb-20890f262ee2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "6171b282-fd8f-42ba-9875-2ae45ea80365-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:55:16 compute-0 nova_compute[187185]: 2025-11-29 06:55:16.465 187189 DEBUG oslo_concurrency.lockutils [None req-778dbeb8-7eec-477d-b6cb-20890f262ee2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "6171b282-fd8f-42ba-9875-2ae45ea80365-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:55:16 compute-0 nova_compute[187185]: 2025-11-29 06:55:16.479 187189 INFO nova.compute.manager [None req-778dbeb8-7eec-477d-b6cb-20890f262ee2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Terminating instance
Nov 29 06:55:16 compute-0 nova_compute[187185]: 2025-11-29 06:55:16.497 187189 DEBUG nova.compute.manager [None req-778dbeb8-7eec-477d-b6cb-20890f262ee2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 06:55:16 compute-0 kernel: tap43628527-f6 (unregistering): left promiscuous mode
Nov 29 06:55:16 compute-0 NetworkManager[55227]: <info>  [1764399316.5844] device (tap43628527-f6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 06:55:16 compute-0 ovn_controller[95281]: 2025-11-29T06:55:16Z|00093|binding|INFO|Releasing lport 43628527-f640-422f-909a-1feda8b1b46b from this chassis (sb_readonly=0)
Nov 29 06:55:16 compute-0 ovn_controller[95281]: 2025-11-29T06:55:16Z|00094|binding|INFO|Setting lport 43628527-f640-422f-909a-1feda8b1b46b down in Southbound
Nov 29 06:55:16 compute-0 nova_compute[187185]: 2025-11-29 06:55:16.593 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:55:16 compute-0 ovn_controller[95281]: 2025-11-29T06:55:16Z|00095|binding|INFO|Removing iface tap43628527-f6 ovn-installed in OVS
Nov 29 06:55:16 compute-0 nova_compute[187185]: 2025-11-29 06:55:16.616 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:55:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:16.619 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9d:c7:52 10.100.0.13'], port_security=['fa:16:3e:9d:c7:52 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '6171b282-fd8f-42ba-9875-2ae45ea80365', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ca8fef31-1a4b-4249-948f-73ea087430b2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8122595a-c31d-4e3d-a668-dbae500c1d72, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=43628527-f640-422f-909a-1feda8b1b46b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 06:55:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:16.622 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 43628527-f640-422f-909a-1feda8b1b46b in datapath 17ec2ca4-3fa9-41aa-80ef-35bf92d404ba unbound from our chassis
Nov 29 06:55:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:16.625 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 17ec2ca4-3fa9-41aa-80ef-35bf92d404ba, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 06:55:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:16.627 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f575dff7-8378-41c8-a4c3-6e8879165f6d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:55:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:16.628 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba namespace which is not needed anymore
Nov 29 06:55:16 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000022.scope: Deactivated successfully.
Nov 29 06:55:16 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000022.scope: Consumed 9.508s CPU time.
Nov 29 06:55:16 compute-0 systemd-machined[153486]: Machine qemu-12-instance-00000022 terminated.
Nov 29 06:55:16 compute-0 nova_compute[187185]: 2025-11-29 06:55:16.727 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:55:16 compute-0 nova_compute[187185]: 2025-11-29 06:55:16.732 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:55:16 compute-0 nova_compute[187185]: 2025-11-29 06:55:16.769 187189 INFO nova.virt.libvirt.driver [-] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Instance destroyed successfully.
Nov 29 06:55:16 compute-0 nova_compute[187185]: 2025-11-29 06:55:16.770 187189 DEBUG nova.objects.instance [None req-778dbeb8-7eec-477d-b6cb-20890f262ee2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lazy-loading 'resources' on Instance uuid 6171b282-fd8f-42ba-9875-2ae45ea80365 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:55:16 compute-0 nova_compute[187185]: 2025-11-29 06:55:16.837 187189 DEBUG nova.virt.libvirt.vif [None req-778dbeb8-7eec-477d-b6cb-20890f262ee2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:54:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-797406206',display_name='tempest-ImagesTestJSON-server-797406206',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-797406206',id=34,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:55:07Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='78f8ba841bbe4fdcb9d9e2237d97bf73',ramdisk_id='',reservation_id='r-q8w53trj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesTestJSON-1674785298',owner_user_name='tempest-ImagesTestJSON-1674785298-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:55:14Z,user_data=None,user_id='315be492c2ce4b9f8af2898e6794a256',uuid=6171b282-fd8f-42ba-9875-2ae45ea80365,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "43628527-f640-422f-909a-1feda8b1b46b", "address": "fa:16:3e:9d:c7:52", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43628527-f6", "ovs_interfaceid": "43628527-f640-422f-909a-1feda8b1b46b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 06:55:16 compute-0 nova_compute[187185]: 2025-11-29 06:55:16.839 187189 DEBUG nova.network.os_vif_util [None req-778dbeb8-7eec-477d-b6cb-20890f262ee2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Converting VIF {"id": "43628527-f640-422f-909a-1feda8b1b46b", "address": "fa:16:3e:9d:c7:52", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43628527-f6", "ovs_interfaceid": "43628527-f640-422f-909a-1feda8b1b46b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 06:55:16 compute-0 nova_compute[187185]: 2025-11-29 06:55:16.840 187189 DEBUG nova.network.os_vif_util [None req-778dbeb8-7eec-477d-b6cb-20890f262ee2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9d:c7:52,bridge_name='br-int',has_traffic_filtering=True,id=43628527-f640-422f-909a-1feda8b1b46b,network=Network(17ec2ca4-3fa9-41aa-80ef-35bf92d404ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43628527-f6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 06:55:16 compute-0 nova_compute[187185]: 2025-11-29 06:55:16.841 187189 DEBUG os_vif [None req-778dbeb8-7eec-477d-b6cb-20890f262ee2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9d:c7:52,bridge_name='br-int',has_traffic_filtering=True,id=43628527-f640-422f-909a-1feda8b1b46b,network=Network(17ec2ca4-3fa9-41aa-80ef-35bf92d404ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43628527-f6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 06:55:16 compute-0 nova_compute[187185]: 2025-11-29 06:55:16.844 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:55:16 compute-0 nova_compute[187185]: 2025-11-29 06:55:16.845 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap43628527-f6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:55:16 compute-0 nova_compute[187185]: 2025-11-29 06:55:16.847 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:55:16 compute-0 nova_compute[187185]: 2025-11-29 06:55:16.850 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 06:55:16 compute-0 nova_compute[187185]: 2025-11-29 06:55:16.855 187189 INFO os_vif [None req-778dbeb8-7eec-477d-b6cb-20890f262ee2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9d:c7:52,bridge_name='br-int',has_traffic_filtering=True,id=43628527-f640-422f-909a-1feda8b1b46b,network=Network(17ec2ca4-3fa9-41aa-80ef-35bf92d404ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43628527-f6')
Nov 29 06:55:16 compute-0 nova_compute[187185]: 2025-11-29 06:55:16.856 187189 INFO nova.virt.libvirt.driver [None req-778dbeb8-7eec-477d-b6cb-20890f262ee2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Deleting instance files /var/lib/nova/instances/6171b282-fd8f-42ba-9875-2ae45ea80365_del
Nov 29 06:55:16 compute-0 nova_compute[187185]: 2025-11-29 06:55:16.857 187189 INFO nova.virt.libvirt.driver [None req-778dbeb8-7eec-477d-b6cb-20890f262ee2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Deletion of /var/lib/nova/instances/6171b282-fd8f-42ba-9875-2ae45ea80365_del complete
Nov 29 06:55:16 compute-0 neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba[218140]: [NOTICE]   (218145) : haproxy version is 2.8.14-c23fe91
Nov 29 06:55:16 compute-0 neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba[218140]: [NOTICE]   (218145) : path to executable is /usr/sbin/haproxy
Nov 29 06:55:16 compute-0 neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba[218140]: [WARNING]  (218145) : Exiting Master process...
Nov 29 06:55:16 compute-0 neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba[218140]: [ALERT]    (218145) : Current worker (218147) exited with code 143 (Terminated)
Nov 29 06:55:16 compute-0 neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba[218140]: [WARNING]  (218145) : All workers exited. Exiting... (0)
Nov 29 06:55:16 compute-0 systemd[1]: libpod-89eaf63b3bc77ec68b3e7bab5443b6aa967d9614bc70c6f941f99673de9c4f87.scope: Deactivated successfully.
Nov 29 06:55:16 compute-0 podman[218250]: 2025-11-29 06:55:16.877712568 +0000 UTC m=+0.152407175 container died 89eaf63b3bc77ec68b3e7bab5443b6aa967d9614bc70c6f941f99673de9c4f87 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 29 06:55:17 compute-0 nova_compute[187185]: 2025-11-29 06:55:17.072 187189 INFO nova.compute.manager [None req-778dbeb8-7eec-477d-b6cb-20890f262ee2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Took 0.57 seconds to destroy the instance on the hypervisor.
Nov 29 06:55:17 compute-0 nova_compute[187185]: 2025-11-29 06:55:17.073 187189 DEBUG oslo.service.loopingcall [None req-778dbeb8-7eec-477d-b6cb-20890f262ee2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 06:55:17 compute-0 nova_compute[187185]: 2025-11-29 06:55:17.073 187189 DEBUG nova.compute.manager [-] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 06:55:17 compute-0 nova_compute[187185]: 2025-11-29 06:55:17.074 187189 DEBUG nova.network.neutron [-] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 06:55:17 compute-0 nova_compute[187185]: 2025-11-29 06:55:17.096 187189 DEBUG nova.compute.manager [req-71f1cc7d-5f91-477d-ac7e-6f0c235d42a6 req-8caa0c4d-f410-46bd-8f90-d3fd19f4321c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Received event network-vif-unplugged-43628527-f640-422f-909a-1feda8b1b46b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:55:17 compute-0 nova_compute[187185]: 2025-11-29 06:55:17.097 187189 DEBUG oslo_concurrency.lockutils [req-71f1cc7d-5f91-477d-ac7e-6f0c235d42a6 req-8caa0c4d-f410-46bd-8f90-d3fd19f4321c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "6171b282-fd8f-42ba-9875-2ae45ea80365-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:55:17 compute-0 nova_compute[187185]: 2025-11-29 06:55:17.097 187189 DEBUG oslo_concurrency.lockutils [req-71f1cc7d-5f91-477d-ac7e-6f0c235d42a6 req-8caa0c4d-f410-46bd-8f90-d3fd19f4321c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "6171b282-fd8f-42ba-9875-2ae45ea80365-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:55:17 compute-0 nova_compute[187185]: 2025-11-29 06:55:17.097 187189 DEBUG oslo_concurrency.lockutils [req-71f1cc7d-5f91-477d-ac7e-6f0c235d42a6 req-8caa0c4d-f410-46bd-8f90-d3fd19f4321c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "6171b282-fd8f-42ba-9875-2ae45ea80365-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:55:17 compute-0 nova_compute[187185]: 2025-11-29 06:55:17.098 187189 DEBUG nova.compute.manager [req-71f1cc7d-5f91-477d-ac7e-6f0c235d42a6 req-8caa0c4d-f410-46bd-8f90-d3fd19f4321c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] No waiting events found dispatching network-vif-unplugged-43628527-f640-422f-909a-1feda8b1b46b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 06:55:17 compute-0 nova_compute[187185]: 2025-11-29 06:55:17.098 187189 DEBUG nova.compute.manager [req-71f1cc7d-5f91-477d-ac7e-6f0c235d42a6 req-8caa0c4d-f410-46bd-8f90-d3fd19f4321c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Received event network-vif-unplugged-43628527-f640-422f-909a-1feda8b1b46b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 06:55:17 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-89eaf63b3bc77ec68b3e7bab5443b6aa967d9614bc70c6f941f99673de9c4f87-userdata-shm.mount: Deactivated successfully.
Nov 29 06:55:17 compute-0 systemd[1]: var-lib-containers-storage-overlay-4df7b7fd28614d455b6b400d92a5a5a1adad67384cb8db154712c8c31cf4899c-merged.mount: Deactivated successfully.
Nov 29 06:55:18 compute-0 nova_compute[187185]: 2025-11-29 06:55:18.165 187189 DEBUG nova.network.neutron [-] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:55:18 compute-0 nova_compute[187185]: 2025-11-29 06:55:18.174 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:55:18 compute-0 nova_compute[187185]: 2025-11-29 06:55:18.198 187189 INFO nova.compute.manager [-] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Took 1.12 seconds to deallocate network for instance.
Nov 29 06:55:18 compute-0 nova_compute[187185]: 2025-11-29 06:55:18.299 187189 DEBUG oslo_concurrency.lockutils [None req-778dbeb8-7eec-477d-b6cb-20890f262ee2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:55:18 compute-0 nova_compute[187185]: 2025-11-29 06:55:18.300 187189 DEBUG oslo_concurrency.lockutils [None req-778dbeb8-7eec-477d-b6cb-20890f262ee2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:55:18 compute-0 nova_compute[187185]: 2025-11-29 06:55:18.303 187189 DEBUG nova.compute.manager [req-9d8a6e25-0292-4fe9-8052-b31c1109973a req-2613528b-05a8-4515-9628-4ff4fdd95bdf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Received event network-vif-deleted-43628527-f640-422f-909a-1feda8b1b46b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:55:18 compute-0 nova_compute[187185]: 2025-11-29 06:55:18.398 187189 DEBUG nova.compute.provider_tree [None req-778dbeb8-7eec-477d-b6cb-20890f262ee2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:55:18 compute-0 podman[218250]: 2025-11-29 06:55:18.407767502 +0000 UTC m=+1.682462109 container cleanup 89eaf63b3bc77ec68b3e7bab5443b6aa967d9614bc70c6f941f99673de9c4f87 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Nov 29 06:55:18 compute-0 nova_compute[187185]: 2025-11-29 06:55:18.419 187189 DEBUG nova.scheduler.client.report [None req-778dbeb8-7eec-477d-b6cb-20890f262ee2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:55:18 compute-0 systemd[1]: libpod-conmon-89eaf63b3bc77ec68b3e7bab5443b6aa967d9614bc70c6f941f99673de9c4f87.scope: Deactivated successfully.
Nov 29 06:55:18 compute-0 nova_compute[187185]: 2025-11-29 06:55:18.467 187189 DEBUG oslo_concurrency.lockutils [None req-778dbeb8-7eec-477d-b6cb-20890f262ee2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:55:18 compute-0 nova_compute[187185]: 2025-11-29 06:55:18.494 187189 INFO nova.scheduler.client.report [None req-778dbeb8-7eec-477d-b6cb-20890f262ee2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Deleted allocations for instance 6171b282-fd8f-42ba-9875-2ae45ea80365
Nov 29 06:55:18 compute-0 nova_compute[187185]: 2025-11-29 06:55:18.588 187189 DEBUG oslo_concurrency.lockutils [None req-778dbeb8-7eec-477d-b6cb-20890f262ee2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "6171b282-fd8f-42ba-9875-2ae45ea80365" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:55:18 compute-0 podman[218297]: 2025-11-29 06:55:18.602084679 +0000 UTC m=+0.145282283 container remove 89eaf63b3bc77ec68b3e7bab5443b6aa967d9614bc70c6f941f99673de9c4f87 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:55:18 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:18.611 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[aaf8a007-d1e4-4ef3-8acd-42224d2b4214]: (4, ('Sat Nov 29 06:55:16 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba (89eaf63b3bc77ec68b3e7bab5443b6aa967d9614bc70c6f941f99673de9c4f87)\n89eaf63b3bc77ec68b3e7bab5443b6aa967d9614bc70c6f941f99673de9c4f87\nSat Nov 29 06:55:18 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba (89eaf63b3bc77ec68b3e7bab5443b6aa967d9614bc70c6f941f99673de9c4f87)\n89eaf63b3bc77ec68b3e7bab5443b6aa967d9614bc70c6f941f99673de9c4f87\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:55:18 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:18.614 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[5ee67563-a6d7-4f8a-b487-c685db0d463c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:55:18 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:18.616 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap17ec2ca4-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:55:18 compute-0 nova_compute[187185]: 2025-11-29 06:55:18.620 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:55:18 compute-0 kernel: tap17ec2ca4-30: left promiscuous mode
Nov 29 06:55:18 compute-0 nova_compute[187185]: 2025-11-29 06:55:18.645 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:55:18 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:18.648 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[5208bee7-0be3-48c4-9302-dc13bcd3ad81]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:55:18 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:18.670 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[68374d97-9c6e-4d57-82d4-4e30a5599724]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:55:18 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:18.671 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[3766ada6-9afb-49e1-b2cd-9ffafdacec37]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:55:18 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:18.690 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[00d472b3-4c3c-4cb8-a1e5-5cc49935713e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 475458, 'reachable_time': 21782, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218313, 'error': None, 'target': 'ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:55:18 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:18.694 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 06:55:18 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:18.695 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[4c15e464-5d7a-4303-8036-143a7376cccb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:55:18 compute-0 systemd[1]: run-netns-ovnmeta\x2d17ec2ca4\x2d3fa9\x2d41aa\x2d80ef\x2d35bf92d404ba.mount: Deactivated successfully.
Nov 29 06:55:19 compute-0 nova_compute[187185]: 2025-11-29 06:55:19.358 187189 DEBUG nova.compute.manager [req-3f83fa02-eda3-4022-8598-a95d10b270a5 req-572e6795-b450-4255-b79e-b8edad7f2169 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Received event network-vif-plugged-43628527-f640-422f-909a-1feda8b1b46b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:55:19 compute-0 nova_compute[187185]: 2025-11-29 06:55:19.359 187189 DEBUG oslo_concurrency.lockutils [req-3f83fa02-eda3-4022-8598-a95d10b270a5 req-572e6795-b450-4255-b79e-b8edad7f2169 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "6171b282-fd8f-42ba-9875-2ae45ea80365-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:55:19 compute-0 nova_compute[187185]: 2025-11-29 06:55:19.360 187189 DEBUG oslo_concurrency.lockutils [req-3f83fa02-eda3-4022-8598-a95d10b270a5 req-572e6795-b450-4255-b79e-b8edad7f2169 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "6171b282-fd8f-42ba-9875-2ae45ea80365-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:55:19 compute-0 nova_compute[187185]: 2025-11-29 06:55:19.360 187189 DEBUG oslo_concurrency.lockutils [req-3f83fa02-eda3-4022-8598-a95d10b270a5 req-572e6795-b450-4255-b79e-b8edad7f2169 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "6171b282-fd8f-42ba-9875-2ae45ea80365-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:55:19 compute-0 nova_compute[187185]: 2025-11-29 06:55:19.361 187189 DEBUG nova.compute.manager [req-3f83fa02-eda3-4022-8598-a95d10b270a5 req-572e6795-b450-4255-b79e-b8edad7f2169 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] No waiting events found dispatching network-vif-plugged-43628527-f640-422f-909a-1feda8b1b46b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 06:55:19 compute-0 nova_compute[187185]: 2025-11-29 06:55:19.361 187189 WARNING nova.compute.manager [req-3f83fa02-eda3-4022-8598-a95d10b270a5 req-572e6795-b450-4255-b79e-b8edad7f2169 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Received unexpected event network-vif-plugged-43628527-f640-422f-909a-1feda8b1b46b for instance with vm_state deleted and task_state None.
Nov 29 06:55:19 compute-0 nova_compute[187185]: 2025-11-29 06:55:19.775 187189 DEBUG nova.virt.libvirt.driver [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Nov 29 06:55:21 compute-0 nova_compute[187185]: 2025-11-29 06:55:21.849 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:55:21 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000021.scope: Deactivated successfully.
Nov 29 06:55:21 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000021.scope: Consumed 12.896s CPU time.
Nov 29 06:55:21 compute-0 systemd-machined[153486]: Machine qemu-11-instance-00000021 terminated.
Nov 29 06:55:22 compute-0 nova_compute[187185]: 2025-11-29 06:55:22.794 187189 INFO nova.virt.libvirt.driver [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Instance shutdown successfully after 13 seconds.
Nov 29 06:55:22 compute-0 nova_compute[187185]: 2025-11-29 06:55:22.801 187189 INFO nova.virt.libvirt.driver [-] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Instance destroyed successfully.
Nov 29 06:55:22 compute-0 nova_compute[187185]: 2025-11-29 06:55:22.805 187189 DEBUG oslo_concurrency.processutils [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:55:22 compute-0 nova_compute[187185]: 2025-11-29 06:55:22.920 187189 DEBUG oslo_concurrency.processutils [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk --force-share --output=json" returned: 0 in 0.115s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:55:22 compute-0 nova_compute[187185]: 2025-11-29 06:55:22.922 187189 DEBUG oslo_concurrency.processutils [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:55:23 compute-0 nova_compute[187185]: 2025-11-29 06:55:23.023 187189 DEBUG oslo_concurrency.processutils [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:55:23 compute-0 nova_compute[187185]: 2025-11-29 06:55:23.025 187189 DEBUG nova.virt.libvirt.volume.remotefs [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Copying file /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4_resize/disk to 192.168.122.101:/var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Nov 29 06:55:23 compute-0 nova_compute[187185]: 2025-11-29 06:55:23.025 187189 DEBUG oslo_concurrency.processutils [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4_resize/disk 192.168.122.101:/var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:55:23 compute-0 nova_compute[187185]: 2025-11-29 06:55:23.177 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:55:24 compute-0 nova_compute[187185]: 2025-11-29 06:55:24.654 187189 DEBUG oslo_concurrency.processutils [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] CMD "scp -r /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4_resize/disk 192.168.122.101:/var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk" returned: 0 in 1.628s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:55:24 compute-0 nova_compute[187185]: 2025-11-29 06:55:24.655 187189 DEBUG nova.virt.libvirt.volume.remotefs [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Copying file /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4_resize/disk.config to 192.168.122.101:/var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Nov 29 06:55:24 compute-0 nova_compute[187185]: 2025-11-29 06:55:24.655 187189 DEBUG oslo_concurrency.processutils [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4_resize/disk.config 192.168.122.101:/var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:55:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:24.815 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:55:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:24.816 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:55:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:24.816 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:55:24 compute-0 nova_compute[187185]: 2025-11-29 06:55:24.897 187189 DEBUG oslo_concurrency.processutils [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] CMD "scp -C -r /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4_resize/disk.config 192.168.122.101:/var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk.config" returned: 0 in 0.242s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:55:24 compute-0 nova_compute[187185]: 2025-11-29 06:55:24.899 187189 DEBUG nova.virt.libvirt.volume.remotefs [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Copying file /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4_resize/disk.info to 192.168.122.101:/var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Nov 29 06:55:24 compute-0 nova_compute[187185]: 2025-11-29 06:55:24.900 187189 DEBUG oslo_concurrency.processutils [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4_resize/disk.info 192.168.122.101:/var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:55:25 compute-0 nova_compute[187185]: 2025-11-29 06:55:25.137 187189 DEBUG oslo_concurrency.processutils [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] CMD "scp -C -r /var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4_resize/disk.info 192.168.122.101:/var/lib/nova/instances/66b9235f-7cc8-40d4-877b-b690613298a4/disk.info" returned: 0 in 0.237s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:55:25 compute-0 nova_compute[187185]: 2025-11-29 06:55:25.281 187189 DEBUG oslo_concurrency.lockutils [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Acquiring lock "66b9235f-7cc8-40d4-877b-b690613298a4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:55:25 compute-0 nova_compute[187185]: 2025-11-29 06:55:25.282 187189 DEBUG oslo_concurrency.lockutils [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Lock "66b9235f-7cc8-40d4-877b-b690613298a4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:55:25 compute-0 nova_compute[187185]: 2025-11-29 06:55:25.283 187189 DEBUG oslo_concurrency.lockutils [None req-9cfa627a-01ba-43f5-83d6-12da0fdd9279 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Lock "66b9235f-7cc8-40d4-877b-b690613298a4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:55:26 compute-0 podman[218335]: 2025-11-29 06:55:26.808467221 +0000 UTC m=+0.065805700 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 29 06:55:26 compute-0 podman[218336]: 2025-11-29 06:55:26.82472626 +0000 UTC m=+0.074627549 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, maintainer=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, release=1755695350, vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public)
Nov 29 06:55:26 compute-0 podman[218337]: 2025-11-29 06:55:26.837860071 +0000 UTC m=+0.084314173 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 06:55:26 compute-0 nova_compute[187185]: 2025-11-29 06:55:26.853 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:55:28 compute-0 nova_compute[187185]: 2025-11-29 06:55:28.209 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:55:29 compute-0 nova_compute[187185]: 2025-11-29 06:55:29.787 187189 DEBUG oslo_concurrency.lockutils [None req-9fc844ed-03b6-4046-a853-bf4bdaa8be76 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "66b9235f-7cc8-40d4-877b-b690613298a4" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:55:29 compute-0 nova_compute[187185]: 2025-11-29 06:55:29.787 187189 DEBUG oslo_concurrency.lockutils [None req-9fc844ed-03b6-4046-a853-bf4bdaa8be76 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "66b9235f-7cc8-40d4-877b-b690613298a4" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:55:29 compute-0 nova_compute[187185]: 2025-11-29 06:55:29.788 187189 DEBUG nova.compute.manager [None req-9fc844ed-03b6-4046-a853-bf4bdaa8be76 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Going to confirm migration 7 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679
Nov 29 06:55:29 compute-0 nova_compute[187185]: 2025-11-29 06:55:29.894 187189 DEBUG nova.objects.instance [None req-9fc844ed-03b6-4046-a853-bf4bdaa8be76 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lazy-loading 'info_cache' on Instance uuid 66b9235f-7cc8-40d4-877b-b690613298a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:55:29 compute-0 nova_compute[187185]: 2025-11-29 06:55:29.897 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:55:30 compute-0 nova_compute[187185]: 2025-11-29 06:55:30.521 187189 DEBUG oslo_concurrency.lockutils [None req-9fc844ed-03b6-4046-a853-bf4bdaa8be76 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "refresh_cache-66b9235f-7cc8-40d4-877b-b690613298a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:55:30 compute-0 nova_compute[187185]: 2025-11-29 06:55:30.521 187189 DEBUG oslo_concurrency.lockutils [None req-9fc844ed-03b6-4046-a853-bf4bdaa8be76 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquired lock "refresh_cache-66b9235f-7cc8-40d4-877b-b690613298a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:55:30 compute-0 nova_compute[187185]: 2025-11-29 06:55:30.521 187189 DEBUG nova.network.neutron [None req-9fc844ed-03b6-4046-a853-bf4bdaa8be76 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 06:55:30 compute-0 nova_compute[187185]: 2025-11-29 06:55:30.767 187189 DEBUG nova.network.neutron [None req-9fc844ed-03b6-4046-a853-bf4bdaa8be76 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 06:55:31 compute-0 nova_compute[187185]: 2025-11-29 06:55:31.211 187189 DEBUG nova.network.neutron [None req-9fc844ed-03b6-4046-a853-bf4bdaa8be76 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:55:31 compute-0 nova_compute[187185]: 2025-11-29 06:55:31.239 187189 DEBUG oslo_concurrency.lockutils [None req-9fc844ed-03b6-4046-a853-bf4bdaa8be76 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Releasing lock "refresh_cache-66b9235f-7cc8-40d4-877b-b690613298a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:55:31 compute-0 nova_compute[187185]: 2025-11-29 06:55:31.240 187189 DEBUG nova.objects.instance [None req-9fc844ed-03b6-4046-a853-bf4bdaa8be76 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lazy-loading 'migration_context' on Instance uuid 66b9235f-7cc8-40d4-877b-b690613298a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:55:31 compute-0 nova_compute[187185]: 2025-11-29 06:55:31.277 187189 DEBUG oslo_concurrency.lockutils [None req-9fc844ed-03b6-4046-a853-bf4bdaa8be76 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:55:31 compute-0 nova_compute[187185]: 2025-11-29 06:55:31.278 187189 DEBUG oslo_concurrency.lockutils [None req-9fc844ed-03b6-4046-a853-bf4bdaa8be76 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:55:31 compute-0 nova_compute[187185]: 2025-11-29 06:55:31.408 187189 DEBUG nova.compute.provider_tree [None req-9fc844ed-03b6-4046-a853-bf4bdaa8be76 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:55:31 compute-0 nova_compute[187185]: 2025-11-29 06:55:31.426 187189 DEBUG nova.scheduler.client.report [None req-9fc844ed-03b6-4046-a853-bf4bdaa8be76 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:55:31 compute-0 nova_compute[187185]: 2025-11-29 06:55:31.514 187189 DEBUG oslo_concurrency.lockutils [None req-9fc844ed-03b6-4046-a853-bf4bdaa8be76 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.235s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:55:31 compute-0 nova_compute[187185]: 2025-11-29 06:55:31.698 187189 INFO nova.scheduler.client.report [None req-9fc844ed-03b6-4046-a853-bf4bdaa8be76 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Deleted allocation for migration 3da3d3f5-2569-489b-86cb-65e3eee7704d
Nov 29 06:55:31 compute-0 nova_compute[187185]: 2025-11-29 06:55:31.768 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399316.7667403, 6171b282-fd8f-42ba-9875-2ae45ea80365 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:55:31 compute-0 nova_compute[187185]: 2025-11-29 06:55:31.769 187189 INFO nova.compute.manager [-] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] VM Stopped (Lifecycle Event)
Nov 29 06:55:31 compute-0 nova_compute[187185]: 2025-11-29 06:55:31.796 187189 DEBUG nova.compute.manager [None req-bcaeecbc-c0fd-493f-8e61-71f03411d2ac - - - - - -] [instance: 6171b282-fd8f-42ba-9875-2ae45ea80365] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:55:31 compute-0 nova_compute[187185]: 2025-11-29 06:55:31.806 187189 DEBUG oslo_concurrency.lockutils [None req-9fc844ed-03b6-4046-a853-bf4bdaa8be76 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "66b9235f-7cc8-40d4-877b-b690613298a4" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 2.018s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:55:31 compute-0 nova_compute[187185]: 2025-11-29 06:55:31.855 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:55:32 compute-0 podman[218395]: 2025-11-29 06:55:32.914095785 +0000 UTC m=+0.176068633 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Nov 29 06:55:33 compute-0 nova_compute[187185]: 2025-11-29 06:55:33.211 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:55:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:33.331 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 06:55:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:33.332 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 06:55:33 compute-0 nova_compute[187185]: 2025-11-29 06:55:33.333 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:55:35 compute-0 podman[218423]: 2025-11-29 06:55:35.796245687 +0000 UTC m=+0.058820492 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 06:55:35 compute-0 podman[218422]: 2025-11-29 06:55:35.846131176 +0000 UTC m=+0.105456829 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:55:36 compute-0 nova_compute[187185]: 2025-11-29 06:55:36.858 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:55:37 compute-0 nova_compute[187185]: 2025-11-29 06:55:37.186 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399322.1844518, 66b9235f-7cc8-40d4-877b-b690613298a4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:55:37 compute-0 nova_compute[187185]: 2025-11-29 06:55:37.186 187189 INFO nova.compute.manager [-] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] VM Stopped (Lifecycle Event)
Nov 29 06:55:37 compute-0 nova_compute[187185]: 2025-11-29 06:55:37.213 187189 DEBUG nova.compute.manager [None req-4950498f-2fcd-406a-b5d3-d48252a33d3a - - - - - -] [instance: 66b9235f-7cc8-40d4-877b-b690613298a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:55:38 compute-0 nova_compute[187185]: 2025-11-29 06:55:38.220 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:55:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:55:41.335 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:55:41 compute-0 nova_compute[187185]: 2025-11-29 06:55:41.861 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:55:43 compute-0 nova_compute[187185]: 2025-11-29 06:55:43.272 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:55:43 compute-0 nova_compute[187185]: 2025-11-29 06:55:43.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:55:44 compute-0 nova_compute[187185]: 2025-11-29 06:55:44.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:55:44 compute-0 nova_compute[187185]: 2025-11-29 06:55:44.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 06:55:44 compute-0 nova_compute[187185]: 2025-11-29 06:55:44.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 06:55:44 compute-0 nova_compute[187185]: 2025-11-29 06:55:44.359 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 06:55:45 compute-0 nova_compute[187185]: 2025-11-29 06:55:45.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:55:45 compute-0 nova_compute[187185]: 2025-11-29 06:55:45.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:55:45 compute-0 podman[218466]: 2025-11-29 06:55:45.792889341 +0000 UTC m=+0.060584452 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 06:55:46 compute-0 nova_compute[187185]: 2025-11-29 06:55:46.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:55:46 compute-0 nova_compute[187185]: 2025-11-29 06:55:46.864 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:55:47 compute-0 nova_compute[187185]: 2025-11-29 06:55:47.310 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:55:47 compute-0 nova_compute[187185]: 2025-11-29 06:55:47.347 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:55:47 compute-0 nova_compute[187185]: 2025-11-29 06:55:47.392 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:55:47 compute-0 nova_compute[187185]: 2025-11-29 06:55:47.393 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:55:47 compute-0 nova_compute[187185]: 2025-11-29 06:55:47.393 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:55:47 compute-0 nova_compute[187185]: 2025-11-29 06:55:47.393 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:55:47 compute-0 nova_compute[187185]: 2025-11-29 06:55:47.592 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:55:47 compute-0 nova_compute[187185]: 2025-11-29 06:55:47.593 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5748MB free_disk=73.3388442993164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:55:47 compute-0 nova_compute[187185]: 2025-11-29 06:55:47.593 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:55:47 compute-0 nova_compute[187185]: 2025-11-29 06:55:47.593 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:55:47 compute-0 nova_compute[187185]: 2025-11-29 06:55:47.845 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:55:47 compute-0 nova_compute[187185]: 2025-11-29 06:55:47.846 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:55:47 compute-0 nova_compute[187185]: 2025-11-29 06:55:47.887 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:55:47 compute-0 nova_compute[187185]: 2025-11-29 06:55:47.918 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:55:47 compute-0 nova_compute[187185]: 2025-11-29 06:55:47.972 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:55:47 compute-0 nova_compute[187185]: 2025-11-29 06:55:47.973 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.379s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:55:48 compute-0 nova_compute[187185]: 2025-11-29 06:55:48.273 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:55:48 compute-0 nova_compute[187185]: 2025-11-29 06:55:48.943 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:55:49 compute-0 nova_compute[187185]: 2025-11-29 06:55:49.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:55:49 compute-0 nova_compute[187185]: 2025-11-29 06:55:49.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:55:49 compute-0 nova_compute[187185]: 2025-11-29 06:55:49.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 06:55:51 compute-0 sshd-session[218487]: Invalid user hu from 179.125.24.202 port 58688
Nov 29 06:55:51 compute-0 nova_compute[187185]: 2025-11-29 06:55:51.868 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:55:51 compute-0 sshd-session[218487]: Received disconnect from 179.125.24.202 port 58688:11: Bye Bye [preauth]
Nov 29 06:55:51 compute-0 sshd-session[218487]: Disconnected from invalid user hu 179.125.24.202 port 58688 [preauth]
Nov 29 06:55:53 compute-0 nova_compute[187185]: 2025-11-29 06:55:53.276 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:55:56 compute-0 nova_compute[187185]: 2025-11-29 06:55:56.871 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:55:57 compute-0 podman[218490]: 2025-11-29 06:55:57.831101803 +0000 UTC m=+0.086515994 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.33.7, managed_by=edpm_ansible, version=9.6, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal)
Nov 29 06:55:57 compute-0 podman[218489]: 2025-11-29 06:55:57.834652603 +0000 UTC m=+0.089285042 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 29 06:55:57 compute-0 podman[218491]: 2025-11-29 06:55:57.843207485 +0000 UTC m=+0.088914102 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 06:55:58 compute-0 nova_compute[187185]: 2025-11-29 06:55:58.278 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:56:01 compute-0 nova_compute[187185]: 2025-11-29 06:56:01.874 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:56:03 compute-0 ovn_controller[95281]: 2025-11-29T06:56:03Z|00096|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 29 06:56:03 compute-0 nova_compute[187185]: 2025-11-29 06:56:03.281 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:56:03 compute-0 podman[218548]: 2025-11-29 06:56:03.876936598 +0000 UTC m=+0.133077379 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 29 06:56:06 compute-0 podman[218573]: 2025-11-29 06:56:06.807980182 +0000 UTC m=+0.071881711 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:56:06 compute-0 podman[218574]: 2025-11-29 06:56:06.818985362 +0000 UTC m=+0.084557508 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 06:56:06 compute-0 nova_compute[187185]: 2025-11-29 06:56:06.878 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:56:08 compute-0 nova_compute[187185]: 2025-11-29 06:56:08.283 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:56:11 compute-0 sshd-session[218615]: Invalid user conectar from 1.214.197.163 port 48810
Nov 29 06:56:11 compute-0 sshd-session[218615]: Received disconnect from 1.214.197.163 port 48810:11: Bye Bye [preauth]
Nov 29 06:56:11 compute-0 sshd-session[218615]: Disconnected from invalid user conectar 1.214.197.163 port 48810 [preauth]
Nov 29 06:56:11 compute-0 nova_compute[187185]: 2025-11-29 06:56:11.881 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:56:13 compute-0 nova_compute[187185]: 2025-11-29 06:56:13.286 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:56:13 compute-0 nova_compute[187185]: 2025-11-29 06:56:13.317 187189 DEBUG oslo_concurrency.lockutils [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "6aebe65a-3191-4d58-acfd-8d663b9b0a8e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:56:13 compute-0 nova_compute[187185]: 2025-11-29 06:56:13.318 187189 DEBUG oslo_concurrency.lockutils [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "6aebe65a-3191-4d58-acfd-8d663b9b0a8e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:56:13 compute-0 nova_compute[187185]: 2025-11-29 06:56:13.352 187189 DEBUG nova.compute.manager [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 06:56:13 compute-0 nova_compute[187185]: 2025-11-29 06:56:13.467 187189 DEBUG oslo_concurrency.lockutils [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:56:13 compute-0 nova_compute[187185]: 2025-11-29 06:56:13.467 187189 DEBUG oslo_concurrency.lockutils [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:56:13 compute-0 nova_compute[187185]: 2025-11-29 06:56:13.475 187189 DEBUG nova.virt.hardware [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 06:56:13 compute-0 nova_compute[187185]: 2025-11-29 06:56:13.475 187189 INFO nova.compute.claims [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Claim successful on node compute-0.ctlplane.example.com
Nov 29 06:56:13 compute-0 nova_compute[187185]: 2025-11-29 06:56:13.574 187189 DEBUG nova.compute.provider_tree [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:56:13 compute-0 nova_compute[187185]: 2025-11-29 06:56:13.589 187189 DEBUG nova.scheduler.client.report [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:56:13 compute-0 nova_compute[187185]: 2025-11-29 06:56:13.611 187189 DEBUG oslo_concurrency.lockutils [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:56:13 compute-0 nova_compute[187185]: 2025-11-29 06:56:13.612 187189 DEBUG nova.compute.manager [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 06:56:13 compute-0 nova_compute[187185]: 2025-11-29 06:56:13.663 187189 DEBUG nova.compute.manager [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 06:56:13 compute-0 nova_compute[187185]: 2025-11-29 06:56:13.664 187189 DEBUG nova.network.neutron [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 06:56:13 compute-0 nova_compute[187185]: 2025-11-29 06:56:13.683 187189 INFO nova.virt.libvirt.driver [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 06:56:13 compute-0 nova_compute[187185]: 2025-11-29 06:56:13.702 187189 DEBUG nova.compute.manager [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 06:56:13 compute-0 nova_compute[187185]: 2025-11-29 06:56:13.802 187189 DEBUG nova.compute.manager [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 06:56:13 compute-0 nova_compute[187185]: 2025-11-29 06:56:13.804 187189 DEBUG nova.virt.libvirt.driver [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 06:56:13 compute-0 nova_compute[187185]: 2025-11-29 06:56:13.805 187189 INFO nova.virt.libvirt.driver [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Creating image(s)
Nov 29 06:56:13 compute-0 nova_compute[187185]: 2025-11-29 06:56:13.806 187189 DEBUG oslo_concurrency.lockutils [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "/var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:56:13 compute-0 nova_compute[187185]: 2025-11-29 06:56:13.809 187189 DEBUG oslo_concurrency.lockutils [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "/var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:56:13 compute-0 nova_compute[187185]: 2025-11-29 06:56:13.811 187189 DEBUG oslo_concurrency.lockutils [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "/var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:56:13 compute-0 nova_compute[187185]: 2025-11-29 06:56:13.838 187189 DEBUG oslo_concurrency.processutils [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:56:13 compute-0 nova_compute[187185]: 2025-11-29 06:56:13.904 187189 DEBUG oslo_concurrency.processutils [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:56:13 compute-0 nova_compute[187185]: 2025-11-29 06:56:13.905 187189 DEBUG oslo_concurrency.lockutils [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:56:13 compute-0 nova_compute[187185]: 2025-11-29 06:56:13.907 187189 DEBUG oslo_concurrency.lockutils [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:56:13 compute-0 nova_compute[187185]: 2025-11-29 06:56:13.930 187189 DEBUG oslo_concurrency.processutils [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:56:14 compute-0 nova_compute[187185]: 2025-11-29 06:56:14.026 187189 DEBUG oslo_concurrency.processutils [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:56:14 compute-0 nova_compute[187185]: 2025-11-29 06:56:14.029 187189 DEBUG oslo_concurrency.processutils [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:56:14 compute-0 nova_compute[187185]: 2025-11-29 06:56:14.269 187189 DEBUG oslo_concurrency.processutils [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk 1073741824" returned: 0 in 0.240s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:56:14 compute-0 nova_compute[187185]: 2025-11-29 06:56:14.271 187189 DEBUG oslo_concurrency.lockutils [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.365s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:56:14 compute-0 nova_compute[187185]: 2025-11-29 06:56:14.272 187189 DEBUG oslo_concurrency.processutils [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:56:14 compute-0 nova_compute[187185]: 2025-11-29 06:56:14.367 187189 DEBUG oslo_concurrency.processutils [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:56:14 compute-0 nova_compute[187185]: 2025-11-29 06:56:14.369 187189 DEBUG nova.virt.disk.api [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Checking if we can resize image /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 06:56:14 compute-0 nova_compute[187185]: 2025-11-29 06:56:14.370 187189 DEBUG oslo_concurrency.processutils [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:56:14 compute-0 nova_compute[187185]: 2025-11-29 06:56:14.442 187189 DEBUG oslo_concurrency.processutils [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:56:14 compute-0 nova_compute[187185]: 2025-11-29 06:56:14.444 187189 DEBUG nova.virt.disk.api [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Cannot resize image /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 06:56:14 compute-0 nova_compute[187185]: 2025-11-29 06:56:14.445 187189 DEBUG nova.objects.instance [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lazy-loading 'migration_context' on Instance uuid 6aebe65a-3191-4d58-acfd-8d663b9b0a8e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:56:14 compute-0 nova_compute[187185]: 2025-11-29 06:56:14.463 187189 DEBUG nova.virt.libvirt.driver [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 06:56:14 compute-0 nova_compute[187185]: 2025-11-29 06:56:14.463 187189 DEBUG nova.virt.libvirt.driver [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Ensure instance console log exists: /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 06:56:14 compute-0 nova_compute[187185]: 2025-11-29 06:56:14.464 187189 DEBUG oslo_concurrency.lockutils [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:56:14 compute-0 nova_compute[187185]: 2025-11-29 06:56:14.464 187189 DEBUG oslo_concurrency.lockutils [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:56:14 compute-0 nova_compute[187185]: 2025-11-29 06:56:14.464 187189 DEBUG oslo_concurrency.lockutils [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:56:15 compute-0 nova_compute[187185]: 2025-11-29 06:56:15.244 187189 DEBUG nova.network.neutron [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Nov 29 06:56:15 compute-0 nova_compute[187185]: 2025-11-29 06:56:15.245 187189 DEBUG nova.compute.manager [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 06:56:15 compute-0 nova_compute[187185]: 2025-11-29 06:56:15.248 187189 DEBUG nova.virt.libvirt.driver [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 06:56:15 compute-0 nova_compute[187185]: 2025-11-29 06:56:15.254 187189 WARNING nova.virt.libvirt.driver [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:56:15 compute-0 nova_compute[187185]: 2025-11-29 06:56:15.260 187189 DEBUG nova.virt.libvirt.host [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 06:56:15 compute-0 nova_compute[187185]: 2025-11-29 06:56:15.260 187189 DEBUG nova.virt.libvirt.host [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 06:56:15 compute-0 nova_compute[187185]: 2025-11-29 06:56:15.263 187189 DEBUG nova.virt.libvirt.host [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 06:56:15 compute-0 nova_compute[187185]: 2025-11-29 06:56:15.263 187189 DEBUG nova.virt.libvirt.host [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 06:56:15 compute-0 nova_compute[187185]: 2025-11-29 06:56:15.264 187189 DEBUG nova.virt.libvirt.driver [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 06:56:15 compute-0 nova_compute[187185]: 2025-11-29 06:56:15.265 187189 DEBUG nova.virt.hardware [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:56:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='268377f2-662f-436c-9b23-42f8b1992a21',id=30,is_public=True,memory_mb=128,name='tempest-test_resize_flavor_-1559473648',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 06:56:15 compute-0 nova_compute[187185]: 2025-11-29 06:56:15.265 187189 DEBUG nova.virt.hardware [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 06:56:15 compute-0 nova_compute[187185]: 2025-11-29 06:56:15.265 187189 DEBUG nova.virt.hardware [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 06:56:15 compute-0 nova_compute[187185]: 2025-11-29 06:56:15.266 187189 DEBUG nova.virt.hardware [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 06:56:15 compute-0 nova_compute[187185]: 2025-11-29 06:56:15.266 187189 DEBUG nova.virt.hardware [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 06:56:15 compute-0 nova_compute[187185]: 2025-11-29 06:56:15.266 187189 DEBUG nova.virt.hardware [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 06:56:15 compute-0 nova_compute[187185]: 2025-11-29 06:56:15.266 187189 DEBUG nova.virt.hardware [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 06:56:15 compute-0 nova_compute[187185]: 2025-11-29 06:56:15.266 187189 DEBUG nova.virt.hardware [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 06:56:15 compute-0 nova_compute[187185]: 2025-11-29 06:56:15.267 187189 DEBUG nova.virt.hardware [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 06:56:15 compute-0 nova_compute[187185]: 2025-11-29 06:56:15.267 187189 DEBUG nova.virt.hardware [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 06:56:15 compute-0 nova_compute[187185]: 2025-11-29 06:56:15.267 187189 DEBUG nova.virt.hardware [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 06:56:15 compute-0 nova_compute[187185]: 2025-11-29 06:56:15.272 187189 DEBUG nova.objects.instance [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6aebe65a-3191-4d58-acfd-8d663b9b0a8e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:56:15 compute-0 nova_compute[187185]: 2025-11-29 06:56:15.291 187189 DEBUG nova.virt.libvirt.driver [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] End _get_guest_xml xml=<domain type="kvm">
Nov 29 06:56:15 compute-0 nova_compute[187185]:   <uuid>6aebe65a-3191-4d58-acfd-8d663b9b0a8e</uuid>
Nov 29 06:56:15 compute-0 nova_compute[187185]:   <name>instance-00000024</name>
Nov 29 06:56:15 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 06:56:15 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 06:56:15 compute-0 nova_compute[187185]:   <metadata>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 06:56:15 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:       <nova:name>tempest-MigrationsAdminTest-server-1402593290</nova:name>
Nov 29 06:56:15 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 06:56:15</nova:creationTime>
Nov 29 06:56:15 compute-0 nova_compute[187185]:       <nova:flavor name="tempest-test_resize_flavor_-1559473648">
Nov 29 06:56:15 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 06:56:15 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 06:56:15 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 06:56:15 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 06:56:15 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 06:56:15 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 06:56:15 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 06:56:15 compute-0 nova_compute[187185]:         <nova:user uuid="53ee944c04484336b9b14d84235a62b8">tempest-MigrationsAdminTest-1601255173-project-member</nova:user>
Nov 29 06:56:15 compute-0 nova_compute[187185]:         <nova:project uuid="890f94a625b342fdb17128922403c925">tempest-MigrationsAdminTest-1601255173</nova:project>
Nov 29 06:56:15 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 06:56:15 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:       <nova:ports/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 06:56:15 compute-0 nova_compute[187185]:   </metadata>
Nov 29 06:56:15 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <system>
Nov 29 06:56:15 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 06:56:15 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 06:56:15 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 06:56:15 compute-0 nova_compute[187185]:       <entry name="serial">6aebe65a-3191-4d58-acfd-8d663b9b0a8e</entry>
Nov 29 06:56:15 compute-0 nova_compute[187185]:       <entry name="uuid">6aebe65a-3191-4d58-acfd-8d663b9b0a8e</entry>
Nov 29 06:56:15 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     </system>
Nov 29 06:56:15 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 06:56:15 compute-0 nova_compute[187185]:   <os>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:   </os>
Nov 29 06:56:15 compute-0 nova_compute[187185]:   <features>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <apic/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:   </features>
Nov 29 06:56:15 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:   </clock>
Nov 29 06:56:15 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:   </cpu>
Nov 29 06:56:15 compute-0 nova_compute[187185]:   <devices>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 06:56:15 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     </disk>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 06:56:15 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk.config"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     </disk>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 06:56:15 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/console.log" append="off"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     </serial>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <video>
Nov 29 06:56:15 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     </video>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 06:56:15 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     </rng>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 06:56:15 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 06:56:15 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 06:56:15 compute-0 nova_compute[187185]:   </devices>
Nov 29 06:56:15 compute-0 nova_compute[187185]: </domain>
Nov 29 06:56:15 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 06:56:15 compute-0 nova_compute[187185]: 2025-11-29 06:56:15.528 187189 DEBUG nova.virt.libvirt.driver [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 06:56:15 compute-0 nova_compute[187185]: 2025-11-29 06:56:15.528 187189 DEBUG nova.virt.libvirt.driver [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 06:56:15 compute-0 nova_compute[187185]: 2025-11-29 06:56:15.529 187189 INFO nova.virt.libvirt.driver [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Using config drive
Nov 29 06:56:16 compute-0 nova_compute[187185]: 2025-11-29 06:56:16.347 187189 INFO nova.virt.libvirt.driver [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Creating config drive at /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk.config
Nov 29 06:56:16 compute-0 nova_compute[187185]: 2025-11-29 06:56:16.357 187189 DEBUG oslo_concurrency.processutils [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptmx1xz1o execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:56:16 compute-0 nova_compute[187185]: 2025-11-29 06:56:16.503 187189 DEBUG oslo_concurrency.processutils [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptmx1xz1o" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:56:16 compute-0 systemd-machined[153486]: New machine qemu-13-instance-00000024.
Nov 29 06:56:16 compute-0 systemd[1]: Started Virtual Machine qemu-13-instance-00000024.
Nov 29 06:56:16 compute-0 podman[218642]: 2025-11-29 06:56:16.708661173 +0000 UTC m=+0.092270429 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:56:16 compute-0 nova_compute[187185]: 2025-11-29 06:56:16.884 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:56:17 compute-0 nova_compute[187185]: 2025-11-29 06:56:17.187 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399377.1869352, 6aebe65a-3191-4d58-acfd-8d663b9b0a8e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:56:17 compute-0 nova_compute[187185]: 2025-11-29 06:56:17.189 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] VM Resumed (Lifecycle Event)
Nov 29 06:56:17 compute-0 nova_compute[187185]: 2025-11-29 06:56:17.196 187189 DEBUG nova.compute.manager [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 06:56:17 compute-0 nova_compute[187185]: 2025-11-29 06:56:17.197 187189 DEBUG nova.virt.libvirt.driver [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 06:56:17 compute-0 nova_compute[187185]: 2025-11-29 06:56:17.204 187189 INFO nova.virt.libvirt.driver [-] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Instance spawned successfully.
Nov 29 06:56:17 compute-0 nova_compute[187185]: 2025-11-29 06:56:17.205 187189 DEBUG nova.virt.libvirt.driver [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 06:56:17 compute-0 nova_compute[187185]: 2025-11-29 06:56:17.219 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:56:17 compute-0 nova_compute[187185]: 2025-11-29 06:56:17.228 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 06:56:17 compute-0 nova_compute[187185]: 2025-11-29 06:56:17.235 187189 DEBUG nova.virt.libvirt.driver [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:56:17 compute-0 nova_compute[187185]: 2025-11-29 06:56:17.236 187189 DEBUG nova.virt.libvirt.driver [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:56:17 compute-0 nova_compute[187185]: 2025-11-29 06:56:17.237 187189 DEBUG nova.virt.libvirt.driver [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:56:17 compute-0 nova_compute[187185]: 2025-11-29 06:56:17.238 187189 DEBUG nova.virt.libvirt.driver [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:56:17 compute-0 nova_compute[187185]: 2025-11-29 06:56:17.239 187189 DEBUG nova.virt.libvirt.driver [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:56:17 compute-0 nova_compute[187185]: 2025-11-29 06:56:17.240 187189 DEBUG nova.virt.libvirt.driver [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:56:17 compute-0 nova_compute[187185]: 2025-11-29 06:56:17.250 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 06:56:17 compute-0 nova_compute[187185]: 2025-11-29 06:56:17.251 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399377.1886523, 6aebe65a-3191-4d58-acfd-8d663b9b0a8e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:56:17 compute-0 nova_compute[187185]: 2025-11-29 06:56:17.251 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] VM Started (Lifecycle Event)
Nov 29 06:56:17 compute-0 nova_compute[187185]: 2025-11-29 06:56:17.275 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:56:17 compute-0 nova_compute[187185]: 2025-11-29 06:56:17.282 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 06:56:17 compute-0 nova_compute[187185]: 2025-11-29 06:56:17.314 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 06:56:17 compute-0 nova_compute[187185]: 2025-11-29 06:56:17.328 187189 INFO nova.compute.manager [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Took 3.53 seconds to spawn the instance on the hypervisor.
Nov 29 06:56:17 compute-0 nova_compute[187185]: 2025-11-29 06:56:17.330 187189 DEBUG nova.compute.manager [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:56:17 compute-0 nova_compute[187185]: 2025-11-29 06:56:17.434 187189 INFO nova.compute.manager [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Took 4.00 seconds to build instance.
Nov 29 06:56:17 compute-0 nova_compute[187185]: 2025-11-29 06:56:17.459 187189 DEBUG oslo_concurrency.lockutils [None req-e3dc5b8e-dc56-4f12-a4f0-2364bc707ffa 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "6aebe65a-3191-4d58-acfd-8d663b9b0a8e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:56:18 compute-0 nova_compute[187185]: 2025-11-29 06:56:18.290 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:56:20 compute-0 nova_compute[187185]: 2025-11-29 06:56:20.471 187189 DEBUG oslo_concurrency.lockutils [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "refresh_cache-6aebe65a-3191-4d58-acfd-8d663b9b0a8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:56:20 compute-0 nova_compute[187185]: 2025-11-29 06:56:20.471 187189 DEBUG oslo_concurrency.lockutils [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquired lock "refresh_cache-6aebe65a-3191-4d58-acfd-8d663b9b0a8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:56:20 compute-0 nova_compute[187185]: 2025-11-29 06:56:20.471 187189 DEBUG nova.network.neutron [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 06:56:20 compute-0 nova_compute[187185]: 2025-11-29 06:56:20.773 187189 DEBUG nova.network.neutron [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 06:56:21 compute-0 nova_compute[187185]: 2025-11-29 06:56:21.089 187189 DEBUG nova.network.neutron [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:56:21 compute-0 nova_compute[187185]: 2025-11-29 06:56:21.112 187189 DEBUG oslo_concurrency.lockutils [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Releasing lock "refresh_cache-6aebe65a-3191-4d58-acfd-8d663b9b0a8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:56:21 compute-0 nova_compute[187185]: 2025-11-29 06:56:21.335 187189 DEBUG nova.virt.libvirt.driver [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Nov 29 06:56:21 compute-0 nova_compute[187185]: 2025-11-29 06:56:21.335 187189 DEBUG nova.virt.libvirt.volume.remotefs [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Creating file /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/c18aba771c0346f1aeafae7d620d0bd1.tmp on remote host 192.168.122.101 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79
Nov 29 06:56:21 compute-0 nova_compute[187185]: 2025-11-29 06:56:21.336 187189 DEBUG oslo_concurrency.processutils [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/c18aba771c0346f1aeafae7d620d0bd1.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:56:21 compute-0 nova_compute[187185]: 2025-11-29 06:56:21.887 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:56:22 compute-0 nova_compute[187185]: 2025-11-29 06:56:22.078 187189 DEBUG oslo_concurrency.processutils [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/c18aba771c0346f1aeafae7d620d0bd1.tmp" returned: 1 in 0.742s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:56:22 compute-0 nova_compute[187185]: 2025-11-29 06:56:22.079 187189 DEBUG oslo_concurrency.processutils [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] 'ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/c18aba771c0346f1aeafae7d620d0bd1.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Nov 29 06:56:22 compute-0 nova_compute[187185]: 2025-11-29 06:56:22.080 187189 DEBUG nova.virt.libvirt.volume.remotefs [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Creating directory /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e on remote host 192.168.122.101 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91
Nov 29 06:56:22 compute-0 nova_compute[187185]: 2025-11-29 06:56:22.080 187189 DEBUG oslo_concurrency.processutils [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:56:22 compute-0 nova_compute[187185]: 2025-11-29 06:56:22.346 187189 DEBUG oslo_concurrency.processutils [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e" returned: 0 in 0.265s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:56:22 compute-0 nova_compute[187185]: 2025-11-29 06:56:22.350 187189 DEBUG nova.virt.libvirt.driver [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Nov 29 06:56:23 compute-0 nova_compute[187185]: 2025-11-29 06:56:23.299 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:56:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:56:24.817 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:56:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:56:24.818 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:56:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:56:24.818 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:56:26 compute-0 nova_compute[187185]: 2025-11-29 06:56:26.890 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:56:27 compute-0 sshd-session[218681]: Received disconnect from 103.179.56.44 port 57404:11: Bye Bye [preauth]
Nov 29 06:56:27 compute-0 sshd-session[218681]: Disconnected from authenticating user root 103.179.56.44 port 57404 [preauth]
Nov 29 06:56:28 compute-0 nova_compute[187185]: 2025-11-29 06:56:28.299 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:56:28 compute-0 podman[218701]: 2025-11-29 06:56:28.815763134 +0000 UTC m=+0.062777418 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 06:56:28 compute-0 podman[218690]: 2025-11-29 06:56:28.816123245 +0000 UTC m=+0.069533489 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 29 06:56:28 compute-0 podman[218700]: 2025-11-29 06:56:28.826763764 +0000 UTC m=+0.075807245 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, vcs-type=git, maintainer=Red Hat, Inc., version=9.6, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, release=1755695350, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 29 06:56:31 compute-0 nova_compute[187185]: 2025-11-29 06:56:31.895 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:56:32 compute-0 nova_compute[187185]: 2025-11-29 06:56:32.412 187189 DEBUG nova.virt.libvirt.driver [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Nov 29 06:56:33 compute-0 nova_compute[187185]: 2025-11-29 06:56:33.302 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:56:34 compute-0 podman[218760]: 2025-11-29 06:56:34.909400175 +0000 UTC m=+0.167022693 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:56:35 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000024.scope: Deactivated successfully.
Nov 29 06:56:35 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000024.scope: Consumed 13.119s CPU time.
Nov 29 06:56:35 compute-0 systemd-machined[153486]: Machine qemu-13-instance-00000024 terminated.
Nov 29 06:56:35 compute-0 nova_compute[187185]: 2025-11-29 06:56:35.682 187189 INFO nova.virt.libvirt.driver [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Instance shutdown successfully after 13 seconds.
Nov 29 06:56:35 compute-0 nova_compute[187185]: 2025-11-29 06:56:35.688 187189 INFO nova.virt.libvirt.driver [-] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Instance destroyed successfully.
Nov 29 06:56:35 compute-0 nova_compute[187185]: 2025-11-29 06:56:35.694 187189 DEBUG oslo_concurrency.processutils [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:56:35 compute-0 nova_compute[187185]: 2025-11-29 06:56:35.793 187189 DEBUG oslo_concurrency.processutils [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:56:35 compute-0 nova_compute[187185]: 2025-11-29 06:56:35.796 187189 DEBUG oslo_concurrency.processutils [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:56:35 compute-0 nova_compute[187185]: 2025-11-29 06:56:35.852 187189 DEBUG oslo_concurrency.processutils [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:56:35 compute-0 nova_compute[187185]: 2025-11-29 06:56:35.855 187189 DEBUG nova.virt.libvirt.volume.remotefs [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Copying file /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e_resize/disk to 192.168.122.101:/var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Nov 29 06:56:35 compute-0 nova_compute[187185]: 2025-11-29 06:56:35.856 187189 DEBUG oslo_concurrency.processutils [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e_resize/disk 192.168.122.101:/var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:56:36 compute-0 nova_compute[187185]: 2025-11-29 06:56:36.790 187189 DEBUG oslo_concurrency.processutils [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "scp -r /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e_resize/disk 192.168.122.101:/var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk" returned: 0 in 0.934s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:56:36 compute-0 nova_compute[187185]: 2025-11-29 06:56:36.793 187189 DEBUG nova.virt.libvirt.volume.remotefs [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Copying file /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e_resize/disk.config to 192.168.122.101:/var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Nov 29 06:56:36 compute-0 nova_compute[187185]: 2025-11-29 06:56:36.793 187189 DEBUG oslo_concurrency.processutils [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e_resize/disk.config 192.168.122.101:/var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:56:36 compute-0 nova_compute[187185]: 2025-11-29 06:56:36.899 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:56:37 compute-0 nova_compute[187185]: 2025-11-29 06:56:37.054 187189 DEBUG oslo_concurrency.processutils [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "scp -C -r /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e_resize/disk.config 192.168.122.101:/var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk.config" returned: 0 in 0.260s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:56:37 compute-0 nova_compute[187185]: 2025-11-29 06:56:37.055 187189 DEBUG nova.virt.libvirt.volume.remotefs [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Copying file /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e_resize/disk.info to 192.168.122.101:/var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Nov 29 06:56:37 compute-0 nova_compute[187185]: 2025-11-29 06:56:37.056 187189 DEBUG oslo_concurrency.processutils [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e_resize/disk.info 192.168.122.101:/var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:56:37 compute-0 nova_compute[187185]: 2025-11-29 06:56:37.384 187189 DEBUG oslo_concurrency.processutils [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "scp -C -r /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e_resize/disk.info 192.168.122.101:/var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk.info" returned: 0 in 0.328s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:56:37 compute-0 nova_compute[187185]: 2025-11-29 06:56:37.524 187189 DEBUG oslo_concurrency.lockutils [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "6aebe65a-3191-4d58-acfd-8d663b9b0a8e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:56:37 compute-0 nova_compute[187185]: 2025-11-29 06:56:37.525 187189 DEBUG oslo_concurrency.lockutils [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "6aebe65a-3191-4d58-acfd-8d663b9b0a8e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:56:37 compute-0 nova_compute[187185]: 2025-11-29 06:56:37.526 187189 DEBUG oslo_concurrency.lockutils [None req-aa8115cb-90ff-46bd-8ce5-1543db8f5c1a 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "6aebe65a-3191-4d58-acfd-8d663b9b0a8e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:56:37 compute-0 podman[218808]: 2025-11-29 06:56:37.847676014 +0000 UTC m=+0.090763756 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 06:56:37 compute-0 podman[218807]: 2025-11-29 06:56:37.866877085 +0000 UTC m=+0.116401458 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 06:56:38 compute-0 nova_compute[187185]: 2025-11-29 06:56:38.342 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:56:41 compute-0 nova_compute[187185]: 2025-11-29 06:56:41.903 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:56:42 compute-0 nova_compute[187185]: 2025-11-29 06:56:42.780 187189 INFO nova.compute.manager [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Swapping old allocation on dict_keys(['4e39a026-df39-4e20-874a-dbb5a40df044']) held by migration b7911716-661a-44a8-8b8d-33fa6a185908 for instance
Nov 29 06:56:42 compute-0 nova_compute[187185]: 2025-11-29 06:56:42.814 187189 DEBUG nova.scheduler.client.report [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Overwriting current allocation {'allocations': {'1c526389-06f6-4ffd-8e90-a84c6c39f0bc': {'resources': {'VCPU': 1, 'MEMORY_MB': 192, 'DISK_GB': 1}, 'generation': 27}}, 'project_id': '890f94a625b342fdb17128922403c925', 'user_id': '53ee944c04484336b9b14d84235a62b8', 'consumer_generation': 1} on consumer 6aebe65a-3191-4d58-acfd-8d663b9b0a8e move_allocations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:2018
Nov 29 06:56:43 compute-0 nova_compute[187185]: 2025-11-29 06:56:43.111 187189 DEBUG oslo_concurrency.lockutils [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "refresh_cache-6aebe65a-3191-4d58-acfd-8d663b9b0a8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:56:43 compute-0 nova_compute[187185]: 2025-11-29 06:56:43.112 187189 DEBUG oslo_concurrency.lockutils [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquired lock "refresh_cache-6aebe65a-3191-4d58-acfd-8d663b9b0a8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:56:43 compute-0 nova_compute[187185]: 2025-11-29 06:56:43.112 187189 DEBUG nova.network.neutron [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 06:56:43 compute-0 nova_compute[187185]: 2025-11-29 06:56:43.329 187189 DEBUG nova.network.neutron [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 06:56:43 compute-0 nova_compute[187185]: 2025-11-29 06:56:43.345 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:56:43 compute-0 nova_compute[187185]: 2025-11-29 06:56:43.704 187189 DEBUG nova.network.neutron [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:56:43 compute-0 nova_compute[187185]: 2025-11-29 06:56:43.722 187189 DEBUG oslo_concurrency.lockutils [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Releasing lock "refresh_cache-6aebe65a-3191-4d58-acfd-8d663b9b0a8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:56:43 compute-0 nova_compute[187185]: 2025-11-29 06:56:43.723 187189 DEBUG nova.virt.libvirt.driver [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Starting finish_revert_migration finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11843
Nov 29 06:56:43 compute-0 nova_compute[187185]: 2025-11-29 06:56:43.735 187189 DEBUG nova.virt.libvirt.driver [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 06:56:43 compute-0 nova_compute[187185]: 2025-11-29 06:56:43.743 187189 WARNING nova.virt.libvirt.driver [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:56:43 compute-0 nova_compute[187185]: 2025-11-29 06:56:43.758 187189 DEBUG nova.virt.libvirt.host [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 06:56:43 compute-0 nova_compute[187185]: 2025-11-29 06:56:43.760 187189 DEBUG nova.virt.libvirt.host [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 06:56:43 compute-0 nova_compute[187185]: 2025-11-29 06:56:43.766 187189 DEBUG nova.virt.libvirt.host [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 06:56:43 compute-0 nova_compute[187185]: 2025-11-29 06:56:43.767 187189 DEBUG nova.virt.libvirt.host [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 06:56:43 compute-0 nova_compute[187185]: 2025-11-29 06:56:43.769 187189 DEBUG nova.virt.libvirt.driver [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 06:56:43 compute-0 nova_compute[187185]: 2025-11-29 06:56:43.769 187189 DEBUG nova.virt.hardware [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:56:10Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='268377f2-662f-436c-9b23-42f8b1992a21',id=30,is_public=True,memory_mb=128,name='tempest-test_resize_flavor_-1559473648',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 06:56:43 compute-0 nova_compute[187185]: 2025-11-29 06:56:43.770 187189 DEBUG nova.virt.hardware [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 06:56:43 compute-0 nova_compute[187185]: 2025-11-29 06:56:43.771 187189 DEBUG nova.virt.hardware [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 06:56:43 compute-0 nova_compute[187185]: 2025-11-29 06:56:43.772 187189 DEBUG nova.virt.hardware [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 06:56:43 compute-0 nova_compute[187185]: 2025-11-29 06:56:43.772 187189 DEBUG nova.virt.hardware [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 06:56:43 compute-0 nova_compute[187185]: 2025-11-29 06:56:43.773 187189 DEBUG nova.virt.hardware [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 06:56:43 compute-0 nova_compute[187185]: 2025-11-29 06:56:43.773 187189 DEBUG nova.virt.hardware [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 06:56:43 compute-0 nova_compute[187185]: 2025-11-29 06:56:43.774 187189 DEBUG nova.virt.hardware [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 06:56:43 compute-0 nova_compute[187185]: 2025-11-29 06:56:43.774 187189 DEBUG nova.virt.hardware [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 06:56:43 compute-0 nova_compute[187185]: 2025-11-29 06:56:43.775 187189 DEBUG nova.virt.hardware [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 06:56:43 compute-0 nova_compute[187185]: 2025-11-29 06:56:43.775 187189 DEBUG nova.virt.hardware [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 06:56:43 compute-0 nova_compute[187185]: 2025-11-29 06:56:43.776 187189 DEBUG nova.objects.instance [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 6aebe65a-3191-4d58-acfd-8d663b9b0a8e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:56:43 compute-0 nova_compute[187185]: 2025-11-29 06:56:43.801 187189 DEBUG oslo_concurrency.processutils [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:56:43 compute-0 nova_compute[187185]: 2025-11-29 06:56:43.902 187189 DEBUG oslo_concurrency.processutils [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk.config --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:56:43 compute-0 nova_compute[187185]: 2025-11-29 06:56:43.904 187189 DEBUG oslo_concurrency.lockutils [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "/var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:56:43 compute-0 nova_compute[187185]: 2025-11-29 06:56:43.905 187189 DEBUG oslo_concurrency.lockutils [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "/var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:56:43 compute-0 nova_compute[187185]: 2025-11-29 06:56:43.907 187189 DEBUG oslo_concurrency.lockutils [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "/var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:56:43 compute-0 nova_compute[187185]: 2025-11-29 06:56:43.912 187189 DEBUG nova.virt.libvirt.driver [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] End _get_guest_xml xml=<domain type="kvm">
Nov 29 06:56:43 compute-0 nova_compute[187185]:   <uuid>6aebe65a-3191-4d58-acfd-8d663b9b0a8e</uuid>
Nov 29 06:56:43 compute-0 nova_compute[187185]:   <name>instance-00000024</name>
Nov 29 06:56:43 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 06:56:43 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 06:56:43 compute-0 nova_compute[187185]:   <metadata>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 06:56:43 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:       <nova:name>tempest-MigrationsAdminTest-server-1402593290</nova:name>
Nov 29 06:56:43 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 06:56:43</nova:creationTime>
Nov 29 06:56:43 compute-0 nova_compute[187185]:       <nova:flavor name="tempest-test_resize_flavor_-1559473648">
Nov 29 06:56:43 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 06:56:43 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 06:56:43 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 06:56:43 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 06:56:43 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 06:56:43 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 06:56:43 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 06:56:43 compute-0 nova_compute[187185]:         <nova:user uuid="53ee944c04484336b9b14d84235a62b8">tempest-MigrationsAdminTest-1601255173-project-member</nova:user>
Nov 29 06:56:43 compute-0 nova_compute[187185]:         <nova:project uuid="890f94a625b342fdb17128922403c925">tempest-MigrationsAdminTest-1601255173</nova:project>
Nov 29 06:56:43 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 06:56:43 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:       <nova:ports/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 06:56:43 compute-0 nova_compute[187185]:   </metadata>
Nov 29 06:56:43 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <system>
Nov 29 06:56:43 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 06:56:43 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 06:56:43 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 06:56:43 compute-0 nova_compute[187185]:       <entry name="serial">6aebe65a-3191-4d58-acfd-8d663b9b0a8e</entry>
Nov 29 06:56:43 compute-0 nova_compute[187185]:       <entry name="uuid">6aebe65a-3191-4d58-acfd-8d663b9b0a8e</entry>
Nov 29 06:56:43 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     </system>
Nov 29 06:56:43 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 06:56:43 compute-0 nova_compute[187185]:   <os>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:   </os>
Nov 29 06:56:43 compute-0 nova_compute[187185]:   <features>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <apic/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:   </features>
Nov 29 06:56:43 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:   </clock>
Nov 29 06:56:43 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:   </cpu>
Nov 29 06:56:43 compute-0 nova_compute[187185]:   <devices>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 06:56:43 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     </disk>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 06:56:43 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk.config"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     </disk>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 06:56:43 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/console.log" append="off"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     </serial>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <video>
Nov 29 06:56:43 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     </video>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <input type="keyboard" bus="usb"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 06:56:43 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     </rng>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 06:56:43 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 06:56:43 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 06:56:43 compute-0 nova_compute[187185]:   </devices>
Nov 29 06:56:43 compute-0 nova_compute[187185]: </domain>
Nov 29 06:56:43 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 06:56:44 compute-0 nova_compute[187185]: 2025-11-29 06:56:44.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:56:44 compute-0 nova_compute[187185]: 2025-11-29 06:56:44.321 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 06:56:44 compute-0 nova_compute[187185]: 2025-11-29 06:56:44.321 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 06:56:44 compute-0 nova_compute[187185]: 2025-11-29 06:56:44.352 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "refresh_cache-6aebe65a-3191-4d58-acfd-8d663b9b0a8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:56:44 compute-0 nova_compute[187185]: 2025-11-29 06:56:44.353 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquired lock "refresh_cache-6aebe65a-3191-4d58-acfd-8d663b9b0a8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:56:44 compute-0 nova_compute[187185]: 2025-11-29 06:56:44.353 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 29 06:56:44 compute-0 nova_compute[187185]: 2025-11-29 06:56:44.353 187189 DEBUG nova.objects.instance [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 6aebe65a-3191-4d58-acfd-8d663b9b0a8e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:56:44 compute-0 systemd-machined[153486]: New machine qemu-14-instance-00000024.
Nov 29 06:56:44 compute-0 systemd[1]: Started Virtual Machine qemu-14-instance-00000024.
Nov 29 06:56:44 compute-0 nova_compute[187185]: 2025-11-29 06:56:44.576 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 06:56:45 compute-0 nova_compute[187185]: 2025-11-29 06:56:45.048 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:56:45 compute-0 nova_compute[187185]: 2025-11-29 06:56:45.064 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Releasing lock "refresh_cache-6aebe65a-3191-4d58-acfd-8d663b9b0a8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:56:45 compute-0 nova_compute[187185]: 2025-11-29 06:56:45.064 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 29 06:56:45 compute-0 nova_compute[187185]: 2025-11-29 06:56:45.064 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:56:45 compute-0 nova_compute[187185]: 2025-11-29 06:56:45.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:56:45 compute-0 nova_compute[187185]: 2025-11-29 06:56:45.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 29 06:56:45 compute-0 nova_compute[187185]: 2025-11-29 06:56:45.784 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:56:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:56:45.781 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 06:56:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:56:45.792 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 06:56:46 compute-0 nova_compute[187185]: 2025-11-29 06:56:46.343 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:56:46 compute-0 nova_compute[187185]: 2025-11-29 06:56:46.345 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:56:46 compute-0 nova_compute[187185]: 2025-11-29 06:56:46.345 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:56:46 compute-0 nova_compute[187185]: 2025-11-29 06:56:46.393 187189 DEBUG nova.virt.libvirt.host [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Removed pending event for 6aebe65a-3191-4d58-acfd-8d663b9b0a8e due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Nov 29 06:56:46 compute-0 nova_compute[187185]: 2025-11-29 06:56:46.394 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399406.3900352, 6aebe65a-3191-4d58-acfd-8d663b9b0a8e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:56:46 compute-0 nova_compute[187185]: 2025-11-29 06:56:46.395 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] VM Resumed (Lifecycle Event)
Nov 29 06:56:46 compute-0 nova_compute[187185]: 2025-11-29 06:56:46.398 187189 DEBUG nova.compute.manager [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 06:56:46 compute-0 nova_compute[187185]: 2025-11-29 06:56:46.406 187189 INFO nova.virt.libvirt.driver [-] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Instance running successfully.
Nov 29 06:56:46 compute-0 nova_compute[187185]: 2025-11-29 06:56:46.406 187189 DEBUG nova.virt.libvirt.driver [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] finish_revert_migration finished successfully. finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11887
Nov 29 06:56:46 compute-0 nova_compute[187185]: 2025-11-29 06:56:46.416 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:56:46 compute-0 nova_compute[187185]: 2025-11-29 06:56:46.421 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 06:56:46 compute-0 nova_compute[187185]: 2025-11-29 06:56:46.685 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] During sync_power_state the instance has a pending task (resize_reverting). Skip.
Nov 29 06:56:46 compute-0 nova_compute[187185]: 2025-11-29 06:56:46.686 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399406.3944225, 6aebe65a-3191-4d58-acfd-8d663b9b0a8e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:56:46 compute-0 nova_compute[187185]: 2025-11-29 06:56:46.686 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] VM Started (Lifecycle Event)
Nov 29 06:56:46 compute-0 nova_compute[187185]: 2025-11-29 06:56:46.718 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:56:46 compute-0 nova_compute[187185]: 2025-11-29 06:56:46.723 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 06:56:46 compute-0 nova_compute[187185]: 2025-11-29 06:56:46.759 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] During sync_power_state the instance has a pending task (resize_reverting). Skip.
Nov 29 06:56:46 compute-0 nova_compute[187185]: 2025-11-29 06:56:46.783 187189 INFO nova.compute.manager [None req-ec46e3b9-5b08-4c8f-9cb9-30a00987bd8d 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Updating instance to original state: 'active'
Nov 29 06:56:46 compute-0 nova_compute[187185]: 2025-11-29 06:56:46.906 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:56:47 compute-0 podman[218881]: 2025-11-29 06:56:47.817960267 +0000 UTC m=+0.081203607 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 29 06:56:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:47.995 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}dd148b1b64568516e8e9c4f7dca4b2c96ed9e7a6d38f1387503b8184f9bf9013" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.161 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 644 Content-Type: application/json Date: Sat, 29 Nov 2025 06:56:48 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-d324a14f-77be-4c02-9c78-b65514ee9539 x-openstack-request-id: req-d324a14f-77be-4c02-9c78-b65514ee9539 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.161 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31"}]}, {"id": "e29df891-dca5-4a1c-9258-dc512a46956f", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/e29df891-dca5-4a1c-9258-dc512a46956f"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/e29df891-dca5-4a1c-9258-dc512a46956f"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.162 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-d324a14f-77be-4c02-9c78-b65514ee9539 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.165 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '6aebe65a-3191-4d58-acfd-8d663b9b0a8e', 'name': 'tempest-MigrationsAdminTest-server-1402593290', 'flavor': {'id': 'tempest-test_resize_flavor_-1559473648', 'name': 'tempest-test_resize_flavor_-1559473648', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000024', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '890f94a625b342fdb17128922403c925', 'user_id': '53ee944c04484336b9b14d84235a62b8', 'hostId': '5d2d68639a181e87249e9aeb15eeec6cf8d9527b057e75f6fc9bad8c', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.166 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.169 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.202 12 DEBUG ceilometer.compute.pollsters [-] 6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk.device.read.latency volume: 210142231 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.203 12 DEBUG ceilometer.compute.pollsters [-] 6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk.device.read.latency volume: 408012 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8145108a-c523-46bc-934e-076b0bca0915', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 210142231, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': '6aebe65a-3191-4d58-acfd-8d663b9b0a8e-vda', 'timestamp': '2025-11-29T06:56:48.169929', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1402593290', 'name': 'instance-00000024', 'instance_id': '6aebe65a-3191-4d58-acfd-8d663b9b0a8e', 'instance_type': 'tempest-test_resize_flavor_-1559473648', 'host': '5d2d68639a181e87249e9aeb15eeec6cf8d9527b057e75f6fc9bad8c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-1559473648', 'name': 'tempest-test_resize_flavor_-1559473648', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '932ef110-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4855.888166037, 'message_signature': 'e4bb8231da29def2cfbb8bcdd22d1ed2f5be5b6415f458b2dc18349f47306874'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 408012, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': '6aebe65a-3191-4d58-acfd-8d663b9b0a8e-sda', 'timestamp': '2025-11-29T06:56:48.169929', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1402593290', 'name': 'instance-00000024', 'instance_id': '6aebe65a-3191-4d58-acfd-8d663b9b0a8e', 'instance_type': 'tempest-test_resize_flavor_-1559473648', 'host': '5d2d68639a181e87249e9aeb15eeec6cf8d9527b057e75f6fc9bad8c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-1559473648', 'name': 'tempest-test_resize_flavor_-1559473648', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '932f18c0-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4855.888166037, 'message_signature': 'f7575ab929499e2074bb4be4eb225901cc41c6b54e61af4a2009994f657b629a'}]}, 'timestamp': '2025-11-29 06:56:48.204420', '_unique_id': '9b433010451246df855298bf642a96d8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.207 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.210 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.210 12 DEBUG ceilometer.compute.pollsters [-] 6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.211 12 DEBUG ceilometer.compute.pollsters [-] 6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '40684957-ecb6-4e93-9a20-6ba373e7ae11', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': '6aebe65a-3191-4d58-acfd-8d663b9b0a8e-vda', 'timestamp': '2025-11-29T06:56:48.210482', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1402593290', 'name': 'instance-00000024', 'instance_id': '6aebe65a-3191-4d58-acfd-8d663b9b0a8e', 'instance_type': 'tempest-test_resize_flavor_-1559473648', 'host': '5d2d68639a181e87249e9aeb15eeec6cf8d9527b057e75f6fc9bad8c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-1559473648', 'name': 'tempest-test_resize_flavor_-1559473648', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '93302198-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4855.888166037, 'message_signature': '644fe5110d3a0c6c85db7e70dde31895f5b967fe5b083679217f15512c2aef7f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': '6aebe65a-3191-4d58-acfd-8d663b9b0a8e-sda', 'timestamp': '2025-11-29T06:56:48.210482', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1402593290', 'name': 'instance-00000024', 'instance_id': '6aebe65a-3191-4d58-acfd-8d663b9b0a8e', 'instance_type': 'tempest-test_resize_flavor_-1559473648', 'host': '5d2d68639a181e87249e9aeb15eeec6cf8d9527b057e75f6fc9bad8c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-1559473648', 'name': 'tempest-test_resize_flavor_-1559473648', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9330371e-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4855.888166037, 'message_signature': '08bbce96b56647ccf1ed7be07a12d06108605699146c4b379b57db1d02d13413'}]}, 'timestamp': '2025-11-29 06:56:48.211557', '_unique_id': '2735ec782f844d56b83889e9ad76100a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.212 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.214 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.214 12 DEBUG ceilometer.compute.pollsters [-] 6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.214 12 DEBUG ceilometer.compute.pollsters [-] 6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3fb7f6bc-5462-4f46-9a10-af158d559684', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': '6aebe65a-3191-4d58-acfd-8d663b9b0a8e-vda', 'timestamp': '2025-11-29T06:56:48.214326', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1402593290', 'name': 'instance-00000024', 'instance_id': '6aebe65a-3191-4d58-acfd-8d663b9b0a8e', 'instance_type': 'tempest-test_resize_flavor_-1559473648', 'host': '5d2d68639a181e87249e9aeb15eeec6cf8d9527b057e75f6fc9bad8c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-1559473648', 'name': 'tempest-test_resize_flavor_-1559473648', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9330b6e4-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4855.888166037, 'message_signature': '7d45fd16f0ca6ed296cc8afbf25f32f47f0f4cc14beb931211b680cf1bf3fe90'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': '6aebe65a-3191-4d58-acfd-8d663b9b0a8e-sda', 'timestamp': '2025-11-29T06:56:48.214326', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1402593290', 'name': 'instance-00000024', 'instance_id': '6aebe65a-3191-4d58-acfd-8d663b9b0a8e', 'instance_type': 'tempest-test_resize_flavor_-1559473648', 'host': '5d2d68639a181e87249e9aeb15eeec6cf8d9527b057e75f6fc9bad8c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-1559473648', 'name': 'tempest-test_resize_flavor_-1559473648', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9330cbde-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4855.888166037, 'message_signature': 'cc44cb938a8f70f28f63d31dd0d586d212a7a8ba5376b13ce54bc10cb3b19efa'}]}, 'timestamp': '2025-11-29 06:56:48.215365', '_unique_id': 'b09b58cdc9bf48faae062379572a997b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.216 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.217 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.217 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.218 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.218 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.218 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.218 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-MigrationsAdminTest-server-1402593290>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MigrationsAdminTest-server-1402593290>]
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.219 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.234 12 DEBUG ceilometer.compute.pollsters [-] 6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.234 12 DEBUG ceilometer.compute.pollsters [-] 6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3dccf4d0-3692-4247-a3c0-4d49eceb9823', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': '6aebe65a-3191-4d58-acfd-8d663b9b0a8e-vda', 'timestamp': '2025-11-29T06:56:48.219341', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1402593290', 'name': 'instance-00000024', 'instance_id': '6aebe65a-3191-4d58-acfd-8d663b9b0a8e', 'instance_type': 'tempest-test_resize_flavor_-1559473648', 'host': '5d2d68639a181e87249e9aeb15eeec6cf8d9527b057e75f6fc9bad8c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-1559473648', 'name': 'tempest-test_resize_flavor_-1559473648', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9333bfd8-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4855.93766755, 'message_signature': 'ed8bd6e10dfbb435b281b53665a0022bb8c4a637a4e6c55b0502be14a74524c1'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': '6aebe65a-3191-4d58-acfd-8d663b9b0a8e-sda', 'timestamp': '2025-11-29T06:56:48.219341', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1402593290', 'name': 'instance-00000024', 'instance_id': '6aebe65a-3191-4d58-acfd-8d663b9b0a8e', 'instance_type': 'tempest-test_resize_flavor_-1559473648', 'host': '5d2d68639a181e87249e9aeb15eeec6cf8d9527b057e75f6fc9bad8c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-1559473648', 'name': 'tempest-test_resize_flavor_-1559473648', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9333d784-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4855.93766755, 'message_signature': '600c084d138d104455e0ca098a27af1153e4a0882f4ee1ffbcd762ba1f465360'}]}, 'timestamp': '2025-11-29 06:56:48.235336', '_unique_id': 'b30a72b98a5e4cf4a62f8b63fe3db68a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.236 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.238 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.238 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.238 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.238 12 DEBUG ceilometer.compute.pollsters [-] 6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.239 12 DEBUG ceilometer.compute.pollsters [-] 6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a6c05b2a-528a-45fe-aaab-13e63a1c5fcb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': '6aebe65a-3191-4d58-acfd-8d663b9b0a8e-vda', 'timestamp': '2025-11-29T06:56:48.238790', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1402593290', 'name': 'instance-00000024', 'instance_id': '6aebe65a-3191-4d58-acfd-8d663b9b0a8e', 'instance_type': 'tempest-test_resize_flavor_-1559473648', 'host': '5d2d68639a181e87249e9aeb15eeec6cf8d9527b057e75f6fc9bad8c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-1559473648', 'name': 'tempest-test_resize_flavor_-1559473648', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '93347540-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4855.93766755, 'message_signature': 'dadc9489bb60becbd0925adb0b67759a73517a1edbb2c8580126cc1336cabd80'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': '6aebe65a-3191-4d58-acfd-8d663b9b0a8e-sda', 'timestamp': '2025-11-29T06:56:48.238790', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1402593290', 'name': 'instance-00000024', 'instance_id': '6aebe65a-3191-4d58-acfd-8d663b9b0a8e', 'instance_type': 'tempest-test_resize_flavor_-1559473648', 'host': '5d2d68639a181e87249e9aeb15eeec6cf8d9527b057e75f6fc9bad8c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-1559473648', 'name': 'tempest-test_resize_flavor_-1559473648', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '93348ba2-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4855.93766755, 'message_signature': '0efcc710358bc063a7632fe8c00d38239f5db92465d0e9c80b66ad3bfdd814e7'}]}, 'timestamp': '2025-11-29 06:56:48.239970', '_unique_id': '2284a7682d7d479593bcba09d4a53b06'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.241 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.242 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.242 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.243 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-MigrationsAdminTest-server-1402593290>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MigrationsAdminTest-server-1402593290>]
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.243 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.243 12 DEBUG ceilometer.compute.pollsters [-] 6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.244 12 DEBUG ceilometer.compute.pollsters [-] 6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cac93906-64b4-45ce-958e-a4db59b104d7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': '6aebe65a-3191-4d58-acfd-8d663b9b0a8e-vda', 'timestamp': '2025-11-29T06:56:48.243544', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1402593290', 'name': 'instance-00000024', 'instance_id': '6aebe65a-3191-4d58-acfd-8d663b9b0a8e', 'instance_type': 'tempest-test_resize_flavor_-1559473648', 'host': '5d2d68639a181e87249e9aeb15eeec6cf8d9527b057e75f6fc9bad8c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-1559473648', 'name': 'tempest-test_resize_flavor_-1559473648', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '93352b70-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4855.888166037, 'message_signature': '56b0e8d24c7a96a92627a24ea22290a053f1945fb78ce07e4bb3878c954173d1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': '6aebe65a-3191-4d58-acfd-8d663b9b0a8e-sda', 'timestamp': '2025-11-29T06:56:48.243544', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1402593290', 'name': 'instance-00000024', 'instance_id': '6aebe65a-3191-4d58-acfd-8d663b9b0a8e', 'instance_type': 'tempest-test_resize_flavor_-1559473648', 'host': '5d2d68639a181e87249e9aeb15eeec6cf8d9527b057e75f6fc9bad8c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-1559473648', 'name': 'tempest-test_resize_flavor_-1559473648', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '93353e94-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4855.888166037, 'message_signature': 'dd77c9da18cfce63351ea082b598e32a2447544352b25f19b824e1cdcab22f3b'}]}, 'timestamp': '2025-11-29 06:56:48.244499', '_unique_id': '6ccc85a3dd9e4cbb9bd9d982557b75c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.245 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.247 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.267 12 DEBUG ceilometer.compute.pollsters [-] 6aebe65a-3191-4d58-acfd-8d663b9b0a8e/cpu volume: 1740000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0fd47d05-bf68-4246-8f09-e3e37f9c98e9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1740000000, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': '6aebe65a-3191-4d58-acfd-8d663b9b0a8e', 'timestamp': '2025-11-29T06:56:48.247327', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1402593290', 'name': 'instance-00000024', 'instance_id': '6aebe65a-3191-4d58-acfd-8d663b9b0a8e', 'instance_type': 'tempest-test_resize_flavor_-1559473648', 'host': '5d2d68639a181e87249e9aeb15eeec6cf8d9527b057e75f6fc9bad8c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-1559473648', 'name': 'tempest-test_resize_flavor_-1559473648', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '9338d3e2-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4855.985484116, 'message_signature': 'af3b637678b55da5cb187f0ef046dbc9461d7b2f3af36c35e8eec30f74168d77'}]}, 'timestamp': '2025-11-29 06:56:48.268099', '_unique_id': 'a8d3bb168d404a729058fb6459f1a026'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.269 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.270 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.271 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.271 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-MigrationsAdminTest-server-1402593290>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MigrationsAdminTest-server-1402593290>]
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.271 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.271 12 DEBUG ceilometer.compute.pollsters [-] 6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.272 12 DEBUG ceilometer.compute.pollsters [-] 6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fe7d954a-ae56-4d84-a88c-7ba77ab75726', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': '6aebe65a-3191-4d58-acfd-8d663b9b0a8e-vda', 'timestamp': '2025-11-29T06:56:48.271625', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1402593290', 'name': 'instance-00000024', 'instance_id': '6aebe65a-3191-4d58-acfd-8d663b9b0a8e', 'instance_type': 'tempest-test_resize_flavor_-1559473648', 'host': '5d2d68639a181e87249e9aeb15eeec6cf8d9527b057e75f6fc9bad8c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-1559473648', 'name': 'tempest-test_resize_flavor_-1559473648', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '93397644-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4855.888166037, 'message_signature': '899d4bc9bf3af9e21829b9bf98dbcaa6a7a2a8082ef5ba1c7126dd910644b605'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': '6aebe65a-3191-4d58-acfd-8d663b9b0a8e-sda', 'timestamp': '2025-11-29T06:56:48.271625', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1402593290', 'name': 'instance-00000024', 'instance_id': '6aebe65a-3191-4d58-acfd-8d663b9b0a8e', 'instance_type': 'tempest-test_resize_flavor_-1559473648', 'host': '5d2d68639a181e87249e9aeb15eeec6cf8d9527b057e75f6fc9bad8c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-1559473648', 'name': 'tempest-test_resize_flavor_-1559473648', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '93398774-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4855.888166037, 'message_signature': '41a1a0121cddd92e1c43568fbb5d763f17f4f7ea96877f42bf94b55f0aa2a4cc'}]}, 'timestamp': '2025-11-29 06:56:48.272602', '_unique_id': '6dfce731a2f847fabee82badff2ded29'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.273 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.275 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.275 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.275 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.275 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-MigrationsAdminTest-server-1402593290>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MigrationsAdminTest-server-1402593290>]
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.276 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.276 12 DEBUG ceilometer.compute.pollsters [-] 6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.276 12 DEBUG ceilometer.compute.pollsters [-] 6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ce45f9de-e2dc-44b7-be31-319aae7e1578', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': '6aebe65a-3191-4d58-acfd-8d663b9b0a8e-vda', 'timestamp': '2025-11-29T06:56:48.276371', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1402593290', 'name': 'instance-00000024', 'instance_id': '6aebe65a-3191-4d58-acfd-8d663b9b0a8e', 'instance_type': 'tempest-test_resize_flavor_-1559473648', 'host': '5d2d68639a181e87249e9aeb15eeec6cf8d9527b057e75f6fc9bad8c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-1559473648', 'name': 'tempest-test_resize_flavor_-1559473648', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '933a2dd2-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4855.888166037, 'message_signature': '2dc39132dbb0bde29bc13555416c9bad3a3b4672787539fd38f7ad3d38ea9ddd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': '6aebe65a-3191-4d58-acfd-8d663b9b0a8e-sda', 'timestamp': '2025-11-29T06:56:48.276371', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1402593290', 'name': 'instance-00000024', 'instance_id': '6aebe65a-3191-4d58-acfd-8d663b9b0a8e', 'instance_type': 'tempest-test_resize_flavor_-1559473648', 'host': '5d2d68639a181e87249e9aeb15eeec6cf8d9527b057e75f6fc9bad8c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-1559473648', 'name': 'tempest-test_resize_flavor_-1559473648', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '933a4128-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4855.888166037, 'message_signature': '4cd39e86f7db7adda8165aa49c2aa5b6f0c224380bdabb15dec4016e1022318f'}]}, 'timestamp': '2025-11-29 06:56:48.277334', '_unique_id': '8acfbfdeffef4c86a93dc2bd92735ceb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.278 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.279 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.280 12 DEBUG ceilometer.compute.pollsters [-] 6aebe65a-3191-4d58-acfd-8d663b9b0a8e/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.280 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 6aebe65a-3191-4d58-acfd-8d663b9b0a8e: ceilometer.compute.pollsters.NoVolumeException
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.280 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.280 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.281 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.281 12 DEBUG ceilometer.compute.pollsters [-] 6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk.device.allocation volume: 30744576 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.281 12 DEBUG ceilometer.compute.pollsters [-] 6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '826e14bc-093b-4749-881b-1a1cc902f718', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30744576, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': '6aebe65a-3191-4d58-acfd-8d663b9b0a8e-vda', 'timestamp': '2025-11-29T06:56:48.281356', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1402593290', 'name': 'instance-00000024', 'instance_id': '6aebe65a-3191-4d58-acfd-8d663b9b0a8e', 'instance_type': 'tempest-test_resize_flavor_-1559473648', 'host': '5d2d68639a181e87249e9aeb15eeec6cf8d9527b057e75f6fc9bad8c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-1559473648', 'name': 'tempest-test_resize_flavor_-1559473648', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '933af050-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4855.93766755, 'message_signature': '42cc0670ffb5bc300487d93a09b473d691461963594bc54fac857f3bc6ed6bf5'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '53ee944c04484336b9b14d84235a62b8', 'user_name': None, 'project_id': '890f94a625b342fdb17128922403c925', 'project_name': None, 'resource_id': '6aebe65a-3191-4d58-acfd-8d663b9b0a8e-sda', 'timestamp': '2025-11-29T06:56:48.281356', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1402593290', 'name': 'instance-00000024', 'instance_id': '6aebe65a-3191-4d58-acfd-8d663b9b0a8e', 'instance_type': 'tempest-test_resize_flavor_-1559473648', 'host': '5d2d68639a181e87249e9aeb15eeec6cf8d9527b057e75f6fc9bad8c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'tempest-test_resize_flavor_-1559473648', 'name': 'tempest-test_resize_flavor_-1559473648', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '933b02e8-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4855.93766755, 'message_signature': 'b8901c69809ca52e92a7f00c868b7d114008b26d0d08bdbbeb33828c24822e08'}]}, 'timestamp': '2025-11-29 06:56:48.282378', '_unique_id': '1ea9030d5b924ccf95a28b83e178f19f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.283 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:56:48.284 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 06:56:48 compute-0 nova_compute[187185]: 2025-11-29 06:56:48.348 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:56:49 compute-0 nova_compute[187185]: 2025-11-29 06:56:49.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:56:49 compute-0 nova_compute[187185]: 2025-11-29 06:56:49.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:56:49 compute-0 nova_compute[187185]: 2025-11-29 06:56:49.430 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:56:49 compute-0 nova_compute[187185]: 2025-11-29 06:56:49.431 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:56:49 compute-0 nova_compute[187185]: 2025-11-29 06:56:49.432 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:56:49 compute-0 nova_compute[187185]: 2025-11-29 06:56:49.432 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:56:49 compute-0 nova_compute[187185]: 2025-11-29 06:56:49.814 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:56:49 compute-0 nova_compute[187185]: 2025-11-29 06:56:49.913 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:56:49 compute-0 nova_compute[187185]: 2025-11-29 06:56:49.915 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:56:49 compute-0 nova_compute[187185]: 2025-11-29 06:56:49.997 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:56:50 compute-0 nova_compute[187185]: 2025-11-29 06:56:50.223 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:56:50 compute-0 nova_compute[187185]: 2025-11-29 06:56:50.226 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5601MB free_disk=73.30966186523438GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:56:50 compute-0 nova_compute[187185]: 2025-11-29 06:56:50.227 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:56:50 compute-0 nova_compute[187185]: 2025-11-29 06:56:50.228 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:56:50 compute-0 nova_compute[187185]: 2025-11-29 06:56:50.663 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance 6aebe65a-3191-4d58-acfd-8d663b9b0a8e actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 06:56:50 compute-0 nova_compute[187185]: 2025-11-29 06:56:50.665 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:56:50 compute-0 nova_compute[187185]: 2025-11-29 06:56:50.665 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:56:50 compute-0 nova_compute[187185]: 2025-11-29 06:56:50.711 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:56:51 compute-0 nova_compute[187185]: 2025-11-29 06:56:51.002 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:56:51 compute-0 nova_compute[187185]: 2025-11-29 06:56:51.938 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:56:51 compute-0 nova_compute[187185]: 2025-11-29 06:56:51.977 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:56:51 compute-0 nova_compute[187185]: 2025-11-29 06:56:51.978 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.750s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:56:52 compute-0 nova_compute[187185]: 2025-11-29 06:56:52.978 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:56:52 compute-0 nova_compute[187185]: 2025-11-29 06:56:52.978 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:56:52 compute-0 nova_compute[187185]: 2025-11-29 06:56:52.978 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 06:56:53 compute-0 nova_compute[187185]: 2025-11-29 06:56:53.349 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:56:54 compute-0 nova_compute[187185]: 2025-11-29 06:56:54.812 187189 DEBUG oslo_concurrency.lockutils [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "72856fd1-9e86-48df-817f-42b206cc0bea" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:56:54 compute-0 nova_compute[187185]: 2025-11-29 06:56:54.812 187189 DEBUG oslo_concurrency.lockutils [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "72856fd1-9e86-48df-817f-42b206cc0bea" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:56:54 compute-0 nova_compute[187185]: 2025-11-29 06:56:54.855 187189 DEBUG nova.compute.manager [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 06:56:55 compute-0 nova_compute[187185]: 2025-11-29 06:56:55.105 187189 DEBUG oslo_concurrency.lockutils [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:56:55 compute-0 nova_compute[187185]: 2025-11-29 06:56:55.106 187189 DEBUG oslo_concurrency.lockutils [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:56:55 compute-0 nova_compute[187185]: 2025-11-29 06:56:55.118 187189 DEBUG nova.virt.hardware [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 06:56:55 compute-0 nova_compute[187185]: 2025-11-29 06:56:55.118 187189 INFO nova.compute.claims [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Claim successful on node compute-0.ctlplane.example.com
Nov 29 06:56:55 compute-0 nova_compute[187185]: 2025-11-29 06:56:55.391 187189 DEBUG nova.compute.provider_tree [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:56:55 compute-0 nova_compute[187185]: 2025-11-29 06:56:55.714 187189 DEBUG nova.scheduler.client.report [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:56:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:56:55.795 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:56:55 compute-0 nova_compute[187185]: 2025-11-29 06:56:55.834 187189 DEBUG oslo_concurrency.lockutils [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.727s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:56:55 compute-0 nova_compute[187185]: 2025-11-29 06:56:55.835 187189 DEBUG nova.compute.manager [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 06:56:56 compute-0 nova_compute[187185]: 2025-11-29 06:56:56.252 187189 DEBUG nova.compute.manager [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 06:56:56 compute-0 nova_compute[187185]: 2025-11-29 06:56:56.253 187189 DEBUG nova.network.neutron [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 06:56:56 compute-0 nova_compute[187185]: 2025-11-29 06:56:56.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:56:56 compute-0 nova_compute[187185]: 2025-11-29 06:56:56.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 29 06:56:56 compute-0 nova_compute[187185]: 2025-11-29 06:56:56.554 187189 DEBUG nova.network.neutron [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Nov 29 06:56:56 compute-0 nova_compute[187185]: 2025-11-29 06:56:56.555 187189 DEBUG nova.compute.manager [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 06:56:56 compute-0 nova_compute[187185]: 2025-11-29 06:56:56.879 187189 INFO nova.virt.libvirt.driver [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 06:56:56 compute-0 nova_compute[187185]: 2025-11-29 06:56:56.909 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 29 06:56:56 compute-0 nova_compute[187185]: 2025-11-29 06:56:56.942 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:56:57 compute-0 nova_compute[187185]: 2025-11-29 06:56:57.928 187189 DEBUG nova.compute.manager [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 06:56:58 compute-0 nova_compute[187185]: 2025-11-29 06:56:58.351 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:56:58 compute-0 nova_compute[187185]: 2025-11-29 06:56:58.701 187189 DEBUG nova.compute.manager [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 06:56:58 compute-0 nova_compute[187185]: 2025-11-29 06:56:58.703 187189 DEBUG nova.virt.libvirt.driver [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 06:56:58 compute-0 nova_compute[187185]: 2025-11-29 06:56:58.704 187189 INFO nova.virt.libvirt.driver [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Creating image(s)
Nov 29 06:56:58 compute-0 nova_compute[187185]: 2025-11-29 06:56:58.705 187189 DEBUG oslo_concurrency.lockutils [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "/var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:56:58 compute-0 nova_compute[187185]: 2025-11-29 06:56:58.706 187189 DEBUG oslo_concurrency.lockutils [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "/var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:56:58 compute-0 nova_compute[187185]: 2025-11-29 06:56:58.707 187189 DEBUG oslo_concurrency.lockutils [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "/var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:56:58 compute-0 nova_compute[187185]: 2025-11-29 06:56:58.736 187189 DEBUG oslo_concurrency.processutils [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:56:58 compute-0 nova_compute[187185]: 2025-11-29 06:56:58.828 187189 DEBUG oslo_concurrency.processutils [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:56:58 compute-0 nova_compute[187185]: 2025-11-29 06:56:58.830 187189 DEBUG oslo_concurrency.lockutils [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:56:58 compute-0 nova_compute[187185]: 2025-11-29 06:56:58.831 187189 DEBUG oslo_concurrency.lockutils [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:56:58 compute-0 nova_compute[187185]: 2025-11-29 06:56:58.856 187189 DEBUG oslo_concurrency.processutils [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:56:58 compute-0 nova_compute[187185]: 2025-11-29 06:56:58.939 187189 DEBUG oslo_concurrency.processutils [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:56:58 compute-0 nova_compute[187185]: 2025-11-29 06:56:58.940 187189 DEBUG oslo_concurrency.processutils [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:56:59 compute-0 nova_compute[187185]: 2025-11-29 06:56:59.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:56:59 compute-0 podman[218929]: 2025-11-29 06:56:59.843759251 +0000 UTC m=+0.085884678 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 06:56:59 compute-0 podman[218927]: 2025-11-29 06:56:59.852114147 +0000 UTC m=+0.099690578 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:56:59 compute-0 podman[218928]: 2025-11-29 06:56:59.865188425 +0000 UTC m=+0.106030956 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vendor=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41)
Nov 29 06:57:00 compute-0 nova_compute[187185]: 2025-11-29 06:57:00.847 187189 DEBUG oslo_concurrency.processutils [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk 1073741824" returned: 0 in 1.907s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:57:00 compute-0 nova_compute[187185]: 2025-11-29 06:57:00.848 187189 DEBUG oslo_concurrency.lockutils [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 2.017s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:57:00 compute-0 nova_compute[187185]: 2025-11-29 06:57:00.848 187189 DEBUG oslo_concurrency.processutils [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:57:00 compute-0 nova_compute[187185]: 2025-11-29 06:57:00.936 187189 DEBUG oslo_concurrency.processutils [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:57:00 compute-0 nova_compute[187185]: 2025-11-29 06:57:00.937 187189 DEBUG nova.virt.disk.api [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Checking if we can resize image /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 06:57:00 compute-0 nova_compute[187185]: 2025-11-29 06:57:00.938 187189 DEBUG oslo_concurrency.processutils [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:57:01 compute-0 nova_compute[187185]: 2025-11-29 06:57:01.000 187189 DEBUG oslo_concurrency.processutils [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:57:01 compute-0 nova_compute[187185]: 2025-11-29 06:57:01.002 187189 DEBUG nova.virt.disk.api [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Cannot resize image /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 06:57:01 compute-0 nova_compute[187185]: 2025-11-29 06:57:01.003 187189 DEBUG nova.objects.instance [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lazy-loading 'migration_context' on Instance uuid 72856fd1-9e86-48df-817f-42b206cc0bea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:57:01 compute-0 nova_compute[187185]: 2025-11-29 06:57:01.017 187189 DEBUG nova.virt.libvirt.driver [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 06:57:01 compute-0 nova_compute[187185]: 2025-11-29 06:57:01.018 187189 DEBUG nova.virt.libvirt.driver [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Ensure instance console log exists: /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 06:57:01 compute-0 nova_compute[187185]: 2025-11-29 06:57:01.019 187189 DEBUG oslo_concurrency.lockutils [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:57:01 compute-0 nova_compute[187185]: 2025-11-29 06:57:01.020 187189 DEBUG oslo_concurrency.lockutils [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:57:01 compute-0 nova_compute[187185]: 2025-11-29 06:57:01.020 187189 DEBUG oslo_concurrency.lockutils [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:57:01 compute-0 nova_compute[187185]: 2025-11-29 06:57:01.024 187189 DEBUG nova.virt.libvirt.driver [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 06:57:01 compute-0 nova_compute[187185]: 2025-11-29 06:57:01.032 187189 WARNING nova.virt.libvirt.driver [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:57:01 compute-0 nova_compute[187185]: 2025-11-29 06:57:01.039 187189 DEBUG nova.virt.libvirt.host [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 06:57:01 compute-0 nova_compute[187185]: 2025-11-29 06:57:01.041 187189 DEBUG nova.virt.libvirt.host [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 06:57:01 compute-0 nova_compute[187185]: 2025-11-29 06:57:01.045 187189 DEBUG nova.virt.libvirt.host [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 06:57:01 compute-0 nova_compute[187185]: 2025-11-29 06:57:01.046 187189 DEBUG nova.virt.libvirt.host [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 06:57:01 compute-0 nova_compute[187185]: 2025-11-29 06:57:01.048 187189 DEBUG nova.virt.libvirt.driver [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 06:57:01 compute-0 nova_compute[187185]: 2025-11-29 06:57:01.049 187189 DEBUG nova.virt.hardware [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 06:57:01 compute-0 nova_compute[187185]: 2025-11-29 06:57:01.050 187189 DEBUG nova.virt.hardware [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 06:57:01 compute-0 nova_compute[187185]: 2025-11-29 06:57:01.050 187189 DEBUG nova.virt.hardware [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 06:57:01 compute-0 nova_compute[187185]: 2025-11-29 06:57:01.051 187189 DEBUG nova.virt.hardware [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 06:57:01 compute-0 nova_compute[187185]: 2025-11-29 06:57:01.051 187189 DEBUG nova.virt.hardware [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 06:57:01 compute-0 nova_compute[187185]: 2025-11-29 06:57:01.052 187189 DEBUG nova.virt.hardware [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 06:57:01 compute-0 nova_compute[187185]: 2025-11-29 06:57:01.052 187189 DEBUG nova.virt.hardware [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 06:57:01 compute-0 nova_compute[187185]: 2025-11-29 06:57:01.053 187189 DEBUG nova.virt.hardware [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 06:57:01 compute-0 nova_compute[187185]: 2025-11-29 06:57:01.053 187189 DEBUG nova.virt.hardware [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 06:57:01 compute-0 nova_compute[187185]: 2025-11-29 06:57:01.054 187189 DEBUG nova.virt.hardware [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 06:57:01 compute-0 nova_compute[187185]: 2025-11-29 06:57:01.055 187189 DEBUG nova.virt.hardware [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 06:57:01 compute-0 nova_compute[187185]: 2025-11-29 06:57:01.061 187189 DEBUG nova.objects.instance [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lazy-loading 'pci_devices' on Instance uuid 72856fd1-9e86-48df-817f-42b206cc0bea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:57:01 compute-0 nova_compute[187185]: 2025-11-29 06:57:01.077 187189 DEBUG nova.virt.libvirt.driver [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] End _get_guest_xml xml=<domain type="kvm">
Nov 29 06:57:01 compute-0 nova_compute[187185]:   <uuid>72856fd1-9e86-48df-817f-42b206cc0bea</uuid>
Nov 29 06:57:01 compute-0 nova_compute[187185]:   <name>instance-00000028</name>
Nov 29 06:57:01 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 06:57:01 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 06:57:01 compute-0 nova_compute[187185]:   <metadata>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 06:57:01 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:       <nova:name>tempest-MigrationsAdminTest-server-2041941217</nova:name>
Nov 29 06:57:01 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 06:57:01</nova:creationTime>
Nov 29 06:57:01 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 06:57:01 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 06:57:01 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 06:57:01 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 06:57:01 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 06:57:01 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 06:57:01 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 06:57:01 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 06:57:01 compute-0 nova_compute[187185]:         <nova:user uuid="53ee944c04484336b9b14d84235a62b8">tempest-MigrationsAdminTest-1601255173-project-member</nova:user>
Nov 29 06:57:01 compute-0 nova_compute[187185]:         <nova:project uuid="890f94a625b342fdb17128922403c925">tempest-MigrationsAdminTest-1601255173</nova:project>
Nov 29 06:57:01 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 06:57:01 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:       <nova:ports/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 06:57:01 compute-0 nova_compute[187185]:   </metadata>
Nov 29 06:57:01 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <system>
Nov 29 06:57:01 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 06:57:01 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 06:57:01 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 06:57:01 compute-0 nova_compute[187185]:       <entry name="serial">72856fd1-9e86-48df-817f-42b206cc0bea</entry>
Nov 29 06:57:01 compute-0 nova_compute[187185]:       <entry name="uuid">72856fd1-9e86-48df-817f-42b206cc0bea</entry>
Nov 29 06:57:01 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     </system>
Nov 29 06:57:01 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 06:57:01 compute-0 nova_compute[187185]:   <os>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:   </os>
Nov 29 06:57:01 compute-0 nova_compute[187185]:   <features>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <apic/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:   </features>
Nov 29 06:57:01 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:   </clock>
Nov 29 06:57:01 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:   </cpu>
Nov 29 06:57:01 compute-0 nova_compute[187185]:   <devices>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 06:57:01 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     </disk>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 06:57:01 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk.config"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     </disk>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 06:57:01 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/console.log" append="off"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     </serial>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <video>
Nov 29 06:57:01 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     </video>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 06:57:01 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     </rng>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 06:57:01 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 06:57:01 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 06:57:01 compute-0 nova_compute[187185]:   </devices>
Nov 29 06:57:01 compute-0 nova_compute[187185]: </domain>
Nov 29 06:57:01 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 06:57:01 compute-0 nova_compute[187185]: 2025-11-29 06:57:01.333 187189 DEBUG nova.virt.libvirt.driver [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 06:57:01 compute-0 nova_compute[187185]: 2025-11-29 06:57:01.334 187189 DEBUG nova.virt.libvirt.driver [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 06:57:01 compute-0 nova_compute[187185]: 2025-11-29 06:57:01.335 187189 INFO nova.virt.libvirt.driver [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Using config drive
Nov 29 06:57:01 compute-0 nova_compute[187185]: 2025-11-29 06:57:01.570 187189 INFO nova.virt.libvirt.driver [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Creating config drive at /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk.config
Nov 29 06:57:01 compute-0 nova_compute[187185]: 2025-11-29 06:57:01.579 187189 DEBUG oslo_concurrency.processutils [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpilghpvy_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:57:01 compute-0 anacron[29976]: Job `cron.weekly' started
Nov 29 06:57:01 compute-0 anacron[29976]: Job `cron.weekly' terminated
Nov 29 06:57:01 compute-0 nova_compute[187185]: 2025-11-29 06:57:01.711 187189 DEBUG oslo_concurrency.processutils [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpilghpvy_" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:57:01 compute-0 systemd-machined[153486]: New machine qemu-15-instance-00000028.
Nov 29 06:57:01 compute-0 systemd[1]: Started Virtual Machine qemu-15-instance-00000028.
Nov 29 06:57:01 compute-0 nova_compute[187185]: 2025-11-29 06:57:01.944 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:57:02 compute-0 nova_compute[187185]: 2025-11-29 06:57:02.239 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399422.239181, 72856fd1-9e86-48df-817f-42b206cc0bea => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:57:02 compute-0 nova_compute[187185]: 2025-11-29 06:57:02.240 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] VM Resumed (Lifecycle Event)
Nov 29 06:57:02 compute-0 nova_compute[187185]: 2025-11-29 06:57:02.245 187189 DEBUG nova.compute.manager [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 06:57:02 compute-0 nova_compute[187185]: 2025-11-29 06:57:02.246 187189 DEBUG nova.virt.libvirt.driver [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 06:57:02 compute-0 nova_compute[187185]: 2025-11-29 06:57:02.252 187189 INFO nova.virt.libvirt.driver [-] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Instance spawned successfully.
Nov 29 06:57:02 compute-0 nova_compute[187185]: 2025-11-29 06:57:02.253 187189 DEBUG nova.virt.libvirt.driver [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 06:57:02 compute-0 nova_compute[187185]: 2025-11-29 06:57:02.262 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:57:02 compute-0 nova_compute[187185]: 2025-11-29 06:57:02.267 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 06:57:02 compute-0 nova_compute[187185]: 2025-11-29 06:57:02.279 187189 DEBUG nova.virt.libvirt.driver [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:57:02 compute-0 nova_compute[187185]: 2025-11-29 06:57:02.280 187189 DEBUG nova.virt.libvirt.driver [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:57:02 compute-0 nova_compute[187185]: 2025-11-29 06:57:02.280 187189 DEBUG nova.virt.libvirt.driver [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:57:02 compute-0 nova_compute[187185]: 2025-11-29 06:57:02.281 187189 DEBUG nova.virt.libvirt.driver [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:57:02 compute-0 nova_compute[187185]: 2025-11-29 06:57:02.282 187189 DEBUG nova.virt.libvirt.driver [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:57:02 compute-0 nova_compute[187185]: 2025-11-29 06:57:02.283 187189 DEBUG nova.virt.libvirt.driver [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:57:02 compute-0 nova_compute[187185]: 2025-11-29 06:57:02.289 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 06:57:02 compute-0 nova_compute[187185]: 2025-11-29 06:57:02.289 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399422.241153, 72856fd1-9e86-48df-817f-42b206cc0bea => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:57:02 compute-0 nova_compute[187185]: 2025-11-29 06:57:02.290 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] VM Started (Lifecycle Event)
Nov 29 06:57:02 compute-0 nova_compute[187185]: 2025-11-29 06:57:02.334 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:57:02 compute-0 nova_compute[187185]: 2025-11-29 06:57:02.338 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 06:57:02 compute-0 nova_compute[187185]: 2025-11-29 06:57:02.362 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 06:57:02 compute-0 nova_compute[187185]: 2025-11-29 06:57:02.366 187189 INFO nova.compute.manager [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Took 3.66 seconds to spawn the instance on the hypervisor.
Nov 29 06:57:02 compute-0 nova_compute[187185]: 2025-11-29 06:57:02.366 187189 DEBUG nova.compute.manager [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:57:02 compute-0 nova_compute[187185]: 2025-11-29 06:57:02.458 187189 INFO nova.compute.manager [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Took 7.53 seconds to build instance.
Nov 29 06:57:02 compute-0 nova_compute[187185]: 2025-11-29 06:57:02.481 187189 DEBUG oslo_concurrency.lockutils [None req-05e55f49-e3da-44f4-8ce0-49a65e5d7b7e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "72856fd1-9e86-48df-817f-42b206cc0bea" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.669s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:57:03 compute-0 nova_compute[187185]: 2025-11-29 06:57:03.353 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:57:05 compute-0 nova_compute[187185]: 2025-11-29 06:57:05.793 187189 DEBUG oslo_concurrency.lockutils [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Acquiring lock "refresh_cache-72856fd1-9e86-48df-817f-42b206cc0bea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:57:05 compute-0 nova_compute[187185]: 2025-11-29 06:57:05.796 187189 DEBUG oslo_concurrency.lockutils [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Acquired lock "refresh_cache-72856fd1-9e86-48df-817f-42b206cc0bea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:57:05 compute-0 nova_compute[187185]: 2025-11-29 06:57:05.797 187189 DEBUG nova.network.neutron [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 06:57:05 compute-0 podman[219020]: 2025-11-29 06:57:05.846648906 +0000 UTC m=+0.111525870 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 06:57:05 compute-0 nova_compute[187185]: 2025-11-29 06:57:05.962 187189 DEBUG nova.network.neutron [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 06:57:06 compute-0 nova_compute[187185]: 2025-11-29 06:57:06.825 187189 DEBUG nova.network.neutron [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:57:06 compute-0 nova_compute[187185]: 2025-11-29 06:57:06.844 187189 DEBUG oslo_concurrency.lockutils [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Releasing lock "refresh_cache-72856fd1-9e86-48df-817f-42b206cc0bea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:57:06 compute-0 nova_compute[187185]: 2025-11-29 06:57:06.948 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:57:06 compute-0 nova_compute[187185]: 2025-11-29 06:57:06.983 187189 DEBUG nova.virt.libvirt.driver [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Nov 29 06:57:06 compute-0 nova_compute[187185]: 2025-11-29 06:57:06.984 187189 DEBUG nova.virt.libvirt.volume.remotefs [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Creating file /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/4431a345815f42818e13869bf94558bb.tmp on remote host 192.168.122.102 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79
Nov 29 06:57:06 compute-0 nova_compute[187185]: 2025-11-29 06:57:06.985 187189 DEBUG oslo_concurrency.processutils [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/4431a345815f42818e13869bf94558bb.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:57:07 compute-0 nova_compute[187185]: 2025-11-29 06:57:07.530 187189 DEBUG oslo_concurrency.processutils [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/4431a345815f42818e13869bf94558bb.tmp" returned: 1 in 0.545s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:57:07 compute-0 nova_compute[187185]: 2025-11-29 06:57:07.532 187189 DEBUG oslo_concurrency.processutils [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] 'ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/4431a345815f42818e13869bf94558bb.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Nov 29 06:57:07 compute-0 nova_compute[187185]: 2025-11-29 06:57:07.533 187189 DEBUG nova.virt.libvirt.volume.remotefs [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Creating directory /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea on remote host 192.168.122.102 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91
Nov 29 06:57:07 compute-0 nova_compute[187185]: 2025-11-29 06:57:07.533 187189 DEBUG oslo_concurrency.processutils [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:57:07 compute-0 nova_compute[187185]: 2025-11-29 06:57:07.741 187189 DEBUG oslo_concurrency.processutils [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea" returned: 0 in 0.207s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:57:07 compute-0 nova_compute[187185]: 2025-11-29 06:57:07.746 187189 DEBUG nova.virt.libvirt.driver [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Nov 29 06:57:08 compute-0 nova_compute[187185]: 2025-11-29 06:57:08.355 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:57:08 compute-0 podman[219050]: 2025-11-29 06:57:08.808000125 +0000 UTC m=+0.061832522 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 06:57:08 compute-0 podman[219049]: 2025-11-29 06:57:08.818089369 +0000 UTC m=+0.075543738 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.41.3, tcib_managed=true)
Nov 29 06:57:11 compute-0 nova_compute[187185]: 2025-11-29 06:57:11.981 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:57:13 compute-0 nova_compute[187185]: 2025-11-29 06:57:13.357 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:57:16 compute-0 nova_compute[187185]: 2025-11-29 06:57:16.984 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:57:17 compute-0 nova_compute[187185]: 2025-11-29 06:57:17.793 187189 DEBUG nova.virt.libvirt.driver [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Nov 29 06:57:18 compute-0 nova_compute[187185]: 2025-11-29 06:57:18.358 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:57:18 compute-0 podman[219106]: 2025-11-29 06:57:18.86613313 +0000 UTC m=+0.101157958 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd)
Nov 29 06:57:21 compute-0 nova_compute[187185]: 2025-11-29 06:57:21.986 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:57:23 compute-0 nova_compute[187185]: 2025-11-29 06:57:23.392 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:57:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:57:24.817 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:57:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:57:24.818 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:57:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:57:24.818 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:57:26 compute-0 nova_compute[187185]: 2025-11-29 06:57:26.990 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:57:28 compute-0 nova_compute[187185]: 2025-11-29 06:57:28.414 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:57:29 compute-0 nova_compute[187185]: 2025-11-29 06:57:29.016 187189 DEBUG nova.virt.libvirt.driver [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Instance in state 1 after 21 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Nov 29 06:57:30 compute-0 podman[219127]: 2025-11-29 06:57:30.793358536 +0000 UTC m=+0.053562437 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Nov 29 06:57:30 compute-0 podman[219129]: 2025-11-29 06:57:30.793445838 +0000 UTC m=+0.050043068 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 06:57:30 compute-0 podman[219128]: 2025-11-29 06:57:30.814122266 +0000 UTC m=+0.063492284 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, config_id=edpm, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, architecture=x86_64, container_name=openstack_network_exporter, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7)
Nov 29 06:57:31 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000028.scope: Deactivated successfully.
Nov 29 06:57:31 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000028.scope: Consumed 13.328s CPU time.
Nov 29 06:57:31 compute-0 systemd-machined[153486]: Machine qemu-15-instance-00000028 terminated.
Nov 29 06:57:31 compute-0 nova_compute[187185]: 2025-11-29 06:57:31.992 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:57:32 compute-0 nova_compute[187185]: 2025-11-29 06:57:32.111 187189 INFO nova.virt.libvirt.driver [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Instance shutdown successfully after 24 seconds.
Nov 29 06:57:32 compute-0 nova_compute[187185]: 2025-11-29 06:57:32.116 187189 INFO nova.virt.libvirt.driver [-] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Instance destroyed successfully.
Nov 29 06:57:32 compute-0 nova_compute[187185]: 2025-11-29 06:57:32.120 187189 DEBUG oslo_concurrency.processutils [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:57:32 compute-0 nova_compute[187185]: 2025-11-29 06:57:32.180 187189 DEBUG oslo_concurrency.processutils [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:57:32 compute-0 nova_compute[187185]: 2025-11-29 06:57:32.181 187189 DEBUG oslo_concurrency.processutils [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:57:32 compute-0 nova_compute[187185]: 2025-11-29 06:57:32.236 187189 DEBUG oslo_concurrency.processutils [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:57:32 compute-0 nova_compute[187185]: 2025-11-29 06:57:32.238 187189 DEBUG nova.virt.libvirt.volume.remotefs [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Copying file /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea_resize/disk to 192.168.122.102:/var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Nov 29 06:57:32 compute-0 nova_compute[187185]: 2025-11-29 06:57:32.239 187189 DEBUG oslo_concurrency.processutils [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea_resize/disk 192.168.122.102:/var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:57:33 compute-0 nova_compute[187185]: 2025-11-29 06:57:33.417 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:57:33 compute-0 nova_compute[187185]: 2025-11-29 06:57:33.563 187189 DEBUG oslo_concurrency.processutils [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] CMD "scp -r /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea_resize/disk 192.168.122.102:/var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk" returned: 0 in 1.325s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:57:33 compute-0 nova_compute[187185]: 2025-11-29 06:57:33.565 187189 DEBUG nova.virt.libvirt.volume.remotefs [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Copying file /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea_resize/disk.config to 192.168.122.102:/var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Nov 29 06:57:33 compute-0 nova_compute[187185]: 2025-11-29 06:57:33.566 187189 DEBUG oslo_concurrency.processutils [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea_resize/disk.config 192.168.122.102:/var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:57:33 compute-0 nova_compute[187185]: 2025-11-29 06:57:33.830 187189 DEBUG oslo_concurrency.processutils [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] CMD "scp -C -r /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea_resize/disk.config 192.168.122.102:/var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk.config" returned: 0 in 0.264s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:57:33 compute-0 nova_compute[187185]: 2025-11-29 06:57:33.831 187189 DEBUG nova.virt.libvirt.volume.remotefs [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Copying file /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea_resize/disk.info to 192.168.122.102:/var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Nov 29 06:57:33 compute-0 nova_compute[187185]: 2025-11-29 06:57:33.832 187189 DEBUG oslo_concurrency.processutils [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea_resize/disk.info 192.168.122.102:/var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:57:34 compute-0 nova_compute[187185]: 2025-11-29 06:57:34.055 187189 DEBUG oslo_concurrency.processutils [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] CMD "scp -C -r /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea_resize/disk.info 192.168.122.102:/var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk.info" returned: 0 in 0.223s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:57:34 compute-0 nova_compute[187185]: 2025-11-29 06:57:34.226 187189 DEBUG oslo_concurrency.lockutils [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Acquiring lock "72856fd1-9e86-48df-817f-42b206cc0bea-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:57:34 compute-0 nova_compute[187185]: 2025-11-29 06:57:34.227 187189 DEBUG oslo_concurrency.lockutils [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Lock "72856fd1-9e86-48df-817f-42b206cc0bea-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:57:34 compute-0 nova_compute[187185]: 2025-11-29 06:57:34.227 187189 DEBUG oslo_concurrency.lockutils [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Lock "72856fd1-9e86-48df-817f-42b206cc0bea-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:57:36 compute-0 podman[219209]: 2025-11-29 06:57:36.891008718 +0000 UTC m=+0.143602931 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller)
Nov 29 06:57:36 compute-0 nova_compute[187185]: 2025-11-29 06:57:36.994 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:57:38 compute-0 nova_compute[187185]: 2025-11-29 06:57:38.421 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:57:39 compute-0 podman[219238]: 2025-11-29 06:57:39.80231151 +0000 UTC m=+0.057358723 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 06:57:39 compute-0 podman[219237]: 2025-11-29 06:57:39.822866504 +0000 UTC m=+0.079972695 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 29 06:57:40 compute-0 sshd-session[219235]: Received disconnect from 1.214.197.163 port 50208:11: Bye Bye [preauth]
Nov 29 06:57:40 compute-0 sshd-session[219235]: Disconnected from authenticating user root 1.214.197.163 port 50208 [preauth]
Nov 29 06:57:42 compute-0 nova_compute[187185]: 2025-11-29 06:57:41.999 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:57:42 compute-0 nova_compute[187185]: 2025-11-29 06:57:42.522 187189 INFO nova.compute.manager [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Swapping old allocation on dict_keys(['4e39a026-df39-4e20-874a-dbb5a40df044']) held by migration 32315c80-25a1-4b88-8a75-5380619fbfaf for instance
Nov 29 06:57:42 compute-0 nova_compute[187185]: 2025-11-29 06:57:42.737 187189 DEBUG nova.scheduler.client.report [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Overwriting current allocation {'allocations': {'2d55ea77-8118-4f48-9bb5-d62d10fd53c0': {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}, 'generation': 36}}, 'project_id': '890f94a625b342fdb17128922403c925', 'user_id': '53ee944c04484336b9b14d84235a62b8', 'consumer_generation': 1} on consumer 72856fd1-9e86-48df-817f-42b206cc0bea move_allocations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:2018
Nov 29 06:57:43 compute-0 nova_compute[187185]: 2025-11-29 06:57:43.408 187189 DEBUG oslo_concurrency.lockutils [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "refresh_cache-72856fd1-9e86-48df-817f-42b206cc0bea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:57:43 compute-0 nova_compute[187185]: 2025-11-29 06:57:43.409 187189 DEBUG oslo_concurrency.lockutils [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquired lock "refresh_cache-72856fd1-9e86-48df-817f-42b206cc0bea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:57:43 compute-0 nova_compute[187185]: 2025-11-29 06:57:43.409 187189 DEBUG nova.network.neutron [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 06:57:43 compute-0 nova_compute[187185]: 2025-11-29 06:57:43.422 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:57:43 compute-0 nova_compute[187185]: 2025-11-29 06:57:43.658 187189 DEBUG nova.network.neutron [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 06:57:43 compute-0 nova_compute[187185]: 2025-11-29 06:57:43.861 187189 DEBUG nova.network.neutron [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:57:43 compute-0 nova_compute[187185]: 2025-11-29 06:57:43.877 187189 DEBUG oslo_concurrency.lockutils [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Releasing lock "refresh_cache-72856fd1-9e86-48df-817f-42b206cc0bea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:57:43 compute-0 nova_compute[187185]: 2025-11-29 06:57:43.878 187189 DEBUG nova.virt.libvirt.driver [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Starting finish_revert_migration finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11843
Nov 29 06:57:43 compute-0 nova_compute[187185]: 2025-11-29 06:57:43.890 187189 DEBUG nova.virt.libvirt.driver [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 06:57:43 compute-0 nova_compute[187185]: 2025-11-29 06:57:43.896 187189 WARNING nova.virt.libvirt.driver [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:57:43 compute-0 nova_compute[187185]: 2025-11-29 06:57:43.903 187189 DEBUG nova.virt.libvirt.host [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 06:57:43 compute-0 nova_compute[187185]: 2025-11-29 06:57:43.904 187189 DEBUG nova.virt.libvirt.host [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 06:57:43 compute-0 nova_compute[187185]: 2025-11-29 06:57:43.909 187189 DEBUG nova.virt.libvirt.host [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 06:57:43 compute-0 nova_compute[187185]: 2025-11-29 06:57:43.909 187189 DEBUG nova.virt.libvirt.host [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 06:57:43 compute-0 nova_compute[187185]: 2025-11-29 06:57:43.911 187189 DEBUG nova.virt.libvirt.driver [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 06:57:43 compute-0 nova_compute[187185]: 2025-11-29 06:57:43.911 187189 DEBUG nova.virt.hardware [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 06:57:43 compute-0 nova_compute[187185]: 2025-11-29 06:57:43.912 187189 DEBUG nova.virt.hardware [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 06:57:43 compute-0 nova_compute[187185]: 2025-11-29 06:57:43.912 187189 DEBUG nova.virt.hardware [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 06:57:43 compute-0 nova_compute[187185]: 2025-11-29 06:57:43.913 187189 DEBUG nova.virt.hardware [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 06:57:43 compute-0 nova_compute[187185]: 2025-11-29 06:57:43.913 187189 DEBUG nova.virt.hardware [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 06:57:43 compute-0 nova_compute[187185]: 2025-11-29 06:57:43.913 187189 DEBUG nova.virt.hardware [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 06:57:43 compute-0 nova_compute[187185]: 2025-11-29 06:57:43.913 187189 DEBUG nova.virt.hardware [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 06:57:43 compute-0 nova_compute[187185]: 2025-11-29 06:57:43.914 187189 DEBUG nova.virt.hardware [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 06:57:43 compute-0 nova_compute[187185]: 2025-11-29 06:57:43.914 187189 DEBUG nova.virt.hardware [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 06:57:43 compute-0 nova_compute[187185]: 2025-11-29 06:57:43.914 187189 DEBUG nova.virt.hardware [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 06:57:43 compute-0 nova_compute[187185]: 2025-11-29 06:57:43.915 187189 DEBUG nova.virt.hardware [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 06:57:43 compute-0 nova_compute[187185]: 2025-11-29 06:57:43.915 187189 DEBUG nova.objects.instance [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 72856fd1-9e86-48df-817f-42b206cc0bea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:57:43 compute-0 nova_compute[187185]: 2025-11-29 06:57:43.931 187189 DEBUG oslo_concurrency.processutils [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:57:44 compute-0 nova_compute[187185]: 2025-11-29 06:57:44.024 187189 DEBUG oslo_concurrency.processutils [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk.config --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:57:44 compute-0 nova_compute[187185]: 2025-11-29 06:57:44.026 187189 DEBUG oslo_concurrency.lockutils [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "/var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:57:44 compute-0 nova_compute[187185]: 2025-11-29 06:57:44.026 187189 DEBUG oslo_concurrency.lockutils [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "/var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:57:44 compute-0 nova_compute[187185]: 2025-11-29 06:57:44.027 187189 DEBUG oslo_concurrency.lockutils [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "/var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:57:44 compute-0 nova_compute[187185]: 2025-11-29 06:57:44.030 187189 DEBUG nova.virt.libvirt.driver [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] End _get_guest_xml xml=<domain type="kvm">
Nov 29 06:57:44 compute-0 nova_compute[187185]:   <uuid>72856fd1-9e86-48df-817f-42b206cc0bea</uuid>
Nov 29 06:57:44 compute-0 nova_compute[187185]:   <name>instance-00000028</name>
Nov 29 06:57:44 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 06:57:44 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 06:57:44 compute-0 nova_compute[187185]:   <metadata>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 06:57:44 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:       <nova:name>tempest-MigrationsAdminTest-server-2041941217</nova:name>
Nov 29 06:57:44 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 06:57:43</nova:creationTime>
Nov 29 06:57:44 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 06:57:44 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 06:57:44 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 06:57:44 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 06:57:44 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 06:57:44 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 06:57:44 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 06:57:44 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 06:57:44 compute-0 nova_compute[187185]:         <nova:user uuid="53ee944c04484336b9b14d84235a62b8">tempest-MigrationsAdminTest-1601255173-project-member</nova:user>
Nov 29 06:57:44 compute-0 nova_compute[187185]:         <nova:project uuid="890f94a625b342fdb17128922403c925">tempest-MigrationsAdminTest-1601255173</nova:project>
Nov 29 06:57:44 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 06:57:44 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:       <nova:ports/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 06:57:44 compute-0 nova_compute[187185]:   </metadata>
Nov 29 06:57:44 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <system>
Nov 29 06:57:44 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 06:57:44 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 06:57:44 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 06:57:44 compute-0 nova_compute[187185]:       <entry name="serial">72856fd1-9e86-48df-817f-42b206cc0bea</entry>
Nov 29 06:57:44 compute-0 nova_compute[187185]:       <entry name="uuid">72856fd1-9e86-48df-817f-42b206cc0bea</entry>
Nov 29 06:57:44 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     </system>
Nov 29 06:57:44 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 06:57:44 compute-0 nova_compute[187185]:   <os>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:   </os>
Nov 29 06:57:44 compute-0 nova_compute[187185]:   <features>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <apic/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:   </features>
Nov 29 06:57:44 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:   </clock>
Nov 29 06:57:44 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:   </cpu>
Nov 29 06:57:44 compute-0 nova_compute[187185]:   <devices>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 06:57:44 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     </disk>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 06:57:44 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk.config"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     </disk>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 06:57:44 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/console.log" append="off"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     </serial>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <video>
Nov 29 06:57:44 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     </video>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <input type="keyboard" bus="usb"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 06:57:44 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     </rng>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 06:57:44 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 06:57:44 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 06:57:44 compute-0 nova_compute[187185]:   </devices>
Nov 29 06:57:44 compute-0 nova_compute[187185]: </domain>
Nov 29 06:57:44 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 06:57:44 compute-0 systemd-machined[153486]: New machine qemu-16-instance-00000028.
Nov 29 06:57:44 compute-0 systemd[1]: Started Virtual Machine qemu-16-instance-00000028.
Nov 29 06:57:45 compute-0 nova_compute[187185]: 2025-11-29 06:57:45.873 187189 DEBUG nova.virt.libvirt.host [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Removed pending event for 72856fd1-9e86-48df-817f-42b206cc0bea due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Nov 29 06:57:45 compute-0 nova_compute[187185]: 2025-11-29 06:57:45.874 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399465.8729408, 72856fd1-9e86-48df-817f-42b206cc0bea => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:57:45 compute-0 nova_compute[187185]: 2025-11-29 06:57:45.874 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] VM Resumed (Lifecycle Event)
Nov 29 06:57:45 compute-0 nova_compute[187185]: 2025-11-29 06:57:45.876 187189 DEBUG nova.compute.manager [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 06:57:45 compute-0 nova_compute[187185]: 2025-11-29 06:57:45.882 187189 INFO nova.virt.libvirt.driver [-] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Instance running successfully.
Nov 29 06:57:45 compute-0 nova_compute[187185]: 2025-11-29 06:57:45.882 187189 DEBUG nova.virt.libvirt.driver [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] finish_revert_migration finished successfully. finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11887
Nov 29 06:57:45 compute-0 nova_compute[187185]: 2025-11-29 06:57:45.901 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:57:45 compute-0 nova_compute[187185]: 2025-11-29 06:57:45.910 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 06:57:45 compute-0 nova_compute[187185]: 2025-11-29 06:57:45.950 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] During sync_power_state the instance has a pending task (resize_reverting). Skip.
Nov 29 06:57:45 compute-0 nova_compute[187185]: 2025-11-29 06:57:45.950 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399465.8741987, 72856fd1-9e86-48df-817f-42b206cc0bea => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:57:45 compute-0 nova_compute[187185]: 2025-11-29 06:57:45.951 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] VM Started (Lifecycle Event)
Nov 29 06:57:45 compute-0 nova_compute[187185]: 2025-11-29 06:57:45.970 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:57:45 compute-0 nova_compute[187185]: 2025-11-29 06:57:45.973 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Synchronizing instance power state after lifecycle event "Started"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 06:57:45 compute-0 nova_compute[187185]: 2025-11-29 06:57:45.988 187189 INFO nova.compute.manager [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Updating instance to original state: 'active'
Nov 29 06:57:45 compute-0 nova_compute[187185]: 2025-11-29 06:57:45.995 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] During sync_power_state the instance has a pending task (resize_reverting). Skip.
Nov 29 06:57:46 compute-0 nova_compute[187185]: 2025-11-29 06:57:46.417 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:57:46 compute-0 nova_compute[187185]: 2025-11-29 06:57:46.418 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 06:57:46 compute-0 nova_compute[187185]: 2025-11-29 06:57:46.418 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 06:57:46 compute-0 nova_compute[187185]: 2025-11-29 06:57:46.547 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "refresh_cache-6aebe65a-3191-4d58-acfd-8d663b9b0a8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:57:46 compute-0 nova_compute[187185]: 2025-11-29 06:57:46.547 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquired lock "refresh_cache-6aebe65a-3191-4d58-acfd-8d663b9b0a8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:57:46 compute-0 nova_compute[187185]: 2025-11-29 06:57:46.547 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 29 06:57:46 compute-0 nova_compute[187185]: 2025-11-29 06:57:46.548 187189 DEBUG nova.objects.instance [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 6aebe65a-3191-4d58-acfd-8d663b9b0a8e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:57:46 compute-0 nova_compute[187185]: 2025-11-29 06:57:46.729 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 06:57:47 compute-0 nova_compute[187185]: 2025-11-29 06:57:47.003 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:57:47 compute-0 nova_compute[187185]: 2025-11-29 06:57:47.407 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:57:47 compute-0 nova_compute[187185]: 2025-11-29 06:57:47.430 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Releasing lock "refresh_cache-6aebe65a-3191-4d58-acfd-8d663b9b0a8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:57:47 compute-0 nova_compute[187185]: 2025-11-29 06:57:47.430 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 29 06:57:47 compute-0 nova_compute[187185]: 2025-11-29 06:57:47.431 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:57:47 compute-0 nova_compute[187185]: 2025-11-29 06:57:47.432 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:57:47 compute-0 nova_compute[187185]: 2025-11-29 06:57:47.668 187189 DEBUG oslo_concurrency.lockutils [None req-2bfc7432-bf8c-496d-80c6-1c71cf977697 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "72856fd1-9e86-48df-817f-42b206cc0bea" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:57:47 compute-0 nova_compute[187185]: 2025-11-29 06:57:47.669 187189 DEBUG oslo_concurrency.lockutils [None req-2bfc7432-bf8c-496d-80c6-1c71cf977697 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "72856fd1-9e86-48df-817f-42b206cc0bea" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:57:47 compute-0 nova_compute[187185]: 2025-11-29 06:57:47.670 187189 DEBUG oslo_concurrency.lockutils [None req-2bfc7432-bf8c-496d-80c6-1c71cf977697 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "72856fd1-9e86-48df-817f-42b206cc0bea-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:57:47 compute-0 nova_compute[187185]: 2025-11-29 06:57:47.670 187189 DEBUG oslo_concurrency.lockutils [None req-2bfc7432-bf8c-496d-80c6-1c71cf977697 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "72856fd1-9e86-48df-817f-42b206cc0bea-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:57:47 compute-0 nova_compute[187185]: 2025-11-29 06:57:47.670 187189 DEBUG oslo_concurrency.lockutils [None req-2bfc7432-bf8c-496d-80c6-1c71cf977697 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "72856fd1-9e86-48df-817f-42b206cc0bea-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:57:47 compute-0 nova_compute[187185]: 2025-11-29 06:57:47.683 187189 INFO nova.compute.manager [None req-2bfc7432-bf8c-496d-80c6-1c71cf977697 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Terminating instance
Nov 29 06:57:47 compute-0 nova_compute[187185]: 2025-11-29 06:57:47.702 187189 DEBUG oslo_concurrency.lockutils [None req-2bfc7432-bf8c-496d-80c6-1c71cf977697 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "refresh_cache-72856fd1-9e86-48df-817f-42b206cc0bea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:57:47 compute-0 nova_compute[187185]: 2025-11-29 06:57:47.703 187189 DEBUG oslo_concurrency.lockutils [None req-2bfc7432-bf8c-496d-80c6-1c71cf977697 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquired lock "refresh_cache-72856fd1-9e86-48df-817f-42b206cc0bea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:57:47 compute-0 nova_compute[187185]: 2025-11-29 06:57:47.703 187189 DEBUG nova.network.neutron [None req-2bfc7432-bf8c-496d-80c6-1c71cf977697 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 06:57:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:57:47.846 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 06:57:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:57:47.848 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 06:57:47 compute-0 nova_compute[187185]: 2025-11-29 06:57:47.850 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:57:47 compute-0 nova_compute[187185]: 2025-11-29 06:57:47.859 187189 DEBUG nova.network.neutron [None req-2bfc7432-bf8c-496d-80c6-1c71cf977697 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 06:57:48 compute-0 nova_compute[187185]: 2025-11-29 06:57:48.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:57:48 compute-0 nova_compute[187185]: 2025-11-29 06:57:48.318 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:57:48 compute-0 nova_compute[187185]: 2025-11-29 06:57:48.427 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:57:48 compute-0 nova_compute[187185]: 2025-11-29 06:57:48.519 187189 DEBUG nova.network.neutron [None req-2bfc7432-bf8c-496d-80c6-1c71cf977697 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:57:48 compute-0 nova_compute[187185]: 2025-11-29 06:57:48.673 187189 DEBUG oslo_concurrency.lockutils [None req-2bfc7432-bf8c-496d-80c6-1c71cf977697 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Releasing lock "refresh_cache-72856fd1-9e86-48df-817f-42b206cc0bea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:57:48 compute-0 nova_compute[187185]: 2025-11-29 06:57:48.674 187189 DEBUG nova.compute.manager [None req-2bfc7432-bf8c-496d-80c6-1c71cf977697 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 06:57:48 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000028.scope: Deactivated successfully.
Nov 29 06:57:48 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000028.scope: Consumed 4.112s CPU time.
Nov 29 06:57:48 compute-0 systemd-machined[153486]: Machine qemu-16-instance-00000028 terminated.
Nov 29 06:57:48 compute-0 nova_compute[187185]: 2025-11-29 06:57:48.936 187189 INFO nova.virt.libvirt.driver [-] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Instance destroyed successfully.
Nov 29 06:57:48 compute-0 nova_compute[187185]: 2025-11-29 06:57:48.937 187189 DEBUG nova.objects.instance [None req-2bfc7432-bf8c-496d-80c6-1c71cf977697 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lazy-loading 'resources' on Instance uuid 72856fd1-9e86-48df-817f-42b206cc0bea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:57:48 compute-0 nova_compute[187185]: 2025-11-29 06:57:48.951 187189 INFO nova.virt.libvirt.driver [None req-2bfc7432-bf8c-496d-80c6-1c71cf977697 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Deleting instance files /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea_del
Nov 29 06:57:48 compute-0 nova_compute[187185]: 2025-11-29 06:57:48.958 187189 INFO nova.virt.libvirt.driver [None req-2bfc7432-bf8c-496d-80c6-1c71cf977697 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Deletion of /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea_del complete
Nov 29 06:57:49 compute-0 nova_compute[187185]: 2025-11-29 06:57:49.037 187189 INFO nova.compute.manager [None req-2bfc7432-bf8c-496d-80c6-1c71cf977697 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Took 0.36 seconds to destroy the instance on the hypervisor.
Nov 29 06:57:49 compute-0 nova_compute[187185]: 2025-11-29 06:57:49.038 187189 DEBUG oslo.service.loopingcall [None req-2bfc7432-bf8c-496d-80c6-1c71cf977697 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 06:57:49 compute-0 nova_compute[187185]: 2025-11-29 06:57:49.038 187189 DEBUG nova.compute.manager [-] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 06:57:49 compute-0 nova_compute[187185]: 2025-11-29 06:57:49.039 187189 DEBUG nova.network.neutron [-] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 06:57:49 compute-0 nova_compute[187185]: 2025-11-29 06:57:49.214 187189 DEBUG nova.network.neutron [-] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 06:57:49 compute-0 nova_compute[187185]: 2025-11-29 06:57:49.238 187189 DEBUG nova.network.neutron [-] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:57:49 compute-0 nova_compute[187185]: 2025-11-29 06:57:49.254 187189 INFO nova.compute.manager [-] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Took 0.22 seconds to deallocate network for instance.
Nov 29 06:57:49 compute-0 nova_compute[187185]: 2025-11-29 06:57:49.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:57:49 compute-0 nova_compute[187185]: 2025-11-29 06:57:49.333 187189 DEBUG oslo_concurrency.lockutils [None req-2bfc7432-bf8c-496d-80c6-1c71cf977697 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:57:49 compute-0 nova_compute[187185]: 2025-11-29 06:57:49.334 187189 DEBUG oslo_concurrency.lockutils [None req-2bfc7432-bf8c-496d-80c6-1c71cf977697 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:57:49 compute-0 nova_compute[187185]: 2025-11-29 06:57:49.338 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:57:49 compute-0 nova_compute[187185]: 2025-11-29 06:57:49.410 187189 DEBUG nova.compute.provider_tree [None req-2bfc7432-bf8c-496d-80c6-1c71cf977697 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:57:49 compute-0 nova_compute[187185]: 2025-11-29 06:57:49.429 187189 DEBUG nova.scheduler.client.report [None req-2bfc7432-bf8c-496d-80c6-1c71cf977697 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:57:49 compute-0 nova_compute[187185]: 2025-11-29 06:57:49.449 187189 DEBUG oslo_concurrency.lockutils [None req-2bfc7432-bf8c-496d-80c6-1c71cf977697 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:57:49 compute-0 nova_compute[187185]: 2025-11-29 06:57:49.451 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:57:49 compute-0 nova_compute[187185]: 2025-11-29 06:57:49.451 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:57:49 compute-0 nova_compute[187185]: 2025-11-29 06:57:49.452 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:57:49 compute-0 nova_compute[187185]: 2025-11-29 06:57:49.497 187189 INFO nova.scheduler.client.report [None req-2bfc7432-bf8c-496d-80c6-1c71cf977697 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Deleted allocations for instance 72856fd1-9e86-48df-817f-42b206cc0bea
Nov 29 06:57:49 compute-0 nova_compute[187185]: 2025-11-29 06:57:49.532 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:57:49 compute-0 podman[219319]: 2025-11-29 06:57:49.579250963 +0000 UTC m=+0.070824459 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 06:57:49 compute-0 nova_compute[187185]: 2025-11-29 06:57:49.578 187189 DEBUG oslo_concurrency.lockutils [None req-2bfc7432-bf8c-496d-80c6-1c71cf977697 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "72856fd1-9e86-48df-817f-42b206cc0bea" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.909s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:57:49 compute-0 nova_compute[187185]: 2025-11-29 06:57:49.614 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:57:49 compute-0 nova_compute[187185]: 2025-11-29 06:57:49.615 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:57:49 compute-0 nova_compute[187185]: 2025-11-29 06:57:49.691 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:57:49 compute-0 nova_compute[187185]: 2025-11-29 06:57:49.845 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:57:49 compute-0 nova_compute[187185]: 2025-11-29 06:57:49.847 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5593MB free_disk=73.3095588684082GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:57:49 compute-0 nova_compute[187185]: 2025-11-29 06:57:49.847 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:57:49 compute-0 nova_compute[187185]: 2025-11-29 06:57:49.847 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:57:49 compute-0 nova_compute[187185]: 2025-11-29 06:57:49.939 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance 6aebe65a-3191-4d58-acfd-8d663b9b0a8e actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 06:57:49 compute-0 nova_compute[187185]: 2025-11-29 06:57:49.941 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:57:49 compute-0 nova_compute[187185]: 2025-11-29 06:57:49.941 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:57:49 compute-0 nova_compute[187185]: 2025-11-29 06:57:49.996 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:57:50 compute-0 nova_compute[187185]: 2025-11-29 06:57:50.020 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:57:50 compute-0 nova_compute[187185]: 2025-11-29 06:57:50.047 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:57:50 compute-0 nova_compute[187185]: 2025-11-29 06:57:50.047 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.200s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:57:50 compute-0 nova_compute[187185]: 2025-11-29 06:57:50.581 187189 DEBUG oslo_concurrency.lockutils [None req-d878a004-9700-4f41-ab8a-278d001c95f3 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "6aebe65a-3191-4d58-acfd-8d663b9b0a8e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:57:50 compute-0 nova_compute[187185]: 2025-11-29 06:57:50.582 187189 DEBUG oslo_concurrency.lockutils [None req-d878a004-9700-4f41-ab8a-278d001c95f3 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "6aebe65a-3191-4d58-acfd-8d663b9b0a8e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:57:50 compute-0 nova_compute[187185]: 2025-11-29 06:57:50.582 187189 DEBUG oslo_concurrency.lockutils [None req-d878a004-9700-4f41-ab8a-278d001c95f3 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "6aebe65a-3191-4d58-acfd-8d663b9b0a8e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:57:50 compute-0 nova_compute[187185]: 2025-11-29 06:57:50.583 187189 DEBUG oslo_concurrency.lockutils [None req-d878a004-9700-4f41-ab8a-278d001c95f3 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "6aebe65a-3191-4d58-acfd-8d663b9b0a8e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:57:50 compute-0 nova_compute[187185]: 2025-11-29 06:57:50.583 187189 DEBUG oslo_concurrency.lockutils [None req-d878a004-9700-4f41-ab8a-278d001c95f3 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "6aebe65a-3191-4d58-acfd-8d663b9b0a8e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:57:50 compute-0 nova_compute[187185]: 2025-11-29 06:57:50.603 187189 INFO nova.compute.manager [None req-d878a004-9700-4f41-ab8a-278d001c95f3 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Terminating instance
Nov 29 06:57:50 compute-0 nova_compute[187185]: 2025-11-29 06:57:50.617 187189 DEBUG oslo_concurrency.lockutils [None req-d878a004-9700-4f41-ab8a-278d001c95f3 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "refresh_cache-6aebe65a-3191-4d58-acfd-8d663b9b0a8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:57:50 compute-0 nova_compute[187185]: 2025-11-29 06:57:50.617 187189 DEBUG oslo_concurrency.lockutils [None req-d878a004-9700-4f41-ab8a-278d001c95f3 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquired lock "refresh_cache-6aebe65a-3191-4d58-acfd-8d663b9b0a8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:57:50 compute-0 nova_compute[187185]: 2025-11-29 06:57:50.618 187189 DEBUG nova.network.neutron [None req-d878a004-9700-4f41-ab8a-278d001c95f3 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 06:57:50 compute-0 nova_compute[187185]: 2025-11-29 06:57:50.977 187189 DEBUG nova.network.neutron [None req-d878a004-9700-4f41-ab8a-278d001c95f3 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 06:57:51 compute-0 nova_compute[187185]: 2025-11-29 06:57:51.420 187189 DEBUG nova.network.neutron [None req-d878a004-9700-4f41-ab8a-278d001c95f3 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:57:51 compute-0 nova_compute[187185]: 2025-11-29 06:57:51.441 187189 DEBUG oslo_concurrency.lockutils [None req-d878a004-9700-4f41-ab8a-278d001c95f3 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Releasing lock "refresh_cache-6aebe65a-3191-4d58-acfd-8d663b9b0a8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:57:51 compute-0 nova_compute[187185]: 2025-11-29 06:57:51.442 187189 DEBUG nova.compute.manager [None req-d878a004-9700-4f41-ab8a-278d001c95f3 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 06:57:52 compute-0 nova_compute[187185]: 2025-11-29 06:57:52.007 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:57:52 compute-0 nova_compute[187185]: 2025-11-29 06:57:52.043 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:57:52 compute-0 nova_compute[187185]: 2025-11-29 06:57:52.789 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:57:52 compute-0 nova_compute[187185]: 2025-11-29 06:57:52.790 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:57:52 compute-0 nova_compute[187185]: 2025-11-29 06:57:52.790 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:57:52 compute-0 nova_compute[187185]: 2025-11-29 06:57:52.790 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 06:57:52 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000024.scope: Deactivated successfully.
Nov 29 06:57:52 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000024.scope: Consumed 16.493s CPU time.
Nov 29 06:57:52 compute-0 systemd-machined[153486]: Machine qemu-14-instance-00000024 terminated.
Nov 29 06:57:53 compute-0 nova_compute[187185]: 2025-11-29 06:57:53.091 187189 INFO nova.virt.libvirt.driver [-] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Instance destroyed successfully.
Nov 29 06:57:53 compute-0 nova_compute[187185]: 2025-11-29 06:57:53.092 187189 DEBUG nova.objects.instance [None req-d878a004-9700-4f41-ab8a-278d001c95f3 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lazy-loading 'resources' on Instance uuid 6aebe65a-3191-4d58-acfd-8d663b9b0a8e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:57:53 compute-0 nova_compute[187185]: 2025-11-29 06:57:53.115 187189 INFO nova.virt.libvirt.driver [None req-d878a004-9700-4f41-ab8a-278d001c95f3 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Deleting instance files /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e_del
Nov 29 06:57:53 compute-0 nova_compute[187185]: 2025-11-29 06:57:53.121 187189 INFO nova.virt.libvirt.driver [None req-d878a004-9700-4f41-ab8a-278d001c95f3 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Deletion of /var/lib/nova/instances/6aebe65a-3191-4d58-acfd-8d663b9b0a8e_del complete
Nov 29 06:57:53 compute-0 nova_compute[187185]: 2025-11-29 06:57:53.211 187189 INFO nova.compute.manager [None req-d878a004-9700-4f41-ab8a-278d001c95f3 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Took 1.77 seconds to destroy the instance on the hypervisor.
Nov 29 06:57:53 compute-0 nova_compute[187185]: 2025-11-29 06:57:53.212 187189 DEBUG oslo.service.loopingcall [None req-d878a004-9700-4f41-ab8a-278d001c95f3 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 06:57:53 compute-0 nova_compute[187185]: 2025-11-29 06:57:53.212 187189 DEBUG nova.compute.manager [-] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 06:57:53 compute-0 nova_compute[187185]: 2025-11-29 06:57:53.212 187189 DEBUG nova.network.neutron [-] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 06:57:53 compute-0 nova_compute[187185]: 2025-11-29 06:57:53.414 187189 DEBUG nova.network.neutron [-] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 06:57:53 compute-0 nova_compute[187185]: 2025-11-29 06:57:53.428 187189 DEBUG nova.network.neutron [-] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:57:53 compute-0 nova_compute[187185]: 2025-11-29 06:57:53.431 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:57:53 compute-0 nova_compute[187185]: 2025-11-29 06:57:53.448 187189 INFO nova.compute.manager [-] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Took 0.24 seconds to deallocate network for instance.
Nov 29 06:57:53 compute-0 nova_compute[187185]: 2025-11-29 06:57:53.521 187189 DEBUG oslo_concurrency.lockutils [None req-d878a004-9700-4f41-ab8a-278d001c95f3 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:57:53 compute-0 nova_compute[187185]: 2025-11-29 06:57:53.522 187189 DEBUG oslo_concurrency.lockutils [None req-d878a004-9700-4f41-ab8a-278d001c95f3 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:57:53 compute-0 nova_compute[187185]: 2025-11-29 06:57:53.642 187189 DEBUG nova.compute.provider_tree [None req-d878a004-9700-4f41-ab8a-278d001c95f3 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:57:53 compute-0 nova_compute[187185]: 2025-11-29 06:57:53.662 187189 DEBUG nova.scheduler.client.report [None req-d878a004-9700-4f41-ab8a-278d001c95f3 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:57:53 compute-0 nova_compute[187185]: 2025-11-29 06:57:53.697 187189 DEBUG oslo_concurrency.lockutils [None req-d878a004-9700-4f41-ab8a-278d001c95f3 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:57:53 compute-0 nova_compute[187185]: 2025-11-29 06:57:53.728 187189 INFO nova.scheduler.client.report [None req-d878a004-9700-4f41-ab8a-278d001c95f3 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Deleted allocations for instance 6aebe65a-3191-4d58-acfd-8d663b9b0a8e
Nov 29 06:57:53 compute-0 nova_compute[187185]: 2025-11-29 06:57:53.818 187189 DEBUG oslo_concurrency.lockutils [None req-d878a004-9700-4f41-ab8a-278d001c95f3 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "6aebe65a-3191-4d58-acfd-8d663b9b0a8e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.237s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:57:57 compute-0 nova_compute[187185]: 2025-11-29 06:57:57.011 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:57:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:57:57.851 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:57:58 compute-0 nova_compute[187185]: 2025-11-29 06:57:58.428 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:58:01 compute-0 podman[219358]: 2025-11-29 06:58:01.809021923 +0000 UTC m=+0.063432452 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal, config_id=edpm, release=1755695350, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 29 06:58:01 compute-0 podman[219357]: 2025-11-29 06:58:01.821769589 +0000 UTC m=+0.082135665 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:58:01 compute-0 podman[219359]: 2025-11-29 06:58:01.831999425 +0000 UTC m=+0.085841098 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 06:58:02 compute-0 nova_compute[187185]: 2025-11-29 06:58:02.015 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:58:03 compute-0 nova_compute[187185]: 2025-11-29 06:58:03.430 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:58:03 compute-0 nova_compute[187185]: 2025-11-29 06:58:03.934 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399468.933674, 72856fd1-9e86-48df-817f-42b206cc0bea => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:58:03 compute-0 nova_compute[187185]: 2025-11-29 06:58:03.935 187189 INFO nova.compute.manager [-] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] VM Stopped (Lifecycle Event)
Nov 29 06:58:03 compute-0 nova_compute[187185]: 2025-11-29 06:58:03.958 187189 DEBUG nova.compute.manager [None req-2fe5525f-e656-4933-91e5-dbb9cb328678 - - - - - -] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:58:07 compute-0 nova_compute[187185]: 2025-11-29 06:58:07.017 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:58:07 compute-0 nova_compute[187185]: 2025-11-29 06:58:07.117 187189 DEBUG oslo_concurrency.lockutils [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Acquiring lock "3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:58:07 compute-0 nova_compute[187185]: 2025-11-29 06:58:07.118 187189 DEBUG oslo_concurrency.lockutils [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Lock "3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:58:07 compute-0 nova_compute[187185]: 2025-11-29 06:58:07.138 187189 DEBUG nova.compute.manager [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 06:58:07 compute-0 nova_compute[187185]: 2025-11-29 06:58:07.245 187189 DEBUG oslo_concurrency.lockutils [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:58:07 compute-0 nova_compute[187185]: 2025-11-29 06:58:07.246 187189 DEBUG oslo_concurrency.lockutils [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:58:07 compute-0 nova_compute[187185]: 2025-11-29 06:58:07.256 187189 DEBUG nova.virt.hardware [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 06:58:07 compute-0 nova_compute[187185]: 2025-11-29 06:58:07.256 187189 INFO nova.compute.claims [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Claim successful on node compute-0.ctlplane.example.com
Nov 29 06:58:07 compute-0 nova_compute[187185]: 2025-11-29 06:58:07.389 187189 DEBUG nova.compute.provider_tree [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:58:07 compute-0 nova_compute[187185]: 2025-11-29 06:58:07.455 187189 DEBUG nova.scheduler.client.report [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:58:07 compute-0 nova_compute[187185]: 2025-11-29 06:58:07.483 187189 DEBUG oslo_concurrency.lockutils [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.236s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:58:07 compute-0 nova_compute[187185]: 2025-11-29 06:58:07.483 187189 DEBUG nova.compute.manager [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 06:58:07 compute-0 nova_compute[187185]: 2025-11-29 06:58:07.551 187189 DEBUG nova.compute.manager [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 06:58:07 compute-0 nova_compute[187185]: 2025-11-29 06:58:07.551 187189 DEBUG nova.network.neutron [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 06:58:07 compute-0 nova_compute[187185]: 2025-11-29 06:58:07.568 187189 INFO nova.virt.libvirt.driver [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 06:58:07 compute-0 nova_compute[187185]: 2025-11-29 06:58:07.584 187189 DEBUG nova.compute.manager [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 06:58:07 compute-0 nova_compute[187185]: 2025-11-29 06:58:07.728 187189 DEBUG nova.compute.manager [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 06:58:07 compute-0 nova_compute[187185]: 2025-11-29 06:58:07.730 187189 DEBUG nova.virt.libvirt.driver [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 06:58:07 compute-0 nova_compute[187185]: 2025-11-29 06:58:07.731 187189 INFO nova.virt.libvirt.driver [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Creating image(s)
Nov 29 06:58:07 compute-0 nova_compute[187185]: 2025-11-29 06:58:07.732 187189 DEBUG oslo_concurrency.lockutils [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Acquiring lock "/var/lib/nova/instances/3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:58:07 compute-0 nova_compute[187185]: 2025-11-29 06:58:07.732 187189 DEBUG oslo_concurrency.lockutils [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Lock "/var/lib/nova/instances/3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:58:07 compute-0 nova_compute[187185]: 2025-11-29 06:58:07.733 187189 DEBUG oslo_concurrency.lockutils [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Lock "/var/lib/nova/instances/3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:58:07 compute-0 nova_compute[187185]: 2025-11-29 06:58:07.760 187189 DEBUG oslo_concurrency.processutils [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:58:07 compute-0 nova_compute[187185]: 2025-11-29 06:58:07.829 187189 DEBUG nova.policy [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1472c12f52724fe5aae37390978dd8fe', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c25ab1655ced493998b50733e2d514fc', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 06:58:07 compute-0 nova_compute[187185]: 2025-11-29 06:58:07.865 187189 DEBUG oslo_concurrency.processutils [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.105s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:58:07 compute-0 nova_compute[187185]: 2025-11-29 06:58:07.867 187189 DEBUG oslo_concurrency.lockutils [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:58:07 compute-0 nova_compute[187185]: 2025-11-29 06:58:07.868 187189 DEBUG oslo_concurrency.lockutils [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:58:07 compute-0 nova_compute[187185]: 2025-11-29 06:58:07.884 187189 DEBUG oslo_concurrency.processutils [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:58:07 compute-0 podman[219421]: 2025-11-29 06:58:07.888060869 +0000 UTC m=+0.150627648 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 06:58:07 compute-0 nova_compute[187185]: 2025-11-29 06:58:07.948 187189 DEBUG oslo_concurrency.processutils [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:58:07 compute-0 nova_compute[187185]: 2025-11-29 06:58:07.950 187189 DEBUG oslo_concurrency.processutils [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:58:08 compute-0 nova_compute[187185]: 2025-11-29 06:58:08.090 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399473.0894973, 6aebe65a-3191-4d58-acfd-8d663b9b0a8e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:58:08 compute-0 nova_compute[187185]: 2025-11-29 06:58:08.091 187189 INFO nova.compute.manager [-] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] VM Stopped (Lifecycle Event)
Nov 29 06:58:08 compute-0 nova_compute[187185]: 2025-11-29 06:58:08.118 187189 DEBUG nova.compute.manager [None req-36c832c5-52b8-409d-88ec-6b7437df98c9 - - - - - -] [instance: 6aebe65a-3191-4d58-acfd-8d663b9b0a8e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:58:08 compute-0 nova_compute[187185]: 2025-11-29 06:58:08.434 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:58:08 compute-0 nova_compute[187185]: 2025-11-29 06:58:08.864 187189 DEBUG nova.network.neutron [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Successfully created port: b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 06:58:10 compute-0 nova_compute[187185]: 2025-11-29 06:58:10.123 187189 DEBUG oslo_concurrency.processutils [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/disk 1073741824" returned: 0 in 2.173s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:58:10 compute-0 nova_compute[187185]: 2025-11-29 06:58:10.125 187189 DEBUG oslo_concurrency.lockutils [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 2.256s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:58:10 compute-0 nova_compute[187185]: 2025-11-29 06:58:10.125 187189 DEBUG oslo_concurrency.processutils [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:58:10 compute-0 nova_compute[187185]: 2025-11-29 06:58:10.217 187189 DEBUG oslo_concurrency.processutils [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:58:10 compute-0 nova_compute[187185]: 2025-11-29 06:58:10.219 187189 DEBUG nova.virt.disk.api [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Checking if we can resize image /var/lib/nova/instances/3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 06:58:10 compute-0 nova_compute[187185]: 2025-11-29 06:58:10.220 187189 DEBUG oslo_concurrency.processutils [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:58:10 compute-0 nova_compute[187185]: 2025-11-29 06:58:10.295 187189 DEBUG oslo_concurrency.processutils [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:58:10 compute-0 nova_compute[187185]: 2025-11-29 06:58:10.297 187189 DEBUG nova.virt.disk.api [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Cannot resize image /var/lib/nova/instances/3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 06:58:10 compute-0 nova_compute[187185]: 2025-11-29 06:58:10.298 187189 DEBUG nova.objects.instance [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Lazy-loading 'migration_context' on Instance uuid 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:58:10 compute-0 nova_compute[187185]: 2025-11-29 06:58:10.343 187189 DEBUG nova.virt.libvirt.driver [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 06:58:10 compute-0 nova_compute[187185]: 2025-11-29 06:58:10.343 187189 DEBUG nova.virt.libvirt.driver [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Ensure instance console log exists: /var/lib/nova/instances/3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 06:58:10 compute-0 nova_compute[187185]: 2025-11-29 06:58:10.344 187189 DEBUG oslo_concurrency.lockutils [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:58:10 compute-0 nova_compute[187185]: 2025-11-29 06:58:10.345 187189 DEBUG oslo_concurrency.lockutils [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:58:10 compute-0 nova_compute[187185]: 2025-11-29 06:58:10.346 187189 DEBUG oslo_concurrency.lockutils [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:58:10 compute-0 podman[219465]: 2025-11-29 06:58:10.830796409 +0000 UTC m=+0.071741395 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 06:58:10 compute-0 podman[219464]: 2025-11-29 06:58:10.847431643 +0000 UTC m=+0.093977645 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 06:58:11 compute-0 nova_compute[187185]: 2025-11-29 06:58:11.462 187189 DEBUG nova.network.neutron [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Successfully updated port: b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 06:58:11 compute-0 nova_compute[187185]: 2025-11-29 06:58:11.487 187189 DEBUG oslo_concurrency.lockutils [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Acquiring lock "refresh_cache-3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:58:11 compute-0 nova_compute[187185]: 2025-11-29 06:58:11.488 187189 DEBUG oslo_concurrency.lockutils [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Acquired lock "refresh_cache-3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:58:11 compute-0 nova_compute[187185]: 2025-11-29 06:58:11.488 187189 DEBUG nova.network.neutron [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 06:58:12 compute-0 nova_compute[187185]: 2025-11-29 06:58:12.021 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:58:12 compute-0 nova_compute[187185]: 2025-11-29 06:58:12.464 187189 DEBUG nova.network.neutron [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 06:58:12 compute-0 nova_compute[187185]: 2025-11-29 06:58:12.690 187189 DEBUG nova.compute.manager [req-5fdc4d1c-2326-4657-b989-f6a146418e34 req-6f8145f1-32ed-41f5-b068-9f191e7dcbf6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Received event network-changed-b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:58:12 compute-0 nova_compute[187185]: 2025-11-29 06:58:12.691 187189 DEBUG nova.compute.manager [req-5fdc4d1c-2326-4657-b989-f6a146418e34 req-6f8145f1-32ed-41f5-b068-9f191e7dcbf6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Refreshing instance network info cache due to event network-changed-b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 06:58:12 compute-0 nova_compute[187185]: 2025-11-29 06:58:12.691 187189 DEBUG oslo_concurrency.lockutils [req-5fdc4d1c-2326-4657-b989-f6a146418e34 req-6f8145f1-32ed-41f5-b068-9f191e7dcbf6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:58:13 compute-0 nova_compute[187185]: 2025-11-29 06:58:13.493 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:58:13 compute-0 nova_compute[187185]: 2025-11-29 06:58:13.897 187189 DEBUG nova.network.neutron [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Updating instance_info_cache with network_info: [{"id": "b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e", "address": "fa:16:3e:e6:38:dc", "network": {"id": "5c312f91-84b2-4c6f-94ed-87030fed964b", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-218404435-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c25ab1655ced493998b50733e2d514fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb9ec3ba2-52", "ovs_interfaceid": "b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:58:13 compute-0 nova_compute[187185]: 2025-11-29 06:58:13.920 187189 DEBUG oslo_concurrency.lockutils [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Releasing lock "refresh_cache-3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:58:13 compute-0 nova_compute[187185]: 2025-11-29 06:58:13.920 187189 DEBUG nova.compute.manager [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Instance network_info: |[{"id": "b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e", "address": "fa:16:3e:e6:38:dc", "network": {"id": "5c312f91-84b2-4c6f-94ed-87030fed964b", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-218404435-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c25ab1655ced493998b50733e2d514fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb9ec3ba2-52", "ovs_interfaceid": "b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 06:58:13 compute-0 nova_compute[187185]: 2025-11-29 06:58:13.921 187189 DEBUG oslo_concurrency.lockutils [req-5fdc4d1c-2326-4657-b989-f6a146418e34 req-6f8145f1-32ed-41f5-b068-9f191e7dcbf6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:58:13 compute-0 nova_compute[187185]: 2025-11-29 06:58:13.922 187189 DEBUG nova.network.neutron [req-5fdc4d1c-2326-4657-b989-f6a146418e34 req-6f8145f1-32ed-41f5-b068-9f191e7dcbf6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Refreshing network info cache for port b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 06:58:13 compute-0 nova_compute[187185]: 2025-11-29 06:58:13.928 187189 DEBUG nova.virt.libvirt.driver [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Start _get_guest_xml network_info=[{"id": "b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e", "address": "fa:16:3e:e6:38:dc", "network": {"id": "5c312f91-84b2-4c6f-94ed-87030fed964b", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-218404435-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c25ab1655ced493998b50733e2d514fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb9ec3ba2-52", "ovs_interfaceid": "b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 06:58:13 compute-0 nova_compute[187185]: 2025-11-29 06:58:13.936 187189 WARNING nova.virt.libvirt.driver [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:58:13 compute-0 nova_compute[187185]: 2025-11-29 06:58:13.943 187189 DEBUG nova.virt.libvirt.host [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 06:58:13 compute-0 nova_compute[187185]: 2025-11-29 06:58:13.944 187189 DEBUG nova.virt.libvirt.host [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 06:58:13 compute-0 nova_compute[187185]: 2025-11-29 06:58:13.961 187189 DEBUG nova.virt.libvirt.host [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 06:58:13 compute-0 nova_compute[187185]: 2025-11-29 06:58:13.962 187189 DEBUG nova.virt.libvirt.host [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 06:58:13 compute-0 nova_compute[187185]: 2025-11-29 06:58:13.965 187189 DEBUG nova.virt.libvirt.driver [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 06:58:13 compute-0 nova_compute[187185]: 2025-11-29 06:58:13.966 187189 DEBUG nova.virt.hardware [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 06:58:13 compute-0 nova_compute[187185]: 2025-11-29 06:58:13.966 187189 DEBUG nova.virt.hardware [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 06:58:13 compute-0 nova_compute[187185]: 2025-11-29 06:58:13.967 187189 DEBUG nova.virt.hardware [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 06:58:13 compute-0 nova_compute[187185]: 2025-11-29 06:58:13.967 187189 DEBUG nova.virt.hardware [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 06:58:13 compute-0 nova_compute[187185]: 2025-11-29 06:58:13.968 187189 DEBUG nova.virt.hardware [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 06:58:13 compute-0 nova_compute[187185]: 2025-11-29 06:58:13.968 187189 DEBUG nova.virt.hardware [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 06:58:13 compute-0 nova_compute[187185]: 2025-11-29 06:58:13.969 187189 DEBUG nova.virt.hardware [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 06:58:13 compute-0 nova_compute[187185]: 2025-11-29 06:58:13.969 187189 DEBUG nova.virt.hardware [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 06:58:13 compute-0 nova_compute[187185]: 2025-11-29 06:58:13.970 187189 DEBUG nova.virt.hardware [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 06:58:13 compute-0 nova_compute[187185]: 2025-11-29 06:58:13.970 187189 DEBUG nova.virt.hardware [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 06:58:13 compute-0 nova_compute[187185]: 2025-11-29 06:58:13.971 187189 DEBUG nova.virt.hardware [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 06:58:13 compute-0 nova_compute[187185]: 2025-11-29 06:58:13.978 187189 DEBUG nova.virt.libvirt.vif [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:58:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestManualDisk-server-1088610341',display_name='tempest-ServersTestManualDisk-server-1088610341',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmanualdisk-server-1088610341',id=44,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMEM6sFLOUOrNh6eIgURmzcYIUeOV5JocPOhQ5sp4OUldoYBnfHHW9kH+GzZIRDjuYfhnkJMZvDdFm6zREoS69pxxAvYkbzTFiySaqfRHd87bUOgR0mLJJIq/7DNCsg0Lw==',key_name='tempest-keypair-1082752568',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c25ab1655ced493998b50733e2d514fc',ramdisk_id='',reservation_id='r-kuffkjmg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestManualDisk-1726842339',owner_user_name='tempest-ServersTestManualDisk-1726842339-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:58:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1472c12f52724fe5aae37390978dd8fe',uuid=3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e", "address": "fa:16:3e:e6:38:dc", "network": {"id": "5c312f91-84b2-4c6f-94ed-87030fed964b", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-218404435-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c25ab1655ced493998b50733e2d514fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb9ec3ba2-52", "ovs_interfaceid": "b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 06:58:13 compute-0 nova_compute[187185]: 2025-11-29 06:58:13.979 187189 DEBUG nova.network.os_vif_util [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Converting VIF {"id": "b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e", "address": "fa:16:3e:e6:38:dc", "network": {"id": "5c312f91-84b2-4c6f-94ed-87030fed964b", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-218404435-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c25ab1655ced493998b50733e2d514fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb9ec3ba2-52", "ovs_interfaceid": "b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 06:58:13 compute-0 nova_compute[187185]: 2025-11-29 06:58:13.980 187189 DEBUG nova.network.os_vif_util [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e6:38:dc,bridge_name='br-int',has_traffic_filtering=True,id=b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e,network=Network(5c312f91-84b2-4c6f-94ed-87030fed964b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb9ec3ba2-52') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 06:58:13 compute-0 nova_compute[187185]: 2025-11-29 06:58:13.982 187189 DEBUG nova.objects.instance [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Lazy-loading 'pci_devices' on Instance uuid 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:58:14 compute-0 nova_compute[187185]: 2025-11-29 06:58:14.005 187189 DEBUG nova.virt.libvirt.driver [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] End _get_guest_xml xml=<domain type="kvm">
Nov 29 06:58:14 compute-0 nova_compute[187185]:   <uuid>3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a</uuid>
Nov 29 06:58:14 compute-0 nova_compute[187185]:   <name>instance-0000002c</name>
Nov 29 06:58:14 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 06:58:14 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 06:58:14 compute-0 nova_compute[187185]:   <metadata>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 06:58:14 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:       <nova:name>tempest-ServersTestManualDisk-server-1088610341</nova:name>
Nov 29 06:58:14 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 06:58:13</nova:creationTime>
Nov 29 06:58:14 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 06:58:14 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 06:58:14 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 06:58:14 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 06:58:14 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 06:58:14 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 06:58:14 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 06:58:14 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 06:58:14 compute-0 nova_compute[187185]:         <nova:user uuid="1472c12f52724fe5aae37390978dd8fe">tempest-ServersTestManualDisk-1726842339-project-member</nova:user>
Nov 29 06:58:14 compute-0 nova_compute[187185]:         <nova:project uuid="c25ab1655ced493998b50733e2d514fc">tempest-ServersTestManualDisk-1726842339</nova:project>
Nov 29 06:58:14 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 06:58:14 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 06:58:14 compute-0 nova_compute[187185]:         <nova:port uuid="b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e">
Nov 29 06:58:14 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 06:58:14 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 06:58:14 compute-0 nova_compute[187185]:   </metadata>
Nov 29 06:58:14 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <system>
Nov 29 06:58:14 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 06:58:14 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 06:58:14 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 06:58:14 compute-0 nova_compute[187185]:       <entry name="serial">3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a</entry>
Nov 29 06:58:14 compute-0 nova_compute[187185]:       <entry name="uuid">3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a</entry>
Nov 29 06:58:14 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     </system>
Nov 29 06:58:14 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 06:58:14 compute-0 nova_compute[187185]:   <os>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:   </os>
Nov 29 06:58:14 compute-0 nova_compute[187185]:   <features>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <apic/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:   </features>
Nov 29 06:58:14 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:   </clock>
Nov 29 06:58:14 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:   </cpu>
Nov 29 06:58:14 compute-0 nova_compute[187185]:   <devices>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 06:58:14 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/disk"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     </disk>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 06:58:14 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/disk.config"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     </disk>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 06:58:14 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:e6:38:dc"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:       <target dev="tapb9ec3ba2-52"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     </interface>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 06:58:14 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/console.log" append="off"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     </serial>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <video>
Nov 29 06:58:14 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     </video>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 06:58:14 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     </rng>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 06:58:14 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 06:58:14 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 06:58:14 compute-0 nova_compute[187185]:   </devices>
Nov 29 06:58:14 compute-0 nova_compute[187185]: </domain>
Nov 29 06:58:14 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 06:58:14 compute-0 nova_compute[187185]: 2025-11-29 06:58:14.007 187189 DEBUG nova.compute.manager [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Preparing to wait for external event network-vif-plugged-b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 06:58:14 compute-0 nova_compute[187185]: 2025-11-29 06:58:14.008 187189 DEBUG oslo_concurrency.lockutils [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Acquiring lock "3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:58:14 compute-0 nova_compute[187185]: 2025-11-29 06:58:14.008 187189 DEBUG oslo_concurrency.lockutils [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Lock "3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:58:14 compute-0 nova_compute[187185]: 2025-11-29 06:58:14.008 187189 DEBUG oslo_concurrency.lockutils [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Lock "3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:58:14 compute-0 nova_compute[187185]: 2025-11-29 06:58:14.009 187189 DEBUG nova.virt.libvirt.vif [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:58:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestManualDisk-server-1088610341',display_name='tempest-ServersTestManualDisk-server-1088610341',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmanualdisk-server-1088610341',id=44,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMEM6sFLOUOrNh6eIgURmzcYIUeOV5JocPOhQ5sp4OUldoYBnfHHW9kH+GzZIRDjuYfhnkJMZvDdFm6zREoS69pxxAvYkbzTFiySaqfRHd87bUOgR0mLJJIq/7DNCsg0Lw==',key_name='tempest-keypair-1082752568',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c25ab1655ced493998b50733e2d514fc',ramdisk_id='',reservation_id='r-kuffkjmg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestManualDisk-1726842339',owner_user_name='tempest-ServersTestManualDisk-1726842339-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:58:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1472c12f52724fe5aae37390978dd8fe',uuid=3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e", "address": "fa:16:3e:e6:38:dc", "network": {"id": "5c312f91-84b2-4c6f-94ed-87030fed964b", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-218404435-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c25ab1655ced493998b50733e2d514fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb9ec3ba2-52", "ovs_interfaceid": "b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 06:58:14 compute-0 nova_compute[187185]: 2025-11-29 06:58:14.010 187189 DEBUG nova.network.os_vif_util [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Converting VIF {"id": "b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e", "address": "fa:16:3e:e6:38:dc", "network": {"id": "5c312f91-84b2-4c6f-94ed-87030fed964b", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-218404435-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c25ab1655ced493998b50733e2d514fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb9ec3ba2-52", "ovs_interfaceid": "b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 06:58:14 compute-0 nova_compute[187185]: 2025-11-29 06:58:14.011 187189 DEBUG nova.network.os_vif_util [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e6:38:dc,bridge_name='br-int',has_traffic_filtering=True,id=b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e,network=Network(5c312f91-84b2-4c6f-94ed-87030fed964b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb9ec3ba2-52') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 06:58:14 compute-0 nova_compute[187185]: 2025-11-29 06:58:14.011 187189 DEBUG os_vif [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e6:38:dc,bridge_name='br-int',has_traffic_filtering=True,id=b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e,network=Network(5c312f91-84b2-4c6f-94ed-87030fed964b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb9ec3ba2-52') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 06:58:14 compute-0 nova_compute[187185]: 2025-11-29 06:58:14.012 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:58:14 compute-0 nova_compute[187185]: 2025-11-29 06:58:14.012 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:58:14 compute-0 nova_compute[187185]: 2025-11-29 06:58:14.013 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 06:58:14 compute-0 nova_compute[187185]: 2025-11-29 06:58:14.018 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:58:14 compute-0 nova_compute[187185]: 2025-11-29 06:58:14.018 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb9ec3ba2-52, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:58:14 compute-0 nova_compute[187185]: 2025-11-29 06:58:14.019 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb9ec3ba2-52, col_values=(('external_ids', {'iface-id': 'b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e6:38:dc', 'vm-uuid': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:58:14 compute-0 nova_compute[187185]: 2025-11-29 06:58:14.022 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:58:14 compute-0 nova_compute[187185]: 2025-11-29 06:58:14.024 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 06:58:14 compute-0 NetworkManager[55227]: <info>  [1764399494.0256] manager: (tapb9ec3ba2-52): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/57)
Nov 29 06:58:14 compute-0 nova_compute[187185]: 2025-11-29 06:58:14.034 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:58:14 compute-0 nova_compute[187185]: 2025-11-29 06:58:14.036 187189 INFO os_vif [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e6:38:dc,bridge_name='br-int',has_traffic_filtering=True,id=b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e,network=Network(5c312f91-84b2-4c6f-94ed-87030fed964b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb9ec3ba2-52')
Nov 29 06:58:14 compute-0 nova_compute[187185]: 2025-11-29 06:58:14.414 187189 DEBUG nova.virt.libvirt.driver [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 06:58:14 compute-0 nova_compute[187185]: 2025-11-29 06:58:14.415 187189 DEBUG nova.virt.libvirt.driver [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 06:58:14 compute-0 nova_compute[187185]: 2025-11-29 06:58:14.415 187189 DEBUG nova.virt.libvirt.driver [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] No VIF found with MAC fa:16:3e:e6:38:dc, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 06:58:14 compute-0 nova_compute[187185]: 2025-11-29 06:58:14.416 187189 INFO nova.virt.libvirt.driver [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Using config drive
Nov 29 06:58:15 compute-0 nova_compute[187185]: 2025-11-29 06:58:15.317 187189 INFO nova.virt.libvirt.driver [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Creating config drive at /var/lib/nova/instances/3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/disk.config
Nov 29 06:58:15 compute-0 nova_compute[187185]: 2025-11-29 06:58:15.324 187189 DEBUG oslo_concurrency.processutils [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgu3chiph execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:58:15 compute-0 nova_compute[187185]: 2025-11-29 06:58:15.471 187189 DEBUG oslo_concurrency.processutils [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgu3chiph" returned: 0 in 0.147s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:58:15 compute-0 kernel: tapb9ec3ba2-52: entered promiscuous mode
Nov 29 06:58:15 compute-0 nova_compute[187185]: 2025-11-29 06:58:15.567 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:58:15 compute-0 NetworkManager[55227]: <info>  [1764399495.5700] manager: (tapb9ec3ba2-52): new Tun device (/org/freedesktop/NetworkManager/Devices/58)
Nov 29 06:58:15 compute-0 ovn_controller[95281]: 2025-11-29T06:58:15Z|00097|binding|INFO|Claiming lport b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e for this chassis.
Nov 29 06:58:15 compute-0 ovn_controller[95281]: 2025-11-29T06:58:15Z|00098|binding|INFO|b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e: Claiming fa:16:3e:e6:38:dc 10.100.0.8
Nov 29 06:58:15 compute-0 nova_compute[187185]: 2025-11-29 06:58:15.577 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:58:15 compute-0 nova_compute[187185]: 2025-11-29 06:58:15.583 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:58:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:58:15.595 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e6:38:dc 10.100.0.8'], port_security=['fa:16:3e:e6:38:dc 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5c312f91-84b2-4c6f-94ed-87030fed964b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c25ab1655ced493998b50733e2d514fc', 'neutron:revision_number': '2', 'neutron:security_group_ids': '49e53e3a-7c88-48ec-a482-6b67c8d5d0fa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ca554f30-048a-4396-b118-d70c3e4bb5fc, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 06:58:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:58:15.599 104254 INFO neutron.agent.ovn.metadata.agent [-] Port b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e in datapath 5c312f91-84b2-4c6f-94ed-87030fed964b bound to our chassis
Nov 29 06:58:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:58:15.602 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5c312f91-84b2-4c6f-94ed-87030fed964b
Nov 29 06:58:15 compute-0 systemd-udevd[219529]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 06:58:15 compute-0 systemd-machined[153486]: New machine qemu-17-instance-0000002c.
Nov 29 06:58:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:58:15.625 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[8a4a50cc-8bc7-4c94-b882-8d93561b4837]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:58:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:58:15.627 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5c312f91-81 in ovnmeta-5c312f91-84b2-4c6f-94ed-87030fed964b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 06:58:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:58:15.632 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5c312f91-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 06:58:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:58:15.632 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[9f4a3f19-19c0-4141-bfd8-d206bc9147ab]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:58:15 compute-0 NetworkManager[55227]: <info>  [1764399495.6340] device (tapb9ec3ba2-52): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 06:58:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:58:15.633 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[b7907aaa-e94e-4d3c-8c1e-853ee9ef5294]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:58:15 compute-0 NetworkManager[55227]: <info>  [1764399495.6366] device (tapb9ec3ba2-52): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 06:58:15 compute-0 systemd[1]: Started Virtual Machine qemu-17-instance-0000002c.
Nov 29 06:58:15 compute-0 nova_compute[187185]: 2025-11-29 06:58:15.653 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:58:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:58:15.652 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[160cb522-adab-4879-9f52-6b3e2cc536f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:58:15 compute-0 ovn_controller[95281]: 2025-11-29T06:58:15Z|00099|binding|INFO|Setting lport b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e ovn-installed in OVS
Nov 29 06:58:15 compute-0 ovn_controller[95281]: 2025-11-29T06:58:15Z|00100|binding|INFO|Setting lport b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e up in Southbound
Nov 29 06:58:15 compute-0 nova_compute[187185]: 2025-11-29 06:58:15.659 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:58:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:58:15.673 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6cc26fce-5c25-4d09-a269-da22ae77c4f5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:58:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:58:15.718 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[4706c042-e2a6-4946-a4bc-d81188e57602]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:58:15 compute-0 systemd-udevd[219531]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 06:58:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:58:15.723 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a30273d1-30b9-49a0-80f0-3c480f10cb2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:58:15 compute-0 NetworkManager[55227]: <info>  [1764399495.7253] manager: (tap5c312f91-80): new Veth device (/org/freedesktop/NetworkManager/Devices/59)
Nov 29 06:58:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:58:15.764 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[362681c3-1705-4f4f-823b-88ceda30a649]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:58:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:58:15.769 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[6a41ae33-5d7f-4e77-bd3d-8f55f6a173aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:58:15 compute-0 NetworkManager[55227]: <info>  [1764399495.8007] device (tap5c312f91-80): carrier: link connected
Nov 29 06:58:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:58:15.809 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[25d50a34-b0e1-41f2-a57e-979d95a9a338]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:58:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:58:15.838 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[786e08d3-df26-452d-8268-da28797d144f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5c312f91-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a1:ec:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 494346, 'reachable_time': 36908, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219561, 'error': None, 'target': 'ovnmeta-5c312f91-84b2-4c6f-94ed-87030fed964b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:58:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:58:15.865 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[07cc073e-2491-4e0b-aab8-7f421e947687]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea1:ec60'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 494346, 'tstamp': 494346}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219562, 'error': None, 'target': 'ovnmeta-5c312f91-84b2-4c6f-94ed-87030fed964b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:58:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:58:15.896 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[04d2ff9f-9a60-480e-84f1-0e5d42a8a1cf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5c312f91-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a1:ec:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 494346, 'reachable_time': 36908, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219563, 'error': None, 'target': 'ovnmeta-5c312f91-84b2-4c6f-94ed-87030fed964b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:58:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:58:15.938 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e027057e-20b8-432c-a922-3608f6c33bbf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:58:16 compute-0 nova_compute[187185]: 2025-11-29 06:58:16.015 187189 DEBUG nova.network.neutron [req-5fdc4d1c-2326-4657-b989-f6a146418e34 req-6f8145f1-32ed-41f5-b068-9f191e7dcbf6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Updated VIF entry in instance network info cache for port b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 06:58:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:58:16.014 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[8aa1fc8f-ae3a-45c7-b916-1c55772c2326]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:58:16 compute-0 nova_compute[187185]: 2025-11-29 06:58:16.015 187189 DEBUG nova.network.neutron [req-5fdc4d1c-2326-4657-b989-f6a146418e34 req-6f8145f1-32ed-41f5-b068-9f191e7dcbf6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Updating instance_info_cache with network_info: [{"id": "b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e", "address": "fa:16:3e:e6:38:dc", "network": {"id": "5c312f91-84b2-4c6f-94ed-87030fed964b", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-218404435-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c25ab1655ced493998b50733e2d514fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb9ec3ba2-52", "ovs_interfaceid": "b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:58:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:58:16.017 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5c312f91-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:58:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:58:16.017 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 06:58:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:58:16.018 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5c312f91-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:58:16 compute-0 nova_compute[187185]: 2025-11-29 06:58:16.021 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:58:16 compute-0 NetworkManager[55227]: <info>  [1764399496.0221] manager: (tap5c312f91-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Nov 29 06:58:16 compute-0 kernel: tap5c312f91-80: entered promiscuous mode
Nov 29 06:58:16 compute-0 nova_compute[187185]: 2025-11-29 06:58:16.023 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:58:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:58:16.025 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5c312f91-80, col_values=(('external_ids', {'iface-id': '8f6e6a63-3adc-455a-821d-4f8f756e8b07'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:58:16 compute-0 nova_compute[187185]: 2025-11-29 06:58:16.026 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:58:16 compute-0 ovn_controller[95281]: 2025-11-29T06:58:16Z|00101|binding|INFO|Releasing lport 8f6e6a63-3adc-455a-821d-4f8f756e8b07 from this chassis (sb_readonly=0)
Nov 29 06:58:16 compute-0 nova_compute[187185]: 2025-11-29 06:58:16.032 187189 DEBUG oslo_concurrency.lockutils [req-5fdc4d1c-2326-4657-b989-f6a146418e34 req-6f8145f1-32ed-41f5-b068-9f191e7dcbf6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:58:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:58:16.053 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5c312f91-84b2-4c6f-94ed-87030fed964b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5c312f91-84b2-4c6f-94ed-87030fed964b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 06:58:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:58:16.053 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[25df420c-9a7c-4baf-84fa-31652e5e3e8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:58:16 compute-0 nova_compute[187185]: 2025-11-29 06:58:16.053 187189 DEBUG nova.compute.manager [req-04839a89-e9f8-4206-a0fa-41005f470b62 req-97d80745-6e26-4252-a79c-980889b64fae 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Received event network-vif-plugged-b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:58:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:58:16.054 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 06:58:16 compute-0 ovn_metadata_agent[104249]: global
Nov 29 06:58:16 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 06:58:16 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-5c312f91-84b2-4c6f-94ed-87030fed964b
Nov 29 06:58:16 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 06:58:16 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 06:58:16 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 06:58:16 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/5c312f91-84b2-4c6f-94ed-87030fed964b.pid.haproxy
Nov 29 06:58:16 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 06:58:16 compute-0 ovn_metadata_agent[104249]: 
Nov 29 06:58:16 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 06:58:16 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 06:58:16 compute-0 nova_compute[187185]: 2025-11-29 06:58:16.054 187189 DEBUG oslo_concurrency.lockutils [req-04839a89-e9f8-4206-a0fa-41005f470b62 req-97d80745-6e26-4252-a79c-980889b64fae 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:58:16 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 06:58:16 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 06:58:16 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 06:58:16 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 06:58:16 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 06:58:16 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 06:58:16 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 06:58:16 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 06:58:16 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 06:58:16 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 06:58:16 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 06:58:16 compute-0 ovn_metadata_agent[104249]: 
Nov 29 06:58:16 compute-0 ovn_metadata_agent[104249]: 
Nov 29 06:58:16 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 06:58:16 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 06:58:16 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 06:58:16 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID 5c312f91-84b2-4c6f-94ed-87030fed964b
Nov 29 06:58:16 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 06:58:16 compute-0 nova_compute[187185]: 2025-11-29 06:58:16.055 187189 DEBUG oslo_concurrency.lockutils [req-04839a89-e9f8-4206-a0fa-41005f470b62 req-97d80745-6e26-4252-a79c-980889b64fae 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:58:16 compute-0 nova_compute[187185]: 2025-11-29 06:58:16.055 187189 DEBUG oslo_concurrency.lockutils [req-04839a89-e9f8-4206-a0fa-41005f470b62 req-97d80745-6e26-4252-a79c-980889b64fae 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:58:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:58:16.055 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5c312f91-84b2-4c6f-94ed-87030fed964b', 'env', 'PROCESS_TAG=haproxy-5c312f91-84b2-4c6f-94ed-87030fed964b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5c312f91-84b2-4c6f-94ed-87030fed964b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 06:58:16 compute-0 nova_compute[187185]: 2025-11-29 06:58:16.055 187189 DEBUG nova.compute.manager [req-04839a89-e9f8-4206-a0fa-41005f470b62 req-97d80745-6e26-4252-a79c-980889b64fae 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Processing event network-vif-plugged-b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 06:58:16 compute-0 nova_compute[187185]: 2025-11-29 06:58:16.056 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:58:16 compute-0 nova_compute[187185]: 2025-11-29 06:58:16.178 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399496.1776168, 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:58:16 compute-0 nova_compute[187185]: 2025-11-29 06:58:16.178 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] VM Started (Lifecycle Event)
Nov 29 06:58:16 compute-0 nova_compute[187185]: 2025-11-29 06:58:16.182 187189 DEBUG nova.compute.manager [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 06:58:16 compute-0 nova_compute[187185]: 2025-11-29 06:58:16.194 187189 DEBUG nova.virt.libvirt.driver [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 06:58:16 compute-0 nova_compute[187185]: 2025-11-29 06:58:16.199 187189 INFO nova.virt.libvirt.driver [-] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Instance spawned successfully.
Nov 29 06:58:16 compute-0 nova_compute[187185]: 2025-11-29 06:58:16.199 187189 DEBUG nova.virt.libvirt.driver [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 06:58:16 compute-0 nova_compute[187185]: 2025-11-29 06:58:16.213 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:58:16 compute-0 nova_compute[187185]: 2025-11-29 06:58:16.222 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 06:58:16 compute-0 nova_compute[187185]: 2025-11-29 06:58:16.228 187189 DEBUG nova.virt.libvirt.driver [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:58:16 compute-0 nova_compute[187185]: 2025-11-29 06:58:16.229 187189 DEBUG nova.virt.libvirt.driver [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:58:16 compute-0 nova_compute[187185]: 2025-11-29 06:58:16.229 187189 DEBUG nova.virt.libvirt.driver [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:58:16 compute-0 nova_compute[187185]: 2025-11-29 06:58:16.230 187189 DEBUG nova.virt.libvirt.driver [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:58:16 compute-0 nova_compute[187185]: 2025-11-29 06:58:16.231 187189 DEBUG nova.virt.libvirt.driver [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:58:16 compute-0 nova_compute[187185]: 2025-11-29 06:58:16.232 187189 DEBUG nova.virt.libvirt.driver [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 06:58:16 compute-0 nova_compute[187185]: 2025-11-29 06:58:16.266 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 06:58:16 compute-0 nova_compute[187185]: 2025-11-29 06:58:16.267 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399496.1814268, 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:58:16 compute-0 nova_compute[187185]: 2025-11-29 06:58:16.267 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] VM Paused (Lifecycle Event)
Nov 29 06:58:16 compute-0 nova_compute[187185]: 2025-11-29 06:58:16.303 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:58:16 compute-0 nova_compute[187185]: 2025-11-29 06:58:16.308 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399496.1855786, 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:58:16 compute-0 nova_compute[187185]: 2025-11-29 06:58:16.308 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] VM Resumed (Lifecycle Event)
Nov 29 06:58:16 compute-0 nova_compute[187185]: 2025-11-29 06:58:16.328 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:58:16 compute-0 nova_compute[187185]: 2025-11-29 06:58:16.334 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 06:58:16 compute-0 nova_compute[187185]: 2025-11-29 06:58:16.353 187189 INFO nova.compute.manager [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Took 8.62 seconds to spawn the instance on the hypervisor.
Nov 29 06:58:16 compute-0 nova_compute[187185]: 2025-11-29 06:58:16.354 187189 DEBUG nova.compute.manager [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:58:16 compute-0 nova_compute[187185]: 2025-11-29 06:58:16.365 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 06:58:16 compute-0 nova_compute[187185]: 2025-11-29 06:58:16.459 187189 INFO nova.compute.manager [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Took 9.26 seconds to build instance.
Nov 29 06:58:16 compute-0 nova_compute[187185]: 2025-11-29 06:58:16.482 187189 DEBUG oslo_concurrency.lockutils [None req-d60a60df-4c9d-457b-a23b-8fa3b34cd072 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Lock "3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.365s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:58:16 compute-0 podman[219602]: 2025-11-29 06:58:16.490308806 +0000 UTC m=+0.029948377 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 06:58:18 compute-0 podman[219602]: 2025-11-29 06:58:18.112143379 +0000 UTC m=+1.651782910 container create 0a7219f28e89b5e111fee55ab5369cfcb7fb9171ce9b80ea6ea3f96f22eb2176 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5c312f91-84b2-4c6f-94ed-87030fed964b, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 06:58:18 compute-0 nova_compute[187185]: 2025-11-29 06:58:18.147 187189 DEBUG nova.compute.manager [req-675e05b7-8d64-4c1e-91fc-dbbdbb73a8e2 req-ab3e1796-4237-4fa2-9479-6cc3c2cf6426 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Received event network-vif-plugged-b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:58:18 compute-0 nova_compute[187185]: 2025-11-29 06:58:18.148 187189 DEBUG oslo_concurrency.lockutils [req-675e05b7-8d64-4c1e-91fc-dbbdbb73a8e2 req-ab3e1796-4237-4fa2-9479-6cc3c2cf6426 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:58:18 compute-0 nova_compute[187185]: 2025-11-29 06:58:18.149 187189 DEBUG oslo_concurrency.lockutils [req-675e05b7-8d64-4c1e-91fc-dbbdbb73a8e2 req-ab3e1796-4237-4fa2-9479-6cc3c2cf6426 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:58:18 compute-0 nova_compute[187185]: 2025-11-29 06:58:18.149 187189 DEBUG oslo_concurrency.lockutils [req-675e05b7-8d64-4c1e-91fc-dbbdbb73a8e2 req-ab3e1796-4237-4fa2-9479-6cc3c2cf6426 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:58:18 compute-0 nova_compute[187185]: 2025-11-29 06:58:18.149 187189 DEBUG nova.compute.manager [req-675e05b7-8d64-4c1e-91fc-dbbdbb73a8e2 req-ab3e1796-4237-4fa2-9479-6cc3c2cf6426 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] No waiting events found dispatching network-vif-plugged-b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 06:58:18 compute-0 nova_compute[187185]: 2025-11-29 06:58:18.150 187189 WARNING nova.compute.manager [req-675e05b7-8d64-4c1e-91fc-dbbdbb73a8e2 req-ab3e1796-4237-4fa2-9479-6cc3c2cf6426 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Received unexpected event network-vif-plugged-b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e for instance with vm_state active and task_state None.
Nov 29 06:58:18 compute-0 NetworkManager[55227]: <info>  [1764399498.5568] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/61)
Nov 29 06:58:18 compute-0 NetworkManager[55227]: <info>  [1764399498.5583] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/62)
Nov 29 06:58:18 compute-0 nova_compute[187185]: 2025-11-29 06:58:18.555 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:58:18 compute-0 sshd-session[219613]: Received disconnect from 103.179.56.44 port 36910:11: Bye Bye [preauth]
Nov 29 06:58:18 compute-0 sshd-session[219613]: Disconnected from authenticating user root 103.179.56.44 port 36910 [preauth]
Nov 29 06:58:18 compute-0 nova_compute[187185]: 2025-11-29 06:58:18.637 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:58:18 compute-0 ovn_controller[95281]: 2025-11-29T06:58:18Z|00102|binding|INFO|Releasing lport 8f6e6a63-3adc-455a-821d-4f8f756e8b07 from this chassis (sb_readonly=0)
Nov 29 06:58:18 compute-0 systemd[1]: Started libpod-conmon-0a7219f28e89b5e111fee55ab5369cfcb7fb9171ce9b80ea6ea3f96f22eb2176.scope.
Nov 29 06:58:18 compute-0 nova_compute[187185]: 2025-11-29 06:58:18.662 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:58:18 compute-0 systemd[1]: Started libcrun container.
Nov 29 06:58:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a8025cd861c71a8fc26d222bb9d3e3d398a73dbf69a87ef8a91b8ddcd2d1acf6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 06:58:19 compute-0 nova_compute[187185]: 2025-11-29 06:58:19.021 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:58:19 compute-0 podman[219602]: 2025-11-29 06:58:19.432336696 +0000 UTC m=+2.971976257 container init 0a7219f28e89b5e111fee55ab5369cfcb7fb9171ce9b80ea6ea3f96f22eb2176 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5c312f91-84b2-4c6f-94ed-87030fed964b, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:58:19 compute-0 podman[219602]: 2025-11-29 06:58:19.443071855 +0000 UTC m=+2.982711386 container start 0a7219f28e89b5e111fee55ab5369cfcb7fb9171ce9b80ea6ea3f96f22eb2176 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5c312f91-84b2-4c6f-94ed-87030fed964b, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:58:19 compute-0 neutron-haproxy-ovnmeta-5c312f91-84b2-4c6f-94ed-87030fed964b[219620]: [NOTICE]   (219624) : New worker (219626) forked
Nov 29 06:58:19 compute-0 neutron-haproxy-ovnmeta-5c312f91-84b2-4c6f-94ed-87030fed964b[219620]: [NOTICE]   (219624) : Loading success.
Nov 29 06:58:20 compute-0 podman[219635]: 2025-11-29 06:58:20.370481925 +0000 UTC m=+0.615771148 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 29 06:58:20 compute-0 nova_compute[187185]: 2025-11-29 06:58:20.808 187189 DEBUG nova.compute.manager [req-1bdfe681-0108-4f62-8b26-2c21deb93bea req-482c13cd-981c-41b2-8506-815e2f57384a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Received event network-changed-b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:58:20 compute-0 nova_compute[187185]: 2025-11-29 06:58:20.809 187189 DEBUG nova.compute.manager [req-1bdfe681-0108-4f62-8b26-2c21deb93bea req-482c13cd-981c-41b2-8506-815e2f57384a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Refreshing instance network info cache due to event network-changed-b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 06:58:20 compute-0 nova_compute[187185]: 2025-11-29 06:58:20.809 187189 DEBUG oslo_concurrency.lockutils [req-1bdfe681-0108-4f62-8b26-2c21deb93bea req-482c13cd-981c-41b2-8506-815e2f57384a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 06:58:20 compute-0 nova_compute[187185]: 2025-11-29 06:58:20.809 187189 DEBUG oslo_concurrency.lockutils [req-1bdfe681-0108-4f62-8b26-2c21deb93bea req-482c13cd-981c-41b2-8506-815e2f57384a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 06:58:20 compute-0 nova_compute[187185]: 2025-11-29 06:58:20.809 187189 DEBUG nova.network.neutron [req-1bdfe681-0108-4f62-8b26-2c21deb93bea req-482c13cd-981c-41b2-8506-815e2f57384a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Refreshing network info cache for port b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 06:58:23 compute-0 nova_compute[187185]: 2025-11-29 06:58:23.484 187189 DEBUG nova.network.neutron [req-1bdfe681-0108-4f62-8b26-2c21deb93bea req-482c13cd-981c-41b2-8506-815e2f57384a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Updated VIF entry in instance network info cache for port b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 06:58:23 compute-0 nova_compute[187185]: 2025-11-29 06:58:23.485 187189 DEBUG nova.network.neutron [req-1bdfe681-0108-4f62-8b26-2c21deb93bea req-482c13cd-981c-41b2-8506-815e2f57384a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Updating instance_info_cache with network_info: [{"id": "b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e", "address": "fa:16:3e:e6:38:dc", "network": {"id": "5c312f91-84b2-4c6f-94ed-87030fed964b", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-218404435-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c25ab1655ced493998b50733e2d514fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb9ec3ba2-52", "ovs_interfaceid": "b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:58:23 compute-0 nova_compute[187185]: 2025-11-29 06:58:23.513 187189 DEBUG oslo_concurrency.lockutils [req-1bdfe681-0108-4f62-8b26-2c21deb93bea req-482c13cd-981c-41b2-8506-815e2f57384a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 06:58:23 compute-0 nova_compute[187185]: 2025-11-29 06:58:23.639 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:58:24 compute-0 nova_compute[187185]: 2025-11-29 06:58:24.024 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:58:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:58:24.818 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:58:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:58:24.820 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:58:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:58:24.821 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:58:27 compute-0 ovn_controller[95281]: 2025-11-29T06:58:27Z|00103|binding|INFO|Releasing lport 8f6e6a63-3adc-455a-821d-4f8f756e8b07 from this chassis (sb_readonly=0)
Nov 29 06:58:28 compute-0 nova_compute[187185]: 2025-11-29 06:58:28.008 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:58:28 compute-0 nova_compute[187185]: 2025-11-29 06:58:28.640 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:58:29 compute-0 nova_compute[187185]: 2025-11-29 06:58:29.027 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:58:32 compute-0 podman[219666]: 2025-11-29 06:58:32.830922155 +0000 UTC m=+0.083406870 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 06:58:32 compute-0 podman[219668]: 2025-11-29 06:58:32.839464325 +0000 UTC m=+0.078066259 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 06:58:32 compute-0 podman[219667]: 2025-11-29 06:58:32.851633688 +0000 UTC m=+0.100886542 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, com.redhat.component=ubi9-minimal-container, distribution-scope=public, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6)
Nov 29 06:58:33 compute-0 nova_compute[187185]: 2025-11-29 06:58:33.642 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:58:34 compute-0 nova_compute[187185]: 2025-11-29 06:58:34.029 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:58:36 compute-0 ovn_controller[95281]: 2025-11-29T06:58:36Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e6:38:dc 10.100.0.8
Nov 29 06:58:36 compute-0 ovn_controller[95281]: 2025-11-29T06:58:36Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e6:38:dc 10.100.0.8
Nov 29 06:58:38 compute-0 nova_compute[187185]: 2025-11-29 06:58:38.497 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:58:38 compute-0 nova_compute[187185]: 2025-11-29 06:58:38.644 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:58:38 compute-0 podman[219733]: 2025-11-29 06:58:38.868145103 +0000 UTC m=+0.127666866 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 06:58:39 compute-0 nova_compute[187185]: 2025-11-29 06:58:39.031 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:58:41 compute-0 podman[219759]: 2025-11-29 06:58:41.839666851 +0000 UTC m=+0.093138894 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:58:41 compute-0 podman[219760]: 2025-11-29 06:58:41.839894707 +0000 UTC m=+0.086975930 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 06:58:43 compute-0 nova_compute[187185]: 2025-11-29 06:58:43.647 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:58:44 compute-0 nova_compute[187185]: 2025-11-29 06:58:44.034 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:58:45 compute-0 nova_compute[187185]: 2025-11-29 06:58:45.540 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:58:47 compute-0 nova_compute[187185]: 2025-11-29 06:58:47.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:58:47 compute-0 nova_compute[187185]: 2025-11-29 06:58:47.318 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.989 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a', 'name': 'tempest-ServersTestManualDisk-server-1088610341', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000002c', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c25ab1655ced493998b50733e2d514fc', 'user_id': '1472c12f52724fe5aae37390978dd8fe', 'hostId': '4b732d56a3efd101acada57417219e779ad529700523ec3187948204', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.989 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.995 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a / tapb9ec3ba2-52 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.995 12 DEBUG ceilometer.compute.pollsters [-] 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '989423f8-b4ef-4031-8eb9-104c549470ec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1472c12f52724fe5aae37390978dd8fe', 'user_name': None, 'project_id': 'c25ab1655ced493998b50733e2d514fc', 'project_name': None, 'resource_id': 'instance-0000002c-3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-tapb9ec3ba2-52', 'timestamp': '2025-11-29T06:58:47.990071', 'resource_metadata': {'display_name': 'tempest-ServersTestManualDisk-server-1088610341', 'name': 'tapb9ec3ba2-52', 'instance_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a', 'instance_type': 'm1.nano', 'host': '4b732d56a3efd101acada57417219e779ad529700523ec3187948204', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e6:38:dc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb9ec3ba2-52'}, 'message_id': 'da95deec-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4975.708319128, 'message_signature': '4c3e77a652565a024cfa8ba5cbd808dd13357caa7a917b83e7395e18111fad5a'}]}, 'timestamp': '2025-11-29 06:58:47.996076', '_unique_id': '2b60f8b4a6a945f8bb2430d776fdb152'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.997 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.998 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.998 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.998 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServersTestManualDisk-server-1088610341>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersTestManualDisk-server-1088610341>]
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.998 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 06:58:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.998 12 DEBUG ceilometer.compute.pollsters [-] 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '37d04dec-da7e-42ce-bbef-c61e5d14288a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1472c12f52724fe5aae37390978dd8fe', 'user_name': None, 'project_id': 'c25ab1655ced493998b50733e2d514fc', 'project_name': None, 'resource_id': 'instance-0000002c-3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-tapb9ec3ba2-52', 'timestamp': '2025-11-29T06:58:47.998879', 'resource_metadata': {'display_name': 'tempest-ServersTestManualDisk-server-1088610341', 'name': 'tapb9ec3ba2-52', 'instance_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a', 'instance_type': 'm1.nano', 'host': '4b732d56a3efd101acada57417219e779ad529700523ec3187948204', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e6:38:dc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb9ec3ba2-52'}, 'message_id': 'da965d18-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4975.708319128, 'message_signature': '62efb73358337557909807e288f4dfae0d82e3ce22ed0b4681977ab1b0581922'}]}, 'timestamp': '2025-11-29 06:58:47.999120', '_unique_id': '386310d45baf4aedad8e91d174b34f31'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:47.999 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 DEBUG ceilometer.compute.pollsters [-] 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/network.outgoing.packets volume: 17 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '89d3a3fb-d0c8-47bb-b267-4627f87de74f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 17, 'user_id': '1472c12f52724fe5aae37390978dd8fe', 'user_name': None, 'project_id': 'c25ab1655ced493998b50733e2d514fc', 'project_name': None, 'resource_id': 'instance-0000002c-3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-tapb9ec3ba2-52', 'timestamp': '2025-11-29T06:58:48.000266', 'resource_metadata': {'display_name': 'tempest-ServersTestManualDisk-server-1088610341', 'name': 'tapb9ec3ba2-52', 'instance_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a', 'instance_type': 'm1.nano', 'host': '4b732d56a3efd101acada57417219e779ad529700523ec3187948204', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e6:38:dc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb9ec3ba2-52'}, 'message_id': 'da969332-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4975.708319128, 'message_signature': '3f23e9821e7efb22e3c1b51ee915223e235cd05b898fc35234ddad98432c66bb'}]}, 'timestamp': '2025-11-29 06:58:48.000497', '_unique_id': '84502f50b5744db79de4a640a22ab71c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.000 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.001 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.001 12 DEBUG ceilometer.compute.pollsters [-] 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '07ebc416-12ca-4bb3-8bc6-998bd7fc2d80', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1472c12f52724fe5aae37390978dd8fe', 'user_name': None, 'project_id': 'c25ab1655ced493998b50733e2d514fc', 'project_name': None, 'resource_id': 'instance-0000002c-3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-tapb9ec3ba2-52', 'timestamp': '2025-11-29T06:58:48.001617', 'resource_metadata': {'display_name': 'tempest-ServersTestManualDisk-server-1088610341', 'name': 'tapb9ec3ba2-52', 'instance_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a', 'instance_type': 'm1.nano', 'host': '4b732d56a3efd101acada57417219e779ad529700523ec3187948204', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e6:38:dc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb9ec3ba2-52'}, 'message_id': 'da96c80c-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4975.708319128, 'message_signature': '93d6230da0b352bd3de4c8980bbe2eeebe4c86441451794ecd44ce3b9eb08034'}]}, 'timestamp': '2025-11-29 06:58:48.001873', '_unique_id': 'c6a94461b42e46a692cf8055073ea566'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.002 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.041 12 DEBUG ceilometer.compute.pollsters [-] 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/disk.device.write.bytes volume: 72912896 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.042 12 DEBUG ceilometer.compute.pollsters [-] 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6f6300a2-771b-45df-9518-a44f19272249', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72912896, 'user_id': '1472c12f52724fe5aae37390978dd8fe', 'user_name': None, 'project_id': 'c25ab1655ced493998b50733e2d514fc', 'project_name': None, 'resource_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-vda', 'timestamp': '2025-11-29T06:58:48.002953', 'resource_metadata': {'display_name': 'tempest-ServersTestManualDisk-server-1088610341', 'name': 'instance-0000002c', 'instance_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a', 'instance_type': 'm1.nano', 'host': '4b732d56a3efd101acada57417219e779ad529700523ec3187948204', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da9cf452-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4975.72118977, 'message_signature': 'ca17d04852808516aae1ab09e59bf6ca8c6935fd589ef864d0d8c6f723f3c78a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1472c12f52724fe5aae37390978dd8fe', 'user_name': None, 'project_id': 'c25ab1655ced493998b50733e2d514fc', 'project_name': None, 'resource_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-sda', 'timestamp': '2025-11-29T06:58:48.002953', 'resource_metadata': {'display_name': 'tempest-ServersTestManualDisk-server-1088610341', 'name': 'instance-0000002c', 'instance_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a', 'instance_type': 'm1.nano', 'host': '4b732d56a3efd101acada57417219e779ad529700523ec3187948204', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da9d02bc-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4975.72118977, 'message_signature': 'f54a8ca1573101864e7eff1adbf87b10b0a836850817d171480f9fca3aca5e3f'}]}, 'timestamp': '2025-11-29 06:58:48.042694', '_unique_id': 'a02931994b3f4ed0be8c8b5de8009f90'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.044 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.069 12 DEBUG ceilometer.compute.pollsters [-] 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/memory.usage volume: 42.390625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8040fd4e-e7b9-4202-b0ea-1cbcb7f82790', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.390625, 'user_id': '1472c12f52724fe5aae37390978dd8fe', 'user_name': None, 'project_id': 'c25ab1655ced493998b50733e2d514fc', 'project_name': None, 'resource_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a', 'timestamp': '2025-11-29T06:58:48.044941', 'resource_metadata': {'display_name': 'tempest-ServersTestManualDisk-server-1088610341', 'name': 'instance-0000002c', 'instance_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a', 'instance_type': 'm1.nano', 'host': '4b732d56a3efd101acada57417219e779ad529700523ec3187948204', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'daa1248c-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4975.787226989, 'message_signature': 'ab2804fd1cff9dea5d95c9dd4d521ac947d2f96318386b9b045f441f60de5174'}]}, 'timestamp': '2025-11-29 06:58:48.069903', '_unique_id': 'afbb9bed5e84498ea50bcd39dbcae260'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.071 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.072 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.072 12 DEBUG ceilometer.compute.pollsters [-] 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/disk.device.read.bytes volume: 31091200 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.072 12 DEBUG ceilometer.compute.pollsters [-] 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f3e18389-965e-4e59-988b-eb9d5c05aed0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31091200, 'user_id': '1472c12f52724fe5aae37390978dd8fe', 'user_name': None, 'project_id': 'c25ab1655ced493998b50733e2d514fc', 'project_name': None, 'resource_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-vda', 'timestamp': '2025-11-29T06:58:48.072540', 'resource_metadata': {'display_name': 'tempest-ServersTestManualDisk-server-1088610341', 'name': 'instance-0000002c', 'instance_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a', 'instance_type': 'm1.nano', 'host': '4b732d56a3efd101acada57417219e779ad529700523ec3187948204', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'daa19b06-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4975.72118977, 'message_signature': '7969547348b06e7a7a766f7af26bc6fa496ca572b8960bf8b34086febd90eab6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': '1472c12f52724fe5aae37390978dd8fe', 'user_name': None, 'project_id': 'c25ab1655ced493998b50733e2d514fc', 'project_name': None, 'resource_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-sda', 'timestamp': '2025-11-29T06:58:48.072540', 'resource_metadata': {'display_name': 'tempest-ServersTestManualDisk-server-1088610341', 'name': 'instance-0000002c', 'instance_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a', 'instance_type': 'm1.nano', 'host': '4b732d56a3efd101acada57417219e779ad529700523ec3187948204', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'daa1a614-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4975.72118977, 'message_signature': '795b1c0b1fb36e5473d9314c0e3f97cdf379be4a9f66760f6ef9a01fa3135174'}]}, 'timestamp': '2025-11-29 06:58:48.073081', '_unique_id': 'b0db367f82ff474a97c81bd959a52445'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.073 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.074 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.074 12 DEBUG ceilometer.compute.pollsters [-] 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/cpu volume: 12120000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '20f41ada-5f08-488d-8691-406c0508aac7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12120000000, 'user_id': '1472c12f52724fe5aae37390978dd8fe', 'user_name': None, 'project_id': 'c25ab1655ced493998b50733e2d514fc', 'project_name': None, 'resource_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a', 'timestamp': '2025-11-29T06:58:48.074380', 'resource_metadata': {'display_name': 'tempest-ServersTestManualDisk-server-1088610341', 'name': 'instance-0000002c', 'instance_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a', 'instance_type': 'm1.nano', 'host': '4b732d56a3efd101acada57417219e779ad529700523ec3187948204', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'daa1e26e-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4975.787226989, 'message_signature': '5563bb8f4f22c75223f32795ca145935926109da9f91e76983d62ccbccb0a4d5'}]}, 'timestamp': '2025-11-29 06:58:48.074612', '_unique_id': '519af283fa664a22914dae805bd57e90'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 DEBUG ceilometer.compute.pollsters [-] 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/disk.device.write.requests volume: 298 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.075 12 DEBUG ceilometer.compute.pollsters [-] 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ac116ee0-5035-41d4-9019-dfdbe1156013', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 298, 'user_id': '1472c12f52724fe5aae37390978dd8fe', 'user_name': None, 'project_id': 'c25ab1655ced493998b50733e2d514fc', 'project_name': None, 'resource_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-vda', 'timestamp': '2025-11-29T06:58:48.075739', 'resource_metadata': {'display_name': 'tempest-ServersTestManualDisk-server-1088610341', 'name': 'instance-0000002c', 'instance_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a', 'instance_type': 'm1.nano', 'host': '4b732d56a3efd101acada57417219e779ad529700523ec3187948204', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'daa21874-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4975.72118977, 'message_signature': '903d027a84d476b27cf8865c92f6c2331bbabac45ba82cd175cf616732bca2ba'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '1472c12f52724fe5aae37390978dd8fe', 'user_name': None, 'project_id': 'c25ab1655ced493998b50733e2d514fc', 'project_name': None, 'resource_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-sda', 'timestamp': '2025-11-29T06:58:48.075739', 'resource_metadata': {'display_name': 'tempest-ServersTestManualDisk-server-1088610341', 'name': 'instance-0000002c', 'instance_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a', 'instance_type': 'm1.nano', 'host': '4b732d56a3efd101acada57417219e779ad529700523ec3187948204', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'daa220a8-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4975.72118977, 'message_signature': 'bfe14b7141e3a1c39eff04c9369c2a3e0c5574f6dd5eaabdafa674ba16aeb567'}]}, 'timestamp': '2025-11-29 06:58:48.076194', '_unique_id': '1dc0caa4afc1406687fb6dcb7701acfc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.076 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.077 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.077 12 DEBUG ceilometer.compute.pollsters [-] 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/disk.device.write.latency volume: 208241701145 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.077 12 DEBUG ceilometer.compute.pollsters [-] 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b9c7d0cc-aefc-44c5-add0-5591a21bc59d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 208241701145, 'user_id': '1472c12f52724fe5aae37390978dd8fe', 'user_name': None, 'project_id': 'c25ab1655ced493998b50733e2d514fc', 'project_name': None, 'resource_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-vda', 'timestamp': '2025-11-29T06:58:48.077409', 'resource_metadata': {'display_name': 'tempest-ServersTestManualDisk-server-1088610341', 'name': 'instance-0000002c', 'instance_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a', 'instance_type': 'm1.nano', 'host': '4b732d56a3efd101acada57417219e779ad529700523ec3187948204', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'daa25a78-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4975.72118977, 'message_signature': '879593d962bc4c92dd6084ae1bd30d55ed1c1cd94498d276d98e8126ce349637'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1472c12f52724fe5aae37390978dd8fe', 'user_name': None, 'project_id': 'c25ab1655ced493998b50733e2d514fc', 'project_name': None, 'resource_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-sda', 'timestamp': '2025-11-29T06:58:48.077409', 'resource_metadata': {'display_name': 'tempest-ServersTestManualDisk-server-1088610341', 'name': 'instance-0000002c', 'instance_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a', 'instance_type': 'm1.nano', 'host': '4b732d56a3efd101acada57417219e779ad529700523ec3187948204', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'daa2670c-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4975.72118977, 'message_signature': 'f6b317b92d1df48861b8b206647a8e248710a677459620eb5bd080b90294f71c'}]}, 'timestamp': '2025-11-29 06:58:48.078005', '_unique_id': '542650d4819343da834e91a6c4eb24e0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.078 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.079 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.079 12 DEBUG ceilometer.compute.pollsters [-] 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/disk.device.read.requests volume: 1141 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.079 12 DEBUG ceilometer.compute.pollsters [-] 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bc4db1f7-f8ff-4698-b844-814107af14e6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1141, 'user_id': '1472c12f52724fe5aae37390978dd8fe', 'user_name': None, 'project_id': 'c25ab1655ced493998b50733e2d514fc', 'project_name': None, 'resource_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-vda', 'timestamp': '2025-11-29T06:58:48.079289', 'resource_metadata': {'display_name': 'tempest-ServersTestManualDisk-server-1088610341', 'name': 'instance-0000002c', 'instance_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a', 'instance_type': 'm1.nano', 'host': '4b732d56a3efd101acada57417219e779ad529700523ec3187948204', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'daa2a23a-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4975.72118977, 'message_signature': '1ada4d244e3448d36aff5534d1f9301a8ce00fb347eca8ed73aafce3a3597197'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': '1472c12f52724fe5aae37390978dd8fe', 'user_name': None, 'project_id': 'c25ab1655ced493998b50733e2d514fc', 'project_name': None, 'resource_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-sda', 'timestamp': '2025-11-29T06:58:48.079289', 'resource_metadata': {'display_name': 'tempest-ServersTestManualDisk-server-1088610341', 'name': 'instance-0000002c', 'instance_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a', 'instance_type': 'm1.nano', 'host': '4b732d56a3efd101acada57417219e779ad529700523ec3187948204', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'daa2aa0a-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4975.72118977, 'message_signature': '2dba90302d230bd321acc0c02878ca2d8362618b1001ea0ac6769ace6d3346a0'}]}, 'timestamp': '2025-11-29 06:58:48.079710', '_unique_id': 'c5ba1ff0680147bfa6b49165d49ad4d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.080 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.081 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.081 12 DEBUG ceilometer.compute.pollsters [-] 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/network.incoming.bytes volume: 1862 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1ebeeb88-326f-45a7-b2b8-b57d248b32ae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1862, 'user_id': '1472c12f52724fe5aae37390978dd8fe', 'user_name': None, 'project_id': 'c25ab1655ced493998b50733e2d514fc', 'project_name': None, 'resource_id': 'instance-0000002c-3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-tapb9ec3ba2-52', 'timestamp': '2025-11-29T06:58:48.081154', 'resource_metadata': {'display_name': 'tempest-ServersTestManualDisk-server-1088610341', 'name': 'tapb9ec3ba2-52', 'instance_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a', 'instance_type': 'm1.nano', 'host': '4b732d56a3efd101acada57417219e779ad529700523ec3187948204', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e6:38:dc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb9ec3ba2-52'}, 'message_id': 'daa2f3ca-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4975.708319128, 'message_signature': 'f730f519f076d89ff7e18b7f309556c7aaac27113849b93fa433492545be9664'}]}, 'timestamp': '2025-11-29 06:58:48.081655', '_unique_id': '7d08594ac56a466ba36c93f700d0697a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.082 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.083 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.083 12 DEBUG ceilometer.compute.pollsters [-] 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f36af91b-2f98-4e0c-a235-845c7cb43b8a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1472c12f52724fe5aae37390978dd8fe', 'user_name': None, 'project_id': 'c25ab1655ced493998b50733e2d514fc', 'project_name': None, 'resource_id': 'instance-0000002c-3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-tapb9ec3ba2-52', 'timestamp': '2025-11-29T06:58:48.083253', 'resource_metadata': {'display_name': 'tempest-ServersTestManualDisk-server-1088610341', 'name': 'tapb9ec3ba2-52', 'instance_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a', 'instance_type': 'm1.nano', 'host': '4b732d56a3efd101acada57417219e779ad529700523ec3187948204', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e6:38:dc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb9ec3ba2-52'}, 'message_id': 'daa33d58-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4975.708319128, 'message_signature': '26919debaa0236fc8a317d5b1f4a29c5914d2a56e05af304bdb51dc74a15b1de'}]}, 'timestamp': '2025-11-29 06:58:48.083523', '_unique_id': 'e97a57e8a5e74dbda4f1cd2abc9c23aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.084 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.100 12 DEBUG ceilometer.compute.pollsters [-] 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.101 12 DEBUG ceilometer.compute.pollsters [-] 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '722a70f2-934a-4c12-a2a4-1e971d49bd4f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1472c12f52724fe5aae37390978dd8fe', 'user_name': None, 'project_id': 'c25ab1655ced493998b50733e2d514fc', 'project_name': None, 'resource_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-vda', 'timestamp': '2025-11-29T06:58:48.084855', 'resource_metadata': {'display_name': 'tempest-ServersTestManualDisk-server-1088610341', 'name': 'instance-0000002c', 'instance_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a', 'instance_type': 'm1.nano', 'host': '4b732d56a3efd101acada57417219e779ad529700523ec3187948204', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'daa5f3a4-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4975.803106027, 'message_signature': '1366c98bf9cb7711814dab0c91d502f39d5accc75de044a4b0c10a725d6458ba'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '1472c12f52724fe5aae37390978dd8fe', 'user_name': None, 'project_id': 'c25ab1655ced493998b50733e2d514fc', 'project_name': None, 'resource_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-sda', 'timestamp': '2025-11-29T06:58:48.084855', 'resource_metadata': {'display_name': 'tempest-ServersTestManualDisk-server-1088610341', 'name': 'instance-0000002c', 'instance_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a', 'instance_type': 'm1.nano', 'host': '4b732d56a3efd101acada57417219e779ad529700523ec3187948204', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'daa60010-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4975.803106027, 'message_signature': '2459ac7e7ac277f20ab81313b47b23d739c7f77cbbbc7054bd01623df3d24c8e'}]}, 'timestamp': '2025-11-29 06:58:48.101593', '_unique_id': 'ac52139d0b464c838642b874483cd59d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.102 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.103 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.103 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.103 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServersTestManualDisk-server-1088610341>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersTestManualDisk-server-1088610341>]
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 DEBUG ceilometer.compute.pollsters [-] 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/network.outgoing.bytes volume: 1690 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1f0d307-0996-4cfd-89b3-15303f70db01', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1690, 'user_id': '1472c12f52724fe5aae37390978dd8fe', 'user_name': None, 'project_id': 'c25ab1655ced493998b50733e2d514fc', 'project_name': None, 'resource_id': 'instance-0000002c-3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-tapb9ec3ba2-52', 'timestamp': '2025-11-29T06:58:48.104161', 'resource_metadata': {'display_name': 'tempest-ServersTestManualDisk-server-1088610341', 'name': 'tapb9ec3ba2-52', 'instance_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a', 'instance_type': 'm1.nano', 'host': '4b732d56a3efd101acada57417219e779ad529700523ec3187948204', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e6:38:dc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb9ec3ba2-52'}, 'message_id': 'daa66e24-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4975.708319128, 'message_signature': '1ef8a4998e302755f51768c06d95b2359acd836d68fa26fbd3904bbde2ca03b3'}]}, 'timestamp': '2025-11-29 06:58:48.104408', '_unique_id': '4e35df2198b94a5f9a7127032ebcbe85'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.104 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.105 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.105 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.105 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServersTestManualDisk-server-1088610341>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersTestManualDisk-server-1088610341>]
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.105 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.105 12 DEBUG ceilometer.compute.pollsters [-] 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cadb7635-4b92-457b-b611-b8cec95fe9ac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1472c12f52724fe5aae37390978dd8fe', 'user_name': None, 'project_id': 'c25ab1655ced493998b50733e2d514fc', 'project_name': None, 'resource_id': 'instance-0000002c-3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-tapb9ec3ba2-52', 'timestamp': '2025-11-29T06:58:48.105923', 'resource_metadata': {'display_name': 'tempest-ServersTestManualDisk-server-1088610341', 'name': 'tapb9ec3ba2-52', 'instance_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a', 'instance_type': 'm1.nano', 'host': '4b732d56a3efd101acada57417219e779ad529700523ec3187948204', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e6:38:dc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb9ec3ba2-52'}, 'message_id': 'daa6b230-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4975.708319128, 'message_signature': 'dd5c5e3eaf980ae06923682172e136d138c3d3ce2048529fedc6e56acccae71b'}]}, 'timestamp': '2025-11-29 06:58:48.106150', '_unique_id': '1ebd08e4c015472aa46d1416fa03e1cd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.106 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.107 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.107 12 DEBUG ceilometer.compute.pollsters [-] 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/disk.device.read.latency volume: 498926735 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.107 12 DEBUG ceilometer.compute.pollsters [-] 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/disk.device.read.latency volume: 124479275 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f14d6ab8-9581-4b52-a442-edec3e66c389', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 498926735, 'user_id': '1472c12f52724fe5aae37390978dd8fe', 'user_name': None, 'project_id': 'c25ab1655ced493998b50733e2d514fc', 'project_name': None, 'resource_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-vda', 'timestamp': '2025-11-29T06:58:48.107449', 'resource_metadata': {'display_name': 'tempest-ServersTestManualDisk-server-1088610341', 'name': 'instance-0000002c', 'instance_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a', 'instance_type': 'm1.nano', 'host': '4b732d56a3efd101acada57417219e779ad529700523ec3187948204', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'daa6f024-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4975.72118977, 'message_signature': 'd01c9712c92c761d1b918d96fe7a79e3deb9104e939f77d72c6b4ad139b5e7f4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 124479275, 'user_id': '1472c12f52724fe5aae37390978dd8fe', 'user_name': None, 'project_id': 'c25ab1655ced493998b50733e2d514fc', 'project_name': None, 'resource_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-sda', 'timestamp': '2025-11-29T06:58:48.107449', 'resource_metadata': {'display_name': 'tempest-ServersTestManualDisk-server-1088610341', 'name': 'instance-0000002c', 'instance_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a', 'instance_type': 'm1.nano', 'host': '4b732d56a3efd101acada57417219e779ad529700523ec3187948204', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'daa6f934-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4975.72118977, 'message_signature': '85503e80b11f2e50c5d1f58d5d05d4d4729dfd680fa610dcc13698bd23a5795a'}]}, 'timestamp': '2025-11-29 06:58:48.107989', '_unique_id': '46520244c42246e9be5f347d63b5a82c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:58:48 compute-0 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.108 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.109 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.109 12 DEBUG ceilometer.compute.pollsters [-] 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bf23f274-6ae7-4618-9e07-e4c73f072ea7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1472c12f52724fe5aae37390978dd8fe', 'user_name': None, 'project_id': 'c25ab1655ced493998b50733e2d514fc', 'project_name': None, 'resource_id': 'instance-0000002c-3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-tapb9ec3ba2-52', 'timestamp': '2025-11-29T06:58:48.109257', 'resource_metadata': {'display_name': 'tempest-ServersTestManualDisk-server-1088610341', 'name': 'tapb9ec3ba2-52', 'instance_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a', 'instance_type': 'm1.nano', 'host': '4b732d56a3efd101acada57417219e779ad529700523ec3187948204', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e6:38:dc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb9ec3ba2-52'}, 'message_id': 'daa73534-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4975.708319128, 'message_signature': 'c7d7c270293b3fd5e6f5075561e19bfc7bdad7339093338766510dad4e863a37'}]}, 'timestamp': '2025-11-29 06:58:48.109504', '_unique_id': '3d1ee2b4331f466e904fa8c183373423'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServersTestManualDisk-server-1088610341>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersTestManualDisk-server-1088610341>]
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.110 12 DEBUG ceilometer.compute.pollsters [-] 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/network.incoming.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c46e35db-1683-4b5b-bc3d-e5201e6c4308', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '1472c12f52724fe5aae37390978dd8fe', 'user_name': None, 'project_id': 'c25ab1655ced493998b50733e2d514fc', 'project_name': None, 'resource_id': 'instance-0000002c-3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-tapb9ec3ba2-52', 'timestamp': '2025-11-29T06:58:48.110977', 'resource_metadata': {'display_name': 'tempest-ServersTestManualDisk-server-1088610341', 'name': 'tapb9ec3ba2-52', 'instance_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a', 'instance_type': 'm1.nano', 'host': '4b732d56a3efd101acada57417219e779ad529700523ec3187948204', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e6:38:dc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb9ec3ba2-52'}, 'message_id': 'daa7779c-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4975.708319128, 'message_signature': '0eb9551f716c7c01aca774b959401caf5133a974069cc423abfd92eb9e17ad25'}]}, 'timestamp': '2025-11-29 06:58:48.111202', '_unique_id': 'dddd49cb9bf64847af4bf841c8f9e359'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.111 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.112 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.112 12 DEBUG ceilometer.compute.pollsters [-] 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.112 12 DEBUG ceilometer.compute.pollsters [-] 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9cdaa41a-cb73-47a8-8d6d-f0fa0ddada5f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '1472c12f52724fe5aae37390978dd8fe', 'user_name': None, 'project_id': 'c25ab1655ced493998b50733e2d514fc', 'project_name': None, 'resource_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-vda', 'timestamp': '2025-11-29T06:58:48.112303', 'resource_metadata': {'display_name': 'tempest-ServersTestManualDisk-server-1088610341', 'name': 'instance-0000002c', 'instance_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a', 'instance_type': 'm1.nano', 'host': '4b732d56a3efd101acada57417219e779ad529700523ec3187948204', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'daa7ab5e-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4975.803106027, 'message_signature': '84d3693a75c84d921f9ab3f1fc8f4388b8f688b0cf5487b14b8b51f66d2b9e03'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '1472c12f52724fe5aae37390978dd8fe', 'user_name': None, 'project_id': 'c25ab1655ced493998b50733e2d514fc', 'project_name': None, 'resource_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-sda', 'timestamp': '2025-11-29T06:58:48.112303', 'resource_metadata': {'display_name': 'tempest-ServersTestManualDisk-server-1088610341', 'name': 'instance-0000002c', 'instance_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a', 'instance_type': 'm1.nano', 'host': '4b732d56a3efd101acada57417219e779ad529700523ec3187948204', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'daa7b388-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4975.803106027, 'message_signature': '46c398072c4607feac5a94e0bd37d9c82cbed34a275159d6d9e535a5ad26156f'}]}, 'timestamp': '2025-11-29 06:58:48.112721', '_unique_id': '955b1cd4281a437eb57cf63ccc1ec3c0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.113 12 DEBUG ceilometer.compute.pollsters [-] 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 DEBUG ceilometer.compute.pollsters [-] 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7c5c3a17-4c72-4b7f-a324-75d03f4a26e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '1472c12f52724fe5aae37390978dd8fe', 'user_name': None, 'project_id': 'c25ab1655ced493998b50733e2d514fc', 'project_name': None, 'resource_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-vda', 'timestamp': '2025-11-29T06:58:48.113783', 'resource_metadata': {'display_name': 'tempest-ServersTestManualDisk-server-1088610341', 'name': 'instance-0000002c', 'instance_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a', 'instance_type': 'm1.nano', 'host': '4b732d56a3efd101acada57417219e779ad529700523ec3187948204', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'daa7e628-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4975.803106027, 'message_signature': '9780589dab925e5a54cc281163c037789027589067f01cdcf24561b4bec650c7'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': '1472c12f52724fe5aae37390978dd8fe', 'user_name': None, 'project_id': 'c25ab1655ced493998b50733e2d514fc', 'project_name': None, 'resource_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-sda', 'timestamp': '2025-11-29T06:58:48.113783', 'resource_metadata': {'display_name': 'tempest-ServersTestManualDisk-server-1088610341', 'name': 'instance-0000002c', 'instance_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a', 'instance_type': 'm1.nano', 'host': '4b732d56a3efd101acada57417219e779ad529700523ec3187948204', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'daa7edc6-ccf0-11f0-8f64-fa163e220349', 'monotonic_time': 4975.803106027, 'message_signature': '9e4a15e3dba6b6ef31e5c985b3ba5d5e796436e1c57ea86abc5227f835b6eb99'}]}, 'timestamp': '2025-11-29 06:58:48.114212', '_unique_id': '54959aa4b5a446bdb42eed9e0e39f9ae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 06:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 06:58:48.114 12 ERROR oslo_messaging.notify.messaging 
Nov 29 06:58:48 compute-0 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 06:58:48 compute-0 nova_compute[187185]: 2025-11-29 06:58:48.318 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:58:48 compute-0 nova_compute[187185]: 2025-11-29 06:58:48.319 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 06:58:48 compute-0 nova_compute[187185]: 2025-11-29 06:58:48.650 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:58:49 compute-0 nova_compute[187185]: 2025-11-29 06:58:49.036 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:58:50 compute-0 podman[219802]: 2025-11-29 06:58:50.820600155 +0000 UTC m=+0.080192349 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 29 06:58:51 compute-0 nova_compute[187185]: 2025-11-29 06:58:51.003 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 06:58:51 compute-0 nova_compute[187185]: 2025-11-29 06:58:51.004 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:58:51 compute-0 nova_compute[187185]: 2025-11-29 06:58:51.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:58:51 compute-0 nova_compute[187185]: 2025-11-29 06:58:51.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:58:51 compute-0 nova_compute[187185]: 2025-11-29 06:58:51.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:58:51 compute-0 nova_compute[187185]: 2025-11-29 06:58:51.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 06:58:51 compute-0 nova_compute[187185]: 2025-11-29 06:58:51.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:58:51 compute-0 nova_compute[187185]: 2025-11-29 06:58:51.576 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:58:51 compute-0 nova_compute[187185]: 2025-11-29 06:58:51.577 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:58:51 compute-0 nova_compute[187185]: 2025-11-29 06:58:51.578 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:58:51 compute-0 nova_compute[187185]: 2025-11-29 06:58:51.578 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:58:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:58:53.449 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 06:58:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:58:53.451 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 06:58:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:58:53.452 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:58:53 compute-0 nova_compute[187185]: 2025-11-29 06:58:53.505 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:58:53 compute-0 nova_compute[187185]: 2025-11-29 06:58:53.516 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:58:53 compute-0 nova_compute[187185]: 2025-11-29 06:58:53.580 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:58:53 compute-0 nova_compute[187185]: 2025-11-29 06:58:53.582 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 06:58:53 compute-0 nova_compute[187185]: 2025-11-29 06:58:53.642 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 06:58:53 compute-0 nova_compute[187185]: 2025-11-29 06:58:53.651 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:58:53 compute-0 nova_compute[187185]: 2025-11-29 06:58:53.812 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:58:53 compute-0 nova_compute[187185]: 2025-11-29 06:58:53.813 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5569MB free_disk=73.31024169921875GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:58:53 compute-0 nova_compute[187185]: 2025-11-29 06:58:53.814 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:58:53 compute-0 nova_compute[187185]: 2025-11-29 06:58:53.814 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:58:54 compute-0 nova_compute[187185]: 2025-11-29 06:58:54.039 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:58:55 compute-0 nova_compute[187185]: 2025-11-29 06:58:55.750 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 06:58:55 compute-0 nova_compute[187185]: 2025-11-29 06:58:55.750 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:58:55 compute-0 nova_compute[187185]: 2025-11-29 06:58:55.751 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:58:55 compute-0 nova_compute[187185]: 2025-11-29 06:58:55.829 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Refreshing inventories for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 29 06:58:55 compute-0 nova_compute[187185]: 2025-11-29 06:58:55.853 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Updating ProviderTree inventory for provider 4e39a026-df39-4e20-874a-dbb5a40df044 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 29 06:58:55 compute-0 nova_compute[187185]: 2025-11-29 06:58:55.853 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Updating inventory in ProviderTree for provider 4e39a026-df39-4e20-874a-dbb5a40df044 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 29 06:58:55 compute-0 nova_compute[187185]: 2025-11-29 06:58:55.941 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Refreshing aggregate associations for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 29 06:58:55 compute-0 nova_compute[187185]: 2025-11-29 06:58:55.978 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Refreshing trait associations for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044, traits: HW_CPU_X86_SSE,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_TRUSTED_CERTS,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 29 06:58:56 compute-0 nova_compute[187185]: 2025-11-29 06:58:56.050 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:58:58 compute-0 nova_compute[187185]: 2025-11-29 06:58:58.655 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:58:59 compute-0 nova_compute[187185]: 2025-11-29 06:58:59.042 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:59:01 compute-0 nova_compute[187185]: 2025-11-29 06:59:01.059 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:59:01 compute-0 nova_compute[187185]: 2025-11-29 06:59:01.238 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:59:01 compute-0 nova_compute[187185]: 2025-11-29 06:59:01.238 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 7.424s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:59:02 compute-0 nova_compute[187185]: 2025-11-29 06:59:02.856 187189 DEBUG oslo_concurrency.lockutils [None req-97bb66a3-eaba-4455-865d-c5dff300ebde 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Acquiring lock "3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:59:02 compute-0 nova_compute[187185]: 2025-11-29 06:59:02.857 187189 DEBUG oslo_concurrency.lockutils [None req-97bb66a3-eaba-4455-865d-c5dff300ebde 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Lock "3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:59:02 compute-0 nova_compute[187185]: 2025-11-29 06:59:02.857 187189 DEBUG oslo_concurrency.lockutils [None req-97bb66a3-eaba-4455-865d-c5dff300ebde 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Acquiring lock "3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:59:02 compute-0 nova_compute[187185]: 2025-11-29 06:59:02.858 187189 DEBUG oslo_concurrency.lockutils [None req-97bb66a3-eaba-4455-865d-c5dff300ebde 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Lock "3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:59:02 compute-0 nova_compute[187185]: 2025-11-29 06:59:02.858 187189 DEBUG oslo_concurrency.lockutils [None req-97bb66a3-eaba-4455-865d-c5dff300ebde 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Lock "3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:59:02 compute-0 nova_compute[187185]: 2025-11-29 06:59:02.894 187189 INFO nova.compute.manager [None req-97bb66a3-eaba-4455-865d-c5dff300ebde 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Terminating instance
Nov 29 06:59:02 compute-0 nova_compute[187185]: 2025-11-29 06:59:02.906 187189 DEBUG nova.compute.manager [None req-97bb66a3-eaba-4455-865d-c5dff300ebde 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 06:59:03 compute-0 nova_compute[187185]: 2025-11-29 06:59:03.237 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:59:03 compute-0 ovn_controller[95281]: 2025-11-29T06:59:03Z|00104|binding|INFO|Releasing lport 8f6e6a63-3adc-455a-821d-4f8f756e8b07 from this chassis (sb_readonly=0)
Nov 29 06:59:03 compute-0 nova_compute[187185]: 2025-11-29 06:59:03.675 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:59:03 compute-0 kernel: tapb9ec3ba2-52 (unregistering): left promiscuous mode
Nov 29 06:59:03 compute-0 NetworkManager[55227]: <info>  [1764399543.6878] device (tapb9ec3ba2-52): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 06:59:03 compute-0 ovn_controller[95281]: 2025-11-29T06:59:03Z|00105|binding|INFO|Releasing lport b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e from this chassis (sb_readonly=0)
Nov 29 06:59:03 compute-0 ovn_controller[95281]: 2025-11-29T06:59:03Z|00106|binding|INFO|Setting lport b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e down in Southbound
Nov 29 06:59:03 compute-0 nova_compute[187185]: 2025-11-29 06:59:03.698 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:59:03 compute-0 ovn_controller[95281]: 2025-11-29T06:59:03Z|00107|binding|INFO|Removing iface tapb9ec3ba2-52 ovn-installed in OVS
Nov 29 06:59:03 compute-0 nova_compute[187185]: 2025-11-29 06:59:03.700 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:59:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:59:03.707 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e6:38:dc 10.100.0.8'], port_security=['fa:16:3e:e6:38:dc 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5c312f91-84b2-4c6f-94ed-87030fed964b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c25ab1655ced493998b50733e2d514fc', 'neutron:revision_number': '4', 'neutron:security_group_ids': '49e53e3a-7c88-48ec-a482-6b67c8d5d0fa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.204'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ca554f30-048a-4396-b118-d70c3e4bb5fc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 06:59:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:59:03.709 104254 INFO neutron.agent.ovn.metadata.agent [-] Port b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e in datapath 5c312f91-84b2-4c6f-94ed-87030fed964b unbound from our chassis
Nov 29 06:59:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:59:03.711 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5c312f91-84b2-4c6f-94ed-87030fed964b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 06:59:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:59:03.714 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e5b6a481-59e6-44f2-960e-25fcca5f1474]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:59:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:59:03.715 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5c312f91-84b2-4c6f-94ed-87030fed964b namespace which is not needed anymore
Nov 29 06:59:03 compute-0 nova_compute[187185]: 2025-11-29 06:59:03.716 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:59:03 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d0000002c.scope: Deactivated successfully.
Nov 29 06:59:03 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d0000002c.scope: Consumed 15.242s CPU time.
Nov 29 06:59:03 compute-0 systemd-machined[153486]: Machine qemu-17-instance-0000002c terminated.
Nov 29 06:59:03 compute-0 podman[219836]: 2025-11-29 06:59:03.824996971 +0000 UTC m=+0.080016144 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 06:59:03 compute-0 podman[219832]: 2025-11-29 06:59:03.825008041 +0000 UTC m=+0.090299943 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 06:59:03 compute-0 podman[219834]: 2025-11-29 06:59:03.860526371 +0000 UTC m=+0.119009482 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, config_id=edpm, maintainer=Red Hat, Inc., name=ubi9-minimal, io.buildah.version=1.33.7, managed_by=edpm_ansible, version=9.6, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, architecture=x86_64)
Nov 29 06:59:03 compute-0 nova_compute[187185]: 2025-11-29 06:59:03.926 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:59:03 compute-0 nova_compute[187185]: 2025-11-29 06:59:03.932 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:59:03 compute-0 nova_compute[187185]: 2025-11-29 06:59:03.972 187189 INFO nova.virt.libvirt.driver [-] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Instance destroyed successfully.
Nov 29 06:59:03 compute-0 nova_compute[187185]: 2025-11-29 06:59:03.973 187189 DEBUG nova.objects.instance [None req-97bb66a3-eaba-4455-865d-c5dff300ebde 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Lazy-loading 'resources' on Instance uuid 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 06:59:03 compute-0 nova_compute[187185]: 2025-11-29 06:59:03.996 187189 DEBUG nova.virt.libvirt.vif [None req-97bb66a3-eaba-4455-865d-c5dff300ebde 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:58:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestManualDisk-server-1088610341',display_name='tempest-ServersTestManualDisk-server-1088610341',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmanualdisk-server-1088610341',id=44,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMEM6sFLOUOrNh6eIgURmzcYIUeOV5JocPOhQ5sp4OUldoYBnfHHW9kH+GzZIRDjuYfhnkJMZvDdFm6zREoS69pxxAvYkbzTFiySaqfRHd87bUOgR0mLJJIq/7DNCsg0Lw==',key_name='tempest-keypair-1082752568',keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:58:16Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c25ab1655ced493998b50733e2d514fc',ramdisk_id='',reservation_id='r-kuffkjmg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestManualDisk-1726842339',owner_user_name='tempest-ServersTestManualDisk-1726842339-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:58:16Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1472c12f52724fe5aae37390978dd8fe',uuid=3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e", "address": "fa:16:3e:e6:38:dc", "network": {"id": "5c312f91-84b2-4c6f-94ed-87030fed964b", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-218404435-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c25ab1655ced493998b50733e2d514fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb9ec3ba2-52", "ovs_interfaceid": "b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 06:59:03 compute-0 nova_compute[187185]: 2025-11-29 06:59:03.996 187189 DEBUG nova.network.os_vif_util [None req-97bb66a3-eaba-4455-865d-c5dff300ebde 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Converting VIF {"id": "b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e", "address": "fa:16:3e:e6:38:dc", "network": {"id": "5c312f91-84b2-4c6f-94ed-87030fed964b", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-218404435-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c25ab1655ced493998b50733e2d514fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb9ec3ba2-52", "ovs_interfaceid": "b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 06:59:03 compute-0 nova_compute[187185]: 2025-11-29 06:59:03.997 187189 DEBUG nova.network.os_vif_util [None req-97bb66a3-eaba-4455-865d-c5dff300ebde 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e6:38:dc,bridge_name='br-int',has_traffic_filtering=True,id=b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e,network=Network(5c312f91-84b2-4c6f-94ed-87030fed964b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb9ec3ba2-52') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 06:59:03 compute-0 nova_compute[187185]: 2025-11-29 06:59:03.997 187189 DEBUG os_vif [None req-97bb66a3-eaba-4455-865d-c5dff300ebde 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e6:38:dc,bridge_name='br-int',has_traffic_filtering=True,id=b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e,network=Network(5c312f91-84b2-4c6f-94ed-87030fed964b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb9ec3ba2-52') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 06:59:03 compute-0 nova_compute[187185]: 2025-11-29 06:59:03.999 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:59:03 compute-0 nova_compute[187185]: 2025-11-29 06:59:03.999 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb9ec3ba2-52, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:59:04 compute-0 nova_compute[187185]: 2025-11-29 06:59:04.001 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:59:04 compute-0 nova_compute[187185]: 2025-11-29 06:59:04.004 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 06:59:04 compute-0 nova_compute[187185]: 2025-11-29 06:59:04.008 187189 INFO os_vif [None req-97bb66a3-eaba-4455-865d-c5dff300ebde 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e6:38:dc,bridge_name='br-int',has_traffic_filtering=True,id=b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e,network=Network(5c312f91-84b2-4c6f-94ed-87030fed964b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb9ec3ba2-52')
Nov 29 06:59:04 compute-0 nova_compute[187185]: 2025-11-29 06:59:04.009 187189 INFO nova.virt.libvirt.driver [None req-97bb66a3-eaba-4455-865d-c5dff300ebde 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Deleting instance files /var/lib/nova/instances/3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a_del
Nov 29 06:59:04 compute-0 nova_compute[187185]: 2025-11-29 06:59:04.010 187189 INFO nova.virt.libvirt.driver [None req-97bb66a3-eaba-4455-865d-c5dff300ebde 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Deletion of /var/lib/nova/instances/3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a_del complete
Nov 29 06:59:04 compute-0 nova_compute[187185]: 2025-11-29 06:59:04.029 187189 DEBUG nova.compute.manager [req-c9d61237-9cfd-4e74-8a26-89e31d60f028 req-75a1d880-080f-44ea-abeb-b4171c66c3ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Received event network-vif-unplugged-b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:59:04 compute-0 nova_compute[187185]: 2025-11-29 06:59:04.029 187189 DEBUG oslo_concurrency.lockutils [req-c9d61237-9cfd-4e74-8a26-89e31d60f028 req-75a1d880-080f-44ea-abeb-b4171c66c3ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:59:04 compute-0 nova_compute[187185]: 2025-11-29 06:59:04.029 187189 DEBUG oslo_concurrency.lockutils [req-c9d61237-9cfd-4e74-8a26-89e31d60f028 req-75a1d880-080f-44ea-abeb-b4171c66c3ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:59:04 compute-0 nova_compute[187185]: 2025-11-29 06:59:04.030 187189 DEBUG oslo_concurrency.lockutils [req-c9d61237-9cfd-4e74-8a26-89e31d60f028 req-75a1d880-080f-44ea-abeb-b4171c66c3ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:59:04 compute-0 nova_compute[187185]: 2025-11-29 06:59:04.030 187189 DEBUG nova.compute.manager [req-c9d61237-9cfd-4e74-8a26-89e31d60f028 req-75a1d880-080f-44ea-abeb-b4171c66c3ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] No waiting events found dispatching network-vif-unplugged-b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 06:59:04 compute-0 nova_compute[187185]: 2025-11-29 06:59:04.030 187189 DEBUG nova.compute.manager [req-c9d61237-9cfd-4e74-8a26-89e31d60f028 req-75a1d880-080f-44ea-abeb-b4171c66c3ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Received event network-vif-unplugged-b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 06:59:04 compute-0 nova_compute[187185]: 2025-11-29 06:59:04.164 187189 INFO nova.compute.manager [None req-97bb66a3-eaba-4455-865d-c5dff300ebde 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Took 1.26 seconds to destroy the instance on the hypervisor.
Nov 29 06:59:04 compute-0 nova_compute[187185]: 2025-11-29 06:59:04.165 187189 DEBUG oslo.service.loopingcall [None req-97bb66a3-eaba-4455-865d-c5dff300ebde 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 06:59:04 compute-0 nova_compute[187185]: 2025-11-29 06:59:04.166 187189 DEBUG nova.compute.manager [-] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 06:59:04 compute-0 nova_compute[187185]: 2025-11-29 06:59:04.167 187189 DEBUG nova.network.neutron [-] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 06:59:04 compute-0 neutron-haproxy-ovnmeta-5c312f91-84b2-4c6f-94ed-87030fed964b[219620]: [NOTICE]   (219624) : haproxy version is 2.8.14-c23fe91
Nov 29 06:59:04 compute-0 neutron-haproxy-ovnmeta-5c312f91-84b2-4c6f-94ed-87030fed964b[219620]: [NOTICE]   (219624) : path to executable is /usr/sbin/haproxy
Nov 29 06:59:04 compute-0 neutron-haproxy-ovnmeta-5c312f91-84b2-4c6f-94ed-87030fed964b[219620]: [WARNING]  (219624) : Exiting Master process...
Nov 29 06:59:04 compute-0 neutron-haproxy-ovnmeta-5c312f91-84b2-4c6f-94ed-87030fed964b[219620]: [ALERT]    (219624) : Current worker (219626) exited with code 143 (Terminated)
Nov 29 06:59:04 compute-0 neutron-haproxy-ovnmeta-5c312f91-84b2-4c6f-94ed-87030fed964b[219620]: [WARNING]  (219624) : All workers exited. Exiting... (0)
Nov 29 06:59:04 compute-0 systemd[1]: libpod-0a7219f28e89b5e111fee55ab5369cfcb7fb9171ce9b80ea6ea3f96f22eb2176.scope: Deactivated successfully.
Nov 29 06:59:04 compute-0 podman[219908]: 2025-11-29 06:59:04.328084036 +0000 UTC m=+0.481198720 container died 0a7219f28e89b5e111fee55ab5369cfcb7fb9171ce9b80ea6ea3f96f22eb2176 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5c312f91-84b2-4c6f-94ed-87030fed964b, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:59:04 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0a7219f28e89b5e111fee55ab5369cfcb7fb9171ce9b80ea6ea3f96f22eb2176-userdata-shm.mount: Deactivated successfully.
Nov 29 06:59:04 compute-0 systemd[1]: var-lib-containers-storage-overlay-a8025cd861c71a8fc26d222bb9d3e3d398a73dbf69a87ef8a91b8ddcd2d1acf6-merged.mount: Deactivated successfully.
Nov 29 06:59:05 compute-0 podman[219908]: 2025-11-29 06:59:05.44261751 +0000 UTC m=+1.595732184 container cleanup 0a7219f28e89b5e111fee55ab5369cfcb7fb9171ce9b80ea6ea3f96f22eb2176 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5c312f91-84b2-4c6f-94ed-87030fed964b, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:59:05 compute-0 systemd[1]: libpod-conmon-0a7219f28e89b5e111fee55ab5369cfcb7fb9171ce9b80ea6ea3f96f22eb2176.scope: Deactivated successfully.
Nov 29 06:59:05 compute-0 podman[219953]: 2025-11-29 06:59:05.97477907 +0000 UTC m=+0.498484664 container remove 0a7219f28e89b5e111fee55ab5369cfcb7fb9171ce9b80ea6ea3f96f22eb2176 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5c312f91-84b2-4c6f-94ed-87030fed964b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 06:59:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:59:05.983 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[9ab76f67-b122-4afa-b3bb-707d297773b8]: (4, ('Sat Nov 29 06:59:03 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-5c312f91-84b2-4c6f-94ed-87030fed964b (0a7219f28e89b5e111fee55ab5369cfcb7fb9171ce9b80ea6ea3f96f22eb2176)\n0a7219f28e89b5e111fee55ab5369cfcb7fb9171ce9b80ea6ea3f96f22eb2176\nSat Nov 29 06:59:05 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5c312f91-84b2-4c6f-94ed-87030fed964b (0a7219f28e89b5e111fee55ab5369cfcb7fb9171ce9b80ea6ea3f96f22eb2176)\n0a7219f28e89b5e111fee55ab5369cfcb7fb9171ce9b80ea6ea3f96f22eb2176\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:59:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:59:05.985 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d5698ed4-46a8-4e56-abf6-0310eb6f34b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:59:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:59:05.986 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5c312f91-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:59:05 compute-0 nova_compute[187185]: 2025-11-29 06:59:05.989 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:59:05 compute-0 kernel: tap5c312f91-80: left promiscuous mode
Nov 29 06:59:06 compute-0 nova_compute[187185]: 2025-11-29 06:59:06.001 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:59:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:59:06.004 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[1f3ea683-d9cc-4500-9ac9-cf3a99ca1042]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:59:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:59:06.024 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[1c2735a7-2fd4-41b2-a600-ea86b22ce6f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:59:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:59:06.025 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[67221ac4-67f9-4caf-bce2-940a4cf4840e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:59:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:59:06.045 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[b937294d-43bf-427e-bee7-0ea72f6e370b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 494337, 'reachable_time': 27869, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219969, 'error': None, 'target': 'ovnmeta-5c312f91-84b2-4c6f-94ed-87030fed964b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:59:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:59:06.048 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5c312f91-84b2-4c6f-94ed-87030fed964b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 06:59:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:59:06.048 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[f2bc1c60-ca7d-488c-bc7a-10dc7bf875ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 06:59:06 compute-0 systemd[1]: run-netns-ovnmeta\x2d5c312f91\x2d84b2\x2d4c6f\x2d94ed\x2d87030fed964b.mount: Deactivated successfully.
Nov 29 06:59:06 compute-0 nova_compute[187185]: 2025-11-29 06:59:06.208 187189 DEBUG nova.network.neutron [-] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 06:59:06 compute-0 nova_compute[187185]: 2025-11-29 06:59:06.234 187189 INFO nova.compute.manager [-] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Took 2.07 seconds to deallocate network for instance.
Nov 29 06:59:06 compute-0 nova_compute[187185]: 2025-11-29 06:59:06.372 187189 DEBUG nova.compute.manager [req-1e44121b-4b60-49f0-b6d4-7346a71d4f39 req-01c0a6f2-398e-4af9-91de-76634c7ed41e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Received event network-vif-plugged-b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:59:06 compute-0 nova_compute[187185]: 2025-11-29 06:59:06.373 187189 DEBUG oslo_concurrency.lockutils [req-1e44121b-4b60-49f0-b6d4-7346a71d4f39 req-01c0a6f2-398e-4af9-91de-76634c7ed41e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:59:06 compute-0 nova_compute[187185]: 2025-11-29 06:59:06.373 187189 DEBUG oslo_concurrency.lockutils [req-1e44121b-4b60-49f0-b6d4-7346a71d4f39 req-01c0a6f2-398e-4af9-91de-76634c7ed41e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:59:06 compute-0 nova_compute[187185]: 2025-11-29 06:59:06.374 187189 DEBUG oslo_concurrency.lockutils [req-1e44121b-4b60-49f0-b6d4-7346a71d4f39 req-01c0a6f2-398e-4af9-91de-76634c7ed41e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:59:06 compute-0 nova_compute[187185]: 2025-11-29 06:59:06.374 187189 DEBUG nova.compute.manager [req-1e44121b-4b60-49f0-b6d4-7346a71d4f39 req-01c0a6f2-398e-4af9-91de-76634c7ed41e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] No waiting events found dispatching network-vif-plugged-b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 06:59:06 compute-0 nova_compute[187185]: 2025-11-29 06:59:06.375 187189 WARNING nova.compute.manager [req-1e44121b-4b60-49f0-b6d4-7346a71d4f39 req-01c0a6f2-398e-4af9-91de-76634c7ed41e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Received unexpected event network-vif-plugged-b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e for instance with vm_state deleted and task_state None.
Nov 29 06:59:06 compute-0 nova_compute[187185]: 2025-11-29 06:59:06.396 187189 DEBUG oslo_concurrency.lockutils [None req-97bb66a3-eaba-4455-865d-c5dff300ebde 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:59:06 compute-0 nova_compute[187185]: 2025-11-29 06:59:06.397 187189 DEBUG oslo_concurrency.lockutils [None req-97bb66a3-eaba-4455-865d-c5dff300ebde 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:59:06 compute-0 nova_compute[187185]: 2025-11-29 06:59:06.487 187189 DEBUG nova.compute.manager [req-46e9520c-000f-4989-b177-215cfa1a2e98 req-4f38b2ba-231b-4097-804b-b788a8cb0013 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Received event network-vif-deleted-b9ec3ba2-527d-4ee3-a13b-5d8ea4c53c6e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 06:59:06 compute-0 nova_compute[187185]: 2025-11-29 06:59:06.496 187189 DEBUG nova.compute.provider_tree [None req-97bb66a3-eaba-4455-865d-c5dff300ebde 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:59:06 compute-0 nova_compute[187185]: 2025-11-29 06:59:06.514 187189 DEBUG nova.scheduler.client.report [None req-97bb66a3-eaba-4455-865d-c5dff300ebde 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:59:06 compute-0 nova_compute[187185]: 2025-11-29 06:59:06.590 187189 DEBUG oslo_concurrency.lockutils [None req-97bb66a3-eaba-4455-865d-c5dff300ebde 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.194s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:59:06 compute-0 nova_compute[187185]: 2025-11-29 06:59:06.631 187189 INFO nova.scheduler.client.report [None req-97bb66a3-eaba-4455-865d-c5dff300ebde 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Deleted allocations for instance 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a
Nov 29 06:59:06 compute-0 nova_compute[187185]: 2025-11-29 06:59:06.826 187189 DEBUG oslo_concurrency.lockutils [None req-97bb66a3-eaba-4455-865d-c5dff300ebde 1472c12f52724fe5aae37390978dd8fe c25ab1655ced493998b50733e2d514fc - - default default] Lock "3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.969s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:59:08 compute-0 nova_compute[187185]: 2025-11-29 06:59:08.677 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:59:09 compute-0 nova_compute[187185]: 2025-11-29 06:59:09.002 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:59:09 compute-0 podman[219970]: 2025-11-29 06:59:09.822020266 +0000 UTC m=+0.078362377 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 06:59:12 compute-0 podman[219997]: 2025-11-29 06:59:12.818532689 +0000 UTC m=+0.067241974 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 06:59:12 compute-0 podman[219996]: 2025-11-29 06:59:12.825630619 +0000 UTC m=+0.082084762 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 06:59:13 compute-0 nova_compute[187185]: 2025-11-29 06:59:13.678 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:59:14 compute-0 nova_compute[187185]: 2025-11-29 06:59:14.004 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:59:18 compute-0 nova_compute[187185]: 2025-11-29 06:59:18.681 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:59:18 compute-0 nova_compute[187185]: 2025-11-29 06:59:18.970 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399543.9690373, 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 06:59:18 compute-0 nova_compute[187185]: 2025-11-29 06:59:18.971 187189 INFO nova.compute.manager [-] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] VM Stopped (Lifecycle Event)
Nov 29 06:59:19 compute-0 nova_compute[187185]: 2025-11-29 06:59:19.006 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:59:19 compute-0 nova_compute[187185]: 2025-11-29 06:59:19.019 187189 DEBUG nova.compute.manager [None req-ddcb4e46-281d-4e8b-970b-56c514f0579d - - - - - -] [instance: 3c0c4af9-1bfc-4a31-acf5-0ec01c62fe9a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 06:59:21 compute-0 podman[220038]: 2025-11-29 06:59:21.807119288 +0000 UTC m=+0.066070161 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd)
Nov 29 06:59:23 compute-0 nova_compute[187185]: 2025-11-29 06:59:23.686 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:59:24 compute-0 nova_compute[187185]: 2025-11-29 06:59:24.008 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:59:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:59:24.819 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:59:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:59:24.819 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:59:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:59:24.820 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:59:25 compute-0 nova_compute[187185]: 2025-11-29 06:59:25.637 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:59:25 compute-0 nova_compute[187185]: 2025-11-29 06:59:25.751 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:59:28 compute-0 nova_compute[187185]: 2025-11-29 06:59:28.689 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:59:29 compute-0 nova_compute[187185]: 2025-11-29 06:59:29.010 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:59:33 compute-0 nova_compute[187185]: 2025-11-29 06:59:33.690 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:59:34 compute-0 nova_compute[187185]: 2025-11-29 06:59:34.013 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:59:34 compute-0 podman[220062]: 2025-11-29 06:59:34.792272405 +0000 UTC m=+0.054926558 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, config_id=edpm, vcs-type=git, io.openshift.tags=minimal rhel9, architecture=x86_64, release=1755695350, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, container_name=openstack_network_exporter, distribution-scope=public)
Nov 29 06:59:34 compute-0 podman[220063]: 2025-11-29 06:59:34.802924555 +0000 UTC m=+0.056840412 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 06:59:34 compute-0 podman[220061]: 2025-11-29 06:59:34.816287461 +0000 UTC m=+0.081589018 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 06:59:38 compute-0 nova_compute[187185]: 2025-11-29 06:59:38.692 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:59:39 compute-0 nova_compute[187185]: 2025-11-29 06:59:39.016 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:59:40 compute-0 podman[220123]: 2025-11-29 06:59:40.889898663 +0000 UTC m=+0.148266876 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 06:59:43 compute-0 nova_compute[187185]: 2025-11-29 06:59:43.694 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:59:43 compute-0 podman[220151]: 2025-11-29 06:59:43.820676214 +0000 UTC m=+0.078775399 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Nov 29 06:59:43 compute-0 podman[220152]: 2025-11-29 06:59:43.843210978 +0000 UTC m=+0.086969399 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 06:59:44 compute-0 nova_compute[187185]: 2025-11-29 06:59:44.018 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:59:48 compute-0 nova_compute[187185]: 2025-11-29 06:59:48.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:59:48 compute-0 nova_compute[187185]: 2025-11-29 06:59:48.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:59:48 compute-0 nova_compute[187185]: 2025-11-29 06:59:48.698 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:59:49 compute-0 nova_compute[187185]: 2025-11-29 06:59:49.022 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:59:49 compute-0 nova_compute[187185]: 2025-11-29 06:59:49.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:59:49 compute-0 nova_compute[187185]: 2025-11-29 06:59:49.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 06:59:49 compute-0 nova_compute[187185]: 2025-11-29 06:59:49.318 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 06:59:49 compute-0 nova_compute[187185]: 2025-11-29 06:59:49.350 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 06:59:50 compute-0 nova_compute[187185]: 2025-11-29 06:59:50.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:59:51 compute-0 nova_compute[187185]: 2025-11-29 06:59:51.311 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:59:51 compute-0 nova_compute[187185]: 2025-11-29 06:59:51.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:59:51 compute-0 nova_compute[187185]: 2025-11-29 06:59:51.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:59:51 compute-0 nova_compute[187185]: 2025-11-29 06:59:51.477 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:59:51 compute-0 nova_compute[187185]: 2025-11-29 06:59:51.477 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:59:51 compute-0 nova_compute[187185]: 2025-11-29 06:59:51.477 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:59:51 compute-0 nova_compute[187185]: 2025-11-29 06:59:51.478 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 06:59:51 compute-0 nova_compute[187185]: 2025-11-29 06:59:51.676 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 06:59:51 compute-0 nova_compute[187185]: 2025-11-29 06:59:51.678 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5777MB free_disk=73.33893966674805GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 06:59:51 compute-0 nova_compute[187185]: 2025-11-29 06:59:51.678 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 06:59:51 compute-0 nova_compute[187185]: 2025-11-29 06:59:51.678 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 06:59:52 compute-0 nova_compute[187185]: 2025-11-29 06:59:52.063 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 06:59:52 compute-0 nova_compute[187185]: 2025-11-29 06:59:52.064 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 06:59:52 compute-0 nova_compute[187185]: 2025-11-29 06:59:52.085 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 06:59:52 compute-0 nova_compute[187185]: 2025-11-29 06:59:52.256 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 06:59:52 compute-0 nova_compute[187185]: 2025-11-29 06:59:52.388 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 06:59:52 compute-0 nova_compute[187185]: 2025-11-29 06:59:52.389 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.710s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 06:59:52 compute-0 podman[220196]: 2025-11-29 06:59:52.838818254 +0000 UTC m=+0.087228505 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd)
Nov 29 06:59:53 compute-0 nova_compute[187185]: 2025-11-29 06:59:53.389 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:59:53 compute-0 nova_compute[187185]: 2025-11-29 06:59:53.390 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 06:59:53 compute-0 nova_compute[187185]: 2025-11-29 06:59:53.706 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:59:54 compute-0 nova_compute[187185]: 2025-11-29 06:59:54.024 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:59:54 compute-0 nova_compute[187185]: 2025-11-29 06:59:54.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:59:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:59:54.768 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 06:59:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:59:54.769 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 06:59:54 compute-0 nova_compute[187185]: 2025-11-29 06:59:54.769 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:59:55 compute-0 nova_compute[187185]: 2025-11-29 06:59:55.314 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 06:59:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 06:59:56.772 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 06:59:58 compute-0 nova_compute[187185]: 2025-11-29 06:59:58.708 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 06:59:59 compute-0 nova_compute[187185]: 2025-11-29 06:59:59.027 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:00:03 compute-0 nova_compute[187185]: 2025-11-29 07:00:03.710 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:00:04 compute-0 nova_compute[187185]: 2025-11-29 07:00:04.029 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:00:05 compute-0 podman[220216]: 2025-11-29 07:00:05.790089989 +0000 UTC m=+0.050661318 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 29 07:00:05 compute-0 podman[220217]: 2025-11-29 07:00:05.806771498 +0000 UTC m=+0.057545321 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, distribution-scope=public, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_id=edpm, architecture=x86_64, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 07:00:05 compute-0 podman[220218]: 2025-11-29 07:00:05.825725632 +0000 UTC m=+0.076253368 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:00:08 compute-0 nova_compute[187185]: 2025-11-29 07:00:08.712 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:00:09 compute-0 nova_compute[187185]: 2025-11-29 07:00:09.031 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:00:11 compute-0 podman[220283]: 2025-11-29 07:00:11.839822568 +0000 UTC m=+0.108318651 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 07:00:12 compute-0 sshd-session[220281]: Invalid user linuxacademy from 103.179.56.44 port 49248
Nov 29 07:00:12 compute-0 sshd-session[220281]: Received disconnect from 103.179.56.44 port 49248:11: Bye Bye [preauth]
Nov 29 07:00:12 compute-0 sshd-session[220281]: Disconnected from invalid user linuxacademy 103.179.56.44 port 49248 [preauth]
Nov 29 07:00:13 compute-0 nova_compute[187185]: 2025-11-29 07:00:13.714 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:00:14 compute-0 nova_compute[187185]: 2025-11-29 07:00:14.033 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:00:14 compute-0 podman[220311]: 2025-11-29 07:00:14.795752497 +0000 UTC m=+0.063421126 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:00:14 compute-0 podman[220312]: 2025-11-29 07:00:14.805723888 +0000 UTC m=+0.066159354 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 07:00:15 compute-0 ovn_controller[95281]: 2025-11-29T07:00:15Z|00108|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 29 07:00:18 compute-0 nova_compute[187185]: 2025-11-29 07:00:18.716 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:00:19 compute-0 nova_compute[187185]: 2025-11-29 07:00:19.036 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:00:23 compute-0 nova_compute[187185]: 2025-11-29 07:00:23.718 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:00:24 compute-0 nova_compute[187185]: 2025-11-29 07:00:24.093 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:00:24 compute-0 podman[220355]: 2025-11-29 07:00:24.581370737 +0000 UTC m=+0.833490758 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 07:00:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:00:24.819 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:00:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:00:24.820 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:00:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:00:24.820 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:00:26 compute-0 nova_compute[187185]: 2025-11-29 07:00:26.422 187189 DEBUG oslo_concurrency.lockutils [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "0c53e488-5068-4650-b5ab-66c486f03efa" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:00:26 compute-0 nova_compute[187185]: 2025-11-29 07:00:26.422 187189 DEBUG oslo_concurrency.lockutils [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "0c53e488-5068-4650-b5ab-66c486f03efa" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:00:26 compute-0 nova_compute[187185]: 2025-11-29 07:00:26.442 187189 DEBUG nova.compute.manager [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 07:00:26 compute-0 nova_compute[187185]: 2025-11-29 07:00:26.609 187189 DEBUG oslo_concurrency.lockutils [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:00:26 compute-0 nova_compute[187185]: 2025-11-29 07:00:26.609 187189 DEBUG oslo_concurrency.lockutils [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:00:26 compute-0 nova_compute[187185]: 2025-11-29 07:00:26.621 187189 DEBUG nova.virt.hardware [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:00:26 compute-0 nova_compute[187185]: 2025-11-29 07:00:26.621 187189 INFO nova.compute.claims [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:00:26 compute-0 nova_compute[187185]: 2025-11-29 07:00:26.807 187189 DEBUG nova.compute.provider_tree [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:00:26 compute-0 nova_compute[187185]: 2025-11-29 07:00:26.824 187189 DEBUG nova.scheduler.client.report [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:00:26 compute-0 nova_compute[187185]: 2025-11-29 07:00:26.854 187189 DEBUG oslo_concurrency.lockutils [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.245s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:00:26 compute-0 nova_compute[187185]: 2025-11-29 07:00:26.855 187189 DEBUG nova.compute.manager [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 07:00:26 compute-0 nova_compute[187185]: 2025-11-29 07:00:26.981 187189 DEBUG nova.compute.manager [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 07:00:26 compute-0 nova_compute[187185]: 2025-11-29 07:00:26.981 187189 DEBUG nova.network.neutron [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 07:00:27 compute-0 nova_compute[187185]: 2025-11-29 07:00:27.021 187189 INFO nova.virt.libvirt.driver [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 07:00:27 compute-0 nova_compute[187185]: 2025-11-29 07:00:27.052 187189 DEBUG nova.compute.manager [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 07:00:27 compute-0 nova_compute[187185]: 2025-11-29 07:00:27.293 187189 DEBUG nova.compute.manager [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 07:00:27 compute-0 nova_compute[187185]: 2025-11-29 07:00:27.294 187189 DEBUG nova.virt.libvirt.driver [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:00:27 compute-0 nova_compute[187185]: 2025-11-29 07:00:27.295 187189 INFO nova.virt.libvirt.driver [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Creating image(s)
Nov 29 07:00:27 compute-0 nova_compute[187185]: 2025-11-29 07:00:27.296 187189 DEBUG oslo_concurrency.lockutils [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "/var/lib/nova/instances/0c53e488-5068-4650-b5ab-66c486f03efa/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:00:27 compute-0 nova_compute[187185]: 2025-11-29 07:00:27.296 187189 DEBUG oslo_concurrency.lockutils [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "/var/lib/nova/instances/0c53e488-5068-4650-b5ab-66c486f03efa/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:00:27 compute-0 nova_compute[187185]: 2025-11-29 07:00:27.298 187189 DEBUG oslo_concurrency.lockutils [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "/var/lib/nova/instances/0c53e488-5068-4650-b5ab-66c486f03efa/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:00:27 compute-0 nova_compute[187185]: 2025-11-29 07:00:27.317 187189 DEBUG oslo_concurrency.processutils [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:00:27 compute-0 nova_compute[187185]: 2025-11-29 07:00:27.375 187189 DEBUG oslo_concurrency.processutils [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:00:27 compute-0 nova_compute[187185]: 2025-11-29 07:00:27.376 187189 DEBUG oslo_concurrency.lockutils [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:00:27 compute-0 nova_compute[187185]: 2025-11-29 07:00:27.377 187189 DEBUG oslo_concurrency.lockutils [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:00:27 compute-0 nova_compute[187185]: 2025-11-29 07:00:27.387 187189 DEBUG oslo_concurrency.processutils [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:00:27 compute-0 nova_compute[187185]: 2025-11-29 07:00:27.422 187189 DEBUG nova.policy [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 07:00:27 compute-0 nova_compute[187185]: 2025-11-29 07:00:27.438 187189 DEBUG oslo_concurrency.processutils [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:00:27 compute-0 nova_compute[187185]: 2025-11-29 07:00:27.438 187189 DEBUG oslo_concurrency.processutils [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/0c53e488-5068-4650-b5ab-66c486f03efa/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:00:28 compute-0 nova_compute[187185]: 2025-11-29 07:00:28.681 187189 DEBUG oslo_concurrency.processutils [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/0c53e488-5068-4650-b5ab-66c486f03efa/disk 1073741824" returned: 0 in 1.242s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:00:28 compute-0 nova_compute[187185]: 2025-11-29 07:00:28.682 187189 DEBUG oslo_concurrency.lockutils [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 1.305s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:00:28 compute-0 nova_compute[187185]: 2025-11-29 07:00:28.683 187189 DEBUG oslo_concurrency.processutils [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:00:28 compute-0 nova_compute[187185]: 2025-11-29 07:00:28.720 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:00:28 compute-0 nova_compute[187185]: 2025-11-29 07:00:28.773 187189 DEBUG oslo_concurrency.processutils [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:00:28 compute-0 nova_compute[187185]: 2025-11-29 07:00:28.774 187189 DEBUG nova.virt.disk.api [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Checking if we can resize image /var/lib/nova/instances/0c53e488-5068-4650-b5ab-66c486f03efa/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:00:28 compute-0 nova_compute[187185]: 2025-11-29 07:00:28.775 187189 DEBUG oslo_concurrency.processutils [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0c53e488-5068-4650-b5ab-66c486f03efa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:00:28 compute-0 nova_compute[187185]: 2025-11-29 07:00:28.828 187189 DEBUG oslo_concurrency.processutils [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0c53e488-5068-4650-b5ab-66c486f03efa/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:00:28 compute-0 nova_compute[187185]: 2025-11-29 07:00:28.829 187189 DEBUG nova.virt.disk.api [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Cannot resize image /var/lib/nova/instances/0c53e488-5068-4650-b5ab-66c486f03efa/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:00:28 compute-0 nova_compute[187185]: 2025-11-29 07:00:28.830 187189 DEBUG nova.objects.instance [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lazy-loading 'migration_context' on Instance uuid 0c53e488-5068-4650-b5ab-66c486f03efa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:00:28 compute-0 nova_compute[187185]: 2025-11-29 07:00:28.846 187189 DEBUG nova.virt.libvirt.driver [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:00:28 compute-0 nova_compute[187185]: 2025-11-29 07:00:28.847 187189 DEBUG nova.virt.libvirt.driver [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Ensure instance console log exists: /var/lib/nova/instances/0c53e488-5068-4650-b5ab-66c486f03efa/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:00:28 compute-0 nova_compute[187185]: 2025-11-29 07:00:28.848 187189 DEBUG oslo_concurrency.lockutils [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:00:28 compute-0 nova_compute[187185]: 2025-11-29 07:00:28.848 187189 DEBUG oslo_concurrency.lockutils [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:00:28 compute-0 nova_compute[187185]: 2025-11-29 07:00:28.848 187189 DEBUG oslo_concurrency.lockutils [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:00:28 compute-0 nova_compute[187185]: 2025-11-29 07:00:28.961 187189 DEBUG nova.network.neutron [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Successfully created port: f3c83dc3-5763-4272-83eb-749a084d4129 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:00:29 compute-0 nova_compute[187185]: 2025-11-29 07:00:29.142 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:00:31 compute-0 nova_compute[187185]: 2025-11-29 07:00:31.916 187189 DEBUG nova.network.neutron [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Successfully updated port: f3c83dc3-5763-4272-83eb-749a084d4129 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:00:32 compute-0 nova_compute[187185]: 2025-11-29 07:00:32.007 187189 DEBUG oslo_concurrency.lockutils [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "refresh_cache-0c53e488-5068-4650-b5ab-66c486f03efa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:00:32 compute-0 nova_compute[187185]: 2025-11-29 07:00:32.007 187189 DEBUG oslo_concurrency.lockutils [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquired lock "refresh_cache-0c53e488-5068-4650-b5ab-66c486f03efa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:00:32 compute-0 nova_compute[187185]: 2025-11-29 07:00:32.008 187189 DEBUG nova.network.neutron [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:00:32 compute-0 nova_compute[187185]: 2025-11-29 07:00:32.187 187189 DEBUG nova.compute.manager [req-689711d9-4282-4360-beac-3afe92806e14 req-e1b4331c-4c56-4c85-adfe-b4e61fd7b426 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Received event network-changed-f3c83dc3-5763-4272-83eb-749a084d4129 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:00:32 compute-0 nova_compute[187185]: 2025-11-29 07:00:32.187 187189 DEBUG nova.compute.manager [req-689711d9-4282-4360-beac-3afe92806e14 req-e1b4331c-4c56-4c85-adfe-b4e61fd7b426 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Refreshing instance network info cache due to event network-changed-f3c83dc3-5763-4272-83eb-749a084d4129. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:00:32 compute-0 nova_compute[187185]: 2025-11-29 07:00:32.187 187189 DEBUG oslo_concurrency.lockutils [req-689711d9-4282-4360-beac-3afe92806e14 req-e1b4331c-4c56-4c85-adfe-b4e61fd7b426 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-0c53e488-5068-4650-b5ab-66c486f03efa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:00:32 compute-0 nova_compute[187185]: 2025-11-29 07:00:32.687 187189 DEBUG nova.network.neutron [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:00:33 compute-0 nova_compute[187185]: 2025-11-29 07:00:33.722 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:00:34 compute-0 nova_compute[187185]: 2025-11-29 07:00:34.144 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.093 187189 DEBUG nova.network.neutron [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Updating instance_info_cache with network_info: [{"id": "f3c83dc3-5763-4272-83eb-749a084d4129", "address": "fa:16:3e:32:b2:fb", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3c83dc3-57", "ovs_interfaceid": "f3c83dc3-5763-4272-83eb-749a084d4129", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.124 187189 DEBUG oslo_concurrency.lockutils [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Releasing lock "refresh_cache-0c53e488-5068-4650-b5ab-66c486f03efa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.125 187189 DEBUG nova.compute.manager [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Instance network_info: |[{"id": "f3c83dc3-5763-4272-83eb-749a084d4129", "address": "fa:16:3e:32:b2:fb", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3c83dc3-57", "ovs_interfaceid": "f3c83dc3-5763-4272-83eb-749a084d4129", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.126 187189 DEBUG oslo_concurrency.lockutils [req-689711d9-4282-4360-beac-3afe92806e14 req-e1b4331c-4c56-4c85-adfe-b4e61fd7b426 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-0c53e488-5068-4650-b5ab-66c486f03efa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.126 187189 DEBUG nova.network.neutron [req-689711d9-4282-4360-beac-3afe92806e14 req-e1b4331c-4c56-4c85-adfe-b4e61fd7b426 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Refreshing network info cache for port f3c83dc3-5763-4272-83eb-749a084d4129 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.131 187189 DEBUG nova.virt.libvirt.driver [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Start _get_guest_xml network_info=[{"id": "f3c83dc3-5763-4272-83eb-749a084d4129", "address": "fa:16:3e:32:b2:fb", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3c83dc3-57", "ovs_interfaceid": "f3c83dc3-5763-4272-83eb-749a084d4129", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.139 187189 WARNING nova.virt.libvirt.driver [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.147 187189 DEBUG nova.virt.libvirt.host [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.148 187189 DEBUG nova.virt.libvirt.host [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.159 187189 DEBUG nova.virt.libvirt.host [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.160 187189 DEBUG nova.virt.libvirt.host [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.162 187189 DEBUG nova.virt.libvirt.driver [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.162 187189 DEBUG nova.virt.hardware [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.163 187189 DEBUG nova.virt.hardware [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.163 187189 DEBUG nova.virt.hardware [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.164 187189 DEBUG nova.virt.hardware [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.164 187189 DEBUG nova.virt.hardware [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.165 187189 DEBUG nova.virt.hardware [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.165 187189 DEBUG nova.virt.hardware [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.165 187189 DEBUG nova.virt.hardware [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.166 187189 DEBUG nova.virt.hardware [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.166 187189 DEBUG nova.virt.hardware [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.167 187189 DEBUG nova.virt.hardware [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.173 187189 DEBUG nova.virt.libvirt.vif [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:00:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-13163110',display_name='tempest-DeleteServersTestJSON-server-13163110',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-13163110',id=52,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98df116965b74e4a9985049062e65162',ramdisk_id='',reservation_id='r-z3dqqzux',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1973671383',owner_user_name='tempest-DeleteServersTestJSON-1973671383-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:00:27Z,user_data=None,user_id='4ecd161098b5422084003b39f0504a8f',uuid=0c53e488-5068-4650-b5ab-66c486f03efa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f3c83dc3-5763-4272-83eb-749a084d4129", "address": "fa:16:3e:32:b2:fb", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3c83dc3-57", "ovs_interfaceid": "f3c83dc3-5763-4272-83eb-749a084d4129", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.174 187189 DEBUG nova.network.os_vif_util [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converting VIF {"id": "f3c83dc3-5763-4272-83eb-749a084d4129", "address": "fa:16:3e:32:b2:fb", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3c83dc3-57", "ovs_interfaceid": "f3c83dc3-5763-4272-83eb-749a084d4129", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.175 187189 DEBUG nova.network.os_vif_util [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:32:b2:fb,bridge_name='br-int',has_traffic_filtering=True,id=f3c83dc3-5763-4272-83eb-749a084d4129,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3c83dc3-57') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.176 187189 DEBUG nova.objects.instance [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0c53e488-5068-4650-b5ab-66c486f03efa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.205 187189 DEBUG nova.virt.libvirt.driver [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:00:35 compute-0 nova_compute[187185]:   <uuid>0c53e488-5068-4650-b5ab-66c486f03efa</uuid>
Nov 29 07:00:35 compute-0 nova_compute[187185]:   <name>instance-00000034</name>
Nov 29 07:00:35 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:00:35 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:00:35 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:00:35 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:       <nova:name>tempest-DeleteServersTestJSON-server-13163110</nova:name>
Nov 29 07:00:35 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:00:35</nova:creationTime>
Nov 29 07:00:35 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:00:35 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:00:35 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:00:35 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:00:35 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:00:35 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:00:35 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:00:35 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:00:35 compute-0 nova_compute[187185]:         <nova:user uuid="4ecd161098b5422084003b39f0504a8f">tempest-DeleteServersTestJSON-1973671383-project-member</nova:user>
Nov 29 07:00:35 compute-0 nova_compute[187185]:         <nova:project uuid="98df116965b74e4a9985049062e65162">tempest-DeleteServersTestJSON-1973671383</nova:project>
Nov 29 07:00:35 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:00:35 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:00:35 compute-0 nova_compute[187185]:         <nova:port uuid="f3c83dc3-5763-4272-83eb-749a084d4129">
Nov 29 07:00:35 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:00:35 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:00:35 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:00:35 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <system>
Nov 29 07:00:35 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:00:35 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:00:35 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:00:35 compute-0 nova_compute[187185]:       <entry name="serial">0c53e488-5068-4650-b5ab-66c486f03efa</entry>
Nov 29 07:00:35 compute-0 nova_compute[187185]:       <entry name="uuid">0c53e488-5068-4650-b5ab-66c486f03efa</entry>
Nov 29 07:00:35 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     </system>
Nov 29 07:00:35 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:00:35 compute-0 nova_compute[187185]:   <os>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:   </os>
Nov 29 07:00:35 compute-0 nova_compute[187185]:   <features>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:   </features>
Nov 29 07:00:35 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:00:35 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:00:35 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:00:35 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/0c53e488-5068-4650-b5ab-66c486f03efa/disk"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:00:35 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/0c53e488-5068-4650-b5ab-66c486f03efa/disk.config"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:00:35 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:32:b2:fb"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:       <target dev="tapf3c83dc3-57"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:00:35 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/0c53e488-5068-4650-b5ab-66c486f03efa/console.log" append="off"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <video>
Nov 29 07:00:35 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     </video>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:00:35 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:00:35 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:00:35 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:00:35 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:00:35 compute-0 nova_compute[187185]: </domain>
Nov 29 07:00:35 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.207 187189 DEBUG nova.compute.manager [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Preparing to wait for external event network-vif-plugged-f3c83dc3-5763-4272-83eb-749a084d4129 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.207 187189 DEBUG oslo_concurrency.lockutils [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "0c53e488-5068-4650-b5ab-66c486f03efa-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.208 187189 DEBUG oslo_concurrency.lockutils [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "0c53e488-5068-4650-b5ab-66c486f03efa-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.208 187189 DEBUG oslo_concurrency.lockutils [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "0c53e488-5068-4650-b5ab-66c486f03efa-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.210 187189 DEBUG nova.virt.libvirt.vif [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:00:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-13163110',display_name='tempest-DeleteServersTestJSON-server-13163110',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-13163110',id=52,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98df116965b74e4a9985049062e65162',ramdisk_id='',reservation_id='r-z3dqqzux',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1973671383',owner_user_name='tempest-DeleteServersTestJSON-1973671383-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:00:27Z,user_data=None,user_id='4ecd161098b5422084003b39f0504a8f',uuid=0c53e488-5068-4650-b5ab-66c486f03efa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f3c83dc3-5763-4272-83eb-749a084d4129", "address": "fa:16:3e:32:b2:fb", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3c83dc3-57", "ovs_interfaceid": "f3c83dc3-5763-4272-83eb-749a084d4129", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.210 187189 DEBUG nova.network.os_vif_util [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converting VIF {"id": "f3c83dc3-5763-4272-83eb-749a084d4129", "address": "fa:16:3e:32:b2:fb", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3c83dc3-57", "ovs_interfaceid": "f3c83dc3-5763-4272-83eb-749a084d4129", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.211 187189 DEBUG nova.network.os_vif_util [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:32:b2:fb,bridge_name='br-int',has_traffic_filtering=True,id=f3c83dc3-5763-4272-83eb-749a084d4129,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3c83dc3-57') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.212 187189 DEBUG os_vif [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:b2:fb,bridge_name='br-int',has_traffic_filtering=True,id=f3c83dc3-5763-4272-83eb-749a084d4129,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3c83dc3-57') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.213 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.214 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.215 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.219 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.219 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf3c83dc3-57, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.220 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf3c83dc3-57, col_values=(('external_ids', {'iface-id': 'f3c83dc3-5763-4272-83eb-749a084d4129', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:32:b2:fb', 'vm-uuid': '0c53e488-5068-4650-b5ab-66c486f03efa'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.223 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:00:35 compute-0 NetworkManager[55227]: <info>  [1764399635.2257] manager: (tapf3c83dc3-57): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/63)
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.227 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.232 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:00:35 compute-0 nova_compute[187185]: 2025-11-29 07:00:35.233 187189 INFO os_vif [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:b2:fb,bridge_name='br-int',has_traffic_filtering=True,id=f3c83dc3-5763-4272-83eb-749a084d4129,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3c83dc3-57')
Nov 29 07:00:36 compute-0 nova_compute[187185]: 2025-11-29 07:00:36.310 187189 DEBUG nova.virt.libvirt.driver [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:00:36 compute-0 nova_compute[187185]: 2025-11-29 07:00:36.311 187189 DEBUG nova.virt.libvirt.driver [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:00:36 compute-0 nova_compute[187185]: 2025-11-29 07:00:36.311 187189 DEBUG nova.virt.libvirt.driver [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] No VIF found with MAC fa:16:3e:32:b2:fb, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:00:36 compute-0 nova_compute[187185]: 2025-11-29 07:00:36.312 187189 INFO nova.virt.libvirt.driver [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Using config drive
Nov 29 07:00:36 compute-0 podman[220393]: 2025-11-29 07:00:36.785978169 +0000 UTC m=+0.049423878 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 07:00:36 compute-0 podman[220395]: 2025-11-29 07:00:36.829357134 +0000 UTC m=+0.082096610 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 07:00:36 compute-0 podman[220394]: 2025-11-29 07:00:36.837993039 +0000 UTC m=+0.088953835 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_id=edpm, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, release=1755695350, container_name=openstack_network_exporter, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., maintainer=Red Hat, Inc.)
Nov 29 07:00:36 compute-0 nova_compute[187185]: 2025-11-29 07:00:36.893 187189 INFO nova.virt.libvirt.driver [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Creating config drive at /var/lib/nova/instances/0c53e488-5068-4650-b5ab-66c486f03efa/disk.config
Nov 29 07:00:36 compute-0 nova_compute[187185]: 2025-11-29 07:00:36.899 187189 DEBUG oslo_concurrency.processutils [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0c53e488-5068-4650-b5ab-66c486f03efa/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwt_ivlkc execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:00:37 compute-0 nova_compute[187185]: 2025-11-29 07:00:37.029 187189 DEBUG oslo_concurrency.processutils [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0c53e488-5068-4650-b5ab-66c486f03efa/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwt_ivlkc" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:00:37 compute-0 kernel: tapf3c83dc3-57: entered promiscuous mode
Nov 29 07:00:37 compute-0 NetworkManager[55227]: <info>  [1764399637.1267] manager: (tapf3c83dc3-57): new Tun device (/org/freedesktop/NetworkManager/Devices/64)
Nov 29 07:00:37 compute-0 ovn_controller[95281]: 2025-11-29T07:00:37Z|00109|binding|INFO|Claiming lport f3c83dc3-5763-4272-83eb-749a084d4129 for this chassis.
Nov 29 07:00:37 compute-0 nova_compute[187185]: 2025-11-29 07:00:37.128 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:00:37 compute-0 ovn_controller[95281]: 2025-11-29T07:00:37Z|00110|binding|INFO|f3c83dc3-5763-4272-83eb-749a084d4129: Claiming fa:16:3e:32:b2:fb 10.100.0.14
Nov 29 07:00:37 compute-0 nova_compute[187185]: 2025-11-29 07:00:37.133 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:00:37 compute-0 nova_compute[187185]: 2025-11-29 07:00:37.135 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:00:37 compute-0 nova_compute[187185]: 2025-11-29 07:00:37.141 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:00:37.157 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:32:b2:fb 10.100.0.14'], port_security=['fa:16:3e:32:b2:fb 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '0c53e488-5068-4650-b5ab-66c486f03efa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98df116965b74e4a9985049062e65162', 'neutron:revision_number': '2', 'neutron:security_group_ids': '234720a9-9cd1-4b87-9bec-1abfe8ff0514', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e694bb30-a43a-4d18-87fa-e5c0dd8850c2, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=f3c83dc3-5763-4272-83eb-749a084d4129) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:00:37.160 104254 INFO neutron.agent.ovn.metadata.agent [-] Port f3c83dc3-5763-4272-83eb-749a084d4129 in datapath fd9eb57e-b1f8-4bae-a60f-8e40613556cd bound to our chassis
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:00:37.163 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fd9eb57e-b1f8-4bae-a60f-8e40613556cd
Nov 29 07:00:37 compute-0 systemd-machined[153486]: New machine qemu-18-instance-00000034.
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:00:37.180 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[512a0014-6f02-4bb9-aceb-c6eda90b6685]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:00:37.181 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfd9eb57e-b1 in ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:00:37.183 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfd9eb57e-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:00:37.184 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f4ebd969-018e-4252-9004-8e56ee89fab2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:00:37.185 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[5ea2e68d-f1db-444a-9bb9-632e8ffb06a2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:00:37.201 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[abd31324-05b7-487d-857f-e7e3b437cee7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:00:37 compute-0 ovn_controller[95281]: 2025-11-29T07:00:37Z|00111|binding|INFO|Setting lport f3c83dc3-5763-4272-83eb-749a084d4129 ovn-installed in OVS
Nov 29 07:00:37 compute-0 ovn_controller[95281]: 2025-11-29T07:00:37Z|00112|binding|INFO|Setting lport f3c83dc3-5763-4272-83eb-749a084d4129 up in Southbound
Nov 29 07:00:37 compute-0 systemd[1]: Started Virtual Machine qemu-18-instance-00000034.
Nov 29 07:00:37 compute-0 nova_compute[187185]: 2025-11-29 07:00:37.206 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:00:37.225 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6b5dc375-5375-498d-b0c5-fc58da7a39f7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:00:37 compute-0 systemd-udevd[220475]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:00:37 compute-0 NetworkManager[55227]: <info>  [1764399637.2494] device (tapf3c83dc3-57): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:00:37 compute-0 NetworkManager[55227]: <info>  [1764399637.2505] device (tapf3c83dc3-57): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:00:37.286 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[7e93541f-79ec-46c1-9c87-6c00f2f5acfe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:00:37 compute-0 NetworkManager[55227]: <info>  [1764399637.2952] manager: (tapfd9eb57e-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/65)
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:00:37.293 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[964c6af2-5431-4364-b5d2-1abbce54d6bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:00:37.347 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[57944480-4a20-4e12-a8ff-c440ae881f20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:00:37.352 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[f6675ced-6af7-43f8-8b02-d26a63980348]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:00:37 compute-0 NetworkManager[55227]: <info>  [1764399637.3874] device (tapfd9eb57e-b0): carrier: link connected
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:00:37.405 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[82f868d7-7232-4553-8dab-ee6679583773]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:00:37.431 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[17383ed0-7887-4d46-b183-dc5cd0ed2ae9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfd9eb57e-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ec:80:ac'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 508505, 'reachable_time': 20270, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220505, 'error': None, 'target': 'ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:00:37.464 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[34532ccb-8454-4f9a-84c9-319f919e23c8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feec:80ac'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 508505, 'tstamp': 508505}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220506, 'error': None, 'target': 'ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:00:37.494 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[3ff10959-4448-47c7-8327-4107a495c97d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfd9eb57e-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ec:80:ac'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 508505, 'reachable_time': 20270, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220507, 'error': None, 'target': 'ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:00:37.546 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[cd3cee4c-4485-47e4-a54c-788418983771]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:00:37 compute-0 nova_compute[187185]: 2025-11-29 07:00:37.629 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399637.6287928, 0c53e488-5068-4650-b5ab-66c486f03efa => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:00:37 compute-0 nova_compute[187185]: 2025-11-29 07:00:37.631 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] VM Started (Lifecycle Event)
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:00:37.636 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[353bc49a-80ad-4dc2-9789-638a63a7b264]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:00:37.639 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfd9eb57e-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:00:37.640 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:00:37.642 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfd9eb57e-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:00:37 compute-0 nova_compute[187185]: 2025-11-29 07:00:37.645 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:00:37 compute-0 kernel: tapfd9eb57e-b0: entered promiscuous mode
Nov 29 07:00:37 compute-0 NetworkManager[55227]: <info>  [1764399637.6470] manager: (tapfd9eb57e-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:00:37.649 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfd9eb57e-b0, col_values=(('external_ids', {'iface-id': 'e7b4cb4f-cb6d-4f0e-8c8d-34c743671595'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:00:37 compute-0 ovn_controller[95281]: 2025-11-29T07:00:37Z|00113|binding|INFO|Releasing lport e7b4cb4f-cb6d-4f0e-8c8d-34c743671595 from this chassis (sb_readonly=0)
Nov 29 07:00:37 compute-0 nova_compute[187185]: 2025-11-29 07:00:37.650 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:00:37.654 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fd9eb57e-b1f8-4bae-a60f-8e40613556cd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fd9eb57e-b1f8-4bae-a60f-8e40613556cd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:00:37.655 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e1f81364-12c6-446f-8984-c7991d767c09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:00:37.656 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-fd9eb57e-b1f8-4bae-a60f-8e40613556cd
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/fd9eb57e-b1f8-4bae-a60f-8e40613556cd.pid.haproxy
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID fd9eb57e-b1f8-4bae-a60f-8e40613556cd
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:00:37 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:00:37.657 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'env', 'PROCESS_TAG=haproxy-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fd9eb57e-b1f8-4bae-a60f-8e40613556cd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:00:37 compute-0 nova_compute[187185]: 2025-11-29 07:00:37.668 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:00:37 compute-0 nova_compute[187185]: 2025-11-29 07:00:37.677 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:00:37 compute-0 nova_compute[187185]: 2025-11-29 07:00:37.683 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399637.6289654, 0c53e488-5068-4650-b5ab-66c486f03efa => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:00:37 compute-0 nova_compute[187185]: 2025-11-29 07:00:37.684 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] VM Paused (Lifecycle Event)
Nov 29 07:00:37 compute-0 nova_compute[187185]: 2025-11-29 07:00:37.706 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:00:37 compute-0 nova_compute[187185]: 2025-11-29 07:00:37.712 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:00:37 compute-0 nova_compute[187185]: 2025-11-29 07:00:37.754 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:00:38 compute-0 podman[220546]: 2025-11-29 07:00:38.039092243 +0000 UTC m=+0.021893570 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:00:38 compute-0 nova_compute[187185]: 2025-11-29 07:00:38.386 187189 DEBUG nova.network.neutron [req-689711d9-4282-4360-beac-3afe92806e14 req-e1b4331c-4c56-4c85-adfe-b4e61fd7b426 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Updated VIF entry in instance network info cache for port f3c83dc3-5763-4272-83eb-749a084d4129. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:00:38 compute-0 nova_compute[187185]: 2025-11-29 07:00:38.387 187189 DEBUG nova.network.neutron [req-689711d9-4282-4360-beac-3afe92806e14 req-e1b4331c-4c56-4c85-adfe-b4e61fd7b426 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Updating instance_info_cache with network_info: [{"id": "f3c83dc3-5763-4272-83eb-749a084d4129", "address": "fa:16:3e:32:b2:fb", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3c83dc3-57", "ovs_interfaceid": "f3c83dc3-5763-4272-83eb-749a084d4129", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:00:38 compute-0 nova_compute[187185]: 2025-11-29 07:00:38.411 187189 DEBUG oslo_concurrency.lockutils [req-689711d9-4282-4360-beac-3afe92806e14 req-e1b4331c-4c56-4c85-adfe-b4e61fd7b426 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-0c53e488-5068-4650-b5ab-66c486f03efa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:00:38 compute-0 nova_compute[187185]: 2025-11-29 07:00:38.724 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:00:39 compute-0 podman[220546]: 2025-11-29 07:00:39.197606393 +0000 UTC m=+1.180407740 container create 972ef85c735ee92eda64c27290f50599ec0a74c197a1aa9412ca504fc45bceb2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 07:00:40 compute-0 systemd[1]: Started libpod-conmon-972ef85c735ee92eda64c27290f50599ec0a74c197a1aa9412ca504fc45bceb2.scope.
Nov 29 07:00:40 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:00:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/92295d971936649fc5cfe4e3a99d1330c6e5e244025c6d270069672112aa6eca/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:00:40 compute-0 nova_compute[187185]: 2025-11-29 07:00:40.225 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:00:40 compute-0 podman[220546]: 2025-11-29 07:00:40.384612148 +0000 UTC m=+2.367413495 container init 972ef85c735ee92eda64c27290f50599ec0a74c197a1aa9412ca504fc45bceb2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:00:40 compute-0 podman[220546]: 2025-11-29 07:00:40.394990481 +0000 UTC m=+2.377791798 container start 972ef85c735ee92eda64c27290f50599ec0a74c197a1aa9412ca504fc45bceb2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Nov 29 07:00:40 compute-0 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[220562]: [NOTICE]   (220566) : New worker (220568) forked
Nov 29 07:00:40 compute-0 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[220562]: [NOTICE]   (220566) : Loading success.
Nov 29 07:00:41 compute-0 nova_compute[187185]: 2025-11-29 07:00:41.868 187189 DEBUG nova.compute.manager [req-d0a30965-6038-4207-a860-71eebec28b0d req-717e27d0-6041-4763-b963-2414ad7c1af8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Received event network-vif-plugged-f3c83dc3-5763-4272-83eb-749a084d4129 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:00:41 compute-0 nova_compute[187185]: 2025-11-29 07:00:41.869 187189 DEBUG oslo_concurrency.lockutils [req-d0a30965-6038-4207-a860-71eebec28b0d req-717e27d0-6041-4763-b963-2414ad7c1af8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "0c53e488-5068-4650-b5ab-66c486f03efa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:00:41 compute-0 nova_compute[187185]: 2025-11-29 07:00:41.869 187189 DEBUG oslo_concurrency.lockutils [req-d0a30965-6038-4207-a860-71eebec28b0d req-717e27d0-6041-4763-b963-2414ad7c1af8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0c53e488-5068-4650-b5ab-66c486f03efa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:00:41 compute-0 nova_compute[187185]: 2025-11-29 07:00:41.870 187189 DEBUG oslo_concurrency.lockutils [req-d0a30965-6038-4207-a860-71eebec28b0d req-717e27d0-6041-4763-b963-2414ad7c1af8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0c53e488-5068-4650-b5ab-66c486f03efa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:00:41 compute-0 nova_compute[187185]: 2025-11-29 07:00:41.870 187189 DEBUG nova.compute.manager [req-d0a30965-6038-4207-a860-71eebec28b0d req-717e27d0-6041-4763-b963-2414ad7c1af8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Processing event network-vif-plugged-f3c83dc3-5763-4272-83eb-749a084d4129 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:00:41 compute-0 nova_compute[187185]: 2025-11-29 07:00:41.871 187189 DEBUG nova.compute.manager [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:00:41 compute-0 nova_compute[187185]: 2025-11-29 07:00:41.882 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399641.8813734, 0c53e488-5068-4650-b5ab-66c486f03efa => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:00:41 compute-0 nova_compute[187185]: 2025-11-29 07:00:41.883 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] VM Resumed (Lifecycle Event)
Nov 29 07:00:41 compute-0 nova_compute[187185]: 2025-11-29 07:00:41.886 187189 DEBUG nova.virt.libvirt.driver [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:00:41 compute-0 nova_compute[187185]: 2025-11-29 07:00:41.892 187189 INFO nova.virt.libvirt.driver [-] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Instance spawned successfully.
Nov 29 07:00:41 compute-0 nova_compute[187185]: 2025-11-29 07:00:41.893 187189 DEBUG nova.virt.libvirt.driver [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:00:41 compute-0 nova_compute[187185]: 2025-11-29 07:00:41.922 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:00:41 compute-0 nova_compute[187185]: 2025-11-29 07:00:41.928 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:00:41 compute-0 nova_compute[187185]: 2025-11-29 07:00:41.962 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:00:41 compute-0 nova_compute[187185]: 2025-11-29 07:00:41.969 187189 DEBUG nova.virt.libvirt.driver [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:00:41 compute-0 nova_compute[187185]: 2025-11-29 07:00:41.970 187189 DEBUG nova.virt.libvirt.driver [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:00:41 compute-0 nova_compute[187185]: 2025-11-29 07:00:41.971 187189 DEBUG nova.virt.libvirt.driver [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:00:41 compute-0 nova_compute[187185]: 2025-11-29 07:00:41.972 187189 DEBUG nova.virt.libvirt.driver [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:00:41 compute-0 nova_compute[187185]: 2025-11-29 07:00:41.972 187189 DEBUG nova.virt.libvirt.driver [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:00:41 compute-0 nova_compute[187185]: 2025-11-29 07:00:41.973 187189 DEBUG nova.virt.libvirt.driver [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:00:42 compute-0 nova_compute[187185]: 2025-11-29 07:00:42.081 187189 INFO nova.compute.manager [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Took 14.79 seconds to spawn the instance on the hypervisor.
Nov 29 07:00:42 compute-0 nova_compute[187185]: 2025-11-29 07:00:42.082 187189 DEBUG nova.compute.manager [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:00:42 compute-0 nova_compute[187185]: 2025-11-29 07:00:42.254 187189 INFO nova.compute.manager [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Took 15.73 seconds to build instance.
Nov 29 07:00:42 compute-0 nova_compute[187185]: 2025-11-29 07:00:42.278 187189 DEBUG oslo_concurrency.lockutils [None req-fb386bb3-7d6f-4697-8464-220d1cfabd0d 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "0c53e488-5068-4650-b5ab-66c486f03efa" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.856s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:00:42 compute-0 podman[220577]: 2025-11-29 07:00:42.901027093 +0000 UTC m=+0.165088857 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:00:43 compute-0 nova_compute[187185]: 2025-11-29 07:00:43.402 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:00:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:00:43.402 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:00:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:00:43.404 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:00:43 compute-0 nova_compute[187185]: 2025-11-29 07:00:43.726 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:00:44 compute-0 nova_compute[187185]: 2025-11-29 07:00:44.136 187189 DEBUG nova.compute.manager [req-99f4c097-ce9c-4621-a39e-0a403468c511 req-c1fe9de7-6ddd-4e0f-a1cf-7a1da32c596d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Received event network-vif-plugged-f3c83dc3-5763-4272-83eb-749a084d4129 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:00:44 compute-0 nova_compute[187185]: 2025-11-29 07:00:44.136 187189 DEBUG oslo_concurrency.lockutils [req-99f4c097-ce9c-4621-a39e-0a403468c511 req-c1fe9de7-6ddd-4e0f-a1cf-7a1da32c596d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "0c53e488-5068-4650-b5ab-66c486f03efa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:00:44 compute-0 nova_compute[187185]: 2025-11-29 07:00:44.136 187189 DEBUG oslo_concurrency.lockutils [req-99f4c097-ce9c-4621-a39e-0a403468c511 req-c1fe9de7-6ddd-4e0f-a1cf-7a1da32c596d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0c53e488-5068-4650-b5ab-66c486f03efa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:00:44 compute-0 nova_compute[187185]: 2025-11-29 07:00:44.136 187189 DEBUG oslo_concurrency.lockutils [req-99f4c097-ce9c-4621-a39e-0a403468c511 req-c1fe9de7-6ddd-4e0f-a1cf-7a1da32c596d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0c53e488-5068-4650-b5ab-66c486f03efa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:00:44 compute-0 nova_compute[187185]: 2025-11-29 07:00:44.137 187189 DEBUG nova.compute.manager [req-99f4c097-ce9c-4621-a39e-0a403468c511 req-c1fe9de7-6ddd-4e0f-a1cf-7a1da32c596d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] No waiting events found dispatching network-vif-plugged-f3c83dc3-5763-4272-83eb-749a084d4129 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:00:44 compute-0 nova_compute[187185]: 2025-11-29 07:00:44.137 187189 WARNING nova.compute.manager [req-99f4c097-ce9c-4621-a39e-0a403468c511 req-c1fe9de7-6ddd-4e0f-a1cf-7a1da32c596d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Received unexpected event network-vif-plugged-f3c83dc3-5763-4272-83eb-749a084d4129 for instance with vm_state active and task_state None.
Nov 29 07:00:45 compute-0 nova_compute[187185]: 2025-11-29 07:00:45.193 187189 DEBUG oslo_concurrency.lockutils [None req-5631dd08-c7b1-4122-8c4b-0ffbe881c7f7 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "0c53e488-5068-4650-b5ab-66c486f03efa" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:00:45 compute-0 nova_compute[187185]: 2025-11-29 07:00:45.194 187189 DEBUG oslo_concurrency.lockutils [None req-5631dd08-c7b1-4122-8c4b-0ffbe881c7f7 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "0c53e488-5068-4650-b5ab-66c486f03efa" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:00:45 compute-0 nova_compute[187185]: 2025-11-29 07:00:45.195 187189 INFO nova.compute.manager [None req-5631dd08-c7b1-4122-8c4b-0ffbe881c7f7 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Shelving
Nov 29 07:00:45 compute-0 nova_compute[187185]: 2025-11-29 07:00:45.235 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:00:45 compute-0 nova_compute[187185]: 2025-11-29 07:00:45.257 187189 DEBUG nova.virt.libvirt.driver [None req-5631dd08-c7b1-4122-8c4b-0ffbe881c7f7 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Nov 29 07:00:45 compute-0 podman[220603]: 2025-11-29 07:00:45.829015149 +0000 UTC m=+0.080528597 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm)
Nov 29 07:00:45 compute-0 podman[220604]: 2025-11-29 07:00:45.850115125 +0000 UTC m=+0.094921593 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:00:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:00:47.409 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:00:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:47.994 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '0c53e488-5068-4650-b5ab-66c486f03efa', 'name': 'tempest-DeleteServersTestJSON-server-13163110', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000034', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '98df116965b74e4a9985049062e65162', 'user_id': '4ecd161098b5422084003b39f0504a8f', 'hostId': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 07:00:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:47.995 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 07:00:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:47.995 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:00:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:47.996 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-DeleteServersTestJSON-server-13163110>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-DeleteServersTestJSON-server-13163110>]
Nov 29 07:00:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:47.996 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.000 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 0c53e488-5068-4650-b5ab-66c486f03efa / tapf3c83dc3-57 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.000 12 DEBUG ceilometer.compute.pollsters [-] 0c53e488-5068-4650-b5ab-66c486f03efa/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0a514ef3-0440-4026-a510-621b4ee26ac6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'instance-00000034-0c53e488-5068-4650-b5ab-66c486f03efa-tapf3c83dc3-57', 'timestamp': '2025-11-29T07:00:47.996578', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-13163110', 'name': 'tapf3c83dc3-57', 'instance_id': '0c53e488-5068-4650-b5ab-66c486f03efa', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:32:b2:fb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf3c83dc3-57'}, 'message_id': '221d4660-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5095.714832118, 'message_signature': '83bdfd3ddc07edac8c97e7e03629f5c8a9eda807f2e911af0d0a35b075fc76c3'}]}, 'timestamp': '2025-11-29 07:00:48.001883', '_unique_id': 'dcbe331fed514e618ebd3335be391782'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.004 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.005 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.005 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.005 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-DeleteServersTestJSON-server-13163110>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-DeleteServersTestJSON-server-13163110>]
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.006 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.029 12 DEBUG ceilometer.compute.pollsters [-] 0c53e488-5068-4650-b5ab-66c486f03efa/cpu volume: 5840000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cf900b94-611d-4118-90a1-5ff177d19727', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 5840000000, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': '0c53e488-5068-4650-b5ab-66c486f03efa', 'timestamp': '2025-11-29T07:00:48.006102', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-13163110', 'name': 'instance-00000034', 'instance_id': '0c53e488-5068-4650-b5ab-66c486f03efa', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '222196c0-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5095.747196373, 'message_signature': '6ffba413254bda7898ced8d73933a698b029f3ba9ff78850d11996506d9058ae'}]}, 'timestamp': '2025-11-29 07:00:48.029954', '_unique_id': '02588962e029490d84262f867d5947d9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.031 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.032 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.057 12 DEBUG ceilometer.compute.pollsters [-] 0c53e488-5068-4650-b5ab-66c486f03efa/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.058 12 DEBUG ceilometer.compute.pollsters [-] 0c53e488-5068-4650-b5ab-66c486f03efa/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '98ed023f-594c-4401-b244-184f87a871be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': '0c53e488-5068-4650-b5ab-66c486f03efa-vda', 'timestamp': '2025-11-29T07:00:48.032394', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-13163110', 'name': 'instance-00000034', 'instance_id': '0c53e488-5068-4650-b5ab-66c486f03efa', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2225fce2-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5095.75062444, 'message_signature': '12410b6fd1ee5c11fae6deedd314d8427f789932770f6504f662f52fe2df3b67'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': '0c53e488-5068-4650-b5ab-66c486f03efa-sda', 'timestamp': '2025-11-29T07:00:48.032394', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-13163110', 'name': 'instance-00000034', 'instance_id': '0c53e488-5068-4650-b5ab-66c486f03efa', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '22260e1c-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5095.75062444, 'message_signature': 'e3a28ebb97f392e8f55d378df1a9b22f8da05c7cdfde9ce782d6a917634e4314'}]}, 'timestamp': '2025-11-29 07:00:48.059095', '_unique_id': '501e4d6d688444a4bf1cbbfde46f37b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.060 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.061 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.061 12 DEBUG ceilometer.compute.pollsters [-] 0c53e488-5068-4650-b5ab-66c486f03efa/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9b0766f1-4c18-40e6-a9ed-137c958924a2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'instance-00000034-0c53e488-5068-4650-b5ab-66c486f03efa-tapf3c83dc3-57', 'timestamp': '2025-11-29T07:00:48.061626', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-13163110', 'name': 'tapf3c83dc3-57', 'instance_id': '0c53e488-5068-4650-b5ab-66c486f03efa', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:32:b2:fb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf3c83dc3-57'}, 'message_id': '22267ef6-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5095.714832118, 'message_signature': '806d8c2eb964fab36bd31efd77e9957bf2a2ff84d3c5066a10139b3a8abe4b4f'}]}, 'timestamp': '2025-11-29 07:00:48.061981', '_unique_id': '29f1d924fe9c45f1a2dc44527ae53e59'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.062 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.063 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.079 12 DEBUG ceilometer.compute.pollsters [-] 0c53e488-5068-4650-b5ab-66c486f03efa/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.079 12 DEBUG ceilometer.compute.pollsters [-] 0c53e488-5068-4650-b5ab-66c486f03efa/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e800b437-1c3b-44f5-b8b8-1c6f2abf21c8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': '0c53e488-5068-4650-b5ab-66c486f03efa-vda', 'timestamp': '2025-11-29T07:00:48.063363', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-13163110', 'name': 'instance-00000034', 'instance_id': '0c53e488-5068-4650-b5ab-66c486f03efa', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '222934a2-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5095.781622026, 'message_signature': 'd161d651574edd1457cf719eb8bb04d601bc5d23705d03ee28bcc0f92ae9cc9a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': '0c53e488-5068-4650-b5ab-66c486f03efa-sda', 'timestamp': '2025-11-29T07:00:48.063363', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-13163110', 'name': 'instance-00000034', 'instance_id': '0c53e488-5068-4650-b5ab-66c486f03efa', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '22294424-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5095.781622026, 'message_signature': '386ff0b2c7b6dac4530a4eb0476180f62f150c364cd4a3bd5d8f5fa12159d0d5'}]}, 'timestamp': '2025-11-29 07:00:48.080136', '_unique_id': '7d27441f143c4242b52bd359468488ff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.081 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.082 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.082 12 DEBUG ceilometer.compute.pollsters [-] 0c53e488-5068-4650-b5ab-66c486f03efa/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.082 12 DEBUG ceilometer.compute.pollsters [-] 0c53e488-5068-4650-b5ab-66c486f03efa/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0697aadf-ab9b-47c1-a75e-1159f06f1e33', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': '0c53e488-5068-4650-b5ab-66c486f03efa-vda', 'timestamp': '2025-11-29T07:00:48.082663', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-13163110', 'name': 'instance-00000034', 'instance_id': '0c53e488-5068-4650-b5ab-66c486f03efa', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2229b472-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5095.781622026, 'message_signature': 'e7030146d1e557f54fa5847335ea27a0b0760f345397b12e4c7c4c1a3efd23c7'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': '0c53e488-5068-4650-b5ab-66c486f03efa-sda', 'timestamp': '2025-11-29T07:00:48.082663', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-13163110', 'name': 'instance-00000034', 'instance_id': '0c53e488-5068-4650-b5ab-66c486f03efa', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2229bcd8-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5095.781622026, 'message_signature': 'f9f69da2c5a39a5e775b560db21db3e0b0b20ef457bd00e1ba135dc1a681e060'}]}, 'timestamp': '2025-11-29 07:00:48.083167', '_unique_id': 'a38e49c8af3e4090b291641290096182'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.083 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.084 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.084 12 DEBUG ceilometer.compute.pollsters [-] 0c53e488-5068-4650-b5ab-66c486f03efa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.084 12 DEBUG ceilometer.compute.pollsters [-] 0c53e488-5068-4650-b5ab-66c486f03efa/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9aba9ef8-848d-4e56-8c9c-92e26df8d19a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': '0c53e488-5068-4650-b5ab-66c486f03efa-vda', 'timestamp': '2025-11-29T07:00:48.084429', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-13163110', 'name': 'instance-00000034', 'instance_id': '0c53e488-5068-4650-b5ab-66c486f03efa', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2229f7ac-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5095.781622026, 'message_signature': '1f5dda4bce60c0d722a09f4658e297ed891740fbdf34b692f608bda68450d855'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': '0c53e488-5068-4650-b5ab-66c486f03efa-sda', 'timestamp': '2025-11-29T07:00:48.084429', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-13163110', 'name': 'instance-00000034', 'instance_id': '0c53e488-5068-4650-b5ab-66c486f03efa', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '222a00e4-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5095.781622026, 'message_signature': '9cda92e634ec546548c9211065a2487538e0873b68ad164dd71b4b61448a2662'}]}, 'timestamp': '2025-11-29 07:00:48.084925', '_unique_id': '71a5fa8cf40e46e293a196f670b18080'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.085 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.086 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.086 12 DEBUG ceilometer.compute.pollsters [-] 0c53e488-5068-4650-b5ab-66c486f03efa/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2cdaba16-f7d1-47ea-8475-06281cb47309', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'instance-00000034-0c53e488-5068-4650-b5ab-66c486f03efa-tapf3c83dc3-57', 'timestamp': '2025-11-29T07:00:48.086145', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-13163110', 'name': 'tapf3c83dc3-57', 'instance_id': '0c53e488-5068-4650-b5ab-66c486f03efa', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:32:b2:fb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf3c83dc3-57'}, 'message_id': '222a3a3c-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5095.714832118, 'message_signature': '0bbd4b2bea14af14166503e16d767caaf12c121c24231f14cf25705ef7184b82'}]}, 'timestamp': '2025-11-29 07:00:48.086435', '_unique_id': '17c5288480204a9fbb6dba36f168f76e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.087 12 DEBUG ceilometer.compute.pollsters [-] 0c53e488-5068-4650-b5ab-66c486f03efa/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '315d3e04-4ea8-40fb-b002-3ab6b8865b1c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'instance-00000034-0c53e488-5068-4650-b5ab-66c486f03efa-tapf3c83dc3-57', 'timestamp': '2025-11-29T07:00:48.087931', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-13163110', 'name': 'tapf3c83dc3-57', 'instance_id': '0c53e488-5068-4650-b5ab-66c486f03efa', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:32:b2:fb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf3c83dc3-57'}, 'message_id': '222a80c8-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5095.714832118, 'message_signature': 'b86821d453b4e142b1d2a5311bdad07f26e6126bbdd5b5f46ec87bd115fd16bd'}]}, 'timestamp': '2025-11-29 07:00:48.088233', '_unique_id': 'f2606911be514a498c60013c32aae4cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.089 12 DEBUG ceilometer.compute.pollsters [-] 0c53e488-5068-4650-b5ab-66c486f03efa/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6f316cb-f14f-43d7-9f4c-d262865b8b9c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'instance-00000034-0c53e488-5068-4650-b5ab-66c486f03efa-tapf3c83dc3-57', 'timestamp': '2025-11-29T07:00:48.089958', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-13163110', 'name': 'tapf3c83dc3-57', 'instance_id': '0c53e488-5068-4650-b5ab-66c486f03efa', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:32:b2:fb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf3c83dc3-57'}, 'message_id': '222acfce-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5095.714832118, 'message_signature': '30829af3ec0cf8200f251857b598e015e8d9f882d95205b4730e6c32c382dfea'}]}, 'timestamp': '2025-11-29 07:00:48.090246', '_unique_id': '30f0a8e189864893a4a339a04577434d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.091 12 DEBUG ceilometer.compute.pollsters [-] 0c53e488-5068-4650-b5ab-66c486f03efa/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 DEBUG ceilometer.compute.pollsters [-] 0c53e488-5068-4650-b5ab-66c486f03efa/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e1d0d564-e144-4d49-95b2-a0867416c2f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': '0c53e488-5068-4650-b5ab-66c486f03efa-vda', 'timestamp': '2025-11-29T07:00:48.091771', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-13163110', 'name': 'instance-00000034', 'instance_id': '0c53e488-5068-4650-b5ab-66c486f03efa', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '222b17a4-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5095.75062444, 'message_signature': '2933a9b743410bd4554dbde25cf1edb3791245aa79acdf3bea048bfafc012414'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': '0c53e488-5068-4650-b5ab-66c486f03efa-sda', 'timestamp': '2025-11-29T07:00:48.091771', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-13163110', 'name': 'instance-00000034', 'instance_id': '0c53e488-5068-4650-b5ab-66c486f03efa', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '222b2244-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5095.75062444, 'message_signature': 'f68e032c847cd940582ec0510c681fb917d9ac6488e306f4e9e8dad6d3c3b51b'}]}, 'timestamp': '2025-11-29 07:00:48.092353', '_unique_id': 'a54037ca51e1472b9abe31170094bfca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.092 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.093 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.093 12 DEBUG ceilometer.compute.pollsters [-] 0c53e488-5068-4650-b5ab-66c486f03efa/disk.device.read.latency volume: 217072731 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.094 12 DEBUG ceilometer.compute.pollsters [-] 0c53e488-5068-4650-b5ab-66c486f03efa/disk.device.read.latency volume: 487894 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '65a668b3-0b9a-4cce-b01a-fa302b6ac18f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 217072731, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': '0c53e488-5068-4650-b5ab-66c486f03efa-vda', 'timestamp': '2025-11-29T07:00:48.093831', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-13163110', 'name': 'instance-00000034', 'instance_id': '0c53e488-5068-4650-b5ab-66c486f03efa', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '222b6844-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5095.75062444, 'message_signature': '8e50f5f5c10d97f3da1eae6667c0e8464d6007bb59364afb5234184095013f9d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 487894, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': '0c53e488-5068-4650-b5ab-66c486f03efa-sda', 'timestamp': '2025-11-29T07:00:48.093831', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-13163110', 'name': 'instance-00000034', 'instance_id': '0c53e488-5068-4650-b5ab-66c486f03efa', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '222b732a-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5095.75062444, 'message_signature': '8c941fc745bef4981d2c1e0269031e1f40e5af60109ecae12a9aa5d7de0ebcfb'}]}, 'timestamp': '2025-11-29 07:00:48.094424', '_unique_id': '660c76a2c7174a6ba6d93e37536112ff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.095 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.096 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.096 12 DEBUG ceilometer.compute.pollsters [-] 0c53e488-5068-4650-b5ab-66c486f03efa/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.096 12 DEBUG ceilometer.compute.pollsters [-] 0c53e488-5068-4650-b5ab-66c486f03efa/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '189fa482-405d-4a09-b631-abd253f47071', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': '0c53e488-5068-4650-b5ab-66c486f03efa-vda', 'timestamp': '2025-11-29T07:00:48.096125', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-13163110', 'name': 'instance-00000034', 'instance_id': '0c53e488-5068-4650-b5ab-66c486f03efa', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '222bc0f0-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5095.75062444, 'message_signature': 'd44970f3e8b468a752b0d92b4c6e73b7ee51255c433135b532e7d2ddc771699e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': '0c53e488-5068-4650-b5ab-66c486f03efa-sda', 'timestamp': '2025-11-29T07:00:48.096125', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-13163110', 'name': 'instance-00000034', 'instance_id': '0c53e488-5068-4650-b5ab-66c486f03efa', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '222bcc6c-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5095.75062444, 'message_signature': 'c8431e993b7f41c6ed8e519965ed0d2f92ef5b83a2881cc9820b7475fdc7b602'}]}, 'timestamp': '2025-11-29 07:00:48.096720', '_unique_id': '8b4176931ccf486eaf41a1dc0e4140a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.097 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.098 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.098 12 DEBUG ceilometer.compute.pollsters [-] 0c53e488-5068-4650-b5ab-66c486f03efa/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.098 12 DEBUG ceilometer.compute.pollsters [-] 0c53e488-5068-4650-b5ab-66c486f03efa/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1849e144-3bc0-459c-bc81-34dc97cb55b8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': '0c53e488-5068-4650-b5ab-66c486f03efa-vda', 'timestamp': '2025-11-29T07:00:48.098232', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-13163110', 'name': 'instance-00000034', 'instance_id': '0c53e488-5068-4650-b5ab-66c486f03efa', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '222c1370-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5095.75062444, 'message_signature': 'a514dde196f576e5ec0ebce74c9fd4f2923578721671797617453e6b31e32aba'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': '0c53e488-5068-4650-b5ab-66c486f03efa-sda', 'timestamp': '2025-11-29T07:00:48.098232', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-13163110', 'name': 'instance-00000034', 'instance_id': '0c53e488-5068-4650-b5ab-66c486f03efa', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '222c1df2-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5095.75062444, 'message_signature': '742377e143c2d95664b9bad52f43a49d9d80693a52b670540b6c7e66ee82412c'}]}, 'timestamp': '2025-11-29 07:00:48.098805', '_unique_id': '8de6d22b0e1d49df9f66b7b3dbcb9a7d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.099 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.100 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.100 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.100 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-DeleteServersTestJSON-server-13163110>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-DeleteServersTestJSON-server-13163110>]
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.100 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.101 12 DEBUG ceilometer.compute.pollsters [-] 0c53e488-5068-4650-b5ab-66c486f03efa/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aed5f1c3-f775-4ad6-87b3-89c4e0895111', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'instance-00000034-0c53e488-5068-4650-b5ab-66c486f03efa-tapf3c83dc3-57', 'timestamp': '2025-11-29T07:00:48.100999', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-13163110', 'name': 'tapf3c83dc3-57', 'instance_id': '0c53e488-5068-4650-b5ab-66c486f03efa', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:32:b2:fb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf3c83dc3-57'}, 'message_id': '222c808a-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5095.714832118, 'message_signature': '307b949d9e991d81bd47ef61bd73393278b4b1753b7aec58ff26ffae99dd95b3'}]}, 'timestamp': '2025-11-29 07:00:48.101369', '_unique_id': 'eae7e022f95845fab370725607ad68fc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.102 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.103 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.103 12 DEBUG ceilometer.compute.pollsters [-] 0c53e488-5068-4650-b5ab-66c486f03efa/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82bf81d0-4c60-4eee-9432-719ecda17597', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'instance-00000034-0c53e488-5068-4650-b5ab-66c486f03efa-tapf3c83dc3-57', 'timestamp': '2025-11-29T07:00:48.103134', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-13163110', 'name': 'tapf3c83dc3-57', 'instance_id': '0c53e488-5068-4650-b5ab-66c486f03efa', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:32:b2:fb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf3c83dc3-57'}, 'message_id': '222cd332-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5095.714832118, 'message_signature': 'd7c408b50f85661b21b23e84b569d75261ff4a61f09bba241f6bed4aa6e5ef8e'}]}, 'timestamp': '2025-11-29 07:00:48.103473', '_unique_id': '294b04e73feb437c8a0b60c68aacde82'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.104 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.105 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.105 12 DEBUG ceilometer.compute.pollsters [-] 0c53e488-5068-4650-b5ab-66c486f03efa/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b227313-1e18-4c09-8eec-188e0340bfb1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'instance-00000034-0c53e488-5068-4650-b5ab-66c486f03efa-tapf3c83dc3-57', 'timestamp': '2025-11-29T07:00:48.105202', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-13163110', 'name': 'tapf3c83dc3-57', 'instance_id': '0c53e488-5068-4650-b5ab-66c486f03efa', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:32:b2:fb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf3c83dc3-57'}, 'message_id': '222d2346-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5095.714832118, 'message_signature': '79b1e66ea5cb286be514798d1670aac1ea53708d9eea6c4ec6d4b74064e2b682'}]}, 'timestamp': '2025-11-29 07:00:48.105525', '_unique_id': '1086a98e014a429da2f322a9406d00f0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.106 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.107 12 DEBUG ceilometer.compute.pollsters [-] 0c53e488-5068-4650-b5ab-66c486f03efa/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.107 12 DEBUG ceilometer.compute.pollsters [-] 0c53e488-5068-4650-b5ab-66c486f03efa/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bf910199-823e-4bf3-a5af-29f276733aa5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': '0c53e488-5068-4650-b5ab-66c486f03efa-vda', 'timestamp': '2025-11-29T07:00:48.107010', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-13163110', 'name': 'instance-00000034', 'instance_id': '0c53e488-5068-4650-b5ab-66c486f03efa', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '222d69be-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5095.75062444, 'message_signature': '58bda274e278aa750cc0c7fd3e14c3aec6dd5c2b17f0f9a62218aad1f08485f5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': '0c53e488-5068-4650-b5ab-66c486f03efa-sda', 'timestamp': '2025-11-29T07:00:48.107010', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-13163110', 'name': 'instance-00000034', 'instance_id': '0c53e488-5068-4650-b5ab-66c486f03efa', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '222d7526-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5095.75062444, 'message_signature': '117a325b0edfc388816ccedc26a73eaa1ae6362f6e958f7f1e8226df868024f5'}]}, 'timestamp': '2025-11-29 07:00:48.107587', '_unique_id': 'e2acf037ddcb4297a76981767cbef09e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.108 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.109 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.109 12 DEBUG ceilometer.compute.pollsters [-] 0c53e488-5068-4650-b5ab-66c486f03efa/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '55b36faf-b7f0-471d-beee-a81db09e1685', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'instance-00000034-0c53e488-5068-4650-b5ab-66c486f03efa-tapf3c83dc3-57', 'timestamp': '2025-11-29T07:00:48.109100', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-13163110', 'name': 'tapf3c83dc3-57', 'instance_id': '0c53e488-5068-4650-b5ab-66c486f03efa', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:32:b2:fb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf3c83dc3-57'}, 'message_id': '222dbc2a-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5095.714832118, 'message_signature': '5bf2830e08bca9fd706d0f1262aeb39be7fd0dc92842116966ecda8e9269eb82'}]}, 'timestamp': '2025-11-29 07:00:48.109411', '_unique_id': '9e4a4b315b1d4392ad18bb4a9870889a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.110 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.111 12 DEBUG ceilometer.compute.pollsters [-] 0c53e488-5068-4650-b5ab-66c486f03efa/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.111 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 0c53e488-5068-4650-b5ab-66c486f03efa: ceilometer.compute.pollsters.NoVolumeException
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.111 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.111 12 DEBUG ceilometer.compute.pollsters [-] 0c53e488-5068-4650-b5ab-66c486f03efa/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '325d5119-a9a6-4907-a89b-e2979c0c3b2c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'instance-00000034-0c53e488-5068-4650-b5ab-66c486f03efa-tapf3c83dc3-57', 'timestamp': '2025-11-29T07:00:48.111273', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-13163110', 'name': 'tapf3c83dc3-57', 'instance_id': '0c53e488-5068-4650-b5ab-66c486f03efa', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:32:b2:fb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf3c83dc3-57'}, 'message_id': '222e10a8-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5095.714832118, 'message_signature': '310667872119d3a17afe6e6691d18561ff8bb8667dc21eb0de99415938a94ae7'}]}, 'timestamp': '2025-11-29 07:00:48.111617', '_unique_id': 'b60c5f19bd6f4542a380a2a4cd00e160'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.112 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.113 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.113 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:00:48.113 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-DeleteServersTestJSON-server-13163110>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-DeleteServersTestJSON-server-13163110>]
Nov 29 07:00:48 compute-0 nova_compute[187185]: 2025-11-29 07:00:48.729 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:00:49 compute-0 nova_compute[187185]: 2025-11-29 07:00:49.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:00:50 compute-0 nova_compute[187185]: 2025-11-29 07:00:50.238 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:00:50 compute-0 nova_compute[187185]: 2025-11-29 07:00:50.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:00:50 compute-0 nova_compute[187185]: 2025-11-29 07:00:50.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:00:50 compute-0 nova_compute[187185]: 2025-11-29 07:00:50.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:00:50 compute-0 nova_compute[187185]: 2025-11-29 07:00:50.346 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "refresh_cache-0c53e488-5068-4650-b5ab-66c486f03efa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:00:50 compute-0 nova_compute[187185]: 2025-11-29 07:00:50.346 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquired lock "refresh_cache-0c53e488-5068-4650-b5ab-66c486f03efa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:00:50 compute-0 nova_compute[187185]: 2025-11-29 07:00:50.346 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 29 07:00:50 compute-0 nova_compute[187185]: 2025-11-29 07:00:50.347 187189 DEBUG nova.objects.instance [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 0c53e488-5068-4650-b5ab-66c486f03efa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:00:53 compute-0 nova_compute[187185]: 2025-11-29 07:00:53.733 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:00:54 compute-0 podman[220650]: 2025-11-29 07:00:54.826692188 +0000 UTC m=+0.083281644 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 07:00:55 compute-0 nova_compute[187185]: 2025-11-29 07:00:55.241 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:00:55 compute-0 nova_compute[187185]: 2025-11-29 07:00:55.312 187189 DEBUG nova.virt.libvirt.driver [None req-5631dd08-c7b1-4122-8c4b-0ffbe881c7f7 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Nov 29 07:00:55 compute-0 nova_compute[187185]: 2025-11-29 07:00:55.804 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Updating instance_info_cache with network_info: [{"id": "f3c83dc3-5763-4272-83eb-749a084d4129", "address": "fa:16:3e:32:b2:fb", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3c83dc3-57", "ovs_interfaceid": "f3c83dc3-5763-4272-83eb-749a084d4129", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:00:56 compute-0 nova_compute[187185]: 2025-11-29 07:00:56.027 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Releasing lock "refresh_cache-0c53e488-5068-4650-b5ab-66c486f03efa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:00:56 compute-0 nova_compute[187185]: 2025-11-29 07:00:56.028 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 29 07:00:56 compute-0 nova_compute[187185]: 2025-11-29 07:00:56.028 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:00:56 compute-0 nova_compute[187185]: 2025-11-29 07:00:56.029 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:00:56 compute-0 nova_compute[187185]: 2025-11-29 07:00:56.029 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:00:56 compute-0 nova_compute[187185]: 2025-11-29 07:00:56.029 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:00:56 compute-0 nova_compute[187185]: 2025-11-29 07:00:56.029 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:00:56 compute-0 nova_compute[187185]: 2025-11-29 07:00:56.030 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:00:56 compute-0 nova_compute[187185]: 2025-11-29 07:00:56.030 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:00:56 compute-0 nova_compute[187185]: 2025-11-29 07:00:56.074 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:00:56 compute-0 nova_compute[187185]: 2025-11-29 07:00:56.075 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:00:56 compute-0 nova_compute[187185]: 2025-11-29 07:00:56.075 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:00:56 compute-0 nova_compute[187185]: 2025-11-29 07:00:56.075 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:00:56 compute-0 nova_compute[187185]: 2025-11-29 07:00:56.182 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0c53e488-5068-4650-b5ab-66c486f03efa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:00:56 compute-0 nova_compute[187185]: 2025-11-29 07:00:56.273 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0c53e488-5068-4650-b5ab-66c486f03efa/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:00:56 compute-0 nova_compute[187185]: 2025-11-29 07:00:56.275 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0c53e488-5068-4650-b5ab-66c486f03efa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:00:56 compute-0 nova_compute[187185]: 2025-11-29 07:00:56.331 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0c53e488-5068-4650-b5ab-66c486f03efa/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:00:56 compute-0 nova_compute[187185]: 2025-11-29 07:00:56.483 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:00:56 compute-0 nova_compute[187185]: 2025-11-29 07:00:56.485 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5548MB free_disk=73.31115341186523GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:00:56 compute-0 nova_compute[187185]: 2025-11-29 07:00:56.485 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:00:56 compute-0 nova_compute[187185]: 2025-11-29 07:00:56.486 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:00:56 compute-0 nova_compute[187185]: 2025-11-29 07:00:56.651 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance 0c53e488-5068-4650-b5ab-66c486f03efa actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:00:56 compute-0 nova_compute[187185]: 2025-11-29 07:00:56.652 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:00:56 compute-0 nova_compute[187185]: 2025-11-29 07:00:56.652 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:00:56 compute-0 nova_compute[187185]: 2025-11-29 07:00:56.731 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:00:56 compute-0 ovn_controller[95281]: 2025-11-29T07:00:56Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:32:b2:fb 10.100.0.14
Nov 29 07:00:56 compute-0 ovn_controller[95281]: 2025-11-29T07:00:56Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:32:b2:fb 10.100.0.14
Nov 29 07:00:56 compute-0 nova_compute[187185]: 2025-11-29 07:00:56.934 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:00:56 compute-0 nova_compute[187185]: 2025-11-29 07:00:56.978 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:00:56 compute-0 nova_compute[187185]: 2025-11-29 07:00:56.979 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.493s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:00:57 compute-0 nova_compute[187185]: 2025-11-29 07:00:57.974 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:00:58 compute-0 nova_compute[187185]: 2025-11-29 07:00:58.736 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:00 compute-0 nova_compute[187185]: 2025-11-29 07:01:00.245 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:01 compute-0 CROND[220695]: (root) CMD (run-parts /etc/cron.hourly)
Nov 29 07:01:01 compute-0 run-parts[220698]: (/etc/cron.hourly) starting 0anacron
Nov 29 07:01:01 compute-0 run-parts[220704]: (/etc/cron.hourly) finished 0anacron
Nov 29 07:01:01 compute-0 CROND[220694]: (root) CMDEND (run-parts /etc/cron.hourly)
Nov 29 07:01:03 compute-0 nova_compute[187185]: 2025-11-29 07:01:03.738 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:05 compute-0 nova_compute[187185]: 2025-11-29 07:01:05.249 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:06 compute-0 nova_compute[187185]: 2025-11-29 07:01:06.364 187189 DEBUG nova.virt.libvirt.driver [None req-5631dd08-c7b1-4122-8c4b-0ffbe881c7f7 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Instance in state 1 after 21 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Nov 29 07:01:07 compute-0 podman[220705]: 2025-11-29 07:01:07.834065704 +0000 UTC m=+0.082557984 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent)
Nov 29 07:01:07 compute-0 podman[220707]: 2025-11-29 07:01:07.84062288 +0000 UTC m=+0.082179514 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 07:01:07 compute-0 podman[220706]: 2025-11-29 07:01:07.853937246 +0000 UTC m=+0.098603898 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.openshift.tags=minimal rhel9, config_id=edpm, io.buildah.version=1.33.7, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Nov 29 07:01:08 compute-0 nova_compute[187185]: 2025-11-29 07:01:08.741 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:09 compute-0 kernel: tapf3c83dc3-57 (unregistering): left promiscuous mode
Nov 29 07:01:09 compute-0 NetworkManager[55227]: <info>  [1764399669.2668] device (tapf3c83dc3-57): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:01:09 compute-0 ovn_controller[95281]: 2025-11-29T07:01:09Z|00114|binding|INFO|Releasing lport f3c83dc3-5763-4272-83eb-749a084d4129 from this chassis (sb_readonly=0)
Nov 29 07:01:09 compute-0 ovn_controller[95281]: 2025-11-29T07:01:09Z|00115|binding|INFO|Setting lport f3c83dc3-5763-4272-83eb-749a084d4129 down in Southbound
Nov 29 07:01:09 compute-0 ovn_controller[95281]: 2025-11-29T07:01:09Z|00116|binding|INFO|Removing iface tapf3c83dc3-57 ovn-installed in OVS
Nov 29 07:01:09 compute-0 nova_compute[187185]: 2025-11-29 07:01:09.282 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:09.312 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:32:b2:fb 10.100.0.14'], port_security=['fa:16:3e:32:b2:fb 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '0c53e488-5068-4650-b5ab-66c486f03efa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98df116965b74e4a9985049062e65162', 'neutron:revision_number': '4', 'neutron:security_group_ids': '234720a9-9cd1-4b87-9bec-1abfe8ff0514', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e694bb30-a43a-4d18-87fa-e5c0dd8850c2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=f3c83dc3-5763-4272-83eb-749a084d4129) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:01:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:09.313 104254 INFO neutron.agent.ovn.metadata.agent [-] Port f3c83dc3-5763-4272-83eb-749a084d4129 in datapath fd9eb57e-b1f8-4bae-a60f-8e40613556cd unbound from our chassis
Nov 29 07:01:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:09.315 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fd9eb57e-b1f8-4bae-a60f-8e40613556cd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:01:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:09.317 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[85708700-487e-408b-b611-38cf4024b895]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:09.318 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd namespace which is not needed anymore
Nov 29 07:01:09 compute-0 nova_compute[187185]: 2025-11-29 07:01:09.322 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:09 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000034.scope: Deactivated successfully.
Nov 29 07:01:09 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000034.scope: Consumed 14.295s CPU time.
Nov 29 07:01:09 compute-0 systemd-machined[153486]: Machine qemu-18-instance-00000034 terminated.
Nov 29 07:01:09 compute-0 nova_compute[187185]: 2025-11-29 07:01:09.513 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:09 compute-0 nova_compute[187185]: 2025-11-29 07:01:09.520 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:09 compute-0 nova_compute[187185]: 2025-11-29 07:01:09.566 187189 INFO nova.virt.libvirt.driver [None req-5631dd08-c7b1-4122-8c4b-0ffbe881c7f7 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Instance shutdown successfully after 24 seconds.
Nov 29 07:01:09 compute-0 nova_compute[187185]: 2025-11-29 07:01:09.574 187189 INFO nova.virt.libvirt.driver [-] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Instance destroyed successfully.
Nov 29 07:01:09 compute-0 nova_compute[187185]: 2025-11-29 07:01:09.575 187189 DEBUG nova.objects.instance [None req-5631dd08-c7b1-4122-8c4b-0ffbe881c7f7 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lazy-loading 'numa_topology' on Instance uuid 0c53e488-5068-4650-b5ab-66c486f03efa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:01:09 compute-0 nova_compute[187185]: 2025-11-29 07:01:09.584 187189 DEBUG nova.compute.manager [req-2f30626f-d560-4350-a447-20903b419d48 req-215346a2-1f91-4d36-83de-c02c1ffc5080 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Received event network-vif-unplugged-f3c83dc3-5763-4272-83eb-749a084d4129 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:01:09 compute-0 nova_compute[187185]: 2025-11-29 07:01:09.585 187189 DEBUG oslo_concurrency.lockutils [req-2f30626f-d560-4350-a447-20903b419d48 req-215346a2-1f91-4d36-83de-c02c1ffc5080 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "0c53e488-5068-4650-b5ab-66c486f03efa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:01:09 compute-0 nova_compute[187185]: 2025-11-29 07:01:09.585 187189 DEBUG oslo_concurrency.lockutils [req-2f30626f-d560-4350-a447-20903b419d48 req-215346a2-1f91-4d36-83de-c02c1ffc5080 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0c53e488-5068-4650-b5ab-66c486f03efa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:01:09 compute-0 nova_compute[187185]: 2025-11-29 07:01:09.585 187189 DEBUG oslo_concurrency.lockutils [req-2f30626f-d560-4350-a447-20903b419d48 req-215346a2-1f91-4d36-83de-c02c1ffc5080 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0c53e488-5068-4650-b5ab-66c486f03efa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:01:09 compute-0 nova_compute[187185]: 2025-11-29 07:01:09.585 187189 DEBUG nova.compute.manager [req-2f30626f-d560-4350-a447-20903b419d48 req-215346a2-1f91-4d36-83de-c02c1ffc5080 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] No waiting events found dispatching network-vif-unplugged-f3c83dc3-5763-4272-83eb-749a084d4129 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:01:09 compute-0 nova_compute[187185]: 2025-11-29 07:01:09.586 187189 WARNING nova.compute.manager [req-2f30626f-d560-4350-a447-20903b419d48 req-215346a2-1f91-4d36-83de-c02c1ffc5080 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Received unexpected event network-vif-unplugged-f3c83dc3-5763-4272-83eb-749a084d4129 for instance with vm_state active and task_state shelving.
Nov 29 07:01:09 compute-0 nova_compute[187185]: 2025-11-29 07:01:09.932 187189 INFO nova.virt.libvirt.driver [None req-5631dd08-c7b1-4122-8c4b-0ffbe881c7f7 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Beginning cold snapshot process
Nov 29 07:01:10 compute-0 nova_compute[187185]: 2025-11-29 07:01:10.293 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:10 compute-0 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[220562]: [NOTICE]   (220566) : haproxy version is 2.8.14-c23fe91
Nov 29 07:01:10 compute-0 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[220562]: [NOTICE]   (220566) : path to executable is /usr/sbin/haproxy
Nov 29 07:01:10 compute-0 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[220562]: [WARNING]  (220566) : Exiting Master process...
Nov 29 07:01:10 compute-0 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[220562]: [ALERT]    (220566) : Current worker (220568) exited with code 143 (Terminated)
Nov 29 07:01:10 compute-0 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[220562]: [WARNING]  (220566) : All workers exited. Exiting... (0)
Nov 29 07:01:10 compute-0 systemd[1]: libpod-972ef85c735ee92eda64c27290f50599ec0a74c197a1aa9412ca504fc45bceb2.scope: Deactivated successfully.
Nov 29 07:01:10 compute-0 conmon[220562]: conmon 972ef85c735ee92eda64 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-972ef85c735ee92eda64c27290f50599ec0a74c197a1aa9412ca504fc45bceb2.scope/container/memory.events
Nov 29 07:01:10 compute-0 podman[220789]: 2025-11-29 07:01:10.817275762 +0000 UTC m=+1.362268919 container died 972ef85c735ee92eda64c27290f50599ec0a74c197a1aa9412ca504fc45bceb2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:01:11 compute-0 nova_compute[187185]: 2025-11-29 07:01:11.790 187189 DEBUG nova.compute.manager [req-23e4f064-f85c-4b57-9738-e28656496674 req-31123d42-1526-4462-b0af-e4ef8f3a7bda 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Received event network-vif-plugged-f3c83dc3-5763-4272-83eb-749a084d4129 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:01:11 compute-0 nova_compute[187185]: 2025-11-29 07:01:11.790 187189 DEBUG oslo_concurrency.lockutils [req-23e4f064-f85c-4b57-9738-e28656496674 req-31123d42-1526-4462-b0af-e4ef8f3a7bda 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "0c53e488-5068-4650-b5ab-66c486f03efa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:01:11 compute-0 nova_compute[187185]: 2025-11-29 07:01:11.791 187189 DEBUG oslo_concurrency.lockutils [req-23e4f064-f85c-4b57-9738-e28656496674 req-31123d42-1526-4462-b0af-e4ef8f3a7bda 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0c53e488-5068-4650-b5ab-66c486f03efa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:01:11 compute-0 nova_compute[187185]: 2025-11-29 07:01:11.791 187189 DEBUG oslo_concurrency.lockutils [req-23e4f064-f85c-4b57-9738-e28656496674 req-31123d42-1526-4462-b0af-e4ef8f3a7bda 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0c53e488-5068-4650-b5ab-66c486f03efa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:01:11 compute-0 nova_compute[187185]: 2025-11-29 07:01:11.792 187189 DEBUG nova.compute.manager [req-23e4f064-f85c-4b57-9738-e28656496674 req-31123d42-1526-4462-b0af-e4ef8f3a7bda 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] No waiting events found dispatching network-vif-plugged-f3c83dc3-5763-4272-83eb-749a084d4129 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:01:11 compute-0 nova_compute[187185]: 2025-11-29 07:01:11.792 187189 WARNING nova.compute.manager [req-23e4f064-f85c-4b57-9738-e28656496674 req-31123d42-1526-4462-b0af-e4ef8f3a7bda 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Received unexpected event network-vif-plugged-f3c83dc3-5763-4272-83eb-749a084d4129 for instance with vm_state active and task_state shelving_image_pending_upload.
Nov 29 07:01:13 compute-0 nova_compute[187185]: 2025-11-29 07:01:13.223 187189 DEBUG nova.privsep.utils [None req-5631dd08-c7b1-4122-8c4b-0ffbe881c7f7 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 29 07:01:13 compute-0 nova_compute[187185]: 2025-11-29 07:01:13.224 187189 DEBUG oslo_concurrency.processutils [None req-5631dd08-c7b1-4122-8c4b-0ffbe881c7f7 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/0c53e488-5068-4650-b5ab-66c486f03efa/disk /var/lib/nova/instances/snapshots/tmpxkjxgzud/0ba0fd5d66f14fc39037e52a83a7c6bf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:01:13 compute-0 nova_compute[187185]: 2025-11-29 07:01:13.752 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:14 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-972ef85c735ee92eda64c27290f50599ec0a74c197a1aa9412ca504fc45bceb2-userdata-shm.mount: Deactivated successfully.
Nov 29 07:01:14 compute-0 systemd[1]: var-lib-containers-storage-overlay-92295d971936649fc5cfe4e3a99d1330c6e5e244025c6d270069672112aa6eca-merged.mount: Deactivated successfully.
Nov 29 07:01:15 compute-0 podman[220835]: 2025-11-29 07:01:15.021961658 +0000 UTC m=+1.272593764 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:01:15 compute-0 nova_compute[187185]: 2025-11-29 07:01:15.297 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:18 compute-0 nova_compute[187185]: 2025-11-29 07:01:18.235 187189 DEBUG oslo_concurrency.lockutils [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Acquiring lock "5f771a98-65c5-4910-b222-13d29157fdf5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:01:18 compute-0 nova_compute[187185]: 2025-11-29 07:01:18.236 187189 DEBUG oslo_concurrency.lockutils [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lock "5f771a98-65c5-4910-b222-13d29157fdf5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:01:18 compute-0 nova_compute[187185]: 2025-11-29 07:01:18.260 187189 DEBUG nova.compute.manager [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 07:01:18 compute-0 nova_compute[187185]: 2025-11-29 07:01:18.389 187189 DEBUG oslo_concurrency.lockutils [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:01:18 compute-0 nova_compute[187185]: 2025-11-29 07:01:18.390 187189 DEBUG oslo_concurrency.lockutils [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:01:18 compute-0 nova_compute[187185]: 2025-11-29 07:01:18.398 187189 DEBUG nova.virt.hardware [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:01:18 compute-0 nova_compute[187185]: 2025-11-29 07:01:18.399 187189 INFO nova.compute.claims [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:01:18 compute-0 nova_compute[187185]: 2025-11-29 07:01:18.589 187189 DEBUG nova.compute.provider_tree [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:01:18 compute-0 nova_compute[187185]: 2025-11-29 07:01:18.676 187189 DEBUG nova.scheduler.client.report [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:01:18 compute-0 nova_compute[187185]: 2025-11-29 07:01:18.719 187189 DEBUG oslo_concurrency.lockutils [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.329s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:01:18 compute-0 nova_compute[187185]: 2025-11-29 07:01:18.720 187189 DEBUG nova.compute.manager [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 07:01:18 compute-0 nova_compute[187185]: 2025-11-29 07:01:18.755 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:18 compute-0 nova_compute[187185]: 2025-11-29 07:01:18.780 187189 DEBUG nova.compute.manager [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 07:01:18 compute-0 nova_compute[187185]: 2025-11-29 07:01:18.780 187189 DEBUG nova.network.neutron [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 07:01:18 compute-0 nova_compute[187185]: 2025-11-29 07:01:18.800 187189 INFO nova.virt.libvirt.driver [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 07:01:18 compute-0 nova_compute[187185]: 2025-11-29 07:01:18.823 187189 DEBUG nova.compute.manager [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 07:01:18 compute-0 nova_compute[187185]: 2025-11-29 07:01:18.967 187189 DEBUG nova.compute.manager [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 07:01:18 compute-0 nova_compute[187185]: 2025-11-29 07:01:18.970 187189 DEBUG nova.virt.libvirt.driver [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:01:18 compute-0 nova_compute[187185]: 2025-11-29 07:01:18.970 187189 INFO nova.virt.libvirt.driver [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Creating image(s)
Nov 29 07:01:18 compute-0 nova_compute[187185]: 2025-11-29 07:01:18.971 187189 DEBUG oslo_concurrency.lockutils [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Acquiring lock "/var/lib/nova/instances/5f771a98-65c5-4910-b222-13d29157fdf5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:01:18 compute-0 nova_compute[187185]: 2025-11-29 07:01:18.971 187189 DEBUG oslo_concurrency.lockutils [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lock "/var/lib/nova/instances/5f771a98-65c5-4910-b222-13d29157fdf5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:01:18 compute-0 nova_compute[187185]: 2025-11-29 07:01:18.972 187189 DEBUG oslo_concurrency.lockutils [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lock "/var/lib/nova/instances/5f771a98-65c5-4910-b222-13d29157fdf5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:01:18 compute-0 nova_compute[187185]: 2025-11-29 07:01:18.986 187189 DEBUG oslo_concurrency.processutils [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:01:19 compute-0 nova_compute[187185]: 2025-11-29 07:01:19.040 187189 DEBUG nova.policy [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '11e9982557a44d40b2ebaf04bf99c371', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'de73e0af4d994da4a30deaebd1a7e86b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 07:01:19 compute-0 nova_compute[187185]: 2025-11-29 07:01:19.082 187189 DEBUG oslo_concurrency.processutils [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:01:19 compute-0 nova_compute[187185]: 2025-11-29 07:01:19.084 187189 DEBUG oslo_concurrency.lockutils [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:01:19 compute-0 nova_compute[187185]: 2025-11-29 07:01:19.085 187189 DEBUG oslo_concurrency.lockutils [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:01:19 compute-0 nova_compute[187185]: 2025-11-29 07:01:19.097 187189 DEBUG oslo_concurrency.processutils [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:01:19 compute-0 nova_compute[187185]: 2025-11-29 07:01:19.156 187189 DEBUG oslo_concurrency.processutils [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:01:19 compute-0 nova_compute[187185]: 2025-11-29 07:01:19.158 187189 DEBUG oslo_concurrency.processutils [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/5f771a98-65c5-4910-b222-13d29157fdf5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:01:20 compute-0 nova_compute[187185]: 2025-11-29 07:01:20.300 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:20 compute-0 nova_compute[187185]: 2025-11-29 07:01:20.642 187189 DEBUG nova.network.neutron [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Successfully created port: c0762766-7e10-403f-82f8-da85dbb8bc40 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:01:20 compute-0 podman[220789]: 2025-11-29 07:01:20.936285621 +0000 UTC m=+11.481278758 container cleanup 972ef85c735ee92eda64c27290f50599ec0a74c197a1aa9412ca504fc45bceb2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 07:01:20 compute-0 systemd[1]: libpod-conmon-972ef85c735ee92eda64c27290f50599ec0a74c197a1aa9412ca504fc45bceb2.scope: Deactivated successfully.
Nov 29 07:01:22 compute-0 podman[220871]: 2025-11-29 07:01:22.580695672 +0000 UTC m=+6.654154191 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 07:01:22 compute-0 podman[220870]: 2025-11-29 07:01:22.582028029 +0000 UTC m=+6.661867058 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:01:22 compute-0 nova_compute[187185]: 2025-11-29 07:01:22.870 187189 DEBUG oslo_concurrency.processutils [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/5f771a98-65c5-4910-b222-13d29157fdf5/disk 1073741824" returned: 0 in 3.712s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:01:22 compute-0 nova_compute[187185]: 2025-11-29 07:01:22.870 187189 DEBUG oslo_concurrency.lockutils [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 3.785s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:01:22 compute-0 nova_compute[187185]: 2025-11-29 07:01:22.871 187189 DEBUG oslo_concurrency.processutils [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:01:22 compute-0 nova_compute[187185]: 2025-11-29 07:01:22.960 187189 DEBUG oslo_concurrency.processutils [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:01:22 compute-0 nova_compute[187185]: 2025-11-29 07:01:22.961 187189 DEBUG nova.virt.disk.api [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Checking if we can resize image /var/lib/nova/instances/5f771a98-65c5-4910-b222-13d29157fdf5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:01:22 compute-0 nova_compute[187185]: 2025-11-29 07:01:22.961 187189 DEBUG oslo_concurrency.processutils [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f771a98-65c5-4910-b222-13d29157fdf5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:01:23 compute-0 nova_compute[187185]: 2025-11-29 07:01:23.026 187189 DEBUG oslo_concurrency.processutils [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f771a98-65c5-4910-b222-13d29157fdf5/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:01:23 compute-0 nova_compute[187185]: 2025-11-29 07:01:23.028 187189 DEBUG nova.virt.disk.api [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Cannot resize image /var/lib/nova/instances/5f771a98-65c5-4910-b222-13d29157fdf5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:01:23 compute-0 nova_compute[187185]: 2025-11-29 07:01:23.028 187189 DEBUG nova.objects.instance [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lazy-loading 'migration_context' on Instance uuid 5f771a98-65c5-4910-b222-13d29157fdf5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:01:23 compute-0 nova_compute[187185]: 2025-11-29 07:01:23.086 187189 DEBUG nova.network.neutron [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Successfully created port: 6549d57e-4605-44d6-b1cd-d909be8fd972 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:01:23 compute-0 nova_compute[187185]: 2025-11-29 07:01:23.152 187189 DEBUG oslo_concurrency.processutils [None req-5631dd08-c7b1-4122-8c4b-0ffbe881c7f7 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/0c53e488-5068-4650-b5ab-66c486f03efa/disk /var/lib/nova/instances/snapshots/tmpxkjxgzud/0ba0fd5d66f14fc39037e52a83a7c6bf" returned: 0 in 9.928s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:01:23 compute-0 nova_compute[187185]: 2025-11-29 07:01:23.153 187189 INFO nova.virt.libvirt.driver [None req-5631dd08-c7b1-4122-8c4b-0ffbe881c7f7 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Snapshot extracted, beginning image upload
Nov 29 07:01:23 compute-0 nova_compute[187185]: 2025-11-29 07:01:23.240 187189 DEBUG nova.virt.libvirt.driver [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:01:23 compute-0 nova_compute[187185]: 2025-11-29 07:01:23.240 187189 DEBUG nova.virt.libvirt.driver [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Ensure instance console log exists: /var/lib/nova/instances/5f771a98-65c5-4910-b222-13d29157fdf5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:01:23 compute-0 nova_compute[187185]: 2025-11-29 07:01:23.241 187189 DEBUG oslo_concurrency.lockutils [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:01:23 compute-0 nova_compute[187185]: 2025-11-29 07:01:23.242 187189 DEBUG oslo_concurrency.lockutils [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:01:23 compute-0 nova_compute[187185]: 2025-11-29 07:01:23.243 187189 DEBUG oslo_concurrency.lockutils [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:01:23 compute-0 podman[220902]: 2025-11-29 07:01:23.698033579 +0000 UTC m=+2.733638735 container remove 972ef85c735ee92eda64c27290f50599ec0a74c197a1aa9412ca504fc45bceb2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 29 07:01:23 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:23.705 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[2aaf9cb9-c2c5-419b-b3e4-2ad4bfa4af66]: (4, ('Sat Nov 29 07:01:09 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd (972ef85c735ee92eda64c27290f50599ec0a74c197a1aa9412ca504fc45bceb2)\n972ef85c735ee92eda64c27290f50599ec0a74c197a1aa9412ca504fc45bceb2\nSat Nov 29 07:01:20 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd (972ef85c735ee92eda64c27290f50599ec0a74c197a1aa9412ca504fc45bceb2)\n972ef85c735ee92eda64c27290f50599ec0a74c197a1aa9412ca504fc45bceb2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:23 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:23.707 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[b125e135-4587-40c6-8839-d5d34011bb44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:23 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:23.708 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfd9eb57e-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:01:23 compute-0 nova_compute[187185]: 2025-11-29 07:01:23.710 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:23 compute-0 kernel: tapfd9eb57e-b0: left promiscuous mode
Nov 29 07:01:23 compute-0 nova_compute[187185]: 2025-11-29 07:01:23.727 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:23 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:23.730 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[421043cb-8609-4b94-a967-679db2dc21a3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:23 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:23.746 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[693093c6-ba07-45b1-aec0-1ef5f3b5c649]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:23 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:23.747 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[c1440ef6-146b-44f7-989d-e62995b6f344]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:23 compute-0 nova_compute[187185]: 2025-11-29 07:01:23.757 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:23 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:23.764 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[65efeeb1-a1b0-4877-9ea4-ae9e6e08e8bd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 508493, 'reachable_time': 34800, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220950, 'error': None, 'target': 'ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:23 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:23.770 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:01:23 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:23.771 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[805b515c-1385-4f55-a24e-cfafc11e722f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:23 compute-0 systemd[1]: run-netns-ovnmeta\x2dfd9eb57e\x2db1f8\x2d4bae\x2da60f\x2d8e40613556cd.mount: Deactivated successfully.
Nov 29 07:01:24 compute-0 nova_compute[187185]: 2025-11-29 07:01:24.566 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399669.5650074, 0c53e488-5068-4650-b5ab-66c486f03efa => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:01:24 compute-0 nova_compute[187185]: 2025-11-29 07:01:24.567 187189 INFO nova.compute.manager [-] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] VM Stopped (Lifecycle Event)
Nov 29 07:01:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:24.820 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:01:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:24.820 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:01:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:24.820 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:01:24 compute-0 nova_compute[187185]: 2025-11-29 07:01:24.925 187189 DEBUG nova.compute.manager [None req-23e0b815-4980-49d0-b25f-5250b7d36bed - - - - - -] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:01:24 compute-0 nova_compute[187185]: 2025-11-29 07:01:24.930 187189 DEBUG nova.compute.manager [None req-23e0b815-4980-49d0-b25f-5250b7d36bed - - - - - -] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: active, current task_state: shelving_image_uploading, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:01:24 compute-0 nova_compute[187185]: 2025-11-29 07:01:24.985 187189 INFO nova.compute.manager [None req-23e0b815-4980-49d0-b25f-5250b7d36bed - - - - - -] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] During sync_power_state the instance has a pending task (shelving_image_uploading). Skip.
Nov 29 07:01:25 compute-0 nova_compute[187185]: 2025-11-29 07:01:25.341 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:25 compute-0 nova_compute[187185]: 2025-11-29 07:01:25.847 187189 DEBUG nova.network.neutron [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Successfully updated port: c0762766-7e10-403f-82f8-da85dbb8bc40 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:01:25 compute-0 podman[220956]: 2025-11-29 07:01:25.852160354 +0000 UTC m=+0.100475920 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true)
Nov 29 07:01:27 compute-0 nova_compute[187185]: 2025-11-29 07:01:27.860 187189 DEBUG nova.compute.manager [req-10cce38b-dca1-4823-9ef3-1ea078590fa5 req-6680e7d4-d4e0-401b-9258-ac89694cfe72 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Received event network-changed-c0762766-7e10-403f-82f8-da85dbb8bc40 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:01:27 compute-0 nova_compute[187185]: 2025-11-29 07:01:27.860 187189 DEBUG nova.compute.manager [req-10cce38b-dca1-4823-9ef3-1ea078590fa5 req-6680e7d4-d4e0-401b-9258-ac89694cfe72 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Refreshing instance network info cache due to event network-changed-c0762766-7e10-403f-82f8-da85dbb8bc40. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:01:27 compute-0 nova_compute[187185]: 2025-11-29 07:01:27.860 187189 DEBUG oslo_concurrency.lockutils [req-10cce38b-dca1-4823-9ef3-1ea078590fa5 req-6680e7d4-d4e0-401b-9258-ac89694cfe72 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-5f771a98-65c5-4910-b222-13d29157fdf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:01:27 compute-0 nova_compute[187185]: 2025-11-29 07:01:27.861 187189 DEBUG oslo_concurrency.lockutils [req-10cce38b-dca1-4823-9ef3-1ea078590fa5 req-6680e7d4-d4e0-401b-9258-ac89694cfe72 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-5f771a98-65c5-4910-b222-13d29157fdf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:01:27 compute-0 nova_compute[187185]: 2025-11-29 07:01:27.861 187189 DEBUG nova.network.neutron [req-10cce38b-dca1-4823-9ef3-1ea078590fa5 req-6680e7d4-d4e0-401b-9258-ac89694cfe72 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Refreshing network info cache for port c0762766-7e10-403f-82f8-da85dbb8bc40 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:01:27 compute-0 nova_compute[187185]: 2025-11-29 07:01:27.929 187189 INFO nova.virt.libvirt.driver [None req-5631dd08-c7b1-4122-8c4b-0ffbe881c7f7 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Snapshot image upload complete
Nov 29 07:01:27 compute-0 nova_compute[187185]: 2025-11-29 07:01:27.930 187189 DEBUG nova.compute.manager [None req-5631dd08-c7b1-4122-8c4b-0ffbe881c7f7 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:01:28 compute-0 nova_compute[187185]: 2025-11-29 07:01:28.753 187189 INFO nova.compute.manager [None req-5631dd08-c7b1-4122-8c4b-0ffbe881c7f7 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Shelve offloading
Nov 29 07:01:28 compute-0 nova_compute[187185]: 2025-11-29 07:01:28.811 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:28 compute-0 nova_compute[187185]: 2025-11-29 07:01:28.868 187189 INFO nova.virt.libvirt.driver [-] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Instance destroyed successfully.
Nov 29 07:01:28 compute-0 nova_compute[187185]: 2025-11-29 07:01:28.868 187189 DEBUG nova.compute.manager [None req-5631dd08-c7b1-4122-8c4b-0ffbe881c7f7 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:01:28 compute-0 nova_compute[187185]: 2025-11-29 07:01:28.872 187189 DEBUG oslo_concurrency.lockutils [None req-5631dd08-c7b1-4122-8c4b-0ffbe881c7f7 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "refresh_cache-0c53e488-5068-4650-b5ab-66c486f03efa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:01:28 compute-0 nova_compute[187185]: 2025-11-29 07:01:28.872 187189 DEBUG oslo_concurrency.lockutils [None req-5631dd08-c7b1-4122-8c4b-0ffbe881c7f7 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquired lock "refresh_cache-0c53e488-5068-4650-b5ab-66c486f03efa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:01:28 compute-0 nova_compute[187185]: 2025-11-29 07:01:28.872 187189 DEBUG nova.network.neutron [None req-5631dd08-c7b1-4122-8c4b-0ffbe881c7f7 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:01:28 compute-0 nova_compute[187185]: 2025-11-29 07:01:28.999 187189 DEBUG nova.network.neutron [req-10cce38b-dca1-4823-9ef3-1ea078590fa5 req-6680e7d4-d4e0-401b-9258-ac89694cfe72 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:01:29 compute-0 nova_compute[187185]: 2025-11-29 07:01:29.318 187189 DEBUG nova.network.neutron [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Successfully updated port: 6549d57e-4605-44d6-b1cd-d909be8fd972 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:01:29 compute-0 nova_compute[187185]: 2025-11-29 07:01:29.336 187189 DEBUG oslo_concurrency.lockutils [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Acquiring lock "refresh_cache-5f771a98-65c5-4910-b222-13d29157fdf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:01:29 compute-0 nova_compute[187185]: 2025-11-29 07:01:29.544 187189 DEBUG nova.network.neutron [req-10cce38b-dca1-4823-9ef3-1ea078590fa5 req-6680e7d4-d4e0-401b-9258-ac89694cfe72 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:01:29 compute-0 nova_compute[187185]: 2025-11-29 07:01:29.569 187189 DEBUG oslo_concurrency.lockutils [req-10cce38b-dca1-4823-9ef3-1ea078590fa5 req-6680e7d4-d4e0-401b-9258-ac89694cfe72 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-5f771a98-65c5-4910-b222-13d29157fdf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:01:29 compute-0 nova_compute[187185]: 2025-11-29 07:01:29.571 187189 DEBUG oslo_concurrency.lockutils [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Acquired lock "refresh_cache-5f771a98-65c5-4910-b222-13d29157fdf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:01:29 compute-0 nova_compute[187185]: 2025-11-29 07:01:29.571 187189 DEBUG nova.network.neutron [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:01:29 compute-0 nova_compute[187185]: 2025-11-29 07:01:29.870 187189 DEBUG nova.network.neutron [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:01:30 compute-0 nova_compute[187185]: 2025-11-29 07:01:30.051 187189 DEBUG nova.compute.manager [req-352d80ec-664e-415f-a009-6b3ea70e6d6c req-aba06cc0-db14-44d6-830e-e885a09f51d6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Received event network-changed-6549d57e-4605-44d6-b1cd-d909be8fd972 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:01:30 compute-0 nova_compute[187185]: 2025-11-29 07:01:30.052 187189 DEBUG nova.compute.manager [req-352d80ec-664e-415f-a009-6b3ea70e6d6c req-aba06cc0-db14-44d6-830e-e885a09f51d6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Refreshing instance network info cache due to event network-changed-6549d57e-4605-44d6-b1cd-d909be8fd972. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:01:30 compute-0 nova_compute[187185]: 2025-11-29 07:01:30.052 187189 DEBUG oslo_concurrency.lockutils [req-352d80ec-664e-415f-a009-6b3ea70e6d6c req-aba06cc0-db14-44d6-830e-e885a09f51d6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-5f771a98-65c5-4910-b222-13d29157fdf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:01:30 compute-0 nova_compute[187185]: 2025-11-29 07:01:30.345 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:30 compute-0 nova_compute[187185]: 2025-11-29 07:01:30.556 187189 DEBUG nova.network.neutron [None req-5631dd08-c7b1-4122-8c4b-0ffbe881c7f7 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Updating instance_info_cache with network_info: [{"id": "f3c83dc3-5763-4272-83eb-749a084d4129", "address": "fa:16:3e:32:b2:fb", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3c83dc3-57", "ovs_interfaceid": "f3c83dc3-5763-4272-83eb-749a084d4129", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:01:30 compute-0 nova_compute[187185]: 2025-11-29 07:01:30.651 187189 DEBUG oslo_concurrency.lockutils [None req-5631dd08-c7b1-4122-8c4b-0ffbe881c7f7 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Releasing lock "refresh_cache-0c53e488-5068-4650-b5ab-66c486f03efa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.739 187189 DEBUG nova.network.neutron [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Updating instance_info_cache with network_info: [{"id": "c0762766-7e10-403f-82f8-da85dbb8bc40", "address": "fa:16:3e:42:4b:e3", "network": {"id": "498d9ea4-23cf-4d91-b24d-062d633f08bb", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-266622578", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.252", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0762766-7e", "ovs_interfaceid": "c0762766-7e10-403f-82f8-da85dbb8bc40", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6549d57e-4605-44d6-b1cd-d909be8fd972", "address": "fa:16:3e:05:05:1c", "network": {"id": "97e790ce-fab4-4f38-8ea5-27e8c5836b93", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2050249622", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.49", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6549d57e-46", "ovs_interfaceid": "6549d57e-4605-44d6-b1cd-d909be8fd972", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.781 187189 DEBUG oslo_concurrency.lockutils [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Releasing lock "refresh_cache-5f771a98-65c5-4910-b222-13d29157fdf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.782 187189 DEBUG nova.compute.manager [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Instance network_info: |[{"id": "c0762766-7e10-403f-82f8-da85dbb8bc40", "address": "fa:16:3e:42:4b:e3", "network": {"id": "498d9ea4-23cf-4d91-b24d-062d633f08bb", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-266622578", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.252", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0762766-7e", "ovs_interfaceid": "c0762766-7e10-403f-82f8-da85dbb8bc40", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6549d57e-4605-44d6-b1cd-d909be8fd972", "address": "fa:16:3e:05:05:1c", "network": {"id": "97e790ce-fab4-4f38-8ea5-27e8c5836b93", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2050249622", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.49", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6549d57e-46", "ovs_interfaceid": "6549d57e-4605-44d6-b1cd-d909be8fd972", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.783 187189 DEBUG oslo_concurrency.lockutils [req-352d80ec-664e-415f-a009-6b3ea70e6d6c req-aba06cc0-db14-44d6-830e-e885a09f51d6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-5f771a98-65c5-4910-b222-13d29157fdf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.783 187189 DEBUG nova.network.neutron [req-352d80ec-664e-415f-a009-6b3ea70e6d6c req-aba06cc0-db14-44d6-830e-e885a09f51d6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Refreshing network info cache for port 6549d57e-4605-44d6-b1cd-d909be8fd972 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.791 187189 DEBUG nova.virt.libvirt.driver [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Start _get_guest_xml network_info=[{"id": "c0762766-7e10-403f-82f8-da85dbb8bc40", "address": "fa:16:3e:42:4b:e3", "network": {"id": "498d9ea4-23cf-4d91-b24d-062d633f08bb", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-266622578", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.252", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0762766-7e", "ovs_interfaceid": "c0762766-7e10-403f-82f8-da85dbb8bc40", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6549d57e-4605-44d6-b1cd-d909be8fd972", "address": "fa:16:3e:05:05:1c", "network": {"id": "97e790ce-fab4-4f38-8ea5-27e8c5836b93", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2050249622", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.49", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6549d57e-46", "ovs_interfaceid": "6549d57e-4605-44d6-b1cd-d909be8fd972", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.800 187189 WARNING nova.virt.libvirt.driver [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.817 187189 DEBUG nova.virt.libvirt.host [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.819 187189 DEBUG nova.virt.libvirt.host [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.825 187189 DEBUG nova.virt.libvirt.host [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.826 187189 DEBUG nova.virt.libvirt.host [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.830 187189 DEBUG nova.virt.libvirt.driver [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.830 187189 DEBUG nova.virt.hardware [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.831 187189 DEBUG nova.virt.hardware [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.832 187189 DEBUG nova.virt.hardware [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.833 187189 DEBUG nova.virt.hardware [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.833 187189 DEBUG nova.virt.hardware [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.834 187189 DEBUG nova.virt.hardware [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.834 187189 DEBUG nova.virt.hardware [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.835 187189 DEBUG nova.virt.hardware [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.835 187189 DEBUG nova.virt.hardware [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.835 187189 DEBUG nova.virt.hardware [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.836 187189 DEBUG nova.virt.hardware [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.845 187189 DEBUG nova.virt.libvirt.vif [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:01:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-2071189335',display_name='tempest-ServersTestMultiNic-server-2071189335',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-2071189335',id=57,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='de73e0af4d994da4a30deaebd1a7e86b',ramdisk_id='',reservation_id='r-9rmm5zry',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1778452684',owner_user_name='tempest-ServersTestMultiNic-1778452684-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:01:18Z,user_data=None,user_id='11e9982557a44d40b2ebaf04bf99c371',uuid=5f771a98-65c5-4910-b222-13d29157fdf5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c0762766-7e10-403f-82f8-da85dbb8bc40", "address": "fa:16:3e:42:4b:e3", "network": {"id": "498d9ea4-23cf-4d91-b24d-062d633f08bb", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-266622578", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.252", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0762766-7e", "ovs_interfaceid": "c0762766-7e10-403f-82f8-da85dbb8bc40", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.846 187189 DEBUG nova.network.os_vif_util [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Converting VIF {"id": "c0762766-7e10-403f-82f8-da85dbb8bc40", "address": "fa:16:3e:42:4b:e3", "network": {"id": "498d9ea4-23cf-4d91-b24d-062d633f08bb", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-266622578", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.252", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0762766-7e", "ovs_interfaceid": "c0762766-7e10-403f-82f8-da85dbb8bc40", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.847 187189 DEBUG nova.network.os_vif_util [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:42:4b:e3,bridge_name='br-int',has_traffic_filtering=True,id=c0762766-7e10-403f-82f8-da85dbb8bc40,network=Network(498d9ea4-23cf-4d91-b24d-062d633f08bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0762766-7e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.849 187189 DEBUG nova.virt.libvirt.vif [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:01:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-2071189335',display_name='tempest-ServersTestMultiNic-server-2071189335',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-2071189335',id=57,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='de73e0af4d994da4a30deaebd1a7e86b',ramdisk_id='',reservation_id='r-9rmm5zry',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1778452684',owner_user_name='tempest-ServersTestMultiNic-1778452684-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:01:18Z,user_data=None,user_id='11e9982557a44d40b2ebaf04bf99c371',uuid=5f771a98-65c5-4910-b222-13d29157fdf5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6549d57e-4605-44d6-b1cd-d909be8fd972", "address": "fa:16:3e:05:05:1c", "network": {"id": "97e790ce-fab4-4f38-8ea5-27e8c5836b93", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2050249622", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.49", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6549d57e-46", "ovs_interfaceid": "6549d57e-4605-44d6-b1cd-d909be8fd972", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.849 187189 DEBUG nova.network.os_vif_util [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Converting VIF {"id": "6549d57e-4605-44d6-b1cd-d909be8fd972", "address": "fa:16:3e:05:05:1c", "network": {"id": "97e790ce-fab4-4f38-8ea5-27e8c5836b93", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2050249622", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.49", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6549d57e-46", "ovs_interfaceid": "6549d57e-4605-44d6-b1cd-d909be8fd972", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.852 187189 DEBUG nova.network.os_vif_util [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:05:05:1c,bridge_name='br-int',has_traffic_filtering=True,id=6549d57e-4605-44d6-b1cd-d909be8fd972,network=Network(97e790ce-fab4-4f38-8ea5-27e8c5836b93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6549d57e-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.854 187189 DEBUG nova.objects.instance [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lazy-loading 'pci_devices' on Instance uuid 5f771a98-65c5-4910-b222-13d29157fdf5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.898 187189 DEBUG nova.virt.libvirt.driver [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:01:31 compute-0 nova_compute[187185]:   <uuid>5f771a98-65c5-4910-b222-13d29157fdf5</uuid>
Nov 29 07:01:31 compute-0 nova_compute[187185]:   <name>instance-00000039</name>
Nov 29 07:01:31 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:01:31 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:01:31 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:01:31 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:       <nova:name>tempest-ServersTestMultiNic-server-2071189335</nova:name>
Nov 29 07:01:31 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:01:31</nova:creationTime>
Nov 29 07:01:31 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:01:31 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:01:31 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:01:31 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:01:31 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:01:31 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:01:31 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:01:31 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:01:31 compute-0 nova_compute[187185]:         <nova:user uuid="11e9982557a44d40b2ebaf04bf99c371">tempest-ServersTestMultiNic-1778452684-project-member</nova:user>
Nov 29 07:01:31 compute-0 nova_compute[187185]:         <nova:project uuid="de73e0af4d994da4a30deaebd1a7e86b">tempest-ServersTestMultiNic-1778452684</nova:project>
Nov 29 07:01:31 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:01:31 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:01:31 compute-0 nova_compute[187185]:         <nova:port uuid="c0762766-7e10-403f-82f8-da85dbb8bc40">
Nov 29 07:01:31 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.252" ipVersion="4"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:01:31 compute-0 nova_compute[187185]:         <nova:port uuid="6549d57e-4605-44d6-b1cd-d909be8fd972">
Nov 29 07:01:31 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.1.49" ipVersion="4"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:01:31 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:01:31 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:01:31 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <system>
Nov 29 07:01:31 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:01:31 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:01:31 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:01:31 compute-0 nova_compute[187185]:       <entry name="serial">5f771a98-65c5-4910-b222-13d29157fdf5</entry>
Nov 29 07:01:31 compute-0 nova_compute[187185]:       <entry name="uuid">5f771a98-65c5-4910-b222-13d29157fdf5</entry>
Nov 29 07:01:31 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     </system>
Nov 29 07:01:31 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:01:31 compute-0 nova_compute[187185]:   <os>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:   </os>
Nov 29 07:01:31 compute-0 nova_compute[187185]:   <features>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:   </features>
Nov 29 07:01:31 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:01:31 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:01:31 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:01:31 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/5f771a98-65c5-4910-b222-13d29157fdf5/disk"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:01:31 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/5f771a98-65c5-4910-b222-13d29157fdf5/disk.config"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:01:31 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:42:4b:e3"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:       <target dev="tapc0762766-7e"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:01:31 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:05:05:1c"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:       <target dev="tap6549d57e-46"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:01:31 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/5f771a98-65c5-4910-b222-13d29157fdf5/console.log" append="off"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <video>
Nov 29 07:01:31 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     </video>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:01:31 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:01:31 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:01:31 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:01:31 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:01:31 compute-0 nova_compute[187185]: </domain>
Nov 29 07:01:31 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.900 187189 DEBUG nova.compute.manager [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Preparing to wait for external event network-vif-plugged-c0762766-7e10-403f-82f8-da85dbb8bc40 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.900 187189 DEBUG oslo_concurrency.lockutils [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Acquiring lock "5f771a98-65c5-4910-b222-13d29157fdf5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.900 187189 DEBUG oslo_concurrency.lockutils [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lock "5f771a98-65c5-4910-b222-13d29157fdf5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.901 187189 DEBUG oslo_concurrency.lockutils [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lock "5f771a98-65c5-4910-b222-13d29157fdf5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.901 187189 DEBUG nova.compute.manager [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Preparing to wait for external event network-vif-plugged-6549d57e-4605-44d6-b1cd-d909be8fd972 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.901 187189 DEBUG oslo_concurrency.lockutils [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Acquiring lock "5f771a98-65c5-4910-b222-13d29157fdf5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.901 187189 DEBUG oslo_concurrency.lockutils [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lock "5f771a98-65c5-4910-b222-13d29157fdf5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.902 187189 DEBUG oslo_concurrency.lockutils [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lock "5f771a98-65c5-4910-b222-13d29157fdf5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.903 187189 DEBUG nova.virt.libvirt.vif [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:01:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-2071189335',display_name='tempest-ServersTestMultiNic-server-2071189335',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-2071189335',id=57,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='de73e0af4d994da4a30deaebd1a7e86b',ramdisk_id='',reservation_id='r-9rmm5zry',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1778452684',owner_user_name='tempest-ServersTestMultiNic-1778452684-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:01:18Z,user_data=None,user_id='11e9982557a44d40b2ebaf04bf99c371',uuid=5f771a98-65c5-4910-b222-13d29157fdf5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c0762766-7e10-403f-82f8-da85dbb8bc40", "address": "fa:16:3e:42:4b:e3", "network": {"id": "498d9ea4-23cf-4d91-b24d-062d633f08bb", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-266622578", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.252", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0762766-7e", "ovs_interfaceid": "c0762766-7e10-403f-82f8-da85dbb8bc40", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.903 187189 DEBUG nova.network.os_vif_util [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Converting VIF {"id": "c0762766-7e10-403f-82f8-da85dbb8bc40", "address": "fa:16:3e:42:4b:e3", "network": {"id": "498d9ea4-23cf-4d91-b24d-062d633f08bb", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-266622578", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.252", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0762766-7e", "ovs_interfaceid": "c0762766-7e10-403f-82f8-da85dbb8bc40", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.904 187189 DEBUG nova.network.os_vif_util [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:42:4b:e3,bridge_name='br-int',has_traffic_filtering=True,id=c0762766-7e10-403f-82f8-da85dbb8bc40,network=Network(498d9ea4-23cf-4d91-b24d-062d633f08bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0762766-7e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.904 187189 DEBUG os_vif [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:42:4b:e3,bridge_name='br-int',has_traffic_filtering=True,id=c0762766-7e10-403f-82f8-da85dbb8bc40,network=Network(498d9ea4-23cf-4d91-b24d-062d633f08bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0762766-7e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.905 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.906 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.906 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.910 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.910 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc0762766-7e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.911 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc0762766-7e, col_values=(('external_ids', {'iface-id': 'c0762766-7e10-403f-82f8-da85dbb8bc40', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:42:4b:e3', 'vm-uuid': '5f771a98-65c5-4910-b222-13d29157fdf5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.913 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:31 compute-0 NetworkManager[55227]: <info>  [1764399691.9145] manager: (tapc0762766-7e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.915 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.926 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.928 187189 INFO os_vif [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:42:4b:e3,bridge_name='br-int',has_traffic_filtering=True,id=c0762766-7e10-403f-82f8-da85dbb8bc40,network=Network(498d9ea4-23cf-4d91-b24d-062d633f08bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0762766-7e')
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.929 187189 DEBUG nova.virt.libvirt.vif [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:01:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-2071189335',display_name='tempest-ServersTestMultiNic-server-2071189335',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-2071189335',id=57,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='de73e0af4d994da4a30deaebd1a7e86b',ramdisk_id='',reservation_id='r-9rmm5zry',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-1778452684',owner_user_name='tempest-ServersTestMultiNic-1778452684-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:01:18Z,user_data=None,user_id='11e9982557a44d40b2ebaf04bf99c371',uuid=5f771a98-65c5-4910-b222-13d29157fdf5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6549d57e-4605-44d6-b1cd-d909be8fd972", "address": "fa:16:3e:05:05:1c", "network": {"id": "97e790ce-fab4-4f38-8ea5-27e8c5836b93", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2050249622", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.49", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6549d57e-46", "ovs_interfaceid": "6549d57e-4605-44d6-b1cd-d909be8fd972", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.930 187189 DEBUG nova.network.os_vif_util [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Converting VIF {"id": "6549d57e-4605-44d6-b1cd-d909be8fd972", "address": "fa:16:3e:05:05:1c", "network": {"id": "97e790ce-fab4-4f38-8ea5-27e8c5836b93", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2050249622", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.49", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6549d57e-46", "ovs_interfaceid": "6549d57e-4605-44d6-b1cd-d909be8fd972", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.931 187189 DEBUG nova.network.os_vif_util [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:05:05:1c,bridge_name='br-int',has_traffic_filtering=True,id=6549d57e-4605-44d6-b1cd-d909be8fd972,network=Network(97e790ce-fab4-4f38-8ea5-27e8c5836b93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6549d57e-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.931 187189 DEBUG os_vif [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:05:1c,bridge_name='br-int',has_traffic_filtering=True,id=6549d57e-4605-44d6-b1cd-d909be8fd972,network=Network(97e790ce-fab4-4f38-8ea5-27e8c5836b93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6549d57e-46') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.932 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.932 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.933 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.936 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.936 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6549d57e-46, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.937 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6549d57e-46, col_values=(('external_ids', {'iface-id': '6549d57e-4605-44d6-b1cd-d909be8fd972', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:05:05:1c', 'vm-uuid': '5f771a98-65c5-4910-b222-13d29157fdf5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.938 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:31 compute-0 NetworkManager[55227]: <info>  [1764399691.9396] manager: (tap6549d57e-46): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/68)
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.940 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.951 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:31 compute-0 nova_compute[187185]: 2025-11-29 07:01:31.953 187189 INFO os_vif [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:05:1c,bridge_name='br-int',has_traffic_filtering=True,id=6549d57e-4605-44d6-b1cd-d909be8fd972,network=Network(97e790ce-fab4-4f38-8ea5-27e8c5836b93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6549d57e-46')
Nov 29 07:01:32 compute-0 nova_compute[187185]: 2025-11-29 07:01:32.430 187189 DEBUG nova.virt.libvirt.driver [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:01:32 compute-0 nova_compute[187185]: 2025-11-29 07:01:32.430 187189 DEBUG nova.virt.libvirt.driver [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:01:32 compute-0 nova_compute[187185]: 2025-11-29 07:01:32.431 187189 DEBUG nova.virt.libvirt.driver [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] No VIF found with MAC fa:16:3e:42:4b:e3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:01:32 compute-0 nova_compute[187185]: 2025-11-29 07:01:32.431 187189 DEBUG nova.virt.libvirt.driver [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] No VIF found with MAC fa:16:3e:05:05:1c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:01:32 compute-0 nova_compute[187185]: 2025-11-29 07:01:32.432 187189 INFO nova.virt.libvirt.driver [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Using config drive
Nov 29 07:01:33 compute-0 nova_compute[187185]: 2025-11-29 07:01:33.815 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:33 compute-0 nova_compute[187185]: 2025-11-29 07:01:33.846 187189 INFO nova.virt.libvirt.driver [-] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Instance destroyed successfully.
Nov 29 07:01:33 compute-0 nova_compute[187185]: 2025-11-29 07:01:33.847 187189 DEBUG nova.objects.instance [None req-5631dd08-c7b1-4122-8c4b-0ffbe881c7f7 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lazy-loading 'resources' on Instance uuid 0c53e488-5068-4650-b5ab-66c486f03efa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:01:33 compute-0 nova_compute[187185]: 2025-11-29 07:01:33.871 187189 DEBUG nova.virt.libvirt.vif [None req-5631dd08-c7b1-4122-8c4b-0ffbe881c7f7 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:00:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-13163110',display_name='tempest-DeleteServersTestJSON-server-13163110',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-13163110',id=52,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:00:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='98df116965b74e4a9985049062e65162',ramdisk_id='',reservation_id='r-z3dqqzux',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DeleteServersTestJSON-1973671383',owner_user_name='tempest-DeleteServersTestJSON-1973671383-project-member',shelved_at='2025-11-29T07:01:27.930016',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='31b45d29-ccfd-4cad-83b0-da0396a9b13c'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:01:23Z,user_data=None,user_id='4ecd161098b5422084003b39f0504a8f',uuid=0c53e488-5068-4650-b5ab-66c486f03efa,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "f3c83dc3-5763-4272-83eb-749a084d4129", "address": "fa:16:3e:32:b2:fb", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3c83dc3-57", "ovs_interfaceid": "f3c83dc3-5763-4272-83eb-749a084d4129", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:01:33 compute-0 nova_compute[187185]: 2025-11-29 07:01:33.872 187189 DEBUG nova.network.os_vif_util [None req-5631dd08-c7b1-4122-8c4b-0ffbe881c7f7 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converting VIF {"id": "f3c83dc3-5763-4272-83eb-749a084d4129", "address": "fa:16:3e:32:b2:fb", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3c83dc3-57", "ovs_interfaceid": "f3c83dc3-5763-4272-83eb-749a084d4129", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:01:33 compute-0 nova_compute[187185]: 2025-11-29 07:01:33.874 187189 DEBUG nova.network.os_vif_util [None req-5631dd08-c7b1-4122-8c4b-0ffbe881c7f7 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:32:b2:fb,bridge_name='br-int',has_traffic_filtering=True,id=f3c83dc3-5763-4272-83eb-749a084d4129,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3c83dc3-57') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:01:33 compute-0 nova_compute[187185]: 2025-11-29 07:01:33.875 187189 DEBUG os_vif [None req-5631dd08-c7b1-4122-8c4b-0ffbe881c7f7 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:b2:fb,bridge_name='br-int',has_traffic_filtering=True,id=f3c83dc3-5763-4272-83eb-749a084d4129,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3c83dc3-57') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:01:33 compute-0 nova_compute[187185]: 2025-11-29 07:01:33.883 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:33 compute-0 nova_compute[187185]: 2025-11-29 07:01:33.884 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf3c83dc3-57, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:01:33 compute-0 nova_compute[187185]: 2025-11-29 07:01:33.887 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:33 compute-0 nova_compute[187185]: 2025-11-29 07:01:33.890 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:01:33 compute-0 nova_compute[187185]: 2025-11-29 07:01:33.895 187189 INFO nova.virt.libvirt.driver [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Creating config drive at /var/lib/nova/instances/5f771a98-65c5-4910-b222-13d29157fdf5/disk.config
Nov 29 07:01:33 compute-0 nova_compute[187185]: 2025-11-29 07:01:33.902 187189 DEBUG oslo_concurrency.processutils [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5f771a98-65c5-4910-b222-13d29157fdf5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0aiafzb9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:01:33 compute-0 nova_compute[187185]: 2025-11-29 07:01:33.934 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:33 compute-0 nova_compute[187185]: 2025-11-29 07:01:33.939 187189 INFO os_vif [None req-5631dd08-c7b1-4122-8c4b-0ffbe881c7f7 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:b2:fb,bridge_name='br-int',has_traffic_filtering=True,id=f3c83dc3-5763-4272-83eb-749a084d4129,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3c83dc3-57')
Nov 29 07:01:33 compute-0 nova_compute[187185]: 2025-11-29 07:01:33.940 187189 INFO nova.virt.libvirt.driver [None req-5631dd08-c7b1-4122-8c4b-0ffbe881c7f7 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Deleting instance files /var/lib/nova/instances/0c53e488-5068-4650-b5ab-66c486f03efa_del
Nov 29 07:01:33 compute-0 nova_compute[187185]: 2025-11-29 07:01:33.945 187189 INFO nova.virt.libvirt.driver [None req-5631dd08-c7b1-4122-8c4b-0ffbe881c7f7 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Deletion of /var/lib/nova/instances/0c53e488-5068-4650-b5ab-66c486f03efa_del complete
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.046 187189 DEBUG oslo_concurrency.processutils [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5f771a98-65c5-4910-b222-13d29157fdf5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0aiafzb9" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:01:34 compute-0 kernel: tapc0762766-7e: entered promiscuous mode
Nov 29 07:01:34 compute-0 NetworkManager[55227]: <info>  [1764399694.1482] manager: (tapc0762766-7e): new Tun device (/org/freedesktop/NetworkManager/Devices/69)
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.146 187189 DEBUG nova.compute.manager [req-a9b47b80-b832-4404-9cc9-e6bdf3756f00 req-771c242b-f3a8-479c-abb2-612bb8857304 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Received event network-changed-f3c83dc3-5763-4272-83eb-749a084d4129 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.147 187189 DEBUG nova.compute.manager [req-a9b47b80-b832-4404-9cc9-e6bdf3756f00 req-771c242b-f3a8-479c-abb2-612bb8857304 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Refreshing instance network info cache due to event network-changed-f3c83dc3-5763-4272-83eb-749a084d4129. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.148 187189 DEBUG oslo_concurrency.lockutils [req-a9b47b80-b832-4404-9cc9-e6bdf3756f00 req-771c242b-f3a8-479c-abb2-612bb8857304 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-0c53e488-5068-4650-b5ab-66c486f03efa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.148 187189 DEBUG oslo_concurrency.lockutils [req-a9b47b80-b832-4404-9cc9-e6bdf3756f00 req-771c242b-f3a8-479c-abb2-612bb8857304 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-0c53e488-5068-4650-b5ab-66c486f03efa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.149 187189 DEBUG nova.network.neutron [req-a9b47b80-b832-4404-9cc9-e6bdf3756f00 req-771c242b-f3a8-479c-abb2-612bb8857304 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Refreshing network info cache for port f3c83dc3-5763-4272-83eb-749a084d4129 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:01:34 compute-0 ovn_controller[95281]: 2025-11-29T07:01:34Z|00117|binding|INFO|Claiming lport c0762766-7e10-403f-82f8-da85dbb8bc40 for this chassis.
Nov 29 07:01:34 compute-0 ovn_controller[95281]: 2025-11-29T07:01:34Z|00118|binding|INFO|c0762766-7e10-403f-82f8-da85dbb8bc40: Claiming fa:16:3e:42:4b:e3 10.100.0.252
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.155 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:34 compute-0 NetworkManager[55227]: <info>  [1764399694.1870] manager: (tap6549d57e-46): new Tun device (/org/freedesktop/NetworkManager/Devices/70)
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:34.188 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:42:4b:e3 10.100.0.252'], port_security=['fa:16:3e:42:4b:e3 10.100.0.252'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.252/24', 'neutron:device_id': '5f771a98-65c5-4910-b222-13d29157fdf5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-498d9ea4-23cf-4d91-b24d-062d633f08bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'de73e0af4d994da4a30deaebd1a7e86b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '16743b8e-22ae-4a05-a3ff-6798c8e786c6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c48bf91d-6152-4f5c-bcdc-06e8d04ea73a, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=c0762766-7e10-403f-82f8-da85dbb8bc40) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:34.191 104254 INFO neutron.agent.ovn.metadata.agent [-] Port c0762766-7e10-403f-82f8-da85dbb8bc40 in datapath 498d9ea4-23cf-4d91-b24d-062d633f08bb bound to our chassis
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:34.195 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 498d9ea4-23cf-4d91-b24d-062d633f08bb
Nov 29 07:01:34 compute-0 kernel: tap6549d57e-46: entered promiscuous mode
Nov 29 07:01:34 compute-0 systemd-udevd[221004]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:01:34 compute-0 systemd-udevd[221005]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:01:34 compute-0 ovn_controller[95281]: 2025-11-29T07:01:34Z|00119|binding|INFO|Setting lport c0762766-7e10-403f-82f8-da85dbb8bc40 ovn-installed in OVS
Nov 29 07:01:34 compute-0 ovn_controller[95281]: 2025-11-29T07:01:34Z|00120|binding|INFO|Setting lport c0762766-7e10-403f-82f8-da85dbb8bc40 up in Southbound
Nov 29 07:01:34 compute-0 NetworkManager[55227]: <info>  [1764399694.2306] device (tap6549d57e-46): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:01:34 compute-0 NetworkManager[55227]: <info>  [1764399694.2315] device (tapc0762766-7e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.235 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:34 compute-0 NetworkManager[55227]: <info>  [1764399694.2362] device (tap6549d57e-46): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:01:34 compute-0 NetworkManager[55227]: <info>  [1764399694.2369] device (tapc0762766-7e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:01:34 compute-0 ovn_controller[95281]: 2025-11-29T07:01:34Z|00121|binding|INFO|Claiming lport 6549d57e-4605-44d6-b1cd-d909be8fd972 for this chassis.
Nov 29 07:01:34 compute-0 ovn_controller[95281]: 2025-11-29T07:01:34Z|00122|binding|INFO|6549d57e-4605-44d6-b1cd-d909be8fd972: Claiming fa:16:3e:05:05:1c 10.100.1.49
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:34.244 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f2456922-cece-4f2e-a6ef-dd8fac7bfb58]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:34.245 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap498d9ea4-21 in ovnmeta-498d9ea4-23cf-4d91-b24d-062d633f08bb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:34.250 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:05:05:1c 10.100.1.49'], port_security=['fa:16:3e:05:05:1c 10.100.1.49'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.1.49/24', 'neutron:device_id': '5f771a98-65c5-4910-b222-13d29157fdf5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-97e790ce-fab4-4f38-8ea5-27e8c5836b93', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'de73e0af4d994da4a30deaebd1a7e86b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '16743b8e-22ae-4a05-a3ff-6798c8e786c6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=63e87163-aedc-4ebc-8733-979e39666f7e, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=6549d57e-4605-44d6-b1cd-d909be8fd972) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:34.252 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap498d9ea4-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:34.253 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[3f2da595-7944-4cb1-8d60-e30075f2d343]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:34.254 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f810b2e3-83f6-4b27-bb67-0871196bc8fa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:34 compute-0 ovn_controller[95281]: 2025-11-29T07:01:34Z|00123|binding|INFO|Setting lport 6549d57e-4605-44d6-b1cd-d909be8fd972 ovn-installed in OVS
Nov 29 07:01:34 compute-0 ovn_controller[95281]: 2025-11-29T07:01:34Z|00124|binding|INFO|Setting lport 6549d57e-4605-44d6-b1cd-d909be8fd972 up in Southbound
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.272 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:34.272 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[49af0792-7a0f-4bd2-9230-987148d4c15c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:34 compute-0 systemd-machined[153486]: New machine qemu-19-instance-00000039.
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.282 187189 INFO nova.scheduler.client.report [None req-5631dd08-c7b1-4122-8c4b-0ffbe881c7f7 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Deleted allocations for instance 0c53e488-5068-4650-b5ab-66c486f03efa
Nov 29 07:01:34 compute-0 systemd[1]: Started Virtual Machine qemu-19-instance-00000039.
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:34.302 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[ca39e481-1803-4935-a33d-dfbb424f3016]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:34.347 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[a67d57f0-fad3-4bc8-a0f9-d745ea2e19a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:34 compute-0 NetworkManager[55227]: <info>  [1764399694.3564] manager: (tap498d9ea4-20): new Veth device (/org/freedesktop/NetworkManager/Devices/71)
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:34.355 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[2231ec43-6b56-48f4-b6af-7967d97c34e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:34.393 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[dedd1804-f9f2-4306-b146-5f77587222c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:34.397 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[0b3c02d4-f8b8-4358-90d0-ba3c63bdfe8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:34 compute-0 NetworkManager[55227]: <info>  [1764399694.4288] device (tap498d9ea4-20): carrier: link connected
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:34.441 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[3dbc51d4-d321-4fb3-acd9-f0fe9ead3bb2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:34.462 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f7629050-054d-48ab-90e1-a9443b5b1e60]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap498d9ea4-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d9:f0:ad'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 514209, 'reachable_time': 32122, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221041, 'error': None, 'target': 'ovnmeta-498d9ea4-23cf-4d91-b24d-062d633f08bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:34.483 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[67a764df-b217-457b-b842-a4b589592e4a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed9:f0ad'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 514209, 'tstamp': 514209}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221042, 'error': None, 'target': 'ovnmeta-498d9ea4-23cf-4d91-b24d-062d633f08bb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:34.503 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[4778bd05-9eb4-47fd-a04a-a50e3ed5fa7c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap498d9ea4-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d9:f0:ad'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 514209, 'reachable_time': 32122, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221043, 'error': None, 'target': 'ovnmeta-498d9ea4-23cf-4d91-b24d-062d633f08bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.537 187189 DEBUG oslo_concurrency.lockutils [None req-5631dd08-c7b1-4122-8c4b-0ffbe881c7f7 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.538 187189 DEBUG oslo_concurrency.lockutils [None req-5631dd08-c7b1-4122-8c4b-0ffbe881c7f7 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:34.550 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[aba0fc57-32ae-4871-a4f9-89865805ed60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.572 187189 DEBUG nova.compute.manager [req-fd0044ef-af8a-45ab-a6b2-c3ce7556e29b req-affc59e4-fd71-41e3-895b-596d1575ab84 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Received event network-vif-plugged-6549d57e-4605-44d6-b1cd-d909be8fd972 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.573 187189 DEBUG oslo_concurrency.lockutils [req-fd0044ef-af8a-45ab-a6b2-c3ce7556e29b req-affc59e4-fd71-41e3-895b-596d1575ab84 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5f771a98-65c5-4910-b222-13d29157fdf5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.573 187189 DEBUG oslo_concurrency.lockutils [req-fd0044ef-af8a-45ab-a6b2-c3ce7556e29b req-affc59e4-fd71-41e3-895b-596d1575ab84 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f771a98-65c5-4910-b222-13d29157fdf5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.573 187189 DEBUG oslo_concurrency.lockutils [req-fd0044ef-af8a-45ab-a6b2-c3ce7556e29b req-affc59e4-fd71-41e3-895b-596d1575ab84 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f771a98-65c5-4910-b222-13d29157fdf5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.574 187189 DEBUG nova.compute.manager [req-fd0044ef-af8a-45ab-a6b2-c3ce7556e29b req-affc59e4-fd71-41e3-895b-596d1575ab84 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Processing event network-vif-plugged-6549d57e-4605-44d6-b1cd-d909be8fd972 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.589 187189 DEBUG nova.compute.provider_tree [None req-5631dd08-c7b1-4122-8c4b-0ffbe881c7f7 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.606 187189 DEBUG nova.scheduler.client.report [None req-5631dd08-c7b1-4122-8c4b-0ffbe881c7f7 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.633 187189 DEBUG oslo_concurrency.lockutils [None req-5631dd08-c7b1-4122-8c4b-0ffbe881c7f7 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:34.657 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d1da3c65-157c-4ce4-866e-9542fb46318a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:34.661 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap498d9ea4-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:34.661 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:34.662 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap498d9ea4-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.665 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:34 compute-0 kernel: tap498d9ea4-20: entered promiscuous mode
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.667 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:34.669 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap498d9ea4-20, col_values=(('external_ids', {'iface-id': '603241bf-6708-4a14-8b1f-1a9d949abf12'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:01:34 compute-0 ovn_controller[95281]: 2025-11-29T07:01:34Z|00125|binding|INFO|Releasing lport 603241bf-6708-4a14-8b1f-1a9d949abf12 from this chassis (sb_readonly=0)
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.672 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:34 compute-0 NetworkManager[55227]: <info>  [1764399694.6736] manager: (tap498d9ea4-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/72)
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:34.675 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/498d9ea4-23cf-4d91-b24d-062d633f08bb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/498d9ea4-23cf-4d91-b24d-062d633f08bb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:34.677 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[adb3c058-51de-4b87-8b29-61460b116dac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:34.679 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-498d9ea4-23cf-4d91-b24d-062d633f08bb
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/498d9ea4-23cf-4d91-b24d-062d633f08bb.pid.haproxy
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID 498d9ea4-23cf-4d91-b24d-062d633f08bb
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:01:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:34.680 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-498d9ea4-23cf-4d91-b24d-062d633f08bb', 'env', 'PROCESS_TAG=haproxy-498d9ea4-23cf-4d91-b24d-062d633f08bb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/498d9ea4-23cf-4d91-b24d-062d633f08bb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.684 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.698 187189 DEBUG oslo_concurrency.lockutils [None req-5631dd08-c7b1-4122-8c4b-0ffbe881c7f7 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "0c53e488-5068-4650-b5ab-66c486f03efa" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 49.504s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.732 187189 DEBUG nova.compute.manager [req-701d2023-ae37-4e3f-b92e-0ee40da3a296 req-aeb2c4d9-6836-45a6-9853-7d57687469df 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Received event network-vif-plugged-c0762766-7e10-403f-82f8-da85dbb8bc40 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.732 187189 DEBUG oslo_concurrency.lockutils [req-701d2023-ae37-4e3f-b92e-0ee40da3a296 req-aeb2c4d9-6836-45a6-9853-7d57687469df 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5f771a98-65c5-4910-b222-13d29157fdf5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.732 187189 DEBUG oslo_concurrency.lockutils [req-701d2023-ae37-4e3f-b92e-0ee40da3a296 req-aeb2c4d9-6836-45a6-9853-7d57687469df 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f771a98-65c5-4910-b222-13d29157fdf5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.733 187189 DEBUG oslo_concurrency.lockutils [req-701d2023-ae37-4e3f-b92e-0ee40da3a296 req-aeb2c4d9-6836-45a6-9853-7d57687469df 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f771a98-65c5-4910-b222-13d29157fdf5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.733 187189 DEBUG nova.compute.manager [req-701d2023-ae37-4e3f-b92e-0ee40da3a296 req-aeb2c4d9-6836-45a6-9853-7d57687469df 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Processing event network-vif-plugged-c0762766-7e10-403f-82f8-da85dbb8bc40 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.787 187189 DEBUG nova.compute.manager [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Instance event wait completed in 0 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.788 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399694.7876112, 5f771a98-65c5-4910-b222-13d29157fdf5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.788 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] VM Started (Lifecycle Event)
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.792 187189 DEBUG nova.virt.libvirt.driver [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.801 187189 INFO nova.virt.libvirt.driver [-] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Instance spawned successfully.
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.802 187189 DEBUG nova.virt.libvirt.driver [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.806 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.809 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.827 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.828 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399694.7880197, 5f771a98-65c5-4910-b222-13d29157fdf5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.828 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] VM Paused (Lifecycle Event)
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.831 187189 DEBUG nova.virt.libvirt.driver [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.831 187189 DEBUG nova.virt.libvirt.driver [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.832 187189 DEBUG nova.virt.libvirt.driver [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.832 187189 DEBUG nova.virt.libvirt.driver [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.833 187189 DEBUG nova.virt.libvirt.driver [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.833 187189 DEBUG nova.virt.libvirt.driver [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.862 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.866 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399694.7933073, 5f771a98-65c5-4910-b222-13d29157fdf5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.867 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] VM Resumed (Lifecycle Event)
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.902 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.908 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.935 187189 INFO nova.compute.manager [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Took 15.97 seconds to spawn the instance on the hypervisor.
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.936 187189 DEBUG nova.compute.manager [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:01:34 compute-0 nova_compute[187185]: 2025-11-29 07:01:34.942 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:01:35 compute-0 nova_compute[187185]: 2025-11-29 07:01:35.086 187189 INFO nova.compute.manager [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Took 16.76 seconds to build instance.
Nov 29 07:01:35 compute-0 nova_compute[187185]: 2025-11-29 07:01:35.106 187189 DEBUG oslo_concurrency.lockutils [None req-c0216e79-1919-4dc9-8769-e39fb2d96c8d 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lock "5f771a98-65c5-4910-b222-13d29157fdf5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.870s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:01:35 compute-0 podman[221083]: 2025-11-29 07:01:35.114538864 +0000 UTC m=+0.026720396 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:01:35 compute-0 nova_compute[187185]: 2025-11-29 07:01:35.279 187189 DEBUG nova.network.neutron [req-352d80ec-664e-415f-a009-6b3ea70e6d6c req-aba06cc0-db14-44d6-830e-e885a09f51d6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Updated VIF entry in instance network info cache for port 6549d57e-4605-44d6-b1cd-d909be8fd972. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:01:35 compute-0 nova_compute[187185]: 2025-11-29 07:01:35.280 187189 DEBUG nova.network.neutron [req-352d80ec-664e-415f-a009-6b3ea70e6d6c req-aba06cc0-db14-44d6-830e-e885a09f51d6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Updating instance_info_cache with network_info: [{"id": "c0762766-7e10-403f-82f8-da85dbb8bc40", "address": "fa:16:3e:42:4b:e3", "network": {"id": "498d9ea4-23cf-4d91-b24d-062d633f08bb", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-266622578", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.252", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0762766-7e", "ovs_interfaceid": "c0762766-7e10-403f-82f8-da85dbb8bc40", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6549d57e-4605-44d6-b1cd-d909be8fd972", "address": "fa:16:3e:05:05:1c", "network": {"id": "97e790ce-fab4-4f38-8ea5-27e8c5836b93", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2050249622", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.49", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6549d57e-46", "ovs_interfaceid": "6549d57e-4605-44d6-b1cd-d909be8fd972", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:01:35 compute-0 nova_compute[187185]: 2025-11-29 07:01:35.297 187189 DEBUG oslo_concurrency.lockutils [req-352d80ec-664e-415f-a009-6b3ea70e6d6c req-aba06cc0-db14-44d6-830e-e885a09f51d6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-5f771a98-65c5-4910-b222-13d29157fdf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:01:35 compute-0 nova_compute[187185]: 2025-11-29 07:01:35.666 187189 DEBUG nova.network.neutron [req-a9b47b80-b832-4404-9cc9-e6bdf3756f00 req-771c242b-f3a8-479c-abb2-612bb8857304 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Updated VIF entry in instance network info cache for port f3c83dc3-5763-4272-83eb-749a084d4129. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:01:35 compute-0 nova_compute[187185]: 2025-11-29 07:01:35.666 187189 DEBUG nova.network.neutron [req-a9b47b80-b832-4404-9cc9-e6bdf3756f00 req-771c242b-f3a8-479c-abb2-612bb8857304 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0c53e488-5068-4650-b5ab-66c486f03efa] Updating instance_info_cache with network_info: [{"id": "f3c83dc3-5763-4272-83eb-749a084d4129", "address": "fa:16:3e:32:b2:fb", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": null, "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tapf3c83dc3-57", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:01:35 compute-0 nova_compute[187185]: 2025-11-29 07:01:35.688 187189 DEBUG oslo_concurrency.lockutils [req-a9b47b80-b832-4404-9cc9-e6bdf3756f00 req-771c242b-f3a8-479c-abb2-612bb8857304 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-0c53e488-5068-4650-b5ab-66c486f03efa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:01:36 compute-0 nova_compute[187185]: 2025-11-29 07:01:36.716 187189 DEBUG nova.compute.manager [req-482a5952-e29c-437a-81d5-ade7eea115b7 req-4713d65d-4b9f-48bc-9441-dc1a0ed3104c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Received event network-vif-plugged-6549d57e-4605-44d6-b1cd-d909be8fd972 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:01:36 compute-0 nova_compute[187185]: 2025-11-29 07:01:36.716 187189 DEBUG oslo_concurrency.lockutils [req-482a5952-e29c-437a-81d5-ade7eea115b7 req-4713d65d-4b9f-48bc-9441-dc1a0ed3104c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5f771a98-65c5-4910-b222-13d29157fdf5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:01:36 compute-0 nova_compute[187185]: 2025-11-29 07:01:36.717 187189 DEBUG oslo_concurrency.lockutils [req-482a5952-e29c-437a-81d5-ade7eea115b7 req-4713d65d-4b9f-48bc-9441-dc1a0ed3104c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f771a98-65c5-4910-b222-13d29157fdf5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:01:36 compute-0 nova_compute[187185]: 2025-11-29 07:01:36.717 187189 DEBUG oslo_concurrency.lockutils [req-482a5952-e29c-437a-81d5-ade7eea115b7 req-4713d65d-4b9f-48bc-9441-dc1a0ed3104c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f771a98-65c5-4910-b222-13d29157fdf5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:01:36 compute-0 nova_compute[187185]: 2025-11-29 07:01:36.717 187189 DEBUG nova.compute.manager [req-482a5952-e29c-437a-81d5-ade7eea115b7 req-4713d65d-4b9f-48bc-9441-dc1a0ed3104c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] No waiting events found dispatching network-vif-plugged-6549d57e-4605-44d6-b1cd-d909be8fd972 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:01:36 compute-0 nova_compute[187185]: 2025-11-29 07:01:36.717 187189 WARNING nova.compute.manager [req-482a5952-e29c-437a-81d5-ade7eea115b7 req-4713d65d-4b9f-48bc-9441-dc1a0ed3104c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Received unexpected event network-vif-plugged-6549d57e-4605-44d6-b1cd-d909be8fd972 for instance with vm_state active and task_state None.
Nov 29 07:01:36 compute-0 nova_compute[187185]: 2025-11-29 07:01:36.873 187189 DEBUG nova.compute.manager [req-bc31f932-fdf9-4cbd-9868-8a521e150a55 req-acdc93d2-c107-43aa-bc5d-ed13a25aed27 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Received event network-vif-plugged-c0762766-7e10-403f-82f8-da85dbb8bc40 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:01:36 compute-0 nova_compute[187185]: 2025-11-29 07:01:36.874 187189 DEBUG oslo_concurrency.lockutils [req-bc31f932-fdf9-4cbd-9868-8a521e150a55 req-acdc93d2-c107-43aa-bc5d-ed13a25aed27 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5f771a98-65c5-4910-b222-13d29157fdf5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:01:36 compute-0 nova_compute[187185]: 2025-11-29 07:01:36.874 187189 DEBUG oslo_concurrency.lockutils [req-bc31f932-fdf9-4cbd-9868-8a521e150a55 req-acdc93d2-c107-43aa-bc5d-ed13a25aed27 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f771a98-65c5-4910-b222-13d29157fdf5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:01:36 compute-0 nova_compute[187185]: 2025-11-29 07:01:36.875 187189 DEBUG oslo_concurrency.lockutils [req-bc31f932-fdf9-4cbd-9868-8a521e150a55 req-acdc93d2-c107-43aa-bc5d-ed13a25aed27 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f771a98-65c5-4910-b222-13d29157fdf5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:01:36 compute-0 nova_compute[187185]: 2025-11-29 07:01:36.875 187189 DEBUG nova.compute.manager [req-bc31f932-fdf9-4cbd-9868-8a521e150a55 req-acdc93d2-c107-43aa-bc5d-ed13a25aed27 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] No waiting events found dispatching network-vif-plugged-c0762766-7e10-403f-82f8-da85dbb8bc40 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:01:36 compute-0 nova_compute[187185]: 2025-11-29 07:01:36.875 187189 WARNING nova.compute.manager [req-bc31f932-fdf9-4cbd-9868-8a521e150a55 req-acdc93d2-c107-43aa-bc5d-ed13a25aed27 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Received unexpected event network-vif-plugged-c0762766-7e10-403f-82f8-da85dbb8bc40 for instance with vm_state active and task_state deleting.
Nov 29 07:01:36 compute-0 nova_compute[187185]: 2025-11-29 07:01:36.915 187189 DEBUG oslo_concurrency.lockutils [None req-956740d3-121a-4c30-ba5a-8a9e2e6d5387 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Acquiring lock "5f771a98-65c5-4910-b222-13d29157fdf5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:01:36 compute-0 nova_compute[187185]: 2025-11-29 07:01:36.916 187189 DEBUG oslo_concurrency.lockutils [None req-956740d3-121a-4c30-ba5a-8a9e2e6d5387 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lock "5f771a98-65c5-4910-b222-13d29157fdf5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:01:36 compute-0 nova_compute[187185]: 2025-11-29 07:01:36.917 187189 DEBUG oslo_concurrency.lockutils [None req-956740d3-121a-4c30-ba5a-8a9e2e6d5387 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Acquiring lock "5f771a98-65c5-4910-b222-13d29157fdf5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:01:36 compute-0 nova_compute[187185]: 2025-11-29 07:01:36.917 187189 DEBUG oslo_concurrency.lockutils [None req-956740d3-121a-4c30-ba5a-8a9e2e6d5387 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lock "5f771a98-65c5-4910-b222-13d29157fdf5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:01:36 compute-0 nova_compute[187185]: 2025-11-29 07:01:36.918 187189 DEBUG oslo_concurrency.lockutils [None req-956740d3-121a-4c30-ba5a-8a9e2e6d5387 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lock "5f771a98-65c5-4910-b222-13d29157fdf5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:01:36 compute-0 nova_compute[187185]: 2025-11-29 07:01:36.936 187189 INFO nova.compute.manager [None req-956740d3-121a-4c30-ba5a-8a9e2e6d5387 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Terminating instance
Nov 29 07:01:36 compute-0 nova_compute[187185]: 2025-11-29 07:01:36.952 187189 DEBUG nova.compute.manager [None req-956740d3-121a-4c30-ba5a-8a9e2e6d5387 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 07:01:37 compute-0 kernel: tapc0762766-7e (unregistering): left promiscuous mode
Nov 29 07:01:37 compute-0 NetworkManager[55227]: <info>  [1764399697.0633] device (tapc0762766-7e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:01:37 compute-0 ovn_controller[95281]: 2025-11-29T07:01:37Z|00126|binding|INFO|Releasing lport c0762766-7e10-403f-82f8-da85dbb8bc40 from this chassis (sb_readonly=0)
Nov 29 07:01:37 compute-0 ovn_controller[95281]: 2025-11-29T07:01:37Z|00127|binding|INFO|Setting lport c0762766-7e10-403f-82f8-da85dbb8bc40 down in Southbound
Nov 29 07:01:37 compute-0 nova_compute[187185]: 2025-11-29 07:01:37.079 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:37 compute-0 ovn_controller[95281]: 2025-11-29T07:01:37Z|00128|binding|INFO|Removing iface tapc0762766-7e ovn-installed in OVS
Nov 29 07:01:37 compute-0 nova_compute[187185]: 2025-11-29 07:01:37.083 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:37 compute-0 nova_compute[187185]: 2025-11-29 07:01:37.105 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:37 compute-0 kernel: tap6549d57e-46 (unregistering): left promiscuous mode
Nov 29 07:01:37 compute-0 NetworkManager[55227]: <info>  [1764399697.1375] device (tap6549d57e-46): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:01:37 compute-0 nova_compute[187185]: 2025-11-29 07:01:37.154 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:37 compute-0 ovn_controller[95281]: 2025-11-29T07:01:37Z|00129|binding|INFO|Releasing lport 6549d57e-4605-44d6-b1cd-d909be8fd972 from this chassis (sb_readonly=1)
Nov 29 07:01:37 compute-0 ovn_controller[95281]: 2025-11-29T07:01:37Z|00130|binding|INFO|Removing iface tap6549d57e-46 ovn-installed in OVS
Nov 29 07:01:37 compute-0 ovn_controller[95281]: 2025-11-29T07:01:37Z|00131|if_status|INFO|Not setting lport 6549d57e-4605-44d6-b1cd-d909be8fd972 down as sb is readonly
Nov 29 07:01:37 compute-0 nova_compute[187185]: 2025-11-29 07:01:37.158 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:37 compute-0 nova_compute[187185]: 2025-11-29 07:01:37.180 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:37 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000039.scope: Deactivated successfully.
Nov 29 07:01:37 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000039.scope: Consumed 2.447s CPU time.
Nov 29 07:01:37 compute-0 systemd-machined[153486]: Machine qemu-19-instance-00000039 terminated.
Nov 29 07:01:37 compute-0 NetworkManager[55227]: <info>  [1764399697.3774] manager: (tapc0762766-7e): new Tun device (/org/freedesktop/NetworkManager/Devices/73)
Nov 29 07:01:37 compute-0 NetworkManager[55227]: <info>  [1764399697.3915] manager: (tap6549d57e-46): new Tun device (/org/freedesktop/NetworkManager/Devices/74)
Nov 29 07:01:37 compute-0 nova_compute[187185]: 2025-11-29 07:01:37.448 187189 INFO nova.virt.libvirt.driver [-] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Instance destroyed successfully.
Nov 29 07:01:37 compute-0 nova_compute[187185]: 2025-11-29 07:01:37.448 187189 DEBUG nova.objects.instance [None req-956740d3-121a-4c30-ba5a-8a9e2e6d5387 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lazy-loading 'resources' on Instance uuid 5f771a98-65c5-4910-b222-13d29157fdf5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:01:38 compute-0 ovn_controller[95281]: 2025-11-29T07:01:38Z|00132|binding|INFO|Setting lport 6549d57e-4605-44d6-b1cd-d909be8fd972 down in Southbound
Nov 29 07:01:38 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:38.067 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:42:4b:e3 10.100.0.252'], port_security=['fa:16:3e:42:4b:e3 10.100.0.252'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.252/24', 'neutron:device_id': '5f771a98-65c5-4910-b222-13d29157fdf5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-498d9ea4-23cf-4d91-b24d-062d633f08bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'de73e0af4d994da4a30deaebd1a7e86b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '16743b8e-22ae-4a05-a3ff-6798c8e786c6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c48bf91d-6152-4f5c-bcdc-06e8d04ea73a, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=c0762766-7e10-403f-82f8-da85dbb8bc40) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:01:38 compute-0 nova_compute[187185]: 2025-11-29 07:01:38.077 187189 DEBUG nova.virt.libvirt.vif [None req-956740d3-121a-4c30-ba5a-8a9e2e6d5387 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:01:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-2071189335',display_name='tempest-ServersTestMultiNic-server-2071189335',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-2071189335',id=57,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:01:34Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='de73e0af4d994da4a30deaebd1a7e86b',ramdisk_id='',reservation_id='r-9rmm5zry',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-1778452684',owner_user_name='tempest-ServersTestMultiNic-1778452684-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:01:35Z,user_data=None,user_id='11e9982557a44d40b2ebaf04bf99c371',uuid=5f771a98-65c5-4910-b222-13d29157fdf5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c0762766-7e10-403f-82f8-da85dbb8bc40", "address": "fa:16:3e:42:4b:e3", "network": {"id": "498d9ea4-23cf-4d91-b24d-062d633f08bb", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-266622578", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.252", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0762766-7e", "ovs_interfaceid": "c0762766-7e10-403f-82f8-da85dbb8bc40", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:01:38 compute-0 nova_compute[187185]: 2025-11-29 07:01:38.078 187189 DEBUG nova.network.os_vif_util [None req-956740d3-121a-4c30-ba5a-8a9e2e6d5387 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Converting VIF {"id": "c0762766-7e10-403f-82f8-da85dbb8bc40", "address": "fa:16:3e:42:4b:e3", "network": {"id": "498d9ea4-23cf-4d91-b24d-062d633f08bb", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-266622578", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.252", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0762766-7e", "ovs_interfaceid": "c0762766-7e10-403f-82f8-da85dbb8bc40", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:01:38 compute-0 nova_compute[187185]: 2025-11-29 07:01:38.079 187189 DEBUG nova.network.os_vif_util [None req-956740d3-121a-4c30-ba5a-8a9e2e6d5387 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:42:4b:e3,bridge_name='br-int',has_traffic_filtering=True,id=c0762766-7e10-403f-82f8-da85dbb8bc40,network=Network(498d9ea4-23cf-4d91-b24d-062d633f08bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0762766-7e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:01:38 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:38.079 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:05:05:1c 10.100.1.49'], port_security=['fa:16:3e:05:05:1c 10.100.1.49'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.1.49/24', 'neutron:device_id': '5f771a98-65c5-4910-b222-13d29157fdf5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-97e790ce-fab4-4f38-8ea5-27e8c5836b93', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'de73e0af4d994da4a30deaebd1a7e86b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '16743b8e-22ae-4a05-a3ff-6798c8e786c6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=63e87163-aedc-4ebc-8733-979e39666f7e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=6549d57e-4605-44d6-b1cd-d909be8fd972) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:01:38 compute-0 nova_compute[187185]: 2025-11-29 07:01:38.079 187189 DEBUG os_vif [None req-956740d3-121a-4c30-ba5a-8a9e2e6d5387 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:42:4b:e3,bridge_name='br-int',has_traffic_filtering=True,id=c0762766-7e10-403f-82f8-da85dbb8bc40,network=Network(498d9ea4-23cf-4d91-b24d-062d633f08bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0762766-7e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:01:38 compute-0 nova_compute[187185]: 2025-11-29 07:01:38.084 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:38 compute-0 nova_compute[187185]: 2025-11-29 07:01:38.084 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc0762766-7e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:01:38 compute-0 nova_compute[187185]: 2025-11-29 07:01:38.087 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:38 compute-0 nova_compute[187185]: 2025-11-29 07:01:38.089 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:01:38 compute-0 nova_compute[187185]: 2025-11-29 07:01:38.092 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:38 compute-0 nova_compute[187185]: 2025-11-29 07:01:38.095 187189 INFO os_vif [None req-956740d3-121a-4c30-ba5a-8a9e2e6d5387 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:42:4b:e3,bridge_name='br-int',has_traffic_filtering=True,id=c0762766-7e10-403f-82f8-da85dbb8bc40,network=Network(498d9ea4-23cf-4d91-b24d-062d633f08bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0762766-7e')
Nov 29 07:01:38 compute-0 nova_compute[187185]: 2025-11-29 07:01:38.097 187189 DEBUG nova.virt.libvirt.vif [None req-956740d3-121a-4c30-ba5a-8a9e2e6d5387 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:01:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-2071189335',display_name='tempest-ServersTestMultiNic-server-2071189335',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-2071189335',id=57,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:01:34Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='de73e0af4d994da4a30deaebd1a7e86b',ramdisk_id='',reservation_id='r-9rmm5zry',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-1778452684',owner_user_name='tempest-ServersTestMultiNic-1778452684-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:01:35Z,user_data=None,user_id='11e9982557a44d40b2ebaf04bf99c371',uuid=5f771a98-65c5-4910-b222-13d29157fdf5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6549d57e-4605-44d6-b1cd-d909be8fd972", "address": "fa:16:3e:05:05:1c", "network": {"id": "97e790ce-fab4-4f38-8ea5-27e8c5836b93", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2050249622", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.49", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6549d57e-46", "ovs_interfaceid": "6549d57e-4605-44d6-b1cd-d909be8fd972", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:01:38 compute-0 nova_compute[187185]: 2025-11-29 07:01:38.097 187189 DEBUG nova.network.os_vif_util [None req-956740d3-121a-4c30-ba5a-8a9e2e6d5387 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Converting VIF {"id": "6549d57e-4605-44d6-b1cd-d909be8fd972", "address": "fa:16:3e:05:05:1c", "network": {"id": "97e790ce-fab4-4f38-8ea5-27e8c5836b93", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2050249622", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.49", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "de73e0af4d994da4a30deaebd1a7e86b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6549d57e-46", "ovs_interfaceid": "6549d57e-4605-44d6-b1cd-d909be8fd972", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:01:38 compute-0 nova_compute[187185]: 2025-11-29 07:01:38.098 187189 DEBUG nova.network.os_vif_util [None req-956740d3-121a-4c30-ba5a-8a9e2e6d5387 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:05:05:1c,bridge_name='br-int',has_traffic_filtering=True,id=6549d57e-4605-44d6-b1cd-d909be8fd972,network=Network(97e790ce-fab4-4f38-8ea5-27e8c5836b93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6549d57e-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:01:38 compute-0 nova_compute[187185]: 2025-11-29 07:01:38.099 187189 DEBUG os_vif [None req-956740d3-121a-4c30-ba5a-8a9e2e6d5387 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:05:1c,bridge_name='br-int',has_traffic_filtering=True,id=6549d57e-4605-44d6-b1cd-d909be8fd972,network=Network(97e790ce-fab4-4f38-8ea5-27e8c5836b93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6549d57e-46') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:01:38 compute-0 nova_compute[187185]: 2025-11-29 07:01:38.101 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:38 compute-0 nova_compute[187185]: 2025-11-29 07:01:38.101 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6549d57e-46, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:01:38 compute-0 nova_compute[187185]: 2025-11-29 07:01:38.103 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:38 compute-0 nova_compute[187185]: 2025-11-29 07:01:38.105 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:38 compute-0 nova_compute[187185]: 2025-11-29 07:01:38.107 187189 INFO os_vif [None req-956740d3-121a-4c30-ba5a-8a9e2e6d5387 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:05:1c,bridge_name='br-int',has_traffic_filtering=True,id=6549d57e-4605-44d6-b1cd-d909be8fd972,network=Network(97e790ce-fab4-4f38-8ea5-27e8c5836b93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6549d57e-46')
Nov 29 07:01:38 compute-0 nova_compute[187185]: 2025-11-29 07:01:38.108 187189 INFO nova.virt.libvirt.driver [None req-956740d3-121a-4c30-ba5a-8a9e2e6d5387 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Deleting instance files /var/lib/nova/instances/5f771a98-65c5-4910-b222-13d29157fdf5_del
Nov 29 07:01:38 compute-0 nova_compute[187185]: 2025-11-29 07:01:38.109 187189 INFO nova.virt.libvirt.driver [None req-956740d3-121a-4c30-ba5a-8a9e2e6d5387 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Deletion of /var/lib/nova/instances/5f771a98-65c5-4910-b222-13d29157fdf5_del complete
Nov 29 07:01:38 compute-0 nova_compute[187185]: 2025-11-29 07:01:38.194 187189 INFO nova.compute.manager [None req-956740d3-121a-4c30-ba5a-8a9e2e6d5387 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Took 1.24 seconds to destroy the instance on the hypervisor.
Nov 29 07:01:38 compute-0 nova_compute[187185]: 2025-11-29 07:01:38.195 187189 DEBUG oslo.service.loopingcall [None req-956740d3-121a-4c30-ba5a-8a9e2e6d5387 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 07:01:38 compute-0 nova_compute[187185]: 2025-11-29 07:01:38.196 187189 DEBUG nova.compute.manager [-] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 07:01:38 compute-0 nova_compute[187185]: 2025-11-29 07:01:38.196 187189 DEBUG nova.network.neutron [-] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 07:01:38 compute-0 podman[221083]: 2025-11-29 07:01:38.278474929 +0000 UTC m=+3.190656441 container create aae579408c3095952976b9e78f0a3e9a5d6f33053da3303b50b1a4a4645eca2c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-498d9ea4-23cf-4d91-b24d-062d633f08bb, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 29 07:01:38 compute-0 systemd[1]: Started libpod-conmon-aae579408c3095952976b9e78f0a3e9a5d6f33053da3303b50b1a4a4645eca2c.scope.
Nov 29 07:01:38 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:01:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6e3ccf7f16949259146b6e012b1b081c74fdf2c1214539a0dbc9a1d2feddd35e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:01:38 compute-0 podman[221142]: 2025-11-29 07:01:38.602898987 +0000 UTC m=+0.260309847 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 07:01:38 compute-0 nova_compute[187185]: 2025-11-29 07:01:38.875 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:38 compute-0 podman[221141]: 2025-11-29 07:01:38.968768307 +0000 UTC m=+0.640613265 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, version=9.6, io.openshift.expose-services=, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, name=ubi9-minimal, config_id=edpm, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.)
Nov 29 07:01:38 compute-0 podman[221140]: 2025-11-29 07:01:38.968896101 +0000 UTC m=+0.643607220 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 07:01:38 compute-0 podman[221083]: 2025-11-29 07:01:38.969462267 +0000 UTC m=+3.881643789 container init aae579408c3095952976b9e78f0a3e9a5d6f33053da3303b50b1a4a4645eca2c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-498d9ea4-23cf-4d91-b24d-062d633f08bb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:01:38 compute-0 podman[221083]: 2025-11-29 07:01:38.976498936 +0000 UTC m=+3.888680418 container start aae579408c3095952976b9e78f0a3e9a5d6f33053da3303b50b1a4a4645eca2c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-498d9ea4-23cf-4d91-b24d-062d633f08bb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 29 07:01:39 compute-0 neutron-haproxy-ovnmeta-498d9ea4-23cf-4d91-b24d-062d633f08bb[221181]: [NOTICE]   (221207) : New worker (221210) forked
Nov 29 07:01:39 compute-0 neutron-haproxy-ovnmeta-498d9ea4-23cf-4d91-b24d-062d633f08bb[221181]: [NOTICE]   (221207) : Loading success.
Nov 29 07:01:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:39.215 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 6549d57e-4605-44d6-b1cd-d909be8fd972 in datapath 97e790ce-fab4-4f38-8ea5-27e8c5836b93 unbound from our chassis
Nov 29 07:01:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:39.217 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 97e790ce-fab4-4f38-8ea5-27e8c5836b93, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:01:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:39.217 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[24478a73-d788-4fc0-a0c3-901927ee87e2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:39.218 104254 INFO neutron.agent.ovn.metadata.agent [-] Port c0762766-7e10-403f-82f8-da85dbb8bc40 in datapath 498d9ea4-23cf-4d91-b24d-062d633f08bb unbound from our chassis
Nov 29 07:01:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:39.219 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 498d9ea4-23cf-4d91-b24d-062d633f08bb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:01:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:39.220 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6f6d30c2-e3da-4f76-9b9c-1be73c3b7e28]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:39.220 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-498d9ea4-23cf-4d91-b24d-062d633f08bb namespace which is not needed anymore
Nov 29 07:01:40 compute-0 neutron-haproxy-ovnmeta-498d9ea4-23cf-4d91-b24d-062d633f08bb[221181]: [NOTICE]   (221207) : haproxy version is 2.8.14-c23fe91
Nov 29 07:01:40 compute-0 neutron-haproxy-ovnmeta-498d9ea4-23cf-4d91-b24d-062d633f08bb[221181]: [NOTICE]   (221207) : path to executable is /usr/sbin/haproxy
Nov 29 07:01:40 compute-0 neutron-haproxy-ovnmeta-498d9ea4-23cf-4d91-b24d-062d633f08bb[221181]: [WARNING]  (221207) : Exiting Master process...
Nov 29 07:01:40 compute-0 neutron-haproxy-ovnmeta-498d9ea4-23cf-4d91-b24d-062d633f08bb[221181]: [ALERT]    (221207) : Current worker (221210) exited with code 143 (Terminated)
Nov 29 07:01:40 compute-0 neutron-haproxy-ovnmeta-498d9ea4-23cf-4d91-b24d-062d633f08bb[221181]: [WARNING]  (221207) : All workers exited. Exiting... (0)
Nov 29 07:01:40 compute-0 systemd[1]: libpod-aae579408c3095952976b9e78f0a3e9a5d6f33053da3303b50b1a4a4645eca2c.scope: Deactivated successfully.
Nov 29 07:01:40 compute-0 conmon[221181]: conmon aae579408c3095952976 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-aae579408c3095952976b9e78f0a3e9a5d6f33053da3303b50b1a4a4645eca2c.scope/container/memory.events
Nov 29 07:01:40 compute-0 podman[221236]: 2025-11-29 07:01:40.097428333 +0000 UTC m=+0.797178719 container died aae579408c3095952976b9e78f0a3e9a5d6f33053da3303b50b1a4a4645eca2c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-498d9ea4-23cf-4d91-b24d-062d633f08bb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 29 07:01:40 compute-0 nova_compute[187185]: 2025-11-29 07:01:40.137 187189 DEBUG nova.compute.manager [req-a442b317-0cfb-4013-a75c-8c4e1a25d6fe req-1846b30a-b42e-4a8d-b6b6-cb6fad39385b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Received event network-vif-unplugged-6549d57e-4605-44d6-b1cd-d909be8fd972 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:01:40 compute-0 nova_compute[187185]: 2025-11-29 07:01:40.138 187189 DEBUG oslo_concurrency.lockutils [req-a442b317-0cfb-4013-a75c-8c4e1a25d6fe req-1846b30a-b42e-4a8d-b6b6-cb6fad39385b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5f771a98-65c5-4910-b222-13d29157fdf5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:01:40 compute-0 nova_compute[187185]: 2025-11-29 07:01:40.139 187189 DEBUG oslo_concurrency.lockutils [req-a442b317-0cfb-4013-a75c-8c4e1a25d6fe req-1846b30a-b42e-4a8d-b6b6-cb6fad39385b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f771a98-65c5-4910-b222-13d29157fdf5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:01:40 compute-0 nova_compute[187185]: 2025-11-29 07:01:40.139 187189 DEBUG oslo_concurrency.lockutils [req-a442b317-0cfb-4013-a75c-8c4e1a25d6fe req-1846b30a-b42e-4a8d-b6b6-cb6fad39385b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f771a98-65c5-4910-b222-13d29157fdf5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:01:40 compute-0 nova_compute[187185]: 2025-11-29 07:01:40.139 187189 DEBUG nova.compute.manager [req-a442b317-0cfb-4013-a75c-8c4e1a25d6fe req-1846b30a-b42e-4a8d-b6b6-cb6fad39385b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] No waiting events found dispatching network-vif-unplugged-6549d57e-4605-44d6-b1cd-d909be8fd972 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:01:40 compute-0 nova_compute[187185]: 2025-11-29 07:01:40.140 187189 DEBUG nova.compute.manager [req-a442b317-0cfb-4013-a75c-8c4e1a25d6fe req-1846b30a-b42e-4a8d-b6b6-cb6fad39385b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Received event network-vif-unplugged-6549d57e-4605-44d6-b1cd-d909be8fd972 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 07:01:40 compute-0 nova_compute[187185]: 2025-11-29 07:01:40.753 187189 DEBUG nova.network.neutron [-] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:01:40 compute-0 nova_compute[187185]: 2025-11-29 07:01:40.784 187189 INFO nova.compute.manager [-] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Took 2.59 seconds to deallocate network for instance.
Nov 29 07:01:40 compute-0 nova_compute[187185]: 2025-11-29 07:01:40.925 187189 DEBUG oslo_concurrency.lockutils [None req-956740d3-121a-4c30-ba5a-8a9e2e6d5387 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:01:40 compute-0 nova_compute[187185]: 2025-11-29 07:01:40.926 187189 DEBUG oslo_concurrency.lockutils [None req-956740d3-121a-4c30-ba5a-8a9e2e6d5387 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:01:41 compute-0 nova_compute[187185]: 2025-11-29 07:01:41.033 187189 DEBUG nova.compute.provider_tree [None req-956740d3-121a-4c30-ba5a-8a9e2e6d5387 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:01:41 compute-0 nova_compute[187185]: 2025-11-29 07:01:41.051 187189 DEBUG nova.scheduler.client.report [None req-956740d3-121a-4c30-ba5a-8a9e2e6d5387 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:01:41 compute-0 nova_compute[187185]: 2025-11-29 07:01:41.078 187189 DEBUG oslo_concurrency.lockutils [None req-956740d3-121a-4c30-ba5a-8a9e2e6d5387 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:01:41 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-aae579408c3095952976b9e78f0a3e9a5d6f33053da3303b50b1a4a4645eca2c-userdata-shm.mount: Deactivated successfully.
Nov 29 07:01:41 compute-0 systemd[1]: var-lib-containers-storage-overlay-6e3ccf7f16949259146b6e012b1b081c74fdf2c1214539a0dbc9a1d2feddd35e-merged.mount: Deactivated successfully.
Nov 29 07:01:41 compute-0 nova_compute[187185]: 2025-11-29 07:01:41.118 187189 INFO nova.scheduler.client.report [None req-956740d3-121a-4c30-ba5a-8a9e2e6d5387 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Deleted allocations for instance 5f771a98-65c5-4910-b222-13d29157fdf5
Nov 29 07:01:41 compute-0 nova_compute[187185]: 2025-11-29 07:01:41.159 187189 DEBUG nova.compute.manager [req-7ccb0080-bef0-449d-8b51-483f0b3f7551 req-3034f095-b70d-4328-be78-ccc6fa38d608 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Received event network-vif-unplugged-c0762766-7e10-403f-82f8-da85dbb8bc40 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:01:41 compute-0 nova_compute[187185]: 2025-11-29 07:01:41.160 187189 DEBUG oslo_concurrency.lockutils [req-7ccb0080-bef0-449d-8b51-483f0b3f7551 req-3034f095-b70d-4328-be78-ccc6fa38d608 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5f771a98-65c5-4910-b222-13d29157fdf5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:01:41 compute-0 nova_compute[187185]: 2025-11-29 07:01:41.161 187189 DEBUG oslo_concurrency.lockutils [req-7ccb0080-bef0-449d-8b51-483f0b3f7551 req-3034f095-b70d-4328-be78-ccc6fa38d608 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f771a98-65c5-4910-b222-13d29157fdf5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:01:41 compute-0 nova_compute[187185]: 2025-11-29 07:01:41.161 187189 DEBUG oslo_concurrency.lockutils [req-7ccb0080-bef0-449d-8b51-483f0b3f7551 req-3034f095-b70d-4328-be78-ccc6fa38d608 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f771a98-65c5-4910-b222-13d29157fdf5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:01:41 compute-0 nova_compute[187185]: 2025-11-29 07:01:41.162 187189 DEBUG nova.compute.manager [req-7ccb0080-bef0-449d-8b51-483f0b3f7551 req-3034f095-b70d-4328-be78-ccc6fa38d608 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] No waiting events found dispatching network-vif-unplugged-c0762766-7e10-403f-82f8-da85dbb8bc40 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:01:41 compute-0 nova_compute[187185]: 2025-11-29 07:01:41.163 187189 WARNING nova.compute.manager [req-7ccb0080-bef0-449d-8b51-483f0b3f7551 req-3034f095-b70d-4328-be78-ccc6fa38d608 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Received unexpected event network-vif-unplugged-c0762766-7e10-403f-82f8-da85dbb8bc40 for instance with vm_state deleted and task_state None.
Nov 29 07:01:41 compute-0 nova_compute[187185]: 2025-11-29 07:01:41.163 187189 DEBUG nova.compute.manager [req-7ccb0080-bef0-449d-8b51-483f0b3f7551 req-3034f095-b70d-4328-be78-ccc6fa38d608 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Received event network-vif-plugged-c0762766-7e10-403f-82f8-da85dbb8bc40 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:01:41 compute-0 nova_compute[187185]: 2025-11-29 07:01:41.163 187189 DEBUG oslo_concurrency.lockutils [req-7ccb0080-bef0-449d-8b51-483f0b3f7551 req-3034f095-b70d-4328-be78-ccc6fa38d608 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5f771a98-65c5-4910-b222-13d29157fdf5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:01:41 compute-0 nova_compute[187185]: 2025-11-29 07:01:41.163 187189 DEBUG oslo_concurrency.lockutils [req-7ccb0080-bef0-449d-8b51-483f0b3f7551 req-3034f095-b70d-4328-be78-ccc6fa38d608 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f771a98-65c5-4910-b222-13d29157fdf5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:01:41 compute-0 nova_compute[187185]: 2025-11-29 07:01:41.163 187189 DEBUG oslo_concurrency.lockutils [req-7ccb0080-bef0-449d-8b51-483f0b3f7551 req-3034f095-b70d-4328-be78-ccc6fa38d608 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f771a98-65c5-4910-b222-13d29157fdf5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:01:41 compute-0 nova_compute[187185]: 2025-11-29 07:01:41.164 187189 DEBUG nova.compute.manager [req-7ccb0080-bef0-449d-8b51-483f0b3f7551 req-3034f095-b70d-4328-be78-ccc6fa38d608 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] No waiting events found dispatching network-vif-plugged-c0762766-7e10-403f-82f8-da85dbb8bc40 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:01:41 compute-0 nova_compute[187185]: 2025-11-29 07:01:41.164 187189 WARNING nova.compute.manager [req-7ccb0080-bef0-449d-8b51-483f0b3f7551 req-3034f095-b70d-4328-be78-ccc6fa38d608 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Received unexpected event network-vif-plugged-c0762766-7e10-403f-82f8-da85dbb8bc40 for instance with vm_state deleted and task_state None.
Nov 29 07:01:41 compute-0 nova_compute[187185]: 2025-11-29 07:01:41.193 187189 DEBUG oslo_concurrency.lockutils [None req-956740d3-121a-4c30-ba5a-8a9e2e6d5387 11e9982557a44d40b2ebaf04bf99c371 de73e0af4d994da4a30deaebd1a7e86b - - default default] Lock "5f771a98-65c5-4910-b222-13d29157fdf5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.277s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:01:41 compute-0 podman[221236]: 2025-11-29 07:01:41.50422511 +0000 UTC m=+2.203975476 container cleanup aae579408c3095952976b9e78f0a3e9a5d6f33053da3303b50b1a4a4645eca2c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-498d9ea4-23cf-4d91-b24d-062d633f08bb, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 07:01:41 compute-0 systemd[1]: libpod-conmon-aae579408c3095952976b9e78f0a3e9a5d6f33053da3303b50b1a4a4645eca2c.scope: Deactivated successfully.
Nov 29 07:01:41 compute-0 podman[221266]: 2025-11-29 07:01:41.896095535 +0000 UTC m=+0.369808662 container remove aae579408c3095952976b9e78f0a3e9a5d6f33053da3303b50b1a4a4645eca2c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-498d9ea4-23cf-4d91-b24d-062d633f08bb, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:01:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:41.903 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[98bd22a2-81e0-462e-8fdc-4ab1ff8f8200]: (4, ('Sat Nov 29 07:01:39 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-498d9ea4-23cf-4d91-b24d-062d633f08bb (aae579408c3095952976b9e78f0a3e9a5d6f33053da3303b50b1a4a4645eca2c)\naae579408c3095952976b9e78f0a3e9a5d6f33053da3303b50b1a4a4645eca2c\nSat Nov 29 07:01:41 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-498d9ea4-23cf-4d91-b24d-062d633f08bb (aae579408c3095952976b9e78f0a3e9a5d6f33053da3303b50b1a4a4645eca2c)\naae579408c3095952976b9e78f0a3e9a5d6f33053da3303b50b1a4a4645eca2c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:41.906 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[fa71656c-83c3-4380-9878-d423f18719f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:41.907 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap498d9ea4-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:01:41 compute-0 nova_compute[187185]: 2025-11-29 07:01:41.909 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:41 compute-0 kernel: tap498d9ea4-20: left promiscuous mode
Nov 29 07:01:41 compute-0 nova_compute[187185]: 2025-11-29 07:01:41.934 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:41.937 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[b718a599-0ec2-4370-9bc6-4f03f33a473b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:41.960 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f8f32817-c443-4899-be65-ed6c1322178b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:41.963 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e84fbe80-5159-4d36-b999-7159469736ae]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:41.989 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[b45cbe29-bd12-440f-ab08-49952b94df0e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 514199, 'reachable_time': 35300, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221280, 'error': None, 'target': 'ovnmeta-498d9ea4-23cf-4d91-b24d-062d633f08bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:41.992 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-498d9ea4-23cf-4d91-b24d-062d633f08bb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:01:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:41.992 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[316c8758-ed9d-48eb-9270-e42a72e63cb2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:41.993 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 6549d57e-4605-44d6-b1cd-d909be8fd972 in datapath 97e790ce-fab4-4f38-8ea5-27e8c5836b93 unbound from our chassis
Nov 29 07:01:41 compute-0 systemd[1]: run-netns-ovnmeta\x2d498d9ea4\x2d23cf\x2d4d91\x2db24d\x2d062d633f08bb.mount: Deactivated successfully.
Nov 29 07:01:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:41.996 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 97e790ce-fab4-4f38-8ea5-27e8c5836b93, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:01:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:41.997 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a479d9f6-c318-45e9-ba7b-86ab723db5e9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:42 compute-0 nova_compute[187185]: 2025-11-29 07:01:42.334 187189 DEBUG nova.compute.manager [req-70b71878-180a-47cb-9ce9-e82ffb55dfcd req-36d830ed-0ee8-4e60-a366-6dda6c7b6f19 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Received event network-vif-plugged-6549d57e-4605-44d6-b1cd-d909be8fd972 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:01:42 compute-0 nova_compute[187185]: 2025-11-29 07:01:42.335 187189 DEBUG oslo_concurrency.lockutils [req-70b71878-180a-47cb-9ce9-e82ffb55dfcd req-36d830ed-0ee8-4e60-a366-6dda6c7b6f19 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5f771a98-65c5-4910-b222-13d29157fdf5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:01:42 compute-0 nova_compute[187185]: 2025-11-29 07:01:42.335 187189 DEBUG oslo_concurrency.lockutils [req-70b71878-180a-47cb-9ce9-e82ffb55dfcd req-36d830ed-0ee8-4e60-a366-6dda6c7b6f19 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f771a98-65c5-4910-b222-13d29157fdf5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:01:42 compute-0 nova_compute[187185]: 2025-11-29 07:01:42.335 187189 DEBUG oslo_concurrency.lockutils [req-70b71878-180a-47cb-9ce9-e82ffb55dfcd req-36d830ed-0ee8-4e60-a366-6dda6c7b6f19 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f771a98-65c5-4910-b222-13d29157fdf5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:01:42 compute-0 nova_compute[187185]: 2025-11-29 07:01:42.336 187189 DEBUG nova.compute.manager [req-70b71878-180a-47cb-9ce9-e82ffb55dfcd req-36d830ed-0ee8-4e60-a366-6dda6c7b6f19 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] No waiting events found dispatching network-vif-plugged-6549d57e-4605-44d6-b1cd-d909be8fd972 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:01:42 compute-0 nova_compute[187185]: 2025-11-29 07:01:42.336 187189 WARNING nova.compute.manager [req-70b71878-180a-47cb-9ce9-e82ffb55dfcd req-36d830ed-0ee8-4e60-a366-6dda6c7b6f19 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Received unexpected event network-vif-plugged-6549d57e-4605-44d6-b1cd-d909be8fd972 for instance with vm_state deleted and task_state None.
Nov 29 07:01:42 compute-0 nova_compute[187185]: 2025-11-29 07:01:42.336 187189 DEBUG nova.compute.manager [req-70b71878-180a-47cb-9ce9-e82ffb55dfcd req-36d830ed-0ee8-4e60-a366-6dda6c7b6f19 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Received event network-vif-deleted-6549d57e-4605-44d6-b1cd-d909be8fd972 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:01:42 compute-0 nova_compute[187185]: 2025-11-29 07:01:42.336 187189 DEBUG nova.compute.manager [req-70b71878-180a-47cb-9ce9-e82ffb55dfcd req-36d830ed-0ee8-4e60-a366-6dda6c7b6f19 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Received event network-vif-deleted-c0762766-7e10-403f-82f8-da85dbb8bc40 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:01:43 compute-0 nova_compute[187185]: 2025-11-29 07:01:43.145 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:43 compute-0 nova_compute[187185]: 2025-11-29 07:01:43.232 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:43.519 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:01:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:43.519 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:01:43 compute-0 nova_compute[187185]: 2025-11-29 07:01:43.520 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:43 compute-0 nova_compute[187185]: 2025-11-29 07:01:43.875 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:44 compute-0 nova_compute[187185]: 2025-11-29 07:01:44.285 187189 DEBUG oslo_concurrency.lockutils [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "d7d04d9c-1c42-4708-913f-0607c892c949" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:01:44 compute-0 nova_compute[187185]: 2025-11-29 07:01:44.287 187189 DEBUG oslo_concurrency.lockutils [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "d7d04d9c-1c42-4708-913f-0607c892c949" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:01:44 compute-0 nova_compute[187185]: 2025-11-29 07:01:44.319 187189 DEBUG nova.compute.manager [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 07:01:44 compute-0 nova_compute[187185]: 2025-11-29 07:01:44.411 187189 DEBUG oslo_concurrency.lockutils [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:01:44 compute-0 nova_compute[187185]: 2025-11-29 07:01:44.412 187189 DEBUG oslo_concurrency.lockutils [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:01:44 compute-0 nova_compute[187185]: 2025-11-29 07:01:44.421 187189 DEBUG nova.virt.hardware [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:01:44 compute-0 nova_compute[187185]: 2025-11-29 07:01:44.422 187189 INFO nova.compute.claims [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:01:44 compute-0 nova_compute[187185]: 2025-11-29 07:01:44.578 187189 DEBUG nova.compute.provider_tree [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:01:44 compute-0 nova_compute[187185]: 2025-11-29 07:01:44.608 187189 DEBUG nova.scheduler.client.report [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:01:44 compute-0 nova_compute[187185]: 2025-11-29 07:01:44.639 187189 DEBUG oslo_concurrency.lockutils [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.227s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:01:44 compute-0 nova_compute[187185]: 2025-11-29 07:01:44.640 187189 DEBUG nova.compute.manager [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 07:01:44 compute-0 nova_compute[187185]: 2025-11-29 07:01:44.708 187189 DEBUG nova.compute.manager [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 07:01:44 compute-0 nova_compute[187185]: 2025-11-29 07:01:44.709 187189 DEBUG nova.network.neutron [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 07:01:44 compute-0 nova_compute[187185]: 2025-11-29 07:01:44.735 187189 INFO nova.virt.libvirt.driver [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 07:01:44 compute-0 nova_compute[187185]: 2025-11-29 07:01:44.756 187189 DEBUG nova.compute.manager [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 07:01:44 compute-0 nova_compute[187185]: 2025-11-29 07:01:44.912 187189 DEBUG nova.compute.manager [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 07:01:44 compute-0 nova_compute[187185]: 2025-11-29 07:01:44.915 187189 DEBUG nova.virt.libvirt.driver [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:01:44 compute-0 nova_compute[187185]: 2025-11-29 07:01:44.916 187189 INFO nova.virt.libvirt.driver [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Creating image(s)
Nov 29 07:01:44 compute-0 nova_compute[187185]: 2025-11-29 07:01:44.917 187189 DEBUG oslo_concurrency.lockutils [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "/var/lib/nova/instances/d7d04d9c-1c42-4708-913f-0607c892c949/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:01:44 compute-0 nova_compute[187185]: 2025-11-29 07:01:44.917 187189 DEBUG oslo_concurrency.lockutils [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "/var/lib/nova/instances/d7d04d9c-1c42-4708-913f-0607c892c949/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:01:44 compute-0 nova_compute[187185]: 2025-11-29 07:01:44.919 187189 DEBUG oslo_concurrency.lockutils [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "/var/lib/nova/instances/d7d04d9c-1c42-4708-913f-0607c892c949/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:01:44 compute-0 nova_compute[187185]: 2025-11-29 07:01:44.948 187189 DEBUG oslo_concurrency.processutils [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:01:45 compute-0 nova_compute[187185]: 2025-11-29 07:01:45.010 187189 DEBUG nova.policy [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 07:01:45 compute-0 nova_compute[187185]: 2025-11-29 07:01:45.019 187189 DEBUG oslo_concurrency.processutils [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:01:45 compute-0 nova_compute[187185]: 2025-11-29 07:01:45.020 187189 DEBUG oslo_concurrency.lockutils [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:01:45 compute-0 nova_compute[187185]: 2025-11-29 07:01:45.021 187189 DEBUG oslo_concurrency.lockutils [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:01:45 compute-0 nova_compute[187185]: 2025-11-29 07:01:45.043 187189 DEBUG oslo_concurrency.processutils [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:01:45 compute-0 nova_compute[187185]: 2025-11-29 07:01:45.110 187189 DEBUG oslo_concurrency.processutils [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:01:45 compute-0 nova_compute[187185]: 2025-11-29 07:01:45.112 187189 DEBUG oslo_concurrency.processutils [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/d7d04d9c-1c42-4708-913f-0607c892c949/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:01:45 compute-0 nova_compute[187185]: 2025-11-29 07:01:45.893 187189 DEBUG nova.network.neutron [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Successfully created port: d917aa01-805b-47c4-8cbf-a739d106fe90 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:01:46 compute-0 nova_compute[187185]: 2025-11-29 07:01:46.255 187189 DEBUG oslo_concurrency.processutils [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/d7d04d9c-1c42-4708-913f-0607c892c949/disk 1073741824" returned: 0 in 1.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:01:46 compute-0 nova_compute[187185]: 2025-11-29 07:01:46.255 187189 DEBUG oslo_concurrency.lockutils [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 1.235s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:01:46 compute-0 nova_compute[187185]: 2025-11-29 07:01:46.256 187189 DEBUG oslo_concurrency.processutils [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:01:46 compute-0 nova_compute[187185]: 2025-11-29 07:01:46.353 187189 DEBUG oslo_concurrency.processutils [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:01:46 compute-0 nova_compute[187185]: 2025-11-29 07:01:46.354 187189 DEBUG nova.virt.disk.api [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Checking if we can resize image /var/lib/nova/instances/d7d04d9c-1c42-4708-913f-0607c892c949/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:01:46 compute-0 nova_compute[187185]: 2025-11-29 07:01:46.354 187189 DEBUG oslo_concurrency.processutils [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d7d04d9c-1c42-4708-913f-0607c892c949/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:01:46 compute-0 nova_compute[187185]: 2025-11-29 07:01:46.427 187189 DEBUG oslo_concurrency.processutils [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d7d04d9c-1c42-4708-913f-0607c892c949/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:01:46 compute-0 nova_compute[187185]: 2025-11-29 07:01:46.428 187189 DEBUG nova.virt.disk.api [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Cannot resize image /var/lib/nova/instances/d7d04d9c-1c42-4708-913f-0607c892c949/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:01:46 compute-0 nova_compute[187185]: 2025-11-29 07:01:46.429 187189 DEBUG nova.objects.instance [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lazy-loading 'migration_context' on Instance uuid d7d04d9c-1c42-4708-913f-0607c892c949 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:01:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:46.522 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:01:46 compute-0 nova_compute[187185]: 2025-11-29 07:01:46.765 187189 DEBUG nova.virt.libvirt.driver [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:01:46 compute-0 nova_compute[187185]: 2025-11-29 07:01:46.766 187189 DEBUG nova.virt.libvirt.driver [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Ensure instance console log exists: /var/lib/nova/instances/d7d04d9c-1c42-4708-913f-0607c892c949/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:01:46 compute-0 nova_compute[187185]: 2025-11-29 07:01:46.766 187189 DEBUG oslo_concurrency.lockutils [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:01:46 compute-0 nova_compute[187185]: 2025-11-29 07:01:46.766 187189 DEBUG oslo_concurrency.lockutils [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:01:46 compute-0 nova_compute[187185]: 2025-11-29 07:01:46.767 187189 DEBUG oslo_concurrency.lockutils [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:01:46 compute-0 podman[221300]: 2025-11-29 07:01:46.878647154 +0000 UTC m=+0.134330667 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 07:01:48 compute-0 nova_compute[187185]: 2025-11-29 07:01:48.149 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:48 compute-0 nova_compute[187185]: 2025-11-29 07:01:48.630 187189 DEBUG nova.network.neutron [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Successfully updated port: d917aa01-805b-47c4-8cbf-a739d106fe90 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:01:48 compute-0 nova_compute[187185]: 2025-11-29 07:01:48.647 187189 DEBUG oslo_concurrency.lockutils [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "refresh_cache-d7d04d9c-1c42-4708-913f-0607c892c949" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:01:48 compute-0 nova_compute[187185]: 2025-11-29 07:01:48.647 187189 DEBUG oslo_concurrency.lockutils [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquired lock "refresh_cache-d7d04d9c-1c42-4708-913f-0607c892c949" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:01:48 compute-0 nova_compute[187185]: 2025-11-29 07:01:48.647 187189 DEBUG nova.network.neutron [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:01:48 compute-0 nova_compute[187185]: 2025-11-29 07:01:48.804 187189 DEBUG nova.compute.manager [req-2e9d4e45-75aa-48d9-a157-e91d2e12cd10 req-c459f220-f842-4519-9a0c-026e636570a4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Received event network-changed-d917aa01-805b-47c4-8cbf-a739d106fe90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:01:48 compute-0 nova_compute[187185]: 2025-11-29 07:01:48.805 187189 DEBUG nova.compute.manager [req-2e9d4e45-75aa-48d9-a157-e91d2e12cd10 req-c459f220-f842-4519-9a0c-026e636570a4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Refreshing instance network info cache due to event network-changed-d917aa01-805b-47c4-8cbf-a739d106fe90. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:01:48 compute-0 nova_compute[187185]: 2025-11-29 07:01:48.805 187189 DEBUG oslo_concurrency.lockutils [req-2e9d4e45-75aa-48d9-a157-e91d2e12cd10 req-c459f220-f842-4519-9a0c-026e636570a4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-d7d04d9c-1c42-4708-913f-0607c892c949" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:01:48 compute-0 nova_compute[187185]: 2025-11-29 07:01:48.917 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:48 compute-0 nova_compute[187185]: 2025-11-29 07:01:48.935 187189 DEBUG nova.network.neutron [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.142 187189 DEBUG nova.network.neutron [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Updating instance_info_cache with network_info: [{"id": "d917aa01-805b-47c4-8cbf-a739d106fe90", "address": "fa:16:3e:3f:96:b0", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd917aa01-80", "ovs_interfaceid": "d917aa01-805b-47c4-8cbf-a739d106fe90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.167 187189 DEBUG oslo_concurrency.lockutils [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Releasing lock "refresh_cache-d7d04d9c-1c42-4708-913f-0607c892c949" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.168 187189 DEBUG nova.compute.manager [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Instance network_info: |[{"id": "d917aa01-805b-47c4-8cbf-a739d106fe90", "address": "fa:16:3e:3f:96:b0", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd917aa01-80", "ovs_interfaceid": "d917aa01-805b-47c4-8cbf-a739d106fe90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.169 187189 DEBUG oslo_concurrency.lockutils [req-2e9d4e45-75aa-48d9-a157-e91d2e12cd10 req-c459f220-f842-4519-9a0c-026e636570a4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-d7d04d9c-1c42-4708-913f-0607c892c949" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.169 187189 DEBUG nova.network.neutron [req-2e9d4e45-75aa-48d9-a157-e91d2e12cd10 req-c459f220-f842-4519-9a0c-026e636570a4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Refreshing network info cache for port d917aa01-805b-47c4-8cbf-a739d106fe90 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.174 187189 DEBUG nova.virt.libvirt.driver [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Start _get_guest_xml network_info=[{"id": "d917aa01-805b-47c4-8cbf-a739d106fe90", "address": "fa:16:3e:3f:96:b0", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd917aa01-80", "ovs_interfaceid": "d917aa01-805b-47c4-8cbf-a739d106fe90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.183 187189 WARNING nova.virt.libvirt.driver [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.197 187189 DEBUG nova.virt.libvirt.host [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.199 187189 DEBUG nova.virt.libvirt.host [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.204 187189 DEBUG nova.virt.libvirt.host [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.206 187189 DEBUG nova.virt.libvirt.host [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.208 187189 DEBUG nova.virt.libvirt.driver [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.208 187189 DEBUG nova.virt.hardware [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.209 187189 DEBUG nova.virt.hardware [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.210 187189 DEBUG nova.virt.hardware [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.211 187189 DEBUG nova.virt.hardware [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.211 187189 DEBUG nova.virt.hardware [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.211 187189 DEBUG nova.virt.hardware [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.212 187189 DEBUG nova.virt.hardware [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.213 187189 DEBUG nova.virt.hardware [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.213 187189 DEBUG nova.virt.hardware [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.214 187189 DEBUG nova.virt.hardware [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.214 187189 DEBUG nova.virt.hardware [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.220 187189 DEBUG nova.virt.libvirt.vif [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:01:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1485815393',display_name='tempest-DeleteServersTestJSON-server-1485815393',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1485815393',id=59,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98df116965b74e4a9985049062e65162',ramdisk_id='',reservation_id='r-65jj0gif',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1973671383',owner_user_name='tempest-DeleteServersTestJSON-1973671383-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:01:44Z,user_data=None,user_id='4ecd161098b5422084003b39f0504a8f',uuid=d7d04d9c-1c42-4708-913f-0607c892c949,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d917aa01-805b-47c4-8cbf-a739d106fe90", "address": "fa:16:3e:3f:96:b0", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd917aa01-80", "ovs_interfaceid": "d917aa01-805b-47c4-8cbf-a739d106fe90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.220 187189 DEBUG nova.network.os_vif_util [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converting VIF {"id": "d917aa01-805b-47c4-8cbf-a739d106fe90", "address": "fa:16:3e:3f:96:b0", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd917aa01-80", "ovs_interfaceid": "d917aa01-805b-47c4-8cbf-a739d106fe90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.221 187189 DEBUG nova.network.os_vif_util [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3f:96:b0,bridge_name='br-int',has_traffic_filtering=True,id=d917aa01-805b-47c4-8cbf-a739d106fe90,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd917aa01-80') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.222 187189 DEBUG nova.objects.instance [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lazy-loading 'pci_devices' on Instance uuid d7d04d9c-1c42-4708-913f-0607c892c949 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.237 187189 DEBUG nova.virt.libvirt.driver [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:01:50 compute-0 nova_compute[187185]:   <uuid>d7d04d9c-1c42-4708-913f-0607c892c949</uuid>
Nov 29 07:01:50 compute-0 nova_compute[187185]:   <name>instance-0000003b</name>
Nov 29 07:01:50 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:01:50 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:01:50 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:01:50 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:       <nova:name>tempest-DeleteServersTestJSON-server-1485815393</nova:name>
Nov 29 07:01:50 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:01:50</nova:creationTime>
Nov 29 07:01:50 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:01:50 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:01:50 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:01:50 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:01:50 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:01:50 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:01:50 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:01:50 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:01:50 compute-0 nova_compute[187185]:         <nova:user uuid="4ecd161098b5422084003b39f0504a8f">tempest-DeleteServersTestJSON-1973671383-project-member</nova:user>
Nov 29 07:01:50 compute-0 nova_compute[187185]:         <nova:project uuid="98df116965b74e4a9985049062e65162">tempest-DeleteServersTestJSON-1973671383</nova:project>
Nov 29 07:01:50 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:01:50 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:01:50 compute-0 nova_compute[187185]:         <nova:port uuid="d917aa01-805b-47c4-8cbf-a739d106fe90">
Nov 29 07:01:50 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:01:50 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:01:50 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:01:50 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <system>
Nov 29 07:01:50 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:01:50 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:01:50 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:01:50 compute-0 nova_compute[187185]:       <entry name="serial">d7d04d9c-1c42-4708-913f-0607c892c949</entry>
Nov 29 07:01:50 compute-0 nova_compute[187185]:       <entry name="uuid">d7d04d9c-1c42-4708-913f-0607c892c949</entry>
Nov 29 07:01:50 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     </system>
Nov 29 07:01:50 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:01:50 compute-0 nova_compute[187185]:   <os>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:   </os>
Nov 29 07:01:50 compute-0 nova_compute[187185]:   <features>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:   </features>
Nov 29 07:01:50 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:01:50 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:01:50 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:01:50 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/d7d04d9c-1c42-4708-913f-0607c892c949/disk"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:01:50 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/d7d04d9c-1c42-4708-913f-0607c892c949/disk.config"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:01:50 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:3f:96:b0"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:       <target dev="tapd917aa01-80"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:01:50 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/d7d04d9c-1c42-4708-913f-0607c892c949/console.log" append="off"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <video>
Nov 29 07:01:50 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     </video>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:01:50 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:01:50 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:01:50 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:01:50 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:01:50 compute-0 nova_compute[187185]: </domain>
Nov 29 07:01:50 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.238 187189 DEBUG nova.compute.manager [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Preparing to wait for external event network-vif-plugged-d917aa01-805b-47c4-8cbf-a739d106fe90 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.239 187189 DEBUG oslo_concurrency.lockutils [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "d7d04d9c-1c42-4708-913f-0607c892c949-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.239 187189 DEBUG oslo_concurrency.lockutils [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "d7d04d9c-1c42-4708-913f-0607c892c949-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.239 187189 DEBUG oslo_concurrency.lockutils [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "d7d04d9c-1c42-4708-913f-0607c892c949-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.240 187189 DEBUG nova.virt.libvirt.vif [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:01:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1485815393',display_name='tempest-DeleteServersTestJSON-server-1485815393',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1485815393',id=59,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98df116965b74e4a9985049062e65162',ramdisk_id='',reservation_id='r-65jj0gif',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1973671383',owner_user_name='tempest-DeleteServersTestJSON-1973671383-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:01:44Z,user_data=None,user_id='4ecd161098b5422084003b39f0504a8f',uuid=d7d04d9c-1c42-4708-913f-0607c892c949,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d917aa01-805b-47c4-8cbf-a739d106fe90", "address": "fa:16:3e:3f:96:b0", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd917aa01-80", "ovs_interfaceid": "d917aa01-805b-47c4-8cbf-a739d106fe90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.240 187189 DEBUG nova.network.os_vif_util [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converting VIF {"id": "d917aa01-805b-47c4-8cbf-a739d106fe90", "address": "fa:16:3e:3f:96:b0", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd917aa01-80", "ovs_interfaceid": "d917aa01-805b-47c4-8cbf-a739d106fe90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.241 187189 DEBUG nova.network.os_vif_util [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3f:96:b0,bridge_name='br-int',has_traffic_filtering=True,id=d917aa01-805b-47c4-8cbf-a739d106fe90,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd917aa01-80') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.241 187189 DEBUG os_vif [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3f:96:b0,bridge_name='br-int',has_traffic_filtering=True,id=d917aa01-805b-47c4-8cbf-a739d106fe90,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd917aa01-80') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.242 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.243 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.243 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.247 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.247 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd917aa01-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.248 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd917aa01-80, col_values=(('external_ids', {'iface-id': 'd917aa01-805b-47c4-8cbf-a739d106fe90', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3f:96:b0', 'vm-uuid': 'd7d04d9c-1c42-4708-913f-0607c892c949'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.250 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:50 compute-0 NetworkManager[55227]: <info>  [1764399710.2521] manager: (tapd917aa01-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/75)
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.252 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.261 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.263 187189 INFO os_vif [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3f:96:b0,bridge_name='br-int',has_traffic_filtering=True,id=d917aa01-805b-47c4-8cbf-a739d106fe90,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd917aa01-80')
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.337 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Nov 29 07:01:50 compute-0 nova_compute[187185]: 2025-11-29 07:01:50.337 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 07:01:51 compute-0 nova_compute[187185]: 2025-11-29 07:01:51.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:01:51 compute-0 nova_compute[187185]: 2025-11-29 07:01:51.505 187189 DEBUG nova.network.neutron [req-2e9d4e45-75aa-48d9-a157-e91d2e12cd10 req-c459f220-f842-4519-9a0c-026e636570a4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Updated VIF entry in instance network info cache for port d917aa01-805b-47c4-8cbf-a739d106fe90. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:01:51 compute-0 nova_compute[187185]: 2025-11-29 07:01:51.506 187189 DEBUG nova.network.neutron [req-2e9d4e45-75aa-48d9-a157-e91d2e12cd10 req-c459f220-f842-4519-9a0c-026e636570a4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Updating instance_info_cache with network_info: [{"id": "d917aa01-805b-47c4-8cbf-a739d106fe90", "address": "fa:16:3e:3f:96:b0", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd917aa01-80", "ovs_interfaceid": "d917aa01-805b-47c4-8cbf-a739d106fe90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:01:51 compute-0 nova_compute[187185]: 2025-11-29 07:01:51.537 187189 DEBUG oslo_concurrency.lockutils [req-2e9d4e45-75aa-48d9-a157-e91d2e12cd10 req-c459f220-f842-4519-9a0c-026e636570a4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-d7d04d9c-1c42-4708-913f-0607c892c949" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:01:51 compute-0 nova_compute[187185]: 2025-11-29 07:01:51.660 187189 DEBUG nova.virt.libvirt.driver [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:01:51 compute-0 nova_compute[187185]: 2025-11-29 07:01:51.660 187189 DEBUG nova.virt.libvirt.driver [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:01:51 compute-0 nova_compute[187185]: 2025-11-29 07:01:51.661 187189 DEBUG nova.virt.libvirt.driver [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] No VIF found with MAC fa:16:3e:3f:96:b0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:01:51 compute-0 nova_compute[187185]: 2025-11-29 07:01:51.662 187189 INFO nova.virt.libvirt.driver [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Using config drive
Nov 29 07:01:52 compute-0 nova_compute[187185]: 2025-11-29 07:01:52.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:01:52 compute-0 nova_compute[187185]: 2025-11-29 07:01:52.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:01:52 compute-0 nova_compute[187185]: 2025-11-29 07:01:52.445 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399697.4438124, 5f771a98-65c5-4910-b222-13d29157fdf5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:01:52 compute-0 nova_compute[187185]: 2025-11-29 07:01:52.446 187189 INFO nova.compute.manager [-] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] VM Stopped (Lifecycle Event)
Nov 29 07:01:52 compute-0 nova_compute[187185]: 2025-11-29 07:01:52.467 187189 DEBUG nova.compute.manager [None req-abd63b1d-c55b-4964-b9ca-de460023f65d - - - - - -] [instance: 5f771a98-65c5-4910-b222-13d29157fdf5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:01:53 compute-0 nova_compute[187185]: 2025-11-29 07:01:53.311 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:01:53 compute-0 nova_compute[187185]: 2025-11-29 07:01:53.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:01:53 compute-0 nova_compute[187185]: 2025-11-29 07:01:53.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:01:53 compute-0 nova_compute[187185]: 2025-11-29 07:01:53.351 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:01:53 compute-0 nova_compute[187185]: 2025-11-29 07:01:53.352 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:01:53 compute-0 nova_compute[187185]: 2025-11-29 07:01:53.352 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:01:53 compute-0 nova_compute[187185]: 2025-11-29 07:01:53.353 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:01:53 compute-0 nova_compute[187185]: 2025-11-29 07:01:53.420 187189 INFO nova.virt.libvirt.driver [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Creating config drive at /var/lib/nova/instances/d7d04d9c-1c42-4708-913f-0607c892c949/disk.config
Nov 29 07:01:53 compute-0 nova_compute[187185]: 2025-11-29 07:01:53.426 187189 DEBUG oslo_concurrency.processutils [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d7d04d9c-1c42-4708-913f-0607c892c949/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8d3wo8gg execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:01:53 compute-0 nova_compute[187185]: 2025-11-29 07:01:53.466 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d7d04d9c-1c42-4708-913f-0607c892c949/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:01:53 compute-0 nova_compute[187185]: 2025-11-29 07:01:53.557 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d7d04d9c-1c42-4708-913f-0607c892c949/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:01:53 compute-0 nova_compute[187185]: 2025-11-29 07:01:53.558 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d7d04d9c-1c42-4708-913f-0607c892c949/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:01:53 compute-0 nova_compute[187185]: 2025-11-29 07:01:53.579 187189 DEBUG oslo_concurrency.processutils [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d7d04d9c-1c42-4708-913f-0607c892c949/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8d3wo8gg" returned: 0 in 0.153s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:01:53 compute-0 nova_compute[187185]: 2025-11-29 07:01:53.626 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d7d04d9c-1c42-4708-913f-0607c892c949/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:01:53 compute-0 kernel: tapd917aa01-80: entered promiscuous mode
Nov 29 07:01:53 compute-0 ovn_controller[95281]: 2025-11-29T07:01:53Z|00133|binding|INFO|Claiming lport d917aa01-805b-47c4-8cbf-a739d106fe90 for this chassis.
Nov 29 07:01:53 compute-0 ovn_controller[95281]: 2025-11-29T07:01:53Z|00134|binding|INFO|d917aa01-805b-47c4-8cbf-a739d106fe90: Claiming fa:16:3e:3f:96:b0 10.100.0.11
Nov 29 07:01:53 compute-0 nova_compute[187185]: 2025-11-29 07:01:53.680 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:53 compute-0 NetworkManager[55227]: <info>  [1764399713.6871] manager: (tapd917aa01-80): new Tun device (/org/freedesktop/NetworkManager/Devices/76)
Nov 29 07:01:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:53.695 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:96:b0 10.100.0.11'], port_security=['fa:16:3e:3f:96:b0 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'd7d04d9c-1c42-4708-913f-0607c892c949', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98df116965b74e4a9985049062e65162', 'neutron:revision_number': '2', 'neutron:security_group_ids': '234720a9-9cd1-4b87-9bec-1abfe8ff0514', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e694bb30-a43a-4d18-87fa-e5c0dd8850c2, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=d917aa01-805b-47c4-8cbf-a739d106fe90) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:01:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:53.696 104254 INFO neutron.agent.ovn.metadata.agent [-] Port d917aa01-805b-47c4-8cbf-a739d106fe90 in datapath fd9eb57e-b1f8-4bae-a60f-8e40613556cd bound to our chassis
Nov 29 07:01:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:53.699 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fd9eb57e-b1f8-4bae-a60f-8e40613556cd
Nov 29 07:01:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:53.710 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[787084df-d99a-4c49-95f6-ff356bad2c9b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:53.712 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfd9eb57e-b1 in ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:01:53 compute-0 systemd-machined[153486]: New machine qemu-20-instance-0000003b.
Nov 29 07:01:53 compute-0 systemd-udevd[221394]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:01:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:53.714 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfd9eb57e-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:01:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:53.714 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[cecb658a-52de-48e9-abb4-d0ff700e0f2e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:53.719 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[82a0dde1-b598-44fb-b2ce-8bc3414f87d7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:53.732 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[5b6c26df-7bf5-4ad5-aa32-ee253872214d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:53 compute-0 NetworkManager[55227]: <info>  [1764399713.7351] device (tapd917aa01-80): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:01:53 compute-0 podman[221336]: 2025-11-29 07:01:53.735509233 +0000 UTC m=+0.097455984 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm, managed_by=edpm_ansible)
Nov 29 07:01:53 compute-0 NetworkManager[55227]: <info>  [1764399713.7363] device (tapd917aa01-80): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:01:53 compute-0 podman[221342]: 2025-11-29 07:01:53.7392948 +0000 UTC m=+0.097749163 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:01:53 compute-0 nova_compute[187185]: 2025-11-29 07:01:53.739 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:53 compute-0 systemd[1]: Started Virtual Machine qemu-20-instance-0000003b.
Nov 29 07:01:53 compute-0 ovn_controller[95281]: 2025-11-29T07:01:53Z|00135|binding|INFO|Setting lport d917aa01-805b-47c4-8cbf-a739d106fe90 ovn-installed in OVS
Nov 29 07:01:53 compute-0 ovn_controller[95281]: 2025-11-29T07:01:53Z|00136|binding|INFO|Setting lport d917aa01-805b-47c4-8cbf-a739d106fe90 up in Southbound
Nov 29 07:01:53 compute-0 nova_compute[187185]: 2025-11-29 07:01:53.748 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:53.754 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[45b98d3c-63fc-400c-a328-2ed6f444b524]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:53.787 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[916a5e1c-56aa-4ca6-8022-265bcba53c92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:53 compute-0 NetworkManager[55227]: <info>  [1764399713.7954] manager: (tapfd9eb57e-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/77)
Nov 29 07:01:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:53.794 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d22d7c70-c986-48f6-a481-d5ef4a037425]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:53.830 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[c951f9fd-430f-41e0-b622-178c8798cad9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:53.834 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[e303cb1e-6677-4ab3-aed3-8e4f07a85142]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:53 compute-0 NetworkManager[55227]: <info>  [1764399713.8643] device (tapfd9eb57e-b0): carrier: link connected
Nov 29 07:01:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:53.873 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[dec7f080-1180-4861-9655-7791da8c5e89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:53.893 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[c7a28439-fd2e-4e11-9d71-9084d3e002e1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfd9eb57e-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ec:80:ac'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 45], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 516152, 'reachable_time': 29009, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221429, 'error': None, 'target': 'ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:53 compute-0 nova_compute[187185]: 2025-11-29 07:01:53.896 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:01:53 compute-0 nova_compute[187185]: 2025-11-29 07:01:53.898 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5721MB free_disk=73.33857727050781GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:01:53 compute-0 nova_compute[187185]: 2025-11-29 07:01:53.898 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:01:53 compute-0 nova_compute[187185]: 2025-11-29 07:01:53.898 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:01:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:53.911 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[44384993-0734-4ec2-b8cf-faaa03fdaa5c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feec:80ac'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 516152, 'tstamp': 516152}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221430, 'error': None, 'target': 'ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:53 compute-0 nova_compute[187185]: 2025-11-29 07:01:53.919 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:53.932 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6b0489e9-4979-4bb2-af98-1e7eb333fd51]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfd9eb57e-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ec:80:ac'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 45], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 516152, 'reachable_time': 29009, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221431, 'error': None, 'target': 'ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:53.962 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f939f234-d3f4-4582-935a-51b3ba99c735]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:54.013 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[86cb3915-d489-440e-9fca-f7cb4c7aa77f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:54.017 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfd9eb57e-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:01:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:54.017 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:01:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:54.018 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfd9eb57e-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:01:54 compute-0 nova_compute[187185]: 2025-11-29 07:01:54.020 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:54 compute-0 NetworkManager[55227]: <info>  [1764399714.0211] manager: (tapfd9eb57e-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Nov 29 07:01:54 compute-0 kernel: tapfd9eb57e-b0: entered promiscuous mode
Nov 29 07:01:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:54.029 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfd9eb57e-b0, col_values=(('external_ids', {'iface-id': 'e7b4cb4f-cb6d-4f0e-8c8d-34c743671595'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:01:54 compute-0 nova_compute[187185]: 2025-11-29 07:01:54.031 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:54 compute-0 ovn_controller[95281]: 2025-11-29T07:01:54Z|00137|binding|INFO|Releasing lport e7b4cb4f-cb6d-4f0e-8c8d-34c743671595 from this chassis (sb_readonly=0)
Nov 29 07:01:54 compute-0 nova_compute[187185]: 2025-11-29 07:01:54.033 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:54.045 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fd9eb57e-b1f8-4bae-a60f-8e40613556cd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fd9eb57e-b1f8-4bae-a60f-8e40613556cd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:01:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:54.048 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[54c2be3a-20fa-48f8-90cd-95948938702c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:01:54 compute-0 nova_compute[187185]: 2025-11-29 07:01:54.049 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance d7d04d9c-1c42-4708-913f-0607c892c949 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:01:54 compute-0 nova_compute[187185]: 2025-11-29 07:01:54.049 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:01:54 compute-0 nova_compute[187185]: 2025-11-29 07:01:54.050 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:01:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:54.050 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:01:54 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:01:54 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:01:54 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-fd9eb57e-b1f8-4bae-a60f-8e40613556cd
Nov 29 07:01:54 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:01:54 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:01:54 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:01:54 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/fd9eb57e-b1f8-4bae-a60f-8e40613556cd.pid.haproxy
Nov 29 07:01:54 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:01:54 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:01:54 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:01:54 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:01:54 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:01:54 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:01:54 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:01:54 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:01:54 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:01:54 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:01:54 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:01:54 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:01:54 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:01:54 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:01:54 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:01:54 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:01:54 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:01:54 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:01:54 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:01:54 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:01:54 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID fd9eb57e-b1f8-4bae-a60f-8e40613556cd
Nov 29 07:01:54 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:01:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:01:54.051 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'env', 'PROCESS_TAG=haproxy-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fd9eb57e-b1f8-4bae-a60f-8e40613556cd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:01:54 compute-0 nova_compute[187185]: 2025-11-29 07:01:54.054 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:54 compute-0 nova_compute[187185]: 2025-11-29 07:01:54.179 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:01:54 compute-0 nova_compute[187185]: 2025-11-29 07:01:54.220 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:01:54 compute-0 nova_compute[187185]: 2025-11-29 07:01:54.263 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399714.2633266, d7d04d9c-1c42-4708-913f-0607c892c949 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:01:54 compute-0 nova_compute[187185]: 2025-11-29 07:01:54.264 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] VM Started (Lifecycle Event)
Nov 29 07:01:54 compute-0 nova_compute[187185]: 2025-11-29 07:01:54.324 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:01:54 compute-0 nova_compute[187185]: 2025-11-29 07:01:54.325 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.426s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:01:54 compute-0 nova_compute[187185]: 2025-11-29 07:01:54.343 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:01:54 compute-0 nova_compute[187185]: 2025-11-29 07:01:54.348 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399714.2643602, d7d04d9c-1c42-4708-913f-0607c892c949 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:01:54 compute-0 nova_compute[187185]: 2025-11-29 07:01:54.349 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] VM Paused (Lifecycle Event)
Nov 29 07:01:54 compute-0 nova_compute[187185]: 2025-11-29 07:01:54.385 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:01:54 compute-0 nova_compute[187185]: 2025-11-29 07:01:54.394 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:01:54 compute-0 nova_compute[187185]: 2025-11-29 07:01:54.429 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:01:54 compute-0 podman[221470]: 2025-11-29 07:01:54.43497433 +0000 UTC m=+0.025313456 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:01:55 compute-0 nova_compute[187185]: 2025-11-29 07:01:55.083 187189 DEBUG nova.compute.manager [req-246c2714-ea3d-476e-a859-38f27ad78ea5 req-af53b199-6589-4d13-82ce-a169b77d1129 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Received event network-vif-plugged-d917aa01-805b-47c4-8cbf-a739d106fe90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:01:55 compute-0 nova_compute[187185]: 2025-11-29 07:01:55.083 187189 DEBUG oslo_concurrency.lockutils [req-246c2714-ea3d-476e-a859-38f27ad78ea5 req-af53b199-6589-4d13-82ce-a169b77d1129 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "d7d04d9c-1c42-4708-913f-0607c892c949-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:01:55 compute-0 nova_compute[187185]: 2025-11-29 07:01:55.084 187189 DEBUG oslo_concurrency.lockutils [req-246c2714-ea3d-476e-a859-38f27ad78ea5 req-af53b199-6589-4d13-82ce-a169b77d1129 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d7d04d9c-1c42-4708-913f-0607c892c949-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:01:55 compute-0 nova_compute[187185]: 2025-11-29 07:01:55.084 187189 DEBUG oslo_concurrency.lockutils [req-246c2714-ea3d-476e-a859-38f27ad78ea5 req-af53b199-6589-4d13-82ce-a169b77d1129 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d7d04d9c-1c42-4708-913f-0607c892c949-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:01:55 compute-0 nova_compute[187185]: 2025-11-29 07:01:55.084 187189 DEBUG nova.compute.manager [req-246c2714-ea3d-476e-a859-38f27ad78ea5 req-af53b199-6589-4d13-82ce-a169b77d1129 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Processing event network-vif-plugged-d917aa01-805b-47c4-8cbf-a739d106fe90 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:01:55 compute-0 nova_compute[187185]: 2025-11-29 07:01:55.085 187189 DEBUG nova.compute.manager [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:01:55 compute-0 nova_compute[187185]: 2025-11-29 07:01:55.092 187189 DEBUG nova.virt.libvirt.driver [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:01:55 compute-0 nova_compute[187185]: 2025-11-29 07:01:55.093 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399715.092082, d7d04d9c-1c42-4708-913f-0607c892c949 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:01:55 compute-0 nova_compute[187185]: 2025-11-29 07:01:55.094 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] VM Resumed (Lifecycle Event)
Nov 29 07:01:55 compute-0 nova_compute[187185]: 2025-11-29 07:01:55.103 187189 INFO nova.virt.libvirt.driver [-] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Instance spawned successfully.
Nov 29 07:01:55 compute-0 nova_compute[187185]: 2025-11-29 07:01:55.103 187189 DEBUG nova.virt.libvirt.driver [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:01:55 compute-0 nova_compute[187185]: 2025-11-29 07:01:55.143 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:01:55 compute-0 nova_compute[187185]: 2025-11-29 07:01:55.152 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:01:55 compute-0 nova_compute[187185]: 2025-11-29 07:01:55.158 187189 DEBUG nova.virt.libvirt.driver [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:01:55 compute-0 nova_compute[187185]: 2025-11-29 07:01:55.159 187189 DEBUG nova.virt.libvirt.driver [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:01:55 compute-0 nova_compute[187185]: 2025-11-29 07:01:55.160 187189 DEBUG nova.virt.libvirt.driver [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:01:55 compute-0 nova_compute[187185]: 2025-11-29 07:01:55.160 187189 DEBUG nova.virt.libvirt.driver [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:01:55 compute-0 nova_compute[187185]: 2025-11-29 07:01:55.161 187189 DEBUG nova.virt.libvirt.driver [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:01:55 compute-0 nova_compute[187185]: 2025-11-29 07:01:55.162 187189 DEBUG nova.virt.libvirt.driver [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:01:55 compute-0 nova_compute[187185]: 2025-11-29 07:01:55.199 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:01:55 compute-0 nova_compute[187185]: 2025-11-29 07:01:55.250 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:55 compute-0 nova_compute[187185]: 2025-11-29 07:01:55.260 187189 INFO nova.compute.manager [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Took 10.35 seconds to spawn the instance on the hypervisor.
Nov 29 07:01:55 compute-0 nova_compute[187185]: 2025-11-29 07:01:55.260 187189 DEBUG nova.compute.manager [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:01:55 compute-0 nova_compute[187185]: 2025-11-29 07:01:55.326 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:01:55 compute-0 nova_compute[187185]: 2025-11-29 07:01:55.327 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:01:55 compute-0 nova_compute[187185]: 2025-11-29 07:01:55.328 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:01:55 compute-0 nova_compute[187185]: 2025-11-29 07:01:55.363 187189 INFO nova.compute.manager [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Took 10.99 seconds to build instance.
Nov 29 07:01:55 compute-0 nova_compute[187185]: 2025-11-29 07:01:55.390 187189 DEBUG oslo_concurrency.lockutils [None req-a9707468-93e1-417c-a4d7-dcd5a0f51a04 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "d7d04d9c-1c42-4708-913f-0607c892c949" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:01:56 compute-0 podman[221470]: 2025-11-29 07:01:56.643061062 +0000 UTC m=+2.233400178 container create e9d5149618b7ff0b267aeb1abd943bef3634ff473b7b3267845337073ee3afb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Nov 29 07:01:57 compute-0 nova_compute[187185]: 2025-11-29 07:01:57.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:01:57 compute-0 nova_compute[187185]: 2025-11-29 07:01:57.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 29 07:01:57 compute-0 systemd[1]: Started libpod-conmon-e9d5149618b7ff0b267aeb1abd943bef3634ff473b7b3267845337073ee3afb7.scope.
Nov 29 07:01:57 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:01:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dfe16bf8cefc9aa330807fc5b347ba66b1f8167ba29c8b38e6473becd73bd40c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:01:57 compute-0 nova_compute[187185]: 2025-11-29 07:01:57.568 187189 DEBUG nova.compute.manager [req-e7197524-edca-44a3-9205-f21b9f108835 req-56b02268-1d47-4121-a6f6-fce3ab398393 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Received event network-vif-plugged-d917aa01-805b-47c4-8cbf-a739d106fe90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:01:57 compute-0 nova_compute[187185]: 2025-11-29 07:01:57.568 187189 DEBUG oslo_concurrency.lockutils [req-e7197524-edca-44a3-9205-f21b9f108835 req-56b02268-1d47-4121-a6f6-fce3ab398393 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "d7d04d9c-1c42-4708-913f-0607c892c949-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:01:57 compute-0 nova_compute[187185]: 2025-11-29 07:01:57.569 187189 DEBUG oslo_concurrency.lockutils [req-e7197524-edca-44a3-9205-f21b9f108835 req-56b02268-1d47-4121-a6f6-fce3ab398393 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d7d04d9c-1c42-4708-913f-0607c892c949-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:01:57 compute-0 nova_compute[187185]: 2025-11-29 07:01:57.569 187189 DEBUG oslo_concurrency.lockutils [req-e7197524-edca-44a3-9205-f21b9f108835 req-56b02268-1d47-4121-a6f6-fce3ab398393 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d7d04d9c-1c42-4708-913f-0607c892c949-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:01:57 compute-0 nova_compute[187185]: 2025-11-29 07:01:57.569 187189 DEBUG nova.compute.manager [req-e7197524-edca-44a3-9205-f21b9f108835 req-56b02268-1d47-4121-a6f6-fce3ab398393 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] No waiting events found dispatching network-vif-plugged-d917aa01-805b-47c4-8cbf-a739d106fe90 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:01:57 compute-0 nova_compute[187185]: 2025-11-29 07:01:57.569 187189 WARNING nova.compute.manager [req-e7197524-edca-44a3-9205-f21b9f108835 req-56b02268-1d47-4121-a6f6-fce3ab398393 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Received unexpected event network-vif-plugged-d917aa01-805b-47c4-8cbf-a739d106fe90 for instance with vm_state active and task_state None.
Nov 29 07:01:57 compute-0 podman[221470]: 2025-11-29 07:01:57.901785374 +0000 UTC m=+3.492124480 container init e9d5149618b7ff0b267aeb1abd943bef3634ff473b7b3267845337073ee3afb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:01:57 compute-0 podman[221470]: 2025-11-29 07:01:57.91333395 +0000 UTC m=+3.503673066 container start e9d5149618b7ff0b267aeb1abd943bef3634ff473b7b3267845337073ee3afb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 29 07:01:57 compute-0 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[221498]: [NOTICE]   (221510) : New worker (221512) forked
Nov 29 07:01:57 compute-0 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[221498]: [NOTICE]   (221510) : Loading success.
Nov 29 07:01:58 compute-0 nova_compute[187185]: 2025-11-29 07:01:58.592 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:01:58 compute-0 podman[221483]: 2025-11-29 07:01:58.654365002 +0000 UTC m=+1.955162204 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 29 07:01:58 compute-0 nova_compute[187185]: 2025-11-29 07:01:58.950 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:01:59 compute-0 nova_compute[187185]: 2025-11-29 07:01:58.998 187189 DEBUG oslo_concurrency.lockutils [None req-db70ea35-5f49-4c9e-84b5-b84489da272f 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "d7d04d9c-1c42-4708-913f-0607c892c949" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:01:59 compute-0 nova_compute[187185]: 2025-11-29 07:01:58.999 187189 DEBUG oslo_concurrency.lockutils [None req-db70ea35-5f49-4c9e-84b5-b84489da272f 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "d7d04d9c-1c42-4708-913f-0607c892c949" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:01:59 compute-0 nova_compute[187185]: 2025-11-29 07:01:59.000 187189 DEBUG nova.compute.manager [None req-db70ea35-5f49-4c9e-84b5-b84489da272f 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:01:59 compute-0 nova_compute[187185]: 2025-11-29 07:01:59.004 187189 DEBUG nova.compute.manager [None req-db70ea35-5f49-4c9e-84b5-b84489da272f 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338
Nov 29 07:01:59 compute-0 nova_compute[187185]: 2025-11-29 07:01:59.005 187189 DEBUG nova.objects.instance [None req-db70ea35-5f49-4c9e-84b5-b84489da272f 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lazy-loading 'flavor' on Instance uuid d7d04d9c-1c42-4708-913f-0607c892c949 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:01:59 compute-0 nova_compute[187185]: 2025-11-29 07:01:59.067 187189 DEBUG nova.objects.instance [None req-db70ea35-5f49-4c9e-84b5-b84489da272f 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lazy-loading 'info_cache' on Instance uuid d7d04d9c-1c42-4708-913f-0607c892c949 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:01:59 compute-0 nova_compute[187185]: 2025-11-29 07:01:59.878 187189 DEBUG nova.virt.libvirt.driver [None req-db70ea35-5f49-4c9e-84b5-b84489da272f 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Nov 29 07:02:00 compute-0 nova_compute[187185]: 2025-11-29 07:02:00.254 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:02:03 compute-0 nova_compute[187185]: 2025-11-29 07:02:03.952 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:02:04 compute-0 nova_compute[187185]: 2025-11-29 07:02:04.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:02:04 compute-0 nova_compute[187185]: 2025-11-29 07:02:04.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 29 07:02:04 compute-0 nova_compute[187185]: 2025-11-29 07:02:04.332 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 29 07:02:04 compute-0 nova_compute[187185]: 2025-11-29 07:02:04.332 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:02:05 compute-0 nova_compute[187185]: 2025-11-29 07:02:05.255 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:02:08 compute-0 podman[221528]: 2025-11-29 07:02:08.845016076 +0000 UTC m=+0.093375230 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 07:02:08 compute-0 nova_compute[187185]: 2025-11-29 07:02:08.957 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:02:09 compute-0 nova_compute[187185]: 2025-11-29 07:02:09.932 187189 DEBUG nova.virt.libvirt.driver [None req-db70ea35-5f49-4c9e-84b5-b84489da272f 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Nov 29 07:02:10 compute-0 podman[221554]: 2025-11-29 07:02:10.10341668 +0000 UTC m=+0.359924933 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 29 07:02:10 compute-0 podman[221555]: 2025-11-29 07:02:10.104393757 +0000 UTC m=+0.352177863 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vcs-type=git, version=9.6, distribution-scope=public, name=ubi9-minimal, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.tags=minimal rhel9, architecture=x86_64, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=)
Nov 29 07:02:10 compute-0 nova_compute[187185]: 2025-11-29 07:02:10.259 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:02:10 compute-0 sshd-session[221552]: Invalid user bodega from 103.179.56.44 port 34380
Nov 29 07:02:11 compute-0 sshd-session[221552]: Received disconnect from 103.179.56.44 port 34380:11: Bye Bye [preauth]
Nov 29 07:02:11 compute-0 sshd-session[221552]: Disconnected from invalid user bodega 103.179.56.44 port 34380 [preauth]
Nov 29 07:02:13 compute-0 nova_compute[187185]: 2025-11-29 07:02:13.960 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:02:14 compute-0 ovn_controller[95281]: 2025-11-29T07:02:14Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:3f:96:b0 10.100.0.11
Nov 29 07:02:14 compute-0 ovn_controller[95281]: 2025-11-29T07:02:14Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3f:96:b0 10.100.0.11
Nov 29 07:02:15 compute-0 nova_compute[187185]: 2025-11-29 07:02:15.262 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:02:17 compute-0 podman[221605]: 2025-11-29 07:02:17.876633245 +0000 UTC m=+0.132536107 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 07:02:18 compute-0 nova_compute[187185]: 2025-11-29 07:02:18.962 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:02:20 compute-0 nova_compute[187185]: 2025-11-29 07:02:20.310 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:02:21 compute-0 nova_compute[187185]: 2025-11-29 07:02:21.002 187189 DEBUG nova.virt.libvirt.driver [None req-db70ea35-5f49-4c9e-84b5-b84489da272f 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Instance in state 1 after 21 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Nov 29 07:02:23 compute-0 nova_compute[187185]: 2025-11-29 07:02:23.965 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:02:24 compute-0 podman[221632]: 2025-11-29 07:02:24.813093103 +0000 UTC m=+0.063531707 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:02:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:02:24.822 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:02:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:02:24.824 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:02:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:02:24.826 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:02:24 compute-0 podman[221631]: 2025-11-29 07:02:24.842916166 +0000 UTC m=+0.092617629 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 29 07:02:25 compute-0 nova_compute[187185]: 2025-11-29 07:02:25.313 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:02:28 compute-0 podman[221673]: 2025-11-29 07:02:28.812003685 +0000 UTC m=+0.077993115 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 07:02:28 compute-0 nova_compute[187185]: 2025-11-29 07:02:28.966 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:02:30 compute-0 nova_compute[187185]: 2025-11-29 07:02:30.316 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:02:32 compute-0 nova_compute[187185]: 2025-11-29 07:02:32.051 187189 DEBUG nova.virt.libvirt.driver [None req-db70ea35-5f49-4c9e-84b5-b84489da272f 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Instance in state 1 after 32 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Nov 29 07:02:33 compute-0 nova_compute[187185]: 2025-11-29 07:02:33.968 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:02:34 compute-0 nova_compute[187185]: 2025-11-29 07:02:34.603 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:02:34 compute-0 nova_compute[187185]: 2025-11-29 07:02:34.623 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Triggering sync for uuid d7d04d9c-1c42-4708-913f-0607c892c949 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Nov 29 07:02:34 compute-0 nova_compute[187185]: 2025-11-29 07:02:34.624 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "d7d04d9c-1c42-4708-913f-0607c892c949" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:02:35 compute-0 nova_compute[187185]: 2025-11-29 07:02:35.319 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:02:39 compute-0 nova_compute[187185]: 2025-11-29 07:02:39.121 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:02:39 compute-0 podman[221694]: 2025-11-29 07:02:39.812993289 +0000 UTC m=+0.073455356 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 07:02:40 compute-0 nova_compute[187185]: 2025-11-29 07:02:40.324 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:02:40 compute-0 podman[221718]: 2025-11-29 07:02:40.825165103 +0000 UTC m=+0.082155912 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 07:02:40 compute-0 podman[221719]: 2025-11-29 07:02:40.837172353 +0000 UTC m=+0.086381642 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_id=edpm, name=ubi9-minimal, release=1755695350, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, version=9.6, vcs-type=git, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 07:02:43 compute-0 nova_compute[187185]: 2025-11-29 07:02:43.143 187189 DEBUG nova.virt.libvirt.driver [None req-db70ea35-5f49-4c9e-84b5-b84489da272f 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Instance in state 1 after 43 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Nov 29 07:02:44 compute-0 nova_compute[187185]: 2025-11-29 07:02:44.123 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:02:45 compute-0 nova_compute[187185]: 2025-11-29 07:02:45.328 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:02:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:47.992 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'd7d04d9c-1c42-4708-913f-0607c892c949', 'name': 'tempest-DeleteServersTestJSON-server-1485815393', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000003b', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '98df116965b74e4a9985049062e65162', 'user_id': '4ecd161098b5422084003b39f0504a8f', 'hostId': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 07:02:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:47.992 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 07:02:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:47.993 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:02:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:47.993 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-DeleteServersTestJSON-server-1485815393>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-DeleteServersTestJSON-server-1485815393>]
Nov 29 07:02:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:47.993 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 07:02:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:47.998 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for d7d04d9c-1c42-4708-913f-0607c892c949 / tapd917aa01-80 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 07:02:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:47.999 12 DEBUG ceilometer.compute.pollsters [-] d7d04d9c-1c42-4708-913f-0607c892c949/network.incoming.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bad27115-a62c-44b3-b80b-19c13f858c68', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 10, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'instance-0000003b-d7d04d9c-1c42-4708-913f-0607c892c949-tapd917aa01-80', 'timestamp': '2025-11-29T07:02:47.993775', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1485815393', 'name': 'tapd917aa01-80', 'instance_id': 'd7d04d9c-1c42-4708-913f-0607c892c949', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3f:96:b0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd917aa01-80'}, 'message_id': '69a390fc-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5215.711989215, 'message_signature': 'd1feacc4b61004817ce3a23cbf65c1ebed530ca098fa4f1d691cd6181f3e9865'}]}, 'timestamp': '2025-11-29 07:02:48.000207', '_unique_id': 'e499af6d4a684de4a37f6fe4b8d9d892'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.002 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.003 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.054 12 DEBUG ceilometer.compute.pollsters [-] d7d04d9c-1c42-4708-913f-0607c892c949/disk.device.read.latency volume: 2283345145 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.055 12 DEBUG ceilometer.compute.pollsters [-] d7d04d9c-1c42-4708-913f-0607c892c949/disk.device.read.latency volume: 24220670 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e6e06981-bfd4-422a-9a3e-f791cd74af4d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2283345145, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'd7d04d9c-1c42-4708-913f-0607c892c949-vda', 'timestamp': '2025-11-29T07:02:48.004130', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1485815393', 'name': 'instance-0000003b', 'instance_id': 'd7d04d9c-1c42-4708-913f-0607c892c949', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69ac032c-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5215.722399507, 'message_signature': 'd3608ca9732cb435059da936a43e1775cc3a5194840d2bd58e9eb8d4702b32f0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24220670, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'd7d04d9c-1c42-4708-913f-0607c892c949-sda', 'timestamp': '2025-11-29T07:02:48.004130', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1485815393', 'name': 'instance-0000003b', 'instance_id': 'd7d04d9c-1c42-4708-913f-0607c892c949', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69ac1006-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5215.722399507, 'message_signature': 'ccef0c5e80b1701b2b5ec85ed6863b02ca2f01953790d315373b84c57c7fe2a9'}]}, 'timestamp': '2025-11-29 07:02:48.055556', '_unique_id': '91b019fd77574fa495df463f00de3c57'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.056 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.057 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.057 12 DEBUG ceilometer.compute.pollsters [-] d7d04d9c-1c42-4708-913f-0607c892c949/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1055db38-13fc-4a83-a6d7-800a73437b4d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'instance-0000003b-d7d04d9c-1c42-4708-913f-0607c892c949-tapd917aa01-80', 'timestamp': '2025-11-29T07:02:48.057380', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1485815393', 'name': 'tapd917aa01-80', 'instance_id': 'd7d04d9c-1c42-4708-913f-0607c892c949', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3f:96:b0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd917aa01-80'}, 'message_id': '69ac6312-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5215.711989215, 'message_signature': 'a17f1416763cddc84c0fcb0bcb65a7b62531f4bfe225eb4c00ccb3812b20c1b2'}]}, 'timestamp': '2025-11-29 07:02:48.057641', '_unique_id': 'b94d0ee5a90c4fc5b4061c7553d4da53'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.058 12 DEBUG ceilometer.compute.pollsters [-] d7d04d9c-1c42-4708-913f-0607c892c949/disk.device.write.bytes volume: 72974336 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 DEBUG ceilometer.compute.pollsters [-] d7d04d9c-1c42-4708-913f-0607c892c949/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '92052489-792f-4099-a6be-e59fbd2f50f2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72974336, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'd7d04d9c-1c42-4708-913f-0607c892c949-vda', 'timestamp': '2025-11-29T07:02:48.058774', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1485815393', 'name': 'instance-0000003b', 'instance_id': 'd7d04d9c-1c42-4708-913f-0607c892c949', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69ac997c-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5215.722399507, 'message_signature': '2dcbba74d0d06b1395669cf06ffbd8062dfcb4b7413dfcf9e91aa95abb1b027a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'd7d04d9c-1c42-4708-913f-0607c892c949-sda', 'timestamp': '2025-11-29T07:02:48.058774', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1485815393', 'name': 'instance-0000003b', 'instance_id': 'd7d04d9c-1c42-4708-913f-0607c892c949', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69aca16a-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5215.722399507, 'message_signature': 'f160df978011e01c2f93d8443e3020e8f2b4fb1f6a228612666fb790741cef55'}]}, 'timestamp': '2025-11-29 07:02:48.059234', '_unique_id': '2e66f31617264f42b866ee2641d758a9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.059 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.060 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.081 12 DEBUG ceilometer.compute.pollsters [-] d7d04d9c-1c42-4708-913f-0607c892c949/memory.usage volume: 43.1328125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '122d44a1-95b9-4cfc-83da-ab425f1950af', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 43.1328125, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'd7d04d9c-1c42-4708-913f-0607c892c949', 'timestamp': '2025-11-29T07:02:48.060456', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1485815393', 'name': 'instance-0000003b', 'instance_id': 'd7d04d9c-1c42-4708-913f-0607c892c949', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '69b02ad8-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5215.800053154, 'message_signature': '0062feeaba7385552eddf62fa3f317945d3b2093cd0524edd7ae5b62abeb89b3'}]}, 'timestamp': '2025-11-29 07:02:48.082477', '_unique_id': '0ab7861ac0d64686a0da515249700e1b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.083 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.084 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.084 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.084 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-DeleteServersTestJSON-server-1485815393>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-DeleteServersTestJSON-server-1485815393>]
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.084 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.084 12 DEBUG ceilometer.compute.pollsters [-] d7d04d9c-1c42-4708-913f-0607c892c949/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '499945a5-c4e9-445a-8692-c2483d0ed125', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'instance-0000003b-d7d04d9c-1c42-4708-913f-0607c892c949-tapd917aa01-80', 'timestamp': '2025-11-29T07:02:48.084737', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1485815393', 'name': 'tapd917aa01-80', 'instance_id': 'd7d04d9c-1c42-4708-913f-0607c892c949', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3f:96:b0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd917aa01-80'}, 'message_id': '69b09072-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5215.711989215, 'message_signature': '0e6635d04e780d474c16b0ad618f8b373f0b01945d85dc13618d5def9b73ca74'}]}, 'timestamp': '2025-11-29 07:02:48.085087', '_unique_id': '5d634da8340a43dc90e8395d5d76d8ad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.085 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.086 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.086 12 DEBUG ceilometer.compute.pollsters [-] d7d04d9c-1c42-4708-913f-0607c892c949/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1433eb15-dabc-4bcf-864b-81bb3e08d4a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'instance-0000003b-d7d04d9c-1c42-4708-913f-0607c892c949-tapd917aa01-80', 'timestamp': '2025-11-29T07:02:48.086429', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1485815393', 'name': 'tapd917aa01-80', 'instance_id': 'd7d04d9c-1c42-4708-913f-0607c892c949', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3f:96:b0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd917aa01-80'}, 'message_id': '69b0d168-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5215.711989215, 'message_signature': '1b5a41ef2fd34fa83a289fd3e397a817c3f86dac6a60cc01988832fd00685cb0'}]}, 'timestamp': '2025-11-29 07:02:48.086694', '_unique_id': 'c40a90c0286746c6b1e32c38ddd8c299'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.087 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 DEBUG ceilometer.compute.pollsters [-] d7d04d9c-1c42-4708-913f-0607c892c949/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '970f9f9f-fb5e-4061-a865-49907f2cfc64', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'instance-0000003b-d7d04d9c-1c42-4708-913f-0607c892c949-tapd917aa01-80', 'timestamp': '2025-11-29T07:02:48.088093', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1485815393', 'name': 'tapd917aa01-80', 'instance_id': 'd7d04d9c-1c42-4708-913f-0607c892c949', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3f:96:b0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd917aa01-80'}, 'message_id': '69b11344-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5215.711989215, 'message_signature': '609690e6de6394ad9432ceca964aa6026500623ffde7a939492bc4a28cc875bc'}]}, 'timestamp': '2025-11-29 07:02:48.088373', '_unique_id': '19e815376db84e0eb563fee69690ad7c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.088 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.089 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.089 12 DEBUG ceilometer.compute.pollsters [-] d7d04d9c-1c42-4708-913f-0607c892c949/disk.device.read.requests volume: 1090 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.089 12 DEBUG ceilometer.compute.pollsters [-] d7d04d9c-1c42-4708-913f-0607c892c949/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f6451650-ed14-41b8-8fa3-733defe73608', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1090, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'd7d04d9c-1c42-4708-913f-0607c892c949-vda', 'timestamp': '2025-11-29T07:02:48.089564', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1485815393', 'name': 'instance-0000003b', 'instance_id': 'd7d04d9c-1c42-4708-913f-0607c892c949', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69b14b3e-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5215.722399507, 'message_signature': '2d441345d444644a6ade9c94e710d3bb036103da4065e809640c60e20c3b9bc5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'd7d04d9c-1c42-4708-913f-0607c892c949-sda', 'timestamp': '2025-11-29T07:02:48.089564', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1485815393', 'name': 'instance-0000003b', 'instance_id': 'd7d04d9c-1c42-4708-913f-0607c892c949', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69b153c2-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5215.722399507, 'message_signature': 'a8a56146db619698078ec08dac085a064bc3f6c0b4208ce8513445157d697340'}]}, 'timestamp': '2025-11-29 07:02:48.089996', '_unique_id': 'cec92ff4f23040b89d97e1ffbde6f465'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.090 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.091 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.091 12 DEBUG ceilometer.compute.pollsters [-] d7d04d9c-1c42-4708-913f-0607c892c949/disk.device.read.bytes volume: 30353920 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.091 12 DEBUG ceilometer.compute.pollsters [-] d7d04d9c-1c42-4708-913f-0607c892c949/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ece86c1b-cad9-4a88-9e5f-bde0de704d8e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30353920, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'd7d04d9c-1c42-4708-913f-0607c892c949-vda', 'timestamp': '2025-11-29T07:02:48.091260', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1485815393', 'name': 'instance-0000003b', 'instance_id': 'd7d04d9c-1c42-4708-913f-0607c892c949', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69b18e96-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5215.722399507, 'message_signature': 'fcc0992cd9243559d48593f8bef1e0a9f6dd3790f552324f64a673f6c7f07761'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'd7d04d9c-1c42-4708-913f-0607c892c949-sda', 'timestamp': '2025-11-29T07:02:48.091260', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1485815393', 'name': 'instance-0000003b', 'instance_id': 'd7d04d9c-1c42-4708-913f-0607c892c949', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69b197d8-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5215.722399507, 'message_signature': 'b05ed79bd0cedea3bf6a3355061e12952da1a222738d9eb2023d5f9ad2592f6a'}]}, 'timestamp': '2025-11-29 07:02:48.091739', '_unique_id': 'b9258310b24e4a3bb1d147c7d0ddd7a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.092 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-DeleteServersTestJSON-server-1485815393>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-DeleteServersTestJSON-server-1485815393>]
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 DEBUG ceilometer.compute.pollsters [-] d7d04d9c-1c42-4708-913f-0607c892c949/cpu volume: 11770000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3862aa74-044c-48ed-a512-9265ea777fe1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11770000000, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'd7d04d9c-1c42-4708-913f-0607c892c949', 'timestamp': '2025-11-29T07:02:48.093145', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1485815393', 'name': 'instance-0000003b', 'instance_id': 'd7d04d9c-1c42-4708-913f-0607c892c949', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '69b1d6d0-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5215.800053154, 'message_signature': 'd6fbcc6e970634da1745dbb1393979f5d205d0e38881290fc0c96e30da13e15a'}]}, 'timestamp': '2025-11-29 07:02:48.093366', '_unique_id': 'f997d5f5868a4a4ea0983ba45fb7f999'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.093 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.094 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.111 12 DEBUG ceilometer.compute.pollsters [-] d7d04d9c-1c42-4708-913f-0607c892c949/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.112 12 DEBUG ceilometer.compute.pollsters [-] d7d04d9c-1c42-4708-913f-0607c892c949/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '24e94067-8d08-4c9d-9aff-b4a9ea654ccf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'd7d04d9c-1c42-4708-913f-0607c892c949-vda', 'timestamp': '2025-11-29T07:02:48.094437', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1485815393', 'name': 'instance-0000003b', 'instance_id': 'd7d04d9c-1c42-4708-913f-0607c892c949', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69b4b4b8-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5215.812661667, 'message_signature': 'dc7a1adab8decd907863a94215abfd63d0c31112623e6cdd3d1bbe53ad90527a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'd7d04d9c-1c42-4708-913f-0607c892c949-sda', 'timestamp': '2025-11-29T07:02:48.094437', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1485815393', 'name': 'instance-0000003b', 'instance_id': 'd7d04d9c-1c42-4708-913f-0607c892c949', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69b4cbd8-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5215.812661667, 'message_signature': '3702fe7f181b44b7099021808d6d6292195b206a3cbe44e6b8bded0f5edd92ec'}]}, 'timestamp': '2025-11-29 07:02:48.112917', '_unique_id': 'c2448885e2ff41288253bdc9b9e8b363'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.114 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.115 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.116 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.116 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-DeleteServersTestJSON-server-1485815393>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-DeleteServersTestJSON-server-1485815393>]
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.116 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.116 12 DEBUG ceilometer.compute.pollsters [-] d7d04d9c-1c42-4708-913f-0607c892c949/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.117 12 DEBUG ceilometer.compute.pollsters [-] d7d04d9c-1c42-4708-913f-0607c892c949/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0c50e9dc-6dc5-4782-b47e-8f6832f4352b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'd7d04d9c-1c42-4708-913f-0607c892c949-vda', 'timestamp': '2025-11-29T07:02:48.116738', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1485815393', 'name': 'instance-0000003b', 'instance_id': 'd7d04d9c-1c42-4708-913f-0607c892c949', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69b5793e-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5215.812661667, 'message_signature': '711067f1bd49097e31c4d556343d7e2be60d5ced2ecc819703381ebcf08e0c34'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'd7d04d9c-1c42-4708-913f-0607c892c949-sda', 'timestamp': '2025-11-29T07:02:48.116738', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1485815393', 'name': 'instance-0000003b', 'instance_id': 'd7d04d9c-1c42-4708-913f-0607c892c949', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69b58c80-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5215.812661667, 'message_signature': '9fe8f7dfaef46aec9e6b0ab3afdd2f9972527ecd55b2dd3611964e85a805658c'}]}, 'timestamp': '2025-11-29 07:02:48.117794', '_unique_id': '68f604cacbc54176b5939b00277f2253'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.118 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.120 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.120 12 DEBUG ceilometer.compute.pollsters [-] d7d04d9c-1c42-4708-913f-0607c892c949/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b3c18492-7eab-4b08-9200-70a3129da91a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'instance-0000003b-d7d04d9c-1c42-4708-913f-0607c892c949-tapd917aa01-80', 'timestamp': '2025-11-29T07:02:48.120501', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1485815393', 'name': 'tapd917aa01-80', 'instance_id': 'd7d04d9c-1c42-4708-913f-0607c892c949', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3f:96:b0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd917aa01-80'}, 'message_id': '69b60a0c-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5215.711989215, 'message_signature': '6905e64e7ad864abb4db5dac32cb1575db59ec8b17a49923a9cb73aa048fc509'}]}, 'timestamp': '2025-11-29 07:02:48.121093', '_unique_id': '8b28b5ac28944ad7aa22ffc113b0bbc3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.122 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.123 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.124 12 DEBUG ceilometer.compute.pollsters [-] d7d04d9c-1c42-4708-913f-0607c892c949/disk.device.write.requests volume: 333 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.124 12 DEBUG ceilometer.compute.pollsters [-] d7d04d9c-1c42-4708-913f-0607c892c949/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f4bced8b-00dd-43f1-b083-0d818e803b1d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 333, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'd7d04d9c-1c42-4708-913f-0607c892c949-vda', 'timestamp': '2025-11-29T07:02:48.124157', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1485815393', 'name': 'instance-0000003b', 'instance_id': 'd7d04d9c-1c42-4708-913f-0607c892c949', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69b699b8-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5215.722399507, 'message_signature': '6c925e8fbfc96106a08013e5c40e694571fb442f669816144147ee408a91495d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'd7d04d9c-1c42-4708-913f-0607c892c949-sda', 'timestamp': '2025-11-29T07:02:48.124157', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1485815393', 'name': 'instance-0000003b', 'instance_id': 'd7d04d9c-1c42-4708-913f-0607c892c949', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69b6afca-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5215.722399507, 'message_signature': '3357482b1afc4d36a4fd3eb26ac19a4ccbb91e7659aa9e664fd8244530b487dc'}]}, 'timestamp': '2025-11-29 07:02:48.125301', '_unique_id': '3076195010b44d18be2eff9b10cd3368'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.126 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.127 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.128 12 DEBUG ceilometer.compute.pollsters [-] d7d04d9c-1c42-4708-913f-0607c892c949/network.outgoing.bytes volume: 2642 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc57ae9a-b9e4-42fa-a1a6-a65cb2867004', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2642, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'instance-0000003b-d7d04d9c-1c42-4708-913f-0607c892c949-tapd917aa01-80', 'timestamp': '2025-11-29T07:02:48.128115', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1485815393', 'name': 'tapd917aa01-80', 'instance_id': 'd7d04d9c-1c42-4708-913f-0607c892c949', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3f:96:b0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd917aa01-80'}, 'message_id': '69b733e6-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5215.711989215, 'message_signature': '4758807c1f8b9faf94384d9a0472f0520483f16055d7de19ea853f7549b81d4a'}]}, 'timestamp': '2025-11-29 07:02:48.128669', '_unique_id': '91797085e13945cf81f15760db810e17'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.129 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.131 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.131 12 DEBUG ceilometer.compute.pollsters [-] d7d04d9c-1c42-4708-913f-0607c892c949/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.132 12 DEBUG ceilometer.compute.pollsters [-] d7d04d9c-1c42-4708-913f-0607c892c949/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c211594-1e61-4bf5-941f-06957776a815', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'd7d04d9c-1c42-4708-913f-0607c892c949-vda', 'timestamp': '2025-11-29T07:02:48.131556', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1485815393', 'name': 'instance-0000003b', 'instance_id': 'd7d04d9c-1c42-4708-913f-0607c892c949', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69b7b960-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5215.812661667, 'message_signature': '03b2ac54c390696b9470959c7e3374c2968d548ef5f908348ab893f2f777de04'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'd7d04d9c-1c42-4708-913f-0607c892c949-sda', 'timestamp': '2025-11-29T07:02:48.131556', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1485815393', 'name': 'instance-0000003b', 'instance_id': 'd7d04d9c-1c42-4708-913f-0607c892c949', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69b7cda6-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5215.812661667, 'message_signature': 'f0fea4a42e6bef6396b22f862da0e656c1da7511bc96315e12f309b98ad2da7d'}]}, 'timestamp': '2025-11-29 07:02:48.132554', '_unique_id': 'fb89e7d58ed94d89824182f600b43fc1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.133 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.134 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.135 12 DEBUG ceilometer.compute.pollsters [-] d7d04d9c-1c42-4708-913f-0607c892c949/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6ca1b068-298b-449c-b809-151545f2cbe6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'instance-0000003b-d7d04d9c-1c42-4708-913f-0607c892c949-tapd917aa01-80', 'timestamp': '2025-11-29T07:02:48.135038', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1485815393', 'name': 'tapd917aa01-80', 'instance_id': 'd7d04d9c-1c42-4708-913f-0607c892c949', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3f:96:b0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd917aa01-80'}, 'message_id': '69b84218-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5215.711989215, 'message_signature': '209856fdffd6c2dd3f835757179cc49c7a896df42b948bca02b6a21508e8e1ac'}]}, 'timestamp': '2025-11-29 07:02:48.135595', '_unique_id': '92ffbfb303de4d66ace583c4992dbf03'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.136 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.137 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.137 12 DEBUG ceilometer.compute.pollsters [-] d7d04d9c-1c42-4708-913f-0607c892c949/disk.device.write.latency volume: 53042787932 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.137 12 DEBUG ceilometer.compute.pollsters [-] d7d04d9c-1c42-4708-913f-0607c892c949/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '41bbafa4-9fb6-493a-9377-5c3332bf2d78', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 53042787932, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'd7d04d9c-1c42-4708-913f-0607c892c949-vda', 'timestamp': '2025-11-29T07:02:48.137490', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1485815393', 'name': 'instance-0000003b', 'instance_id': 'd7d04d9c-1c42-4708-913f-0607c892c949', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69b89df8-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5215.722399507, 'message_signature': '9f6d71342c598ea794ddb548bdacd6fd92b4fa074a67a7712a0fbd3f1822bd1a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'd7d04d9c-1c42-4708-913f-0607c892c949-sda', 'timestamp': '2025-11-29T07:02:48.137490', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1485815393', 'name': 'instance-0000003b', 'instance_id': 'd7d04d9c-1c42-4708-913f-0607c892c949', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69b8aaaa-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5215.722399507, 'message_signature': '9d48ac236af3572e2803e1e07bb4764f34fbc44270ecd3e5fb5903fc79144b1a'}]}, 'timestamp': '2025-11-29 07:02:48.138156', '_unique_id': '0aec11fc5b224f609dac55a1c0bb2336'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.138 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.139 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.139 12 DEBUG ceilometer.compute.pollsters [-] d7d04d9c-1c42-4708-913f-0607c892c949/network.incoming.bytes volume: 1394 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3e14659f-ee7b-49c2-b425-e8c6d686d54e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1394, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'instance-0000003b-d7d04d9c-1c42-4708-913f-0607c892c949-tapd917aa01-80', 'timestamp': '2025-11-29T07:02:48.139662', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1485815393', 'name': 'tapd917aa01-80', 'instance_id': 'd7d04d9c-1c42-4708-913f-0607c892c949', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3f:96:b0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd917aa01-80'}, 'message_id': '69b8f1a4-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5215.711989215, 'message_signature': 'b97a28faf7099d9829c2fed42ebdbf092f556ebde48f6a7076c12d89b7f20c45'}]}, 'timestamp': '2025-11-29 07:02:48.139989', '_unique_id': 'efb6626254ca4942a3675a946d1f707d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.140 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.141 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.141 12 DEBUG ceilometer.compute.pollsters [-] d7d04d9c-1c42-4708-913f-0607c892c949/network.outgoing.packets volume: 27 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '17d2e33e-cd8c-4609-bcde-449ffca44dbf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 27, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'instance-0000003b-d7d04d9c-1c42-4708-913f-0607c892c949-tapd917aa01-80', 'timestamp': '2025-11-29T07:02:48.141383', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1485815393', 'name': 'tapd917aa01-80', 'instance_id': 'd7d04d9c-1c42-4708-913f-0607c892c949', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3f:96:b0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd917aa01-80'}, 'message_id': '69b934ca-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5215.711989215, 'message_signature': '1f9ef8fa85fca21fd5732a595c69fecf47318ea47f56fdedb24f1e79e929cb1d'}]}, 'timestamp': '2025-11-29 07:02:48.141683', '_unique_id': 'a887b3b2306c403e9c2afb9ef3ab1954'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:02:48.142 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:02:48 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:02:48.532 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:02:48 compute-0 nova_compute[187185]: 2025-11-29 07:02:48.533 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:02:48 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:02:48.534 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:02:48 compute-0 podman[221755]: 2025-11-29 07:02:48.870232025 +0000 UTC m=+0.118872104 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 29 07:02:49 compute-0 nova_compute[187185]: 2025-11-29 07:02:49.126 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:02:49 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:02:49.536 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:02:50 compute-0 nova_compute[187185]: 2025-11-29 07:02:50.331 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:02:50 compute-0 nova_compute[187185]: 2025-11-29 07:02:50.337 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:02:50 compute-0 nova_compute[187185]: 2025-11-29 07:02:50.337 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:02:50 compute-0 nova_compute[187185]: 2025-11-29 07:02:50.338 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:02:50 compute-0 nova_compute[187185]: 2025-11-29 07:02:50.360 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "refresh_cache-d7d04d9c-1c42-4708-913f-0607c892c949" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:02:50 compute-0 nova_compute[187185]: 2025-11-29 07:02:50.361 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquired lock "refresh_cache-d7d04d9c-1c42-4708-913f-0607c892c949" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:02:50 compute-0 nova_compute[187185]: 2025-11-29 07:02:50.361 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 29 07:02:50 compute-0 nova_compute[187185]: 2025-11-29 07:02:50.362 187189 DEBUG nova.objects.instance [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lazy-loading 'info_cache' on Instance uuid d7d04d9c-1c42-4708-913f-0607c892c949 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:02:52 compute-0 nova_compute[187185]: 2025-11-29 07:02:52.289 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Updating instance_info_cache with network_info: [{"id": "d917aa01-805b-47c4-8cbf-a739d106fe90", "address": "fa:16:3e:3f:96:b0", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd917aa01-80", "ovs_interfaceid": "d917aa01-805b-47c4-8cbf-a739d106fe90", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:02:52 compute-0 nova_compute[187185]: 2025-11-29 07:02:52.312 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Releasing lock "refresh_cache-d7d04d9c-1c42-4708-913f-0607c892c949" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:02:52 compute-0 nova_compute[187185]: 2025-11-29 07:02:52.313 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 29 07:02:52 compute-0 nova_compute[187185]: 2025-11-29 07:02:52.313 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:02:53 compute-0 nova_compute[187185]: 2025-11-29 07:02:53.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:02:53 compute-0 nova_compute[187185]: 2025-11-29 07:02:53.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:02:53 compute-0 nova_compute[187185]: 2025-11-29 07:02:53.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:02:54 compute-0 nova_compute[187185]: 2025-11-29 07:02:54.128 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:02:54 compute-0 nova_compute[187185]: 2025-11-29 07:02:54.198 187189 DEBUG nova.virt.libvirt.driver [None req-db70ea35-5f49-4c9e-84b5-b84489da272f 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Instance in state 1 after 54 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Nov 29 07:02:54 compute-0 nova_compute[187185]: 2025-11-29 07:02:54.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:02:55 compute-0 nova_compute[187185]: 2025-11-29 07:02:55.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:02:55 compute-0 nova_compute[187185]: 2025-11-29 07:02:55.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:02:55 compute-0 nova_compute[187185]: 2025-11-29 07:02:55.335 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:02:55 compute-0 nova_compute[187185]: 2025-11-29 07:02:55.792 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:02:55 compute-0 nova_compute[187185]: 2025-11-29 07:02:55.792 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:02:55 compute-0 nova_compute[187185]: 2025-11-29 07:02:55.793 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:02:55 compute-0 nova_compute[187185]: 2025-11-29 07:02:55.793 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:02:55 compute-0 podman[221783]: 2025-11-29 07:02:55.841348003 +0000 UTC m=+0.096710392 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 07:02:55 compute-0 podman[221782]: 2025-11-29 07:02:55.861477127 +0000 UTC m=+0.119973264 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 07:02:56 compute-0 nova_compute[187185]: 2025-11-29 07:02:55.999 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d7d04d9c-1c42-4708-913f-0607c892c949/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:02:56 compute-0 nova_compute[187185]: 2025-11-29 07:02:56.100 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d7d04d9c-1c42-4708-913f-0607c892c949/disk --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:02:56 compute-0 nova_compute[187185]: 2025-11-29 07:02:56.101 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d7d04d9c-1c42-4708-913f-0607c892c949/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:02:56 compute-0 nova_compute[187185]: 2025-11-29 07:02:56.196 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d7d04d9c-1c42-4708-913f-0607c892c949/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:02:56 compute-0 nova_compute[187185]: 2025-11-29 07:02:56.447 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:02:56 compute-0 nova_compute[187185]: 2025-11-29 07:02:56.451 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5549MB free_disk=73.31011199951172GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:02:56 compute-0 nova_compute[187185]: 2025-11-29 07:02:56.452 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:02:56 compute-0 nova_compute[187185]: 2025-11-29 07:02:56.452 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:02:56 compute-0 nova_compute[187185]: 2025-11-29 07:02:56.997 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance d7d04d9c-1c42-4708-913f-0607c892c949 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:02:56 compute-0 nova_compute[187185]: 2025-11-29 07:02:56.997 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:02:56 compute-0 nova_compute[187185]: 2025-11-29 07:02:56.998 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:02:57 compute-0 nova_compute[187185]: 2025-11-29 07:02:57.174 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:02:57 compute-0 nova_compute[187185]: 2025-11-29 07:02:57.292 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:02:57 compute-0 nova_compute[187185]: 2025-11-29 07:02:57.338 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:02:57 compute-0 nova_compute[187185]: 2025-11-29 07:02:57.339 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.887s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:02:58 compute-0 nova_compute[187185]: 2025-11-29 07:02:58.339 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:02:58 compute-0 nova_compute[187185]: 2025-11-29 07:02:58.340 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:02:59 compute-0 nova_compute[187185]: 2025-11-29 07:02:59.132 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:02:59 compute-0 podman[221839]: 2025-11-29 07:02:59.812890382 +0000 UTC m=+0.074676185 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 29 07:03:00 compute-0 nova_compute[187185]: 2025-11-29 07:03:00.225 187189 INFO nova.virt.libvirt.driver [None req-db70ea35-5f49-4c9e-84b5-b84489da272f 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Instance failed to shutdown in 60 seconds.
Nov 29 07:03:00 compute-0 nova_compute[187185]: 2025-11-29 07:03:00.340 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:00 compute-0 kernel: tapd917aa01-80 (unregistering): left promiscuous mode
Nov 29 07:03:00 compute-0 NetworkManager[55227]: <info>  [1764399780.8651] device (tapd917aa01-80): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:03:00 compute-0 nova_compute[187185]: 2025-11-29 07:03:00.877 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:00 compute-0 ovn_controller[95281]: 2025-11-29T07:03:00Z|00138|binding|INFO|Releasing lport d917aa01-805b-47c4-8cbf-a739d106fe90 from this chassis (sb_readonly=0)
Nov 29 07:03:00 compute-0 ovn_controller[95281]: 2025-11-29T07:03:00Z|00139|binding|INFO|Setting lport d917aa01-805b-47c4-8cbf-a739d106fe90 down in Southbound
Nov 29 07:03:00 compute-0 ovn_controller[95281]: 2025-11-29T07:03:00Z|00140|binding|INFO|Removing iface tapd917aa01-80 ovn-installed in OVS
Nov 29 07:03:00 compute-0 nova_compute[187185]: 2025-11-29 07:03:00.883 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:00 compute-0 nova_compute[187185]: 2025-11-29 07:03:00.911 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:00 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000003b.scope: Deactivated successfully.
Nov 29 07:03:00 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000003b.scope: Consumed 15.958s CPU time.
Nov 29 07:03:00 compute-0 systemd-machined[153486]: Machine qemu-20-instance-0000003b terminated.
Nov 29 07:03:01 compute-0 nova_compute[187185]: 2025-11-29 07:03:01.090 187189 INFO nova.virt.libvirt.driver [-] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Instance destroyed successfully.
Nov 29 07:03:01 compute-0 nova_compute[187185]: 2025-11-29 07:03:01.090 187189 DEBUG nova.objects.instance [None req-db70ea35-5f49-4c9e-84b5-b84489da272f 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lazy-loading 'numa_topology' on Instance uuid d7d04d9c-1c42-4708-913f-0607c892c949 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:03:02 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:02.011 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:96:b0 10.100.0.11'], port_security=['fa:16:3e:3f:96:b0 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'd7d04d9c-1c42-4708-913f-0607c892c949', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98df116965b74e4a9985049062e65162', 'neutron:revision_number': '4', 'neutron:security_group_ids': '234720a9-9cd1-4b87-9bec-1abfe8ff0514', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e694bb30-a43a-4d18-87fa-e5c0dd8850c2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=d917aa01-805b-47c4-8cbf-a739d106fe90) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:03:02 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:02.013 104254 INFO neutron.agent.ovn.metadata.agent [-] Port d917aa01-805b-47c4-8cbf-a739d106fe90 in datapath fd9eb57e-b1f8-4bae-a60f-8e40613556cd unbound from our chassis
Nov 29 07:03:02 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:02.017 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fd9eb57e-b1f8-4bae-a60f-8e40613556cd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:03:02 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:02.021 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6831261e-9206-4de1-8558-4c5e40df5c56]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:02 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:02.022 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd namespace which is not needed anymore
Nov 29 07:03:02 compute-0 nova_compute[187185]: 2025-11-29 07:03:02.036 187189 DEBUG nova.compute.manager [None req-db70ea35-5f49-4c9e-84b5-b84489da272f 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:03:02 compute-0 nova_compute[187185]: 2025-11-29 07:03:02.326 187189 DEBUG oslo_concurrency.lockutils [None req-db70ea35-5f49-4c9e-84b5-b84489da272f 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "d7d04d9c-1c42-4708-913f-0607c892c949" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 63.327s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:03:02 compute-0 nova_compute[187185]: 2025-11-29 07:03:02.328 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "d7d04d9c-1c42-4708-913f-0607c892c949" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 27.704s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:03:02 compute-0 nova_compute[187185]: 2025-11-29 07:03:02.328 187189 INFO nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] During sync_power_state the instance has a pending task (powering-off). Skip.
Nov 29 07:03:02 compute-0 nova_compute[187185]: 2025-11-29 07:03:02.328 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "d7d04d9c-1c42-4708-913f-0607c892c949" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:03:02 compute-0 nova_compute[187185]: 2025-11-29 07:03:02.561 187189 DEBUG nova.compute.manager [req-3172c0ba-8aee-4548-b8e3-3da581f9c0ac req-2fa61ebf-5775-4bb1-9455-be91d4d1a965 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Received event network-vif-unplugged-d917aa01-805b-47c4-8cbf-a739d106fe90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:03:02 compute-0 nova_compute[187185]: 2025-11-29 07:03:02.562 187189 DEBUG oslo_concurrency.lockutils [req-3172c0ba-8aee-4548-b8e3-3da581f9c0ac req-2fa61ebf-5775-4bb1-9455-be91d4d1a965 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "d7d04d9c-1c42-4708-913f-0607c892c949-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:03:02 compute-0 nova_compute[187185]: 2025-11-29 07:03:02.563 187189 DEBUG oslo_concurrency.lockutils [req-3172c0ba-8aee-4548-b8e3-3da581f9c0ac req-2fa61ebf-5775-4bb1-9455-be91d4d1a965 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d7d04d9c-1c42-4708-913f-0607c892c949-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:03:02 compute-0 nova_compute[187185]: 2025-11-29 07:03:02.564 187189 DEBUG oslo_concurrency.lockutils [req-3172c0ba-8aee-4548-b8e3-3da581f9c0ac req-2fa61ebf-5775-4bb1-9455-be91d4d1a965 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d7d04d9c-1c42-4708-913f-0607c892c949-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:03:02 compute-0 nova_compute[187185]: 2025-11-29 07:03:02.564 187189 DEBUG nova.compute.manager [req-3172c0ba-8aee-4548-b8e3-3da581f9c0ac req-2fa61ebf-5775-4bb1-9455-be91d4d1a965 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] No waiting events found dispatching network-vif-unplugged-d917aa01-805b-47c4-8cbf-a739d106fe90 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:03:02 compute-0 nova_compute[187185]: 2025-11-29 07:03:02.565 187189 WARNING nova.compute.manager [req-3172c0ba-8aee-4548-b8e3-3da581f9c0ac req-2fa61ebf-5775-4bb1-9455-be91d4d1a965 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Received unexpected event network-vif-unplugged-d917aa01-805b-47c4-8cbf-a739d106fe90 for instance with vm_state stopped and task_state None.
Nov 29 07:03:04 compute-0 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[221498]: [NOTICE]   (221510) : haproxy version is 2.8.14-c23fe91
Nov 29 07:03:04 compute-0 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[221498]: [NOTICE]   (221510) : path to executable is /usr/sbin/haproxy
Nov 29 07:03:04 compute-0 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[221498]: [WARNING]  (221510) : Exiting Master process...
Nov 29 07:03:04 compute-0 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[221498]: [WARNING]  (221510) : Exiting Master process...
Nov 29 07:03:04 compute-0 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[221498]: [ALERT]    (221510) : Current worker (221512) exited with code 143 (Terminated)
Nov 29 07:03:04 compute-0 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[221498]: [WARNING]  (221510) : All workers exited. Exiting... (0)
Nov 29 07:03:04 compute-0 systemd[1]: libpod-e9d5149618b7ff0b267aeb1abd943bef3634ff473b7b3267845337073ee3afb7.scope: Deactivated successfully.
Nov 29 07:03:04 compute-0 podman[221900]: 2025-11-29 07:03:04.041781448 +0000 UTC m=+1.845223673 container died e9d5149618b7ff0b267aeb1abd943bef3634ff473b7b3267845337073ee3afb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:03:04 compute-0 nova_compute[187185]: 2025-11-29 07:03:04.134 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:04 compute-0 nova_compute[187185]: 2025-11-29 07:03:04.717 187189 DEBUG nova.compute.manager [req-529fc8f9-8d1e-4264-87ea-2adf815bb388 req-dcba5391-f785-401f-a563-12d8e44669ff 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Received event network-vif-plugged-d917aa01-805b-47c4-8cbf-a739d106fe90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:03:04 compute-0 nova_compute[187185]: 2025-11-29 07:03:04.717 187189 DEBUG oslo_concurrency.lockutils [req-529fc8f9-8d1e-4264-87ea-2adf815bb388 req-dcba5391-f785-401f-a563-12d8e44669ff 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "d7d04d9c-1c42-4708-913f-0607c892c949-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:03:04 compute-0 nova_compute[187185]: 2025-11-29 07:03:04.718 187189 DEBUG oslo_concurrency.lockutils [req-529fc8f9-8d1e-4264-87ea-2adf815bb388 req-dcba5391-f785-401f-a563-12d8e44669ff 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d7d04d9c-1c42-4708-913f-0607c892c949-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:03:04 compute-0 nova_compute[187185]: 2025-11-29 07:03:04.718 187189 DEBUG oslo_concurrency.lockutils [req-529fc8f9-8d1e-4264-87ea-2adf815bb388 req-dcba5391-f785-401f-a563-12d8e44669ff 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d7d04d9c-1c42-4708-913f-0607c892c949-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:03:04 compute-0 nova_compute[187185]: 2025-11-29 07:03:04.719 187189 DEBUG nova.compute.manager [req-529fc8f9-8d1e-4264-87ea-2adf815bb388 req-dcba5391-f785-401f-a563-12d8e44669ff 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] No waiting events found dispatching network-vif-plugged-d917aa01-805b-47c4-8cbf-a739d106fe90 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:03:04 compute-0 nova_compute[187185]: 2025-11-29 07:03:04.720 187189 WARNING nova.compute.manager [req-529fc8f9-8d1e-4264-87ea-2adf815bb388 req-dcba5391-f785-401f-a563-12d8e44669ff 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Received unexpected event network-vif-plugged-d917aa01-805b-47c4-8cbf-a739d106fe90 for instance with vm_state stopped and task_state None.
Nov 29 07:03:04 compute-0 nova_compute[187185]: 2025-11-29 07:03:04.908 187189 DEBUG oslo_concurrency.lockutils [None req-5727660a-dd58-4d11-a092-788af0defda9 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "d7d04d9c-1c42-4708-913f-0607c892c949" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:03:04 compute-0 nova_compute[187185]: 2025-11-29 07:03:04.909 187189 DEBUG oslo_concurrency.lockutils [None req-5727660a-dd58-4d11-a092-788af0defda9 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "d7d04d9c-1c42-4708-913f-0607c892c949" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:03:04 compute-0 nova_compute[187185]: 2025-11-29 07:03:04.909 187189 DEBUG oslo_concurrency.lockutils [None req-5727660a-dd58-4d11-a092-788af0defda9 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "d7d04d9c-1c42-4708-913f-0607c892c949-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:03:04 compute-0 nova_compute[187185]: 2025-11-29 07:03:04.910 187189 DEBUG oslo_concurrency.lockutils [None req-5727660a-dd58-4d11-a092-788af0defda9 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "d7d04d9c-1c42-4708-913f-0607c892c949-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:03:04 compute-0 nova_compute[187185]: 2025-11-29 07:03:04.910 187189 DEBUG oslo_concurrency.lockutils [None req-5727660a-dd58-4d11-a092-788af0defda9 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "d7d04d9c-1c42-4708-913f-0607c892c949-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:03:04 compute-0 nova_compute[187185]: 2025-11-29 07:03:04.925 187189 INFO nova.compute.manager [None req-5727660a-dd58-4d11-a092-788af0defda9 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Terminating instance
Nov 29 07:03:04 compute-0 nova_compute[187185]: 2025-11-29 07:03:04.942 187189 DEBUG nova.compute.manager [None req-5727660a-dd58-4d11-a092-788af0defda9 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 07:03:04 compute-0 nova_compute[187185]: 2025-11-29 07:03:04.952 187189 INFO nova.virt.libvirt.driver [-] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Instance destroyed successfully.
Nov 29 07:03:04 compute-0 nova_compute[187185]: 2025-11-29 07:03:04.952 187189 DEBUG nova.objects.instance [None req-5727660a-dd58-4d11-a092-788af0defda9 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lazy-loading 'resources' on Instance uuid d7d04d9c-1c42-4708-913f-0607c892c949 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:03:04 compute-0 nova_compute[187185]: 2025-11-29 07:03:04.971 187189 DEBUG nova.virt.libvirt.vif [None req-5727660a-dd58-4d11-a092-788af0defda9 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:01:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1485815393',display_name='tempest-DeleteServersTestJSON-server-1485815393',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1485815393',id=59,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:01:55Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='98df116965b74e4a9985049062e65162',ramdisk_id='',reservation_id='r-65jj0gif',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DeleteServersTestJSON-1973671383',owner_user_name='tempest-DeleteServersTestJSON-1973671383-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:03:02Z,user_data=None,user_id='4ecd161098b5422084003b39f0504a8f',uuid=d7d04d9c-1c42-4708-913f-0607c892c949,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "d917aa01-805b-47c4-8cbf-a739d106fe90", "address": "fa:16:3e:3f:96:b0", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd917aa01-80", "ovs_interfaceid": "d917aa01-805b-47c4-8cbf-a739d106fe90", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:03:04 compute-0 nova_compute[187185]: 2025-11-29 07:03:04.971 187189 DEBUG nova.network.os_vif_util [None req-5727660a-dd58-4d11-a092-788af0defda9 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converting VIF {"id": "d917aa01-805b-47c4-8cbf-a739d106fe90", "address": "fa:16:3e:3f:96:b0", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd917aa01-80", "ovs_interfaceid": "d917aa01-805b-47c4-8cbf-a739d106fe90", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:03:04 compute-0 nova_compute[187185]: 2025-11-29 07:03:04.975 187189 DEBUG nova.network.os_vif_util [None req-5727660a-dd58-4d11-a092-788af0defda9 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3f:96:b0,bridge_name='br-int',has_traffic_filtering=True,id=d917aa01-805b-47c4-8cbf-a739d106fe90,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd917aa01-80') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:03:04 compute-0 nova_compute[187185]: 2025-11-29 07:03:04.977 187189 DEBUG os_vif [None req-5727660a-dd58-4d11-a092-788af0defda9 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:3f:96:b0,bridge_name='br-int',has_traffic_filtering=True,id=d917aa01-805b-47c4-8cbf-a739d106fe90,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd917aa01-80') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:03:04 compute-0 nova_compute[187185]: 2025-11-29 07:03:04.983 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:04 compute-0 nova_compute[187185]: 2025-11-29 07:03:04.983 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd917aa01-80, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:03:04 compute-0 nova_compute[187185]: 2025-11-29 07:03:04.988 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:04 compute-0 nova_compute[187185]: 2025-11-29 07:03:04.992 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:03:04 compute-0 nova_compute[187185]: 2025-11-29 07:03:04.997 187189 INFO os_vif [None req-5727660a-dd58-4d11-a092-788af0defda9 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:3f:96:b0,bridge_name='br-int',has_traffic_filtering=True,id=d917aa01-805b-47c4-8cbf-a739d106fe90,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd917aa01-80')
Nov 29 07:03:04 compute-0 nova_compute[187185]: 2025-11-29 07:03:04.998 187189 INFO nova.virt.libvirt.driver [None req-5727660a-dd58-4d11-a092-788af0defda9 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Deleting instance files /var/lib/nova/instances/d7d04d9c-1c42-4708-913f-0607c892c949_del
Nov 29 07:03:05 compute-0 nova_compute[187185]: 2025-11-29 07:03:05.000 187189 INFO nova.virt.libvirt.driver [None req-5727660a-dd58-4d11-a092-788af0defda9 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Deletion of /var/lib/nova/instances/d7d04d9c-1c42-4708-913f-0607c892c949_del complete
Nov 29 07:03:05 compute-0 nova_compute[187185]: 2025-11-29 07:03:05.119 187189 INFO nova.compute.manager [None req-5727660a-dd58-4d11-a092-788af0defda9 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Took 0.18 seconds to destroy the instance on the hypervisor.
Nov 29 07:03:05 compute-0 nova_compute[187185]: 2025-11-29 07:03:05.120 187189 DEBUG oslo.service.loopingcall [None req-5727660a-dd58-4d11-a092-788af0defda9 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 07:03:05 compute-0 nova_compute[187185]: 2025-11-29 07:03:05.121 187189 DEBUG nova.compute.manager [-] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 07:03:05 compute-0 nova_compute[187185]: 2025-11-29 07:03:05.121 187189 DEBUG nova.network.neutron [-] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 07:03:06 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e9d5149618b7ff0b267aeb1abd943bef3634ff473b7b3267845337073ee3afb7-userdata-shm.mount: Deactivated successfully.
Nov 29 07:03:06 compute-0 systemd[1]: var-lib-containers-storage-overlay-dfe16bf8cefc9aa330807fc5b347ba66b1f8167ba29c8b38e6473becd73bd40c-merged.mount: Deactivated successfully.
Nov 29 07:03:06 compute-0 nova_compute[187185]: 2025-11-29 07:03:06.936 187189 DEBUG nova.network.neutron [-] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:03:06 compute-0 nova_compute[187185]: 2025-11-29 07:03:06.971 187189 INFO nova.compute.manager [-] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Took 1.85 seconds to deallocate network for instance.
Nov 29 07:03:07 compute-0 podman[221900]: 2025-11-29 07:03:07.064275387 +0000 UTC m=+4.867717562 container cleanup e9d5149618b7ff0b267aeb1abd943bef3634ff473b7b3267845337073ee3afb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 07:03:07 compute-0 nova_compute[187185]: 2025-11-29 07:03:07.066 187189 DEBUG oslo_concurrency.lockutils [None req-5727660a-dd58-4d11-a092-788af0defda9 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:03:07 compute-0 nova_compute[187185]: 2025-11-29 07:03:07.067 187189 DEBUG oslo_concurrency.lockutils [None req-5727660a-dd58-4d11-a092-788af0defda9 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:03:07 compute-0 systemd[1]: libpod-conmon-e9d5149618b7ff0b267aeb1abd943bef3634ff473b7b3267845337073ee3afb7.scope: Deactivated successfully.
Nov 29 07:03:07 compute-0 nova_compute[187185]: 2025-11-29 07:03:07.087 187189 DEBUG nova.compute.manager [req-06193686-73cc-4208-b1a4-fe26d632d969 req-f2b39e12-a780-4692-bfcf-3e83f2cb2c73 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Received event network-vif-deleted-d917aa01-805b-47c4-8cbf-a739d106fe90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:03:07 compute-0 nova_compute[187185]: 2025-11-29 07:03:07.118 187189 DEBUG nova.compute.provider_tree [None req-5727660a-dd58-4d11-a092-788af0defda9 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:03:07 compute-0 nova_compute[187185]: 2025-11-29 07:03:07.136 187189 DEBUG nova.scheduler.client.report [None req-5727660a-dd58-4d11-a092-788af0defda9 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:03:07 compute-0 nova_compute[187185]: 2025-11-29 07:03:07.160 187189 DEBUG oslo_concurrency.lockutils [None req-5727660a-dd58-4d11-a092-788af0defda9 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:03:07 compute-0 nova_compute[187185]: 2025-11-29 07:03:07.190 187189 INFO nova.scheduler.client.report [None req-5727660a-dd58-4d11-a092-788af0defda9 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Deleted allocations for instance d7d04d9c-1c42-4708-913f-0607c892c949
Nov 29 07:03:07 compute-0 nova_compute[187185]: 2025-11-29 07:03:07.273 187189 DEBUG oslo_concurrency.lockutils [None req-5727660a-dd58-4d11-a092-788af0defda9 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "d7d04d9c-1c42-4708-913f-0607c892c949" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.364s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:03:08 compute-0 podman[221931]: 2025-11-29 07:03:08.302027747 +0000 UTC m=+1.203712736 container remove e9d5149618b7ff0b267aeb1abd943bef3634ff473b7b3267845337073ee3afb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 29 07:03:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:08.312 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[de041d58-2ad0-4c20-b37b-5e6ce4523bf8]: (4, ('Sat Nov 29 07:03:02 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd (e9d5149618b7ff0b267aeb1abd943bef3634ff473b7b3267845337073ee3afb7)\ne9d5149618b7ff0b267aeb1abd943bef3634ff473b7b3267845337073ee3afb7\nSat Nov 29 07:03:07 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd (e9d5149618b7ff0b267aeb1abd943bef3634ff473b7b3267845337073ee3afb7)\ne9d5149618b7ff0b267aeb1abd943bef3634ff473b7b3267845337073ee3afb7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:08.314 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[98a08607-e2e9-4924-8f28-06b2adbbe9a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:08.316 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfd9eb57e-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:03:08 compute-0 nova_compute[187185]: 2025-11-29 07:03:08.318 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:08 compute-0 kernel: tapfd9eb57e-b0: left promiscuous mode
Nov 29 07:03:08 compute-0 nova_compute[187185]: 2025-11-29 07:03:08.336 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:08.340 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[931ead83-2a30-45d2-8dc3-72e41a29e3d3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:08.359 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[351af695-8c6d-46b1-9afb-7e45c10687da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:08.361 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d75e5e29-5d56-4df7-bd8e-70c0e3139483]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:08.383 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[b196c099-4368-42c7-8fad-486a58ab439f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 516144, 'reachable_time': 35233, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221945, 'error': None, 'target': 'ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:08 compute-0 systemd[1]: run-netns-ovnmeta\x2dfd9eb57e\x2db1f8\x2d4bae\x2da60f\x2d8e40613556cd.mount: Deactivated successfully.
Nov 29 07:03:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:08.387 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:03:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:08.388 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[70d2aed7-bbb4-45fc-83b2-7e86ed674f8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:09 compute-0 nova_compute[187185]: 2025-11-29 07:03:09.137 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:09 compute-0 nova_compute[187185]: 2025-11-29 07:03:09.988 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:10 compute-0 podman[221950]: 2025-11-29 07:03:10.825036646 +0000 UTC m=+0.088099921 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 07:03:11 compute-0 podman[221975]: 2025-11-29 07:03:11.794430202 +0000 UTC m=+0.062829082 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 29 07:03:11 compute-0 podman[221976]: 2025-11-29 07:03:11.810793751 +0000 UTC m=+0.071696981 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, container_name=openstack_network_exporter, distribution-scope=public, vcs-type=git, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 07:03:12 compute-0 nova_compute[187185]: 2025-11-29 07:03:12.079 187189 DEBUG oslo_concurrency.lockutils [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "fb979e7f-7c8f-4c79-846d-44c88985959a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:03:12 compute-0 nova_compute[187185]: 2025-11-29 07:03:12.080 187189 DEBUG oslo_concurrency.lockutils [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "fb979e7f-7c8f-4c79-846d-44c88985959a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:03:12 compute-0 nova_compute[187185]: 2025-11-29 07:03:12.097 187189 DEBUG nova.compute.manager [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 07:03:12 compute-0 nova_compute[187185]: 2025-11-29 07:03:12.275 187189 DEBUG oslo_concurrency.lockutils [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:03:12 compute-0 nova_compute[187185]: 2025-11-29 07:03:12.276 187189 DEBUG oslo_concurrency.lockutils [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:03:12 compute-0 nova_compute[187185]: 2025-11-29 07:03:12.283 187189 DEBUG nova.virt.hardware [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:03:12 compute-0 nova_compute[187185]: 2025-11-29 07:03:12.283 187189 INFO nova.compute.claims [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:03:12 compute-0 nova_compute[187185]: 2025-11-29 07:03:12.403 187189 DEBUG nova.compute.provider_tree [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:03:12 compute-0 nova_compute[187185]: 2025-11-29 07:03:12.421 187189 DEBUG nova.scheduler.client.report [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:03:12 compute-0 nova_compute[187185]: 2025-11-29 07:03:12.453 187189 DEBUG oslo_concurrency.lockutils [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:03:12 compute-0 nova_compute[187185]: 2025-11-29 07:03:12.455 187189 DEBUG nova.compute.manager [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 07:03:12 compute-0 nova_compute[187185]: 2025-11-29 07:03:12.574 187189 DEBUG nova.compute.manager [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 07:03:12 compute-0 nova_compute[187185]: 2025-11-29 07:03:12.575 187189 DEBUG nova.network.neutron [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 07:03:12 compute-0 nova_compute[187185]: 2025-11-29 07:03:12.607 187189 INFO nova.virt.libvirt.driver [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 07:03:12 compute-0 nova_compute[187185]: 2025-11-29 07:03:12.631 187189 DEBUG nova.compute.manager [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 07:03:12 compute-0 nova_compute[187185]: 2025-11-29 07:03:12.769 187189 DEBUG nova.compute.manager [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 07:03:12 compute-0 nova_compute[187185]: 2025-11-29 07:03:12.771 187189 DEBUG nova.virt.libvirt.driver [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:03:12 compute-0 nova_compute[187185]: 2025-11-29 07:03:12.772 187189 INFO nova.virt.libvirt.driver [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Creating image(s)
Nov 29 07:03:12 compute-0 nova_compute[187185]: 2025-11-29 07:03:12.772 187189 DEBUG oslo_concurrency.lockutils [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "/var/lib/nova/instances/fb979e7f-7c8f-4c79-846d-44c88985959a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:03:12 compute-0 nova_compute[187185]: 2025-11-29 07:03:12.773 187189 DEBUG oslo_concurrency.lockutils [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "/var/lib/nova/instances/fb979e7f-7c8f-4c79-846d-44c88985959a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:03:12 compute-0 nova_compute[187185]: 2025-11-29 07:03:12.774 187189 DEBUG oslo_concurrency.lockutils [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "/var/lib/nova/instances/fb979e7f-7c8f-4c79-846d-44c88985959a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:03:12 compute-0 nova_compute[187185]: 2025-11-29 07:03:12.788 187189 DEBUG oslo_concurrency.processutils [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:03:12 compute-0 nova_compute[187185]: 2025-11-29 07:03:12.820 187189 DEBUG nova.policy [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 07:03:12 compute-0 nova_compute[187185]: 2025-11-29 07:03:12.844 187189 DEBUG oslo_concurrency.processutils [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:03:12 compute-0 nova_compute[187185]: 2025-11-29 07:03:12.845 187189 DEBUG oslo_concurrency.lockutils [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:03:12 compute-0 nova_compute[187185]: 2025-11-29 07:03:12.846 187189 DEBUG oslo_concurrency.lockutils [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:03:12 compute-0 nova_compute[187185]: 2025-11-29 07:03:12.860 187189 DEBUG oslo_concurrency.processutils [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:03:12 compute-0 nova_compute[187185]: 2025-11-29 07:03:12.917 187189 DEBUG oslo_concurrency.processutils [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:03:12 compute-0 nova_compute[187185]: 2025-11-29 07:03:12.919 187189 DEBUG oslo_concurrency.processutils [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/fb979e7f-7c8f-4c79-846d-44c88985959a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:03:13 compute-0 nova_compute[187185]: 2025-11-29 07:03:13.608 187189 DEBUG nova.network.neutron [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Successfully created port: 7b49d918-5378-4d9d-b444-afa3837b57c9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:03:14 compute-0 nova_compute[187185]: 2025-11-29 07:03:14.139 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:14 compute-0 nova_compute[187185]: 2025-11-29 07:03:14.237 187189 DEBUG oslo_concurrency.processutils [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/fb979e7f-7c8f-4c79-846d-44c88985959a/disk 1073741824" returned: 0 in 1.319s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:03:14 compute-0 nova_compute[187185]: 2025-11-29 07:03:14.238 187189 DEBUG oslo_concurrency.lockutils [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 1.392s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:03:14 compute-0 nova_compute[187185]: 2025-11-29 07:03:14.239 187189 DEBUG oslo_concurrency.processutils [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:03:14 compute-0 nova_compute[187185]: 2025-11-29 07:03:14.307 187189 DEBUG oslo_concurrency.processutils [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:03:14 compute-0 nova_compute[187185]: 2025-11-29 07:03:14.309 187189 DEBUG nova.virt.disk.api [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Checking if we can resize image /var/lib/nova/instances/fb979e7f-7c8f-4c79-846d-44c88985959a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:03:14 compute-0 nova_compute[187185]: 2025-11-29 07:03:14.310 187189 DEBUG oslo_concurrency.processutils [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fb979e7f-7c8f-4c79-846d-44c88985959a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:03:14 compute-0 nova_compute[187185]: 2025-11-29 07:03:14.383 187189 DEBUG oslo_concurrency.processutils [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fb979e7f-7c8f-4c79-846d-44c88985959a/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:03:14 compute-0 nova_compute[187185]: 2025-11-29 07:03:14.384 187189 DEBUG nova.virt.disk.api [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Cannot resize image /var/lib/nova/instances/fb979e7f-7c8f-4c79-846d-44c88985959a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:03:14 compute-0 nova_compute[187185]: 2025-11-29 07:03:14.385 187189 DEBUG nova.objects.instance [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lazy-loading 'migration_context' on Instance uuid fb979e7f-7c8f-4c79-846d-44c88985959a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:03:14 compute-0 nova_compute[187185]: 2025-11-29 07:03:14.407 187189 DEBUG nova.virt.libvirt.driver [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:03:14 compute-0 nova_compute[187185]: 2025-11-29 07:03:14.408 187189 DEBUG nova.virt.libvirt.driver [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Ensure instance console log exists: /var/lib/nova/instances/fb979e7f-7c8f-4c79-846d-44c88985959a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:03:14 compute-0 nova_compute[187185]: 2025-11-29 07:03:14.408 187189 DEBUG oslo_concurrency.lockutils [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:03:14 compute-0 nova_compute[187185]: 2025-11-29 07:03:14.409 187189 DEBUG oslo_concurrency.lockutils [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:03:14 compute-0 nova_compute[187185]: 2025-11-29 07:03:14.409 187189 DEBUG oslo_concurrency.lockutils [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:03:14 compute-0 nova_compute[187185]: 2025-11-29 07:03:14.511 187189 DEBUG nova.network.neutron [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Successfully updated port: 7b49d918-5378-4d9d-b444-afa3837b57c9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:03:14 compute-0 nova_compute[187185]: 2025-11-29 07:03:14.707 187189 DEBUG nova.compute.manager [req-d1ec8032-9545-4536-b895-ae1bdc55b7c8 req-84b3e13a-8e0a-4cdb-ac66-6b9e6f5c9900 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Received event network-changed-7b49d918-5378-4d9d-b444-afa3837b57c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:03:14 compute-0 nova_compute[187185]: 2025-11-29 07:03:14.707 187189 DEBUG nova.compute.manager [req-d1ec8032-9545-4536-b895-ae1bdc55b7c8 req-84b3e13a-8e0a-4cdb-ac66-6b9e6f5c9900 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Refreshing instance network info cache due to event network-changed-7b49d918-5378-4d9d-b444-afa3837b57c9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:03:14 compute-0 nova_compute[187185]: 2025-11-29 07:03:14.708 187189 DEBUG oslo_concurrency.lockutils [req-d1ec8032-9545-4536-b895-ae1bdc55b7c8 req-84b3e13a-8e0a-4cdb-ac66-6b9e6f5c9900 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-fb979e7f-7c8f-4c79-846d-44c88985959a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:03:14 compute-0 nova_compute[187185]: 2025-11-29 07:03:14.708 187189 DEBUG oslo_concurrency.lockutils [req-d1ec8032-9545-4536-b895-ae1bdc55b7c8 req-84b3e13a-8e0a-4cdb-ac66-6b9e6f5c9900 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-fb979e7f-7c8f-4c79-846d-44c88985959a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:03:14 compute-0 nova_compute[187185]: 2025-11-29 07:03:14.709 187189 DEBUG nova.network.neutron [req-d1ec8032-9545-4536-b895-ae1bdc55b7c8 req-84b3e13a-8e0a-4cdb-ac66-6b9e6f5c9900 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Refreshing network info cache for port 7b49d918-5378-4d9d-b444-afa3837b57c9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:03:14 compute-0 nova_compute[187185]: 2025-11-29 07:03:14.712 187189 DEBUG oslo_concurrency.lockutils [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "refresh_cache-fb979e7f-7c8f-4c79-846d-44c88985959a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:03:14 compute-0 nova_compute[187185]: 2025-11-29 07:03:14.991 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:15 compute-0 nova_compute[187185]: 2025-11-29 07:03:15.044 187189 DEBUG nova.network.neutron [req-d1ec8032-9545-4536-b895-ae1bdc55b7c8 req-84b3e13a-8e0a-4cdb-ac66-6b9e6f5c9900 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:03:15 compute-0 nova_compute[187185]: 2025-11-29 07:03:15.490 187189 DEBUG nova.network.neutron [req-d1ec8032-9545-4536-b895-ae1bdc55b7c8 req-84b3e13a-8e0a-4cdb-ac66-6b9e6f5c9900 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:03:15 compute-0 nova_compute[187185]: 2025-11-29 07:03:15.526 187189 DEBUG oslo_concurrency.lockutils [req-d1ec8032-9545-4536-b895-ae1bdc55b7c8 req-84b3e13a-8e0a-4cdb-ac66-6b9e6f5c9900 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-fb979e7f-7c8f-4c79-846d-44c88985959a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:03:15 compute-0 nova_compute[187185]: 2025-11-29 07:03:15.527 187189 DEBUG oslo_concurrency.lockutils [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquired lock "refresh_cache-fb979e7f-7c8f-4c79-846d-44c88985959a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:03:15 compute-0 nova_compute[187185]: 2025-11-29 07:03:15.527 187189 DEBUG nova.network.neutron [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:03:15 compute-0 nova_compute[187185]: 2025-11-29 07:03:15.777 187189 DEBUG nova.network.neutron [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:03:16 compute-0 nova_compute[187185]: 2025-11-29 07:03:16.089 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399781.0883722, d7d04d9c-1c42-4708-913f-0607c892c949 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:03:16 compute-0 nova_compute[187185]: 2025-11-29 07:03:16.089 187189 INFO nova.compute.manager [-] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] VM Stopped (Lifecycle Event)
Nov 29 07:03:16 compute-0 nova_compute[187185]: 2025-11-29 07:03:16.575 187189 DEBUG nova.compute.manager [None req-0fb29329-6cf4-4813-a3e4-094d058a0367 - - - - - -] [instance: d7d04d9c-1c42-4708-913f-0607c892c949] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:03:16 compute-0 nova_compute[187185]: 2025-11-29 07:03:16.962 187189 DEBUG nova.network.neutron [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Updating instance_info_cache with network_info: [{"id": "7b49d918-5378-4d9d-b444-afa3837b57c9", "address": "fa:16:3e:35:f9:cb", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b49d918-53", "ovs_interfaceid": "7b49d918-5378-4d9d-b444-afa3837b57c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.001 187189 DEBUG oslo_concurrency.lockutils [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Releasing lock "refresh_cache-fb979e7f-7c8f-4c79-846d-44c88985959a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.002 187189 DEBUG nova.compute.manager [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Instance network_info: |[{"id": "7b49d918-5378-4d9d-b444-afa3837b57c9", "address": "fa:16:3e:35:f9:cb", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b49d918-53", "ovs_interfaceid": "7b49d918-5378-4d9d-b444-afa3837b57c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.005 187189 DEBUG nova.virt.libvirt.driver [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Start _get_guest_xml network_info=[{"id": "7b49d918-5378-4d9d-b444-afa3837b57c9", "address": "fa:16:3e:35:f9:cb", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b49d918-53", "ovs_interfaceid": "7b49d918-5378-4d9d-b444-afa3837b57c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.011 187189 WARNING nova.virt.libvirt.driver [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.016 187189 DEBUG nova.virt.libvirt.host [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.017 187189 DEBUG nova.virt.libvirt.host [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.020 187189 DEBUG nova.virt.libvirt.host [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.021 187189 DEBUG nova.virt.libvirt.host [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.024 187189 DEBUG nova.virt.libvirt.driver [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.024 187189 DEBUG nova.virt.hardware [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.025 187189 DEBUG nova.virt.hardware [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.025 187189 DEBUG nova.virt.hardware [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.025 187189 DEBUG nova.virt.hardware [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.026 187189 DEBUG nova.virt.hardware [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.026 187189 DEBUG nova.virt.hardware [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.026 187189 DEBUG nova.virt.hardware [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.026 187189 DEBUG nova.virt.hardware [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.027 187189 DEBUG nova.virt.hardware [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.027 187189 DEBUG nova.virt.hardware [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.027 187189 DEBUG nova.virt.hardware [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.032 187189 DEBUG nova.virt.libvirt.vif [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:03:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1053654635',display_name='tempest-DeleteServersTestJSON-server-1053654635',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1053654635',id=61,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98df116965b74e4a9985049062e65162',ramdisk_id='',reservation_id='r-c7vdkx3b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1973671383',owner_user_name='tempest-DeleteServersTestJSON-1973671383-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:03:12Z,user_data=None,user_id='4ecd161098b5422084003b39f0504a8f',uuid=fb979e7f-7c8f-4c79-846d-44c88985959a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7b49d918-5378-4d9d-b444-afa3837b57c9", "address": "fa:16:3e:35:f9:cb", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b49d918-53", "ovs_interfaceid": "7b49d918-5378-4d9d-b444-afa3837b57c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.032 187189 DEBUG nova.network.os_vif_util [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converting VIF {"id": "7b49d918-5378-4d9d-b444-afa3837b57c9", "address": "fa:16:3e:35:f9:cb", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b49d918-53", "ovs_interfaceid": "7b49d918-5378-4d9d-b444-afa3837b57c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.033 187189 DEBUG nova.network.os_vif_util [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:f9:cb,bridge_name='br-int',has_traffic_filtering=True,id=7b49d918-5378-4d9d-b444-afa3837b57c9,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b49d918-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.034 187189 DEBUG nova.objects.instance [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lazy-loading 'pci_devices' on Instance uuid fb979e7f-7c8f-4c79-846d-44c88985959a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.056 187189 DEBUG nova.virt.libvirt.driver [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:03:17 compute-0 nova_compute[187185]:   <uuid>fb979e7f-7c8f-4c79-846d-44c88985959a</uuid>
Nov 29 07:03:17 compute-0 nova_compute[187185]:   <name>instance-0000003d</name>
Nov 29 07:03:17 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:03:17 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:03:17 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:03:17 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:       <nova:name>tempest-DeleteServersTestJSON-server-1053654635</nova:name>
Nov 29 07:03:17 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:03:17</nova:creationTime>
Nov 29 07:03:17 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:03:17 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:03:17 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:03:17 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:03:17 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:03:17 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:03:17 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:03:17 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:03:17 compute-0 nova_compute[187185]:         <nova:user uuid="4ecd161098b5422084003b39f0504a8f">tempest-DeleteServersTestJSON-1973671383-project-member</nova:user>
Nov 29 07:03:17 compute-0 nova_compute[187185]:         <nova:project uuid="98df116965b74e4a9985049062e65162">tempest-DeleteServersTestJSON-1973671383</nova:project>
Nov 29 07:03:17 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:03:17 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:03:17 compute-0 nova_compute[187185]:         <nova:port uuid="7b49d918-5378-4d9d-b444-afa3837b57c9">
Nov 29 07:03:17 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:03:17 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:03:17 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:03:17 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <system>
Nov 29 07:03:17 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:03:17 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:03:17 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:03:17 compute-0 nova_compute[187185]:       <entry name="serial">fb979e7f-7c8f-4c79-846d-44c88985959a</entry>
Nov 29 07:03:17 compute-0 nova_compute[187185]:       <entry name="uuid">fb979e7f-7c8f-4c79-846d-44c88985959a</entry>
Nov 29 07:03:17 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     </system>
Nov 29 07:03:17 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:03:17 compute-0 nova_compute[187185]:   <os>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:   </os>
Nov 29 07:03:17 compute-0 nova_compute[187185]:   <features>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:   </features>
Nov 29 07:03:17 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:03:17 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:03:17 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:03:17 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/fb979e7f-7c8f-4c79-846d-44c88985959a/disk"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:03:17 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/fb979e7f-7c8f-4c79-846d-44c88985959a/disk.config"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:03:17 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:35:f9:cb"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:       <target dev="tap7b49d918-53"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:03:17 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/fb979e7f-7c8f-4c79-846d-44c88985959a/console.log" append="off"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <video>
Nov 29 07:03:17 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     </video>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:03:17 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:03:17 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:03:17 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:03:17 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:03:17 compute-0 nova_compute[187185]: </domain>
Nov 29 07:03:17 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.057 187189 DEBUG nova.compute.manager [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Preparing to wait for external event network-vif-plugged-7b49d918-5378-4d9d-b444-afa3837b57c9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.058 187189 DEBUG oslo_concurrency.lockutils [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "fb979e7f-7c8f-4c79-846d-44c88985959a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.058 187189 DEBUG oslo_concurrency.lockutils [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "fb979e7f-7c8f-4c79-846d-44c88985959a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.058 187189 DEBUG oslo_concurrency.lockutils [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "fb979e7f-7c8f-4c79-846d-44c88985959a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.059 187189 DEBUG nova.virt.libvirt.vif [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:03:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1053654635',display_name='tempest-DeleteServersTestJSON-server-1053654635',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1053654635',id=61,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98df116965b74e4a9985049062e65162',ramdisk_id='',reservation_id='r-c7vdkx3b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1973671383',owner_user_name='tempest-DeleteServersTestJSON-1973671383-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:03:12Z,user_data=None,user_id='4ecd161098b5422084003b39f0504a8f',uuid=fb979e7f-7c8f-4c79-846d-44c88985959a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7b49d918-5378-4d9d-b444-afa3837b57c9", "address": "fa:16:3e:35:f9:cb", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b49d918-53", "ovs_interfaceid": "7b49d918-5378-4d9d-b444-afa3837b57c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.059 187189 DEBUG nova.network.os_vif_util [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converting VIF {"id": "7b49d918-5378-4d9d-b444-afa3837b57c9", "address": "fa:16:3e:35:f9:cb", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b49d918-53", "ovs_interfaceid": "7b49d918-5378-4d9d-b444-afa3837b57c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.060 187189 DEBUG nova.network.os_vif_util [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:f9:cb,bridge_name='br-int',has_traffic_filtering=True,id=7b49d918-5378-4d9d-b444-afa3837b57c9,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b49d918-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.060 187189 DEBUG os_vif [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:f9:cb,bridge_name='br-int',has_traffic_filtering=True,id=7b49d918-5378-4d9d-b444-afa3837b57c9,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b49d918-53') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.061 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.062 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.062 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.067 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.067 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7b49d918-53, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.068 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7b49d918-53, col_values=(('external_ids', {'iface-id': '7b49d918-5378-4d9d-b444-afa3837b57c9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:35:f9:cb', 'vm-uuid': 'fb979e7f-7c8f-4c79-846d-44c88985959a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.085 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:17 compute-0 NetworkManager[55227]: <info>  [1764399797.0868] manager: (tap7b49d918-53): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/79)
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.089 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.095 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.096 187189 INFO os_vif [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:f9:cb,bridge_name='br-int',has_traffic_filtering=True,id=7b49d918-5378-4d9d-b444-afa3837b57c9,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b49d918-53')
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.190 187189 DEBUG nova.virt.libvirt.driver [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.191 187189 DEBUG nova.virt.libvirt.driver [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.191 187189 DEBUG nova.virt.libvirt.driver [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] No VIF found with MAC fa:16:3e:35:f9:cb, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.192 187189 INFO nova.virt.libvirt.driver [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Using config drive
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.745 187189 INFO nova.virt.libvirt.driver [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Creating config drive at /var/lib/nova/instances/fb979e7f-7c8f-4c79-846d-44c88985959a/disk.config
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.754 187189 DEBUG oslo_concurrency.processutils [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fb979e7f-7c8f-4c79-846d-44c88985959a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpo5dlpouu execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.882 187189 DEBUG oslo_concurrency.processutils [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fb979e7f-7c8f-4c79-846d-44c88985959a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpo5dlpouu" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:03:17 compute-0 kernel: tap7b49d918-53: entered promiscuous mode
Nov 29 07:03:17 compute-0 NetworkManager[55227]: <info>  [1764399797.9513] manager: (tap7b49d918-53): new Tun device (/org/freedesktop/NetworkManager/Devices/80)
Nov 29 07:03:17 compute-0 ovn_controller[95281]: 2025-11-29T07:03:17Z|00141|binding|INFO|Claiming lport 7b49d918-5378-4d9d-b444-afa3837b57c9 for this chassis.
Nov 29 07:03:17 compute-0 ovn_controller[95281]: 2025-11-29T07:03:17Z|00142|binding|INFO|7b49d918-5378-4d9d-b444-afa3837b57c9: Claiming fa:16:3e:35:f9:cb 10.100.0.14
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.951 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:17 compute-0 ovn_controller[95281]: 2025-11-29T07:03:17Z|00143|binding|INFO|Setting lport 7b49d918-5378-4d9d-b444-afa3837b57c9 ovn-installed in OVS
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.965 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:17 compute-0 nova_compute[187185]: 2025-11-29 07:03:17.971 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:17 compute-0 systemd-udevd[222047]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:03:17 compute-0 systemd-machined[153486]: New machine qemu-21-instance-0000003d.
Nov 29 07:03:17 compute-0 NetworkManager[55227]: <info>  [1764399797.9917] device (tap7b49d918-53): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:03:17 compute-0 NetworkManager[55227]: <info>  [1764399797.9928] device (tap7b49d918-53): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:03:18 compute-0 systemd[1]: Started Virtual Machine qemu-21-instance-0000003d.
Nov 29 07:03:18 compute-0 ovn_controller[95281]: 2025-11-29T07:03:18Z|00144|binding|INFO|Setting lport 7b49d918-5378-4d9d-b444-afa3837b57c9 up in Southbound
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:18.003 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:f9:cb 10.100.0.14'], port_security=['fa:16:3e:35:f9:cb 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'fb979e7f-7c8f-4c79-846d-44c88985959a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98df116965b74e4a9985049062e65162', 'neutron:revision_number': '2', 'neutron:security_group_ids': '234720a9-9cd1-4b87-9bec-1abfe8ff0514', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e694bb30-a43a-4d18-87fa-e5c0dd8850c2, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=7b49d918-5378-4d9d-b444-afa3837b57c9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:18.004 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 7b49d918-5378-4d9d-b444-afa3837b57c9 in datapath fd9eb57e-b1f8-4bae-a60f-8e40613556cd bound to our chassis
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:18.006 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fd9eb57e-b1f8-4bae-a60f-8e40613556cd
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:18.019 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[aeab6ab1-d62a-4268-8865-be490eb8a473]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:18.020 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfd9eb57e-b1 in ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:18.023 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfd9eb57e-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:18.023 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[295cbc0a-5e35-4e1f-834e-bd1339c532ce]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:18.024 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e05c11e2-de47-4c72-aec1-6d603c2771c0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:18.036 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[8b802ead-9edf-457e-a88b-57a6c282e63c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:18.049 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[4e50db57-af61-4199-a43d-15d3548e004c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:18.078 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[b54aa95e-67a7-4a5e-985a-676b8fc62a7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:18.083 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[85a58854-d939-4f55-862f-c971fbca7503]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:18 compute-0 NetworkManager[55227]: <info>  [1764399798.0847] manager: (tapfd9eb57e-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/81)
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:18.112 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[513abbce-3809-44f1-82a0-33d95b5120e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:18.115 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[56406ed2-60d6-4f05-a733-42cd191db3c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:18 compute-0 NetworkManager[55227]: <info>  [1764399798.1354] device (tapfd9eb57e-b0): carrier: link connected
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:18.140 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[10c0792f-fe1f-496f-abf4-b19a76a3aeeb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:18.155 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d2520f5d-bd76-45c4-80c4-f45118041492]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfd9eb57e-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ec:80:ac'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 524579, 'reachable_time': 44570, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222081, 'error': None, 'target': 'ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:18.169 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[fcedb11f-75f0-4fd7-b2e5-00aaa0a9c989]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feec:80ac'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 524579, 'tstamp': 524579}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222082, 'error': None, 'target': 'ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:18.189 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[c907accb-05b8-42b1-8243-0bf9915edae1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfd9eb57e-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ec:80:ac'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 524579, 'reachable_time': 44570, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222083, 'error': None, 'target': 'ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:18.220 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[fefcb74c-9457-416c-9fb6-1ba9a5db181f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:18.277 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f7b77afc-718a-49fc-bc56-93bf85d57638]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:18.279 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfd9eb57e-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:18.280 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:18.280 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfd9eb57e-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:03:18 compute-0 NetworkManager[55227]: <info>  [1764399798.2825] manager: (tapfd9eb57e-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/82)
Nov 29 07:03:18 compute-0 kernel: tapfd9eb57e-b0: entered promiscuous mode
Nov 29 07:03:18 compute-0 nova_compute[187185]: 2025-11-29 07:03:18.286 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:18.287 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfd9eb57e-b0, col_values=(('external_ids', {'iface-id': 'e7b4cb4f-cb6d-4f0e-8c8d-34c743671595'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:03:18 compute-0 ovn_controller[95281]: 2025-11-29T07:03:18Z|00145|binding|INFO|Releasing lport e7b4cb4f-cb6d-4f0e-8c8d-34c743671595 from this chassis (sb_readonly=0)
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:18.291 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fd9eb57e-b1f8-4bae-a60f-8e40613556cd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fd9eb57e-b1f8-4bae-a60f-8e40613556cd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:03:18 compute-0 nova_compute[187185]: 2025-11-29 07:03:18.301 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:18.299 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[c0e8d8b9-462e-4cc7-9d6a-ffc77d040220]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:18.303 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-fd9eb57e-b1f8-4bae-a60f-8e40613556cd
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/fd9eb57e-b1f8-4bae-a60f-8e40613556cd.pid.haproxy
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID fd9eb57e-b1f8-4bae-a60f-8e40613556cd
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:03:18 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:18.304 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'env', 'PROCESS_TAG=haproxy-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fd9eb57e-b1f8-4bae-a60f-8e40613556cd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:03:18 compute-0 nova_compute[187185]: 2025-11-29 07:03:18.476 187189 DEBUG nova.compute.manager [req-0b6d873e-8684-4731-bc61-472c3f15a9ae req-4b5d73ab-c64a-42fa-8842-ff50f9819e70 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Received event network-vif-plugged-7b49d918-5378-4d9d-b444-afa3837b57c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:03:18 compute-0 nova_compute[187185]: 2025-11-29 07:03:18.477 187189 DEBUG oslo_concurrency.lockutils [req-0b6d873e-8684-4731-bc61-472c3f15a9ae req-4b5d73ab-c64a-42fa-8842-ff50f9819e70 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "fb979e7f-7c8f-4c79-846d-44c88985959a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:03:18 compute-0 nova_compute[187185]: 2025-11-29 07:03:18.477 187189 DEBUG oslo_concurrency.lockutils [req-0b6d873e-8684-4731-bc61-472c3f15a9ae req-4b5d73ab-c64a-42fa-8842-ff50f9819e70 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "fb979e7f-7c8f-4c79-846d-44c88985959a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:03:18 compute-0 nova_compute[187185]: 2025-11-29 07:03:18.478 187189 DEBUG oslo_concurrency.lockutils [req-0b6d873e-8684-4731-bc61-472c3f15a9ae req-4b5d73ab-c64a-42fa-8842-ff50f9819e70 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "fb979e7f-7c8f-4c79-846d-44c88985959a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:03:18 compute-0 nova_compute[187185]: 2025-11-29 07:03:18.478 187189 DEBUG nova.compute.manager [req-0b6d873e-8684-4731-bc61-472c3f15a9ae req-4b5d73ab-c64a-42fa-8842-ff50f9819e70 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Processing event network-vif-plugged-7b49d918-5378-4d9d-b444-afa3837b57c9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:03:18 compute-0 podman[222115]: 2025-11-29 07:03:18.640732701 +0000 UTC m=+0.023225332 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:03:19 compute-0 nova_compute[187185]: 2025-11-29 07:03:19.027 187189 DEBUG nova.compute.manager [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:03:19 compute-0 nova_compute[187185]: 2025-11-29 07:03:19.028 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399799.0268736, fb979e7f-7c8f-4c79-846d-44c88985959a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:03:19 compute-0 nova_compute[187185]: 2025-11-29 07:03:19.028 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] VM Started (Lifecycle Event)
Nov 29 07:03:19 compute-0 nova_compute[187185]: 2025-11-29 07:03:19.033 187189 DEBUG nova.virt.libvirt.driver [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:03:19 compute-0 nova_compute[187185]: 2025-11-29 07:03:19.036 187189 INFO nova.virt.libvirt.driver [-] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Instance spawned successfully.
Nov 29 07:03:19 compute-0 nova_compute[187185]: 2025-11-29 07:03:19.036 187189 DEBUG nova.virt.libvirt.driver [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:03:19 compute-0 nova_compute[187185]: 2025-11-29 07:03:19.141 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:19 compute-0 nova_compute[187185]: 2025-11-29 07:03:19.149 187189 DEBUG nova.virt.libvirt.driver [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:03:19 compute-0 nova_compute[187185]: 2025-11-29 07:03:19.149 187189 DEBUG nova.virt.libvirt.driver [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:03:19 compute-0 nova_compute[187185]: 2025-11-29 07:03:19.150 187189 DEBUG nova.virt.libvirt.driver [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:03:19 compute-0 nova_compute[187185]: 2025-11-29 07:03:19.150 187189 DEBUG nova.virt.libvirt.driver [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:03:19 compute-0 nova_compute[187185]: 2025-11-29 07:03:19.151 187189 DEBUG nova.virt.libvirt.driver [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:03:19 compute-0 nova_compute[187185]: 2025-11-29 07:03:19.151 187189 DEBUG nova.virt.libvirt.driver [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:03:19 compute-0 nova_compute[187185]: 2025-11-29 07:03:19.157 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:03:19 compute-0 nova_compute[187185]: 2025-11-29 07:03:19.163 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:03:19 compute-0 nova_compute[187185]: 2025-11-29 07:03:19.233 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:03:19 compute-0 nova_compute[187185]: 2025-11-29 07:03:19.234 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399799.0271845, fb979e7f-7c8f-4c79-846d-44c88985959a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:03:19 compute-0 nova_compute[187185]: 2025-11-29 07:03:19.234 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] VM Paused (Lifecycle Event)
Nov 29 07:03:19 compute-0 podman[222115]: 2025-11-29 07:03:19.369001467 +0000 UTC m=+0.751494098 container create 96f6af69a7bc1cde33a78af108f8e1ee1efbed92d586151f30f0d2bf715251a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 29 07:03:19 compute-0 nova_compute[187185]: 2025-11-29 07:03:19.499 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:03:19 compute-0 nova_compute[187185]: 2025-11-29 07:03:19.503 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399799.032722, fb979e7f-7c8f-4c79-846d-44c88985959a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:03:19 compute-0 nova_compute[187185]: 2025-11-29 07:03:19.503 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] VM Resumed (Lifecycle Event)
Nov 29 07:03:19 compute-0 nova_compute[187185]: 2025-11-29 07:03:19.552 187189 INFO nova.compute.manager [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Took 6.78 seconds to spawn the instance on the hypervisor.
Nov 29 07:03:19 compute-0 nova_compute[187185]: 2025-11-29 07:03:19.552 187189 DEBUG nova.compute.manager [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:03:19 compute-0 nova_compute[187185]: 2025-11-29 07:03:19.558 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:03:19 compute-0 nova_compute[187185]: 2025-11-29 07:03:19.561 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:03:19 compute-0 nova_compute[187185]: 2025-11-29 07:03:19.600 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:03:19 compute-0 systemd[1]: Started libpod-conmon-96f6af69a7bc1cde33a78af108f8e1ee1efbed92d586151f30f0d2bf715251a9.scope.
Nov 29 07:03:19 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:03:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/30656485305ba70d34a8738a3aef2ad248440203fdcd5bc964a8ebd86d3b541d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:03:19 compute-0 nova_compute[187185]: 2025-11-29 07:03:19.672 187189 INFO nova.compute.manager [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Took 7.52 seconds to build instance.
Nov 29 07:03:19 compute-0 nova_compute[187185]: 2025-11-29 07:03:19.702 187189 DEBUG oslo_concurrency.lockutils [None req-0e407b4c-a314-4549-b64d-2ccfff006566 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "fb979e7f-7c8f-4c79-846d-44c88985959a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.622s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:03:19 compute-0 podman[222115]: 2025-11-29 07:03:19.799142495 +0000 UTC m=+1.181635146 container init 96f6af69a7bc1cde33a78af108f8e1ee1efbed92d586151f30f0d2bf715251a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 07:03:19 compute-0 podman[222115]: 2025-11-29 07:03:19.806975594 +0000 UTC m=+1.189468225 container start 96f6af69a7bc1cde33a78af108f8e1ee1efbed92d586151f30f0d2bf715251a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Nov 29 07:03:19 compute-0 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[222154]: [NOTICE]   (222167) : New worker (222169) forked
Nov 29 07:03:19 compute-0 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[222154]: [NOTICE]   (222167) : Loading success.
Nov 29 07:03:20 compute-0 podman[222134]: 2025-11-29 07:03:20.011647222 +0000 UTC m=+0.608501289 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 07:03:20 compute-0 nova_compute[187185]: 2025-11-29 07:03:20.739 187189 DEBUG nova.compute.manager [req-fdeda677-a5bf-4e3a-bc94-cb247cd8fa81 req-57c07b5b-ca6b-4d00-8d7a-0da2fff62d24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Received event network-vif-plugged-7b49d918-5378-4d9d-b444-afa3837b57c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:03:20 compute-0 nova_compute[187185]: 2025-11-29 07:03:20.739 187189 DEBUG oslo_concurrency.lockutils [req-fdeda677-a5bf-4e3a-bc94-cb247cd8fa81 req-57c07b5b-ca6b-4d00-8d7a-0da2fff62d24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "fb979e7f-7c8f-4c79-846d-44c88985959a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:03:20 compute-0 nova_compute[187185]: 2025-11-29 07:03:20.739 187189 DEBUG oslo_concurrency.lockutils [req-fdeda677-a5bf-4e3a-bc94-cb247cd8fa81 req-57c07b5b-ca6b-4d00-8d7a-0da2fff62d24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "fb979e7f-7c8f-4c79-846d-44c88985959a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:03:20 compute-0 nova_compute[187185]: 2025-11-29 07:03:20.740 187189 DEBUG oslo_concurrency.lockutils [req-fdeda677-a5bf-4e3a-bc94-cb247cd8fa81 req-57c07b5b-ca6b-4d00-8d7a-0da2fff62d24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "fb979e7f-7c8f-4c79-846d-44c88985959a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:03:20 compute-0 nova_compute[187185]: 2025-11-29 07:03:20.740 187189 DEBUG nova.compute.manager [req-fdeda677-a5bf-4e3a-bc94-cb247cd8fa81 req-57c07b5b-ca6b-4d00-8d7a-0da2fff62d24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] No waiting events found dispatching network-vif-plugged-7b49d918-5378-4d9d-b444-afa3837b57c9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:03:20 compute-0 nova_compute[187185]: 2025-11-29 07:03:20.740 187189 WARNING nova.compute.manager [req-fdeda677-a5bf-4e3a-bc94-cb247cd8fa81 req-57c07b5b-ca6b-4d00-8d7a-0da2fff62d24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Received unexpected event network-vif-plugged-7b49d918-5378-4d9d-b444-afa3837b57c9 for instance with vm_state active and task_state None.
Nov 29 07:03:22 compute-0 nova_compute[187185]: 2025-11-29 07:03:22.085 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:24 compute-0 nova_compute[187185]: 2025-11-29 07:03:24.143 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:24.823 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:03:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:24.824 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:03:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:24.825 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:03:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:25.722 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:03:25 compute-0 nova_compute[187185]: 2025-11-29 07:03:25.722 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:25.723 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:03:25 compute-0 nova_compute[187185]: 2025-11-29 07:03:25.927 187189 DEBUG nova.objects.instance [None req-613d07dc-caaa-4156-8c8c-61490ba79f3f 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lazy-loading 'pci_devices' on Instance uuid fb979e7f-7c8f-4c79-846d-44c88985959a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:03:25 compute-0 nova_compute[187185]: 2025-11-29 07:03:25.957 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399805.9575708, fb979e7f-7c8f-4c79-846d-44c88985959a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:03:25 compute-0 nova_compute[187185]: 2025-11-29 07:03:25.958 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] VM Paused (Lifecycle Event)
Nov 29 07:03:26 compute-0 nova_compute[187185]: 2025-11-29 07:03:26.095 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:03:26 compute-0 nova_compute[187185]: 2025-11-29 07:03:26.100 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:03:26 compute-0 nova_compute[187185]: 2025-11-29 07:03:26.121 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] During sync_power_state the instance has a pending task (suspending). Skip.
Nov 29 07:03:26 compute-0 podman[222183]: 2025-11-29 07:03:26.807451685 +0000 UTC m=+0.063789278 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 07:03:26 compute-0 podman[222182]: 2025-11-29 07:03:26.807795915 +0000 UTC m=+0.069872919 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 07:03:27 compute-0 nova_compute[187185]: 2025-11-29 07:03:27.132 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:28 compute-0 kernel: tap7b49d918-53 (unregistering): left promiscuous mode
Nov 29 07:03:28 compute-0 NetworkManager[55227]: <info>  [1764399808.9446] device (tap7b49d918-53): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:03:28 compute-0 ovn_controller[95281]: 2025-11-29T07:03:28Z|00146|binding|INFO|Releasing lport 7b49d918-5378-4d9d-b444-afa3837b57c9 from this chassis (sb_readonly=0)
Nov 29 07:03:28 compute-0 ovn_controller[95281]: 2025-11-29T07:03:28Z|00147|binding|INFO|Setting lport 7b49d918-5378-4d9d-b444-afa3837b57c9 down in Southbound
Nov 29 07:03:28 compute-0 nova_compute[187185]: 2025-11-29 07:03:28.959 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:28 compute-0 ovn_controller[95281]: 2025-11-29T07:03:28Z|00148|binding|INFO|Removing iface tap7b49d918-53 ovn-installed in OVS
Nov 29 07:03:28 compute-0 nova_compute[187185]: 2025-11-29 07:03:28.961 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:28 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:28.977 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:f9:cb 10.100.0.14'], port_security=['fa:16:3e:35:f9:cb 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'fb979e7f-7c8f-4c79-846d-44c88985959a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98df116965b74e4a9985049062e65162', 'neutron:revision_number': '4', 'neutron:security_group_ids': '234720a9-9cd1-4b87-9bec-1abfe8ff0514', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e694bb30-a43a-4d18-87fa-e5c0dd8850c2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=7b49d918-5378-4d9d-b444-afa3837b57c9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:03:28 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:28.980 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 7b49d918-5378-4d9d-b444-afa3837b57c9 in datapath fd9eb57e-b1f8-4bae-a60f-8e40613556cd unbound from our chassis
Nov 29 07:03:28 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:28.985 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fd9eb57e-b1f8-4bae-a60f-8e40613556cd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:03:28 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:28.986 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[430e4d8a-4891-4c5e-801b-9d059a694cb1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:28 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:28.987 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd namespace which is not needed anymore
Nov 29 07:03:28 compute-0 nova_compute[187185]: 2025-11-29 07:03:28.996 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:29 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000003d.scope: Deactivated successfully.
Nov 29 07:03:29 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000003d.scope: Consumed 7.967s CPU time.
Nov 29 07:03:29 compute-0 systemd-machined[153486]: Machine qemu-21-instance-0000003d terminated.
Nov 29 07:03:29 compute-0 nova_compute[187185]: 2025-11-29 07:03:29.129 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:29 compute-0 nova_compute[187185]: 2025-11-29 07:03:29.132 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:29 compute-0 nova_compute[187185]: 2025-11-29 07:03:29.145 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:29 compute-0 nova_compute[187185]: 2025-11-29 07:03:29.164 187189 DEBUG nova.compute.manager [None req-613d07dc-caaa-4156-8c8c-61490ba79f3f 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:03:29 compute-0 nova_compute[187185]: 2025-11-29 07:03:29.241 187189 DEBUG nova.compute.manager [req-dca145c8-9ee7-4691-9900-e8affac10a74 req-e9a9eccf-ff5e-4527-bebb-314254e7d54b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Received event network-vif-unplugged-7b49d918-5378-4d9d-b444-afa3837b57c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:03:29 compute-0 nova_compute[187185]: 2025-11-29 07:03:29.242 187189 DEBUG oslo_concurrency.lockutils [req-dca145c8-9ee7-4691-9900-e8affac10a74 req-e9a9eccf-ff5e-4527-bebb-314254e7d54b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "fb979e7f-7c8f-4c79-846d-44c88985959a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:03:29 compute-0 nova_compute[187185]: 2025-11-29 07:03:29.242 187189 DEBUG oslo_concurrency.lockutils [req-dca145c8-9ee7-4691-9900-e8affac10a74 req-e9a9eccf-ff5e-4527-bebb-314254e7d54b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "fb979e7f-7c8f-4c79-846d-44c88985959a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:03:29 compute-0 nova_compute[187185]: 2025-11-29 07:03:29.242 187189 DEBUG oslo_concurrency.lockutils [req-dca145c8-9ee7-4691-9900-e8affac10a74 req-e9a9eccf-ff5e-4527-bebb-314254e7d54b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "fb979e7f-7c8f-4c79-846d-44c88985959a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:03:29 compute-0 nova_compute[187185]: 2025-11-29 07:03:29.242 187189 DEBUG nova.compute.manager [req-dca145c8-9ee7-4691-9900-e8affac10a74 req-e9a9eccf-ff5e-4527-bebb-314254e7d54b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] No waiting events found dispatching network-vif-unplugged-7b49d918-5378-4d9d-b444-afa3837b57c9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:03:29 compute-0 nova_compute[187185]: 2025-11-29 07:03:29.242 187189 WARNING nova.compute.manager [req-dca145c8-9ee7-4691-9900-e8affac10a74 req-e9a9eccf-ff5e-4527-bebb-314254e7d54b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Received unexpected event network-vif-unplugged-7b49d918-5378-4d9d-b444-afa3837b57c9 for instance with vm_state suspended and task_state None.
Nov 29 07:03:29 compute-0 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[222154]: [NOTICE]   (222167) : haproxy version is 2.8.14-c23fe91
Nov 29 07:03:29 compute-0 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[222154]: [NOTICE]   (222167) : path to executable is /usr/sbin/haproxy
Nov 29 07:03:29 compute-0 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[222154]: [WARNING]  (222167) : Exiting Master process...
Nov 29 07:03:29 compute-0 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[222154]: [ALERT]    (222167) : Current worker (222169) exited with code 143 (Terminated)
Nov 29 07:03:29 compute-0 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[222154]: [WARNING]  (222167) : All workers exited. Exiting... (0)
Nov 29 07:03:29 compute-0 systemd[1]: libpod-96f6af69a7bc1cde33a78af108f8e1ee1efbed92d586151f30f0d2bf715251a9.scope: Deactivated successfully.
Nov 29 07:03:29 compute-0 podman[222251]: 2025-11-29 07:03:29.344848569 +0000 UTC m=+0.253573650 container died 96f6af69a7bc1cde33a78af108f8e1ee1efbed92d586151f30f0d2bf715251a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:03:30 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-96f6af69a7bc1cde33a78af108f8e1ee1efbed92d586151f30f0d2bf715251a9-userdata-shm.mount: Deactivated successfully.
Nov 29 07:03:30 compute-0 systemd[1]: var-lib-containers-storage-overlay-30656485305ba70d34a8738a3aef2ad248440203fdcd5bc964a8ebd86d3b541d-merged.mount: Deactivated successfully.
Nov 29 07:03:30 compute-0 podman[222296]: 2025-11-29 07:03:30.298627418 +0000 UTC m=+0.057002899 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:03:30 compute-0 nova_compute[187185]: 2025-11-29 07:03:30.814 187189 DEBUG oslo_concurrency.lockutils [None req-90495342-224d-478b-89a8-606a2a554b9f 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "fb979e7f-7c8f-4c79-846d-44c88985959a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:03:30 compute-0 nova_compute[187185]: 2025-11-29 07:03:30.814 187189 DEBUG oslo_concurrency.lockutils [None req-90495342-224d-478b-89a8-606a2a554b9f 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "fb979e7f-7c8f-4c79-846d-44c88985959a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:03:30 compute-0 nova_compute[187185]: 2025-11-29 07:03:30.815 187189 DEBUG oslo_concurrency.lockutils [None req-90495342-224d-478b-89a8-606a2a554b9f 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "fb979e7f-7c8f-4c79-846d-44c88985959a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:03:30 compute-0 nova_compute[187185]: 2025-11-29 07:03:30.815 187189 DEBUG oslo_concurrency.lockutils [None req-90495342-224d-478b-89a8-606a2a554b9f 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "fb979e7f-7c8f-4c79-846d-44c88985959a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:03:30 compute-0 nova_compute[187185]: 2025-11-29 07:03:30.815 187189 DEBUG oslo_concurrency.lockutils [None req-90495342-224d-478b-89a8-606a2a554b9f 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "fb979e7f-7c8f-4c79-846d-44c88985959a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:03:30 compute-0 nova_compute[187185]: 2025-11-29 07:03:30.830 187189 INFO nova.compute.manager [None req-90495342-224d-478b-89a8-606a2a554b9f 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Terminating instance
Nov 29 07:03:30 compute-0 nova_compute[187185]: 2025-11-29 07:03:30.847 187189 DEBUG nova.compute.manager [None req-90495342-224d-478b-89a8-606a2a554b9f 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 07:03:30 compute-0 nova_compute[187185]: 2025-11-29 07:03:30.854 187189 INFO nova.virt.libvirt.driver [-] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Instance destroyed successfully.
Nov 29 07:03:30 compute-0 nova_compute[187185]: 2025-11-29 07:03:30.854 187189 DEBUG nova.objects.instance [None req-90495342-224d-478b-89a8-606a2a554b9f 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lazy-loading 'resources' on Instance uuid fb979e7f-7c8f-4c79-846d-44c88985959a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:03:30 compute-0 nova_compute[187185]: 2025-11-29 07:03:30.872 187189 DEBUG nova.virt.libvirt.vif [None req-90495342-224d-478b-89a8-606a2a554b9f 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:03:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1053654635',display_name='tempest-DeleteServersTestJSON-server-1053654635',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1053654635',id=61,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:03:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='98df116965b74e4a9985049062e65162',ramdisk_id='',reservation_id='r-c7vdkx3b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-1973671383',owner_user_name='tempest-DeleteServersTestJSON-1973671383-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:03:29Z,user_data=None,user_id='4ecd161098b5422084003b39f0504a8f',uuid=fb979e7f-7c8f-4c79-846d-44c88985959a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "7b49d918-5378-4d9d-b444-afa3837b57c9", "address": "fa:16:3e:35:f9:cb", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b49d918-53", "ovs_interfaceid": "7b49d918-5378-4d9d-b444-afa3837b57c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:03:30 compute-0 nova_compute[187185]: 2025-11-29 07:03:30.872 187189 DEBUG nova.network.os_vif_util [None req-90495342-224d-478b-89a8-606a2a554b9f 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converting VIF {"id": "7b49d918-5378-4d9d-b444-afa3837b57c9", "address": "fa:16:3e:35:f9:cb", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b49d918-53", "ovs_interfaceid": "7b49d918-5378-4d9d-b444-afa3837b57c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:03:30 compute-0 nova_compute[187185]: 2025-11-29 07:03:30.873 187189 DEBUG nova.network.os_vif_util [None req-90495342-224d-478b-89a8-606a2a554b9f 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:f9:cb,bridge_name='br-int',has_traffic_filtering=True,id=7b49d918-5378-4d9d-b444-afa3837b57c9,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b49d918-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:03:30 compute-0 nova_compute[187185]: 2025-11-29 07:03:30.873 187189 DEBUG os_vif [None req-90495342-224d-478b-89a8-606a2a554b9f 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:f9:cb,bridge_name='br-int',has_traffic_filtering=True,id=7b49d918-5378-4d9d-b444-afa3837b57c9,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b49d918-53') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:03:30 compute-0 nova_compute[187185]: 2025-11-29 07:03:30.875 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:30 compute-0 nova_compute[187185]: 2025-11-29 07:03:30.876 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b49d918-53, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:03:30 compute-0 nova_compute[187185]: 2025-11-29 07:03:30.877 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:30 compute-0 nova_compute[187185]: 2025-11-29 07:03:30.878 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:30 compute-0 nova_compute[187185]: 2025-11-29 07:03:30.880 187189 INFO os_vif [None req-90495342-224d-478b-89a8-606a2a554b9f 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:f9:cb,bridge_name='br-int',has_traffic_filtering=True,id=7b49d918-5378-4d9d-b444-afa3837b57c9,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b49d918-53')
Nov 29 07:03:30 compute-0 nova_compute[187185]: 2025-11-29 07:03:30.881 187189 INFO nova.virt.libvirt.driver [None req-90495342-224d-478b-89a8-606a2a554b9f 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Deleting instance files /var/lib/nova/instances/fb979e7f-7c8f-4c79-846d-44c88985959a_del
Nov 29 07:03:30 compute-0 nova_compute[187185]: 2025-11-29 07:03:30.882 187189 INFO nova.virt.libvirt.driver [None req-90495342-224d-478b-89a8-606a2a554b9f 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Deletion of /var/lib/nova/instances/fb979e7f-7c8f-4c79-846d-44c88985959a_del complete
Nov 29 07:03:30 compute-0 podman[222251]: 2025-11-29 07:03:30.966174351 +0000 UTC m=+1.874899432 container cleanup 96f6af69a7bc1cde33a78af108f8e1ee1efbed92d586151f30f0d2bf715251a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 29 07:03:30 compute-0 systemd[1]: libpod-conmon-96f6af69a7bc1cde33a78af108f8e1ee1efbed92d586151f30f0d2bf715251a9.scope: Deactivated successfully.
Nov 29 07:03:31 compute-0 nova_compute[187185]: 2025-11-29 07:03:31.010 187189 INFO nova.compute.manager [None req-90495342-224d-478b-89a8-606a2a554b9f 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Took 0.16 seconds to destroy the instance on the hypervisor.
Nov 29 07:03:31 compute-0 nova_compute[187185]: 2025-11-29 07:03:31.011 187189 DEBUG oslo.service.loopingcall [None req-90495342-224d-478b-89a8-606a2a554b9f 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 07:03:31 compute-0 nova_compute[187185]: 2025-11-29 07:03:31.011 187189 DEBUG nova.compute.manager [-] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 07:03:31 compute-0 nova_compute[187185]: 2025-11-29 07:03:31.011 187189 DEBUG nova.network.neutron [-] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 07:03:31 compute-0 nova_compute[187185]: 2025-11-29 07:03:31.357 187189 DEBUG nova.compute.manager [req-6290b384-ec03-4444-a0b9-7326ae6cab5c req-edfeac11-b6f3-4027-b9c7-6609ec001451 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Received event network-vif-plugged-7b49d918-5378-4d9d-b444-afa3837b57c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:03:31 compute-0 nova_compute[187185]: 2025-11-29 07:03:31.357 187189 DEBUG oslo_concurrency.lockutils [req-6290b384-ec03-4444-a0b9-7326ae6cab5c req-edfeac11-b6f3-4027-b9c7-6609ec001451 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "fb979e7f-7c8f-4c79-846d-44c88985959a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:03:31 compute-0 nova_compute[187185]: 2025-11-29 07:03:31.358 187189 DEBUG oslo_concurrency.lockutils [req-6290b384-ec03-4444-a0b9-7326ae6cab5c req-edfeac11-b6f3-4027-b9c7-6609ec001451 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "fb979e7f-7c8f-4c79-846d-44c88985959a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:03:31 compute-0 nova_compute[187185]: 2025-11-29 07:03:31.358 187189 DEBUG oslo_concurrency.lockutils [req-6290b384-ec03-4444-a0b9-7326ae6cab5c req-edfeac11-b6f3-4027-b9c7-6609ec001451 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "fb979e7f-7c8f-4c79-846d-44c88985959a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:03:31 compute-0 nova_compute[187185]: 2025-11-29 07:03:31.359 187189 DEBUG nova.compute.manager [req-6290b384-ec03-4444-a0b9-7326ae6cab5c req-edfeac11-b6f3-4027-b9c7-6609ec001451 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] No waiting events found dispatching network-vif-plugged-7b49d918-5378-4d9d-b444-afa3837b57c9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:03:31 compute-0 nova_compute[187185]: 2025-11-29 07:03:31.359 187189 WARNING nova.compute.manager [req-6290b384-ec03-4444-a0b9-7326ae6cab5c req-edfeac11-b6f3-4027-b9c7-6609ec001451 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Received unexpected event network-vif-plugged-7b49d918-5378-4d9d-b444-afa3837b57c9 for instance with vm_state suspended and task_state deleting.
Nov 29 07:03:32 compute-0 podman[222318]: 2025-11-29 07:03:32.182978313 +0000 UTC m=+1.192200673 container remove 96f6af69a7bc1cde33a78af108f8e1ee1efbed92d586151f30f0d2bf715251a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 07:03:32 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:32.191 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[7bd82057-c34a-44b4-a27e-4295f8f261fb]: (4, ('Sat Nov 29 07:03:29 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd (96f6af69a7bc1cde33a78af108f8e1ee1efbed92d586151f30f0d2bf715251a9)\n96f6af69a7bc1cde33a78af108f8e1ee1efbed92d586151f30f0d2bf715251a9\nSat Nov 29 07:03:30 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd (96f6af69a7bc1cde33a78af108f8e1ee1efbed92d586151f30f0d2bf715251a9)\n96f6af69a7bc1cde33a78af108f8e1ee1efbed92d586151f30f0d2bf715251a9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:32 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:32.194 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[dc86209c-45ca-4aaf-9a2a-b60cbb0d4826]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:32 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:32.195 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfd9eb57e-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:03:32 compute-0 nova_compute[187185]: 2025-11-29 07:03:32.197 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:32 compute-0 kernel: tapfd9eb57e-b0: left promiscuous mode
Nov 29 07:03:32 compute-0 nova_compute[187185]: 2025-11-29 07:03:32.223 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:32 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:32.226 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[fc55fb51-f480-4893-8e0d-8f927a9c5352]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:32 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:32.247 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6bdce8c9-7c0f-4c4b-b316-2535e8e84b0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:32 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:32.251 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[457aa433-4aa4-4b95-986d-6e2b00cbf1d1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:32 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:32.275 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[91303aa2-2006-4b0e-ab51-a1403258f2e3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 524573, 'reachable_time': 28057, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222330, 'error': None, 'target': 'ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:32 compute-0 systemd[1]: run-netns-ovnmeta\x2dfd9eb57e\x2db1f8\x2d4bae\x2da60f\x2d8e40613556cd.mount: Deactivated successfully.
Nov 29 07:03:32 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:32.280 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:03:32 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:32.281 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[7ee601c8-bc1f-4e98-bd85-7cad59bb1f3c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:32 compute-0 nova_compute[187185]: 2025-11-29 07:03:32.452 187189 DEBUG nova.network.neutron [-] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:03:32 compute-0 nova_compute[187185]: 2025-11-29 07:03:32.476 187189 INFO nova.compute.manager [-] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Took 1.46 seconds to deallocate network for instance.
Nov 29 07:03:32 compute-0 nova_compute[187185]: 2025-11-29 07:03:32.540 187189 DEBUG nova.compute.manager [req-b9654c5c-acc3-4bc2-9826-43fe264d441b req-9dbc7475-a7fa-471d-b158-1a1c0d30a009 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Received event network-vif-deleted-7b49d918-5378-4d9d-b444-afa3837b57c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:03:32 compute-0 nova_compute[187185]: 2025-11-29 07:03:32.600 187189 DEBUG oslo_concurrency.lockutils [None req-90495342-224d-478b-89a8-606a2a554b9f 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:03:32 compute-0 nova_compute[187185]: 2025-11-29 07:03:32.600 187189 DEBUG oslo_concurrency.lockutils [None req-90495342-224d-478b-89a8-606a2a554b9f 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:03:32 compute-0 nova_compute[187185]: 2025-11-29 07:03:32.681 187189 DEBUG nova.compute.provider_tree [None req-90495342-224d-478b-89a8-606a2a554b9f 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:03:32 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:32.725 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:03:32 compute-0 nova_compute[187185]: 2025-11-29 07:03:32.747 187189 DEBUG nova.scheduler.client.report [None req-90495342-224d-478b-89a8-606a2a554b9f 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:03:32 compute-0 nova_compute[187185]: 2025-11-29 07:03:32.775 187189 DEBUG oslo_concurrency.lockutils [None req-90495342-224d-478b-89a8-606a2a554b9f 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:03:32 compute-0 nova_compute[187185]: 2025-11-29 07:03:32.798 187189 INFO nova.scheduler.client.report [None req-90495342-224d-478b-89a8-606a2a554b9f 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Deleted allocations for instance fb979e7f-7c8f-4c79-846d-44c88985959a
Nov 29 07:03:32 compute-0 nova_compute[187185]: 2025-11-29 07:03:32.894 187189 DEBUG oslo_concurrency.lockutils [None req-90495342-224d-478b-89a8-606a2a554b9f 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "fb979e7f-7c8f-4c79-846d-44c88985959a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.079s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:03:34 compute-0 nova_compute[187185]: 2025-11-29 07:03:34.187 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:35 compute-0 nova_compute[187185]: 2025-11-29 07:03:35.879 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:37 compute-0 nova_compute[187185]: 2025-11-29 07:03:37.082 187189 DEBUG oslo_concurrency.lockutils [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:03:37 compute-0 nova_compute[187185]: 2025-11-29 07:03:37.082 187189 DEBUG oslo_concurrency.lockutils [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:03:37 compute-0 nova_compute[187185]: 2025-11-29 07:03:37.097 187189 DEBUG nova.compute.manager [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 07:03:37 compute-0 nova_compute[187185]: 2025-11-29 07:03:37.210 187189 DEBUG oslo_concurrency.lockutils [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:03:37 compute-0 nova_compute[187185]: 2025-11-29 07:03:37.211 187189 DEBUG oslo_concurrency.lockutils [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:03:37 compute-0 nova_compute[187185]: 2025-11-29 07:03:37.219 187189 DEBUG nova.virt.hardware [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:03:37 compute-0 nova_compute[187185]: 2025-11-29 07:03:37.219 187189 INFO nova.compute.claims [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:03:37 compute-0 nova_compute[187185]: 2025-11-29 07:03:37.409 187189 DEBUG nova.compute.provider_tree [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:03:37 compute-0 nova_compute[187185]: 2025-11-29 07:03:37.607 187189 DEBUG nova.scheduler.client.report [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:03:37 compute-0 nova_compute[187185]: 2025-11-29 07:03:37.742 187189 DEBUG oslo_concurrency.lockutils [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.531s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:03:37 compute-0 nova_compute[187185]: 2025-11-29 07:03:37.743 187189 DEBUG nova.compute.manager [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 07:03:37 compute-0 nova_compute[187185]: 2025-11-29 07:03:37.791 187189 DEBUG oslo_concurrency.lockutils [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "02b37f0c-3272-417b-9791-48b555f68d56" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:03:37 compute-0 nova_compute[187185]: 2025-11-29 07:03:37.792 187189 DEBUG oslo_concurrency.lockutils [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "02b37f0c-3272-417b-9791-48b555f68d56" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:03:37 compute-0 nova_compute[187185]: 2025-11-29 07:03:37.823 187189 DEBUG nova.compute.manager [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 07:03:37 compute-0 nova_compute[187185]: 2025-11-29 07:03:37.828 187189 DEBUG nova.compute.manager [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 07:03:37 compute-0 nova_compute[187185]: 2025-11-29 07:03:37.829 187189 DEBUG nova.network.neutron [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 07:03:37 compute-0 nova_compute[187185]: 2025-11-29 07:03:37.867 187189 INFO nova.virt.libvirt.driver [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 07:03:37 compute-0 nova_compute[187185]: 2025-11-29 07:03:37.920 187189 DEBUG nova.compute.manager [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 07:03:37 compute-0 nova_compute[187185]: 2025-11-29 07:03:37.993 187189 DEBUG oslo_concurrency.lockutils [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:03:37 compute-0 nova_compute[187185]: 2025-11-29 07:03:37.993 187189 DEBUG oslo_concurrency.lockutils [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:03:38 compute-0 nova_compute[187185]: 2025-11-29 07:03:38.006 187189 DEBUG nova.virt.hardware [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:03:38 compute-0 nova_compute[187185]: 2025-11-29 07:03:38.007 187189 INFO nova.compute.claims [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:03:38 compute-0 nova_compute[187185]: 2025-11-29 07:03:38.099 187189 DEBUG nova.compute.manager [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 07:03:38 compute-0 nova_compute[187185]: 2025-11-29 07:03:38.101 187189 DEBUG nova.virt.libvirt.driver [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:03:38 compute-0 nova_compute[187185]: 2025-11-29 07:03:38.101 187189 INFO nova.virt.libvirt.driver [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Creating image(s)
Nov 29 07:03:38 compute-0 nova_compute[187185]: 2025-11-29 07:03:38.101 187189 DEBUG oslo_concurrency.lockutils [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "/var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:03:38 compute-0 nova_compute[187185]: 2025-11-29 07:03:38.102 187189 DEBUG oslo_concurrency.lockutils [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "/var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:03:38 compute-0 nova_compute[187185]: 2025-11-29 07:03:38.102 187189 DEBUG oslo_concurrency.lockutils [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "/var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:03:38 compute-0 nova_compute[187185]: 2025-11-29 07:03:38.115 187189 DEBUG oslo_concurrency.processutils [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:03:38 compute-0 nova_compute[187185]: 2025-11-29 07:03:38.137 187189 DEBUG nova.policy [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 07:03:38 compute-0 nova_compute[187185]: 2025-11-29 07:03:38.193 187189 DEBUG oslo_concurrency.processutils [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:03:38 compute-0 nova_compute[187185]: 2025-11-29 07:03:38.194 187189 DEBUG oslo_concurrency.lockutils [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:03:38 compute-0 nova_compute[187185]: 2025-11-29 07:03:38.194 187189 DEBUG oslo_concurrency.lockutils [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:03:38 compute-0 nova_compute[187185]: 2025-11-29 07:03:38.211 187189 DEBUG oslo_concurrency.processutils [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:03:38 compute-0 nova_compute[187185]: 2025-11-29 07:03:38.239 187189 DEBUG nova.compute.provider_tree [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:03:38 compute-0 nova_compute[187185]: 2025-11-29 07:03:38.262 187189 DEBUG nova.scheduler.client.report [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:03:38 compute-0 nova_compute[187185]: 2025-11-29 07:03:38.270 187189 DEBUG oslo_concurrency.processutils [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:03:38 compute-0 nova_compute[187185]: 2025-11-29 07:03:38.271 187189 DEBUG oslo_concurrency.processutils [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:03:38 compute-0 nova_compute[187185]: 2025-11-29 07:03:38.298 187189 DEBUG oslo_concurrency.lockutils [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.305s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:03:38 compute-0 nova_compute[187185]: 2025-11-29 07:03:38.300 187189 DEBUG nova.compute.manager [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 07:03:38 compute-0 nova_compute[187185]: 2025-11-29 07:03:38.376 187189 DEBUG nova.compute.manager [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 07:03:38 compute-0 nova_compute[187185]: 2025-11-29 07:03:38.377 187189 DEBUG nova.network.neutron [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 07:03:38 compute-0 nova_compute[187185]: 2025-11-29 07:03:38.423 187189 INFO nova.virt.libvirt.driver [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 07:03:38 compute-0 nova_compute[187185]: 2025-11-29 07:03:38.450 187189 DEBUG nova.compute.manager [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 07:03:38 compute-0 nova_compute[187185]: 2025-11-29 07:03:38.593 187189 DEBUG nova.compute.manager [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 07:03:38 compute-0 nova_compute[187185]: 2025-11-29 07:03:38.597 187189 DEBUG nova.virt.libvirt.driver [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:03:38 compute-0 nova_compute[187185]: 2025-11-29 07:03:38.598 187189 INFO nova.virt.libvirt.driver [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Creating image(s)
Nov 29 07:03:38 compute-0 nova_compute[187185]: 2025-11-29 07:03:38.599 187189 DEBUG oslo_concurrency.lockutils [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "/var/lib/nova/instances/02b37f0c-3272-417b-9791-48b555f68d56/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:03:38 compute-0 nova_compute[187185]: 2025-11-29 07:03:38.599 187189 DEBUG oslo_concurrency.lockutils [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "/var/lib/nova/instances/02b37f0c-3272-417b-9791-48b555f68d56/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:03:38 compute-0 nova_compute[187185]: 2025-11-29 07:03:38.601 187189 DEBUG oslo_concurrency.lockutils [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "/var/lib/nova/instances/02b37f0c-3272-417b-9791-48b555f68d56/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:03:38 compute-0 nova_compute[187185]: 2025-11-29 07:03:38.628 187189 DEBUG oslo_concurrency.processutils [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:03:38 compute-0 nova_compute[187185]: 2025-11-29 07:03:38.700 187189 DEBUG oslo_concurrency.processutils [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:03:38 compute-0 nova_compute[187185]: 2025-11-29 07:03:38.702 187189 DEBUG oslo_concurrency.lockutils [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:03:38 compute-0 nova_compute[187185]: 2025-11-29 07:03:38.937 187189 DEBUG nova.policy [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 07:03:38 compute-0 nova_compute[187185]: 2025-11-29 07:03:38.943 187189 DEBUG nova.network.neutron [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Successfully created port: 6a0ff3c3-e368-4504-9884-40716725c901 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:03:39 compute-0 nova_compute[187185]: 2025-11-29 07:03:39.191 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:39 compute-0 nova_compute[187185]: 2025-11-29 07:03:39.767 187189 DEBUG oslo_concurrency.processutils [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk 1073741824" returned: 0 in 1.496s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:03:39 compute-0 nova_compute[187185]: 2025-11-29 07:03:39.768 187189 DEBUG oslo_concurrency.lockutils [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 1.573s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:03:39 compute-0 nova_compute[187185]: 2025-11-29 07:03:39.768 187189 DEBUG oslo_concurrency.processutils [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:03:39 compute-0 nova_compute[187185]: 2025-11-29 07:03:39.796 187189 DEBUG oslo_concurrency.lockutils [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 1.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:03:39 compute-0 nova_compute[187185]: 2025-11-29 07:03:39.810 187189 DEBUG oslo_concurrency.processutils [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:03:39 compute-0 nova_compute[187185]: 2025-11-29 07:03:39.864 187189 DEBUG oslo_concurrency.processutils [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:03:39 compute-0 nova_compute[187185]: 2025-11-29 07:03:39.865 187189 DEBUG nova.virt.disk.api [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Checking if we can resize image /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:03:39 compute-0 nova_compute[187185]: 2025-11-29 07:03:39.866 187189 DEBUG oslo_concurrency.processutils [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:03:39 compute-0 nova_compute[187185]: 2025-11-29 07:03:39.885 187189 DEBUG oslo_concurrency.processutils [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:03:39 compute-0 nova_compute[187185]: 2025-11-29 07:03:39.887 187189 DEBUG oslo_concurrency.processutils [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/02b37f0c-3272-417b-9791-48b555f68d56/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:03:39 compute-0 nova_compute[187185]: 2025-11-29 07:03:39.928 187189 DEBUG oslo_concurrency.processutils [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:03:39 compute-0 nova_compute[187185]: 2025-11-29 07:03:39.929 187189 DEBUG nova.virt.disk.api [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Cannot resize image /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:03:39 compute-0 nova_compute[187185]: 2025-11-29 07:03:39.930 187189 DEBUG nova.objects.instance [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lazy-loading 'migration_context' on Instance uuid 690daf8f-6151-4de9-85f6-b8a9fe51ea02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:03:39 compute-0 nova_compute[187185]: 2025-11-29 07:03:39.950 187189 DEBUG nova.virt.libvirt.driver [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:03:39 compute-0 nova_compute[187185]: 2025-11-29 07:03:39.951 187189 DEBUG nova.virt.libvirt.driver [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Ensure instance console log exists: /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:03:39 compute-0 nova_compute[187185]: 2025-11-29 07:03:39.952 187189 DEBUG oslo_concurrency.lockutils [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:03:39 compute-0 nova_compute[187185]: 2025-11-29 07:03:39.952 187189 DEBUG oslo_concurrency.lockutils [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:03:39 compute-0 nova_compute[187185]: 2025-11-29 07:03:39.952 187189 DEBUG oslo_concurrency.lockutils [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:03:40 compute-0 nova_compute[187185]: 2025-11-29 07:03:40.311 187189 DEBUG nova.network.neutron [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Successfully updated port: 6a0ff3c3-e368-4504-9884-40716725c901 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:03:40 compute-0 nova_compute[187185]: 2025-11-29 07:03:40.328 187189 DEBUG oslo_concurrency.lockutils [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "refresh_cache-690daf8f-6151-4de9-85f6-b8a9fe51ea02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:03:40 compute-0 nova_compute[187185]: 2025-11-29 07:03:40.328 187189 DEBUG oslo_concurrency.lockutils [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquired lock "refresh_cache-690daf8f-6151-4de9-85f6-b8a9fe51ea02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:03:40 compute-0 nova_compute[187185]: 2025-11-29 07:03:40.328 187189 DEBUG nova.network.neutron [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:03:40 compute-0 nova_compute[187185]: 2025-11-29 07:03:40.445 187189 DEBUG nova.compute.manager [req-1c2a8888-e381-431f-b446-1c203cc7c4a9 req-80cc8fa3-dc30-4d30-a010-ddf12a40f903 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Received event network-changed-6a0ff3c3-e368-4504-9884-40716725c901 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:03:40 compute-0 nova_compute[187185]: 2025-11-29 07:03:40.445 187189 DEBUG nova.compute.manager [req-1c2a8888-e381-431f-b446-1c203cc7c4a9 req-80cc8fa3-dc30-4d30-a010-ddf12a40f903 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Refreshing instance network info cache due to event network-changed-6a0ff3c3-e368-4504-9884-40716725c901. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:03:40 compute-0 nova_compute[187185]: 2025-11-29 07:03:40.445 187189 DEBUG oslo_concurrency.lockutils [req-1c2a8888-e381-431f-b446-1c203cc7c4a9 req-80cc8fa3-dc30-4d30-a010-ddf12a40f903 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-690daf8f-6151-4de9-85f6-b8a9fe51ea02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:03:40 compute-0 nova_compute[187185]: 2025-11-29 07:03:40.565 187189 DEBUG nova.network.neutron [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:03:40 compute-0 nova_compute[187185]: 2025-11-29 07:03:40.594 187189 DEBUG oslo_concurrency.processutils [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/02b37f0c-3272-417b-9791-48b555f68d56/disk 1073741824" returned: 0 in 0.706s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:03:40 compute-0 nova_compute[187185]: 2025-11-29 07:03:40.595 187189 DEBUG oslo_concurrency.lockutils [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.799s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:03:40 compute-0 nova_compute[187185]: 2025-11-29 07:03:40.596 187189 DEBUG oslo_concurrency.processutils [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:03:40 compute-0 nova_compute[187185]: 2025-11-29 07:03:40.655 187189 DEBUG nova.network.neutron [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Successfully created port: 3978dee0-d304-43a8-9478-68840d581d9b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:03:40 compute-0 nova_compute[187185]: 2025-11-29 07:03:40.674 187189 DEBUG oslo_concurrency.processutils [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:03:40 compute-0 nova_compute[187185]: 2025-11-29 07:03:40.675 187189 DEBUG nova.virt.disk.api [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Checking if we can resize image /var/lib/nova/instances/02b37f0c-3272-417b-9791-48b555f68d56/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:03:40 compute-0 nova_compute[187185]: 2025-11-29 07:03:40.676 187189 DEBUG oslo_concurrency.processutils [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02b37f0c-3272-417b-9791-48b555f68d56/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:03:40 compute-0 nova_compute[187185]: 2025-11-29 07:03:40.747 187189 DEBUG oslo_concurrency.processutils [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02b37f0c-3272-417b-9791-48b555f68d56/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:03:40 compute-0 nova_compute[187185]: 2025-11-29 07:03:40.748 187189 DEBUG nova.virt.disk.api [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Cannot resize image /var/lib/nova/instances/02b37f0c-3272-417b-9791-48b555f68d56/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:03:40 compute-0 nova_compute[187185]: 2025-11-29 07:03:40.748 187189 DEBUG nova.objects.instance [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lazy-loading 'migration_context' on Instance uuid 02b37f0c-3272-417b-9791-48b555f68d56 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:03:40 compute-0 nova_compute[187185]: 2025-11-29 07:03:40.763 187189 DEBUG nova.virt.libvirt.driver [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:03:40 compute-0 nova_compute[187185]: 2025-11-29 07:03:40.764 187189 DEBUG nova.virt.libvirt.driver [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Ensure instance console log exists: /var/lib/nova/instances/02b37f0c-3272-417b-9791-48b555f68d56/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:03:40 compute-0 nova_compute[187185]: 2025-11-29 07:03:40.764 187189 DEBUG oslo_concurrency.lockutils [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:03:40 compute-0 nova_compute[187185]: 2025-11-29 07:03:40.765 187189 DEBUG oslo_concurrency.lockutils [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:03:40 compute-0 nova_compute[187185]: 2025-11-29 07:03:40.765 187189 DEBUG oslo_concurrency.lockutils [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:03:40 compute-0 nova_compute[187185]: 2025-11-29 07:03:40.929 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:41 compute-0 podman[222365]: 2025-11-29 07:03:41.796607 +0000 UTC m=+0.060042334 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 07:03:41 compute-0 podman[222389]: 2025-11-29 07:03:41.892665503 +0000 UTC m=+0.054311304 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 29 07:03:41 compute-0 podman[222390]: 2025-11-29 07:03:41.924771383 +0000 UTC m=+0.082918255 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, config_id=edpm, vcs-type=git, io.buildah.version=1.33.7, io.openshift.expose-services=, managed_by=edpm_ansible, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., version=9.6)
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.137 187189 DEBUG nova.network.neutron [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Updating instance_info_cache with network_info: [{"id": "6a0ff3c3-e368-4504-9884-40716725c901", "address": "fa:16:3e:15:c1:31", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a0ff3c3-e3", "ovs_interfaceid": "6a0ff3c3-e368-4504-9884-40716725c901", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.161 187189 DEBUG oslo_concurrency.lockutils [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Releasing lock "refresh_cache-690daf8f-6151-4de9-85f6-b8a9fe51ea02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.161 187189 DEBUG nova.compute.manager [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Instance network_info: |[{"id": "6a0ff3c3-e368-4504-9884-40716725c901", "address": "fa:16:3e:15:c1:31", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a0ff3c3-e3", "ovs_interfaceid": "6a0ff3c3-e368-4504-9884-40716725c901", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.162 187189 DEBUG oslo_concurrency.lockutils [req-1c2a8888-e381-431f-b446-1c203cc7c4a9 req-80cc8fa3-dc30-4d30-a010-ddf12a40f903 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-690daf8f-6151-4de9-85f6-b8a9fe51ea02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.162 187189 DEBUG nova.network.neutron [req-1c2a8888-e381-431f-b446-1c203cc7c4a9 req-80cc8fa3-dc30-4d30-a010-ddf12a40f903 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Refreshing network info cache for port 6a0ff3c3-e368-4504-9884-40716725c901 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.165 187189 DEBUG nova.virt.libvirt.driver [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Start _get_guest_xml network_info=[{"id": "6a0ff3c3-e368-4504-9884-40716725c901", "address": "fa:16:3e:15:c1:31", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a0ff3c3-e3", "ovs_interfaceid": "6a0ff3c3-e368-4504-9884-40716725c901", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.184 187189 WARNING nova.virt.libvirt.driver [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.191 187189 DEBUG nova.virt.libvirt.host [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.192 187189 DEBUG nova.virt.libvirt.host [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.196 187189 DEBUG nova.virt.libvirt.host [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.197 187189 DEBUG nova.virt.libvirt.host [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.198 187189 DEBUG nova.virt.libvirt.driver [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.198 187189 DEBUG nova.virt.hardware [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.199 187189 DEBUG nova.virt.hardware [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.199 187189 DEBUG nova.virt.hardware [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.199 187189 DEBUG nova.virt.hardware [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.200 187189 DEBUG nova.virt.hardware [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.200 187189 DEBUG nova.virt.hardware [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.200 187189 DEBUG nova.virt.hardware [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.200 187189 DEBUG nova.virt.hardware [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.200 187189 DEBUG nova.virt.hardware [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.201 187189 DEBUG nova.virt.hardware [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.201 187189 DEBUG nova.virt.hardware [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.204 187189 DEBUG nova.virt.libvirt.vif [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:03:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-660939301',display_name='tempest-DeleteServersTestJSON-server-660939301',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-660939301',id=63,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98df116965b74e4a9985049062e65162',ramdisk_id='',reservation_id='r-0d65al9n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1973671383',owner_user_name='tempest-DeleteServersTestJSON-1973671383-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:03:37Z,user_data=None,user_id='4ecd161098b5422084003b39f0504a8f',uuid=690daf8f-6151-4de9-85f6-b8a9fe51ea02,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6a0ff3c3-e368-4504-9884-40716725c901", "address": "fa:16:3e:15:c1:31", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a0ff3c3-e3", "ovs_interfaceid": "6a0ff3c3-e368-4504-9884-40716725c901", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.205 187189 DEBUG nova.network.os_vif_util [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converting VIF {"id": "6a0ff3c3-e368-4504-9884-40716725c901", "address": "fa:16:3e:15:c1:31", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a0ff3c3-e3", "ovs_interfaceid": "6a0ff3c3-e368-4504-9884-40716725c901", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.205 187189 DEBUG nova.network.os_vif_util [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:c1:31,bridge_name='br-int',has_traffic_filtering=True,id=6a0ff3c3-e368-4504-9884-40716725c901,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a0ff3c3-e3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.206 187189 DEBUG nova.objects.instance [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lazy-loading 'pci_devices' on Instance uuid 690daf8f-6151-4de9-85f6-b8a9fe51ea02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.220 187189 DEBUG nova.virt.libvirt.driver [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:03:42 compute-0 nova_compute[187185]:   <uuid>690daf8f-6151-4de9-85f6-b8a9fe51ea02</uuid>
Nov 29 07:03:42 compute-0 nova_compute[187185]:   <name>instance-0000003f</name>
Nov 29 07:03:42 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:03:42 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:03:42 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:03:42 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:       <nova:name>tempest-DeleteServersTestJSON-server-660939301</nova:name>
Nov 29 07:03:42 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:03:42</nova:creationTime>
Nov 29 07:03:42 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:03:42 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:03:42 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:03:42 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:03:42 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:03:42 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:03:42 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:03:42 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:03:42 compute-0 nova_compute[187185]:         <nova:user uuid="4ecd161098b5422084003b39f0504a8f">tempest-DeleteServersTestJSON-1973671383-project-member</nova:user>
Nov 29 07:03:42 compute-0 nova_compute[187185]:         <nova:project uuid="98df116965b74e4a9985049062e65162">tempest-DeleteServersTestJSON-1973671383</nova:project>
Nov 29 07:03:42 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:03:42 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:03:42 compute-0 nova_compute[187185]:         <nova:port uuid="6a0ff3c3-e368-4504-9884-40716725c901">
Nov 29 07:03:42 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:03:42 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:03:42 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:03:42 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <system>
Nov 29 07:03:42 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:03:42 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:03:42 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:03:42 compute-0 nova_compute[187185]:       <entry name="serial">690daf8f-6151-4de9-85f6-b8a9fe51ea02</entry>
Nov 29 07:03:42 compute-0 nova_compute[187185]:       <entry name="uuid">690daf8f-6151-4de9-85f6-b8a9fe51ea02</entry>
Nov 29 07:03:42 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     </system>
Nov 29 07:03:42 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:03:42 compute-0 nova_compute[187185]:   <os>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:   </os>
Nov 29 07:03:42 compute-0 nova_compute[187185]:   <features>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:   </features>
Nov 29 07:03:42 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:03:42 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:03:42 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:03:42 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:03:42 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk.config"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:03:42 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:15:c1:31"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:       <target dev="tap6a0ff3c3-e3"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:03:42 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/console.log" append="off"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <video>
Nov 29 07:03:42 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     </video>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:03:42 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:03:42 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:03:42 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:03:42 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:03:42 compute-0 nova_compute[187185]: </domain>
Nov 29 07:03:42 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.221 187189 DEBUG nova.compute.manager [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Preparing to wait for external event network-vif-plugged-6a0ff3c3-e368-4504-9884-40716725c901 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.222 187189 DEBUG oslo_concurrency.lockutils [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.222 187189 DEBUG oslo_concurrency.lockutils [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.222 187189 DEBUG oslo_concurrency.lockutils [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.223 187189 DEBUG nova.virt.libvirt.vif [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:03:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-660939301',display_name='tempest-DeleteServersTestJSON-server-660939301',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-660939301',id=63,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98df116965b74e4a9985049062e65162',ramdisk_id='',reservation_id='r-0d65al9n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1973671383',owner_user_name='tempest-DeleteServersTestJSON-1973671383-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:03:37Z,user_data=None,user_id='4ecd161098b5422084003b39f0504a8f',uuid=690daf8f-6151-4de9-85f6-b8a9fe51ea02,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6a0ff3c3-e368-4504-9884-40716725c901", "address": "fa:16:3e:15:c1:31", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a0ff3c3-e3", "ovs_interfaceid": "6a0ff3c3-e368-4504-9884-40716725c901", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.223 187189 DEBUG nova.network.os_vif_util [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converting VIF {"id": "6a0ff3c3-e368-4504-9884-40716725c901", "address": "fa:16:3e:15:c1:31", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a0ff3c3-e3", "ovs_interfaceid": "6a0ff3c3-e368-4504-9884-40716725c901", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.224 187189 DEBUG nova.network.os_vif_util [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:c1:31,bridge_name='br-int',has_traffic_filtering=True,id=6a0ff3c3-e368-4504-9884-40716725c901,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a0ff3c3-e3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.224 187189 DEBUG os_vif [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:c1:31,bridge_name='br-int',has_traffic_filtering=True,id=6a0ff3c3-e368-4504-9884-40716725c901,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a0ff3c3-e3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.225 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.225 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.226 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.228 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.228 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6a0ff3c3-e3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.229 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6a0ff3c3-e3, col_values=(('external_ids', {'iface-id': '6a0ff3c3-e368-4504-9884-40716725c901', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:15:c1:31', 'vm-uuid': '690daf8f-6151-4de9-85f6-b8a9fe51ea02'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.230 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:42 compute-0 NetworkManager[55227]: <info>  [1764399822.2312] manager: (tap6a0ff3c3-e3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/83)
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.233 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.236 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.236 187189 INFO os_vif [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:c1:31,bridge_name='br-int',has_traffic_filtering=True,id=6a0ff3c3-e368-4504-9884-40716725c901,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a0ff3c3-e3')
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.323 187189 DEBUG nova.virt.libvirt.driver [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.323 187189 DEBUG nova.virt.libvirt.driver [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.323 187189 DEBUG nova.virt.libvirt.driver [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] No VIF found with MAC fa:16:3e:15:c1:31, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.324 187189 INFO nova.virt.libvirt.driver [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Using config drive
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.469 187189 DEBUG nova.network.neutron [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Successfully updated port: 3978dee0-d304-43a8-9478-68840d581d9b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.487 187189 DEBUG oslo_concurrency.lockutils [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "refresh_cache-02b37f0c-3272-417b-9791-48b555f68d56" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.488 187189 DEBUG oslo_concurrency.lockutils [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquired lock "refresh_cache-02b37f0c-3272-417b-9791-48b555f68d56" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.488 187189 DEBUG nova.network.neutron [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.575 187189 DEBUG nova.compute.manager [req-996d3d46-93cb-4ba2-ab06-1d4bd86dcd26 req-9ed7a26a-4fb2-440f-8131-068eb8eaadaa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Received event network-changed-3978dee0-d304-43a8-9478-68840d581d9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.576 187189 DEBUG nova.compute.manager [req-996d3d46-93cb-4ba2-ab06-1d4bd86dcd26 req-9ed7a26a-4fb2-440f-8131-068eb8eaadaa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Refreshing instance network info cache due to event network-changed-3978dee0-d304-43a8-9478-68840d581d9b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:03:42 compute-0 nova_compute[187185]: 2025-11-29 07:03:42.576 187189 DEBUG oslo_concurrency.lockutils [req-996d3d46-93cb-4ba2-ab06-1d4bd86dcd26 req-9ed7a26a-4fb2-440f-8131-068eb8eaadaa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-02b37f0c-3272-417b-9791-48b555f68d56" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:03:43 compute-0 nova_compute[187185]: 2025-11-29 07:03:43.268 187189 DEBUG nova.network.neutron [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:03:43 compute-0 nova_compute[187185]: 2025-11-29 07:03:43.302 187189 INFO nova.virt.libvirt.driver [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Creating config drive at /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk.config
Nov 29 07:03:43 compute-0 nova_compute[187185]: 2025-11-29 07:03:43.309 187189 DEBUG oslo_concurrency.processutils [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpup12abq0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:03:43 compute-0 nova_compute[187185]: 2025-11-29 07:03:43.443 187189 DEBUG oslo_concurrency.processutils [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpup12abq0" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:03:43 compute-0 kernel: tap6a0ff3c3-e3: entered promiscuous mode
Nov 29 07:03:43 compute-0 NetworkManager[55227]: <info>  [1764399823.5078] manager: (tap6a0ff3c3-e3): new Tun device (/org/freedesktop/NetworkManager/Devices/84)
Nov 29 07:03:43 compute-0 ovn_controller[95281]: 2025-11-29T07:03:43Z|00149|binding|INFO|Claiming lport 6a0ff3c3-e368-4504-9884-40716725c901 for this chassis.
Nov 29 07:03:43 compute-0 nova_compute[187185]: 2025-11-29 07:03:43.509 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:43 compute-0 ovn_controller[95281]: 2025-11-29T07:03:43Z|00150|binding|INFO|6a0ff3c3-e368-4504-9884-40716725c901: Claiming fa:16:3e:15:c1:31 10.100.0.14
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:43.535 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:15:c1:31 10.100.0.14'], port_security=['fa:16:3e:15:c1:31 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98df116965b74e4a9985049062e65162', 'neutron:revision_number': '2', 'neutron:security_group_ids': '234720a9-9cd1-4b87-9bec-1abfe8ff0514', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e694bb30-a43a-4d18-87fa-e5c0dd8850c2, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=6a0ff3c3-e368-4504-9884-40716725c901) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:03:43 compute-0 systemd-udevd[222442]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:03:43 compute-0 ovn_controller[95281]: 2025-11-29T07:03:43Z|00151|binding|INFO|Setting lport 6a0ff3c3-e368-4504-9884-40716725c901 ovn-installed in OVS
Nov 29 07:03:43 compute-0 ovn_controller[95281]: 2025-11-29T07:03:43Z|00152|binding|INFO|Setting lport 6a0ff3c3-e368-4504-9884-40716725c901 up in Southbound
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:43.537 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 6a0ff3c3-e368-4504-9884-40716725c901 in datapath fd9eb57e-b1f8-4bae-a60f-8e40613556cd bound to our chassis
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:43.539 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fd9eb57e-b1f8-4bae-a60f-8e40613556cd
Nov 29 07:03:43 compute-0 nova_compute[187185]: 2025-11-29 07:03:43.539 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:43.550 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[33e08b7a-5fff-4205-8c36-4c180e3294c3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:43.552 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfd9eb57e-b1 in ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:43.553 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfd9eb57e-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:43.553 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[ac081630-6154-44bd-8af8-4f21c0c44660]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:43.554 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[c0bae479-82bd-4d97-8399-e0c7394955e2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:43 compute-0 NetworkManager[55227]: <info>  [1764399823.5584] device (tap6a0ff3c3-e3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:03:43 compute-0 NetworkManager[55227]: <info>  [1764399823.5594] device (tap6a0ff3c3-e3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:03:43 compute-0 systemd-machined[153486]: New machine qemu-22-instance-0000003f.
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:43.570 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[da1d28ff-e32b-40df-9158-d1882c4d2c11]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:43 compute-0 systemd[1]: Started Virtual Machine qemu-22-instance-0000003f.
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:43.597 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[17426e63-2068-4fee-b1e4-0688b380b5c9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:43.633 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[e2d2cc0c-df0f-46dd-9d0c-d26c21551d65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:43 compute-0 systemd-udevd[222449]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:43.639 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[2dc9aa96-1d87-405f-98cf-2c4f10a79642]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:43 compute-0 NetworkManager[55227]: <info>  [1764399823.6405] manager: (tapfd9eb57e-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/85)
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:43.672 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[325cd6dc-97ab-493e-bd8f-2765cf8bcb2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:43.675 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[0dec2321-a334-40a6-b813-99793720dab7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:43 compute-0 NetworkManager[55227]: <info>  [1764399823.6984] device (tapfd9eb57e-b0): carrier: link connected
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:43.703 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[f52a5a3e-d5a4-4feb-8c43-8f8bad656ad2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:43.721 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e15d3e3e-f46e-4636-9022-f29b93e17790]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfd9eb57e-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ec:80:ac'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 527136, 'reachable_time': 25166, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222478, 'error': None, 'target': 'ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:43.736 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d16b973e-540b-4fc9-8744-c57cf0363a85]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feec:80ac'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 527136, 'tstamp': 527136}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222479, 'error': None, 'target': 'ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:43 compute-0 nova_compute[187185]: 2025-11-29 07:03:43.753 187189 DEBUG nova.compute.manager [req-288baa4f-a1e2-4d22-bf3f-1c59a02b0761 req-2ef84880-95f1-4943-acf4-0a9d9ab5a2ee 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Received event network-vif-plugged-6a0ff3c3-e368-4504-9884-40716725c901 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:03:43 compute-0 nova_compute[187185]: 2025-11-29 07:03:43.754 187189 DEBUG oslo_concurrency.lockutils [req-288baa4f-a1e2-4d22-bf3f-1c59a02b0761 req-2ef84880-95f1-4943-acf4-0a9d9ab5a2ee 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:03:43 compute-0 nova_compute[187185]: 2025-11-29 07:03:43.754 187189 DEBUG oslo_concurrency.lockutils [req-288baa4f-a1e2-4d22-bf3f-1c59a02b0761 req-2ef84880-95f1-4943-acf4-0a9d9ab5a2ee 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:03:43 compute-0 nova_compute[187185]: 2025-11-29 07:03:43.754 187189 DEBUG oslo_concurrency.lockutils [req-288baa4f-a1e2-4d22-bf3f-1c59a02b0761 req-2ef84880-95f1-4943-acf4-0a9d9ab5a2ee 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:03:43 compute-0 nova_compute[187185]: 2025-11-29 07:03:43.755 187189 DEBUG nova.compute.manager [req-288baa4f-a1e2-4d22-bf3f-1c59a02b0761 req-2ef84880-95f1-4943-acf4-0a9d9ab5a2ee 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Processing event network-vif-plugged-6a0ff3c3-e368-4504-9884-40716725c901 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:43.758 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d7e9b4f2-2cdd-429b-94b7-2df31edfaffb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfd9eb57e-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ec:80:ac'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 527136, 'reachable_time': 25166, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222480, 'error': None, 'target': 'ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:43.791 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[8e77703a-fe3d-4c3c-b88e-9fc144e22c25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:43.859 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[66c29c83-d050-4d90-a2bf-bc2d9eba13bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:43.861 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfd9eb57e-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:43.861 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:43.862 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfd9eb57e-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:03:43 compute-0 nova_compute[187185]: 2025-11-29 07:03:43.864 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:43 compute-0 NetworkManager[55227]: <info>  [1764399823.8652] manager: (tapfd9eb57e-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/86)
Nov 29 07:03:43 compute-0 kernel: tapfd9eb57e-b0: entered promiscuous mode
Nov 29 07:03:43 compute-0 nova_compute[187185]: 2025-11-29 07:03:43.867 187189 DEBUG nova.network.neutron [req-1c2a8888-e381-431f-b446-1c203cc7c4a9 req-80cc8fa3-dc30-4d30-a010-ddf12a40f903 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Updated VIF entry in instance network info cache for port 6a0ff3c3-e368-4504-9884-40716725c901. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:03:43 compute-0 nova_compute[187185]: 2025-11-29 07:03:43.867 187189 DEBUG nova.network.neutron [req-1c2a8888-e381-431f-b446-1c203cc7c4a9 req-80cc8fa3-dc30-4d30-a010-ddf12a40f903 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Updating instance_info_cache with network_info: [{"id": "6a0ff3c3-e368-4504-9884-40716725c901", "address": "fa:16:3e:15:c1:31", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a0ff3c3-e3", "ovs_interfaceid": "6a0ff3c3-e368-4504-9884-40716725c901", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:43.868 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfd9eb57e-b0, col_values=(('external_ids', {'iface-id': 'e7b4cb4f-cb6d-4f0e-8c8d-34c743671595'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:03:43 compute-0 nova_compute[187185]: 2025-11-29 07:03:43.869 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:43 compute-0 ovn_controller[95281]: 2025-11-29T07:03:43Z|00153|binding|INFO|Releasing lport e7b4cb4f-cb6d-4f0e-8c8d-34c743671595 from this chassis (sb_readonly=0)
Nov 29 07:03:43 compute-0 nova_compute[187185]: 2025-11-29 07:03:43.886 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:43.886 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fd9eb57e-b1f8-4bae-a60f-8e40613556cd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fd9eb57e-b1f8-4bae-a60f-8e40613556cd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:43.887 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f5eb2590-b41e-4c95-94f1-d7a4b2283d65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:43.888 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-fd9eb57e-b1f8-4bae-a60f-8e40613556cd
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/fd9eb57e-b1f8-4bae-a60f-8e40613556cd.pid.haproxy
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID fd9eb57e-b1f8-4bae-a60f-8e40613556cd
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:03:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:43.888 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'env', 'PROCESS_TAG=haproxy-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fd9eb57e-b1f8-4bae-a60f-8e40613556cd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:03:43 compute-0 nova_compute[187185]: 2025-11-29 07:03:43.903 187189 DEBUG oslo_concurrency.lockutils [req-1c2a8888-e381-431f-b446-1c203cc7c4a9 req-80cc8fa3-dc30-4d30-a010-ddf12a40f903 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-690daf8f-6151-4de9-85f6-b8a9fe51ea02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.165 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399809.1638556, fb979e7f-7c8f-4c79-846d-44c88985959a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.165 187189 INFO nova.compute.manager [-] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] VM Stopped (Lifecycle Event)
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.187 187189 DEBUG nova.compute.manager [None req-d9cca572-5b4f-4893-bed5-2f7e32ebcb3a - - - - - -] [instance: fb979e7f-7c8f-4c79-846d-44c88985959a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.192 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:44 compute-0 podman[222511]: 2025-11-29 07:03:44.222594341 +0000 UTC m=+0.027815801 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.386 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399824.3856075, 690daf8f-6151-4de9-85f6-b8a9fe51ea02 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.386 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] VM Started (Lifecycle Event)
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.388 187189 DEBUG nova.compute.manager [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.391 187189 DEBUG nova.virt.libvirt.driver [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.394 187189 INFO nova.virt.libvirt.driver [-] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Instance spawned successfully.
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.394 187189 DEBUG nova.virt.libvirt.driver [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.410 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.417 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.421 187189 DEBUG nova.virt.libvirt.driver [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.422 187189 DEBUG nova.virt.libvirt.driver [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.422 187189 DEBUG nova.virt.libvirt.driver [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.422 187189 DEBUG nova.virt.libvirt.driver [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.423 187189 DEBUG nova.virt.libvirt.driver [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.423 187189 DEBUG nova.virt.libvirt.driver [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.558 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.558 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399824.387972, 690daf8f-6151-4de9-85f6-b8a9fe51ea02 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.559 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] VM Paused (Lifecycle Event)
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.606 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.610 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399824.3910067, 690daf8f-6151-4de9-85f6-b8a9fe51ea02 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.610 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] VM Resumed (Lifecycle Event)
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.642 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.643 187189 INFO nova.compute.manager [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Took 6.54 seconds to spawn the instance on the hypervisor.
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.644 187189 DEBUG nova.compute.manager [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.647 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.656 187189 DEBUG nova.network.neutron [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Updating instance_info_cache with network_info: [{"id": "3978dee0-d304-43a8-9478-68840d581d9b", "address": "fa:16:3e:c9:dd:dd", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3978dee0-d3", "ovs_interfaceid": "3978dee0-d304-43a8-9478-68840d581d9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.686 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.703 187189 DEBUG oslo_concurrency.lockutils [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Releasing lock "refresh_cache-02b37f0c-3272-417b-9791-48b555f68d56" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.703 187189 DEBUG nova.compute.manager [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Instance network_info: |[{"id": "3978dee0-d304-43a8-9478-68840d581d9b", "address": "fa:16:3e:c9:dd:dd", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3978dee0-d3", "ovs_interfaceid": "3978dee0-d304-43a8-9478-68840d581d9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.703 187189 DEBUG oslo_concurrency.lockutils [req-996d3d46-93cb-4ba2-ab06-1d4bd86dcd26 req-9ed7a26a-4fb2-440f-8131-068eb8eaadaa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-02b37f0c-3272-417b-9791-48b555f68d56" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.703 187189 DEBUG nova.network.neutron [req-996d3d46-93cb-4ba2-ab06-1d4bd86dcd26 req-9ed7a26a-4fb2-440f-8131-068eb8eaadaa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Refreshing network info cache for port 3978dee0-d304-43a8-9478-68840d581d9b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.706 187189 DEBUG nova.virt.libvirt.driver [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Start _get_guest_xml network_info=[{"id": "3978dee0-d304-43a8-9478-68840d581d9b", "address": "fa:16:3e:c9:dd:dd", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3978dee0-d3", "ovs_interfaceid": "3978dee0-d304-43a8-9478-68840d581d9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.710 187189 WARNING nova.virt.libvirt.driver [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.717 187189 DEBUG nova.virt.libvirt.host [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.719 187189 DEBUG nova.virt.libvirt.host [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.725 187189 DEBUG nova.virt.libvirt.host [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.726 187189 DEBUG nova.virt.libvirt.host [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.728 187189 DEBUG nova.virt.libvirt.driver [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.728 187189 DEBUG nova.virt.hardware [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.728 187189 DEBUG nova.virt.hardware [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.729 187189 DEBUG nova.virt.hardware [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.729 187189 DEBUG nova.virt.hardware [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.729 187189 DEBUG nova.virt.hardware [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.729 187189 DEBUG nova.virt.hardware [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.729 187189 DEBUG nova.virt.hardware [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.730 187189 DEBUG nova.virt.hardware [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.730 187189 DEBUG nova.virt.hardware [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.730 187189 DEBUG nova.virt.hardware [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.730 187189 DEBUG nova.virt.hardware [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.733 187189 DEBUG nova.virt.libvirt.vif [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:03:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-₡-1325169813',display_name='tempest-₡-1325169813',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest--1325169813',id=64,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1dba9539037a4e9dbf33cba140fe21fe',ramdisk_id='',reservation_id='r-5w7ovqmq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-373958708',owner_user_name='tempest-ServersTestJSON-373958708-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:03:38Z,user_data=None,user_id='f2f86d3bd4814a09966b869dd539a6c9',uuid=02b37f0c-3272-417b-9791-48b555f68d56,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3978dee0-d304-43a8-9478-68840d581d9b", "address": "fa:16:3e:c9:dd:dd", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3978dee0-d3", "ovs_interfaceid": "3978dee0-d304-43a8-9478-68840d581d9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.734 187189 DEBUG nova.network.os_vif_util [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Converting VIF {"id": "3978dee0-d304-43a8-9478-68840d581d9b", "address": "fa:16:3e:c9:dd:dd", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3978dee0-d3", "ovs_interfaceid": "3978dee0-d304-43a8-9478-68840d581d9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.734 187189 DEBUG nova.network.os_vif_util [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c9:dd:dd,bridge_name='br-int',has_traffic_filtering=True,id=3978dee0-d304-43a8-9478-68840d581d9b,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3978dee0-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.735 187189 DEBUG nova.objects.instance [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lazy-loading 'pci_devices' on Instance uuid 02b37f0c-3272-417b-9791-48b555f68d56 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.749 187189 DEBUG nova.virt.libvirt.driver [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:03:44 compute-0 nova_compute[187185]:   <uuid>02b37f0c-3272-417b-9791-48b555f68d56</uuid>
Nov 29 07:03:44 compute-0 nova_compute[187185]:   <name>instance-00000040</name>
Nov 29 07:03:44 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:03:44 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:03:44 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:03:44 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:       <nova:name>tempest-₡-1325169813</nova:name>
Nov 29 07:03:44 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:03:44</nova:creationTime>
Nov 29 07:03:44 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:03:44 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:03:44 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:03:44 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:03:44 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:03:44 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:03:44 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:03:44 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:03:44 compute-0 nova_compute[187185]:         <nova:user uuid="f2f86d3bd4814a09966b869dd539a6c9">tempest-ServersTestJSON-373958708-project-member</nova:user>
Nov 29 07:03:44 compute-0 nova_compute[187185]:         <nova:project uuid="1dba9539037a4e9dbf33cba140fe21fe">tempest-ServersTestJSON-373958708</nova:project>
Nov 29 07:03:44 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:03:44 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:03:44 compute-0 nova_compute[187185]:         <nova:port uuid="3978dee0-d304-43a8-9478-68840d581d9b">
Nov 29 07:03:44 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:03:44 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:03:44 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:03:44 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <system>
Nov 29 07:03:44 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:03:44 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:03:44 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:03:44 compute-0 nova_compute[187185]:       <entry name="serial">02b37f0c-3272-417b-9791-48b555f68d56</entry>
Nov 29 07:03:44 compute-0 nova_compute[187185]:       <entry name="uuid">02b37f0c-3272-417b-9791-48b555f68d56</entry>
Nov 29 07:03:44 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     </system>
Nov 29 07:03:44 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:03:44 compute-0 nova_compute[187185]:   <os>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:   </os>
Nov 29 07:03:44 compute-0 nova_compute[187185]:   <features>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:   </features>
Nov 29 07:03:44 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:03:44 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:03:44 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:03:44 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/02b37f0c-3272-417b-9791-48b555f68d56/disk"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:03:44 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/02b37f0c-3272-417b-9791-48b555f68d56/disk.config"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:03:44 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:c9:dd:dd"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:       <target dev="tap3978dee0-d3"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:03:44 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/02b37f0c-3272-417b-9791-48b555f68d56/console.log" append="off"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <video>
Nov 29 07:03:44 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     </video>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:03:44 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:03:44 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:03:44 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:03:44 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:03:44 compute-0 nova_compute[187185]: </domain>
Nov 29 07:03:44 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.750 187189 DEBUG nova.compute.manager [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Preparing to wait for external event network-vif-plugged-3978dee0-d304-43a8-9478-68840d581d9b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.751 187189 DEBUG oslo_concurrency.lockutils [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "02b37f0c-3272-417b-9791-48b555f68d56-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.751 187189 DEBUG oslo_concurrency.lockutils [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "02b37f0c-3272-417b-9791-48b555f68d56-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.751 187189 DEBUG oslo_concurrency.lockutils [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "02b37f0c-3272-417b-9791-48b555f68d56-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.752 187189 DEBUG nova.virt.libvirt.vif [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:03:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-₡-1325169813',display_name='tempest-₡-1325169813',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest--1325169813',id=64,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1dba9539037a4e9dbf33cba140fe21fe',ramdisk_id='',reservation_id='r-5w7ovqmq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-373958708',owner_user_name='tempest-ServersTestJSON-373958708-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:03:38Z,user_data=None,user_id='f2f86d3bd4814a09966b869dd539a6c9',uuid=02b37f0c-3272-417b-9791-48b555f68d56,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3978dee0-d304-43a8-9478-68840d581d9b", "address": "fa:16:3e:c9:dd:dd", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3978dee0-d3", "ovs_interfaceid": "3978dee0-d304-43a8-9478-68840d581d9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.752 187189 DEBUG nova.network.os_vif_util [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Converting VIF {"id": "3978dee0-d304-43a8-9478-68840d581d9b", "address": "fa:16:3e:c9:dd:dd", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3978dee0-d3", "ovs_interfaceid": "3978dee0-d304-43a8-9478-68840d581d9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.753 187189 DEBUG nova.network.os_vif_util [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c9:dd:dd,bridge_name='br-int',has_traffic_filtering=True,id=3978dee0-d304-43a8-9478-68840d581d9b,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3978dee0-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.753 187189 DEBUG os_vif [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:dd:dd,bridge_name='br-int',has_traffic_filtering=True,id=3978dee0-d304-43a8-9478-68840d581d9b,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3978dee0-d3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.757 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.758 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.758 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.760 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.761 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3978dee0-d3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.761 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3978dee0-d3, col_values=(('external_ids', {'iface-id': '3978dee0-d304-43a8-9478-68840d581d9b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c9:dd:dd', 'vm-uuid': '02b37f0c-3272-417b-9791-48b555f68d56'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.762 187189 INFO nova.compute.manager [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Took 7.60 seconds to build instance.
Nov 29 07:03:44 compute-0 NetworkManager[55227]: <info>  [1764399824.7650] manager: (tap3978dee0-d3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/87)
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.765 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.769 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.773 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.775 187189 INFO os_vif [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:dd:dd,bridge_name='br-int',has_traffic_filtering=True,id=3978dee0-d304-43a8-9478-68840d581d9b,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3978dee0-d3')
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.780 187189 DEBUG oslo_concurrency.lockutils [None req-cd36e3b4-2436-4ced-a50d-d44dbe3b0d40 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.698s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.864 187189 DEBUG nova.virt.libvirt.driver [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.865 187189 DEBUG nova.virt.libvirt.driver [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.865 187189 DEBUG nova.virt.libvirt.driver [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] No VIF found with MAC fa:16:3e:c9:dd:dd, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:03:44 compute-0 nova_compute[187185]: 2025-11-29 07:03:44.866 187189 INFO nova.virt.libvirt.driver [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Using config drive
Nov 29 07:03:44 compute-0 podman[222511]: 2025-11-29 07:03:44.904200628 +0000 UTC m=+0.709422058 container create ee627524d984af0e13335db0316324fa2410181628b13f8f12f56ead2fad935d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 07:03:45 compute-0 systemd[1]: Started libpod-conmon-ee627524d984af0e13335db0316324fa2410181628b13f8f12f56ead2fad935d.scope.
Nov 29 07:03:45 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:03:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cddb442fdfa72fb2b2e65745b79bdacfb5514e2592ecad32eee28d5ccce15e22/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:03:45 compute-0 podman[222511]: 2025-11-29 07:03:45.428328362 +0000 UTC m=+1.233549842 container init ee627524d984af0e13335db0316324fa2410181628b13f8f12f56ead2fad935d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 07:03:45 compute-0 podman[222511]: 2025-11-29 07:03:45.435702229 +0000 UTC m=+1.240923669 container start ee627524d984af0e13335db0316324fa2410181628b13f8f12f56ead2fad935d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Nov 29 07:03:45 compute-0 nova_compute[187185]: 2025-11-29 07:03:45.458 187189 INFO nova.virt.libvirt.driver [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Creating config drive at /var/lib/nova/instances/02b37f0c-3272-417b-9791-48b555f68d56/disk.config
Nov 29 07:03:45 compute-0 nova_compute[187185]: 2025-11-29 07:03:45.465 187189 DEBUG oslo_concurrency.processutils [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/02b37f0c-3272-417b-9791-48b555f68d56/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp31fu5brf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:03:45 compute-0 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[222534]: [NOTICE]   (222538) : New worker (222540) forked
Nov 29 07:03:45 compute-0 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[222534]: [NOTICE]   (222538) : Loading success.
Nov 29 07:03:45 compute-0 nova_compute[187185]: 2025-11-29 07:03:45.603 187189 DEBUG oslo_concurrency.processutils [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/02b37f0c-3272-417b-9791-48b555f68d56/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp31fu5brf" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:03:45 compute-0 kernel: tap3978dee0-d3: entered promiscuous mode
Nov 29 07:03:45 compute-0 NetworkManager[55227]: <info>  [1764399825.6803] manager: (tap3978dee0-d3): new Tun device (/org/freedesktop/NetworkManager/Devices/88)
Nov 29 07:03:45 compute-0 systemd-udevd[222467]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:03:45 compute-0 nova_compute[187185]: 2025-11-29 07:03:45.681 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:45 compute-0 ovn_controller[95281]: 2025-11-29T07:03:45Z|00154|binding|INFO|Claiming lport 3978dee0-d304-43a8-9478-68840d581d9b for this chassis.
Nov 29 07:03:45 compute-0 ovn_controller[95281]: 2025-11-29T07:03:45Z|00155|binding|INFO|3978dee0-d304-43a8-9478-68840d581d9b: Claiming fa:16:3e:c9:dd:dd 10.100.0.11
Nov 29 07:03:45 compute-0 NetworkManager[55227]: <info>  [1764399825.6975] device (tap3978dee0-d3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:03:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:45.696 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c9:dd:dd 10.100.0.11'], port_security=['fa:16:3e:c9:dd:dd 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9cf3a513-f54e-430e-b018-befaa643b464', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fc8ab121-ee69-4ab4-9a39-25b26b293132', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a3fd0639-e84a-4389-a7f3-f9ac2c360b5e, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=3978dee0-d304-43a8-9478-68840d581d9b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:03:45 compute-0 NetworkManager[55227]: <info>  [1764399825.7000] device (tap3978dee0-d3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:03:45 compute-0 nova_compute[187185]: 2025-11-29 07:03:45.755 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:45 compute-0 systemd-machined[153486]: New machine qemu-23-instance-00000040.
Nov 29 07:03:45 compute-0 ovn_controller[95281]: 2025-11-29T07:03:45Z|00156|binding|INFO|Setting lport 3978dee0-d304-43a8-9478-68840d581d9b ovn-installed in OVS
Nov 29 07:03:45 compute-0 ovn_controller[95281]: 2025-11-29T07:03:45Z|00157|binding|INFO|Setting lport 3978dee0-d304-43a8-9478-68840d581d9b up in Southbound
Nov 29 07:03:45 compute-0 systemd[1]: Started Virtual Machine qemu-23-instance-00000040.
Nov 29 07:03:45 compute-0 nova_compute[187185]: 2025-11-29 07:03:45.767 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:45.788 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 3978dee0-d304-43a8-9478-68840d581d9b in datapath 9cf3a513-f54e-430e-b018-befaa643b464 unbound from our chassis
Nov 29 07:03:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:45.790 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9cf3a513-f54e-430e-b018-befaa643b464
Nov 29 07:03:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:45.800 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[8a471c6f-6d98-4f40-8ef6-2cc46df4e09e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:45.801 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9cf3a513-f1 in ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:03:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:45.802 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9cf3a513-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:03:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:45.803 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[dab56f90-2989-4fc2-8763-2b94030b59d1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:45.805 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[900d2516-5f4c-46ec-90b6-5ea373a5ec42]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:45.817 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[dd739f77-db1d-44bc-9651-b75b3e132610]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:45 compute-0 nova_compute[187185]: 2025-11-29 07:03:45.851 187189 DEBUG nova.compute.manager [req-c7fb4e28-4b48-4f6f-a1ec-fbb9b5828582 req-4e5733d1-afb8-4524-a1ba-692ee221d6cb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Received event network-vif-plugged-6a0ff3c3-e368-4504-9884-40716725c901 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:03:45 compute-0 nova_compute[187185]: 2025-11-29 07:03:45.852 187189 DEBUG oslo_concurrency.lockutils [req-c7fb4e28-4b48-4f6f-a1ec-fbb9b5828582 req-4e5733d1-afb8-4524-a1ba-692ee221d6cb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:03:45 compute-0 nova_compute[187185]: 2025-11-29 07:03:45.852 187189 DEBUG oslo_concurrency.lockutils [req-c7fb4e28-4b48-4f6f-a1ec-fbb9b5828582 req-4e5733d1-afb8-4524-a1ba-692ee221d6cb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:03:45 compute-0 nova_compute[187185]: 2025-11-29 07:03:45.852 187189 DEBUG oslo_concurrency.lockutils [req-c7fb4e28-4b48-4f6f-a1ec-fbb9b5828582 req-4e5733d1-afb8-4524-a1ba-692ee221d6cb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:03:45 compute-0 nova_compute[187185]: 2025-11-29 07:03:45.853 187189 DEBUG nova.compute.manager [req-c7fb4e28-4b48-4f6f-a1ec-fbb9b5828582 req-4e5733d1-afb8-4524-a1ba-692ee221d6cb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] No waiting events found dispatching network-vif-plugged-6a0ff3c3-e368-4504-9884-40716725c901 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:03:45 compute-0 nova_compute[187185]: 2025-11-29 07:03:45.853 187189 WARNING nova.compute.manager [req-c7fb4e28-4b48-4f6f-a1ec-fbb9b5828582 req-4e5733d1-afb8-4524-a1ba-692ee221d6cb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Received unexpected event network-vif-plugged-6a0ff3c3-e368-4504-9884-40716725c901 for instance with vm_state active and task_state None.
Nov 29 07:03:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:45.846 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[3622c3cb-de96-4aef-87f7-b9dd28667299]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:45.884 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[e06eae0f-fc30-42e3-bd3b-ca7b9efff106]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:45 compute-0 NetworkManager[55227]: <info>  [1764399825.8905] manager: (tap9cf3a513-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/89)
Nov 29 07:03:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:45.889 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[8f3a10bc-555e-462c-adcf-336d5717ce9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:45.926 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[c118aab0-19b9-4082-a4f4-c9bad168f111]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:45.929 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[cf29a3ca-2bad-427c-a2cd-333e7b359311]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:45 compute-0 NetworkManager[55227]: <info>  [1764399825.9513] device (tap9cf3a513-f0): carrier: link connected
Nov 29 07:03:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:45.959 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[0af7b110-594e-410a-8322-80e44c3aef07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:45.976 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[b1e5dfc9-9cf9-4258-bb4f-6efcd9921f57]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9cf3a513-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:40:28:ee'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 527361, 'reachable_time': 29237, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222583, 'error': None, 'target': 'ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:45.992 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[2f50bbe0-6edf-48cf-86cb-f0e5d91099df]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe40:28ee'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 527361, 'tstamp': 527361}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222584, 'error': None, 'target': 'ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:46.008 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[ebc625e6-6a22-4bce-8897-a4461bbc3f17]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9cf3a513-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:40:28:ee'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 527361, 'reachable_time': 29237, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222585, 'error': None, 'target': 'ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:46.038 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a42e03d4-67f0-45ae-9992-0de9b8f6b1a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:46.095 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[8f8f059d-067c-4a30-b260-3902f973d8c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:46.097 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9cf3a513-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:03:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:46.097 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:03:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:46.098 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9cf3a513-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:03:46 compute-0 kernel: tap9cf3a513-f0: entered promiscuous mode
Nov 29 07:03:46 compute-0 NetworkManager[55227]: <info>  [1764399826.1003] manager: (tap9cf3a513-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/90)
Nov 29 07:03:46 compute-0 nova_compute[187185]: 2025-11-29 07:03:46.101 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:46.117 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9cf3a513-f0, col_values=(('external_ids', {'iface-id': 'ed5aef73-67a0-4ad1-8aea-9c411786c18e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:03:46 compute-0 ovn_controller[95281]: 2025-11-29T07:03:46Z|00158|binding|INFO|Releasing lport ed5aef73-67a0-4ad1-8aea-9c411786c18e from this chassis (sb_readonly=0)
Nov 29 07:03:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:46.120 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9cf3a513-f54e-430e-b018-befaa643b464.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9cf3a513-f54e-430e-b018-befaa643b464.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:03:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:46.120 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[4562ca10-0da6-4d62-8c7a-6970d55b0e8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:03:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:46.121 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:03:46 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:03:46 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:03:46 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-9cf3a513-f54e-430e-b018-befaa643b464
Nov 29 07:03:46 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:03:46 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:03:46 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:03:46 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/9cf3a513-f54e-430e-b018-befaa643b464.pid.haproxy
Nov 29 07:03:46 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:03:46 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:03:46 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:03:46 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:03:46 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:03:46 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:03:46 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:03:46 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:03:46 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:03:46 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:03:46 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:03:46 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:03:46 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:03:46 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:03:46 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:03:46 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:03:46 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:03:46 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:03:46 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:03:46 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:03:46 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID 9cf3a513-f54e-430e-b018-befaa643b464
Nov 29 07:03:46 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:03:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:03:46.121 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464', 'env', 'PROCESS_TAG=haproxy-9cf3a513-f54e-430e-b018-befaa643b464', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9cf3a513-f54e-430e-b018-befaa643b464.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:03:46 compute-0 nova_compute[187185]: 2025-11-29 07:03:46.123 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:46 compute-0 nova_compute[187185]: 2025-11-29 07:03:46.130 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:46 compute-0 nova_compute[187185]: 2025-11-29 07:03:46.401 187189 DEBUG nova.compute.manager [req-0011c5bb-2fb6-4cfa-aea3-5965977d9bd2 req-497181c3-84d4-49bc-ab64-b9443a105948 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Received event network-vif-plugged-3978dee0-d304-43a8-9478-68840d581d9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:03:46 compute-0 nova_compute[187185]: 2025-11-29 07:03:46.402 187189 DEBUG oslo_concurrency.lockutils [req-0011c5bb-2fb6-4cfa-aea3-5965977d9bd2 req-497181c3-84d4-49bc-ab64-b9443a105948 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "02b37f0c-3272-417b-9791-48b555f68d56-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:03:46 compute-0 nova_compute[187185]: 2025-11-29 07:03:46.402 187189 DEBUG oslo_concurrency.lockutils [req-0011c5bb-2fb6-4cfa-aea3-5965977d9bd2 req-497181c3-84d4-49bc-ab64-b9443a105948 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "02b37f0c-3272-417b-9791-48b555f68d56-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:03:46 compute-0 nova_compute[187185]: 2025-11-29 07:03:46.403 187189 DEBUG oslo_concurrency.lockutils [req-0011c5bb-2fb6-4cfa-aea3-5965977d9bd2 req-497181c3-84d4-49bc-ab64-b9443a105948 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "02b37f0c-3272-417b-9791-48b555f68d56-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:03:46 compute-0 nova_compute[187185]: 2025-11-29 07:03:46.403 187189 DEBUG nova.compute.manager [req-0011c5bb-2fb6-4cfa-aea3-5965977d9bd2 req-497181c3-84d4-49bc-ab64-b9443a105948 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Processing event network-vif-plugged-3978dee0-d304-43a8-9478-68840d581d9b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:03:46 compute-0 podman[222617]: 2025-11-29 07:03:46.452134764 +0000 UTC m=+0.020472135 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:03:46 compute-0 nova_compute[187185]: 2025-11-29 07:03:46.770 187189 DEBUG nova.compute.manager [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:03:46 compute-0 nova_compute[187185]: 2025-11-29 07:03:46.772 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399826.7712648, 02b37f0c-3272-417b-9791-48b555f68d56 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:03:46 compute-0 nova_compute[187185]: 2025-11-29 07:03:46.773 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] VM Started (Lifecycle Event)
Nov 29 07:03:46 compute-0 nova_compute[187185]: 2025-11-29 07:03:46.781 187189 DEBUG nova.virt.libvirt.driver [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:03:46 compute-0 nova_compute[187185]: 2025-11-29 07:03:46.785 187189 INFO nova.virt.libvirt.driver [-] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Instance spawned successfully.
Nov 29 07:03:46 compute-0 nova_compute[187185]: 2025-11-29 07:03:46.786 187189 DEBUG nova.virt.libvirt.driver [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:03:46 compute-0 nova_compute[187185]: 2025-11-29 07:03:46.798 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:03:46 compute-0 nova_compute[187185]: 2025-11-29 07:03:46.825 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:03:46 compute-0 nova_compute[187185]: 2025-11-29 07:03:46.834 187189 DEBUG nova.virt.libvirt.driver [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:03:46 compute-0 nova_compute[187185]: 2025-11-29 07:03:46.837 187189 DEBUG nova.virt.libvirt.driver [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:03:46 compute-0 nova_compute[187185]: 2025-11-29 07:03:46.838 187189 DEBUG nova.virt.libvirt.driver [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:03:46 compute-0 nova_compute[187185]: 2025-11-29 07:03:46.839 187189 DEBUG nova.virt.libvirt.driver [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:03:46 compute-0 nova_compute[187185]: 2025-11-29 07:03:46.840 187189 DEBUG nova.virt.libvirt.driver [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:03:46 compute-0 nova_compute[187185]: 2025-11-29 07:03:46.840 187189 DEBUG nova.virt.libvirt.driver [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:03:46 compute-0 nova_compute[187185]: 2025-11-29 07:03:46.888 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:03:46 compute-0 nova_compute[187185]: 2025-11-29 07:03:46.888 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399826.771878, 02b37f0c-3272-417b-9791-48b555f68d56 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:03:46 compute-0 nova_compute[187185]: 2025-11-29 07:03:46.888 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] VM Paused (Lifecycle Event)
Nov 29 07:03:46 compute-0 nova_compute[187185]: 2025-11-29 07:03:46.923 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:03:46 compute-0 nova_compute[187185]: 2025-11-29 07:03:46.928 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399826.7794828, 02b37f0c-3272-417b-9791-48b555f68d56 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:03:46 compute-0 nova_compute[187185]: 2025-11-29 07:03:46.928 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] VM Resumed (Lifecycle Event)
Nov 29 07:03:46 compute-0 nova_compute[187185]: 2025-11-29 07:03:46.957 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:03:46 compute-0 nova_compute[187185]: 2025-11-29 07:03:46.961 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:03:46 compute-0 nova_compute[187185]: 2025-11-29 07:03:46.986 187189 INFO nova.compute.manager [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Took 8.39 seconds to spawn the instance on the hypervisor.
Nov 29 07:03:46 compute-0 nova_compute[187185]: 2025-11-29 07:03:46.987 187189 DEBUG nova.compute.manager [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:03:46 compute-0 nova_compute[187185]: 2025-11-29 07:03:46.994 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:03:47 compute-0 nova_compute[187185]: 2025-11-29 07:03:47.163 187189 INFO nova.compute.manager [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Took 9.22 seconds to build instance.
Nov 29 07:03:47 compute-0 nova_compute[187185]: 2025-11-29 07:03:47.191 187189 DEBUG oslo_concurrency.lockutils [None req-f03bba44-92da-46c0-b0ed-05b9ebe59ead f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "02b37f0c-3272-417b-9791-48b555f68d56" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.400s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:03:47 compute-0 nova_compute[187185]: 2025-11-29 07:03:47.313 187189 DEBUG nova.network.neutron [req-996d3d46-93cb-4ba2-ab06-1d4bd86dcd26 req-9ed7a26a-4fb2-440f-8131-068eb8eaadaa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Updated VIF entry in instance network info cache for port 3978dee0-d304-43a8-9478-68840d581d9b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:03:47 compute-0 nova_compute[187185]: 2025-11-29 07:03:47.313 187189 DEBUG nova.network.neutron [req-996d3d46-93cb-4ba2-ab06-1d4bd86dcd26 req-9ed7a26a-4fb2-440f-8131-068eb8eaadaa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Updating instance_info_cache with network_info: [{"id": "3978dee0-d304-43a8-9478-68840d581d9b", "address": "fa:16:3e:c9:dd:dd", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3978dee0-d3", "ovs_interfaceid": "3978dee0-d304-43a8-9478-68840d581d9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:03:47 compute-0 nova_compute[187185]: 2025-11-29 07:03:47.330 187189 DEBUG oslo_concurrency.lockutils [req-996d3d46-93cb-4ba2-ab06-1d4bd86dcd26 req-9ed7a26a-4fb2-440f-8131-068eb8eaadaa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-02b37f0c-3272-417b-9791-48b555f68d56" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:03:48 compute-0 podman[222617]: 2025-11-29 07:03:48.332711973 +0000 UTC m=+1.901049334 container create 0dfe199bffa38d5ab97d9d4d6a0d9f7a22d102cb1581c02d5eaa39c112b65390 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:03:48 compute-0 systemd[1]: Started libpod-conmon-0dfe199bffa38d5ab97d9d4d6a0d9f7a22d102cb1581c02d5eaa39c112b65390.scope.
Nov 29 07:03:48 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:03:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f2219bc9b7c6d7940c935ac1ce97f304a66274a918469f3454d0d503d892dd54/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:03:48 compute-0 podman[222617]: 2025-11-29 07:03:48.781222047 +0000 UTC m=+2.349559428 container init 0dfe199bffa38d5ab97d9d4d6a0d9f7a22d102cb1581c02d5eaa39c112b65390 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 29 07:03:48 compute-0 podman[222617]: 2025-11-29 07:03:48.786349921 +0000 UTC m=+2.354687272 container start 0dfe199bffa38d5ab97d9d4d6a0d9f7a22d102cb1581c02d5eaa39c112b65390 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 29 07:03:48 compute-0 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[222641]: [NOTICE]   (222645) : New worker (222647) forked
Nov 29 07:03:48 compute-0 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[222641]: [NOTICE]   (222645) : Loading success.
Nov 29 07:03:49 compute-0 nova_compute[187185]: 2025-11-29 07:03:49.195 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:49 compute-0 nova_compute[187185]: 2025-11-29 07:03:49.215 187189 DEBUG nova.compute.manager [req-1b4c0a87-5c5b-4bef-ab07-558789fecc41 req-837141e7-d490-40d5-bb3c-cc4a9c852b2a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Received event network-vif-plugged-3978dee0-d304-43a8-9478-68840d581d9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:03:49 compute-0 nova_compute[187185]: 2025-11-29 07:03:49.217 187189 DEBUG oslo_concurrency.lockutils [req-1b4c0a87-5c5b-4bef-ab07-558789fecc41 req-837141e7-d490-40d5-bb3c-cc4a9c852b2a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "02b37f0c-3272-417b-9791-48b555f68d56-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:03:49 compute-0 nova_compute[187185]: 2025-11-29 07:03:49.218 187189 DEBUG oslo_concurrency.lockutils [req-1b4c0a87-5c5b-4bef-ab07-558789fecc41 req-837141e7-d490-40d5-bb3c-cc4a9c852b2a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "02b37f0c-3272-417b-9791-48b555f68d56-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:03:49 compute-0 nova_compute[187185]: 2025-11-29 07:03:49.218 187189 DEBUG oslo_concurrency.lockutils [req-1b4c0a87-5c5b-4bef-ab07-558789fecc41 req-837141e7-d490-40d5-bb3c-cc4a9c852b2a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "02b37f0c-3272-417b-9791-48b555f68d56-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:03:49 compute-0 nova_compute[187185]: 2025-11-29 07:03:49.219 187189 DEBUG nova.compute.manager [req-1b4c0a87-5c5b-4bef-ab07-558789fecc41 req-837141e7-d490-40d5-bb3c-cc4a9c852b2a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] No waiting events found dispatching network-vif-plugged-3978dee0-d304-43a8-9478-68840d581d9b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:03:49 compute-0 nova_compute[187185]: 2025-11-29 07:03:49.219 187189 WARNING nova.compute.manager [req-1b4c0a87-5c5b-4bef-ab07-558789fecc41 req-837141e7-d490-40d5-bb3c-cc4a9c852b2a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Received unexpected event network-vif-plugged-3978dee0-d304-43a8-9478-68840d581d9b for instance with vm_state active and task_state None.
Nov 29 07:03:49 compute-0 nova_compute[187185]: 2025-11-29 07:03:49.764 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:50 compute-0 nova_compute[187185]: 2025-11-29 07:03:50.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:03:50 compute-0 nova_compute[187185]: 2025-11-29 07:03:50.318 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:03:50 compute-0 nova_compute[187185]: 2025-11-29 07:03:50.318 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:03:50 compute-0 nova_compute[187185]: 2025-11-29 07:03:50.466 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "refresh_cache-690daf8f-6151-4de9-85f6-b8a9fe51ea02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:03:50 compute-0 nova_compute[187185]: 2025-11-29 07:03:50.466 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquired lock "refresh_cache-690daf8f-6151-4de9-85f6-b8a9fe51ea02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:03:50 compute-0 nova_compute[187185]: 2025-11-29 07:03:50.467 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 29 07:03:50 compute-0 nova_compute[187185]: 2025-11-29 07:03:50.467 187189 DEBUG nova.objects.instance [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 690daf8f-6151-4de9-85f6-b8a9fe51ea02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:03:50 compute-0 podman[222656]: 2025-11-29 07:03:50.887458304 +0000 UTC m=+0.137379623 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller)
Nov 29 07:03:54 compute-0 nova_compute[187185]: 2025-11-29 07:03:54.196 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:54 compute-0 nova_compute[187185]: 2025-11-29 07:03:54.767 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:54 compute-0 nova_compute[187185]: 2025-11-29 07:03:54.887 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Updating instance_info_cache with network_info: [{"id": "6a0ff3c3-e368-4504-9884-40716725c901", "address": "fa:16:3e:15:c1:31", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a0ff3c3-e3", "ovs_interfaceid": "6a0ff3c3-e368-4504-9884-40716725c901", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:03:55 compute-0 nova_compute[187185]: 2025-11-29 07:03:55.313 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Releasing lock "refresh_cache-690daf8f-6151-4de9-85f6-b8a9fe51ea02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:03:55 compute-0 nova_compute[187185]: 2025-11-29 07:03:55.315 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 29 07:03:55 compute-0 nova_compute[187185]: 2025-11-29 07:03:55.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:03:55 compute-0 nova_compute[187185]: 2025-11-29 07:03:55.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:03:55 compute-0 nova_compute[187185]: 2025-11-29 07:03:55.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:03:55 compute-0 nova_compute[187185]: 2025-11-29 07:03:55.318 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:03:55 compute-0 nova_compute[187185]: 2025-11-29 07:03:55.319 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:03:55 compute-0 nova_compute[187185]: 2025-11-29 07:03:55.319 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:03:55 compute-0 nova_compute[187185]: 2025-11-29 07:03:55.356 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:03:55 compute-0 nova_compute[187185]: 2025-11-29 07:03:55.356 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:03:55 compute-0 nova_compute[187185]: 2025-11-29 07:03:55.357 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:03:55 compute-0 nova_compute[187185]: 2025-11-29 07:03:55.358 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:03:55 compute-0 nova_compute[187185]: 2025-11-29 07:03:55.503 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02b37f0c-3272-417b-9791-48b555f68d56/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:03:55 compute-0 nova_compute[187185]: 2025-11-29 07:03:55.617 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02b37f0c-3272-417b-9791-48b555f68d56/disk --force-share --output=json" returned: 0 in 0.115s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:03:55 compute-0 nova_compute[187185]: 2025-11-29 07:03:55.619 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02b37f0c-3272-417b-9791-48b555f68d56/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:03:55 compute-0 nova_compute[187185]: 2025-11-29 07:03:55.695 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02b37f0c-3272-417b-9791-48b555f68d56/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:03:55 compute-0 nova_compute[187185]: 2025-11-29 07:03:55.704 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:03:55 compute-0 nova_compute[187185]: 2025-11-29 07:03:55.760 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:03:55 compute-0 nova_compute[187185]: 2025-11-29 07:03:55.762 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:03:55 compute-0 nova_compute[187185]: 2025-11-29 07:03:55.819 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:03:56 compute-0 nova_compute[187185]: 2025-11-29 07:03:56.027 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:03:56 compute-0 nova_compute[187185]: 2025-11-29 07:03:56.029 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5373MB free_disk=73.32915496826172GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:03:56 compute-0 nova_compute[187185]: 2025-11-29 07:03:56.029 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:03:56 compute-0 nova_compute[187185]: 2025-11-29 07:03:56.029 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:03:56 compute-0 nova_compute[187185]: 2025-11-29 07:03:56.347 187189 INFO nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Updating resource usage from migration b2f30ee9-093d-4a50-9511-730851938837
Nov 29 07:03:56 compute-0 nova_compute[187185]: 2025-11-29 07:03:56.395 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance 02b37f0c-3272-417b-9791-48b555f68d56 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:03:56 compute-0 nova_compute[187185]: 2025-11-29 07:03:56.396 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Migration b2f30ee9-093d-4a50-9511-730851938837 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Nov 29 07:03:56 compute-0 nova_compute[187185]: 2025-11-29 07:03:56.397 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:03:56 compute-0 nova_compute[187185]: 2025-11-29 07:03:56.397 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:03:56 compute-0 nova_compute[187185]: 2025-11-29 07:03:56.430 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Refreshing inventories for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 29 07:03:56 compute-0 nova_compute[187185]: 2025-11-29 07:03:56.453 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Updating ProviderTree inventory for provider 4e39a026-df39-4e20-874a-dbb5a40df044 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 29 07:03:56 compute-0 nova_compute[187185]: 2025-11-29 07:03:56.454 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Updating inventory in ProviderTree for provider 4e39a026-df39-4e20-874a-dbb5a40df044 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 29 07:03:56 compute-0 nova_compute[187185]: 2025-11-29 07:03:56.473 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Refreshing aggregate associations for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 29 07:03:56 compute-0 nova_compute[187185]: 2025-11-29 07:03:56.491 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Refreshing trait associations for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044, traits: HW_CPU_X86_SSE,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_TRUSTED_CERTS,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 29 07:03:56 compute-0 nova_compute[187185]: 2025-11-29 07:03:56.573 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:03:56 compute-0 nova_compute[187185]: 2025-11-29 07:03:56.576 187189 DEBUG oslo_concurrency.lockutils [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "refresh_cache-690daf8f-6151-4de9-85f6-b8a9fe51ea02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:03:56 compute-0 nova_compute[187185]: 2025-11-29 07:03:56.577 187189 DEBUG oslo_concurrency.lockutils [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquired lock "refresh_cache-690daf8f-6151-4de9-85f6-b8a9fe51ea02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:03:56 compute-0 nova_compute[187185]: 2025-11-29 07:03:56.577 187189 DEBUG nova.network.neutron [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:03:57 compute-0 nova_compute[187185]: 2025-11-29 07:03:56.999 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:03:57 compute-0 nova_compute[187185]: 2025-11-29 07:03:57.640 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:03:57 compute-0 nova_compute[187185]: 2025-11-29 07:03:57.642 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.612s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:03:57 compute-0 podman[222711]: 2025-11-29 07:03:57.794337962 +0000 UTC m=+0.057097782 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 07:03:57 compute-0 podman[222710]: 2025-11-29 07:03:57.826532454 +0000 UTC m=+0.090570820 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm)
Nov 29 07:03:58 compute-0 nova_compute[187185]: 2025-11-29 07:03:58.977 187189 DEBUG nova.network.neutron [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Updating instance_info_cache with network_info: [{"id": "6a0ff3c3-e368-4504-9884-40716725c901", "address": "fa:16:3e:15:c1:31", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a0ff3c3-e3", "ovs_interfaceid": "6a0ff3c3-e368-4504-9884-40716725c901", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:03:59 compute-0 nova_compute[187185]: 2025-11-29 07:03:59.036 187189 DEBUG oslo_concurrency.lockutils [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Releasing lock "refresh_cache-690daf8f-6151-4de9-85f6-b8a9fe51ea02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:03:59 compute-0 nova_compute[187185]: 2025-11-29 07:03:59.199 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:03:59 compute-0 nova_compute[187185]: 2025-11-29 07:03:59.572 187189 DEBUG nova.virt.libvirt.driver [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Nov 29 07:03:59 compute-0 nova_compute[187185]: 2025-11-29 07:03:59.574 187189 DEBUG nova.virt.libvirt.volume.remotefs [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Creating file /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/5fcd8bd4a32a45b4a6d724fc6a62b14a.tmp on remote host 192.168.122.101 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79
Nov 29 07:03:59 compute-0 nova_compute[187185]: 2025-11-29 07:03:59.574 187189 DEBUG oslo_concurrency.processutils [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/5fcd8bd4a32a45b4a6d724fc6a62b14a.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:03:59 compute-0 nova_compute[187185]: 2025-11-29 07:03:59.640 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:03:59 compute-0 nova_compute[187185]: 2025-11-29 07:03:59.641 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:03:59 compute-0 nova_compute[187185]: 2025-11-29 07:03:59.642 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:03:59 compute-0 nova_compute[187185]: 2025-11-29 07:03:59.770 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:04:00 compute-0 nova_compute[187185]: 2025-11-29 07:04:00.014 187189 DEBUG oslo_concurrency.processutils [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/5fcd8bd4a32a45b4a6d724fc6a62b14a.tmp" returned: 1 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:04:00 compute-0 nova_compute[187185]: 2025-11-29 07:04:00.015 187189 DEBUG oslo_concurrency.processutils [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] 'ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/5fcd8bd4a32a45b4a6d724fc6a62b14a.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Nov 29 07:04:00 compute-0 nova_compute[187185]: 2025-11-29 07:04:00.016 187189 DEBUG nova.virt.libvirt.volume.remotefs [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Creating directory /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02 on remote host 192.168.122.101 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91
Nov 29 07:04:00 compute-0 nova_compute[187185]: 2025-11-29 07:04:00.016 187189 DEBUG oslo_concurrency.processutils [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:04:00 compute-0 nova_compute[187185]: 2025-11-29 07:04:00.220 187189 DEBUG oslo_concurrency.processutils [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02" returned: 0 in 0.203s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:04:00 compute-0 nova_compute[187185]: 2025-11-29 07:04:00.227 187189 DEBUG nova.virt.libvirt.driver [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Nov 29 07:04:00 compute-0 nova_compute[187185]: 2025-11-29 07:04:00.311 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:04:00 compute-0 ovn_controller[95281]: 2025-11-29T07:04:00Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:15:c1:31 10.100.0.14
Nov 29 07:04:00 compute-0 ovn_controller[95281]: 2025-11-29T07:04:00Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:15:c1:31 10.100.0.14
Nov 29 07:04:00 compute-0 podman[222769]: 2025-11-29 07:04:00.831385042 +0000 UTC m=+0.093386739 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 07:04:04 compute-0 nova_compute[187185]: 2025-11-29 07:04:04.201 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:04:04 compute-0 nova_compute[187185]: 2025-11-29 07:04:04.773 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:04:06 compute-0 ovn_controller[95281]: 2025-11-29T07:04:06Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c9:dd:dd 10.100.0.11
Nov 29 07:04:06 compute-0 ovn_controller[95281]: 2025-11-29T07:04:06Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c9:dd:dd 10.100.0.11
Nov 29 07:04:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:04:09.110 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:04:09 compute-0 nova_compute[187185]: 2025-11-29 07:04:09.111 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:04:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:04:09.111 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:04:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:04:09.112 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:04:09 compute-0 nova_compute[187185]: 2025-11-29 07:04:09.203 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:04:09 compute-0 sshd-session[222794]: Received disconnect from 103.179.56.44 port 50012:11: Bye Bye [preauth]
Nov 29 07:04:09 compute-0 sshd-session[222794]: Disconnected from authenticating user root 103.179.56.44 port 50012 [preauth]
Nov 29 07:04:09 compute-0 nova_compute[187185]: 2025-11-29 07:04:09.776 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:04:10 compute-0 nova_compute[187185]: 2025-11-29 07:04:10.288 187189 DEBUG nova.virt.libvirt.driver [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Nov 29 07:04:12 compute-0 podman[222798]: 2025-11-29 07:04:12.83589071 +0000 UTC m=+0.079614763 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 07:04:12 compute-0 podman[222797]: 2025-11-29 07:04:12.84802684 +0000 UTC m=+0.098207274 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, release=1755695350, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.expose-services=, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, managed_by=edpm_ansible, vendor=Red Hat, Inc., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6)
Nov 29 07:04:12 compute-0 podman[222796]: 2025-11-29 07:04:12.868731491 +0000 UTC m=+0.121221600 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 07:04:14 compute-0 nova_compute[187185]: 2025-11-29 07:04:14.206 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:04:14 compute-0 nova_compute[187185]: 2025-11-29 07:04:14.824 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:04:19 compute-0 nova_compute[187185]: 2025-11-29 07:04:19.208 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:04:19 compute-0 nova_compute[187185]: 2025-11-29 07:04:19.827 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:04:21 compute-0 nova_compute[187185]: 2025-11-29 07:04:21.347 187189 DEBUG nova.virt.libvirt.driver [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Instance in state 1 after 21 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Nov 29 07:04:21 compute-0 podman[222860]: 2025-11-29 07:04:21.849991141 +0000 UTC m=+0.105658794 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 07:04:24 compute-0 nova_compute[187185]: 2025-11-29 07:04:24.210 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:04:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:04:24.824 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:04:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:04:24.824 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:04:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:04:24.826 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:04:24 compute-0 nova_compute[187185]: 2025-11-29 07:04:24.869 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:04:28 compute-0 podman[222887]: 2025-11-29 07:04:28.793882375 +0000 UTC m=+0.054550040 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 07:04:28 compute-0 podman[222886]: 2025-11-29 07:04:28.82043749 +0000 UTC m=+0.086738103 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:04:29 compute-0 nova_compute[187185]: 2025-11-29 07:04:29.212 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:04:29 compute-0 nova_compute[187185]: 2025-11-29 07:04:29.871 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:04:31 compute-0 podman[222929]: 2025-11-29 07:04:31.79002159 +0000 UTC m=+0.057944425 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, config_id=multipathd, io.buildah.version=1.41.3)
Nov 29 07:04:32 compute-0 nova_compute[187185]: 2025-11-29 07:04:32.403 187189 DEBUG nova.virt.libvirt.driver [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Instance in state 1 after 32 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Nov 29 07:04:34 compute-0 nova_compute[187185]: 2025-11-29 07:04:34.214 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:04:34 compute-0 nova_compute[187185]: 2025-11-29 07:04:34.873 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:04:39 compute-0 nova_compute[187185]: 2025-11-29 07:04:39.216 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:04:39 compute-0 nova_compute[187185]: 2025-11-29 07:04:39.875 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:04:43 compute-0 nova_compute[187185]: 2025-11-29 07:04:43.449 187189 DEBUG nova.virt.libvirt.driver [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Instance in state 1 after 43 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Nov 29 07:04:43 compute-0 podman[222951]: 2025-11-29 07:04:43.802779636 +0000 UTC m=+0.054270692 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 07:04:43 compute-0 podman[222950]: 2025-11-29 07:04:43.812326894 +0000 UTC m=+0.064809698 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container)
Nov 29 07:04:43 compute-0 podman[222949]: 2025-11-29 07:04:43.818480516 +0000 UTC m=+0.076107164 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible)
Nov 29 07:04:44 compute-0 nova_compute[187185]: 2025-11-29 07:04:44.218 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:04:44 compute-0 nova_compute[187185]: 2025-11-29 07:04:44.877 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:04:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:47.992 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '02b37f0c-3272-417b-9791-48b555f68d56', 'name': 'tempest-₡-1325169813', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000040', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1dba9539037a4e9dbf33cba140fe21fe', 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'hostId': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 07:04:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:47.996 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02', 'name': 'tempest-DeleteServersTestJSON-server-660939301', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000003f', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '98df116965b74e4a9985049062e65162', 'user_id': '4ecd161098b5422084003b39f0504a8f', 'hostId': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 07:04:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:47.996 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.010 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.011 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.026 12 DEBUG ceilometer.compute.pollsters [-] 690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.026 12 DEBUG ceilometer.compute.pollsters [-] 690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '196226ab-34fd-473d-bc63-bf4689d4c380', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': '02b37f0c-3272-417b-9791-48b555f68d56-vda', 'timestamp': '2025-11-29T07:04:47.996988', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'instance-00000040', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b12bd290-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.715255588, 'message_signature': 'c67601f158a565faa45ee60c87e0734d4c5b709a732f0201bc84579d6c95c8a6'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': '02b37f0c-3272-417b-9791-48b555f68d56-sda', 'timestamp': '2025-11-29T07:04:47.996988', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'instance-00000040', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b12be852-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.715255588, 'message_signature': '46354cc9c657b40ff607cbb7df93f3afa720b00a306f2ae0ff662fc5afb1e3e7'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02-vda', 'timestamp': '2025-11-29T07:04:47.996988', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-660939301', 'name': 'instance-0000003f', 'instance_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b12e34c2-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.729983821, 'message_signature': '3d2817b46e1ef7aae8a376a54f0579be27f3cb70d3f390791ec7283aa23055e6'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02-sda', 'timestamp': '2025-11-29T07:04:47.996988', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-660939301', 'name': 'instance-0000003f', 'instance_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b12e3ee0-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.729983821, 'message_signature': '738edd07cf96c95ec0cd6ee8c25c6e7d0a50e2f5d90d6c561fce6c4c6c0de5ed'}]}, 'timestamp': '2025-11-29 07:04:48.027016', '_unique_id': '9bf6691899e84a348567c27e68096402'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.028 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.029 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.057 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/disk.device.read.bytes volume: 30108160 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.058 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.088 12 DEBUG ceilometer.compute.pollsters [-] 690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk.device.read.bytes volume: 30439936 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.089 12 DEBUG ceilometer.compute.pollsters [-] 690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e728d176-38b9-4d6b-938a-c7e539397789', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30108160, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': '02b37f0c-3272-417b-9791-48b555f68d56-vda', 'timestamp': '2025-11-29T07:04:48.029950', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'instance-00000040', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b132f5de-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.748244622, 'message_signature': '177103310fb1777195037508e1b1182682230f8ad69bd0d197b32358523b7560'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': '02b37f0c-3272-417b-9791-48b555f68d56-sda', 'timestamp': '2025-11-29T07:04:48.029950', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'instance-00000040', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b13316cc-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.748244622, 'message_signature': 'c94260455579971422a3fda07da74086806fe6eb32a571e1147b91f789dffae2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30439936, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02-vda', 'timestamp': '2025-11-29T07:04:48.029950', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-660939301', 'name': 'instance-0000003f', 'instance_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b137b57e-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.77704242, 'message_signature': '0e2eb263b3c6973257f04284e6e3957d2ed4601fa5459468e395a4d264e6fff6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02-sda', 'timestamp': '2025-11-29T07:04:48.029950', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-660939301', 'name': 'instance-0000003f', 'instance_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b137c4e2-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.77704242, 'message_signature': '86c6fc7c3c94624356702acb21ee9c9bcbc325f70d5585a155561f7e6ca563e9'}]}, 'timestamp': '2025-11-29 07:04:48.089330', '_unique_id': '176f7dd9678e4c60877f8cc7b8fb9ef6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.090 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.091 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.094 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 02b37f0c-3272-417b-9791-48b555f68d56 / tap3978dee0-d3 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.094 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.096 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 690daf8f-6151-4de9-85f6-b8a9fe51ea02 / tap6a0ff3c3-e3 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.096 12 DEBUG ceilometer.compute.pollsters [-] 690daf8f-6151-4de9-85f6-b8a9fe51ea02/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0b6e6782-a2d3-4dd0-96e9-1478f4f687b4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'instance-00000040-02b37f0c-3272-417b-9791-48b555f68d56-tap3978dee0-d3', 'timestamp': '2025-11-29T07:04:48.091248', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'tap3978dee0-d3', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c9:dd:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3978dee0-d3'}, 'message_id': 'b1389516-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.80949862, 'message_signature': '9ab36a77efeb97230be44376ce0e6e2e62b57eb2f4f3ea5324d86bf7f58e9ade'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'instance-0000003f-690daf8f-6151-4de9-85f6-b8a9fe51ea02-tap6a0ff3c3-e3', 'timestamp': '2025-11-29T07:04:48.091248', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-660939301', 'name': 'tap6a0ff3c3-e3', 'instance_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:15:c1:31', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6a0ff3c3-e3'}, 'message_id': 'b138ebf6-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.812870084, 'message_signature': '088297449cae98e44a4b9641b0b80e27aa5179391d9ed294e19dedfb93bf4e9d'}]}, 'timestamp': '2025-11-29 07:04:48.096928', '_unique_id': 'd2786a97952c42f2b0c9984c135b339a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.097 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.098 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.098 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/network.incoming.bytes volume: 1394 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.099 12 DEBUG ceilometer.compute.pollsters [-] 690daf8f-6151-4de9-85f6-b8a9fe51ea02/network.incoming.bytes volume: 1394 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '70454265-dfb6-49ff-9a15-163ac3600a1c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1394, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'instance-00000040-02b37f0c-3272-417b-9791-48b555f68d56-tap3978dee0-d3', 'timestamp': '2025-11-29T07:04:48.098819', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'tap3978dee0-d3', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c9:dd:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3978dee0-d3'}, 'message_id': 'b13944b6-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.80949862, 'message_signature': 'c1b9805755c996846a260ae7041fc78bc47b785f0d282bad352c01e866a23065'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1394, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'instance-0000003f-690daf8f-6151-4de9-85f6-b8a9fe51ea02-tap6a0ff3c3-e3', 'timestamp': '2025-11-29T07:04:48.098819', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-660939301', 'name': 'tap6a0ff3c3-e3', 'instance_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:15:c1:31', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6a0ff3c3-e3'}, 'message_id': 'b1394ede-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.812870084, 'message_signature': 'd896ba6e84594f170adb44a06b8e24318b77a9b50e3b5cb9dd38a13dbd01ba14'}]}, 'timestamp': '2025-11-29 07:04:48.099433', '_unique_id': '65985e37f024446eb6c9e120c8f6f809'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.100 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.101 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-₡-1325169813>, <NovaLikeServer: tempest-DeleteServersTestJSON-server-660939301>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-₡-1325169813>, <NovaLikeServer: tempest-DeleteServersTestJSON-server-660939301>]
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.101 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.101 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.101 12 DEBUG ceilometer.compute.pollsters [-] 690daf8f-6151-4de9-85f6-b8a9fe51ea02/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'edf85e84-4079-4dde-a02d-11ae133cf1aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'instance-00000040-02b37f0c-3272-417b-9791-48b555f68d56-tap3978dee0-d3', 'timestamp': '2025-11-29T07:04:48.101354', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'tap3978dee0-d3', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c9:dd:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3978dee0-d3'}, 'message_id': 'b139a3de-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.80949862, 'message_signature': '0bb90ae700df6bdcca48bbf58a40fd14648c0f2e3b396ed41e9fc08eb98b59f7'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'instance-0000003f-690daf8f-6151-4de9-85f6-b8a9fe51ea02-tap6a0ff3c3-e3', 'timestamp': '2025-11-29T07:04:48.101354', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-660939301', 'name': 'tap6a0ff3c3-e3', 'instance_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:15:c1:31', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6a0ff3c3-e3'}, 'message_id': 'b139ae56-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.812870084, 'message_signature': '406326f998d086f6dbbc9fa99d65de4e961d32659c8169daa5210fa11da34e9d'}]}, 'timestamp': '2025-11-29 07:04:48.101873', '_unique_id': 'd37957b9394f4becad6386a7814ae81b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.102 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.103 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.103 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/network.incoming.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.103 12 DEBUG ceilometer.compute.pollsters [-] 690daf8f-6151-4de9-85f6-b8a9fe51ea02/network.incoming.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '58f1e6e4-1bec-42d5-adca-a17a920bb56a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 10, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'instance-00000040-02b37f0c-3272-417b-9791-48b555f68d56-tap3978dee0-d3', 'timestamp': '2025-11-29T07:04:48.103083', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'tap3978dee0-d3', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c9:dd:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3978dee0-d3'}, 'message_id': 'b139e754-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.80949862, 'message_signature': '94561b5e4f400c62fa171c6031c80060a88d52f66f2d85d0e234790e7a9f4c6c'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 10, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'instance-0000003f-690daf8f-6151-4de9-85f6-b8a9fe51ea02-tap6a0ff3c3-e3', 'timestamp': '2025-11-29T07:04:48.103083', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-660939301', 'name': 'tap6a0ff3c3-e3', 'instance_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:15:c1:31', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6a0ff3c3-e3'}, 'message_id': 'b139f046-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.812870084, 'message_signature': 'e2a67f79531046dbe36656aabef308d21fd3a515076ac96f4aef15f822ebd7d2'}]}, 'timestamp': '2025-11-29 07:04:48.103531', '_unique_id': '7df2684440ea43f990a17999defbb054'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.104 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.105 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/disk.device.write.requests volume: 329 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.105 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.105 12 DEBUG ceilometer.compute.pollsters [-] 690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk.device.write.requests volume: 378 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.105 12 DEBUG ceilometer.compute.pollsters [-] 690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9b342571-6141-47c5-ad28-60a5b1886ec2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 329, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': '02b37f0c-3272-417b-9791-48b555f68d56-vda', 'timestamp': '2025-11-29T07:04:48.105021', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'instance-00000040', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b13a333a-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.748244622, 'message_signature': '438f35ce9d3d29938605ebfee2fe05b394fddfc7771b6f6d1fd295c2e295400c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': '02b37f0c-3272-417b-9791-48b555f68d56-sda', 'timestamp': '2025-11-29T07:04:48.105021', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'instance-00000040', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b13a3b28-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.748244622, 'message_signature': '26f19485dc3e31cf67a219e20cb38a28e49e4681a52259915d9fbffebc86d4c3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 378, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02-vda', 'timestamp': '2025-11-29T07:04:48.105021', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-660939301', 'name': 'instance-0000003f', 'instance_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b13a433e-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.77704242, 'message_signature': '1cfa0e7d5b26598306b99dd38b3700b9246e492f14c64161c8e7ab0446a8d728'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02-sda', 'timestamp': '2025-11-29T07:04:48.105021', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-660939301', 'name': 'instance-0000003f', 'instance_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b13a4f46-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.77704242, 'message_signature': '5d798af9015f7f74a0458161174ac2a02e032a6dead7f13d11d24807da197b2a'}]}, 'timestamp': '2025-11-29 07:04:48.105991', '_unique_id': '0b01b4a6cd0b492ca8844cd14b83ff3f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.106 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.107 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.107 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.107 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-₡-1325169813>, <NovaLikeServer: tempest-DeleteServersTestJSON-server-660939301>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-₡-1325169813>, <NovaLikeServer: tempest-DeleteServersTestJSON-server-660939301>]
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.107 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.121 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/memory.usage volume: 46.37109375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.133 12 DEBUG ceilometer.compute.pollsters [-] 690daf8f-6151-4de9-85f6-b8a9fe51ea02/memory.usage volume: 43.15234375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd47fd8ed-93a4-467a-8fd1-6e85d0d9a628', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 46.37109375, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'timestamp': '2025-11-29T07:04:48.107979', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'instance-00000040', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'b13ccabe-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.839893152, 'message_signature': '3ec41bc9787a7dfce2692a604768366502c63b7c0682a76b8bc362c2e7293a65'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 43.15234375, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02', 'timestamp': '2025-11-29T07:04:48.107979', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-660939301', 'name': 'instance-0000003f', 'instance_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'b13e99b6-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.851932719, 'message_signature': '4065a9e2f93ca64eb918f81956cf3c6fc297417793d4f7bb7680f118560f5462'}]}, 'timestamp': '2025-11-29 07:04:48.134114', '_unique_id': '702903fbe57c43ebbf5ed9ab87929d5f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.135 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-₡-1325169813>, <NovaLikeServer: tempest-DeleteServersTestJSON-server-660939301>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-₡-1325169813>, <NovaLikeServer: tempest-DeleteServersTestJSON-server-660939301>]
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.136 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.136 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/disk.device.read.latency volume: 1356780113 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.136 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/disk.device.read.latency volume: 256274987 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.136 12 DEBUG ceilometer.compute.pollsters [-] 690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk.device.read.latency volume: 1534283553 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.136 12 DEBUG ceilometer.compute.pollsters [-] 690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk.device.read.latency volume: 24677762 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cac88a39-2f34-48ab-bf3d-0503e5f5a24d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1356780113, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': '02b37f0c-3272-417b-9791-48b555f68d56-vda', 'timestamp': '2025-11-29T07:04:48.136201', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'instance-00000040', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b13ef528-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.748244622, 'message_signature': 'a91c9a05cc6e63fddc02e0c4f9782cb86a235e0dbe5cbc96cc51b45f1925f82a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 256274987, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': '02b37f0c-3272-417b-9791-48b555f68d56-sda', 'timestamp': '2025-11-29T07:04:48.136201', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'instance-00000040', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b13eff46-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.748244622, 'message_signature': 'b28351d1d4b08bdc47c53b9767f202daad63c0fa62b06b6215034b1062155bc7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1534283553, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02-vda', 'timestamp': '2025-11-29T07:04:48.136201', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-660939301', 'name': 'instance-0000003f', 'instance_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b13f0720-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.77704242, 'message_signature': '49efcd8f55892953e2aa41ce5742e7479c63ae69928186532aa0d682805fae8f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24677762, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02-sda', 'timestamp': '2025-11-29T07:04:48.136201', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-660939301', 'name': 'instance-0000003f', 'instance_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b13f103a-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.77704242, 'message_signature': 'c8c7222ba0a496d589878b618628850aa46eb5d682ff53670de572a6453125da'}]}, 'timestamp': '2025-11-29 07:04:48.137114', '_unique_id': 'b177d2f9605b4eb5a314395dd8c5b892'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.138 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.138 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/disk.device.read.requests volume: 1079 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.138 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.138 12 DEBUG ceilometer.compute.pollsters [-] 690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk.device.read.requests volume: 1094 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.139 12 DEBUG ceilometer.compute.pollsters [-] 690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9bda81d5-c9ac-4ed6-b37f-ff15be2f9eb7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1079, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': '02b37f0c-3272-417b-9791-48b555f68d56-vda', 'timestamp': '2025-11-29T07:04:48.138363', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'instance-00000040', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b13f494c-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.748244622, 'message_signature': 'a99754ca42e3feab2b97afdb06dfd48e1d758e69fddb20309e8c46d54169090f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': '02b37f0c-3272-417b-9791-48b555f68d56-sda', 'timestamp': '2025-11-29T07:04:48.138363', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'instance-00000040', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b13f5360-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.748244622, 'message_signature': 'ab14b7affab3838b413c61d8de3d9883834928cb7cfdc7064940a18e44455633'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1094, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02-vda', 'timestamp': '2025-11-29T07:04:48.138363', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-660939301', 'name': 'instance-0000003f', 'instance_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b13f5d9c-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.77704242, 'message_signature': 'ca564b9fd380527f76fcadb6b57a6e7216315d9326c8fb18f4a5f10a25a67dcd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02-sda', 'timestamp': '2025-11-29T07:04:48.138363', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-660939301', 'name': 'instance-0000003f', 'instance_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b13f6512-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.77704242, 'message_signature': '48db3c31c4c0fedb692f3cbb7fb5c8728f78b04e02c5ee0119f76a55969a52e8'}]}, 'timestamp': '2025-11-29 07:04:48.139282', '_unique_id': '47852b2654154ce7ae2d8e302ea640da'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-₡-1325169813>, <NovaLikeServer: tempest-DeleteServersTestJSON-server-660939301>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-₡-1325169813>, <NovaLikeServer: tempest-DeleteServersTestJSON-server-660939301>]
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.140 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.141 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/disk.device.write.bytes volume: 72830976 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.141 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.141 12 DEBUG ceilometer.compute.pollsters [-] 690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk.device.write.bytes volume: 73060352 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.141 12 DEBUG ceilometer.compute.pollsters [-] 690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '822d02eb-866e-4b6b-a6e4-0262879612dd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72830976, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': '02b37f0c-3272-417b-9791-48b555f68d56-vda', 'timestamp': '2025-11-29T07:04:48.141011', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'instance-00000040', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b13fb0b2-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.748244622, 'message_signature': 'f4595c9060ab71b1521b3e864a9ab76f12ab097d0a6a584ab9f9a8ad072c9758'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': '02b37f0c-3272-417b-9791-48b555f68d56-sda', 'timestamp': '2025-11-29T07:04:48.141011', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'instance-00000040', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b13fb878-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.748244622, 'message_signature': '77c2bf2f60ae8f03a7fa5acff1839d78197e6111ecfa77a6e0f05fcff77427ac'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73060352, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02-vda', 'timestamp': '2025-11-29T07:04:48.141011', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-660939301', 'name': 'instance-0000003f', 'instance_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b13fc11a-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.77704242, 'message_signature': 'c54ec636de300afba2528677aa94de705d60c0343b59079c8dc4d38c85c979a2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02-sda', 'timestamp': '2025-11-29T07:04:48.141011', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-660939301', 'name': 'instance-0000003f', 'instance_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b13fcf48-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.77704242, 'message_signature': '7f1250c8f001601f88d20dfcef76536f07a7f1f7d744aaff51f090614b097a04'}]}, 'timestamp': '2025-11-29 07:04:48.142008', '_unique_id': '49ca6c46ed424bc281dd0865c69d80ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.142 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.143 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.143 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.143 12 DEBUG ceilometer.compute.pollsters [-] 690daf8f-6151-4de9-85f6-b8a9fe51ea02/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9a86e891-8169-4f15-875f-8b808f049b9e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'instance-00000040-02b37f0c-3272-417b-9791-48b555f68d56-tap3978dee0-d3', 'timestamp': '2025-11-29T07:04:48.143321', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'tap3978dee0-d3', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c9:dd:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3978dee0-d3'}, 'message_id': 'b1400b3e-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.80949862, 'message_signature': '410c6c0b0c4ccbed4dfd8836adbd510c2232b9619a0261e64b261f3db147681f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'instance-0000003f-690daf8f-6151-4de9-85f6-b8a9fe51ea02-tap6a0ff3c3-e3', 'timestamp': '2025-11-29T07:04:48.143321', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-660939301', 'name': 'tap6a0ff3c3-e3', 'instance_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:15:c1:31', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6a0ff3c3-e3'}, 'message_id': 'b1401502-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.812870084, 'message_signature': '5ffff0aae72f37dc7dbfe0e9ef4ff2f1559b4ec52253eafc725cb31beadfaabd'}]}, 'timestamp': '2025-11-29 07:04:48.143800', '_unique_id': 'fb671520e3cb4d219454d942d7896382'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.144 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 DEBUG ceilometer.compute.pollsters [-] 690daf8f-6151-4de9-85f6-b8a9fe51ea02/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c9f63671-b593-4916-824f-2680630de4c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'instance-00000040-02b37f0c-3272-417b-9791-48b555f68d56-tap3978dee0-d3', 'timestamp': '2025-11-29T07:04:48.144966', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'tap3978dee0-d3', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c9:dd:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3978dee0-d3'}, 'message_id': 'b1404b76-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.80949862, 'message_signature': 'a5f6910273a4849b532be90fc5437f0c3d4c4636d3fb55f4f245fdac956910d3'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'instance-0000003f-690daf8f-6151-4de9-85f6-b8a9fe51ea02-tap6a0ff3c3-e3', 'timestamp': '2025-11-29T07:04:48.144966', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-660939301', 'name': 'tap6a0ff3c3-e3', 'instance_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:15:c1:31', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6a0ff3c3-e3'}, 'message_id': 'b14053aa-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.812870084, 'message_signature': 'a2e816e493242e59cb7ce7070ccb4be0ce78d28e51f5c10ab989190c8fa8ac22'}]}, 'timestamp': '2025-11-29 07:04:48.145397', '_unique_id': '8fd6de5ada7d414895f8779b6fbfbf95'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.145 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.146 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.146 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/disk.device.write.latency volume: 64576545638 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.146 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.149 12 DEBUG ceilometer.compute.pollsters [-] 690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk.device.write.latency volume: 49789579158 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.149 12 DEBUG ceilometer.compute.pollsters [-] 690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a1f1a8db-e16e-40c1-8d57-219b47f12582', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 64576545638, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': '02b37f0c-3272-417b-9791-48b555f68d56-vda', 'timestamp': '2025-11-29T07:04:48.146547', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'instance-00000040', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b14088e8-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.748244622, 'message_signature': '2221db0911443c84b0e33770e6733f3a3ea6eed4806bebc15a08bee885574213'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': '02b37f0c-3272-417b-9791-48b555f68d56-sda', 'timestamp': '2025-11-29T07:04:48.146547', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'instance-00000040', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b14094e6-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.748244622, 'message_signature': '873ffb9e984ab42a154929b4fed1bbc77261093ed479bd6df7a07b5df109c00a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 49789579158, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02-vda', 'timestamp': '2025-11-29T07:04:48.146547', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-660939301', 'name': 'instance-0000003f', 'instance_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b140fdc8-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.77704242, 'message_signature': '59de3bf7ebd7423d54923ec2a256e44efce71815bb7993208be4b0aec656b4b5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02-sda', 'timestamp': '2025-11-29T07:04:48.146547', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-660939301', 'name': 'instance-0000003f', 'instance_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b14108b8-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.77704242, 'message_signature': 'e802f7f17fee43ff7af13d0330e35f65e98de26d92aa954c145e7fc07cc8e5d0'}]}, 'timestamp': '2025-11-29 07:04:48.150074', '_unique_id': 'f3b3708a49e4464c80b715c2ede53612'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.150 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.151 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.151 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.151 12 DEBUG ceilometer.compute.pollsters [-] 690daf8f-6151-4de9-85f6-b8a9fe51ea02/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc5aa2fa-fc5f-47d0-9ef8-c1d6c8082553', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'instance-00000040-02b37f0c-3272-417b-9791-48b555f68d56-tap3978dee0-d3', 'timestamp': '2025-11-29T07:04:48.151626', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'tap3978dee0-d3', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c9:dd:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3978dee0-d3'}, 'message_id': 'b1415106-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.80949862, 'message_signature': '7e326e62b66bccdcb385e6525b46ddeedb52dd07defdf1821f5a2bd6dd4f4350'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'instance-0000003f-690daf8f-6151-4de9-85f6-b8a9fe51ea02-tap6a0ff3c3-e3', 'timestamp': '2025-11-29T07:04:48.151626', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-660939301', 'name': 'tap6a0ff3c3-e3', 'instance_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:15:c1:31', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6a0ff3c3-e3'}, 'message_id': 'b1415d72-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.812870084, 'message_signature': '82b3c4b66014e1a031216e3499171b234ee7837f51222b09a6f55a7f399633a9'}]}, 'timestamp': '2025-11-29 07:04:48.152222', '_unique_id': '067ee1211f894804a7ca0288cd02fc54'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.152 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.153 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.154 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/disk.device.usage volume: 29884416 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.154 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.154 12 DEBUG ceilometer.compute.pollsters [-] 690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.154 12 DEBUG ceilometer.compute.pollsters [-] 690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '63101ed9-5c5e-4359-99dd-1187ca916672', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29884416, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': '02b37f0c-3272-417b-9791-48b555f68d56-vda', 'timestamp': '2025-11-29T07:04:48.154053', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'instance-00000040', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b141ae58-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.715255588, 'message_signature': '120dfd393fd25c99d5641744926f6eab5ef3301c3ddd50c1d0058fe49dced9a5'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': '02b37f0c-3272-417b-9791-48b555f68d56-sda', 'timestamp': '2025-11-29T07:04:48.154053', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'instance-00000040', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b141b600-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.715255588, 'message_signature': '16eac048251f4b17bea2a1a1af27e29fa81da3229f548f168636de7ef10c60c9'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02-vda', 'timestamp': '2025-11-29T07:04:48.154053', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-660939301', 'name': 'instance-0000003f', 'instance_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b141bd62-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.729983821, 'message_signature': '8ff4b6c680bc752c2962de5bf6606cc524ce9d6cbe8f5b503e6cc0c322ddfbb7'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02-sda', 'timestamp': '2025-11-29T07:04:48.154053', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-660939301', 'name': 'instance-0000003f', 'instance_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b141c492-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.729983821, 'message_signature': '9314bdaa015ab11a4574ced419653c5307b9d9fd967a99feb795ce1998bc21e7'}]}, 'timestamp': '2025-11-29 07:04:48.154922', '_unique_id': 'd45c205844e04b92a4493c365152fae7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.155 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.156 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.156 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.156 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.156 12 DEBUG ceilometer.compute.pollsters [-] 690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk.device.allocation volume: 30810112 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.156 12 DEBUG ceilometer.compute.pollsters [-] 690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5aeabec8-f8c1-423c-bcd6-42ec9928fdb1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': '02b37f0c-3272-417b-9791-48b555f68d56-vda', 'timestamp': '2025-11-29T07:04:48.156242', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'instance-00000040', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b14203c6-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.715255588, 'message_signature': 'a7728788cf00b0d6f8d9f101bc567ef77962db06c792b6e4d647fff120749fff'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': '02b37f0c-3272-417b-9791-48b555f68d56-sda', 'timestamp': '2025-11-29T07:04:48.156242', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'instance-00000040', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b1420c4a-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.715255588, 'message_signature': '2b0f5679b1e2e9f72119742f8d36bf544000b9e0a9f2d9ff333ba6d9cce7a9a9'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30810112, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02-vda', 'timestamp': '2025-11-29T07:04:48.156242', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-660939301', 'name': 'instance-0000003f', 'instance_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b14213a2-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.729983821, 'message_signature': 'a2c7f16d7a11a6cfc89b0d378c8cac633c383c9fe3777a563676ec3353cb30b4'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02-sda', 'timestamp': '2025-11-29T07:04:48.156242', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-660939301', 'name': 'instance-0000003f', 'instance_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b1421f82-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.729983821, 'message_signature': '4d2e2d0e355ae1a493dcbc5ac11c012070891cafe2e83f44ed121d9820079637'}]}, 'timestamp': '2025-11-29 07:04:48.157165', '_unique_id': '60e3964ff897486c8efe07443949f97b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.157 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.158 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.158 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/network.outgoing.bytes volume: 2740 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.158 12 DEBUG ceilometer.compute.pollsters [-] 690daf8f-6151-4de9-85f6-b8a9fe51ea02/network.outgoing.bytes volume: 2740 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ee97c7c4-2989-4ea3-9005-d657bcb0d4af', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2740, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'instance-00000040-02b37f0c-3272-417b-9791-48b555f68d56-tap3978dee0-d3', 'timestamp': '2025-11-29T07:04:48.158339', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'tap3978dee0-d3', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c9:dd:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3978dee0-d3'}, 'message_id': 'b142559c-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.80949862, 'message_signature': '69bbfac10c467fa27c81c1fb6f73eee8e79451d4a33c5d58d83bf958308ffe0d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2740, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'instance-0000003f-690daf8f-6151-4de9-85f6-b8a9fe51ea02-tap6a0ff3c3-e3', 'timestamp': '2025-11-29T07:04:48.158339', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-660939301', 'name': 'tap6a0ff3c3-e3', 'instance_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:15:c1:31', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6a0ff3c3-e3'}, 'message_id': 'b1425d8a-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.812870084, 'message_signature': '2ddec04ff4d09c4a5947d320d1cc845cde8bb867d66ffcdcb523380c6a486ed5'}]}, 'timestamp': '2025-11-29 07:04:48.158759', '_unique_id': 'fab48da318434ff6b6ffbb1e8df988bc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.159 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/network.outgoing.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 DEBUG ceilometer.compute.pollsters [-] 690daf8f-6151-4de9-85f6-b8a9fe51ea02/network.outgoing.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1ce1e14d-114f-4bb5-809f-f9f35af737ef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'instance-00000040-02b37f0c-3272-417b-9791-48b555f68d56-tap3978dee0-d3', 'timestamp': '2025-11-29T07:04:48.159995', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'tap3978dee0-d3', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c9:dd:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3978dee0-d3'}, 'message_id': 'b1429642-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.80949862, 'message_signature': '9f4ca37f26c3ade78f89151ae0afc86a7f5d4e7b1e4f5259692a1356c6a4fc4f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'instance-0000003f-690daf8f-6151-4de9-85f6-b8a9fe51ea02-tap6a0ff3c3-e3', 'timestamp': '2025-11-29T07:04:48.159995', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-660939301', 'name': 'tap6a0ff3c3-e3', 'instance_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:15:c1:31', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6a0ff3c3-e3'}, 'message_id': 'b1429e80-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.812870084, 'message_signature': 'c4db154f419f7b323740d3fe3f30f4a801cfd735e0ba96e4f2ce17e8ef4bcae4'}]}, 'timestamp': '2025-11-29 07:04:48.160420', '_unique_id': '0001557229914388878646448ef4ee33'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.160 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.161 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.161 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/cpu volume: 11910000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.161 12 DEBUG ceilometer.compute.pollsters [-] 690daf8f-6151-4de9-85f6-b8a9fe51ea02/cpu volume: 11880000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f27e9956-c0a5-42b3-a0ac-10e443c868a2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11910000000, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'timestamp': '2025-11-29T07:04:48.161525', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'instance-00000040', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'b142d1f2-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.839893152, 'message_signature': 'd3a7abaaff904f08464695532121a2e945614260c404a8571a678389f16954ae'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11880000000, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02', 'timestamp': '2025-11-29T07:04:48.161525', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-660939301', 'name': 'instance-0000003f', 'instance_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'b142da1c-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.851932719, 'message_signature': '638de13d5a727b3fd21522bcb86c6fca4e2fd1f16c0752d67d0d8b6f337968bc'}]}, 'timestamp': '2025-11-29 07:04:48.161985', '_unique_id': '6260f6b8f74b485d80bac9d0e26f337e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.162 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.163 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.163 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.163 12 DEBUG ceilometer.compute.pollsters [-] 690daf8f-6151-4de9-85f6-b8a9fe51ea02/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9a6b376e-5452-4ac1-966c-9c9c843cb3c9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'instance-00000040-02b37f0c-3272-417b-9791-48b555f68d56-tap3978dee0-d3', 'timestamp': '2025-11-29T07:04:48.163254', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'tap3978dee0-d3', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c9:dd:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3978dee0-d3'}, 'message_id': 'b1431586-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.80949862, 'message_signature': '307f82b6652723892a2961badf88f3c818d26bdae654be57771ac471ed42f4bd'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_name': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_name': None, 'resource_id': 'instance-0000003f-690daf8f-6151-4de9-85f6-b8a9fe51ea02-tap6a0ff3c3-e3', 'timestamp': '2025-11-29T07:04:48.163254', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-660939301', 'name': 'tap6a0ff3c3-e3', 'instance_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02', 'instance_type': 'm1.nano', 'host': 'deae7c49b2a6e7e4e533816c7f09bdf080ba57aaef971e57bb01e49e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:15:c1:31', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6a0ff3c3-e3'}, 'message_id': 'b1431d60-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5335.812870084, 'message_signature': 'e525bdc69b8c5fc6c9a7685fe22a074710294e225fb4104f9cbe0b022ef05654'}]}, 'timestamp': '2025-11-29 07:04:48.163667', '_unique_id': '2b5f8e64951f43cc91435dc78d4fad9c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:04:48.164 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:04:49 compute-0 nova_compute[187185]: 2025-11-29 07:04:49.222 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:04:49 compute-0 nova_compute[187185]: 2025-11-29 07:04:49.880 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:04:50 compute-0 nova_compute[187185]: 2025-11-29 07:04:50.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:04:50 compute-0 nova_compute[187185]: 2025-11-29 07:04:50.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:04:50 compute-0 nova_compute[187185]: 2025-11-29 07:04:50.529 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "refresh_cache-02b37f0c-3272-417b-9791-48b555f68d56" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:04:50 compute-0 nova_compute[187185]: 2025-11-29 07:04:50.530 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquired lock "refresh_cache-02b37f0c-3272-417b-9791-48b555f68d56" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:04:50 compute-0 nova_compute[187185]: 2025-11-29 07:04:50.530 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 29 07:04:52 compute-0 nova_compute[187185]: 2025-11-29 07:04:52.289 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Updating instance_info_cache with network_info: [{"id": "3978dee0-d304-43a8-9478-68840d581d9b", "address": "fa:16:3e:c9:dd:dd", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3978dee0-d3", "ovs_interfaceid": "3978dee0-d304-43a8-9478-68840d581d9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:04:52 compute-0 nova_compute[187185]: 2025-11-29 07:04:52.310 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Releasing lock "refresh_cache-02b37f0c-3272-417b-9791-48b555f68d56" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:04:52 compute-0 nova_compute[187185]: 2025-11-29 07:04:52.311 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 29 07:04:52 compute-0 nova_compute[187185]: 2025-11-29 07:04:52.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:04:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:04:52.626 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:04:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:04:52.628 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:04:52 compute-0 nova_compute[187185]: 2025-11-29 07:04:52.630 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:04:52 compute-0 nova_compute[187185]: 2025-11-29 07:04:52.632 187189 DEBUG oslo_concurrency.lockutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "46a3bd91-6f2e-4a64-aeef-fe1b46a95bba" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:04:52 compute-0 nova_compute[187185]: 2025-11-29 07:04:52.633 187189 DEBUG oslo_concurrency.lockutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "46a3bd91-6f2e-4a64-aeef-fe1b46a95bba" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:04:52 compute-0 nova_compute[187185]: 2025-11-29 07:04:52.669 187189 DEBUG nova.compute.manager [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 07:04:52 compute-0 podman[223012]: 2025-11-29 07:04:52.894166668 +0000 UTC m=+0.140290593 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 29 07:04:53 compute-0 nova_compute[187185]: 2025-11-29 07:04:53.051 187189 DEBUG oslo_concurrency.lockutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:04:53 compute-0 nova_compute[187185]: 2025-11-29 07:04:53.053 187189 DEBUG oslo_concurrency.lockutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:04:53 compute-0 nova_compute[187185]: 2025-11-29 07:04:53.064 187189 DEBUG nova.virt.hardware [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:04:53 compute-0 nova_compute[187185]: 2025-11-29 07:04:53.064 187189 INFO nova.compute.claims [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:04:53 compute-0 nova_compute[187185]: 2025-11-29 07:04:53.754 187189 DEBUG nova.compute.provider_tree [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:04:53 compute-0 nova_compute[187185]: 2025-11-29 07:04:53.871 187189 DEBUG nova.scheduler.client.report [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:04:53 compute-0 nova_compute[187185]: 2025-11-29 07:04:53.917 187189 DEBUG oslo_concurrency.lockutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.864s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:04:53 compute-0 nova_compute[187185]: 2025-11-29 07:04:53.941 187189 DEBUG oslo_concurrency.lockutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "dbeb75d8-a2d5-4f46-8b67-a79ea432d2c8" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:04:53 compute-0 nova_compute[187185]: 2025-11-29 07:04:53.942 187189 DEBUG oslo_concurrency.lockutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "dbeb75d8-a2d5-4f46-8b67-a79ea432d2c8" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:04:53 compute-0 nova_compute[187185]: 2025-11-29 07:04:53.966 187189 DEBUG nova.compute.manager [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] No node specified, defaulting to compute-0.ctlplane.example.com _get_nodename /usr/lib/python3.9/site-packages/nova/compute/manager.py:10505
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.015 187189 DEBUG oslo_concurrency.lockutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "dbeb75d8-a2d5-4f46-8b67-a79ea432d2c8" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: held 0.073s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.019 187189 DEBUG nova.compute.manager [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.109 187189 DEBUG nova.compute.manager [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.109 187189 DEBUG nova.network.neutron [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.134 187189 INFO nova.virt.libvirt.driver [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.155 187189 DEBUG nova.compute.manager [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.223 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.311 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.363 187189 DEBUG nova.compute.manager [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.365 187189 DEBUG nova.virt.libvirt.driver [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.366 187189 INFO nova.virt.libvirt.driver [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Creating image(s)
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.367 187189 DEBUG oslo_concurrency.lockutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "/var/lib/nova/instances/46a3bd91-6f2e-4a64-aeef-fe1b46a95bba/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.368 187189 DEBUG oslo_concurrency.lockutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "/var/lib/nova/instances/46a3bd91-6f2e-4a64-aeef-fe1b46a95bba/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.369 187189 DEBUG oslo_concurrency.lockutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "/var/lib/nova/instances/46a3bd91-6f2e-4a64-aeef-fe1b46a95bba/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.397 187189 DEBUG oslo_concurrency.processutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.451 187189 DEBUG nova.network.neutron [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.451 187189 DEBUG nova.compute.manager [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.459 187189 DEBUG oslo_concurrency.processutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.460 187189 DEBUG oslo_concurrency.lockutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.461 187189 DEBUG oslo_concurrency.lockutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.473 187189 DEBUG oslo_concurrency.processutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.502 187189 DEBUG nova.virt.libvirt.driver [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Instance in state 1 after 54 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.532 187189 DEBUG oslo_concurrency.processutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.532 187189 DEBUG oslo_concurrency.processutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/46a3bd91-6f2e-4a64-aeef-fe1b46a95bba/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.632 187189 DEBUG oslo_concurrency.processutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/46a3bd91-6f2e-4a64-aeef-fe1b46a95bba/disk 1073741824" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.633 187189 DEBUG oslo_concurrency.lockutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.634 187189 DEBUG oslo_concurrency.processutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.709 187189 DEBUG oslo_concurrency.processutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.710 187189 DEBUG nova.virt.disk.api [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Checking if we can resize image /var/lib/nova/instances/46a3bd91-6f2e-4a64-aeef-fe1b46a95bba/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.711 187189 DEBUG oslo_concurrency.processutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46a3bd91-6f2e-4a64-aeef-fe1b46a95bba/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.792 187189 DEBUG oslo_concurrency.processutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46a3bd91-6f2e-4a64-aeef-fe1b46a95bba/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.793 187189 DEBUG nova.virt.disk.api [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Cannot resize image /var/lib/nova/instances/46a3bd91-6f2e-4a64-aeef-fe1b46a95bba/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.794 187189 DEBUG nova.objects.instance [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lazy-loading 'migration_context' on Instance uuid 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.809 187189 DEBUG nova.virt.libvirt.driver [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.810 187189 DEBUG nova.virt.libvirt.driver [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Ensure instance console log exists: /var/lib/nova/instances/46a3bd91-6f2e-4a64-aeef-fe1b46a95bba/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.810 187189 DEBUG oslo_concurrency.lockutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.810 187189 DEBUG oslo_concurrency.lockutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.811 187189 DEBUG oslo_concurrency.lockutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.812 187189 DEBUG nova.virt.libvirt.driver [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.817 187189 WARNING nova.virt.libvirt.driver [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.820 187189 DEBUG nova.virt.libvirt.host [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.820 187189 DEBUG nova.virt.libvirt.host [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.823 187189 DEBUG nova.virt.libvirt.host [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.823 187189 DEBUG nova.virt.libvirt.host [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.824 187189 DEBUG nova.virt.libvirt.driver [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.825 187189 DEBUG nova.virt.hardware [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.825 187189 DEBUG nova.virt.hardware [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.825 187189 DEBUG nova.virt.hardware [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.825 187189 DEBUG nova.virt.hardware [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.826 187189 DEBUG nova.virt.hardware [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.826 187189 DEBUG nova.virt.hardware [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.826 187189 DEBUG nova.virt.hardware [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.826 187189 DEBUG nova.virt.hardware [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.826 187189 DEBUG nova.virt.hardware [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.827 187189 DEBUG nova.virt.hardware [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.827 187189 DEBUG nova.virt.hardware [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.831 187189 DEBUG nova.objects.instance [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lazy-loading 'pci_devices' on Instance uuid 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.847 187189 DEBUG nova.virt.libvirt.driver [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:04:54 compute-0 nova_compute[187185]:   <uuid>46a3bd91-6f2e-4a64-aeef-fe1b46a95bba</uuid>
Nov 29 07:04:54 compute-0 nova_compute[187185]:   <name>instance-00000047</name>
Nov 29 07:04:54 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:04:54 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:04:54 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:04:54 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:       <nova:name>tempest-ServersOnMultiNodesTest-server-710160406-1</nova:name>
Nov 29 07:04:54 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:04:54</nova:creationTime>
Nov 29 07:04:54 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:04:54 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:04:54 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:04:54 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:04:54 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:04:54 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:04:54 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:04:54 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:04:54 compute-0 nova_compute[187185]:         <nova:user uuid="0c56214d54944034ac2500edac59a239">tempest-ServersOnMultiNodesTest-2086403841-project-member</nova:user>
Nov 29 07:04:54 compute-0 nova_compute[187185]:         <nova:project uuid="d09f64becda14f30b831bdf7371d586b">tempest-ServersOnMultiNodesTest-2086403841</nova:project>
Nov 29 07:04:54 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:04:54 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:       <nova:ports/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:04:54 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:04:54 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <system>
Nov 29 07:04:54 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:04:54 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:04:54 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:04:54 compute-0 nova_compute[187185]:       <entry name="serial">46a3bd91-6f2e-4a64-aeef-fe1b46a95bba</entry>
Nov 29 07:04:54 compute-0 nova_compute[187185]:       <entry name="uuid">46a3bd91-6f2e-4a64-aeef-fe1b46a95bba</entry>
Nov 29 07:04:54 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     </system>
Nov 29 07:04:54 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:04:54 compute-0 nova_compute[187185]:   <os>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:   </os>
Nov 29 07:04:54 compute-0 nova_compute[187185]:   <features>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:   </features>
Nov 29 07:04:54 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:04:54 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:04:54 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:04:54 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/46a3bd91-6f2e-4a64-aeef-fe1b46a95bba/disk"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:04:54 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/46a3bd91-6f2e-4a64-aeef-fe1b46a95bba/disk.config"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:04:54 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/46a3bd91-6f2e-4a64-aeef-fe1b46a95bba/console.log" append="off"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <video>
Nov 29 07:04:54 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     </video>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:04:54 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:04:54 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:04:54 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:04:54 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:04:54 compute-0 nova_compute[187185]: </domain>
Nov 29 07:04:54 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:04:54 compute-0 nova_compute[187185]: 2025-11-29 07:04:54.882 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:04:55 compute-0 nova_compute[187185]: 2025-11-29 07:04:55.292 187189 DEBUG nova.virt.libvirt.driver [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:04:55 compute-0 nova_compute[187185]: 2025-11-29 07:04:55.292 187189 DEBUG nova.virt.libvirt.driver [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:04:55 compute-0 nova_compute[187185]: 2025-11-29 07:04:55.292 187189 INFO nova.virt.libvirt.driver [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Using config drive
Nov 29 07:04:55 compute-0 nova_compute[187185]: 2025-11-29 07:04:55.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:04:55 compute-0 nova_compute[187185]: 2025-11-29 07:04:55.471 187189 INFO nova.virt.libvirt.driver [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Creating config drive at /var/lib/nova/instances/46a3bd91-6f2e-4a64-aeef-fe1b46a95bba/disk.config
Nov 29 07:04:55 compute-0 nova_compute[187185]: 2025-11-29 07:04:55.477 187189 DEBUG oslo_concurrency.processutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/46a3bd91-6f2e-4a64-aeef-fe1b46a95bba/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpu6pj487n execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:04:55 compute-0 nova_compute[187185]: 2025-11-29 07:04:55.620 187189 DEBUG oslo_concurrency.processutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/46a3bd91-6f2e-4a64-aeef-fe1b46a95bba/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpu6pj487n" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:04:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:04:55.630 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:04:55 compute-0 systemd-machined[153486]: New machine qemu-24-instance-00000047.
Nov 29 07:04:55 compute-0 systemd[1]: Started Virtual Machine qemu-24-instance-00000047.
Nov 29 07:04:56 compute-0 nova_compute[187185]: 2025-11-29 07:04:56.080 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399896.0792701, 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:04:56 compute-0 nova_compute[187185]: 2025-11-29 07:04:56.082 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] VM Resumed (Lifecycle Event)
Nov 29 07:04:56 compute-0 nova_compute[187185]: 2025-11-29 07:04:56.085 187189 DEBUG nova.compute.manager [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:04:56 compute-0 nova_compute[187185]: 2025-11-29 07:04:56.086 187189 DEBUG nova.virt.libvirt.driver [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:04:56 compute-0 nova_compute[187185]: 2025-11-29 07:04:56.090 187189 INFO nova.virt.libvirt.driver [-] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Instance spawned successfully.
Nov 29 07:04:56 compute-0 nova_compute[187185]: 2025-11-29 07:04:56.091 187189 DEBUG nova.virt.libvirt.driver [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:04:56 compute-0 nova_compute[187185]: 2025-11-29 07:04:56.192 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:04:56 compute-0 nova_compute[187185]: 2025-11-29 07:04:56.195 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:04:56 compute-0 nova_compute[187185]: 2025-11-29 07:04:56.205 187189 DEBUG nova.virt.libvirt.driver [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:04:56 compute-0 nova_compute[187185]: 2025-11-29 07:04:56.205 187189 DEBUG nova.virt.libvirt.driver [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:04:56 compute-0 nova_compute[187185]: 2025-11-29 07:04:56.206 187189 DEBUG nova.virt.libvirt.driver [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:04:56 compute-0 nova_compute[187185]: 2025-11-29 07:04:56.206 187189 DEBUG nova.virt.libvirt.driver [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:04:56 compute-0 nova_compute[187185]: 2025-11-29 07:04:56.207 187189 DEBUG nova.virt.libvirt.driver [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:04:56 compute-0 nova_compute[187185]: 2025-11-29 07:04:56.207 187189 DEBUG nova.virt.libvirt.driver [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:04:56 compute-0 nova_compute[187185]: 2025-11-29 07:04:56.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:04:56 compute-0 nova_compute[187185]: 2025-11-29 07:04:56.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:04:56 compute-0 nova_compute[187185]: 2025-11-29 07:04:56.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:04:56 compute-0 nova_compute[187185]: 2025-11-29 07:04:56.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:04:56 compute-0 nova_compute[187185]: 2025-11-29 07:04:56.441 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:04:56 compute-0 nova_compute[187185]: 2025-11-29 07:04:56.442 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764399896.0813687, 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:04:56 compute-0 nova_compute[187185]: 2025-11-29 07:04:56.442 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] VM Started (Lifecycle Event)
Nov 29 07:04:56 compute-0 nova_compute[187185]: 2025-11-29 07:04:56.444 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:04:56 compute-0 nova_compute[187185]: 2025-11-29 07:04:56.444 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:04:56 compute-0 nova_compute[187185]: 2025-11-29 07:04:56.445 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:04:56 compute-0 nova_compute[187185]: 2025-11-29 07:04:56.445 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:04:56 compute-0 kernel: tap6a0ff3c3-e3 (unregistering): left promiscuous mode
Nov 29 07:04:56 compute-0 NetworkManager[55227]: <info>  [1764399896.5539] device (tap6a0ff3c3-e3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:04:56 compute-0 ovn_controller[95281]: 2025-11-29T07:04:56Z|00159|binding|INFO|Releasing lport 6a0ff3c3-e368-4504-9884-40716725c901 from this chassis (sb_readonly=0)
Nov 29 07:04:56 compute-0 ovn_controller[95281]: 2025-11-29T07:04:56Z|00160|binding|INFO|Setting lport 6a0ff3c3-e368-4504-9884-40716725c901 down in Southbound
Nov 29 07:04:56 compute-0 ovn_controller[95281]: 2025-11-29T07:04:56Z|00161|binding|INFO|Removing iface tap6a0ff3c3-e3 ovn-installed in OVS
Nov 29 07:04:56 compute-0 nova_compute[187185]: 2025-11-29 07:04:56.568 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:04:56 compute-0 nova_compute[187185]: 2025-11-29 07:04:56.570 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:04:56 compute-0 nova_compute[187185]: 2025-11-29 07:04:56.581 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:04:56 compute-0 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000003f.scope: Deactivated successfully.
Nov 29 07:04:56 compute-0 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000003f.scope: Consumed 16.091s CPU time.
Nov 29 07:04:56 compute-0 systemd-machined[153486]: Machine qemu-22-instance-0000003f terminated.
Nov 29 07:04:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:04:56.675 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:15:c1:31 10.100.0.14'], port_security=['fa:16:3e:15:c1:31 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '690daf8f-6151-4de9-85f6-b8a9fe51ea02', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98df116965b74e4a9985049062e65162', 'neutron:revision_number': '4', 'neutron:security_group_ids': '234720a9-9cd1-4b87-9bec-1abfe8ff0514', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e694bb30-a43a-4d18-87fa-e5c0dd8850c2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=6a0ff3c3-e368-4504-9884-40716725c901) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:04:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:04:56.677 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 6a0ff3c3-e368-4504-9884-40716725c901 in datapath fd9eb57e-b1f8-4bae-a60f-8e40613556cd unbound from our chassis
Nov 29 07:04:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:04:56.679 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fd9eb57e-b1f8-4bae-a60f-8e40613556cd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:04:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:04:56.682 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[aceaf161-30a4-4de6-8a1e-85e3fa285b66]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:04:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:04:56.683 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd namespace which is not needed anymore
Nov 29 07:04:56 compute-0 nova_compute[187185]: 2025-11-29 07:04:56.710 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:04:56 compute-0 nova_compute[187185]: 2025-11-29 07:04:56.714 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:04:56 compute-0 nova_compute[187185]: 2025-11-29 07:04:56.860 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:04:56 compute-0 nova_compute[187185]: 2025-11-29 07:04:56.866 187189 INFO nova.compute.manager [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Took 2.50 seconds to spawn the instance on the hypervisor.
Nov 29 07:04:56 compute-0 nova_compute[187185]: 2025-11-29 07:04:56.867 187189 DEBUG nova.compute.manager [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:04:56 compute-0 nova_compute[187185]: 2025-11-29 07:04:56.922 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02b37f0c-3272-417b-9791-48b555f68d56/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:04:57 compute-0 nova_compute[187185]: 2025-11-29 07:04:57.006 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02b37f0c-3272-417b-9791-48b555f68d56/disk --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:04:57 compute-0 nova_compute[187185]: 2025-11-29 07:04:57.007 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02b37f0c-3272-417b-9791-48b555f68d56/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:04:57 compute-0 nova_compute[187185]: 2025-11-29 07:04:57.073 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02b37f0c-3272-417b-9791-48b555f68d56/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:04:57 compute-0 nova_compute[187185]: 2025-11-29 07:04:57.081 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:04:57 compute-0 nova_compute[187185]: 2025-11-29 07:04:57.208 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk --force-share --output=json" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:04:57 compute-0 nova_compute[187185]: 2025-11-29 07:04:57.210 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:04:57 compute-0 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[222534]: [NOTICE]   (222538) : haproxy version is 2.8.14-c23fe91
Nov 29 07:04:57 compute-0 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[222534]: [NOTICE]   (222538) : path to executable is /usr/sbin/haproxy
Nov 29 07:04:57 compute-0 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[222534]: [WARNING]  (222538) : Exiting Master process...
Nov 29 07:04:57 compute-0 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[222534]: [WARNING]  (222538) : Exiting Master process...
Nov 29 07:04:57 compute-0 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[222534]: [ALERT]    (222538) : Current worker (222540) exited with code 143 (Terminated)
Nov 29 07:04:57 compute-0 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[222534]: [WARNING]  (222538) : All workers exited. Exiting... (0)
Nov 29 07:04:57 compute-0 systemd[1]: libpod-ee627524d984af0e13335db0316324fa2410181628b13f8f12f56ead2fad935d.scope: Deactivated successfully.
Nov 29 07:04:57 compute-0 podman[223109]: 2025-11-29 07:04:57.252723011 +0000 UTC m=+0.462020189 container died ee627524d984af0e13335db0316324fa2410181628b13f8f12f56ead2fad935d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 07:04:57 compute-0 nova_compute[187185]: 2025-11-29 07:04:57.291 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:04:57 compute-0 nova_compute[187185]: 2025-11-29 07:04:57.297 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46a3bd91-6f2e-4a64-aeef-fe1b46a95bba/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:04:57 compute-0 nova_compute[187185]: 2025-11-29 07:04:57.365 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46a3bd91-6f2e-4a64-aeef-fe1b46a95bba/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:04:57 compute-0 nova_compute[187185]: 2025-11-29 07:04:57.367 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46a3bd91-6f2e-4a64-aeef-fe1b46a95bba/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:04:57 compute-0 nova_compute[187185]: 2025-11-29 07:04:57.434 187189 DEBUG nova.compute.manager [req-7795dd48-0225-495c-9beb-e73f763e9d18 req-5f168ecd-8c73-415e-a546-83696d14a639 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Received event network-vif-unplugged-6a0ff3c3-e368-4504-9884-40716725c901 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:04:57 compute-0 nova_compute[187185]: 2025-11-29 07:04:57.435 187189 DEBUG oslo_concurrency.lockutils [req-7795dd48-0225-495c-9beb-e73f763e9d18 req-5f168ecd-8c73-415e-a546-83696d14a639 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:04:57 compute-0 nova_compute[187185]: 2025-11-29 07:04:57.436 187189 DEBUG oslo_concurrency.lockutils [req-7795dd48-0225-495c-9beb-e73f763e9d18 req-5f168ecd-8c73-415e-a546-83696d14a639 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:04:57 compute-0 nova_compute[187185]: 2025-11-29 07:04:57.436 187189 DEBUG oslo_concurrency.lockutils [req-7795dd48-0225-495c-9beb-e73f763e9d18 req-5f168ecd-8c73-415e-a546-83696d14a639 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:04:57 compute-0 nova_compute[187185]: 2025-11-29 07:04:57.437 187189 DEBUG nova.compute.manager [req-7795dd48-0225-495c-9beb-e73f763e9d18 req-5f168ecd-8c73-415e-a546-83696d14a639 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] No waiting events found dispatching network-vif-unplugged-6a0ff3c3-e368-4504-9884-40716725c901 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:04:57 compute-0 nova_compute[187185]: 2025-11-29 07:04:57.437 187189 WARNING nova.compute.manager [req-7795dd48-0225-495c-9beb-e73f763e9d18 req-5f168ecd-8c73-415e-a546-83696d14a639 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Received unexpected event network-vif-unplugged-6a0ff3c3-e368-4504-9884-40716725c901 for instance with vm_state active and task_state resize_migrating.
Nov 29 07:04:57 compute-0 nova_compute[187185]: 2025-11-29 07:04:57.438 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46a3bd91-6f2e-4a64-aeef-fe1b46a95bba/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:04:57 compute-0 nova_compute[187185]: 2025-11-29 07:04:57.519 187189 INFO nova.virt.libvirt.driver [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Instance shutdown successfully after 57 seconds.
Nov 29 07:04:57 compute-0 nova_compute[187185]: 2025-11-29 07:04:57.526 187189 INFO nova.virt.libvirt.driver [-] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Instance destroyed successfully.
Nov 29 07:04:57 compute-0 nova_compute[187185]: 2025-11-29 07:04:57.527 187189 DEBUG nova.virt.libvirt.vif [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:03:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-660939301',display_name='tempest-DeleteServersTestJSON-server-660939301',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-660939301',id=63,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:03:44Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='98df116965b74e4a9985049062e65162',ramdisk_id='',reservation_id='r-0d65al9n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-1973671383',owner_user_name='tempest-DeleteServersTestJSON-1973671383-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:03:55Z,user_data=None,user_id='4ecd161098b5422084003b39f0504a8f',uuid=690daf8f-6151-4de9-85f6-b8a9fe51ea02,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6a0ff3c3-e368-4504-9884-40716725c901", "address": "fa:16:3e:15:c1:31", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-1503104692-network", "vif_mac": "fa:16:3e:15:c1:31"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a0ff3c3-e3", "ovs_interfaceid": "6a0ff3c3-e368-4504-9884-40716725c901", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:04:57 compute-0 nova_compute[187185]: 2025-11-29 07:04:57.528 187189 DEBUG nova.network.os_vif_util [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converting VIF {"id": "6a0ff3c3-e368-4504-9884-40716725c901", "address": "fa:16:3e:15:c1:31", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-1503104692-network", "vif_mac": "fa:16:3e:15:c1:31"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a0ff3c3-e3", "ovs_interfaceid": "6a0ff3c3-e368-4504-9884-40716725c901", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:04:57 compute-0 nova_compute[187185]: 2025-11-29 07:04:57.529 187189 DEBUG nova.network.os_vif_util [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:15:c1:31,bridge_name='br-int',has_traffic_filtering=True,id=6a0ff3c3-e368-4504-9884-40716725c901,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a0ff3c3-e3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:04:57 compute-0 nova_compute[187185]: 2025-11-29 07:04:57.529 187189 DEBUG os_vif [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:15:c1:31,bridge_name='br-int',has_traffic_filtering=True,id=6a0ff3c3-e368-4504-9884-40716725c901,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a0ff3c3-e3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:04:57 compute-0 nova_compute[187185]: 2025-11-29 07:04:57.532 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:04:57 compute-0 nova_compute[187185]: 2025-11-29 07:04:57.533 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a0ff3c3-e3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:04:57 compute-0 nova_compute[187185]: 2025-11-29 07:04:57.535 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:04:57 compute-0 nova_compute[187185]: 2025-11-29 07:04:57.538 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:04:57 compute-0 nova_compute[187185]: 2025-11-29 07:04:57.541 187189 INFO os_vif [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:15:c1:31,bridge_name='br-int',has_traffic_filtering=True,id=6a0ff3c3-e368-4504-9884-40716725c901,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a0ff3c3-e3')
Nov 29 07:04:57 compute-0 nova_compute[187185]: 2025-11-29 07:04:57.551 187189 DEBUG oslo_concurrency.processutils [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:04:57 compute-0 nova_compute[187185]: 2025-11-29 07:04:57.604 187189 INFO nova.compute.manager [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Took 4.86 seconds to build instance.
Nov 29 07:04:57 compute-0 nova_compute[187185]: 2025-11-29 07:04:57.642 187189 DEBUG oslo_concurrency.processutils [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:04:57 compute-0 nova_compute[187185]: 2025-11-29 07:04:57.644 187189 DEBUG oslo_concurrency.processutils [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:04:57 compute-0 nova_compute[187185]: 2025-11-29 07:04:57.719 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:04:57 compute-0 nova_compute[187185]: 2025-11-29 07:04:57.721 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5337MB free_disk=73.27214431762695GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:04:57 compute-0 nova_compute[187185]: 2025-11-29 07:04:57.722 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:04:57 compute-0 nova_compute[187185]: 2025-11-29 07:04:57.722 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:04:57 compute-0 nova_compute[187185]: 2025-11-29 07:04:57.726 187189 DEBUG oslo_concurrency.processutils [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:04:57 compute-0 nova_compute[187185]: 2025-11-29 07:04:57.727 187189 DEBUG nova.virt.libvirt.volume.remotefs [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Copying file /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02_resize/disk to 192.168.122.101:/var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Nov 29 07:04:57 compute-0 nova_compute[187185]: 2025-11-29 07:04:57.728 187189 DEBUG oslo_concurrency.processutils [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02_resize/disk 192.168.122.101:/var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:04:57 compute-0 nova_compute[187185]: 2025-11-29 07:04:57.958 187189 DEBUG oslo_concurrency.lockutils [None req-e1661564-eedf-41cb-9e9b-465267a3324d 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "46a3bd91-6f2e-4a64-aeef-fe1b46a95bba" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.325s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:04:58 compute-0 nova_compute[187185]: 2025-11-29 07:04:58.594 187189 INFO nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Updating resource usage from migration b2f30ee9-093d-4a50-9511-730851938837
Nov 29 07:04:58 compute-0 nova_compute[187185]: 2025-11-29 07:04:58.617 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance 02b37f0c-3272-417b-9791-48b555f68d56 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:04:58 compute-0 nova_compute[187185]: 2025-11-29 07:04:58.618 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Migration b2f30ee9-093d-4a50-9511-730851938837 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Nov 29 07:04:58 compute-0 nova_compute[187185]: 2025-11-29 07:04:58.618 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:04:58 compute-0 nova_compute[187185]: 2025-11-29 07:04:58.619 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:04:58 compute-0 nova_compute[187185]: 2025-11-29 07:04:58.619 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:04:58 compute-0 nova_compute[187185]: 2025-11-29 07:04:58.662 187189 DEBUG oslo_concurrency.processutils [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "scp -r /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02_resize/disk 192.168.122.101:/var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk" returned: 0 in 0.934s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:04:58 compute-0 nova_compute[187185]: 2025-11-29 07:04:58.663 187189 DEBUG nova.virt.libvirt.volume.remotefs [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Copying file /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02_resize/disk.config to 192.168.122.101:/var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Nov 29 07:04:58 compute-0 nova_compute[187185]: 2025-11-29 07:04:58.664 187189 DEBUG oslo_concurrency.processutils [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02_resize/disk.config 192.168.122.101:/var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:04:58 compute-0 nova_compute[187185]: 2025-11-29 07:04:58.720 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:04:58 compute-0 nova_compute[187185]: 2025-11-29 07:04:58.758 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:04:58 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ee627524d984af0e13335db0316324fa2410181628b13f8f12f56ead2fad935d-userdata-shm.mount: Deactivated successfully.
Nov 29 07:04:58 compute-0 systemd[1]: var-lib-containers-storage-overlay-cddb442fdfa72fb2b2e65745b79bdacfb5514e2592ecad32eee28d5ccce15e22-merged.mount: Deactivated successfully.
Nov 29 07:04:58 compute-0 nova_compute[187185]: 2025-11-29 07:04:58.801 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:04:58 compute-0 nova_compute[187185]: 2025-11-29 07:04:58.802 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.080s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:04:58 compute-0 podman[223176]: 2025-11-29 07:04:58.887023367 +0000 UTC m=+0.057149535 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 07:04:58 compute-0 nova_compute[187185]: 2025-11-29 07:04:58.912 187189 DEBUG oslo_concurrency.processutils [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "scp -C -r /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02_resize/disk.config 192.168.122.101:/var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk.config" returned: 0 in 0.248s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:04:58 compute-0 nova_compute[187185]: 2025-11-29 07:04:58.915 187189 DEBUG nova.virt.libvirt.volume.remotefs [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Copying file /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02_resize/disk.info to 192.168.122.101:/var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Nov 29 07:04:58 compute-0 nova_compute[187185]: 2025-11-29 07:04:58.915 187189 DEBUG oslo_concurrency.processutils [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02_resize/disk.info 192.168.122.101:/var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:04:59 compute-0 nova_compute[187185]: 2025-11-29 07:04:59.225 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:04:59 compute-0 nova_compute[187185]: 2025-11-29 07:04:59.231 187189 DEBUG oslo_concurrency.processutils [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "scp -C -r /var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02_resize/disk.info 192.168.122.101:/var/lib/nova/instances/690daf8f-6151-4de9-85f6-b8a9fe51ea02/disk.info" returned: 0 in 0.316s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:04:59 compute-0 nova_compute[187185]: 2025-11-29 07:04:59.401 187189 DEBUG neutronclient.v2_0.client [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 6a0ff3c3-e368-4504-9884-40716725c901 for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Nov 29 07:04:59 compute-0 nova_compute[187185]: 2025-11-29 07:04:59.582 187189 DEBUG oslo_concurrency.lockutils [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:04:59 compute-0 nova_compute[187185]: 2025-11-29 07:04:59.583 187189 DEBUG oslo_concurrency.lockutils [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:04:59 compute-0 nova_compute[187185]: 2025-11-29 07:04:59.583 187189 DEBUG oslo_concurrency.lockutils [None req-daa740b3-a2bf-4012-9152-765baffb8c42 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:04:59 compute-0 podman[223109]: 2025-11-29 07:04:59.655737417 +0000 UTC m=+2.865034625 container cleanup ee627524d984af0e13335db0316324fa2410181628b13f8f12f56ead2fad935d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 29 07:04:59 compute-0 systemd[1]: libpod-conmon-ee627524d984af0e13335db0316324fa2410181628b13f8f12f56ead2fad935d.scope: Deactivated successfully.
Nov 29 07:04:59 compute-0 nova_compute[187185]: 2025-11-29 07:04:59.802 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:04:59 compute-0 nova_compute[187185]: 2025-11-29 07:04:59.802 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:04:59 compute-0 nova_compute[187185]: 2025-11-29 07:04:59.936 187189 DEBUG nova.compute.manager [req-d587afd1-e93e-4266-9752-483d52dd2f31 req-e264c996-df10-4ca6-8420-1dad930afc6a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Received event network-vif-plugged-6a0ff3c3-e368-4504-9884-40716725c901 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:04:59 compute-0 nova_compute[187185]: 2025-11-29 07:04:59.936 187189 DEBUG oslo_concurrency.lockutils [req-d587afd1-e93e-4266-9752-483d52dd2f31 req-e264c996-df10-4ca6-8420-1dad930afc6a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:04:59 compute-0 nova_compute[187185]: 2025-11-29 07:04:59.937 187189 DEBUG oslo_concurrency.lockutils [req-d587afd1-e93e-4266-9752-483d52dd2f31 req-e264c996-df10-4ca6-8420-1dad930afc6a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:04:59 compute-0 nova_compute[187185]: 2025-11-29 07:04:59.937 187189 DEBUG oslo_concurrency.lockutils [req-d587afd1-e93e-4266-9752-483d52dd2f31 req-e264c996-df10-4ca6-8420-1dad930afc6a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:04:59 compute-0 nova_compute[187185]: 2025-11-29 07:04:59.937 187189 DEBUG nova.compute.manager [req-d587afd1-e93e-4266-9752-483d52dd2f31 req-e264c996-df10-4ca6-8420-1dad930afc6a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] No waiting events found dispatching network-vif-plugged-6a0ff3c3-e368-4504-9884-40716725c901 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:04:59 compute-0 nova_compute[187185]: 2025-11-29 07:04:59.938 187189 WARNING nova.compute.manager [req-d587afd1-e93e-4266-9752-483d52dd2f31 req-e264c996-df10-4ca6-8420-1dad930afc6a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Received unexpected event network-vif-plugged-6a0ff3c3-e368-4504-9884-40716725c901 for instance with vm_state active and task_state resize_migrated.
Nov 29 07:05:00 compute-0 podman[223201]: 2025-11-29 07:05:00.068625067 +0000 UTC m=+1.152970872 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 07:05:00 compute-0 nova_compute[187185]: 2025-11-29 07:05:00.815 187189 DEBUG oslo_concurrency.lockutils [None req-bac1a9c5-cf36-4276-a71e-62acc3ca6169 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "46a3bd91-6f2e-4a64-aeef-fe1b46a95bba" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:05:00 compute-0 nova_compute[187185]: 2025-11-29 07:05:00.816 187189 DEBUG oslo_concurrency.lockutils [None req-bac1a9c5-cf36-4276-a71e-62acc3ca6169 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "46a3bd91-6f2e-4a64-aeef-fe1b46a95bba" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:05:00 compute-0 nova_compute[187185]: 2025-11-29 07:05:00.817 187189 DEBUG oslo_concurrency.lockutils [None req-bac1a9c5-cf36-4276-a71e-62acc3ca6169 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "46a3bd91-6f2e-4a64-aeef-fe1b46a95bba-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:05:00 compute-0 nova_compute[187185]: 2025-11-29 07:05:00.817 187189 DEBUG oslo_concurrency.lockutils [None req-bac1a9c5-cf36-4276-a71e-62acc3ca6169 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "46a3bd91-6f2e-4a64-aeef-fe1b46a95bba-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:05:00 compute-0 nova_compute[187185]: 2025-11-29 07:05:00.817 187189 DEBUG oslo_concurrency.lockutils [None req-bac1a9c5-cf36-4276-a71e-62acc3ca6169 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "46a3bd91-6f2e-4a64-aeef-fe1b46a95bba-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:05:00 compute-0 nova_compute[187185]: 2025-11-29 07:05:00.830 187189 INFO nova.compute.manager [None req-bac1a9c5-cf36-4276-a71e-62acc3ca6169 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Terminating instance
Nov 29 07:05:00 compute-0 nova_compute[187185]: 2025-11-29 07:05:00.841 187189 DEBUG oslo_concurrency.lockutils [None req-bac1a9c5-cf36-4276-a71e-62acc3ca6169 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "refresh_cache-46a3bd91-6f2e-4a64-aeef-fe1b46a95bba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:05:00 compute-0 nova_compute[187185]: 2025-11-29 07:05:00.841 187189 DEBUG oslo_concurrency.lockutils [None req-bac1a9c5-cf36-4276-a71e-62acc3ca6169 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquired lock "refresh_cache-46a3bd91-6f2e-4a64-aeef-fe1b46a95bba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:05:00 compute-0 nova_compute[187185]: 2025-11-29 07:05:00.841 187189 DEBUG nova.network.neutron [None req-bac1a9c5-cf36-4276-a71e-62acc3ca6169 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:05:00 compute-0 podman[223218]: 2025-11-29 07:05:00.99075526 +0000 UTC m=+1.308313130 container remove ee627524d984af0e13335db0316324fa2410181628b13f8f12f56ead2fad935d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 29 07:05:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:05:00.999 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[cb95ae07-59cd-4b6e-855d-2ba08c4849a9]: (4, ('Sat Nov 29 07:04:56 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd (ee627524d984af0e13335db0316324fa2410181628b13f8f12f56ead2fad935d)\nee627524d984af0e13335db0316324fa2410181628b13f8f12f56ead2fad935d\nSat Nov 29 07:04:59 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd (ee627524d984af0e13335db0316324fa2410181628b13f8f12f56ead2fad935d)\nee627524d984af0e13335db0316324fa2410181628b13f8f12f56ead2fad935d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:05:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:05:01.001 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[20ee8e09-b136-4169-99e9-da09cfb0281f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:05:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:05:01.003 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfd9eb57e-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:05:01 compute-0 nova_compute[187185]: 2025-11-29 07:05:01.005 187189 DEBUG nova.network.neutron [None req-bac1a9c5-cf36-4276-a71e-62acc3ca6169 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:05:01 compute-0 kernel: tapfd9eb57e-b0: left promiscuous mode
Nov 29 07:05:01 compute-0 nova_compute[187185]: 2025-11-29 07:05:01.012 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:05:01 compute-0 nova_compute[187185]: 2025-11-29 07:05:01.020 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:05:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:05:01.023 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[2552ab69-7a6c-4902-8345-e2b9906f300a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:05:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:05:01.043 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[586e76f5-37ca-4c62-a1f6-4098113a203f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:05:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:05:01.046 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[cbfa4988-81cb-4500-b83c-13e689f84586]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:05:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:05:01.067 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[0eec7996-f917-44ec-bc5f-901d7b477b38]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 527129, 'reachable_time': 17620, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223242, 'error': None, 'target': 'ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:05:01 compute-0 systemd[1]: run-netns-ovnmeta\x2dfd9eb57e\x2db1f8\x2d4bae\x2da60f\x2d8e40613556cd.mount: Deactivated successfully.
Nov 29 07:05:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:05:01.080 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:05:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:05:01.081 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[07c47924-807c-4112-b11b-f25bb657b960]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:05:01 compute-0 nova_compute[187185]: 2025-11-29 07:05:01.311 187189 DEBUG nova.network.neutron [None req-bac1a9c5-cf36-4276-a71e-62acc3ca6169 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:05:01 compute-0 nova_compute[187185]: 2025-11-29 07:05:01.333 187189 DEBUG oslo_concurrency.lockutils [None req-bac1a9c5-cf36-4276-a71e-62acc3ca6169 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Releasing lock "refresh_cache-46a3bd91-6f2e-4a64-aeef-fe1b46a95bba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:05:01 compute-0 nova_compute[187185]: 2025-11-29 07:05:01.335 187189 DEBUG nova.compute.manager [None req-bac1a9c5-cf36-4276-a71e-62acc3ca6169 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 07:05:01 compute-0 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000047.scope: Deactivated successfully.
Nov 29 07:05:01 compute-0 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000047.scope: Consumed 5.595s CPU time.
Nov 29 07:05:01 compute-0 systemd-machined[153486]: Machine qemu-24-instance-00000047 terminated.
Nov 29 07:05:01 compute-0 nova_compute[187185]: 2025-11-29 07:05:01.596 187189 INFO nova.virt.libvirt.driver [-] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Instance destroyed successfully.
Nov 29 07:05:01 compute-0 nova_compute[187185]: 2025-11-29 07:05:01.598 187189 DEBUG nova.objects.instance [None req-bac1a9c5-cf36-4276-a71e-62acc3ca6169 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lazy-loading 'resources' on Instance uuid 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:05:01 compute-0 nova_compute[187185]: 2025-11-29 07:05:01.627 187189 INFO nova.virt.libvirt.driver [None req-bac1a9c5-cf36-4276-a71e-62acc3ca6169 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Deleting instance files /var/lib/nova/instances/46a3bd91-6f2e-4a64-aeef-fe1b46a95bba_del
Nov 29 07:05:01 compute-0 nova_compute[187185]: 2025-11-29 07:05:01.629 187189 INFO nova.virt.libvirt.driver [None req-bac1a9c5-cf36-4276-a71e-62acc3ca6169 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Deletion of /var/lib/nova/instances/46a3bd91-6f2e-4a64-aeef-fe1b46a95bba_del complete
Nov 29 07:05:01 compute-0 nova_compute[187185]: 2025-11-29 07:05:01.725 187189 INFO nova.compute.manager [None req-bac1a9c5-cf36-4276-a71e-62acc3ca6169 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Took 0.39 seconds to destroy the instance on the hypervisor.
Nov 29 07:05:01 compute-0 nova_compute[187185]: 2025-11-29 07:05:01.726 187189 DEBUG oslo.service.loopingcall [None req-bac1a9c5-cf36-4276-a71e-62acc3ca6169 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 07:05:01 compute-0 nova_compute[187185]: 2025-11-29 07:05:01.726 187189 DEBUG nova.compute.manager [-] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 07:05:01 compute-0 nova_compute[187185]: 2025-11-29 07:05:01.727 187189 DEBUG nova.network.neutron [-] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 07:05:01 compute-0 nova_compute[187185]: 2025-11-29 07:05:01.860 187189 DEBUG nova.network.neutron [-] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:05:01 compute-0 nova_compute[187185]: 2025-11-29 07:05:01.884 187189 DEBUG nova.network.neutron [-] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:05:01 compute-0 nova_compute[187185]: 2025-11-29 07:05:01.899 187189 INFO nova.compute.manager [-] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Took 0.17 seconds to deallocate network for instance.
Nov 29 07:05:02 compute-0 nova_compute[187185]: 2025-11-29 07:05:02.075 187189 DEBUG oslo_concurrency.lockutils [None req-bac1a9c5-cf36-4276-a71e-62acc3ca6169 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:05:02 compute-0 nova_compute[187185]: 2025-11-29 07:05:02.076 187189 DEBUG oslo_concurrency.lockutils [None req-bac1a9c5-cf36-4276-a71e-62acc3ca6169 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:05:02 compute-0 nova_compute[187185]: 2025-11-29 07:05:02.119 187189 DEBUG nova.compute.manager [req-fb5fed23-cb6f-4fa3-a5ed-5c1f718fafdc req-72c18687-8aa5-406c-a107-cb63a2440f9c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Received event network-changed-6a0ff3c3-e368-4504-9884-40716725c901 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:05:02 compute-0 nova_compute[187185]: 2025-11-29 07:05:02.120 187189 DEBUG nova.compute.manager [req-fb5fed23-cb6f-4fa3-a5ed-5c1f718fafdc req-72c18687-8aa5-406c-a107-cb63a2440f9c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Refreshing instance network info cache due to event network-changed-6a0ff3c3-e368-4504-9884-40716725c901. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:05:02 compute-0 nova_compute[187185]: 2025-11-29 07:05:02.120 187189 DEBUG oslo_concurrency.lockutils [req-fb5fed23-cb6f-4fa3-a5ed-5c1f718fafdc req-72c18687-8aa5-406c-a107-cb63a2440f9c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-690daf8f-6151-4de9-85f6-b8a9fe51ea02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:05:02 compute-0 nova_compute[187185]: 2025-11-29 07:05:02.121 187189 DEBUG oslo_concurrency.lockutils [req-fb5fed23-cb6f-4fa3-a5ed-5c1f718fafdc req-72c18687-8aa5-406c-a107-cb63a2440f9c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-690daf8f-6151-4de9-85f6-b8a9fe51ea02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:05:02 compute-0 nova_compute[187185]: 2025-11-29 07:05:02.121 187189 DEBUG nova.network.neutron [req-fb5fed23-cb6f-4fa3-a5ed-5c1f718fafdc req-72c18687-8aa5-406c-a107-cb63a2440f9c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Refreshing network info cache for port 6a0ff3c3-e368-4504-9884-40716725c901 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:05:02 compute-0 nova_compute[187185]: 2025-11-29 07:05:02.210 187189 DEBUG nova.compute.provider_tree [None req-bac1a9c5-cf36-4276-a71e-62acc3ca6169 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:05:02 compute-0 nova_compute[187185]: 2025-11-29 07:05:02.230 187189 DEBUG nova.scheduler.client.report [None req-bac1a9c5-cf36-4276-a71e-62acc3ca6169 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:05:02 compute-0 nova_compute[187185]: 2025-11-29 07:05:02.266 187189 DEBUG oslo_concurrency.lockutils [None req-bac1a9c5-cf36-4276-a71e-62acc3ca6169 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.189s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:05:02 compute-0 nova_compute[187185]: 2025-11-29 07:05:02.323 187189 INFO nova.scheduler.client.report [None req-bac1a9c5-cf36-4276-a71e-62acc3ca6169 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Deleted allocations for instance 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba
Nov 29 07:05:02 compute-0 nova_compute[187185]: 2025-11-29 07:05:02.451 187189 DEBUG oslo_concurrency.lockutils [None req-bac1a9c5-cf36-4276-a71e-62acc3ca6169 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "46a3bd91-6f2e-4a64-aeef-fe1b46a95bba" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.635s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:05:02 compute-0 nova_compute[187185]: 2025-11-29 07:05:02.536 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:05:02 compute-0 podman[223255]: 2025-11-29 07:05:02.872745951 +0000 UTC m=+0.124589850 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:05:04 compute-0 nova_compute[187185]: 2025-11-29 07:05:04.061 187189 DEBUG nova.network.neutron [req-fb5fed23-cb6f-4fa3-a5ed-5c1f718fafdc req-72c18687-8aa5-406c-a107-cb63a2440f9c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Updated VIF entry in instance network info cache for port 6a0ff3c3-e368-4504-9884-40716725c901. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:05:04 compute-0 nova_compute[187185]: 2025-11-29 07:05:04.061 187189 DEBUG nova.network.neutron [req-fb5fed23-cb6f-4fa3-a5ed-5c1f718fafdc req-72c18687-8aa5-406c-a107-cb63a2440f9c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Updating instance_info_cache with network_info: [{"id": "6a0ff3c3-e368-4504-9884-40716725c901", "address": "fa:16:3e:15:c1:31", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a0ff3c3-e3", "ovs_interfaceid": "6a0ff3c3-e368-4504-9884-40716725c901", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:05:04 compute-0 nova_compute[187185]: 2025-11-29 07:05:04.228 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:05:04 compute-0 nova_compute[187185]: 2025-11-29 07:05:04.316 187189 DEBUG oslo_concurrency.lockutils [req-fb5fed23-cb6f-4fa3-a5ed-5c1f718fafdc req-72c18687-8aa5-406c-a107-cb63a2440f9c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-690daf8f-6151-4de9-85f6-b8a9fe51ea02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:05:04 compute-0 nova_compute[187185]: 2025-11-29 07:05:04.593 187189 DEBUG nova.compute.manager [req-cf704190-f656-4fe3-b147-836cc3638873 req-4e4dfacf-eb98-4699-8d10-270b9a81b214 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Received event network-vif-plugged-6a0ff3c3-e368-4504-9884-40716725c901 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:05:04 compute-0 nova_compute[187185]: 2025-11-29 07:05:04.594 187189 DEBUG oslo_concurrency.lockutils [req-cf704190-f656-4fe3-b147-836cc3638873 req-4e4dfacf-eb98-4699-8d10-270b9a81b214 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:05:04 compute-0 nova_compute[187185]: 2025-11-29 07:05:04.594 187189 DEBUG oslo_concurrency.lockutils [req-cf704190-f656-4fe3-b147-836cc3638873 req-4e4dfacf-eb98-4699-8d10-270b9a81b214 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:05:04 compute-0 nova_compute[187185]: 2025-11-29 07:05:04.595 187189 DEBUG oslo_concurrency.lockutils [req-cf704190-f656-4fe3-b147-836cc3638873 req-4e4dfacf-eb98-4699-8d10-270b9a81b214 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:05:04 compute-0 nova_compute[187185]: 2025-11-29 07:05:04.596 187189 DEBUG nova.compute.manager [req-cf704190-f656-4fe3-b147-836cc3638873 req-4e4dfacf-eb98-4699-8d10-270b9a81b214 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] No waiting events found dispatching network-vif-plugged-6a0ff3c3-e368-4504-9884-40716725c901 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:05:04 compute-0 nova_compute[187185]: 2025-11-29 07:05:04.596 187189 WARNING nova.compute.manager [req-cf704190-f656-4fe3-b147-836cc3638873 req-4e4dfacf-eb98-4699-8d10-270b9a81b214 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Received unexpected event network-vif-plugged-6a0ff3c3-e368-4504-9884-40716725c901 for instance with vm_state active and task_state resize_finish.
Nov 29 07:05:05 compute-0 nova_compute[187185]: 2025-11-29 07:05:05.623 187189 DEBUG oslo_concurrency.lockutils [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:05:05 compute-0 nova_compute[187185]: 2025-11-29 07:05:05.624 187189 DEBUG oslo_concurrency.lockutils [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:05:05 compute-0 nova_compute[187185]: 2025-11-29 07:05:05.624 187189 DEBUG nova.compute.manager [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Going to confirm migration 11 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679
Nov 29 07:05:05 compute-0 nova_compute[187185]: 2025-11-29 07:05:05.672 187189 DEBUG nova.objects.instance [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lazy-loading 'info_cache' on Instance uuid 690daf8f-6151-4de9-85f6-b8a9fe51ea02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:05:06 compute-0 nova_compute[187185]: 2025-11-29 07:05:06.089 187189 DEBUG neutronclient.v2_0.client [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 6a0ff3c3-e368-4504-9884-40716725c901 for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Nov 29 07:05:06 compute-0 nova_compute[187185]: 2025-11-29 07:05:06.090 187189 DEBUG oslo_concurrency.lockutils [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "refresh_cache-690daf8f-6151-4de9-85f6-b8a9fe51ea02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:05:06 compute-0 nova_compute[187185]: 2025-11-29 07:05:06.090 187189 DEBUG oslo_concurrency.lockutils [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquired lock "refresh_cache-690daf8f-6151-4de9-85f6-b8a9fe51ea02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:05:06 compute-0 nova_compute[187185]: 2025-11-29 07:05:06.091 187189 DEBUG nova.network.neutron [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:05:06 compute-0 nova_compute[187185]: 2025-11-29 07:05:06.702 187189 DEBUG nova.compute.manager [req-e6e22410-788e-4427-b57d-89aed0cc1178 req-a6c55007-ffc9-4fb7-8e20-99acb9144cac 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Received event network-vif-plugged-6a0ff3c3-e368-4504-9884-40716725c901 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:05:06 compute-0 nova_compute[187185]: 2025-11-29 07:05:06.703 187189 DEBUG oslo_concurrency.lockutils [req-e6e22410-788e-4427-b57d-89aed0cc1178 req-a6c55007-ffc9-4fb7-8e20-99acb9144cac 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:05:06 compute-0 nova_compute[187185]: 2025-11-29 07:05:06.704 187189 DEBUG oslo_concurrency.lockutils [req-e6e22410-788e-4427-b57d-89aed0cc1178 req-a6c55007-ffc9-4fb7-8e20-99acb9144cac 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:05:06 compute-0 nova_compute[187185]: 2025-11-29 07:05:06.704 187189 DEBUG oslo_concurrency.lockutils [req-e6e22410-788e-4427-b57d-89aed0cc1178 req-a6c55007-ffc9-4fb7-8e20-99acb9144cac 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:05:06 compute-0 nova_compute[187185]: 2025-11-29 07:05:06.704 187189 DEBUG nova.compute.manager [req-e6e22410-788e-4427-b57d-89aed0cc1178 req-a6c55007-ffc9-4fb7-8e20-99acb9144cac 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] No waiting events found dispatching network-vif-plugged-6a0ff3c3-e368-4504-9884-40716725c901 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:05:06 compute-0 nova_compute[187185]: 2025-11-29 07:05:06.705 187189 WARNING nova.compute.manager [req-e6e22410-788e-4427-b57d-89aed0cc1178 req-a6c55007-ffc9-4fb7-8e20-99acb9144cac 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Received unexpected event network-vif-plugged-6a0ff3c3-e368-4504-9884-40716725c901 for instance with vm_state resized and task_state deleting.
Nov 29 07:05:07 compute-0 nova_compute[187185]: 2025-11-29 07:05:07.539 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:05:08 compute-0 nova_compute[187185]: 2025-11-29 07:05:08.935 187189 DEBUG nova.network.neutron [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Updating instance_info_cache with network_info: [{"id": "6a0ff3c3-e368-4504-9884-40716725c901", "address": "fa:16:3e:15:c1:31", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a0ff3c3-e3", "ovs_interfaceid": "6a0ff3c3-e368-4504-9884-40716725c901", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:05:09 compute-0 nova_compute[187185]: 2025-11-29 07:05:09.289 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:05:10 compute-0 nova_compute[187185]: 2025-11-29 07:05:10.678 187189 DEBUG oslo_concurrency.lockutils [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Releasing lock "refresh_cache-690daf8f-6151-4de9-85f6-b8a9fe51ea02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:05:10 compute-0 nova_compute[187185]: 2025-11-29 07:05:10.680 187189 DEBUG nova.objects.instance [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lazy-loading 'migration_context' on Instance uuid 690daf8f-6151-4de9-85f6-b8a9fe51ea02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:05:10 compute-0 nova_compute[187185]: 2025-11-29 07:05:10.734 187189 DEBUG nova.virt.libvirt.vif [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:03:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-660939301',display_name='tempest-DeleteServersTestJSON-server-660939301',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-660939301',id=63,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:05:04Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98df116965b74e4a9985049062e65162',ramdisk_id='',reservation_id='r-0d65al9n',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-1973671383',owner_user_name='tempest-DeleteServersTestJSON-1973671383-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:05:05Z,user_data=None,user_id='4ecd161098b5422084003b39f0504a8f',uuid=690daf8f-6151-4de9-85f6-b8a9fe51ea02,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "6a0ff3c3-e368-4504-9884-40716725c901", "address": "fa:16:3e:15:c1:31", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a0ff3c3-e3", "ovs_interfaceid": "6a0ff3c3-e368-4504-9884-40716725c901", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:05:10 compute-0 nova_compute[187185]: 2025-11-29 07:05:10.736 187189 DEBUG nova.network.os_vif_util [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converting VIF {"id": "6a0ff3c3-e368-4504-9884-40716725c901", "address": "fa:16:3e:15:c1:31", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a0ff3c3-e3", "ovs_interfaceid": "6a0ff3c3-e368-4504-9884-40716725c901", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:05:10 compute-0 nova_compute[187185]: 2025-11-29 07:05:10.738 187189 DEBUG nova.network.os_vif_util [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:15:c1:31,bridge_name='br-int',has_traffic_filtering=True,id=6a0ff3c3-e368-4504-9884-40716725c901,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a0ff3c3-e3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:05:10 compute-0 nova_compute[187185]: 2025-11-29 07:05:10.739 187189 DEBUG os_vif [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:15:c1:31,bridge_name='br-int',has_traffic_filtering=True,id=6a0ff3c3-e368-4504-9884-40716725c901,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a0ff3c3-e3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:05:10 compute-0 nova_compute[187185]: 2025-11-29 07:05:10.742 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:05:10 compute-0 nova_compute[187185]: 2025-11-29 07:05:10.743 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a0ff3c3-e3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:05:10 compute-0 nova_compute[187185]: 2025-11-29 07:05:10.744 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:05:10 compute-0 nova_compute[187185]: 2025-11-29 07:05:10.748 187189 INFO os_vif [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:15:c1:31,bridge_name='br-int',has_traffic_filtering=True,id=6a0ff3c3-e368-4504-9884-40716725c901,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a0ff3c3-e3')
Nov 29 07:05:10 compute-0 nova_compute[187185]: 2025-11-29 07:05:10.749 187189 DEBUG oslo_concurrency.lockutils [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:05:10 compute-0 nova_compute[187185]: 2025-11-29 07:05:10.750 187189 DEBUG oslo_concurrency.lockutils [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:05:10 compute-0 nova_compute[187185]: 2025-11-29 07:05:10.937 187189 DEBUG nova.compute.provider_tree [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:05:10 compute-0 nova_compute[187185]: 2025-11-29 07:05:10.953 187189 DEBUG nova.scheduler.client.report [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:05:11 compute-0 nova_compute[187185]: 2025-11-29 07:05:11.064 187189 DEBUG oslo_concurrency.lockutils [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.314s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:05:11 compute-0 nova_compute[187185]: 2025-11-29 07:05:11.656 187189 INFO nova.scheduler.client.report [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Deleted allocation for migration b2f30ee9-093d-4a50-9511-730851938837
Nov 29 07:05:11 compute-0 nova_compute[187185]: 2025-11-29 07:05:11.862 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399896.840866, 690daf8f-6151-4de9-85f6-b8a9fe51ea02 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:05:11 compute-0 nova_compute[187185]: 2025-11-29 07:05:11.863 187189 INFO nova.compute.manager [-] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] VM Stopped (Lifecycle Event)
Nov 29 07:05:12 compute-0 nova_compute[187185]: 2025-11-29 07:05:12.542 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:05:14 compute-0 nova_compute[187185]: 2025-11-29 07:05:14.231 187189 DEBUG nova.compute.manager [None req-3a7a6981-de1a-453b-9d03-f78db4e7403c - - - - - -] [instance: 690daf8f-6151-4de9-85f6-b8a9fe51ea02] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:05:14 compute-0 nova_compute[187185]: 2025-11-29 07:05:14.292 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:05:14 compute-0 podman[223275]: 2025-11-29 07:05:14.841089475 +0000 UTC m=+0.092004529 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 29 07:05:14 compute-0 podman[223276]: 2025-11-29 07:05:14.851327184 +0000 UTC m=+0.096589889 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9-minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.expose-services=, vcs-type=git, distribution-scope=public)
Nov 29 07:05:14 compute-0 podman[223277]: 2025-11-29 07:05:14.852712453 +0000 UTC m=+0.087913753 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 07:05:14 compute-0 nova_compute[187185]: 2025-11-29 07:05:14.993 187189 DEBUG oslo_concurrency.lockutils [None req-3d0432c1-067f-4bd3-a23b-e8ea8ba560b8 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "690daf8f-6151-4de9-85f6-b8a9fe51ea02" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 9.369s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:05:16 compute-0 nova_compute[187185]: 2025-11-29 07:05:16.594 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399901.5933514, 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:05:16 compute-0 nova_compute[187185]: 2025-11-29 07:05:16.595 187189 INFO nova.compute.manager [-] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] VM Stopped (Lifecycle Event)
Nov 29 07:05:17 compute-0 nova_compute[187185]: 2025-11-29 07:05:17.040 187189 DEBUG nova.compute.manager [None req-29a9fc9c-1ec1-4c64-a53e-9ab7d41a3385 - - - - - -] [instance: 46a3bd91-6f2e-4a64-aeef-fe1b46a95bba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:05:17 compute-0 nova_compute[187185]: 2025-11-29 07:05:17.545 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:05:19 compute-0 nova_compute[187185]: 2025-11-29 07:05:19.295 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:05:22 compute-0 nova_compute[187185]: 2025-11-29 07:05:22.547 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:05:23 compute-0 podman[223341]: 2025-11-29 07:05:23.879561168 +0000 UTC m=+0.137803593 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller)
Nov 29 07:05:24 compute-0 nova_compute[187185]: 2025-11-29 07:05:24.298 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:05:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:05:24.825 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:05:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:05:24.826 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:05:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:05:24.827 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:05:27 compute-0 nova_compute[187185]: 2025-11-29 07:05:27.551 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:05:27 compute-0 ovn_controller[95281]: 2025-11-29T07:05:27Z|00162|binding|INFO|Releasing lport ed5aef73-67a0-4ad1-8aea-9c411786c18e from this chassis (sb_readonly=0)
Nov 29 07:05:27 compute-0 nova_compute[187185]: 2025-11-29 07:05:27.651 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:05:29 compute-0 nova_compute[187185]: 2025-11-29 07:05:29.301 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:05:29 compute-0 podman[223368]: 2025-11-29 07:05:29.796952884 +0000 UTC m=+0.056695072 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 07:05:31 compute-0 podman[223392]: 2025-11-29 07:05:31.799028637 +0000 UTC m=+0.062854436 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 29 07:05:32 compute-0 nova_compute[187185]: 2025-11-29 07:05:32.554 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:05:33 compute-0 podman[223413]: 2025-11-29 07:05:33.779812217 +0000 UTC m=+0.051714741 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 07:05:34 compute-0 nova_compute[187185]: 2025-11-29 07:05:34.303 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:05:36 compute-0 nova_compute[187185]: 2025-11-29 07:05:36.371 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:05:36 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:05:36.370 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:05:36 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:05:36.373 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:05:37 compute-0 nova_compute[187185]: 2025-11-29 07:05:37.557 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:05:38 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:05:38.376 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:05:39 compute-0 nova_compute[187185]: 2025-11-29 07:05:39.306 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:05:42 compute-0 nova_compute[187185]: 2025-11-29 07:05:42.559 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:05:44 compute-0 nova_compute[187185]: 2025-11-29 07:05:44.309 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:05:45 compute-0 podman[223433]: 2025-11-29 07:05:45.790418458 +0000 UTC m=+0.061415006 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 29 07:05:45 compute-0 podman[223435]: 2025-11-29 07:05:45.820629271 +0000 UTC m=+0.081579145 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:05:45 compute-0 podman[223434]: 2025-11-29 07:05:45.87582477 +0000 UTC m=+0.127278976 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_id=edpm, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-type=git, version=9.6, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc.)
Nov 29 07:05:47 compute-0 nova_compute[187185]: 2025-11-29 07:05:47.562 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:05:49 compute-0 nova_compute[187185]: 2025-11-29 07:05:49.311 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:05:52 compute-0 nova_compute[187185]: 2025-11-29 07:05:52.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:05:52 compute-0 nova_compute[187185]: 2025-11-29 07:05:52.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:05:52 compute-0 nova_compute[187185]: 2025-11-29 07:05:52.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:05:52 compute-0 nova_compute[187185]: 2025-11-29 07:05:52.564 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:05:52 compute-0 nova_compute[187185]: 2025-11-29 07:05:52.799 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "refresh_cache-02b37f0c-3272-417b-9791-48b555f68d56" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:05:52 compute-0 nova_compute[187185]: 2025-11-29 07:05:52.800 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquired lock "refresh_cache-02b37f0c-3272-417b-9791-48b555f68d56" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:05:52 compute-0 nova_compute[187185]: 2025-11-29 07:05:52.800 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 29 07:05:52 compute-0 nova_compute[187185]: 2025-11-29 07:05:52.800 187189 DEBUG nova.objects.instance [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 02b37f0c-3272-417b-9791-48b555f68d56 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:05:54 compute-0 nova_compute[187185]: 2025-11-29 07:05:54.312 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:05:54 compute-0 nova_compute[187185]: 2025-11-29 07:05:54.457 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Updating instance_info_cache with network_info: [{"id": "3978dee0-d304-43a8-9478-68840d581d9b", "address": "fa:16:3e:c9:dd:dd", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3978dee0-d3", "ovs_interfaceid": "3978dee0-d304-43a8-9478-68840d581d9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:05:54 compute-0 nova_compute[187185]: 2025-11-29 07:05:54.725 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Releasing lock "refresh_cache-02b37f0c-3272-417b-9791-48b555f68d56" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:05:54 compute-0 nova_compute[187185]: 2025-11-29 07:05:54.726 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 29 07:05:54 compute-0 nova_compute[187185]: 2025-11-29 07:05:54.726 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:05:54 compute-0 podman[223492]: 2025-11-29 07:05:54.858549366 +0000 UTC m=+0.112839438 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 29 07:05:57 compute-0 nova_compute[187185]: 2025-11-29 07:05:57.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:05:57 compute-0 nova_compute[187185]: 2025-11-29 07:05:57.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:05:57 compute-0 nova_compute[187185]: 2025-11-29 07:05:57.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:05:57 compute-0 nova_compute[187185]: 2025-11-29 07:05:57.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:05:57 compute-0 nova_compute[187185]: 2025-11-29 07:05:57.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:05:57 compute-0 nova_compute[187185]: 2025-11-29 07:05:57.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:05:57 compute-0 nova_compute[187185]: 2025-11-29 07:05:57.567 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:05:58 compute-0 nova_compute[187185]: 2025-11-29 07:05:58.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:05:58 compute-0 nova_compute[187185]: 2025-11-29 07:05:58.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:05:58 compute-0 nova_compute[187185]: 2025-11-29 07:05:58.511 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:05:58 compute-0 nova_compute[187185]: 2025-11-29 07:05:58.511 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:05:58 compute-0 nova_compute[187185]: 2025-11-29 07:05:58.512 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:05:58 compute-0 nova_compute[187185]: 2025-11-29 07:05:58.512 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:05:58 compute-0 nova_compute[187185]: 2025-11-29 07:05:58.744 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02b37f0c-3272-417b-9791-48b555f68d56/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:05:58 compute-0 nova_compute[187185]: 2025-11-29 07:05:58.819 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02b37f0c-3272-417b-9791-48b555f68d56/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:05:58 compute-0 nova_compute[187185]: 2025-11-29 07:05:58.820 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02b37f0c-3272-417b-9791-48b555f68d56/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:05:58 compute-0 nova_compute[187185]: 2025-11-29 07:05:58.879 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02b37f0c-3272-417b-9791-48b555f68d56/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:05:59 compute-0 nova_compute[187185]: 2025-11-29 07:05:59.054 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:05:59 compute-0 nova_compute[187185]: 2025-11-29 07:05:59.056 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5585MB free_disk=73.30220413208008GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:05:59 compute-0 nova_compute[187185]: 2025-11-29 07:05:59.056 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:05:59 compute-0 nova_compute[187185]: 2025-11-29 07:05:59.057 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:05:59 compute-0 nova_compute[187185]: 2025-11-29 07:05:59.315 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:05:59 compute-0 nova_compute[187185]: 2025-11-29 07:05:59.477 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance 02b37f0c-3272-417b-9791-48b555f68d56 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:05:59 compute-0 nova_compute[187185]: 2025-11-29 07:05:59.478 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:05:59 compute-0 nova_compute[187185]: 2025-11-29 07:05:59.478 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:05:59 compute-0 nova_compute[187185]: 2025-11-29 07:05:59.542 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:05:59 compute-0 nova_compute[187185]: 2025-11-29 07:05:59.688 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:05:59 compute-0 nova_compute[187185]: 2025-11-29 07:05:59.898 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:05:59 compute-0 nova_compute[187185]: 2025-11-29 07:05:59.899 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.842s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:06:00 compute-0 podman[223525]: 2025-11-29 07:06:00.799127337 +0000 UTC m=+0.059955744 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 07:06:01 compute-0 nova_compute[187185]: 2025-11-29 07:06:01.893 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:06:02 compute-0 nova_compute[187185]: 2025-11-29 07:06:02.569 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:06:02 compute-0 podman[223549]: 2025-11-29 07:06:02.80474178 +0000 UTC m=+0.070000618 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Nov 29 07:06:04 compute-0 nova_compute[187185]: 2025-11-29 07:06:04.317 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:06:04 compute-0 podman[223569]: 2025-11-29 07:06:04.794410081 +0000 UTC m=+0.056427185 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Nov 29 07:06:07 compute-0 nova_compute[187185]: 2025-11-29 07:06:07.571 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:06:09 compute-0 nova_compute[187185]: 2025-11-29 07:06:09.319 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:06:12 compute-0 nova_compute[187185]: 2025-11-29 07:06:12.607 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:06:14 compute-0 nova_compute[187185]: 2025-11-29 07:06:14.323 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:06:16 compute-0 podman[223590]: 2025-11-29 07:06:16.787026932 +0000 UTC m=+0.051346011 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 07:06:16 compute-0 podman[223591]: 2025-11-29 07:06:16.811912955 +0000 UTC m=+0.067185299 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.openshift.tags=minimal rhel9, vcs-type=git, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, managed_by=edpm_ansible, release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, maintainer=Red Hat, Inc., vendor=Red Hat, Inc.)
Nov 29 07:06:16 compute-0 podman[223592]: 2025-11-29 07:06:16.824790818 +0000 UTC m=+0.067229369 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 07:06:17 compute-0 nova_compute[187185]: 2025-11-29 07:06:17.616 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:06:19 compute-0 nova_compute[187185]: 2025-11-29 07:06:19.375 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:06:22 compute-0 nova_compute[187185]: 2025-11-29 07:06:22.662 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:06:24 compute-0 nova_compute[187185]: 2025-11-29 07:06:24.378 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:06:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:06:24.825 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:06:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:06:24.826 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:06:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:06:24.826 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:06:25 compute-0 podman[223654]: 2025-11-29 07:06:25.847506894 +0000 UTC m=+0.119433093 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 29 07:06:27 compute-0 nova_compute[187185]: 2025-11-29 07:06:27.716 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:06:29 compute-0 nova_compute[187185]: 2025-11-29 07:06:29.379 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:06:31 compute-0 podman[223682]: 2025-11-29 07:06:31.787826568 +0000 UTC m=+0.058451772 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 07:06:32 compute-0 nova_compute[187185]: 2025-11-29 07:06:32.720 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:06:33 compute-0 podman[223706]: 2025-11-29 07:06:33.804341227 +0000 UTC m=+0.068233578 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm)
Nov 29 07:06:34 compute-0 nova_compute[187185]: 2025-11-29 07:06:34.382 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:06:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:06:34.427 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:06:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:06:34.428 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:06:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:06:34.430 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:06:34 compute-0 nova_compute[187185]: 2025-11-29 07:06:34.478 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:06:35 compute-0 podman[223727]: 2025-11-29 07:06:35.821893506 +0000 UTC m=+0.072476678 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 29 07:06:37 compute-0 nova_compute[187185]: 2025-11-29 07:06:37.723 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:06:39 compute-0 nova_compute[187185]: 2025-11-29 07:06:39.386 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:06:41 compute-0 nova_compute[187185]: 2025-11-29 07:06:41.062 187189 DEBUG oslo_concurrency.lockutils [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Acquiring lock "3b461587-d91f-4a59-a05a-9a2ae89cfacd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:06:41 compute-0 nova_compute[187185]: 2025-11-29 07:06:41.063 187189 DEBUG oslo_concurrency.lockutils [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Lock "3b461587-d91f-4a59-a05a-9a2ae89cfacd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:06:41 compute-0 nova_compute[187185]: 2025-11-29 07:06:41.082 187189 DEBUG nova.compute.manager [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 07:06:41 compute-0 nova_compute[187185]: 2025-11-29 07:06:41.375 187189 DEBUG oslo_concurrency.lockutils [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:06:41 compute-0 nova_compute[187185]: 2025-11-29 07:06:41.375 187189 DEBUG oslo_concurrency.lockutils [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:06:41 compute-0 nova_compute[187185]: 2025-11-29 07:06:41.384 187189 DEBUG nova.virt.hardware [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:06:41 compute-0 nova_compute[187185]: 2025-11-29 07:06:41.385 187189 INFO nova.compute.claims [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:06:41 compute-0 nova_compute[187185]: 2025-11-29 07:06:41.541 187189 DEBUG nova.compute.provider_tree [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:06:41 compute-0 nova_compute[187185]: 2025-11-29 07:06:41.575 187189 DEBUG nova.scheduler.client.report [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:06:41 compute-0 nova_compute[187185]: 2025-11-29 07:06:41.632 187189 DEBUG oslo_concurrency.lockutils [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.257s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:06:41 compute-0 nova_compute[187185]: 2025-11-29 07:06:41.633 187189 DEBUG nova.compute.manager [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 07:06:41 compute-0 nova_compute[187185]: 2025-11-29 07:06:41.689 187189 DEBUG nova.compute.manager [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 07:06:41 compute-0 nova_compute[187185]: 2025-11-29 07:06:41.690 187189 DEBUG nova.network.neutron [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 07:06:41 compute-0 nova_compute[187185]: 2025-11-29 07:06:41.712 187189 INFO nova.virt.libvirt.driver [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 07:06:41 compute-0 nova_compute[187185]: 2025-11-29 07:06:41.731 187189 DEBUG nova.compute.manager [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 07:06:41 compute-0 nova_compute[187185]: 2025-11-29 07:06:41.844 187189 DEBUG nova.compute.manager [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 07:06:41 compute-0 nova_compute[187185]: 2025-11-29 07:06:41.846 187189 DEBUG nova.virt.libvirt.driver [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:06:41 compute-0 nova_compute[187185]: 2025-11-29 07:06:41.847 187189 INFO nova.virt.libvirt.driver [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Creating image(s)
Nov 29 07:06:41 compute-0 nova_compute[187185]: 2025-11-29 07:06:41.847 187189 DEBUG oslo_concurrency.lockutils [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Acquiring lock "/var/lib/nova/instances/3b461587-d91f-4a59-a05a-9a2ae89cfacd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:06:41 compute-0 nova_compute[187185]: 2025-11-29 07:06:41.848 187189 DEBUG oslo_concurrency.lockutils [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Lock "/var/lib/nova/instances/3b461587-d91f-4a59-a05a-9a2ae89cfacd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:06:41 compute-0 nova_compute[187185]: 2025-11-29 07:06:41.849 187189 DEBUG oslo_concurrency.lockutils [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Lock "/var/lib/nova/instances/3b461587-d91f-4a59-a05a-9a2ae89cfacd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:06:41 compute-0 nova_compute[187185]: 2025-11-29 07:06:41.866 187189 DEBUG oslo_concurrency.processutils [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:06:41 compute-0 nova_compute[187185]: 2025-11-29 07:06:41.921 187189 DEBUG nova.policy [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6e3923c949d649889fe9a955a8f5cff8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'eee8c6e871b948c9bff0d4ee4267ba78', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 07:06:41 compute-0 nova_compute[187185]: 2025-11-29 07:06:41.941 187189 DEBUG oslo_concurrency.processutils [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:06:41 compute-0 nova_compute[187185]: 2025-11-29 07:06:41.942 187189 DEBUG oslo_concurrency.lockutils [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:06:41 compute-0 nova_compute[187185]: 2025-11-29 07:06:41.943 187189 DEBUG oslo_concurrency.lockutils [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:06:41 compute-0 nova_compute[187185]: 2025-11-29 07:06:41.960 187189 DEBUG oslo_concurrency.processutils [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:06:42 compute-0 nova_compute[187185]: 2025-11-29 07:06:42.037 187189 DEBUG oslo_concurrency.processutils [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:06:42 compute-0 nova_compute[187185]: 2025-11-29 07:06:42.038 187189 DEBUG oslo_concurrency.processutils [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/3b461587-d91f-4a59-a05a-9a2ae89cfacd/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:06:42 compute-0 nova_compute[187185]: 2025-11-29 07:06:42.726 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:06:42 compute-0 nova_compute[187185]: 2025-11-29 07:06:42.780 187189 DEBUG oslo_concurrency.processutils [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/3b461587-d91f-4a59-a05a-9a2ae89cfacd/disk 1073741824" returned: 0 in 0.742s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:06:42 compute-0 nova_compute[187185]: 2025-11-29 07:06:42.781 187189 DEBUG oslo_concurrency.lockutils [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.838s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:06:42 compute-0 nova_compute[187185]: 2025-11-29 07:06:42.781 187189 DEBUG oslo_concurrency.processutils [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:06:42 compute-0 nova_compute[187185]: 2025-11-29 07:06:42.836 187189 DEBUG oslo_concurrency.processutils [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:06:42 compute-0 nova_compute[187185]: 2025-11-29 07:06:42.837 187189 DEBUG nova.virt.disk.api [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Checking if we can resize image /var/lib/nova/instances/3b461587-d91f-4a59-a05a-9a2ae89cfacd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:06:42 compute-0 nova_compute[187185]: 2025-11-29 07:06:42.837 187189 DEBUG oslo_concurrency.processutils [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b461587-d91f-4a59-a05a-9a2ae89cfacd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:06:42 compute-0 nova_compute[187185]: 2025-11-29 07:06:42.900 187189 DEBUG oslo_concurrency.processutils [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b461587-d91f-4a59-a05a-9a2ae89cfacd/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:06:42 compute-0 nova_compute[187185]: 2025-11-29 07:06:42.901 187189 DEBUG nova.virt.disk.api [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Cannot resize image /var/lib/nova/instances/3b461587-d91f-4a59-a05a-9a2ae89cfacd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:06:42 compute-0 nova_compute[187185]: 2025-11-29 07:06:42.902 187189 DEBUG nova.objects.instance [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Lazy-loading 'migration_context' on Instance uuid 3b461587-d91f-4a59-a05a-9a2ae89cfacd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:06:42 compute-0 nova_compute[187185]: 2025-11-29 07:06:42.937 187189 DEBUG nova.virt.libvirt.driver [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:06:42 compute-0 nova_compute[187185]: 2025-11-29 07:06:42.937 187189 DEBUG nova.virt.libvirt.driver [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Ensure instance console log exists: /var/lib/nova/instances/3b461587-d91f-4a59-a05a-9a2ae89cfacd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:06:42 compute-0 nova_compute[187185]: 2025-11-29 07:06:42.937 187189 DEBUG oslo_concurrency.lockutils [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:06:42 compute-0 nova_compute[187185]: 2025-11-29 07:06:42.938 187189 DEBUG oslo_concurrency.lockutils [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:06:42 compute-0 nova_compute[187185]: 2025-11-29 07:06:42.938 187189 DEBUG oslo_concurrency.lockutils [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:06:43 compute-0 nova_compute[187185]: 2025-11-29 07:06:43.487 187189 DEBUG nova.network.neutron [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Successfully created port: cb9181dd-01c0-409c-8102-a94217bf40a8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:06:44 compute-0 nova_compute[187185]: 2025-11-29 07:06:44.388 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:06:45 compute-0 nova_compute[187185]: 2025-11-29 07:06:45.368 187189 DEBUG nova.network.neutron [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Successfully updated port: cb9181dd-01c0-409c-8102-a94217bf40a8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:06:45 compute-0 nova_compute[187185]: 2025-11-29 07:06:45.385 187189 DEBUG oslo_concurrency.lockutils [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Acquiring lock "refresh_cache-3b461587-d91f-4a59-a05a-9a2ae89cfacd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:06:45 compute-0 nova_compute[187185]: 2025-11-29 07:06:45.386 187189 DEBUG oslo_concurrency.lockutils [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Acquired lock "refresh_cache-3b461587-d91f-4a59-a05a-9a2ae89cfacd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:06:45 compute-0 nova_compute[187185]: 2025-11-29 07:06:45.386 187189 DEBUG nova.network.neutron [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:06:45 compute-0 nova_compute[187185]: 2025-11-29 07:06:45.503 187189 DEBUG nova.network.neutron [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.090 187189 DEBUG nova.network.neutron [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Updating instance_info_cache with network_info: [{"id": "cb9181dd-01c0-409c-8102-a94217bf40a8", "address": "fa:16:3e:80:46:c1", "network": {"id": "ec48bfb6-9f16-422e-b7dc-422060e71ca2", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-39474949-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eee8c6e871b948c9bff0d4ee4267ba78", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcb9181dd-01", "ovs_interfaceid": "cb9181dd-01c0-409c-8102-a94217bf40a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.116 187189 DEBUG oslo_concurrency.lockutils [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Releasing lock "refresh_cache-3b461587-d91f-4a59-a05a-9a2ae89cfacd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.117 187189 DEBUG nova.compute.manager [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Instance network_info: |[{"id": "cb9181dd-01c0-409c-8102-a94217bf40a8", "address": "fa:16:3e:80:46:c1", "network": {"id": "ec48bfb6-9f16-422e-b7dc-422060e71ca2", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-39474949-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eee8c6e871b948c9bff0d4ee4267ba78", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcb9181dd-01", "ovs_interfaceid": "cb9181dd-01c0-409c-8102-a94217bf40a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.122 187189 DEBUG nova.virt.libvirt.driver [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Start _get_guest_xml network_info=[{"id": "cb9181dd-01c0-409c-8102-a94217bf40a8", "address": "fa:16:3e:80:46:c1", "network": {"id": "ec48bfb6-9f16-422e-b7dc-422060e71ca2", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-39474949-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eee8c6e871b948c9bff0d4ee4267ba78", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcb9181dd-01", "ovs_interfaceid": "cb9181dd-01c0-409c-8102-a94217bf40a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.133 187189 WARNING nova.virt.libvirt.driver [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.140 187189 DEBUG nova.virt.libvirt.host [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.141 187189 DEBUG nova.virt.libvirt.host [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.145 187189 DEBUG nova.virt.libvirt.host [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.146 187189 DEBUG nova.virt.libvirt.host [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.148 187189 DEBUG nova.virt.libvirt.driver [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.148 187189 DEBUG nova.virt.hardware [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.149 187189 DEBUG nova.virt.hardware [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.149 187189 DEBUG nova.virt.hardware [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.149 187189 DEBUG nova.virt.hardware [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.150 187189 DEBUG nova.virt.hardware [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.150 187189 DEBUG nova.virt.hardware [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.150 187189 DEBUG nova.virt.hardware [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.151 187189 DEBUG nova.virt.hardware [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.151 187189 DEBUG nova.virt.hardware [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.152 187189 DEBUG nova.virt.hardware [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.152 187189 DEBUG nova.virt.hardware [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.159 187189 DEBUG nova.virt.libvirt.vif [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:06:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009',display_name='tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationnegativetestjson-server-314359009',id=79,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='eee8c6e871b948c9bff0d4ee4267ba78',ramdisk_id='',reservation_id='r-lcsa4a03',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationNegativeTestJSON-1992151889',owner_user_name='tempest-FloatingIPsAssociationNegativeTestJSON-1992151889-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:06:41Z,user_data=None,user_id='6e3923c949d649889fe9a955a8f5cff8',uuid=3b461587-d91f-4a59-a05a-9a2ae89cfacd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cb9181dd-01c0-409c-8102-a94217bf40a8", "address": "fa:16:3e:80:46:c1", "network": {"id": "ec48bfb6-9f16-422e-b7dc-422060e71ca2", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-39474949-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eee8c6e871b948c9bff0d4ee4267ba78", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcb9181dd-01", "ovs_interfaceid": "cb9181dd-01c0-409c-8102-a94217bf40a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.159 187189 DEBUG nova.network.os_vif_util [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Converting VIF {"id": "cb9181dd-01c0-409c-8102-a94217bf40a8", "address": "fa:16:3e:80:46:c1", "network": {"id": "ec48bfb6-9f16-422e-b7dc-422060e71ca2", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-39474949-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eee8c6e871b948c9bff0d4ee4267ba78", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcb9181dd-01", "ovs_interfaceid": "cb9181dd-01c0-409c-8102-a94217bf40a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.161 187189 DEBUG nova.network.os_vif_util [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:80:46:c1,bridge_name='br-int',has_traffic_filtering=True,id=cb9181dd-01c0-409c-8102-a94217bf40a8,network=Network(ec48bfb6-9f16-422e-b7dc-422060e71ca2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcb9181dd-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.163 187189 DEBUG nova.objects.instance [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3b461587-d91f-4a59-a05a-9a2ae89cfacd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.182 187189 DEBUG nova.virt.libvirt.driver [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:06:46 compute-0 nova_compute[187185]:   <uuid>3b461587-d91f-4a59-a05a-9a2ae89cfacd</uuid>
Nov 29 07:06:46 compute-0 nova_compute[187185]:   <name>instance-0000004f</name>
Nov 29 07:06:46 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:06:46 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:06:46 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:06:46 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:       <nova:name>tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009</nova:name>
Nov 29 07:06:46 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:06:46</nova:creationTime>
Nov 29 07:06:46 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:06:46 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:06:46 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:06:46 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:06:46 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:06:46 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:06:46 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:06:46 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:06:46 compute-0 nova_compute[187185]:         <nova:user uuid="6e3923c949d649889fe9a955a8f5cff8">tempest-FloatingIPsAssociationNegativeTestJSON-1992151889-project-member</nova:user>
Nov 29 07:06:46 compute-0 nova_compute[187185]:         <nova:project uuid="eee8c6e871b948c9bff0d4ee4267ba78">tempest-FloatingIPsAssociationNegativeTestJSON-1992151889</nova:project>
Nov 29 07:06:46 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:06:46 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:06:46 compute-0 nova_compute[187185]:         <nova:port uuid="cb9181dd-01c0-409c-8102-a94217bf40a8">
Nov 29 07:06:46 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:06:46 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:06:46 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:06:46 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <system>
Nov 29 07:06:46 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:06:46 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:06:46 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:06:46 compute-0 nova_compute[187185]:       <entry name="serial">3b461587-d91f-4a59-a05a-9a2ae89cfacd</entry>
Nov 29 07:06:46 compute-0 nova_compute[187185]:       <entry name="uuid">3b461587-d91f-4a59-a05a-9a2ae89cfacd</entry>
Nov 29 07:06:46 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     </system>
Nov 29 07:06:46 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:06:46 compute-0 nova_compute[187185]:   <os>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:   </os>
Nov 29 07:06:46 compute-0 nova_compute[187185]:   <features>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:   </features>
Nov 29 07:06:46 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:06:46 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:06:46 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:06:46 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/3b461587-d91f-4a59-a05a-9a2ae89cfacd/disk"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:06:46 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/3b461587-d91f-4a59-a05a-9a2ae89cfacd/disk.config"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:06:46 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:80:46:c1"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:       <target dev="tapcb9181dd-01"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:06:46 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/3b461587-d91f-4a59-a05a-9a2ae89cfacd/console.log" append="off"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <video>
Nov 29 07:06:46 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     </video>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:06:46 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:06:46 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:06:46 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:06:46 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:06:46 compute-0 nova_compute[187185]: </domain>
Nov 29 07:06:46 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.184 187189 DEBUG nova.compute.manager [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Preparing to wait for external event network-vif-plugged-cb9181dd-01c0-409c-8102-a94217bf40a8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.184 187189 DEBUG oslo_concurrency.lockutils [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Acquiring lock "3b461587-d91f-4a59-a05a-9a2ae89cfacd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.184 187189 DEBUG oslo_concurrency.lockutils [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Lock "3b461587-d91f-4a59-a05a-9a2ae89cfacd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.184 187189 DEBUG oslo_concurrency.lockutils [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Lock "3b461587-d91f-4a59-a05a-9a2ae89cfacd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.185 187189 DEBUG nova.virt.libvirt.vif [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:06:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009',display_name='tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationnegativetestjson-server-314359009',id=79,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='eee8c6e871b948c9bff0d4ee4267ba78',ramdisk_id='',reservation_id='r-lcsa4a03',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationNegativeTestJSON-1992151889',owner_user_name='tempest-FloatingIPsAssociationNegativeTestJSON-1992151889-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:06:41Z,user_data=None,user_id='6e3923c949d649889fe9a955a8f5cff8',uuid=3b461587-d91f-4a59-a05a-9a2ae89cfacd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cb9181dd-01c0-409c-8102-a94217bf40a8", "address": "fa:16:3e:80:46:c1", "network": {"id": "ec48bfb6-9f16-422e-b7dc-422060e71ca2", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-39474949-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eee8c6e871b948c9bff0d4ee4267ba78", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcb9181dd-01", "ovs_interfaceid": "cb9181dd-01c0-409c-8102-a94217bf40a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.185 187189 DEBUG nova.network.os_vif_util [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Converting VIF {"id": "cb9181dd-01c0-409c-8102-a94217bf40a8", "address": "fa:16:3e:80:46:c1", "network": {"id": "ec48bfb6-9f16-422e-b7dc-422060e71ca2", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-39474949-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eee8c6e871b948c9bff0d4ee4267ba78", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcb9181dd-01", "ovs_interfaceid": "cb9181dd-01c0-409c-8102-a94217bf40a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.186 187189 DEBUG nova.network.os_vif_util [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:80:46:c1,bridge_name='br-int',has_traffic_filtering=True,id=cb9181dd-01c0-409c-8102-a94217bf40a8,network=Network(ec48bfb6-9f16-422e-b7dc-422060e71ca2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcb9181dd-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.186 187189 DEBUG os_vif [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:46:c1,bridge_name='br-int',has_traffic_filtering=True,id=cb9181dd-01c0-409c-8102-a94217bf40a8,network=Network(ec48bfb6-9f16-422e-b7dc-422060e71ca2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcb9181dd-01') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.187 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.187 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.188 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.192 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.192 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcb9181dd-01, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.192 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcb9181dd-01, col_values=(('external_ids', {'iface-id': 'cb9181dd-01c0-409c-8102-a94217bf40a8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:80:46:c1', 'vm-uuid': '3b461587-d91f-4a59-a05a-9a2ae89cfacd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.194 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:06:46 compute-0 NetworkManager[55227]: <info>  [1764400006.1964] manager: (tapcb9181dd-01): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/91)
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.198 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.204 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.205 187189 INFO os_vif [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:80:46:c1,bridge_name='br-int',has_traffic_filtering=True,id=cb9181dd-01c0-409c-8102-a94217bf40a8,network=Network(ec48bfb6-9f16-422e-b7dc-422060e71ca2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcb9181dd-01')
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.365 187189 DEBUG nova.virt.libvirt.driver [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.366 187189 DEBUG nova.virt.libvirt.driver [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.367 187189 DEBUG nova.virt.libvirt.driver [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] No VIF found with MAC fa:16:3e:80:46:c1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:06:46 compute-0 nova_compute[187185]: 2025-11-29 07:06:46.369 187189 INFO nova.virt.libvirt.driver [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Using config drive
Nov 29 07:06:47 compute-0 nova_compute[187185]: 2025-11-29 07:06:47.166 187189 INFO nova.virt.libvirt.driver [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Creating config drive at /var/lib/nova/instances/3b461587-d91f-4a59-a05a-9a2ae89cfacd/disk.config
Nov 29 07:06:47 compute-0 nova_compute[187185]: 2025-11-29 07:06:47.171 187189 DEBUG oslo_concurrency.processutils [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3b461587-d91f-4a59-a05a-9a2ae89cfacd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7sp4fwrj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:06:47 compute-0 nova_compute[187185]: 2025-11-29 07:06:47.245 187189 DEBUG nova.compute.manager [req-5cc85486-154c-41a7-a32a-58e0ed2ba0ac req-ae4cfdf5-bbb4-4152-95be-c29adaa2d61f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Received event network-changed-cb9181dd-01c0-409c-8102-a94217bf40a8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:06:47 compute-0 nova_compute[187185]: 2025-11-29 07:06:47.246 187189 DEBUG nova.compute.manager [req-5cc85486-154c-41a7-a32a-58e0ed2ba0ac req-ae4cfdf5-bbb4-4152-95be-c29adaa2d61f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Refreshing instance network info cache due to event network-changed-cb9181dd-01c0-409c-8102-a94217bf40a8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:06:47 compute-0 nova_compute[187185]: 2025-11-29 07:06:47.246 187189 DEBUG oslo_concurrency.lockutils [req-5cc85486-154c-41a7-a32a-58e0ed2ba0ac req-ae4cfdf5-bbb4-4152-95be-c29adaa2d61f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-3b461587-d91f-4a59-a05a-9a2ae89cfacd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:06:47 compute-0 nova_compute[187185]: 2025-11-29 07:06:47.247 187189 DEBUG oslo_concurrency.lockutils [req-5cc85486-154c-41a7-a32a-58e0ed2ba0ac req-ae4cfdf5-bbb4-4152-95be-c29adaa2d61f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-3b461587-d91f-4a59-a05a-9a2ae89cfacd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:06:47 compute-0 nova_compute[187185]: 2025-11-29 07:06:47.247 187189 DEBUG nova.network.neutron [req-5cc85486-154c-41a7-a32a-58e0ed2ba0ac req-ae4cfdf5-bbb4-4152-95be-c29adaa2d61f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Refreshing network info cache for port cb9181dd-01c0-409c-8102-a94217bf40a8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:06:47 compute-0 nova_compute[187185]: 2025-11-29 07:06:47.308 187189 DEBUG oslo_concurrency.processutils [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3b461587-d91f-4a59-a05a-9a2ae89cfacd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7sp4fwrj" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:06:47 compute-0 kernel: tapcb9181dd-01: entered promiscuous mode
Nov 29 07:06:47 compute-0 NetworkManager[55227]: <info>  [1764400007.4201] manager: (tapcb9181dd-01): new Tun device (/org/freedesktop/NetworkManager/Devices/92)
Nov 29 07:06:47 compute-0 nova_compute[187185]: 2025-11-29 07:06:47.420 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:06:47 compute-0 ovn_controller[95281]: 2025-11-29T07:06:47Z|00163|binding|INFO|Claiming lport cb9181dd-01c0-409c-8102-a94217bf40a8 for this chassis.
Nov 29 07:06:47 compute-0 ovn_controller[95281]: 2025-11-29T07:06:47Z|00164|binding|INFO|cb9181dd-01c0-409c-8102-a94217bf40a8: Claiming fa:16:3e:80:46:c1 10.100.0.6
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:06:47.434 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:80:46:c1 10.100.0.6'], port_security=['fa:16:3e:80:46:c1 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ec48bfb6-9f16-422e-b7dc-422060e71ca2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'eee8c6e871b948c9bff0d4ee4267ba78', 'neutron:revision_number': '2', 'neutron:security_group_ids': '75187edb-4718-43a2-a8b6-d9040cc77fea', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=827861e9-4eef-4c9a-a62c-b7772140c169, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=cb9181dd-01c0-409c-8102-a94217bf40a8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:06:47.436 104254 INFO neutron.agent.ovn.metadata.agent [-] Port cb9181dd-01c0-409c-8102-a94217bf40a8 in datapath ec48bfb6-9f16-422e-b7dc-422060e71ca2 bound to our chassis
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:06:47.438 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ec48bfb6-9f16-422e-b7dc-422060e71ca2
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:06:47.451 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f83a9cb6-cf33-4f50-9d22-23dcb919e621]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:06:47.453 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapec48bfb6-91 in ovnmeta-ec48bfb6-9f16-422e-b7dc-422060e71ca2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:06:47.455 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapec48bfb6-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:06:47.455 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[43f01b14-decf-44f8-af38-2814a510b01c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:06:47.456 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[24ab482d-4e64-46d2-9516-bf54bef37ffd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:06:47 compute-0 systemd-udevd[223828]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:06:47 compute-0 podman[223775]: 2025-11-29 07:06:47.474685058 +0000 UTC m=+0.081489373 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125)
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:06:47.476 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[25ef39c9-a4b1-40c7-8e1c-440016ab23b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:06:47 compute-0 NetworkManager[55227]: <info>  [1764400007.4869] device (tapcb9181dd-01): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:06:47 compute-0 NetworkManager[55227]: <info>  [1764400007.4894] device (tapcb9181dd-01): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:06:47 compute-0 systemd-machined[153486]: New machine qemu-25-instance-0000004f.
Nov 29 07:06:47 compute-0 nova_compute[187185]: 2025-11-29 07:06:47.490 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:06:47 compute-0 ovn_controller[95281]: 2025-11-29T07:06:47Z|00165|binding|INFO|Setting lport cb9181dd-01c0-409c-8102-a94217bf40a8 ovn-installed in OVS
Nov 29 07:06:47 compute-0 ovn_controller[95281]: 2025-11-29T07:06:47Z|00166|binding|INFO|Setting lport cb9181dd-01c0-409c-8102-a94217bf40a8 up in Southbound
Nov 29 07:06:47 compute-0 nova_compute[187185]: 2025-11-29 07:06:47.495 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:06:47 compute-0 podman[223777]: 2025-11-29 07:06:47.504719536 +0000 UTC m=+0.099228833 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 07:06:47 compute-0 podman[223776]: 2025-11-29 07:06:47.504719206 +0000 UTC m=+0.106357435 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_id=edpm, vcs-type=git, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, architecture=x86_64, container_name=openstack_network_exporter, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:06:47.505 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[df217600-dd4f-4953-9805-8f37e8663d85]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:06:47 compute-0 systemd[1]: Started Virtual Machine qemu-25-instance-0000004f.
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:06:47.545 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[0073dfb1-4a73-46cd-9b65-8b7c6dcb3cf6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:06:47.553 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[80bb623d-6706-47bc-be29-2b12011aa5b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:06:47 compute-0 NetworkManager[55227]: <info>  [1764400007.5544] manager: (tapec48bfb6-90): new Veth device (/org/freedesktop/NetworkManager/Devices/93)
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:06:47.594 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[f728ab80-b4c5-4319-b73f-a8fbfe66e8f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:06:47.599 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[8aff5c62-aedf-49d9-ad1e-a8adc73863d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:06:47 compute-0 NetworkManager[55227]: <info>  [1764400007.6274] device (tapec48bfb6-90): carrier: link connected
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:06:47.633 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[8a18e48f-4880-4773-94b2-e21ac8aa3eec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:06:47.653 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[764dbad7-6197-4e87-bc56-467eb1dba3ff]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapec48bfb6-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:cf:29'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 545529, 'reachable_time': 16429, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223875, 'error': None, 'target': 'ovnmeta-ec48bfb6-9f16-422e-b7dc-422060e71ca2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:06:47.669 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[ecc34d7e-d3df-4c41-8206-6e4296b6178d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe25:cf29'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 545529, 'tstamp': 545529}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223876, 'error': None, 'target': 'ovnmeta-ec48bfb6-9f16-422e-b7dc-422060e71ca2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:06:47.688 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[0fe731fc-e79d-47e0-bce8-8eac0c734de4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapec48bfb6-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:cf:29'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 545529, 'reachable_time': 16429, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223877, 'error': None, 'target': 'ovnmeta-ec48bfb6-9f16-422e-b7dc-422060e71ca2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:06:47.727 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[817fc360-181e-4911-8104-bc9dbe4499b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:06:47.797 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[0dd21f66-5419-428c-8db7-f3fc233fd008]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:06:47.799 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec48bfb6-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:06:47.800 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:06:47.800 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapec48bfb6-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:06:47 compute-0 nova_compute[187185]: 2025-11-29 07:06:47.802 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:06:47 compute-0 NetworkManager[55227]: <info>  [1764400007.8034] manager: (tapec48bfb6-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/94)
Nov 29 07:06:47 compute-0 kernel: tapec48bfb6-90: entered promiscuous mode
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:06:47.810 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapec48bfb6-90, col_values=(('external_ids', {'iface-id': 'cd784eba-7471-412f-94a4-89c319d0e878'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:06:47 compute-0 ovn_controller[95281]: 2025-11-29T07:06:47Z|00167|binding|INFO|Releasing lport cd784eba-7471-412f-94a4-89c319d0e878 from this chassis (sb_readonly=0)
Nov 29 07:06:47 compute-0 nova_compute[187185]: 2025-11-29 07:06:47.812 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:06:47 compute-0 nova_compute[187185]: 2025-11-29 07:06:47.826 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:06:47.828 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ec48bfb6-9f16-422e-b7dc-422060e71ca2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ec48bfb6-9f16-422e-b7dc-422060e71ca2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:06:47.829 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[06903095-22da-41ac-b0ea-14f3f056ed07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:06:47.830 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-ec48bfb6-9f16-422e-b7dc-422060e71ca2
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/ec48bfb6-9f16-422e-b7dc-422060e71ca2.pid.haproxy
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID ec48bfb6-9f16-422e-b7dc-422060e71ca2
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:06:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:06:47.831 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ec48bfb6-9f16-422e-b7dc-422060e71ca2', 'env', 'PROCESS_TAG=haproxy-ec48bfb6-9f16-422e-b7dc-422060e71ca2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ec48bfb6-9f16-422e-b7dc-422060e71ca2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:06:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:47.993 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '02b37f0c-3272-417b-9791-48b555f68d56', 'name': 'tempest-₡-1325169813', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000040', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1dba9539037a4e9dbf33cba140fe21fe', 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'hostId': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 07:06:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:47.996 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd', 'name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000004f', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'paused', 'tenant_id': 'eee8c6e871b948c9bff0d4ee4267ba78', 'user_id': '6e3923c949d649889fe9a955a8f5cff8', 'hostId': '3dfb537a0d579921fcbd0df7859abe955d0dd24f0ce9700adda2e509', 'status': 'paused', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 07:06:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:47.997 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.000 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/network.incoming.packets volume: 12 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.004 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 3b461587-d91f-4a59-a05a-9a2ae89cfacd / tapcb9181dd-01 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.004 12 DEBUG ceilometer.compute.pollsters [-] 3b461587-d91f-4a59-a05a-9a2ae89cfacd/network.incoming.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '446cce1a-2967-4381-8733-b188f046125a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 12, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'instance-00000040-02b37f0c-3272-417b-9791-48b555f68d56-tap3978dee0-d3', 'timestamp': '2025-11-29T07:06:47.997516', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'tap3978dee0-d3', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c9:dd:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3978dee0-d3'}, 'message_id': 'f8b0eb6e-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.715767866, 'message_signature': '27c0afbd06bd89344a6c8809ebee37ce403a195c53e4cf65d57032064bac9f80'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6e3923c949d649889fe9a955a8f5cff8', 'user_name': None, 'project_id': 'eee8c6e871b948c9bff0d4ee4267ba78', 'project_name': None, 'resource_id': 'instance-0000004f-3b461587-d91f-4a59-a05a-9a2ae89cfacd-tapcb9181dd-01', 'timestamp': '2025-11-29T07:06:47.997516', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009', 'name': 'tapcb9181dd-01', 'instance_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd', 'instance_type': 'm1.nano', 'host': '3dfb537a0d579921fcbd0df7859abe955d0dd24f0ce9700adda2e509', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:80:46:c1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcb9181dd-01'}, 'message_id': 'f8b1773c-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.719976035, 'message_signature': '8e16a6dd357098079020fd16fcc5c79066d8201170668a27baf93767a0816a06'}]}, 'timestamp': '2025-11-29 07:06:48.005229', '_unique_id': 'a6279e78cf704586a0f12669dba8092d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.007 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.008 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.008 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.008 12 DEBUG ceilometer.compute.pollsters [-] 3b461587-d91f-4a59-a05a-9a2ae89cfacd/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6073584-6cf0-4836-a026-3aba45d1bf15', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'instance-00000040-02b37f0c-3272-417b-9791-48b555f68d56-tap3978dee0-d3', 'timestamp': '2025-11-29T07:06:48.008551', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'tap3978dee0-d3', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c9:dd:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3978dee0-d3'}, 'message_id': 'f8b208dc-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.715767866, 'message_signature': '40daa832c644ece796dfd522c2a04d9c2d712331ff462d3efc9ec89be9501abb'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6e3923c949d649889fe9a955a8f5cff8', 'user_name': None, 'project_id': 'eee8c6e871b948c9bff0d4ee4267ba78', 'project_name': None, 'resource_id': 'instance-0000004f-3b461587-d91f-4a59-a05a-9a2ae89cfacd-tapcb9181dd-01', 'timestamp': '2025-11-29T07:06:48.008551', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009', 'name': 'tapcb9181dd-01', 'instance_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd', 'instance_type': 'm1.nano', 'host': '3dfb537a0d579921fcbd0df7859abe955d0dd24f0ce9700adda2e509', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:80:46:c1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcb9181dd-01'}, 'message_id': 'f8b21688-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.719976035, 'message_signature': '48e679b79c12c9df12085af4df8957886db535c8aeed18197835cecc88d60dd5'}]}, 'timestamp': '2025-11-29 07:06:48.009187', '_unique_id': '25c7f450bf4c48d6abfd0247730ad223'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.009 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.010 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.035 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/cpu volume: 12750000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 nova_compute[187185]: 2025-11-29 07:06:48.083 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400008.0823255, 3b461587-d91f-4a59-a05a-9a2ae89cfacd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:06:48 compute-0 nova_compute[187185]: 2025-11-29 07:06:48.084 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] VM Started (Lifecycle Event)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.096 12 DEBUG ceilometer.compute.pollsters [-] 3b461587-d91f-4a59-a05a-9a2ae89cfacd/cpu volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '802abd0a-9504-4a6c-a3e9-022029d1a493', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12750000000, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'timestamp': '2025-11-29T07:06:48.010616', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'instance-00000040', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'f8b6474e-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.753983115, 'message_signature': '1edc3c62bfed3a05155480b2edbfafe53356c22de17f49b49558f0d12d770fdf'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '6e3923c949d649889fe9a955a8f5cff8', 'user_name': None, 'project_id': 'eee8c6e871b948c9bff0d4ee4267ba78', 'project_name': None, 'resource_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd', 'timestamp': '2025-11-29T07:06:48.010616', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009', 'name': 'instance-0000004f', 'instance_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd', 'instance_type': 'm1.nano', 'host': '3dfb537a0d579921fcbd0df7859abe955d0dd24f0ce9700adda2e509', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'f8bf7936-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.814371771, 'message_signature': '701a04a0e6962cdecc718cc2ac64cd5cfe03fa7e579f38a8e7165b476c7ee47a'}]}, 'timestamp': '2025-11-29 07:06:48.097061', '_unique_id': '034801384af040589ebab17885b24777'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.098 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.099 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.130 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/disk.device.write.requests volume: 353 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.131 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.159 12 DEBUG ceilometer.compute.pollsters [-] 3b461587-d91f-4a59-a05a-9a2ae89cfacd/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.160 12 DEBUG ceilometer.compute.pollsters [-] 3b461587-d91f-4a59-a05a-9a2ae89cfacd/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '012c33a7-d60a-4be1-9e54-294d93c0abec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 353, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': '02b37f0c-3272-417b-9791-48b555f68d56-vda', 'timestamp': '2025-11-29T07:06:48.099535', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'instance-00000040', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f8c4bdf6-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.817761616, 'message_signature': 'e7b82ffcff77bc08e673ba934b4db820ff4e506f7dd6297e5f36b3ea44bf68e4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': '02b37f0c-3272-417b-9791-48b555f68d56-sda', 'timestamp': '2025-11-29T07:06:48.099535', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'instance-00000040', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f8c4cbe8-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.817761616, 'message_signature': 'e84d0780722c176631f7930d40f0bedb1f63552fe5d27a1d10a66484bc97e9db'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '6e3923c949d649889fe9a955a8f5cff8', 'user_name': None, 'project_id': 'eee8c6e871b948c9bff0d4ee4267ba78', 'project_name': None, 'resource_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd-vda', 'timestamp': '2025-11-29T07:06:48.099535', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009', 'name': 'instance-0000004f', 'instance_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd', 'instance_type': 'm1.nano', 'host': '3dfb537a0d579921fcbd0df7859abe955d0dd24f0ce9700adda2e509', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f8c92990-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.85009306, 'message_signature': 'd3c6aa1280c792fed43a43a3f77b5549ba70e0d0b282b70580d93f21344a94f3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '6e3923c949d649889fe9a955a8f5cff8', 'user_name': None, 'project_id': 'eee8c6e871b948c9bff0d4ee4267ba78', 'project_name': None, 'resource_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd-sda', 'timestamp': '2025-11-29T07:06:48.099535', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009', 'name': 'instance-0000004f', 'instance_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd', 'instance_type': 'm1.nano', 'host': '3dfb537a0d579921fcbd0df7859abe955d0dd24f0ce9700adda2e509', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f8c934bc-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.85009306, 'message_signature': '67ee7e7b5cc413a38ddddb6bb3a72c3e0d499951d038cce95c74e1f3439348ab'}]}, 'timestamp': '2025-11-29 07:06:48.160693', '_unique_id': '7d07f752a6ef4f9fa51046ce1b056254'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.162 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.172 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.172 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.182 12 DEBUG ceilometer.compute.pollsters [-] 3b461587-d91f-4a59-a05a-9a2ae89cfacd/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.182 12 DEBUG ceilometer.compute.pollsters [-] 3b461587-d91f-4a59-a05a-9a2ae89cfacd/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1ad13ab-968a-47bd-9897-ba57cddcb601', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': '02b37f0c-3272-417b-9791-48b555f68d56-vda', 'timestamp': '2025-11-29T07:06:48.162507', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'instance-00000040', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f8cb0c1a-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.880745085, 'message_signature': '14371ba1f3644ce5cda53ad804f5fbf85c0b05c8412b99763c479cd855c61153'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': '02b37f0c-3272-417b-9791-48b555f68d56-sda', 'timestamp': '2025-11-29T07:06:48.162507', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'instance-00000040', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f8cb1aa2-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.880745085, 'message_signature': '7293cae209ddf3156dc710ab942d367dc07220582f7d1d3ceffff99b22c0e1b1'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '6e3923c949d649889fe9a955a8f5cff8', 'user_name': None, 'project_id': 'eee8c6e871b948c9bff0d4ee4267ba78', 'project_name': None, 'resource_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd-vda', 'timestamp': '2025-11-29T07:06:48.162507', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009', 'name': 'instance-0000004f', 'instance_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd', 'instance_type': 'm1.nano', 'host': '3dfb537a0d579921fcbd0df7859abe955d0dd24f0ce9700adda2e509', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f8cc8d92-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.891371125, 'message_signature': 'b2b09426aac759f1e32b7b82fc48b69e62ddeb303f76729de88a16504d014282'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '6e3923c949d649889fe9a955a8f5cff8', 'user_name': None, 'project_id': 'eee8c6e871b948c9bff0d4ee4267ba78', 'project_name': None, 'resource_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd-sda', 'timestamp': '2025-11-29T07:06:48.162507', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009', 'name': 'instance-0000004f', 'instance_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd', 'instance_type': 'm1.nano', 'host': '3dfb537a0d579921fcbd0df7859abe955d0dd24f0ce9700adda2e509', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f8cc9724-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.891371125, 'message_signature': 'a871a06ad6e88003236fdf439e27414ffd9c3dc2647c27aa5606fac0b0b98e67'}]}, 'timestamp': '2025-11-29 07:06:48.182899', '_unique_id': 'bb6fabeea2924bb487e4d2326d953d58'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.183 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.184 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.184 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.184 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009>]
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.185 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.185 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.185 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.185 12 DEBUG ceilometer.compute.pollsters [-] 3b461587-d91f-4a59-a05a-9a2ae89cfacd/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.185 12 DEBUG ceilometer.compute.pollsters [-] 3b461587-d91f-4a59-a05a-9a2ae89cfacd/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '997246fb-3d1c-4c47-8c89-d9b41d910d35', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': '02b37f0c-3272-417b-9791-48b555f68d56-vda', 'timestamp': '2025-11-29T07:06:48.185176', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'instance-00000040', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f8ccfa16-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.880745085, 'message_signature': '5afe2cda0e0e25acffaf8c6c8441464309194f18d1ef33adfcbf9d344217186c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': '02b37f0c-3272-417b-9791-48b555f68d56-sda', 'timestamp': '2025-11-29T07:06:48.185176', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'instance-00000040', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f8cd0254-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.880745085, 'message_signature': 'd0a96316d02cde160d4f7c682583ceb5d71df9373d3c18fb5e3cbde6f6047469'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '6e3923c949d649889fe9a955a8f5cff8', 'user_name': None, 'project_id': 'eee8c6e871b948c9bff0d4ee4267ba78', 'project_name': None, 'resource_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd-vda', 'timestamp': '2025-11-29T07:06:48.185176', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009', 'name': 'instance-0000004f', 'instance_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd', 'instance_type': 'm1.nano', 'host': '3dfb537a0d579921fcbd0df7859abe955d0dd24f0ce9700adda2e509', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f8cd0baa-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.891371125, 'message_signature': 'd4ff70d26b3d2de74ff2b393219cdea87b3a7e3acf2059ab89143863e8f26093'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '6e3923c949d649889fe9a955a8f5cff8', 'user_name': None, 'project_id': 'eee8c6e871b948c9bff0d4ee4267ba78', 'project_name': None, 'resource_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd-sda', 'timestamp': '2025-11-29T07:06:48.185176', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009', 'name': 'instance-0000004f', 'instance_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd', 'instance_type': 'm1.nano', 'host': '3dfb537a0d579921fcbd0df7859abe955d0dd24f0ce9700adda2e509', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f8cd15b4-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.891371125, 'message_signature': '9d231fce411c881cf561a699691aff4ee91b0b5d414bfd66662a54c5ad129686'}]}, 'timestamp': '2025-11-29 07:06:48.186101', '_unique_id': '60ab3b812f4a4ed0bda797470616daa8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.186 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.187 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.187 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/disk.device.read.latency volume: 1357763391 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.187 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/disk.device.read.latency volume: 256274987 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.187 12 DEBUG ceilometer.compute.pollsters [-] 3b461587-d91f-4a59-a05a-9a2ae89cfacd/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 DEBUG ceilometer.compute.pollsters [-] 3b461587-d91f-4a59-a05a-9a2ae89cfacd/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9914d00d-dadd-41a9-8a97-f0d18cc32b76', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1357763391, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': '02b37f0c-3272-417b-9791-48b555f68d56-vda', 'timestamp': '2025-11-29T07:06:48.187497', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'instance-00000040', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f8cd54d4-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.817761616, 'message_signature': 'e124c907f41032cd09dcce10173b537d50c395b7d1712195b611c7d9c60655c0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 256274987, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': '02b37f0c-3272-417b-9791-48b555f68d56-sda', 'timestamp': '2025-11-29T07:06:48.187497', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'instance-00000040', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f8cd5cfe-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.817761616, 'message_signature': '52db4bf593364b80cc03342e7bc495859c64d864cc37857786788074f562cb23'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '6e3923c949d649889fe9a955a8f5cff8', 'user_name': None, 'project_id': 'eee8c6e871b948c9bff0d4ee4267ba78', 'project_name': None, 'resource_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd-vda', 'timestamp': '2025-11-29T07:06:48.187497', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009', 'name': 'instance-0000004f', 'instance_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd', 'instance_type': 'm1.nano', 'host': '3dfb537a0d579921fcbd0df7859abe955d0dd24f0ce9700adda2e509', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f8cd65fa-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.85009306, 'message_signature': '2a4eae8c85558f82b5cd5fb575c440a2bc5c9a4afc07ca813149e7d3266e05c8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '6e3923c949d649889fe9a955a8f5cff8', 'user_name': None, 'project_id': 'eee8c6e871b948c9bff0d4ee4267ba78', 'project_name': None, 'resource_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd-sda', 'timestamp': '2025-11-29T07:06:48.187497', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009', 'name': 'instance-0000004f', 'instance_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd', 'instance_type': 'm1.nano', 'host': '3dfb537a0d579921fcbd0df7859abe955d0dd24f0ce9700adda2e509', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f8cd6d8e-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.85009306, 'message_signature': '02c8aa12814c6ff2d3970d980ecd7f582198dec819d96475f391a3ae079f5a51'}]}, 'timestamp': '2025-11-29 07:06:48.188342', '_unique_id': '54dbd81274364b27a013a36db62c20eb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.188 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.189 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.189 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/disk.device.write.latency volume: 64975981151 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.189 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.189 12 DEBUG ceilometer.compute.pollsters [-] 3b461587-d91f-4a59-a05a-9a2ae89cfacd/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 DEBUG ceilometer.compute.pollsters [-] 3b461587-d91f-4a59-a05a-9a2ae89cfacd/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e95d41b6-4043-4030-9547-e0fa1f5a1690', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 64975981151, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': '02b37f0c-3272-417b-9791-48b555f68d56-vda', 'timestamp': '2025-11-29T07:06:48.189464', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'instance-00000040', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f8cda222-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.817761616, 'message_signature': '0395d379edf0bc21b1c0f2876b60fc25a53e79bffebcf2d588f1ea30dfd359fb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': '02b37f0c-3272-417b-9791-48b555f68d56-sda', 'timestamp': '2025-11-29T07:06:48.189464', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'instance-00000040', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f8cda9ca-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.817761616, 'message_signature': '2b5314dbd32e5010b732ddfa704fc51d392fc9cb3cf94f8f64a1ee487ae45fad'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '6e3923c949d649889fe9a955a8f5cff8', 'user_name': None, 'project_id': 'eee8c6e871b948c9bff0d4ee4267ba78', 'project_name': None, 'resource_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd-vda', 'timestamp': '2025-11-29T07:06:48.189464', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009', 'name': 'instance-0000004f', 'instance_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd', 'instance_type': 'm1.nano', 'host': '3dfb537a0d579921fcbd0df7859abe955d0dd24f0ce9700adda2e509', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f8cdb2da-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.85009306, 'message_signature': '699b9df79347b1338def8741224da28ecc1b5342e4a44482c763fc4a307e50f5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '6e3923c949d649889fe9a955a8f5cff8', 'user_name': None, 'project_id': 'eee8c6e871b948c9bff0d4ee4267ba78', 'project_name': None, 'resource_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd-sda', 'timestamp': '2025-11-29T07:06:48.189464', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009', 'name': 'instance-0000004f', 'instance_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd', 'instance_type': 'm1.nano', 'host': '3dfb537a0d579921fcbd0df7859abe955d0dd24f0ce9700adda2e509', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f8cdba32-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.85009306, 'message_signature': '506b0635dafdcc52369d95083e52d94939db974fe4667e930e7487565faf1e6c'}]}, 'timestamp': '2025-11-29 07:06:48.190305', '_unique_id': '1c90f9a190964aba89d7e04f939d7ace'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.190 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.191 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.191 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.191 12 DEBUG ceilometer.compute.pollsters [-] 3b461587-d91f-4a59-a05a-9a2ae89cfacd/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f52b3d77-67f6-4a84-8ab6-d26abca75af1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'instance-00000040-02b37f0c-3272-417b-9791-48b555f68d56-tap3978dee0-d3', 'timestamp': '2025-11-29T07:06:48.191436', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'tap3978dee0-d3', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c9:dd:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3978dee0-d3'}, 'message_id': 'f8cdeebc-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.715767866, 'message_signature': 'be5708edf0a3b670149e8fd8e91fea07f3389ac458f1b4ba3565e20d8edad0e0'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6e3923c949d649889fe9a955a8f5cff8', 'user_name': None, 'project_id': 'eee8c6e871b948c9bff0d4ee4267ba78', 'project_name': None, 'resource_id': 'instance-0000004f-3b461587-d91f-4a59-a05a-9a2ae89cfacd-tapcb9181dd-01', 'timestamp': '2025-11-29T07:06:48.191436', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009', 'name': 'tapcb9181dd-01', 'instance_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd', 'instance_type': 'm1.nano', 'host': '3dfb537a0d579921fcbd0df7859abe955d0dd24f0ce9700adda2e509', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:80:46:c1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcb9181dd-01'}, 'message_id': 'f8cdf6b4-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.719976035, 'message_signature': 'b71131e1b0bf8b4a767ddf2efacfd44f94f4c1eae0e45b3a985ff775100586d1'}]}, 'timestamp': '2025-11-29 07:06:48.191889', '_unique_id': 'fbd01212bbb8426797a790b783783aa1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.192 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.193 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.193 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.193 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009>]
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.193 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.193 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/network.outgoing.bytes volume: 2740 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.193 12 DEBUG ceilometer.compute.pollsters [-] 3b461587-d91f-4a59-a05a-9a2ae89cfacd/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ccd20e5e-444a-4202-946d-41aab198dc37', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2740, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'instance-00000040-02b37f0c-3272-417b-9791-48b555f68d56-tap3978dee0-d3', 'timestamp': '2025-11-29T07:06:48.193342', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'tap3978dee0-d3', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c9:dd:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3978dee0-d3'}, 'message_id': 'f8ce38c2-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.715767866, 'message_signature': '985c33b43a5cc36dfa311d8650520cb78b31c4d830f74db7d3b1b3cf4d0fb7e8'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6e3923c949d649889fe9a955a8f5cff8', 'user_name': None, 'project_id': 'eee8c6e871b948c9bff0d4ee4267ba78', 'project_name': None, 'resource_id': 'instance-0000004f-3b461587-d91f-4a59-a05a-9a2ae89cfacd-tapcb9181dd-01', 'timestamp': '2025-11-29T07:06:48.193342', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009', 'name': 'tapcb9181dd-01', 'instance_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd', 'instance_type': 'm1.nano', 'host': '3dfb537a0d579921fcbd0df7859abe955d0dd24f0ce9700adda2e509', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:80:46:c1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcb9181dd-01'}, 'message_id': 'f8ce4330-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.719976035, 'message_signature': 'd0b2aeb3a71cc5d6c7f5b5e4cc19102b634ea79853aa5a9f69f406654e8498ce'}]}, 'timestamp': '2025-11-29 07:06:48.193862', '_unique_id': '4f1c710c3d3d40ae91c91278f27fc97a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.194 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 DEBUG ceilometer.compute.pollsters [-] 3b461587-d91f-4a59-a05a-9a2ae89cfacd/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1102dcc6-6114-4387-82e8-c4b44508bc5c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'instance-00000040-02b37f0c-3272-417b-9791-48b555f68d56-tap3978dee0-d3', 'timestamp': '2025-11-29T07:06:48.194953', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'tap3978dee0-d3', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c9:dd:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3978dee0-d3'}, 'message_id': 'f8ce77ec-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.715767866, 'message_signature': 'ae7c5117dcbcdef2dedad6f2f7876a16142a2989eb42b450dda831a6328c029b'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6e3923c949d649889fe9a955a8f5cff8', 'user_name': None, 'project_id': 'eee8c6e871b948c9bff0d4ee4267ba78', 'project_name': None, 'resource_id': 'instance-0000004f-3b461587-d91f-4a59-a05a-9a2ae89cfacd-tapcb9181dd-01', 'timestamp': '2025-11-29T07:06:48.194953', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009', 'name': 'tapcb9181dd-01', 'instance_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd', 'instance_type': 'm1.nano', 'host': '3dfb537a0d579921fcbd0df7859abe955d0dd24f0ce9700adda2e509', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:80:46:c1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcb9181dd-01'}, 'message_id': 'f8ce7fd0-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.719976035, 'message_signature': '848d92c703ed56344fa3b1f42668f25135b456e1432dd226e7465e5671cbd8f2'}]}, 'timestamp': '2025-11-29 07:06:48.195371', '_unique_id': '67d3c42b028b40369f6604785f271e65'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.195 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.196 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.196 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/network.incoming.bytes.delta volume: 84 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.196 12 DEBUG ceilometer.compute.pollsters [-] 3b461587-d91f-4a59-a05a-9a2ae89cfacd/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aff9d929-05d6-4a21-a84e-3ba991333fed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 84, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'instance-00000040-02b37f0c-3272-417b-9791-48b555f68d56-tap3978dee0-d3', 'timestamp': '2025-11-29T07:06:48.196469', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'tap3978dee0-d3', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c9:dd:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3978dee0-d3'}, 'message_id': 'f8ceb31a-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.715767866, 'message_signature': '500a8c6e79081c13ff598c9880b34dde045514b7cd17b5e12d5d594a30a595ea'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6e3923c949d649889fe9a955a8f5cff8', 'user_name': None, 'project_id': 'eee8c6e871b948c9bff0d4ee4267ba78', 'project_name': None, 'resource_id': 'instance-0000004f-3b461587-d91f-4a59-a05a-9a2ae89cfacd-tapcb9181dd-01', 'timestamp': '2025-11-29T07:06:48.196469', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009', 'name': 'tapcb9181dd-01', 'instance_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd', 'instance_type': 'm1.nano', 'host': '3dfb537a0d579921fcbd0df7859abe955d0dd24f0ce9700adda2e509', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:80:46:c1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcb9181dd-01'}, 'message_id': 'f8cebafe-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.719976035, 'message_signature': '7bb6859918416d153b166c47b0d11de31dc7f5082d9a698a321adc0de0abefcc'}]}, 'timestamp': '2025-11-29 07:06:48.196929', '_unique_id': 'e7a81169abeb48e3bea280c5bbfa64cd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.197 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/network.incoming.bytes volume: 1478 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 DEBUG ceilometer.compute.pollsters [-] 3b461587-d91f-4a59-a05a-9a2ae89cfacd/network.incoming.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '237bc473-dae4-4313-b987-4874c18878b6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1478, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'instance-00000040-02b37f0c-3272-417b-9791-48b555f68d56-tap3978dee0-d3', 'timestamp': '2025-11-29T07:06:48.198073', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'tap3978dee0-d3', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c9:dd:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3978dee0-d3'}, 'message_id': 'f8cef1b8-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.715767866, 'message_signature': '270244efa041bb1b61bab546adb2819bf6b68f6663d7aaabf22737057a4754f8'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6e3923c949d649889fe9a955a8f5cff8', 'user_name': None, 'project_id': 'eee8c6e871b948c9bff0d4ee4267ba78', 'project_name': None, 'resource_id': 'instance-0000004f-3b461587-d91f-4a59-a05a-9a2ae89cfacd-tapcb9181dd-01', 'timestamp': '2025-11-29T07:06:48.198073', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009', 'name': 'tapcb9181dd-01', 'instance_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd', 'instance_type': 'm1.nano', 'host': '3dfb537a0d579921fcbd0df7859abe955d0dd24f0ce9700adda2e509', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:80:46:c1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcb9181dd-01'}, 'message_id': 'f8cef992-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.719976035, 'message_signature': '3defe1b0e85a396d59bde7b7f3227191532ca2cdf3a579f7897c9a09da76e269'}]}, 'timestamp': '2025-11-29 07:06:48.198487', '_unique_id': '44083d500b3f437f8336656701857400'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.198 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.199 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.199 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/disk.device.read.bytes volume: 30116352 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.199 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.200 12 DEBUG ceilometer.compute.pollsters [-] 3b461587-d91f-4a59-a05a-9a2ae89cfacd/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.200 12 DEBUG ceilometer.compute.pollsters [-] 3b461587-d91f-4a59-a05a-9a2ae89cfacd/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '204f39ee-17ff-42ec-87bd-b9bf0fa075be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30116352, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': '02b37f0c-3272-417b-9791-48b555f68d56-vda', 'timestamp': '2025-11-29T07:06:48.199564', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'instance-00000040', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f8cf2bf6-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.817761616, 'message_signature': '12bbbf6393cca0a89a65fff1bb4af6c66d9080af98b8f8dd51bcea546b450c7a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': '02b37f0c-3272-417b-9791-48b555f68d56-sda', 'timestamp': '2025-11-29T07:06:48.199564', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'instance-00000040', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f8cf3538-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.817761616, 'message_signature': '35a022c7ee6b3300b8ff8587baa62704a6552f80e1157e178f7c514aff6049b5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6e3923c949d649889fe9a955a8f5cff8', 'user_name': None, 'project_id': 'eee8c6e871b948c9bff0d4ee4267ba78', 'project_name': None, 'resource_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd-vda', 'timestamp': '2025-11-29T07:06:48.199564', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009', 'name': 'instance-0000004f', 'instance_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd', 'instance_type': 'm1.nano', 'host': '3dfb537a0d579921fcbd0df7859abe955d0dd24f0ce9700adda2e509', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f8cf3cea-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.85009306, 'message_signature': '1cb80df3fd4a3b5190030f9eae776e468c50afe2777cacdc9a08eb22eab7a3c8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6e3923c949d649889fe9a955a8f5cff8', 'user_name': None, 'project_id': 'eee8c6e871b948c9bff0d4ee4267ba78', 'project_name': None, 'resource_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd-sda', 'timestamp': '2025-11-29T07:06:48.199564', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009', 'name': 'instance-0000004f', 'instance_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd', 'instance_type': 'm1.nano', 'host': '3dfb537a0d579921fcbd0df7859abe955d0dd24f0ce9700adda2e509', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f8cf442e-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.85009306, 'message_signature': '4405cf5c39af458349ee73aa3b8944421944353ba4ea9d560989087d72c061bc'}]}, 'timestamp': '2025-11-29 07:06:48.200423', '_unique_id': '866630a7275c4067902571a7028d5712'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.201 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.202 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/disk.device.read.requests volume: 1081 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.202 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.202 12 DEBUG ceilometer.compute.pollsters [-] 3b461587-d91f-4a59-a05a-9a2ae89cfacd/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.202 12 DEBUG ceilometer.compute.pollsters [-] 3b461587-d91f-4a59-a05a-9a2ae89cfacd/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '26cd20d9-92f2-498a-a72d-36805cc0f330', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1081, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': '02b37f0c-3272-417b-9791-48b555f68d56-vda', 'timestamp': '2025-11-29T07:06:48.202046', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'instance-00000040', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f8cf8dc6-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.817761616, 'message_signature': '9ccd43517d728052755e0ec733640fb29ed5fe4cc4d31630b73b7af1178f8e5b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': '02b37f0c-3272-417b-9791-48b555f68d56-sda', 'timestamp': '2025-11-29T07:06:48.202046', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'instance-00000040', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f8cf955a-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.817761616, 'message_signature': '00b7eeca62047f2721723ca9b57de4e32c192d582e1a8018dd8ca2204304f6a4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '6e3923c949d649889fe9a955a8f5cff8', 'user_name': None, 'project_id': 'eee8c6e871b948c9bff0d4ee4267ba78', 'project_name': None, 'resource_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd-vda', 'timestamp': '2025-11-29T07:06:48.202046', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009', 'name': 'instance-0000004f', 'instance_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd', 'instance_type': 'm1.nano', 'host': '3dfb537a0d579921fcbd0df7859abe955d0dd24f0ce9700adda2e509', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f8cf9cb2-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.85009306, 'message_signature': '4d991cf37f20cc0dc29acc71b3fe1208a84147bd340245193641a914dee95391'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '6e3923c949d649889fe9a955a8f5cff8', 'user_name': None, 'project_id': 'eee8c6e871b948c9bff0d4ee4267ba78', 'project_name': None, 'resource_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd-sda', 'timestamp': '2025-11-29T07:06:48.202046', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009', 'name': 'instance-0000004f', 'instance_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd', 'instance_type': 'm1.nano', 'host': '3dfb537a0d579921fcbd0df7859abe955d0dd24f0ce9700adda2e509', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f8cfa50e-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.85009306, 'message_signature': 'e5a99ef6bd7ee999c44b55071ffa2b734ee767e27f10b53cb476b5907f67f1dc'}]}, 'timestamp': '2025-11-29 07:06:48.202903', '_unique_id': '3d5235d84b534c9580b3ea9e5ae3b982'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.203 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.204 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.204 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.204 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009>]
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.204 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.204 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.204 12 DEBUG ceilometer.compute.pollsters [-] 3b461587-d91f-4a59-a05a-9a2ae89cfacd/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0063c5d9-18dc-4658-8d31-70cdcdb6b3cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'instance-00000040-02b37f0c-3272-417b-9791-48b555f68d56-tap3978dee0-d3', 'timestamp': '2025-11-29T07:06:48.204549', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'tap3978dee0-d3', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c9:dd:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3978dee0-d3'}, 'message_id': 'f8cfef14-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.715767866, 'message_signature': '833aa89d36d16bf2ac5cc617937ed125fc76ea5a737bcb045d7b7075801482b6'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6e3923c949d649889fe9a955a8f5cff8', 'user_name': None, 'project_id': 'eee8c6e871b948c9bff0d4ee4267ba78', 'project_name': None, 'resource_id': 'instance-0000004f-3b461587-d91f-4a59-a05a-9a2ae89cfacd-tapcb9181dd-01', 'timestamp': '2025-11-29T07:06:48.204549', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009', 'name': 'tapcb9181dd-01', 'instance_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd', 'instance_type': 'm1.nano', 'host': '3dfb537a0d579921fcbd0df7859abe955d0dd24f0ce9700adda2e509', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:80:46:c1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcb9181dd-01'}, 'message_id': 'f8cff96e-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.719976035, 'message_signature': '6f10cef1044dd19ecab080d2be0e148025491d5fa25df434c7acf02265f37ed1'}]}, 'timestamp': '2025-11-29 07:06:48.205066', '_unique_id': '82966d44bb004db2bdad899ed8995cf1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.205 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.206 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.206 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/memory.usage volume: 42.35546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.206 12 DEBUG ceilometer.compute.pollsters [-] 3b461587-d91f-4a59-a05a-9a2ae89cfacd/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.206 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 3b461587-d91f-4a59-a05a-9a2ae89cfacd: ceilometer.compute.pollsters.NoVolumeException
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c21096fd-c9d1-44cc-a808-231b9d7da348', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.35546875, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'timestamp': '2025-11-29T07:06:48.206255', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'instance-00000040', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'f8d03154-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.753983115, 'message_signature': 'e9a4538c8cdd072dc290df1812a09f568894acad1fa8528e8636f53b53a1d178'}]}, 'timestamp': '2025-11-29 07:06:48.206627', '_unique_id': '12a87aa634e94b3580a64c609f48761f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.207 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.208 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.208 12 DEBUG ceilometer.compute.pollsters [-] 3b461587-d91f-4a59-a05a-9a2ae89cfacd/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.208 12 DEBUG ceilometer.compute.pollsters [-] 3b461587-d91f-4a59-a05a-9a2ae89cfacd/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e38b68ff-489d-4829-a098-9ad2073f2e47', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': '02b37f0c-3272-417b-9791-48b555f68d56-vda', 'timestamp': '2025-11-29T07:06:48.207785', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'instance-00000040', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f8d06f3e-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.880745085, 'message_signature': '98d84ec4febbee0161f1708c41c2311192ea77876d6e23b4ca7cdf97f3bbec47'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': '02b37f0c-3272-417b-9791-48b555f68d56-sda', 'timestamp': '2025-11-29T07:06:48.207785', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'instance-00000040', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f8d07812-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.880745085, 'message_signature': 'b5d58a4fc4e3b8ad0cb7850c5672938c579c0c90c071795dd789f922564764bd'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '6e3923c949d649889fe9a955a8f5cff8', 'user_name': None, 'project_id': 'eee8c6e871b948c9bff0d4ee4267ba78', 'project_name': None, 'resource_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd-vda', 'timestamp': '2025-11-29T07:06:48.207785', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009', 'name': 'instance-0000004f', 'instance_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd', 'instance_type': 'm1.nano', 'host': '3dfb537a0d579921fcbd0df7859abe955d0dd24f0ce9700adda2e509', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f8d08046-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.891371125, 'message_signature': '3d44930c1c3855f1b734df66d1fe04d9adb3aa1008379655b3f0f67550e36e4c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '6e3923c949d649889fe9a955a8f5cff8', 'user_name': None, 'project_id': 'eee8c6e871b948c9bff0d4ee4267ba78', 'project_name': None, 'resource_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd-sda', 'timestamp': '2025-11-29T07:06:48.207785', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009', 'name': 'instance-0000004f', 'instance_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd', 'instance_type': 'm1.nano', 'host': '3dfb537a0d579921fcbd0df7859abe955d0dd24f0ce9700adda2e509', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f8d087c6-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.891371125, 'message_signature': '2dfe1f5771e5cb1cdb64bd8916af659adbbf7b9f3cbea52690cce06ccf445b10'}]}, 'timestamp': '2025-11-29 07:06:48.208674', '_unique_id': 'b7b34a3d03a44d5da5213b1b92a3094b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.209 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.210 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.210 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/disk.device.write.bytes volume: 73056256 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.210 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.210 12 DEBUG ceilometer.compute.pollsters [-] 3b461587-d91f-4a59-a05a-9a2ae89cfacd/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.211 12 DEBUG ceilometer.compute.pollsters [-] 3b461587-d91f-4a59-a05a-9a2ae89cfacd/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '46ae5210-10c5-4c1f-bc46-b9d6771e2b90', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73056256, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': '02b37f0c-3272-417b-9791-48b555f68d56-vda', 'timestamp': '2025-11-29T07:06:48.210199', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'instance-00000040', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f8d0ccf4-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.817761616, 'message_signature': '39b1250c68d528c7148f5b5088bd8a5fbd3103a959c8c664738b40a95d401d1f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': '02b37f0c-3272-417b-9791-48b555f68d56-sda', 'timestamp': '2025-11-29T07:06:48.210199', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'instance-00000040', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f8d0d7d0-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.817761616, 'message_signature': 'dc4fcc2255d0b4866e4e0078fa6b0cb9acbe1ff0a5311c4ad9876ec8beea5a73'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6e3923c949d649889fe9a955a8f5cff8', 'user_name': None, 'project_id': 'eee8c6e871b948c9bff0d4ee4267ba78', 'project_name': None, 'resource_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd-vda', 'timestamp': '2025-11-29T07:06:48.210199', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009', 'name': 'instance-0000004f', 'instance_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd', 'instance_type': 'm1.nano', 'host': '3dfb537a0d579921fcbd0df7859abe955d0dd24f0ce9700adda2e509', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f8d0e4dc-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.85009306, 'message_signature': 'e27d8724f5ca9b6701ac29004f7513a24d5eb0a0f9045058f2d48802b6d1f802'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6e3923c949d649889fe9a955a8f5cff8', 'user_name': None, 'project_id': 'eee8c6e871b948c9bff0d4ee4267ba78', 'project_name': None, 'resource_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd-sda', 'timestamp': '2025-11-29T07:06:48.210199', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009', 'name': 'instance-0000004f', 'instance_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd', 'instance_type': 'm1.nano', 'host': '3dfb537a0d579921fcbd0df7859abe955d0dd24f0ce9700adda2e509', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f8d0efa4-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.85009306, 'message_signature': '5f4ca44ab37b050282b0e4651d70d868e92df6d0e0d49af6a38ddfd723baeaf1'}]}, 'timestamp': '2025-11-29 07:06:48.211363', '_unique_id': '91b6aeb3744440958e5bf3c87f6ee476'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.212 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 DEBUG ceilometer.compute.pollsters [-] 3b461587-d91f-4a59-a05a-9a2ae89cfacd/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '44eac8b1-a85f-446f-889c-ae7eaf7a43b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'instance-00000040-02b37f0c-3272-417b-9791-48b555f68d56-tap3978dee0-d3', 'timestamp': '2025-11-29T07:06:48.212821', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'tap3978dee0-d3', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c9:dd:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3978dee0-d3'}, 'message_id': 'f8d1339c-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.715767866, 'message_signature': 'a10b9ede9f1b9d869c34f3a2bca02475df35dd70a56d4bfa092393b4bdd3d6cb'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6e3923c949d649889fe9a955a8f5cff8', 'user_name': None, 'project_id': 'eee8c6e871b948c9bff0d4ee4267ba78', 'project_name': None, 'resource_id': 'instance-0000004f-3b461587-d91f-4a59-a05a-9a2ae89cfacd-tapcb9181dd-01', 'timestamp': '2025-11-29T07:06:48.212821', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009', 'name': 'tapcb9181dd-01', 'instance_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd', 'instance_type': 'm1.nano', 'host': '3dfb537a0d579921fcbd0df7859abe955d0dd24f0ce9700adda2e509', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:80:46:c1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcb9181dd-01'}, 'message_id': 'f8d13f0e-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.719976035, 'message_signature': '31719405d320fd456a790323ccd4af377deb958b168a8d19c305c9f881f73d24'}]}, 'timestamp': '2025-11-29 07:06:48.213380', '_unique_id': '02b5c12ba2e347ab8c7d0df2d00daf1c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.213 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.214 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.214 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.214 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009>]
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.214 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.215 12 DEBUG ceilometer.compute.pollsters [-] 02b37f0c-3272-417b-9791-48b555f68d56/network.outgoing.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.215 12 DEBUG ceilometer.compute.pollsters [-] 3b461587-d91f-4a59-a05a-9a2ae89cfacd/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b2782b0f-483d-41d4-a403-eb864c1bd77f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'instance-00000040-02b37f0c-3272-417b-9791-48b555f68d56-tap3978dee0-d3', 'timestamp': '2025-11-29T07:06:48.215014', 'resource_metadata': {'display_name': 'tempest-₡-1325169813', 'name': 'tap3978dee0-d3', 'instance_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'instance_type': 'm1.nano', 'host': 'd2b2a26ac8bb4b5a7f8b2ffba0a6838e65339de5dfc5cc4516750a55', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c9:dd:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3978dee0-d3'}, 'message_id': 'f8d187e8-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.715767866, 'message_signature': '4d619f77a91ed84788f748e14948da0c6717c29a855ad00f03c4f3236dd6d856'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6e3923c949d649889fe9a955a8f5cff8', 'user_name': None, 'project_id': 'eee8c6e871b948c9bff0d4ee4267ba78', 'project_name': None, 'resource_id': 'instance-0000004f-3b461587-d91f-4a59-a05a-9a2ae89cfacd-tapcb9181dd-01', 'timestamp': '2025-11-29T07:06:48.215014', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009', 'name': 'tapcb9181dd-01', 'instance_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd', 'instance_type': 'm1.nano', 'host': '3dfb537a0d579921fcbd0df7859abe955d0dd24f0ce9700adda2e509', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:80:46:c1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcb9181dd-01'}, 'message_id': 'f8d1904e-ccf1-11f0-8f64-fa163e220349', 'monotonic_time': 5455.719976035, 'message_signature': '1bfbb5067561e47d4af9b358cbfc8286ac5180176ebc6f13ab95c56fccbb91aa'}]}, 'timestamp': '2025-11-29 07:06:48.215456', '_unique_id': 'a7b03e37c60648e496a35f1e59e00a3c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:06:48.216 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:06:48 compute-0 nova_compute[187185]: 2025-11-29 07:06:48.275 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:06:48 compute-0 nova_compute[187185]: 2025-11-29 07:06:48.279 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400008.082665, 3b461587-d91f-4a59-a05a-9a2ae89cfacd => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:06:48 compute-0 nova_compute[187185]: 2025-11-29 07:06:48.279 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] VM Paused (Lifecycle Event)
Nov 29 07:06:48 compute-0 nova_compute[187185]: 2025-11-29 07:06:48.308 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:06:48 compute-0 nova_compute[187185]: 2025-11-29 07:06:48.313 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:06:48 compute-0 podman[223916]: 2025-11-29 07:06:48.2413706 +0000 UTC m=+0.031125120 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:06:48 compute-0 nova_compute[187185]: 2025-11-29 07:06:48.339 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:06:48 compute-0 nova_compute[187185]: 2025-11-29 07:06:48.673 187189 DEBUG nova.network.neutron [req-5cc85486-154c-41a7-a32a-58e0ed2ba0ac req-ae4cfdf5-bbb4-4152-95be-c29adaa2d61f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Updated VIF entry in instance network info cache for port cb9181dd-01c0-409c-8102-a94217bf40a8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:06:48 compute-0 nova_compute[187185]: 2025-11-29 07:06:48.673 187189 DEBUG nova.network.neutron [req-5cc85486-154c-41a7-a32a-58e0ed2ba0ac req-ae4cfdf5-bbb4-4152-95be-c29adaa2d61f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Updating instance_info_cache with network_info: [{"id": "cb9181dd-01c0-409c-8102-a94217bf40a8", "address": "fa:16:3e:80:46:c1", "network": {"id": "ec48bfb6-9f16-422e-b7dc-422060e71ca2", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-39474949-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eee8c6e871b948c9bff0d4ee4267ba78", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcb9181dd-01", "ovs_interfaceid": "cb9181dd-01c0-409c-8102-a94217bf40a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:06:48 compute-0 nova_compute[187185]: 2025-11-29 07:06:48.694 187189 DEBUG oslo_concurrency.lockutils [req-5cc85486-154c-41a7-a32a-58e0ed2ba0ac req-ae4cfdf5-bbb4-4152-95be-c29adaa2d61f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-3b461587-d91f-4a59-a05a-9a2ae89cfacd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:06:49 compute-0 podman[223916]: 2025-11-29 07:06:49.235732373 +0000 UTC m=+1.025486873 container create 548cb25cb3b313e9192b4efe26a077da1f43bddf9fa8b430d14e77eb8e34d50e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ec48bfb6-9f16-422e-b7dc-422060e71ca2, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 29 07:06:49 compute-0 nova_compute[187185]: 2025-11-29 07:06:49.329 187189 DEBUG nova.compute.manager [req-c3ad9dbb-dcf0-4e42-bd98-e5424670c731 req-a42a6ad9-e2c3-4a3c-bf25-4f80d77ce6a0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Received event network-vif-plugged-cb9181dd-01c0-409c-8102-a94217bf40a8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:06:49 compute-0 nova_compute[187185]: 2025-11-29 07:06:49.330 187189 DEBUG oslo_concurrency.lockutils [req-c3ad9dbb-dcf0-4e42-bd98-e5424670c731 req-a42a6ad9-e2c3-4a3c-bf25-4f80d77ce6a0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "3b461587-d91f-4a59-a05a-9a2ae89cfacd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:06:49 compute-0 nova_compute[187185]: 2025-11-29 07:06:49.330 187189 DEBUG oslo_concurrency.lockutils [req-c3ad9dbb-dcf0-4e42-bd98-e5424670c731 req-a42a6ad9-e2c3-4a3c-bf25-4f80d77ce6a0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3b461587-d91f-4a59-a05a-9a2ae89cfacd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:06:49 compute-0 nova_compute[187185]: 2025-11-29 07:06:49.331 187189 DEBUG oslo_concurrency.lockutils [req-c3ad9dbb-dcf0-4e42-bd98-e5424670c731 req-a42a6ad9-e2c3-4a3c-bf25-4f80d77ce6a0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3b461587-d91f-4a59-a05a-9a2ae89cfacd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:06:49 compute-0 nova_compute[187185]: 2025-11-29 07:06:49.331 187189 DEBUG nova.compute.manager [req-c3ad9dbb-dcf0-4e42-bd98-e5424670c731 req-a42a6ad9-e2c3-4a3c-bf25-4f80d77ce6a0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Processing event network-vif-plugged-cb9181dd-01c0-409c-8102-a94217bf40a8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:06:49 compute-0 nova_compute[187185]: 2025-11-29 07:06:49.331 187189 DEBUG nova.compute.manager [req-c3ad9dbb-dcf0-4e42-bd98-e5424670c731 req-a42a6ad9-e2c3-4a3c-bf25-4f80d77ce6a0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Received event network-vif-plugged-cb9181dd-01c0-409c-8102-a94217bf40a8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:06:49 compute-0 nova_compute[187185]: 2025-11-29 07:06:49.332 187189 DEBUG oslo_concurrency.lockutils [req-c3ad9dbb-dcf0-4e42-bd98-e5424670c731 req-a42a6ad9-e2c3-4a3c-bf25-4f80d77ce6a0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "3b461587-d91f-4a59-a05a-9a2ae89cfacd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:06:49 compute-0 nova_compute[187185]: 2025-11-29 07:06:49.332 187189 DEBUG oslo_concurrency.lockutils [req-c3ad9dbb-dcf0-4e42-bd98-e5424670c731 req-a42a6ad9-e2c3-4a3c-bf25-4f80d77ce6a0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3b461587-d91f-4a59-a05a-9a2ae89cfacd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:06:49 compute-0 nova_compute[187185]: 2025-11-29 07:06:49.333 187189 DEBUG oslo_concurrency.lockutils [req-c3ad9dbb-dcf0-4e42-bd98-e5424670c731 req-a42a6ad9-e2c3-4a3c-bf25-4f80d77ce6a0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3b461587-d91f-4a59-a05a-9a2ae89cfacd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:06:49 compute-0 nova_compute[187185]: 2025-11-29 07:06:49.333 187189 DEBUG nova.compute.manager [req-c3ad9dbb-dcf0-4e42-bd98-e5424670c731 req-a42a6ad9-e2c3-4a3c-bf25-4f80d77ce6a0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] No waiting events found dispatching network-vif-plugged-cb9181dd-01c0-409c-8102-a94217bf40a8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:06:49 compute-0 nova_compute[187185]: 2025-11-29 07:06:49.334 187189 WARNING nova.compute.manager [req-c3ad9dbb-dcf0-4e42-bd98-e5424670c731 req-a42a6ad9-e2c3-4a3c-bf25-4f80d77ce6a0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Received unexpected event network-vif-plugged-cb9181dd-01c0-409c-8102-a94217bf40a8 for instance with vm_state building and task_state spawning.
Nov 29 07:06:49 compute-0 nova_compute[187185]: 2025-11-29 07:06:49.334 187189 DEBUG nova.compute.manager [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:06:49 compute-0 nova_compute[187185]: 2025-11-29 07:06:49.339 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400009.3391483, 3b461587-d91f-4a59-a05a-9a2ae89cfacd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:06:49 compute-0 nova_compute[187185]: 2025-11-29 07:06:49.339 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] VM Resumed (Lifecycle Event)
Nov 29 07:06:49 compute-0 nova_compute[187185]: 2025-11-29 07:06:49.343 187189 DEBUG nova.virt.libvirt.driver [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:06:49 compute-0 nova_compute[187185]: 2025-11-29 07:06:49.349 187189 INFO nova.virt.libvirt.driver [-] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Instance spawned successfully.
Nov 29 07:06:49 compute-0 nova_compute[187185]: 2025-11-29 07:06:49.350 187189 DEBUG nova.virt.libvirt.driver [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:06:49 compute-0 nova_compute[187185]: 2025-11-29 07:06:49.359 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:06:49 compute-0 nova_compute[187185]: 2025-11-29 07:06:49.371 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:06:49 compute-0 nova_compute[187185]: 2025-11-29 07:06:49.379 187189 DEBUG nova.virt.libvirt.driver [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:06:49 compute-0 nova_compute[187185]: 2025-11-29 07:06:49.380 187189 DEBUG nova.virt.libvirt.driver [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:06:49 compute-0 nova_compute[187185]: 2025-11-29 07:06:49.381 187189 DEBUG nova.virt.libvirt.driver [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:06:49 compute-0 nova_compute[187185]: 2025-11-29 07:06:49.382 187189 DEBUG nova.virt.libvirt.driver [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:06:49 compute-0 nova_compute[187185]: 2025-11-29 07:06:49.382 187189 DEBUG nova.virt.libvirt.driver [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:06:49 compute-0 nova_compute[187185]: 2025-11-29 07:06:49.383 187189 DEBUG nova.virt.libvirt.driver [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:06:49 compute-0 nova_compute[187185]: 2025-11-29 07:06:49.390 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:06:49 compute-0 nova_compute[187185]: 2025-11-29 07:06:49.391 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:06:49 compute-0 nova_compute[187185]: 2025-11-29 07:06:49.449 187189 INFO nova.compute.manager [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Took 7.60 seconds to spawn the instance on the hypervisor.
Nov 29 07:06:49 compute-0 nova_compute[187185]: 2025-11-29 07:06:49.450 187189 DEBUG nova.compute.manager [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:06:49 compute-0 nova_compute[187185]: 2025-11-29 07:06:49.537 187189 INFO nova.compute.manager [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Took 8.40 seconds to build instance.
Nov 29 07:06:49 compute-0 nova_compute[187185]: 2025-11-29 07:06:49.558 187189 DEBUG oslo_concurrency.lockutils [None req-61246a53-1468-43d9-83fb-a8c9f7d279c7 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Lock "3b461587-d91f-4a59-a05a-9a2ae89cfacd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.495s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:06:50 compute-0 systemd[1]: Started libpod-conmon-548cb25cb3b313e9192b4efe26a077da1f43bddf9fa8b430d14e77eb8e34d50e.scope.
Nov 29 07:06:50 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:06:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/af7236fcd411afb3ea3808ba516c7b2183bb969b079cd3b844a4e8182c43df9c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:06:50 compute-0 podman[223916]: 2025-11-29 07:06:50.476179576 +0000 UTC m=+2.265934096 container init 548cb25cb3b313e9192b4efe26a077da1f43bddf9fa8b430d14e77eb8e34d50e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ec48bfb6-9f16-422e-b7dc-422060e71ca2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 07:06:50 compute-0 podman[223916]: 2025-11-29 07:06:50.483150913 +0000 UTC m=+2.272905413 container start 548cb25cb3b313e9192b4efe26a077da1f43bddf9fa8b430d14e77eb8e34d50e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ec48bfb6-9f16-422e-b7dc-422060e71ca2, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 29 07:06:50 compute-0 neutron-haproxy-ovnmeta-ec48bfb6-9f16-422e-b7dc-422060e71ca2[223931]: [NOTICE]   (223935) : New worker (223937) forked
Nov 29 07:06:50 compute-0 neutron-haproxy-ovnmeta-ec48bfb6-9f16-422e-b7dc-422060e71ca2[223931]: [NOTICE]   (223935) : Loading success.
Nov 29 07:06:51 compute-0 nova_compute[187185]: 2025-11-29 07:06:51.246 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:06:53 compute-0 nova_compute[187185]: 2025-11-29 07:06:53.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:06:53 compute-0 nova_compute[187185]: 2025-11-29 07:06:53.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:06:53 compute-0 nova_compute[187185]: 2025-11-29 07:06:53.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:06:53 compute-0 nova_compute[187185]: 2025-11-29 07:06:53.498 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "refresh_cache-02b37f0c-3272-417b-9791-48b555f68d56" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:06:53 compute-0 nova_compute[187185]: 2025-11-29 07:06:53.500 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquired lock "refresh_cache-02b37f0c-3272-417b-9791-48b555f68d56" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:06:53 compute-0 nova_compute[187185]: 2025-11-29 07:06:53.500 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 29 07:06:53 compute-0 nova_compute[187185]: 2025-11-29 07:06:53.500 187189 DEBUG nova.objects.instance [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 02b37f0c-3272-417b-9791-48b555f68d56 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:06:54 compute-0 nova_compute[187185]: 2025-11-29 07:06:54.492 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:06:54 compute-0 NetworkManager[55227]: <info>  [1764400014.6259] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/95)
Nov 29 07:06:54 compute-0 NetworkManager[55227]: <info>  [1764400014.6279] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/96)
Nov 29 07:06:54 compute-0 nova_compute[187185]: 2025-11-29 07:06:54.628 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:06:54 compute-0 nova_compute[187185]: 2025-11-29 07:06:54.714 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:06:54 compute-0 ovn_controller[95281]: 2025-11-29T07:06:54Z|00168|binding|INFO|Releasing lport cd784eba-7471-412f-94a4-89c319d0e878 from this chassis (sb_readonly=0)
Nov 29 07:06:54 compute-0 ovn_controller[95281]: 2025-11-29T07:06:54Z|00169|binding|INFO|Releasing lport ed5aef73-67a0-4ad1-8aea-9c411786c18e from this chassis (sb_readonly=0)
Nov 29 07:06:54 compute-0 nova_compute[187185]: 2025-11-29 07:06:54.733 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:06:55 compute-0 ovn_controller[95281]: 2025-11-29T07:06:55Z|00170|binding|INFO|Releasing lport cd784eba-7471-412f-94a4-89c319d0e878 from this chassis (sb_readonly=0)
Nov 29 07:06:55 compute-0 ovn_controller[95281]: 2025-11-29T07:06:55Z|00171|binding|INFO|Releasing lport ed5aef73-67a0-4ad1-8aea-9c411786c18e from this chassis (sb_readonly=0)
Nov 29 07:06:55 compute-0 nova_compute[187185]: 2025-11-29 07:06:55.304 187189 DEBUG nova.compute.manager [req-eb833682-a85f-4b21-96d1-a371c621a46d req-4601602d-1f07-456e-a69c-f62df1ee8928 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Received event network-changed-cb9181dd-01c0-409c-8102-a94217bf40a8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:06:55 compute-0 nova_compute[187185]: 2025-11-29 07:06:55.305 187189 DEBUG nova.compute.manager [req-eb833682-a85f-4b21-96d1-a371c621a46d req-4601602d-1f07-456e-a69c-f62df1ee8928 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Refreshing instance network info cache due to event network-changed-cb9181dd-01c0-409c-8102-a94217bf40a8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:06:55 compute-0 nova_compute[187185]: 2025-11-29 07:06:55.307 187189 DEBUG oslo_concurrency.lockutils [req-eb833682-a85f-4b21-96d1-a371c621a46d req-4601602d-1f07-456e-a69c-f62df1ee8928 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-3b461587-d91f-4a59-a05a-9a2ae89cfacd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:06:55 compute-0 nova_compute[187185]: 2025-11-29 07:06:55.307 187189 DEBUG oslo_concurrency.lockutils [req-eb833682-a85f-4b21-96d1-a371c621a46d req-4601602d-1f07-456e-a69c-f62df1ee8928 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-3b461587-d91f-4a59-a05a-9a2ae89cfacd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:06:55 compute-0 nova_compute[187185]: 2025-11-29 07:06:55.307 187189 DEBUG nova.network.neutron [req-eb833682-a85f-4b21-96d1-a371c621a46d req-4601602d-1f07-456e-a69c-f62df1ee8928 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Refreshing network info cache for port cb9181dd-01c0-409c-8102-a94217bf40a8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:06:55 compute-0 nova_compute[187185]: 2025-11-29 07:06:55.326 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:06:55 compute-0 nova_compute[187185]: 2025-11-29 07:06:55.660 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Updating instance_info_cache with network_info: [{"id": "3978dee0-d304-43a8-9478-68840d581d9b", "address": "fa:16:3e:c9:dd:dd", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3978dee0-d3", "ovs_interfaceid": "3978dee0-d304-43a8-9478-68840d581d9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:06:55 compute-0 nova_compute[187185]: 2025-11-29 07:06:55.799 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Releasing lock "refresh_cache-02b37f0c-3272-417b-9791-48b555f68d56" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:06:55 compute-0 nova_compute[187185]: 2025-11-29 07:06:55.800 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 29 07:06:55 compute-0 nova_compute[187185]: 2025-11-29 07:06:55.800 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:06:56 compute-0 nova_compute[187185]: 2025-11-29 07:06:56.249 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:06:56 compute-0 podman[223947]: 2025-11-29 07:06:56.840398642 +0000 UTC m=+0.104333418 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:06:57 compute-0 nova_compute[187185]: 2025-11-29 07:06:57.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:06:57 compute-0 nova_compute[187185]: 2025-11-29 07:06:57.715 187189 DEBUG nova.network.neutron [req-eb833682-a85f-4b21-96d1-a371c621a46d req-4601602d-1f07-456e-a69c-f62df1ee8928 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Updated VIF entry in instance network info cache for port cb9181dd-01c0-409c-8102-a94217bf40a8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:06:57 compute-0 nova_compute[187185]: 2025-11-29 07:06:57.716 187189 DEBUG nova.network.neutron [req-eb833682-a85f-4b21-96d1-a371c621a46d req-4601602d-1f07-456e-a69c-f62df1ee8928 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Updating instance_info_cache with network_info: [{"id": "cb9181dd-01c0-409c-8102-a94217bf40a8", "address": "fa:16:3e:80:46:c1", "network": {"id": "ec48bfb6-9f16-422e-b7dc-422060e71ca2", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-39474949-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eee8c6e871b948c9bff0d4ee4267ba78", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcb9181dd-01", "ovs_interfaceid": "cb9181dd-01c0-409c-8102-a94217bf40a8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:06:58 compute-0 nova_compute[187185]: 2025-11-29 07:06:58.037 187189 DEBUG oslo_concurrency.lockutils [req-eb833682-a85f-4b21-96d1-a371c621a46d req-4601602d-1f07-456e-a69c-f62df1ee8928 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-3b461587-d91f-4a59-a05a-9a2ae89cfacd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:06:58 compute-0 nova_compute[187185]: 2025-11-29 07:06:58.312 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:06:58 compute-0 nova_compute[187185]: 2025-11-29 07:06:58.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:06:58 compute-0 nova_compute[187185]: 2025-11-29 07:06:58.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:06:59 compute-0 nova_compute[187185]: 2025-11-29 07:06:59.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:06:59 compute-0 nova_compute[187185]: 2025-11-29 07:06:59.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:06:59 compute-0 nova_compute[187185]: 2025-11-29 07:06:59.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:06:59 compute-0 nova_compute[187185]: 2025-11-29 07:06:59.494 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:00 compute-0 nova_compute[187185]: 2025-11-29 07:07:00.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:07:00 compute-0 nova_compute[187185]: 2025-11-29 07:07:00.342 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:07:00 compute-0 nova_compute[187185]: 2025-11-29 07:07:00.342 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:07:00 compute-0 nova_compute[187185]: 2025-11-29 07:07:00.343 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:07:00 compute-0 nova_compute[187185]: 2025-11-29 07:07:00.343 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:07:00 compute-0 nova_compute[187185]: 2025-11-29 07:07:00.434 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02b37f0c-3272-417b-9791-48b555f68d56/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:07:00 compute-0 nova_compute[187185]: 2025-11-29 07:07:00.500 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02b37f0c-3272-417b-9791-48b555f68d56/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:07:00 compute-0 nova_compute[187185]: 2025-11-29 07:07:00.501 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02b37f0c-3272-417b-9791-48b555f68d56/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:07:00 compute-0 nova_compute[187185]: 2025-11-29 07:07:00.559 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02b37f0c-3272-417b-9791-48b555f68d56/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:07:00 compute-0 nova_compute[187185]: 2025-11-29 07:07:00.567 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b461587-d91f-4a59-a05a-9a2ae89cfacd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:07:00 compute-0 nova_compute[187185]: 2025-11-29 07:07:00.633 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b461587-d91f-4a59-a05a-9a2ae89cfacd/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:07:00 compute-0 nova_compute[187185]: 2025-11-29 07:07:00.634 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b461587-d91f-4a59-a05a-9a2ae89cfacd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:07:00 compute-0 nova_compute[187185]: 2025-11-29 07:07:00.701 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b461587-d91f-4a59-a05a-9a2ae89cfacd/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:07:00 compute-0 nova_compute[187185]: 2025-11-29 07:07:00.890 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:07:00 compute-0 nova_compute[187185]: 2025-11-29 07:07:00.893 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5428MB free_disk=73.3013687133789GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:07:00 compute-0 nova_compute[187185]: 2025-11-29 07:07:00.894 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:07:00 compute-0 nova_compute[187185]: 2025-11-29 07:07:00.894 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:07:00 compute-0 nova_compute[187185]: 2025-11-29 07:07:00.984 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance 02b37f0c-3272-417b-9791-48b555f68d56 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:07:00 compute-0 nova_compute[187185]: 2025-11-29 07:07:00.985 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance 3b461587-d91f-4a59-a05a-9a2ae89cfacd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:07:00 compute-0 nova_compute[187185]: 2025-11-29 07:07:00.985 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:07:00 compute-0 nova_compute[187185]: 2025-11-29 07:07:00.986 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:07:01 compute-0 nova_compute[187185]: 2025-11-29 07:07:01.012 187189 DEBUG oslo_concurrency.lockutils [None req-c956b29a-8182-4be0-8f03-88dfc759751c f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "02b37f0c-3272-417b-9791-48b555f68d56" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:07:01 compute-0 nova_compute[187185]: 2025-11-29 07:07:01.012 187189 DEBUG oslo_concurrency.lockutils [None req-c956b29a-8182-4be0-8f03-88dfc759751c f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "02b37f0c-3272-417b-9791-48b555f68d56" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:07:01 compute-0 nova_compute[187185]: 2025-11-29 07:07:01.013 187189 DEBUG oslo_concurrency.lockutils [None req-c956b29a-8182-4be0-8f03-88dfc759751c f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "02b37f0c-3272-417b-9791-48b555f68d56-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:07:01 compute-0 nova_compute[187185]: 2025-11-29 07:07:01.013 187189 DEBUG oslo_concurrency.lockutils [None req-c956b29a-8182-4be0-8f03-88dfc759751c f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "02b37f0c-3272-417b-9791-48b555f68d56-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:07:01 compute-0 nova_compute[187185]: 2025-11-29 07:07:01.013 187189 DEBUG oslo_concurrency.lockutils [None req-c956b29a-8182-4be0-8f03-88dfc759751c f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "02b37f0c-3272-417b-9791-48b555f68d56-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:07:01 compute-0 nova_compute[187185]: 2025-11-29 07:07:01.028 187189 INFO nova.compute.manager [None req-c956b29a-8182-4be0-8f03-88dfc759751c f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Terminating instance
Nov 29 07:07:01 compute-0 nova_compute[187185]: 2025-11-29 07:07:01.041 187189 DEBUG nova.compute.manager [None req-c956b29a-8182-4be0-8f03-88dfc759751c f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 07:07:01 compute-0 nova_compute[187185]: 2025-11-29 07:07:01.068 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:07:01 compute-0 nova_compute[187185]: 2025-11-29 07:07:01.089 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:07:01 compute-0 nova_compute[187185]: 2025-11-29 07:07:01.118 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:07:01 compute-0 nova_compute[187185]: 2025-11-29 07:07:01.119 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.225s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:07:01 compute-0 nova_compute[187185]: 2025-11-29 07:07:01.253 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:01 compute-0 kernel: tap3978dee0-d3 (unregistering): left promiscuous mode
Nov 29 07:07:01 compute-0 NetworkManager[55227]: <info>  [1764400021.7509] device (tap3978dee0-d3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:07:01 compute-0 ovn_controller[95281]: 2025-11-29T07:07:01Z|00172|binding|INFO|Releasing lport 3978dee0-d304-43a8-9478-68840d581d9b from this chassis (sb_readonly=0)
Nov 29 07:07:01 compute-0 ovn_controller[95281]: 2025-11-29T07:07:01Z|00173|binding|INFO|Setting lport 3978dee0-d304-43a8-9478-68840d581d9b down in Southbound
Nov 29 07:07:01 compute-0 ovn_controller[95281]: 2025-11-29T07:07:01Z|00174|binding|INFO|Removing iface tap3978dee0-d3 ovn-installed in OVS
Nov 29 07:07:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:01.790 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c9:dd:dd 10.100.0.11'], port_security=['fa:16:3e:c9:dd:dd 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '02b37f0c-3272-417b-9791-48b555f68d56', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9cf3a513-f54e-430e-b018-befaa643b464', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fc8ab121-ee69-4ab4-9a39-25b26b293132', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a3fd0639-e84a-4389-a7f3-f9ac2c360b5e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=3978dee0-d304-43a8-9478-68840d581d9b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:07:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:01.792 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 3978dee0-d304-43a8-9478-68840d581d9b in datapath 9cf3a513-f54e-430e-b018-befaa643b464 unbound from our chassis
Nov 29 07:07:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:01.794 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9cf3a513-f54e-430e-b018-befaa643b464, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:07:01 compute-0 nova_compute[187185]: 2025-11-29 07:07:01.794 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:01 compute-0 nova_compute[187185]: 2025-11-29 07:07:01.797 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:01.796 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[db00c9bd-b14e-4ee6-a6a7-dfd83a245434]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:07:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:01.797 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464 namespace which is not needed anymore
Nov 29 07:07:01 compute-0 nova_compute[187185]: 2025-11-29 07:07:01.807 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:01 compute-0 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d00000040.scope: Deactivated successfully.
Nov 29 07:07:01 compute-0 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d00000040.scope: Consumed 22.340s CPU time.
Nov 29 07:07:01 compute-0 systemd-machined[153486]: Machine qemu-23-instance-00000040 terminated.
Nov 29 07:07:01 compute-0 nova_compute[187185]: 2025-11-29 07:07:01.871 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:01 compute-0 nova_compute[187185]: 2025-11-29 07:07:01.880 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:01 compute-0 podman[223991]: 2025-11-29 07:07:01.907137944 +0000 UTC m=+0.078643062 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 07:07:01 compute-0 nova_compute[187185]: 2025-11-29 07:07:01.927 187189 INFO nova.virt.libvirt.driver [-] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Instance destroyed successfully.
Nov 29 07:07:01 compute-0 nova_compute[187185]: 2025-11-29 07:07:01.927 187189 DEBUG nova.objects.instance [None req-c956b29a-8182-4be0-8f03-88dfc759751c f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lazy-loading 'resources' on Instance uuid 02b37f0c-3272-417b-9791-48b555f68d56 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:07:01 compute-0 nova_compute[187185]: 2025-11-29 07:07:01.947 187189 DEBUG nova.virt.libvirt.vif [None req-c956b29a-8182-4be0-8f03-88dfc759751c f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:03:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-₡-1325169813',display_name='tempest-₡-1325169813',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest--1325169813',id=64,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:03:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1dba9539037a4e9dbf33cba140fe21fe',ramdisk_id='',reservation_id='r-5w7ovqmq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-373958708',owner_user_name='tempest-ServersTestJSON-373958708-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:03:47Z,user_data=None,user_id='f2f86d3bd4814a09966b869dd539a6c9',uuid=02b37f0c-3272-417b-9791-48b555f68d56,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3978dee0-d304-43a8-9478-68840d581d9b", "address": "fa:16:3e:c9:dd:dd", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3978dee0-d3", "ovs_interfaceid": "3978dee0-d304-43a8-9478-68840d581d9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:07:01 compute-0 nova_compute[187185]: 2025-11-29 07:07:01.947 187189 DEBUG nova.network.os_vif_util [None req-c956b29a-8182-4be0-8f03-88dfc759751c f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Converting VIF {"id": "3978dee0-d304-43a8-9478-68840d581d9b", "address": "fa:16:3e:c9:dd:dd", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3978dee0-d3", "ovs_interfaceid": "3978dee0-d304-43a8-9478-68840d581d9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:07:01 compute-0 nova_compute[187185]: 2025-11-29 07:07:01.948 187189 DEBUG nova.network.os_vif_util [None req-c956b29a-8182-4be0-8f03-88dfc759751c f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c9:dd:dd,bridge_name='br-int',has_traffic_filtering=True,id=3978dee0-d304-43a8-9478-68840d581d9b,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3978dee0-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:07:01 compute-0 nova_compute[187185]: 2025-11-29 07:07:01.949 187189 DEBUG os_vif [None req-c956b29a-8182-4be0-8f03-88dfc759751c f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c9:dd:dd,bridge_name='br-int',has_traffic_filtering=True,id=3978dee0-d304-43a8-9478-68840d581d9b,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3978dee0-d3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:07:01 compute-0 nova_compute[187185]: 2025-11-29 07:07:01.952 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:01 compute-0 nova_compute[187185]: 2025-11-29 07:07:01.952 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3978dee0-d3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:07:01 compute-0 nova_compute[187185]: 2025-11-29 07:07:01.956 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:07:01 compute-0 nova_compute[187185]: 2025-11-29 07:07:01.957 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:01 compute-0 nova_compute[187185]: 2025-11-29 07:07:01.960 187189 INFO os_vif [None req-c956b29a-8182-4be0-8f03-88dfc759751c f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c9:dd:dd,bridge_name='br-int',has_traffic_filtering=True,id=3978dee0-d304-43a8-9478-68840d581d9b,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3978dee0-d3')
Nov 29 07:07:01 compute-0 nova_compute[187185]: 2025-11-29 07:07:01.961 187189 INFO nova.virt.libvirt.driver [None req-c956b29a-8182-4be0-8f03-88dfc759751c f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Deleting instance files /var/lib/nova/instances/02b37f0c-3272-417b-9791-48b555f68d56_del
Nov 29 07:07:01 compute-0 nova_compute[187185]: 2025-11-29 07:07:01.963 187189 INFO nova.virt.libvirt.driver [None req-c956b29a-8182-4be0-8f03-88dfc759751c f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Deletion of /var/lib/nova/instances/02b37f0c-3272-417b-9791-48b555f68d56_del complete
Nov 29 07:07:02 compute-0 nova_compute[187185]: 2025-11-29 07:07:02.044 187189 INFO nova.compute.manager [None req-c956b29a-8182-4be0-8f03-88dfc759751c f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Took 1.00 seconds to destroy the instance on the hypervisor.
Nov 29 07:07:02 compute-0 nova_compute[187185]: 2025-11-29 07:07:02.045 187189 DEBUG oslo.service.loopingcall [None req-c956b29a-8182-4be0-8f03-88dfc759751c f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 07:07:02 compute-0 nova_compute[187185]: 2025-11-29 07:07:02.045 187189 DEBUG nova.compute.manager [-] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 07:07:02 compute-0 nova_compute[187185]: 2025-11-29 07:07:02.045 187189 DEBUG nova.network.neutron [-] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 07:07:02 compute-0 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[222641]: [NOTICE]   (222645) : haproxy version is 2.8.14-c23fe91
Nov 29 07:07:02 compute-0 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[222641]: [NOTICE]   (222645) : path to executable is /usr/sbin/haproxy
Nov 29 07:07:02 compute-0 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[222641]: [WARNING]  (222645) : Exiting Master process...
Nov 29 07:07:02 compute-0 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[222641]: [ALERT]    (222645) : Current worker (222647) exited with code 143 (Terminated)
Nov 29 07:07:02 compute-0 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[222641]: [WARNING]  (222645) : All workers exited. Exiting... (0)
Nov 29 07:07:02 compute-0 systemd[1]: libpod-0dfe199bffa38d5ab97d9d4d6a0d9f7a22d102cb1581c02d5eaa39c112b65390.scope: Deactivated successfully.
Nov 29 07:07:02 compute-0 podman[224035]: 2025-11-29 07:07:02.776189907 +0000 UTC m=+0.860588335 container died 0dfe199bffa38d5ab97d9d4d6a0d9f7a22d102cb1581c02d5eaa39c112b65390 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 07:07:03 compute-0 systemd[1]: var-lib-containers-storage-overlay-f2219bc9b7c6d7940c935ac1ce97f304a66274a918469f3454d0d503d892dd54-merged.mount: Deactivated successfully.
Nov 29 07:07:03 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0dfe199bffa38d5ab97d9d4d6a0d9f7a22d102cb1581c02d5eaa39c112b65390-userdata-shm.mount: Deactivated successfully.
Nov 29 07:07:03 compute-0 podman[224035]: 2025-11-29 07:07:03.064111407 +0000 UTC m=+1.148509835 container cleanup 0dfe199bffa38d5ab97d9d4d6a0d9f7a22d102cb1581c02d5eaa39c112b65390 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:07:03 compute-0 systemd[1]: libpod-conmon-0dfe199bffa38d5ab97d9d4d6a0d9f7a22d102cb1581c02d5eaa39c112b65390.scope: Deactivated successfully.
Nov 29 07:07:03 compute-0 nova_compute[187185]: 2025-11-29 07:07:03.090 187189 DEBUG nova.network.neutron [-] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:07:03 compute-0 nova_compute[187185]: 2025-11-29 07:07:03.116 187189 INFO nova.compute.manager [-] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Took 1.07 seconds to deallocate network for instance.
Nov 29 07:07:03 compute-0 podman[224084]: 2025-11-29 07:07:03.165666783 +0000 UTC m=+0.075893018 container remove 0dfe199bffa38d5ab97d9d4d6a0d9f7a22d102cb1581c02d5eaa39c112b65390 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:07:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:03.172 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[adb87d76-0f2b-4255-979d-3afc38faf942]: (4, ('Sat Nov 29 07:07:01 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464 (0dfe199bffa38d5ab97d9d4d6a0d9f7a22d102cb1581c02d5eaa39c112b65390)\n0dfe199bffa38d5ab97d9d4d6a0d9f7a22d102cb1581c02d5eaa39c112b65390\nSat Nov 29 07:07:03 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464 (0dfe199bffa38d5ab97d9d4d6a0d9f7a22d102cb1581c02d5eaa39c112b65390)\n0dfe199bffa38d5ab97d9d4d6a0d9f7a22d102cb1581c02d5eaa39c112b65390\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:07:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:03.174 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6102b1a1-b6ff-4983-92f0-e52cff0b0ce5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:07:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:03.175 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9cf3a513-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:07:03 compute-0 nova_compute[187185]: 2025-11-29 07:07:03.177 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:03 compute-0 kernel: tap9cf3a513-f0: left promiscuous mode
Nov 29 07:07:03 compute-0 nova_compute[187185]: 2025-11-29 07:07:03.191 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:03.194 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[99e1e3d5-b457-46c2-b300-e8f907b1f5c9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:07:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:03.214 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[5f4c627c-f65d-4740-876b-db81b35acacd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:07:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:03.215 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f2b601f0-7c26-4c4d-9473-0421684a0fcc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:07:03 compute-0 nova_compute[187185]: 2025-11-29 07:07:03.219 187189 DEBUG oslo_concurrency.lockutils [None req-c956b29a-8182-4be0-8f03-88dfc759751c f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:07:03 compute-0 nova_compute[187185]: 2025-11-29 07:07:03.220 187189 DEBUG oslo_concurrency.lockutils [None req-c956b29a-8182-4be0-8f03-88dfc759751c f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:07:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:03.232 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6f85e313-8d5c-40d0-babc-d6bff4f25c0d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 527354, 'reachable_time': 31344, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224107, 'error': None, 'target': 'ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:07:03 compute-0 systemd[1]: run-netns-ovnmeta\x2d9cf3a513\x2df54e\x2d430e\x2db018\x2dbefaa643b464.mount: Deactivated successfully.
Nov 29 07:07:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:03.237 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:07:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:03.237 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[a9c7b9b1-66fe-45e7-9a6c-dda1b23c9905]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:07:03 compute-0 nova_compute[187185]: 2025-11-29 07:07:03.252 187189 DEBUG nova.compute.manager [req-2c1e96e0-6c49-49bb-a715-8ebc1a0dc660 req-069de949-1685-4036-b955-bbb4eb5b7992 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Received event network-vif-deleted-3978dee0-d304-43a8-9478-68840d581d9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:07:03 compute-0 nova_compute[187185]: 2025-11-29 07:07:03.256 187189 DEBUG nova.compute.manager [req-234e6dc6-e17c-43f9-ae0b-e50103f43c54 req-cfaa6e1e-b1f7-4e1f-bf93-af22440021f5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Received event network-vif-unplugged-3978dee0-d304-43a8-9478-68840d581d9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:07:03 compute-0 nova_compute[187185]: 2025-11-29 07:07:03.256 187189 DEBUG oslo_concurrency.lockutils [req-234e6dc6-e17c-43f9-ae0b-e50103f43c54 req-cfaa6e1e-b1f7-4e1f-bf93-af22440021f5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "02b37f0c-3272-417b-9791-48b555f68d56-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:07:03 compute-0 nova_compute[187185]: 2025-11-29 07:07:03.256 187189 DEBUG oslo_concurrency.lockutils [req-234e6dc6-e17c-43f9-ae0b-e50103f43c54 req-cfaa6e1e-b1f7-4e1f-bf93-af22440021f5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "02b37f0c-3272-417b-9791-48b555f68d56-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:07:03 compute-0 nova_compute[187185]: 2025-11-29 07:07:03.256 187189 DEBUG oslo_concurrency.lockutils [req-234e6dc6-e17c-43f9-ae0b-e50103f43c54 req-cfaa6e1e-b1f7-4e1f-bf93-af22440021f5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "02b37f0c-3272-417b-9791-48b555f68d56-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:07:03 compute-0 nova_compute[187185]: 2025-11-29 07:07:03.256 187189 DEBUG nova.compute.manager [req-234e6dc6-e17c-43f9-ae0b-e50103f43c54 req-cfaa6e1e-b1f7-4e1f-bf93-af22440021f5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] No waiting events found dispatching network-vif-unplugged-3978dee0-d304-43a8-9478-68840d581d9b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:07:03 compute-0 nova_compute[187185]: 2025-11-29 07:07:03.256 187189 WARNING nova.compute.manager [req-234e6dc6-e17c-43f9-ae0b-e50103f43c54 req-cfaa6e1e-b1f7-4e1f-bf93-af22440021f5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Received unexpected event network-vif-unplugged-3978dee0-d304-43a8-9478-68840d581d9b for instance with vm_state deleted and task_state None.
Nov 29 07:07:03 compute-0 nova_compute[187185]: 2025-11-29 07:07:03.284 187189 DEBUG nova.compute.provider_tree [None req-c956b29a-8182-4be0-8f03-88dfc759751c f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:07:03 compute-0 nova_compute[187185]: 2025-11-29 07:07:03.298 187189 DEBUG nova.scheduler.client.report [None req-c956b29a-8182-4be0-8f03-88dfc759751c f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:07:03 compute-0 nova_compute[187185]: 2025-11-29 07:07:03.338 187189 DEBUG oslo_concurrency.lockutils [None req-c956b29a-8182-4be0-8f03-88dfc759751c f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:07:03 compute-0 nova_compute[187185]: 2025-11-29 07:07:03.362 187189 INFO nova.scheduler.client.report [None req-c956b29a-8182-4be0-8f03-88dfc759751c f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Deleted allocations for instance 02b37f0c-3272-417b-9791-48b555f68d56
Nov 29 07:07:03 compute-0 nova_compute[187185]: 2025-11-29 07:07:03.482 187189 DEBUG oslo_concurrency.lockutils [None req-c956b29a-8182-4be0-8f03-88dfc759751c f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "02b37f0c-3272-417b-9791-48b555f68d56" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.470s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:07:04 compute-0 nova_compute[187185]: 2025-11-29 07:07:04.544 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:04 compute-0 nova_compute[187185]: 2025-11-29 07:07:04.571 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:04 compute-0 ovn_controller[95281]: 2025-11-29T07:07:04Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:80:46:c1 10.100.0.6
Nov 29 07:07:04 compute-0 ovn_controller[95281]: 2025-11-29T07:07:04Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:80:46:c1 10.100.0.6
Nov 29 07:07:04 compute-0 podman[224108]: 2025-11-29 07:07:04.819168047 +0000 UTC m=+0.078184412 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 07:07:05 compute-0 nova_compute[187185]: 2025-11-29 07:07:05.284 187189 DEBUG nova.compute.manager [req-345341ff-a782-4419-9286-ce210da38581 req-c5a341af-b9ad-4d45-a469-06284ded10f8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Received event network-changed-cb9181dd-01c0-409c-8102-a94217bf40a8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:07:05 compute-0 nova_compute[187185]: 2025-11-29 07:07:05.285 187189 DEBUG nova.compute.manager [req-345341ff-a782-4419-9286-ce210da38581 req-c5a341af-b9ad-4d45-a469-06284ded10f8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Refreshing instance network info cache due to event network-changed-cb9181dd-01c0-409c-8102-a94217bf40a8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:07:05 compute-0 nova_compute[187185]: 2025-11-29 07:07:05.285 187189 DEBUG oslo_concurrency.lockutils [req-345341ff-a782-4419-9286-ce210da38581 req-c5a341af-b9ad-4d45-a469-06284ded10f8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-3b461587-d91f-4a59-a05a-9a2ae89cfacd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:07:05 compute-0 nova_compute[187185]: 2025-11-29 07:07:05.285 187189 DEBUG oslo_concurrency.lockutils [req-345341ff-a782-4419-9286-ce210da38581 req-c5a341af-b9ad-4d45-a469-06284ded10f8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-3b461587-d91f-4a59-a05a-9a2ae89cfacd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:07:05 compute-0 nova_compute[187185]: 2025-11-29 07:07:05.285 187189 DEBUG nova.network.neutron [req-345341ff-a782-4419-9286-ce210da38581 req-c5a341af-b9ad-4d45-a469-06284ded10f8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Refreshing network info cache for port cb9181dd-01c0-409c-8102-a94217bf40a8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:07:05 compute-0 nova_compute[187185]: 2025-11-29 07:07:05.368 187189 DEBUG nova.compute.manager [req-f83aab89-8f4a-4825-aa34-7b35b0771163 req-eada4cf0-d32d-4cfa-8e8b-1fdb016f7acb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Received event network-vif-plugged-3978dee0-d304-43a8-9478-68840d581d9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:07:05 compute-0 nova_compute[187185]: 2025-11-29 07:07:05.368 187189 DEBUG oslo_concurrency.lockutils [req-f83aab89-8f4a-4825-aa34-7b35b0771163 req-eada4cf0-d32d-4cfa-8e8b-1fdb016f7acb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "02b37f0c-3272-417b-9791-48b555f68d56-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:07:05 compute-0 nova_compute[187185]: 2025-11-29 07:07:05.369 187189 DEBUG oslo_concurrency.lockutils [req-f83aab89-8f4a-4825-aa34-7b35b0771163 req-eada4cf0-d32d-4cfa-8e8b-1fdb016f7acb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "02b37f0c-3272-417b-9791-48b555f68d56-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:07:05 compute-0 nova_compute[187185]: 2025-11-29 07:07:05.369 187189 DEBUG oslo_concurrency.lockutils [req-f83aab89-8f4a-4825-aa34-7b35b0771163 req-eada4cf0-d32d-4cfa-8e8b-1fdb016f7acb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "02b37f0c-3272-417b-9791-48b555f68d56-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:07:05 compute-0 nova_compute[187185]: 2025-11-29 07:07:05.369 187189 DEBUG nova.compute.manager [req-f83aab89-8f4a-4825-aa34-7b35b0771163 req-eada4cf0-d32d-4cfa-8e8b-1fdb016f7acb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] No waiting events found dispatching network-vif-plugged-3978dee0-d304-43a8-9478-68840d581d9b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:07:05 compute-0 nova_compute[187185]: 2025-11-29 07:07:05.369 187189 WARNING nova.compute.manager [req-f83aab89-8f4a-4825-aa34-7b35b0771163 req-eada4cf0-d32d-4cfa-8e8b-1fdb016f7acb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Received unexpected event network-vif-plugged-3978dee0-d304-43a8-9478-68840d581d9b for instance with vm_state deleted and task_state None.
Nov 29 07:07:06 compute-0 nova_compute[187185]: 2025-11-29 07:07:06.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:07:06 compute-0 podman[224128]: 2025-11-29 07:07:06.831782714 +0000 UTC m=+0.080442815 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 07:07:06 compute-0 nova_compute[187185]: 2025-11-29 07:07:06.956 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:07 compute-0 nova_compute[187185]: 2025-11-29 07:07:07.727 187189 DEBUG nova.network.neutron [req-345341ff-a782-4419-9286-ce210da38581 req-c5a341af-b9ad-4d45-a469-06284ded10f8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Updated VIF entry in instance network info cache for port cb9181dd-01c0-409c-8102-a94217bf40a8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:07:07 compute-0 nova_compute[187185]: 2025-11-29 07:07:07.728 187189 DEBUG nova.network.neutron [req-345341ff-a782-4419-9286-ce210da38581 req-c5a341af-b9ad-4d45-a469-06284ded10f8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Updating instance_info_cache with network_info: [{"id": "cb9181dd-01c0-409c-8102-a94217bf40a8", "address": "fa:16:3e:80:46:c1", "network": {"id": "ec48bfb6-9f16-422e-b7dc-422060e71ca2", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-39474949-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eee8c6e871b948c9bff0d4ee4267ba78", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcb9181dd-01", "ovs_interfaceid": "cb9181dd-01c0-409c-8102-a94217bf40a8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:07:07 compute-0 nova_compute[187185]: 2025-11-29 07:07:07.743 187189 DEBUG oslo_concurrency.lockutils [req-345341ff-a782-4419-9286-ce210da38581 req-c5a341af-b9ad-4d45-a469-06284ded10f8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-3b461587-d91f-4a59-a05a-9a2ae89cfacd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:07:07 compute-0 nova_compute[187185]: 2025-11-29 07:07:07.773 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:08 compute-0 ovn_controller[95281]: 2025-11-29T07:07:08Z|00175|binding|INFO|Releasing lport cd784eba-7471-412f-94a4-89c319d0e878 from this chassis (sb_readonly=0)
Nov 29 07:07:08 compute-0 nova_compute[187185]: 2025-11-29 07:07:08.537 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:08 compute-0 ovn_controller[95281]: 2025-11-29T07:07:08Z|00176|binding|INFO|Releasing lport cd784eba-7471-412f-94a4-89c319d0e878 from this chassis (sb_readonly=0)
Nov 29 07:07:08 compute-0 nova_compute[187185]: 2025-11-29 07:07:08.715 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:09 compute-0 nova_compute[187185]: 2025-11-29 07:07:09.547 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:11 compute-0 nova_compute[187185]: 2025-11-29 07:07:11.306 187189 DEBUG oslo_concurrency.lockutils [None req-53f3cc61-14fa-476f-aabc-85439533699d 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Acquiring lock "3b461587-d91f-4a59-a05a-9a2ae89cfacd" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:07:11 compute-0 nova_compute[187185]: 2025-11-29 07:07:11.307 187189 DEBUG oslo_concurrency.lockutils [None req-53f3cc61-14fa-476f-aabc-85439533699d 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Lock "3b461587-d91f-4a59-a05a-9a2ae89cfacd" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:07:11 compute-0 nova_compute[187185]: 2025-11-29 07:07:11.308 187189 DEBUG oslo_concurrency.lockutils [None req-53f3cc61-14fa-476f-aabc-85439533699d 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Acquiring lock "3b461587-d91f-4a59-a05a-9a2ae89cfacd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:07:11 compute-0 nova_compute[187185]: 2025-11-29 07:07:11.308 187189 DEBUG oslo_concurrency.lockutils [None req-53f3cc61-14fa-476f-aabc-85439533699d 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Lock "3b461587-d91f-4a59-a05a-9a2ae89cfacd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:07:11 compute-0 nova_compute[187185]: 2025-11-29 07:07:11.308 187189 DEBUG oslo_concurrency.lockutils [None req-53f3cc61-14fa-476f-aabc-85439533699d 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Lock "3b461587-d91f-4a59-a05a-9a2ae89cfacd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:07:11 compute-0 nova_compute[187185]: 2025-11-29 07:07:11.324 187189 INFO nova.compute.manager [None req-53f3cc61-14fa-476f-aabc-85439533699d 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Terminating instance
Nov 29 07:07:11 compute-0 nova_compute[187185]: 2025-11-29 07:07:11.333 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:07:11 compute-0 nova_compute[187185]: 2025-11-29 07:07:11.334 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 29 07:07:11 compute-0 nova_compute[187185]: 2025-11-29 07:07:11.349 187189 DEBUG nova.compute.manager [None req-53f3cc61-14fa-476f-aabc-85439533699d 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 07:07:11 compute-0 kernel: tapcb9181dd-01 (unregistering): left promiscuous mode
Nov 29 07:07:11 compute-0 NetworkManager[55227]: <info>  [1764400031.3803] device (tapcb9181dd-01): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:07:11 compute-0 nova_compute[187185]: 2025-11-29 07:07:11.398 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:11 compute-0 ovn_controller[95281]: 2025-11-29T07:07:11Z|00177|binding|INFO|Releasing lport cb9181dd-01c0-409c-8102-a94217bf40a8 from this chassis (sb_readonly=0)
Nov 29 07:07:11 compute-0 ovn_controller[95281]: 2025-11-29T07:07:11Z|00178|binding|INFO|Setting lport cb9181dd-01c0-409c-8102-a94217bf40a8 down in Southbound
Nov 29 07:07:11 compute-0 ovn_controller[95281]: 2025-11-29T07:07:11Z|00179|binding|INFO|Removing iface tapcb9181dd-01 ovn-installed in OVS
Nov 29 07:07:11 compute-0 nova_compute[187185]: 2025-11-29 07:07:11.401 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:11 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:11.409 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:80:46:c1 10.100.0.6'], port_security=['fa:16:3e:80:46:c1 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '3b461587-d91f-4a59-a05a-9a2ae89cfacd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ec48bfb6-9f16-422e-b7dc-422060e71ca2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'eee8c6e871b948c9bff0d4ee4267ba78', 'neutron:revision_number': '4', 'neutron:security_group_ids': '75187edb-4718-43a2-a8b6-d9040cc77fea', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=827861e9-4eef-4c9a-a62c-b7772140c169, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=cb9181dd-01c0-409c-8102-a94217bf40a8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:07:11 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:11.410 104254 INFO neutron.agent.ovn.metadata.agent [-] Port cb9181dd-01c0-409c-8102-a94217bf40a8 in datapath ec48bfb6-9f16-422e-b7dc-422060e71ca2 unbound from our chassis
Nov 29 07:07:11 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:11.411 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ec48bfb6-9f16-422e-b7dc-422060e71ca2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:07:11 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:11.412 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[3605e2f6-647d-4b63-bc88-ea1c027611fe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:07:11 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:11.413 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ec48bfb6-9f16-422e-b7dc-422060e71ca2 namespace which is not needed anymore
Nov 29 07:07:11 compute-0 nova_compute[187185]: 2025-11-29 07:07:11.416 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:11 compute-0 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d0000004f.scope: Deactivated successfully.
Nov 29 07:07:11 compute-0 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d0000004f.scope: Consumed 13.910s CPU time.
Nov 29 07:07:11 compute-0 systemd-machined[153486]: Machine qemu-25-instance-0000004f terminated.
Nov 29 07:07:11 compute-0 neutron-haproxy-ovnmeta-ec48bfb6-9f16-422e-b7dc-422060e71ca2[223931]: [NOTICE]   (223935) : haproxy version is 2.8.14-c23fe91
Nov 29 07:07:11 compute-0 neutron-haproxy-ovnmeta-ec48bfb6-9f16-422e-b7dc-422060e71ca2[223931]: [NOTICE]   (223935) : path to executable is /usr/sbin/haproxy
Nov 29 07:07:11 compute-0 neutron-haproxy-ovnmeta-ec48bfb6-9f16-422e-b7dc-422060e71ca2[223931]: [WARNING]  (223935) : Exiting Master process...
Nov 29 07:07:11 compute-0 neutron-haproxy-ovnmeta-ec48bfb6-9f16-422e-b7dc-422060e71ca2[223931]: [ALERT]    (223935) : Current worker (223937) exited with code 143 (Terminated)
Nov 29 07:07:11 compute-0 neutron-haproxy-ovnmeta-ec48bfb6-9f16-422e-b7dc-422060e71ca2[223931]: [WARNING]  (223935) : All workers exited. Exiting... (0)
Nov 29 07:07:11 compute-0 systemd[1]: libpod-548cb25cb3b313e9192b4efe26a077da1f43bddf9fa8b430d14e77eb8e34d50e.scope: Deactivated successfully.
Nov 29 07:07:11 compute-0 podman[224173]: 2025-11-29 07:07:11.627512897 +0000 UTC m=+0.122846494 container died 548cb25cb3b313e9192b4efe26a077da1f43bddf9fa8b430d14e77eb8e34d50e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ec48bfb6-9f16-422e-b7dc-422060e71ca2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:07:11 compute-0 nova_compute[187185]: 2025-11-29 07:07:11.640 187189 INFO nova.virt.libvirt.driver [-] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Instance destroyed successfully.
Nov 29 07:07:11 compute-0 nova_compute[187185]: 2025-11-29 07:07:11.640 187189 DEBUG nova.objects.instance [None req-53f3cc61-14fa-476f-aabc-85439533699d 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Lazy-loading 'resources' on Instance uuid 3b461587-d91f-4a59-a05a-9a2ae89cfacd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:07:11 compute-0 systemd[1]: var-lib-containers-storage-overlay-af7236fcd411afb3ea3808ba516c7b2183bb969b079cd3b844a4e8182c43df9c-merged.mount: Deactivated successfully.
Nov 29 07:07:11 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-548cb25cb3b313e9192b4efe26a077da1f43bddf9fa8b430d14e77eb8e34d50e-userdata-shm.mount: Deactivated successfully.
Nov 29 07:07:11 compute-0 nova_compute[187185]: 2025-11-29 07:07:11.664 187189 DEBUG nova.virt.libvirt.vif [None req-53f3cc61-14fa-476f-aabc-85439533699d 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:06:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009',display_name='tempest-FloatingIPsAssociationNegativeTestJSON-server-314359009',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationnegativetestjson-server-314359009',id=79,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:06:49Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='eee8c6e871b948c9bff0d4ee4267ba78',ramdisk_id='',reservation_id='r-lcsa4a03',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIPsAssociationNegativeTestJSON-1992151889',owner_user_name='tempest-FloatingIPsAssociationNegativeTestJSON-1992151889-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:06:49Z,user_data=None,user_id='6e3923c949d649889fe9a955a8f5cff8',uuid=3b461587-d91f-4a59-a05a-9a2ae89cfacd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cb9181dd-01c0-409c-8102-a94217bf40a8", "address": "fa:16:3e:80:46:c1", "network": {"id": "ec48bfb6-9f16-422e-b7dc-422060e71ca2", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-39474949-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eee8c6e871b948c9bff0d4ee4267ba78", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcb9181dd-01", "ovs_interfaceid": "cb9181dd-01c0-409c-8102-a94217bf40a8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:07:11 compute-0 nova_compute[187185]: 2025-11-29 07:07:11.664 187189 DEBUG nova.network.os_vif_util [None req-53f3cc61-14fa-476f-aabc-85439533699d 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Converting VIF {"id": "cb9181dd-01c0-409c-8102-a94217bf40a8", "address": "fa:16:3e:80:46:c1", "network": {"id": "ec48bfb6-9f16-422e-b7dc-422060e71ca2", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-39474949-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eee8c6e871b948c9bff0d4ee4267ba78", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcb9181dd-01", "ovs_interfaceid": "cb9181dd-01c0-409c-8102-a94217bf40a8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:07:11 compute-0 nova_compute[187185]: 2025-11-29 07:07:11.665 187189 DEBUG nova.network.os_vif_util [None req-53f3cc61-14fa-476f-aabc-85439533699d 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:80:46:c1,bridge_name='br-int',has_traffic_filtering=True,id=cb9181dd-01c0-409c-8102-a94217bf40a8,network=Network(ec48bfb6-9f16-422e-b7dc-422060e71ca2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcb9181dd-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:07:11 compute-0 nova_compute[187185]: 2025-11-29 07:07:11.665 187189 DEBUG os_vif [None req-53f3cc61-14fa-476f-aabc-85439533699d 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:80:46:c1,bridge_name='br-int',has_traffic_filtering=True,id=cb9181dd-01c0-409c-8102-a94217bf40a8,network=Network(ec48bfb6-9f16-422e-b7dc-422060e71ca2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcb9181dd-01') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:07:11 compute-0 nova_compute[187185]: 2025-11-29 07:07:11.668 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:11 compute-0 podman[224173]: 2025-11-29 07:07:11.66830054 +0000 UTC m=+0.163634127 container cleanup 548cb25cb3b313e9192b4efe26a077da1f43bddf9fa8b430d14e77eb8e34d50e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ec48bfb6-9f16-422e-b7dc-422060e71ca2, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:07:11 compute-0 nova_compute[187185]: 2025-11-29 07:07:11.668 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcb9181dd-01, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:07:11 compute-0 nova_compute[187185]: 2025-11-29 07:07:11.670 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:11 compute-0 nova_compute[187185]: 2025-11-29 07:07:11.671 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:11 compute-0 nova_compute[187185]: 2025-11-29 07:07:11.673 187189 INFO os_vif [None req-53f3cc61-14fa-476f-aabc-85439533699d 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:80:46:c1,bridge_name='br-int',has_traffic_filtering=True,id=cb9181dd-01c0-409c-8102-a94217bf40a8,network=Network(ec48bfb6-9f16-422e-b7dc-422060e71ca2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcb9181dd-01')
Nov 29 07:07:11 compute-0 nova_compute[187185]: 2025-11-29 07:07:11.674 187189 INFO nova.virt.libvirt.driver [None req-53f3cc61-14fa-476f-aabc-85439533699d 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Deleting instance files /var/lib/nova/instances/3b461587-d91f-4a59-a05a-9a2ae89cfacd_del
Nov 29 07:07:11 compute-0 systemd[1]: libpod-conmon-548cb25cb3b313e9192b4efe26a077da1f43bddf9fa8b430d14e77eb8e34d50e.scope: Deactivated successfully.
Nov 29 07:07:11 compute-0 nova_compute[187185]: 2025-11-29 07:07:11.674 187189 INFO nova.virt.libvirt.driver [None req-53f3cc61-14fa-476f-aabc-85439533699d 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Deletion of /var/lib/nova/instances/3b461587-d91f-4a59-a05a-9a2ae89cfacd_del complete
Nov 29 07:07:11 compute-0 podman[224218]: 2025-11-29 07:07:11.755312989 +0000 UTC m=+0.062279027 container remove 548cb25cb3b313e9192b4efe26a077da1f43bddf9fa8b430d14e77eb8e34d50e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ec48bfb6-9f16-422e-b7dc-422060e71ca2, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:07:11 compute-0 nova_compute[187185]: 2025-11-29 07:07:11.759 187189 INFO nova.compute.manager [None req-53f3cc61-14fa-476f-aabc-85439533699d 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Took 0.41 seconds to destroy the instance on the hypervisor.
Nov 29 07:07:11 compute-0 nova_compute[187185]: 2025-11-29 07:07:11.760 187189 DEBUG oslo.service.loopingcall [None req-53f3cc61-14fa-476f-aabc-85439533699d 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 07:07:11 compute-0 nova_compute[187185]: 2025-11-29 07:07:11.760 187189 DEBUG nova.compute.manager [-] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 07:07:11 compute-0 nova_compute[187185]: 2025-11-29 07:07:11.760 187189 DEBUG nova.network.neutron [-] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 07:07:11 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:11.761 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[c47fade2-d4e1-4617-9078-73d1cc47a0bd]: (4, ('Sat Nov 29 07:07:11 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ec48bfb6-9f16-422e-b7dc-422060e71ca2 (548cb25cb3b313e9192b4efe26a077da1f43bddf9fa8b430d14e77eb8e34d50e)\n548cb25cb3b313e9192b4efe26a077da1f43bddf9fa8b430d14e77eb8e34d50e\nSat Nov 29 07:07:11 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ec48bfb6-9f16-422e-b7dc-422060e71ca2 (548cb25cb3b313e9192b4efe26a077da1f43bddf9fa8b430d14e77eb8e34d50e)\n548cb25cb3b313e9192b4efe26a077da1f43bddf9fa8b430d14e77eb8e34d50e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:07:11 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:11.764 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[69c5feb8-8ffb-49cd-ac62-2bf6ead7c7bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:07:11 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:11.765 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec48bfb6-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:07:11 compute-0 kernel: tapec48bfb6-90: left promiscuous mode
Nov 29 07:07:11 compute-0 nova_compute[187185]: 2025-11-29 07:07:11.768 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:11 compute-0 nova_compute[187185]: 2025-11-29 07:07:11.778 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:11 compute-0 nova_compute[187185]: 2025-11-29 07:07:11.781 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:11 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:11.782 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[19d73580-7754-4b23-9c7e-fb162c52581b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:07:11 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:11.800 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[fb9e2531-d38c-4e6c-8dd7-31320079fc57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:07:11 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:11.802 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[646cef7e-e0bc-42ae-923c-77c9a6280d26]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:07:11 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:11.820 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[4c7dbf3c-6c9a-42cc-92f9-6e0d53c007d4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 545520, 'reachable_time': 31961, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224233, 'error': None, 'target': 'ovnmeta-ec48bfb6-9f16-422e-b7dc-422060e71ca2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:07:11 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:11.823 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ec48bfb6-9f16-422e-b7dc-422060e71ca2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:07:11 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:11.824 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[0ac21eb0-495e-4542-85d7-ca6ce3a6a68b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:07:11 compute-0 systemd[1]: run-netns-ovnmeta\x2dec48bfb6\x2d9f16\x2d422e\x2db7dc\x2d422060e71ca2.mount: Deactivated successfully.
Nov 29 07:07:11 compute-0 nova_compute[187185]: 2025-11-29 07:07:11.848 187189 DEBUG nova.compute.manager [req-60e3551d-8322-4385-9e81-d40050b40b41 req-5a1962ef-d0f1-4e21-877b-a2071baa2a1d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Received event network-vif-unplugged-cb9181dd-01c0-409c-8102-a94217bf40a8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:07:11 compute-0 nova_compute[187185]: 2025-11-29 07:07:11.848 187189 DEBUG oslo_concurrency.lockutils [req-60e3551d-8322-4385-9e81-d40050b40b41 req-5a1962ef-d0f1-4e21-877b-a2071baa2a1d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "3b461587-d91f-4a59-a05a-9a2ae89cfacd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:07:11 compute-0 nova_compute[187185]: 2025-11-29 07:07:11.849 187189 DEBUG oslo_concurrency.lockutils [req-60e3551d-8322-4385-9e81-d40050b40b41 req-5a1962ef-d0f1-4e21-877b-a2071baa2a1d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3b461587-d91f-4a59-a05a-9a2ae89cfacd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:07:11 compute-0 nova_compute[187185]: 2025-11-29 07:07:11.849 187189 DEBUG oslo_concurrency.lockutils [req-60e3551d-8322-4385-9e81-d40050b40b41 req-5a1962ef-d0f1-4e21-877b-a2071baa2a1d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3b461587-d91f-4a59-a05a-9a2ae89cfacd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:07:11 compute-0 nova_compute[187185]: 2025-11-29 07:07:11.849 187189 DEBUG nova.compute.manager [req-60e3551d-8322-4385-9e81-d40050b40b41 req-5a1962ef-d0f1-4e21-877b-a2071baa2a1d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] No waiting events found dispatching network-vif-unplugged-cb9181dd-01c0-409c-8102-a94217bf40a8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:07:11 compute-0 nova_compute[187185]: 2025-11-29 07:07:11.849 187189 DEBUG nova.compute.manager [req-60e3551d-8322-4385-9e81-d40050b40b41 req-5a1962ef-d0f1-4e21-877b-a2071baa2a1d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Received event network-vif-unplugged-cb9181dd-01c0-409c-8102-a94217bf40a8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 07:07:12 compute-0 nova_compute[187185]: 2025-11-29 07:07:12.094 187189 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Acquiring lock "95b8710a-1d4d-4b62-b920-af874de99431" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:07:12 compute-0 nova_compute[187185]: 2025-11-29 07:07:12.094 187189 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "95b8710a-1d4d-4b62-b920-af874de99431" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:07:12 compute-0 nova_compute[187185]: 2025-11-29 07:07:12.124 187189 DEBUG nova.compute.manager [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 07:07:12 compute-0 nova_compute[187185]: 2025-11-29 07:07:12.270 187189 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:07:12 compute-0 nova_compute[187185]: 2025-11-29 07:07:12.271 187189 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:07:12 compute-0 nova_compute[187185]: 2025-11-29 07:07:12.277 187189 DEBUG nova.virt.hardware [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:07:12 compute-0 nova_compute[187185]: 2025-11-29 07:07:12.278 187189 INFO nova.compute.claims [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:07:12 compute-0 nova_compute[187185]: 2025-11-29 07:07:12.661 187189 DEBUG nova.network.neutron [-] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:07:12 compute-0 nova_compute[187185]: 2025-11-29 07:07:12.739 187189 INFO nova.compute.manager [-] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Took 0.98 seconds to deallocate network for instance.
Nov 29 07:07:12 compute-0 nova_compute[187185]: 2025-11-29 07:07:12.807 187189 DEBUG nova.compute.provider_tree [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:07:12 compute-0 nova_compute[187185]: 2025-11-29 07:07:12.835 187189 DEBUG nova.scheduler.client.report [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:07:12 compute-0 nova_compute[187185]: 2025-11-29 07:07:12.838 187189 DEBUG oslo_concurrency.lockutils [None req-53f3cc61-14fa-476f-aabc-85439533699d 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:07:12 compute-0 nova_compute[187185]: 2025-11-29 07:07:12.868 187189 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.597s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:07:12 compute-0 nova_compute[187185]: 2025-11-29 07:07:12.868 187189 DEBUG nova.compute.manager [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 07:07:12 compute-0 nova_compute[187185]: 2025-11-29 07:07:12.872 187189 DEBUG oslo_concurrency.lockutils [None req-53f3cc61-14fa-476f-aabc-85439533699d 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.033s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:07:12 compute-0 nova_compute[187185]: 2025-11-29 07:07:12.874 187189 DEBUG nova.compute.manager [req-182980f5-faeb-44da-b1a7-8c9c99ed49f2 req-56569011-c856-49df-9f1d-fc26d40e471c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Received event network-vif-deleted-cb9181dd-01c0-409c-8102-a94217bf40a8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:07:12 compute-0 nova_compute[187185]: 2025-11-29 07:07:12.962 187189 DEBUG nova.compute.manager [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 07:07:12 compute-0 nova_compute[187185]: 2025-11-29 07:07:12.962 187189 DEBUG nova.network.neutron [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 07:07:12 compute-0 nova_compute[187185]: 2025-11-29 07:07:12.993 187189 INFO nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 07:07:13 compute-0 nova_compute[187185]: 2025-11-29 07:07:12.999 187189 DEBUG nova.compute.provider_tree [None req-53f3cc61-14fa-476f-aabc-85439533699d 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:07:13 compute-0 nova_compute[187185]: 2025-11-29 07:07:13.025 187189 DEBUG nova.scheduler.client.report [None req-53f3cc61-14fa-476f-aabc-85439533699d 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:07:13 compute-0 nova_compute[187185]: 2025-11-29 07:07:13.029 187189 DEBUG nova.compute.manager [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 07:07:13 compute-0 nova_compute[187185]: 2025-11-29 07:07:13.057 187189 DEBUG oslo_concurrency.lockutils [None req-53f3cc61-14fa-476f-aabc-85439533699d 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:07:13 compute-0 nova_compute[187185]: 2025-11-29 07:07:13.107 187189 INFO nova.scheduler.client.report [None req-53f3cc61-14fa-476f-aabc-85439533699d 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Deleted allocations for instance 3b461587-d91f-4a59-a05a-9a2ae89cfacd
Nov 29 07:07:13 compute-0 nova_compute[187185]: 2025-11-29 07:07:13.177 187189 DEBUG nova.compute.manager [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 07:07:13 compute-0 nova_compute[187185]: 2025-11-29 07:07:13.178 187189 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:07:13 compute-0 nova_compute[187185]: 2025-11-29 07:07:13.179 187189 INFO nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Creating image(s)
Nov 29 07:07:13 compute-0 nova_compute[187185]: 2025-11-29 07:07:13.179 187189 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Acquiring lock "/var/lib/nova/instances/95b8710a-1d4d-4b62-b920-af874de99431/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:07:13 compute-0 nova_compute[187185]: 2025-11-29 07:07:13.179 187189 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "/var/lib/nova/instances/95b8710a-1d4d-4b62-b920-af874de99431/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:07:13 compute-0 nova_compute[187185]: 2025-11-29 07:07:13.180 187189 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "/var/lib/nova/instances/95b8710a-1d4d-4b62-b920-af874de99431/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:07:13 compute-0 nova_compute[187185]: 2025-11-29 07:07:13.193 187189 DEBUG oslo_concurrency.processutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:07:13 compute-0 nova_compute[187185]: 2025-11-29 07:07:13.232 187189 DEBUG oslo_concurrency.lockutils [None req-53f3cc61-14fa-476f-aabc-85439533699d 6e3923c949d649889fe9a955a8f5cff8 eee8c6e871b948c9bff0d4ee4267ba78 - - default default] Lock "3b461587-d91f-4a59-a05a-9a2ae89cfacd" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.925s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:07:13 compute-0 nova_compute[187185]: 2025-11-29 07:07:13.262 187189 DEBUG oslo_concurrency.processutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:07:13 compute-0 nova_compute[187185]: 2025-11-29 07:07:13.263 187189 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:07:13 compute-0 nova_compute[187185]: 2025-11-29 07:07:13.264 187189 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:07:13 compute-0 nova_compute[187185]: 2025-11-29 07:07:13.279 187189 DEBUG oslo_concurrency.processutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:07:13 compute-0 nova_compute[187185]: 2025-11-29 07:07:13.349 187189 DEBUG oslo_concurrency.processutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:07:13 compute-0 nova_compute[187185]: 2025-11-29 07:07:13.350 187189 DEBUG oslo_concurrency.processutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/95b8710a-1d4d-4b62-b920-af874de99431/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:07:13 compute-0 nova_compute[187185]: 2025-11-29 07:07:13.382 187189 DEBUG nova.policy [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3c9a3fa9f480479d98f522f6f02870fb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '477b89fb35da42f69c15b3f01054754a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 07:07:13 compute-0 nova_compute[187185]: 2025-11-29 07:07:13.389 187189 DEBUG oslo_concurrency.processutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/95b8710a-1d4d-4b62-b920-af874de99431/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:07:13 compute-0 nova_compute[187185]: 2025-11-29 07:07:13.390 187189 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:07:13 compute-0 nova_compute[187185]: 2025-11-29 07:07:13.390 187189 DEBUG oslo_concurrency.processutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:07:13 compute-0 nova_compute[187185]: 2025-11-29 07:07:13.450 187189 DEBUG oslo_concurrency.processutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:07:13 compute-0 nova_compute[187185]: 2025-11-29 07:07:13.453 187189 DEBUG nova.virt.disk.api [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Checking if we can resize image /var/lib/nova/instances/95b8710a-1d4d-4b62-b920-af874de99431/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:07:13 compute-0 nova_compute[187185]: 2025-11-29 07:07:13.453 187189 DEBUG oslo_concurrency.processutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95b8710a-1d4d-4b62-b920-af874de99431/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:07:13 compute-0 nova_compute[187185]: 2025-11-29 07:07:13.513 187189 DEBUG oslo_concurrency.processutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95b8710a-1d4d-4b62-b920-af874de99431/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:07:13 compute-0 nova_compute[187185]: 2025-11-29 07:07:13.514 187189 DEBUG nova.virt.disk.api [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Cannot resize image /var/lib/nova/instances/95b8710a-1d4d-4b62-b920-af874de99431/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:07:13 compute-0 nova_compute[187185]: 2025-11-29 07:07:13.515 187189 DEBUG nova.objects.instance [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lazy-loading 'migration_context' on Instance uuid 95b8710a-1d4d-4b62-b920-af874de99431 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:07:14 compute-0 nova_compute[187185]: 2025-11-29 07:07:14.342 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:07:14 compute-0 nova_compute[187185]: 2025-11-29 07:07:14.343 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 29 07:07:14 compute-0 nova_compute[187185]: 2025-11-29 07:07:14.547 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:16 compute-0 nova_compute[187185]: 2025-11-29 07:07:16.672 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:16 compute-0 nova_compute[187185]: 2025-11-29 07:07:16.924 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400021.9228284, 02b37f0c-3272-417b-9791-48b555f68d56 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:07:16 compute-0 nova_compute[187185]: 2025-11-29 07:07:16.925 187189 INFO nova.compute.manager [-] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] VM Stopped (Lifecycle Event)
Nov 29 07:07:17 compute-0 podman[224251]: 2025-11-29 07:07:17.812172356 +0000 UTC m=+0.064466118 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:07:17 compute-0 podman[224249]: 2025-11-29 07:07:17.817056653 +0000 UTC m=+0.068767269 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 07:07:17 compute-0 podman[224250]: 2025-11-29 07:07:17.848264527 +0000 UTC m=+0.100697883 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, distribution-scope=public, managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 29 07:07:18 compute-0 nova_compute[187185]: 2025-11-29 07:07:18.645 187189 DEBUG nova.compute.manager [req-3531d374-af53-4feb-abae-258a46801689 req-cd744d90-3742-4378-bd8e-6d71a9e4fd8a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Received event network-vif-plugged-cb9181dd-01c0-409c-8102-a94217bf40a8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:07:18 compute-0 nova_compute[187185]: 2025-11-29 07:07:18.645 187189 DEBUG oslo_concurrency.lockutils [req-3531d374-af53-4feb-abae-258a46801689 req-cd744d90-3742-4378-bd8e-6d71a9e4fd8a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "3b461587-d91f-4a59-a05a-9a2ae89cfacd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:07:18 compute-0 nova_compute[187185]: 2025-11-29 07:07:18.646 187189 DEBUG oslo_concurrency.lockutils [req-3531d374-af53-4feb-abae-258a46801689 req-cd744d90-3742-4378-bd8e-6d71a9e4fd8a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3b461587-d91f-4a59-a05a-9a2ae89cfacd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:07:18 compute-0 nova_compute[187185]: 2025-11-29 07:07:18.646 187189 DEBUG oslo_concurrency.lockutils [req-3531d374-af53-4feb-abae-258a46801689 req-cd744d90-3742-4378-bd8e-6d71a9e4fd8a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3b461587-d91f-4a59-a05a-9a2ae89cfacd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:07:18 compute-0 nova_compute[187185]: 2025-11-29 07:07:18.647 187189 DEBUG nova.compute.manager [req-3531d374-af53-4feb-abae-258a46801689 req-cd744d90-3742-4378-bd8e-6d71a9e4fd8a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] No waiting events found dispatching network-vif-plugged-cb9181dd-01c0-409c-8102-a94217bf40a8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:07:18 compute-0 nova_compute[187185]: 2025-11-29 07:07:18.647 187189 WARNING nova.compute.manager [req-3531d374-af53-4feb-abae-258a46801689 req-cd744d90-3742-4378-bd8e-6d71a9e4fd8a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Received unexpected event network-vif-plugged-cb9181dd-01c0-409c-8102-a94217bf40a8 for instance with vm_state deleted and task_state None.
Nov 29 07:07:18 compute-0 nova_compute[187185]: 2025-11-29 07:07:18.665 187189 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:07:18 compute-0 nova_compute[187185]: 2025-11-29 07:07:18.666 187189 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Ensure instance console log exists: /var/lib/nova/instances/95b8710a-1d4d-4b62-b920-af874de99431/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:07:18 compute-0 nova_compute[187185]: 2025-11-29 07:07:18.667 187189 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:07:18 compute-0 nova_compute[187185]: 2025-11-29 07:07:18.667 187189 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:07:18 compute-0 nova_compute[187185]: 2025-11-29 07:07:18.668 187189 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:07:18 compute-0 nova_compute[187185]: 2025-11-29 07:07:18.847 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 29 07:07:18 compute-0 nova_compute[187185]: 2025-11-29 07:07:18.850 187189 DEBUG nova.compute.manager [None req-a402c2a2-a2bb-4501-b17c-84ba1f0df99b - - - - - -] [instance: 02b37f0c-3272-417b-9791-48b555f68d56] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:07:19 compute-0 nova_compute[187185]: 2025-11-29 07:07:19.394 187189 DEBUG nova.network.neutron [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Successfully created port: f051fae4-0fdf-4a7d-831f-f3291755af62 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:07:19 compute-0 nova_compute[187185]: 2025-11-29 07:07:19.550 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:20 compute-0 nova_compute[187185]: 2025-11-29 07:07:20.174 187189 DEBUG oslo_concurrency.lockutils [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Acquiring lock "2f35ca71-193f-457f-b09f-78963a176460" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:07:20 compute-0 nova_compute[187185]: 2025-11-29 07:07:20.175 187189 DEBUG oslo_concurrency.lockutils [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Lock "2f35ca71-193f-457f-b09f-78963a176460" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:07:20 compute-0 nova_compute[187185]: 2025-11-29 07:07:20.193 187189 DEBUG nova.compute.manager [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 07:07:20 compute-0 nova_compute[187185]: 2025-11-29 07:07:20.304 187189 DEBUG oslo_concurrency.lockutils [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:07:20 compute-0 nova_compute[187185]: 2025-11-29 07:07:20.305 187189 DEBUG oslo_concurrency.lockutils [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:07:20 compute-0 nova_compute[187185]: 2025-11-29 07:07:20.312 187189 DEBUG nova.virt.hardware [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:07:20 compute-0 nova_compute[187185]: 2025-11-29 07:07:20.312 187189 INFO nova.compute.claims [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:07:20 compute-0 nova_compute[187185]: 2025-11-29 07:07:20.473 187189 DEBUG nova.compute.provider_tree [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:07:20 compute-0 nova_compute[187185]: 2025-11-29 07:07:20.509 187189 DEBUG nova.scheduler.client.report [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:07:20 compute-0 nova_compute[187185]: 2025-11-29 07:07:20.540 187189 DEBUG oslo_concurrency.lockutils [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.235s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:07:20 compute-0 nova_compute[187185]: 2025-11-29 07:07:20.541 187189 DEBUG nova.compute.manager [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 07:07:20 compute-0 nova_compute[187185]: 2025-11-29 07:07:20.609 187189 DEBUG nova.compute.manager [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948
Nov 29 07:07:20 compute-0 nova_compute[187185]: 2025-11-29 07:07:20.636 187189 INFO nova.virt.libvirt.driver [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 07:07:20 compute-0 nova_compute[187185]: 2025-11-29 07:07:20.659 187189 DEBUG nova.compute.manager [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 07:07:20 compute-0 nova_compute[187185]: 2025-11-29 07:07:20.813 187189 DEBUG nova.network.neutron [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Successfully updated port: f051fae4-0fdf-4a7d-831f-f3291755af62 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.155 187189 DEBUG nova.compute.manager [req-2c668815-8835-465a-a1e8-be188bf97b82 req-e19408d4-f4ab-4e1c-85ff-a002e8a3d2e9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Received event network-changed-f051fae4-0fdf-4a7d-831f-f3291755af62 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.155 187189 DEBUG nova.compute.manager [req-2c668815-8835-465a-a1e8-be188bf97b82 req-e19408d4-f4ab-4e1c-85ff-a002e8a3d2e9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Refreshing instance network info cache due to event network-changed-f051fae4-0fdf-4a7d-831f-f3291755af62. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.156 187189 DEBUG oslo_concurrency.lockutils [req-2c668815-8835-465a-a1e8-be188bf97b82 req-e19408d4-f4ab-4e1c-85ff-a002e8a3d2e9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-95b8710a-1d4d-4b62-b920-af874de99431" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.156 187189 DEBUG oslo_concurrency.lockutils [req-2c668815-8835-465a-a1e8-be188bf97b82 req-e19408d4-f4ab-4e1c-85ff-a002e8a3d2e9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-95b8710a-1d4d-4b62-b920-af874de99431" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.157 187189 DEBUG nova.network.neutron [req-2c668815-8835-465a-a1e8-be188bf97b82 req-e19408d4-f4ab-4e1c-85ff-a002e8a3d2e9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Refreshing network info cache for port f051fae4-0fdf-4a7d-831f-f3291755af62 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.214 187189 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Acquiring lock "refresh_cache-95b8710a-1d4d-4b62-b920-af874de99431" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.267 187189 DEBUG nova.compute.manager [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.269 187189 DEBUG nova.virt.libvirt.driver [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.269 187189 INFO nova.virt.libvirt.driver [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Creating image(s)
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.270 187189 DEBUG oslo_concurrency.lockutils [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Acquiring lock "/var/lib/nova/instances/2f35ca71-193f-457f-b09f-78963a176460/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.270 187189 DEBUG oslo_concurrency.lockutils [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Lock "/var/lib/nova/instances/2f35ca71-193f-457f-b09f-78963a176460/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.271 187189 DEBUG oslo_concurrency.lockutils [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Lock "/var/lib/nova/instances/2f35ca71-193f-457f-b09f-78963a176460/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.288 187189 DEBUG oslo_concurrency.processutils [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.330 187189 DEBUG nova.network.neutron [req-2c668815-8835-465a-a1e8-be188bf97b82 req-e19408d4-f4ab-4e1c-85ff-a002e8a3d2e9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.385 187189 DEBUG oslo_concurrency.processutils [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.386 187189 DEBUG oslo_concurrency.lockutils [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.387 187189 DEBUG oslo_concurrency.lockutils [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.406 187189 DEBUG oslo_concurrency.processutils [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.485 187189 DEBUG oslo_concurrency.processutils [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.487 187189 DEBUG oslo_concurrency.processutils [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/2f35ca71-193f-457f-b09f-78963a176460/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.650 187189 DEBUG nova.network.neutron [req-2c668815-8835-465a-a1e8-be188bf97b82 req-e19408d4-f4ab-4e1c-85ff-a002e8a3d2e9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.678 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.686 187189 DEBUG oslo_concurrency.lockutils [req-2c668815-8835-465a-a1e8-be188bf97b82 req-e19408d4-f4ab-4e1c-85ff-a002e8a3d2e9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-95b8710a-1d4d-4b62-b920-af874de99431" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.687 187189 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Acquired lock "refresh_cache-95b8710a-1d4d-4b62-b920-af874de99431" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.688 187189 DEBUG nova.network.neutron [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.715 187189 DEBUG oslo_concurrency.processutils [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/2f35ca71-193f-457f-b09f-78963a176460/disk 1073741824" returned: 0 in 0.228s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.716 187189 DEBUG oslo_concurrency.lockutils [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.329s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.717 187189 DEBUG oslo_concurrency.processutils [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.788 187189 DEBUG oslo_concurrency.processutils [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.789 187189 DEBUG nova.virt.disk.api [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Checking if we can resize image /var/lib/nova/instances/2f35ca71-193f-457f-b09f-78963a176460/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.789 187189 DEBUG oslo_concurrency.processutils [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2f35ca71-193f-457f-b09f-78963a176460/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.857 187189 DEBUG oslo_concurrency.processutils [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2f35ca71-193f-457f-b09f-78963a176460/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.858 187189 DEBUG nova.virt.disk.api [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Cannot resize image /var/lib/nova/instances/2f35ca71-193f-457f-b09f-78963a176460/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.858 187189 DEBUG nova.objects.instance [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Lazy-loading 'migration_context' on Instance uuid 2f35ca71-193f-457f-b09f-78963a176460 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.905 187189 DEBUG nova.virt.libvirt.driver [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.906 187189 DEBUG nova.virt.libvirt.driver [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Ensure instance console log exists: /var/lib/nova/instances/2f35ca71-193f-457f-b09f-78963a176460/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.906 187189 DEBUG oslo_concurrency.lockutils [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.907 187189 DEBUG oslo_concurrency.lockutils [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.907 187189 DEBUG oslo_concurrency.lockutils [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.909 187189 DEBUG nova.virt.libvirt.driver [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.915 187189 WARNING nova.virt.libvirt.driver [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.920 187189 DEBUG nova.virt.libvirt.host [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.921 187189 DEBUG nova.virt.libvirt.host [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.926 187189 DEBUG nova.virt.libvirt.host [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.927 187189 DEBUG nova.virt.libvirt.host [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.929 187189 DEBUG nova.virt.libvirt.driver [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.930 187189 DEBUG nova.virt.hardware [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.930 187189 DEBUG nova.virt.hardware [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.931 187189 DEBUG nova.virt.hardware [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.931 187189 DEBUG nova.virt.hardware [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.931 187189 DEBUG nova.virt.hardware [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.932 187189 DEBUG nova.virt.hardware [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.932 187189 DEBUG nova.virt.hardware [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.932 187189 DEBUG nova.virt.hardware [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.933 187189 DEBUG nova.virt.hardware [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.933 187189 DEBUG nova.virt.hardware [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.933 187189 DEBUG nova.virt.hardware [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.938 187189 DEBUG nova.objects.instance [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2f35ca71-193f-457f-b09f-78963a176460 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.950 187189 DEBUG nova.network.neutron [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:07:21 compute-0 nova_compute[187185]: 2025-11-29 07:07:21.957 187189 DEBUG nova.virt.libvirt.driver [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:07:21 compute-0 nova_compute[187185]:   <uuid>2f35ca71-193f-457f-b09f-78963a176460</uuid>
Nov 29 07:07:21 compute-0 nova_compute[187185]:   <name>instance-00000054</name>
Nov 29 07:07:21 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:07:21 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:07:21 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:07:21 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:       <nova:name>tempest-ServerShowV257Test-server-1416562793</nova:name>
Nov 29 07:07:21 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:07:21</nova:creationTime>
Nov 29 07:07:21 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:07:21 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:07:21 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:07:21 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:07:21 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:07:21 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:07:21 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:07:21 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:07:21 compute-0 nova_compute[187185]:         <nova:user uuid="7293fa2b633e4b42af3128c6bcbd5176">tempest-ServerShowV257Test-1776381761-project-member</nova:user>
Nov 29 07:07:21 compute-0 nova_compute[187185]:         <nova:project uuid="618890258b5f40d4a3313b98a94795c7">tempest-ServerShowV257Test-1776381761</nova:project>
Nov 29 07:07:21 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:07:21 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:       <nova:ports/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:07:21 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:07:21 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <system>
Nov 29 07:07:21 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:07:21 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:07:21 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:07:21 compute-0 nova_compute[187185]:       <entry name="serial">2f35ca71-193f-457f-b09f-78963a176460</entry>
Nov 29 07:07:21 compute-0 nova_compute[187185]:       <entry name="uuid">2f35ca71-193f-457f-b09f-78963a176460</entry>
Nov 29 07:07:21 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     </system>
Nov 29 07:07:21 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:07:21 compute-0 nova_compute[187185]:   <os>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:   </os>
Nov 29 07:07:21 compute-0 nova_compute[187185]:   <features>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:   </features>
Nov 29 07:07:21 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:07:21 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:07:21 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:07:21 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/2f35ca71-193f-457f-b09f-78963a176460/disk"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:07:21 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/2f35ca71-193f-457f-b09f-78963a176460/disk.config"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:07:21 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/2f35ca71-193f-457f-b09f-78963a176460/console.log" append="off"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <video>
Nov 29 07:07:21 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     </video>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:07:21 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:07:21 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:07:21 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:07:21 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:07:21 compute-0 nova_compute[187185]: </domain>
Nov 29 07:07:21 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:07:22 compute-0 nova_compute[187185]: 2025-11-29 07:07:22.085 187189 DEBUG nova.virt.libvirt.driver [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:07:22 compute-0 nova_compute[187185]: 2025-11-29 07:07:22.086 187189 DEBUG nova.virt.libvirt.driver [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:07:22 compute-0 nova_compute[187185]: 2025-11-29 07:07:22.086 187189 INFO nova.virt.libvirt.driver [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Using config drive
Nov 29 07:07:22 compute-0 nova_compute[187185]: 2025-11-29 07:07:22.326 187189 INFO nova.virt.libvirt.driver [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Creating config drive at /var/lib/nova/instances/2f35ca71-193f-457f-b09f-78963a176460/disk.config
Nov 29 07:07:22 compute-0 nova_compute[187185]: 2025-11-29 07:07:22.330 187189 DEBUG oslo_concurrency.processutils [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2f35ca71-193f-457f-b09f-78963a176460/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpz8949foi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:07:22 compute-0 nova_compute[187185]: 2025-11-29 07:07:22.468 187189 DEBUG oslo_concurrency.processutils [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2f35ca71-193f-457f-b09f-78963a176460/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpz8949foi" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:07:22 compute-0 systemd-machined[153486]: New machine qemu-26-instance-00000054.
Nov 29 07:07:22 compute-0 systemd[1]: Started Virtual Machine qemu-26-instance-00000054.
Nov 29 07:07:23 compute-0 nova_compute[187185]: 2025-11-29 07:07:23.182 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400043.181498, 2f35ca71-193f-457f-b09f-78963a176460 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:07:23 compute-0 nova_compute[187185]: 2025-11-29 07:07:23.185 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 2f35ca71-193f-457f-b09f-78963a176460] VM Resumed (Lifecycle Event)
Nov 29 07:07:23 compute-0 nova_compute[187185]: 2025-11-29 07:07:23.187 187189 DEBUG nova.compute.manager [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:07:23 compute-0 nova_compute[187185]: 2025-11-29 07:07:23.188 187189 DEBUG nova.virt.libvirt.driver [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:07:23 compute-0 nova_compute[187185]: 2025-11-29 07:07:23.191 187189 INFO nova.virt.libvirt.driver [-] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Instance spawned successfully.
Nov 29 07:07:23 compute-0 nova_compute[187185]: 2025-11-29 07:07:23.191 187189 DEBUG nova.virt.libvirt.driver [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:07:23 compute-0 nova_compute[187185]: 2025-11-29 07:07:23.591 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:07:23 compute-0 nova_compute[187185]: 2025-11-29 07:07:23.639 187189 DEBUG nova.virt.libvirt.driver [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:07:23 compute-0 nova_compute[187185]: 2025-11-29 07:07:23.640 187189 DEBUG nova.virt.libvirt.driver [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:07:23 compute-0 nova_compute[187185]: 2025-11-29 07:07:23.641 187189 DEBUG nova.virt.libvirt.driver [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:07:23 compute-0 nova_compute[187185]: 2025-11-29 07:07:23.641 187189 DEBUG nova.virt.libvirt.driver [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:07:23 compute-0 nova_compute[187185]: 2025-11-29 07:07:23.642 187189 DEBUG nova.virt.libvirt.driver [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:07:23 compute-0 nova_compute[187185]: 2025-11-29 07:07:23.643 187189 DEBUG nova.virt.libvirt.driver [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:07:23 compute-0 nova_compute[187185]: 2025-11-29 07:07:23.648 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:07:23 compute-0 nova_compute[187185]: 2025-11-29 07:07:23.729 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 2f35ca71-193f-457f-b09f-78963a176460] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:07:23 compute-0 nova_compute[187185]: 2025-11-29 07:07:23.730 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400043.183195, 2f35ca71-193f-457f-b09f-78963a176460 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:07:23 compute-0 nova_compute[187185]: 2025-11-29 07:07:23.730 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 2f35ca71-193f-457f-b09f-78963a176460] VM Started (Lifecycle Event)
Nov 29 07:07:23 compute-0 nova_compute[187185]: 2025-11-29 07:07:23.765 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:07:23 compute-0 nova_compute[187185]: 2025-11-29 07:07:23.769 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:07:23 compute-0 nova_compute[187185]: 2025-11-29 07:07:23.794 187189 INFO nova.compute.manager [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Took 2.53 seconds to spawn the instance on the hypervisor.
Nov 29 07:07:23 compute-0 nova_compute[187185]: 2025-11-29 07:07:23.794 187189 DEBUG nova.compute.manager [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:07:23 compute-0 nova_compute[187185]: 2025-11-29 07:07:23.796 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 2f35ca71-193f-457f-b09f-78963a176460] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:07:23 compute-0 nova_compute[187185]: 2025-11-29 07:07:23.886 187189 INFO nova.compute.manager [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Took 3.63 seconds to build instance.
Nov 29 07:07:23 compute-0 nova_compute[187185]: 2025-11-29 07:07:23.916 187189 DEBUG oslo_concurrency.lockutils [None req-f9db7948-2b50-4517-9719-4848b183454f 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Lock "2f35ca71-193f-457f-b09f-78963a176460" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.741s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.411 187189 DEBUG nova.network.neutron [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Updating instance_info_cache with network_info: [{"id": "f051fae4-0fdf-4a7d-831f-f3291755af62", "address": "fa:16:3e:07:5c:91", "network": {"id": "c8a3c675-42f5-48a4-83d7-2d39dd3304b9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1711984379-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "477b89fb35da42f69c15b3f01054754a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf051fae4-0f", "ovs_interfaceid": "f051fae4-0fdf-4a7d-831f-f3291755af62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.491 187189 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Releasing lock "refresh_cache-95b8710a-1d4d-4b62-b920-af874de99431" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.492 187189 DEBUG nova.compute.manager [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Instance network_info: |[{"id": "f051fae4-0fdf-4a7d-831f-f3291755af62", "address": "fa:16:3e:07:5c:91", "network": {"id": "c8a3c675-42f5-48a4-83d7-2d39dd3304b9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1711984379-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "477b89fb35da42f69c15b3f01054754a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf051fae4-0f", "ovs_interfaceid": "f051fae4-0fdf-4a7d-831f-f3291755af62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.497 187189 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Start _get_guest_xml network_info=[{"id": "f051fae4-0fdf-4a7d-831f-f3291755af62", "address": "fa:16:3e:07:5c:91", "network": {"id": "c8a3c675-42f5-48a4-83d7-2d39dd3304b9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1711984379-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "477b89fb35da42f69c15b3f01054754a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf051fae4-0f", "ovs_interfaceid": "f051fae4-0fdf-4a7d-831f-f3291755af62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.501 187189 WARNING nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.507 187189 DEBUG nova.virt.libvirt.host [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.508 187189 DEBUG nova.virt.libvirt.host [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.510 187189 DEBUG nova.virt.libvirt.host [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.511 187189 DEBUG nova.virt.libvirt.host [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.511 187189 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.512 187189 DEBUG nova.virt.hardware [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.512 187189 DEBUG nova.virt.hardware [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.512 187189 DEBUG nova.virt.hardware [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.513 187189 DEBUG nova.virt.hardware [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.513 187189 DEBUG nova.virt.hardware [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.513 187189 DEBUG nova.virt.hardware [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.513 187189 DEBUG nova.virt.hardware [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.513 187189 DEBUG nova.virt.hardware [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.514 187189 DEBUG nova.virt.hardware [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.514 187189 DEBUG nova.virt.hardware [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.514 187189 DEBUG nova.virt.hardware [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.517 187189 DEBUG nova.virt.libvirt.vif [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:07:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-1009215778',display_name='tempest-ListServersNegativeTestJSON-server-1009215778-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-1009215778-2',id=82,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='477b89fb35da42f69c15b3f01054754a',ramdisk_id='',reservation_id='r-fs8srmd1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-316367608',owner_user_name='tempest-ListServersNegativeTestJSON-316367608-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:07:13Z,user_data=None,user_id='3c9a3fa9f480479d98f522f6f02870fb',uuid=95b8710a-1d4d-4b62-b920-af874de99431,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f051fae4-0fdf-4a7d-831f-f3291755af62", "address": "fa:16:3e:07:5c:91", "network": {"id": "c8a3c675-42f5-48a4-83d7-2d39dd3304b9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1711984379-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "477b89fb35da42f69c15b3f01054754a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf051fae4-0f", "ovs_interfaceid": "f051fae4-0fdf-4a7d-831f-f3291755af62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.517 187189 DEBUG nova.network.os_vif_util [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Converting VIF {"id": "f051fae4-0fdf-4a7d-831f-f3291755af62", "address": "fa:16:3e:07:5c:91", "network": {"id": "c8a3c675-42f5-48a4-83d7-2d39dd3304b9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1711984379-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "477b89fb35da42f69c15b3f01054754a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf051fae4-0f", "ovs_interfaceid": "f051fae4-0fdf-4a7d-831f-f3291755af62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.518 187189 DEBUG nova.network.os_vif_util [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:07:5c:91,bridge_name='br-int',has_traffic_filtering=True,id=f051fae4-0fdf-4a7d-831f-f3291755af62,network=Network(c8a3c675-42f5-48a4-83d7-2d39dd3304b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf051fae4-0f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.519 187189 DEBUG nova.objects.instance [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lazy-loading 'pci_devices' on Instance uuid 95b8710a-1d4d-4b62-b920-af874de99431 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.535 187189 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:07:24 compute-0 nova_compute[187185]:   <uuid>95b8710a-1d4d-4b62-b920-af874de99431</uuid>
Nov 29 07:07:24 compute-0 nova_compute[187185]:   <name>instance-00000052</name>
Nov 29 07:07:24 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:07:24 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:07:24 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:07:24 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:       <nova:name>tempest-ListServersNegativeTestJSON-server-1009215778-2</nova:name>
Nov 29 07:07:24 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:07:24</nova:creationTime>
Nov 29 07:07:24 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:07:24 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:07:24 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:07:24 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:07:24 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:07:24 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:07:24 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:07:24 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:07:24 compute-0 nova_compute[187185]:         <nova:user uuid="3c9a3fa9f480479d98f522f6f02870fb">tempest-ListServersNegativeTestJSON-316367608-project-member</nova:user>
Nov 29 07:07:24 compute-0 nova_compute[187185]:         <nova:project uuid="477b89fb35da42f69c15b3f01054754a">tempest-ListServersNegativeTestJSON-316367608</nova:project>
Nov 29 07:07:24 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:07:24 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:07:24 compute-0 nova_compute[187185]:         <nova:port uuid="f051fae4-0fdf-4a7d-831f-f3291755af62">
Nov 29 07:07:24 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:07:24 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:07:24 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:07:24 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <system>
Nov 29 07:07:24 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:07:24 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:07:24 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:07:24 compute-0 nova_compute[187185]:       <entry name="serial">95b8710a-1d4d-4b62-b920-af874de99431</entry>
Nov 29 07:07:24 compute-0 nova_compute[187185]:       <entry name="uuid">95b8710a-1d4d-4b62-b920-af874de99431</entry>
Nov 29 07:07:24 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     </system>
Nov 29 07:07:24 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:07:24 compute-0 nova_compute[187185]:   <os>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:   </os>
Nov 29 07:07:24 compute-0 nova_compute[187185]:   <features>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:   </features>
Nov 29 07:07:24 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:07:24 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:07:24 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:07:24 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/95b8710a-1d4d-4b62-b920-af874de99431/disk"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:07:24 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/95b8710a-1d4d-4b62-b920-af874de99431/disk.config"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:07:24 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:07:5c:91"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:       <target dev="tapf051fae4-0f"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:07:24 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/95b8710a-1d4d-4b62-b920-af874de99431/console.log" append="off"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <video>
Nov 29 07:07:24 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     </video>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:07:24 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:07:24 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:07:24 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:07:24 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:07:24 compute-0 nova_compute[187185]: </domain>
Nov 29 07:07:24 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.537 187189 DEBUG nova.compute.manager [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Preparing to wait for external event network-vif-plugged-f051fae4-0fdf-4a7d-831f-f3291755af62 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.537 187189 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Acquiring lock "95b8710a-1d4d-4b62-b920-af874de99431-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.537 187189 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "95b8710a-1d4d-4b62-b920-af874de99431-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.538 187189 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "95b8710a-1d4d-4b62-b920-af874de99431-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.538 187189 DEBUG nova.virt.libvirt.vif [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:07:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-1009215778',display_name='tempest-ListServersNegativeTestJSON-server-1009215778-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-1009215778-2',id=82,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='477b89fb35da42f69c15b3f01054754a',ramdisk_id='',reservation_id='r-fs8srmd1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-316367608',owner_user_name='tempest-ListServersNegativeTestJSON-316367608-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:07:13Z,user_data=None,user_id='3c9a3fa9f480479d98f522f6f02870fb',uuid=95b8710a-1d4d-4b62-b920-af874de99431,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f051fae4-0fdf-4a7d-831f-f3291755af62", "address": "fa:16:3e:07:5c:91", "network": {"id": "c8a3c675-42f5-48a4-83d7-2d39dd3304b9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1711984379-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "477b89fb35da42f69c15b3f01054754a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf051fae4-0f", "ovs_interfaceid": "f051fae4-0fdf-4a7d-831f-f3291755af62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.539 187189 DEBUG nova.network.os_vif_util [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Converting VIF {"id": "f051fae4-0fdf-4a7d-831f-f3291755af62", "address": "fa:16:3e:07:5c:91", "network": {"id": "c8a3c675-42f5-48a4-83d7-2d39dd3304b9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1711984379-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "477b89fb35da42f69c15b3f01054754a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf051fae4-0f", "ovs_interfaceid": "f051fae4-0fdf-4a7d-831f-f3291755af62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.539 187189 DEBUG nova.network.os_vif_util [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:07:5c:91,bridge_name='br-int',has_traffic_filtering=True,id=f051fae4-0fdf-4a7d-831f-f3291755af62,network=Network(c8a3c675-42f5-48a4-83d7-2d39dd3304b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf051fae4-0f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.540 187189 DEBUG os_vif [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:07:5c:91,bridge_name='br-int',has_traffic_filtering=True,id=f051fae4-0fdf-4a7d-831f-f3291755af62,network=Network(c8a3c675-42f5-48a4-83d7-2d39dd3304b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf051fae4-0f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.540 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.541 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.541 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.544 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.545 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf051fae4-0f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.545 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf051fae4-0f, col_values=(('external_ids', {'iface-id': 'f051fae4-0fdf-4a7d-831f-f3291755af62', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:07:5c:91', 'vm-uuid': '95b8710a-1d4d-4b62-b920-af874de99431'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.547 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:24 compute-0 NetworkManager[55227]: <info>  [1764400044.5487] manager: (tapf051fae4-0f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/97)
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.551 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.559 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:24 compute-0 nova_compute[187185]: 2025-11-29 07:07:24.562 187189 INFO os_vif [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:07:5c:91,bridge_name='br-int',has_traffic_filtering=True,id=f051fae4-0fdf-4a7d-831f-f3291755af62,network=Network(c8a3c675-42f5-48a4-83d7-2d39dd3304b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf051fae4-0f')
Nov 29 07:07:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:24.826 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:07:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:24.827 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:07:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:24.828 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:07:25 compute-0 nova_compute[187185]: 2025-11-29 07:07:25.347 187189 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:07:25 compute-0 nova_compute[187185]: 2025-11-29 07:07:25.348 187189 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:07:25 compute-0 nova_compute[187185]: 2025-11-29 07:07:25.348 187189 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] No VIF found with MAC fa:16:3e:07:5c:91, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:07:25 compute-0 nova_compute[187185]: 2025-11-29 07:07:25.348 187189 INFO nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Using config drive
Nov 29 07:07:25 compute-0 nova_compute[187185]: 2025-11-29 07:07:25.742 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:26 compute-0 nova_compute[187185]: 2025-11-29 07:07:26.033 187189 INFO nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Creating config drive at /var/lib/nova/instances/95b8710a-1d4d-4b62-b920-af874de99431/disk.config
Nov 29 07:07:26 compute-0 nova_compute[187185]: 2025-11-29 07:07:26.042 187189 DEBUG oslo_concurrency.processutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/95b8710a-1d4d-4b62-b920-af874de99431/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphpn3awzi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:07:26 compute-0 nova_compute[187185]: 2025-11-29 07:07:26.188 187189 DEBUG oslo_concurrency.processutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/95b8710a-1d4d-4b62-b920-af874de99431/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphpn3awzi" returned: 0 in 0.147s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:07:26 compute-0 kernel: tapf051fae4-0f: entered promiscuous mode
Nov 29 07:07:26 compute-0 NetworkManager[55227]: <info>  [1764400046.2586] manager: (tapf051fae4-0f): new Tun device (/org/freedesktop/NetworkManager/Devices/98)
Nov 29 07:07:26 compute-0 nova_compute[187185]: 2025-11-29 07:07:26.259 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:26 compute-0 ovn_controller[95281]: 2025-11-29T07:07:26Z|00180|binding|INFO|Claiming lport f051fae4-0fdf-4a7d-831f-f3291755af62 for this chassis.
Nov 29 07:07:26 compute-0 ovn_controller[95281]: 2025-11-29T07:07:26Z|00181|binding|INFO|f051fae4-0fdf-4a7d-831f-f3291755af62: Claiming fa:16:3e:07:5c:91 10.100.0.12
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:26.289 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:07:5c:91 10.100.0.12'], port_security=['fa:16:3e:07:5c:91 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '95b8710a-1d4d-4b62-b920-af874de99431', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c8a3c675-42f5-48a4-83d7-2d39dd3304b9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '477b89fb35da42f69c15b3f01054754a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9d87673c-3e8a-46ed-9956-50ea661306ab', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=114ba21f-c978-4c05-97f9-429aa66017b7, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=f051fae4-0fdf-4a7d-831f-f3291755af62) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:26.291 104254 INFO neutron.agent.ovn.metadata.agent [-] Port f051fae4-0fdf-4a7d-831f-f3291755af62 in datapath c8a3c675-42f5-48a4-83d7-2d39dd3304b9 bound to our chassis
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:26.294 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c8a3c675-42f5-48a4-83d7-2d39dd3304b9
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:26.306 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[419f18e2-873f-4df7-8e2b-a5ccb9ed2982]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:26.307 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc8a3c675-41 in ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:26.309 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc8a3c675-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:26.309 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[2be3f0f5-611f-4919-ae54-67fb478b4e3b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:26.310 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[8e74b068-5784-4454-a4de-11d2d7430ffd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:26.323 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[b44031c1-eece-4f72-ad85-1982d9b812e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:07:26 compute-0 ovn_controller[95281]: 2025-11-29T07:07:26Z|00182|binding|INFO|Setting lport f051fae4-0fdf-4a7d-831f-f3291755af62 ovn-installed in OVS
Nov 29 07:07:26 compute-0 ovn_controller[95281]: 2025-11-29T07:07:26Z|00183|binding|INFO|Setting lport f051fae4-0fdf-4a7d-831f-f3291755af62 up in Southbound
Nov 29 07:07:26 compute-0 systemd-machined[153486]: New machine qemu-27-instance-00000052.
Nov 29 07:07:26 compute-0 nova_compute[187185]: 2025-11-29 07:07:26.413 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:26.422 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f19d8301-bf9a-4575-8543-f5787eb63794]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:07:26 compute-0 systemd[1]: Started Virtual Machine qemu-27-instance-00000052.
Nov 29 07:07:26 compute-0 systemd-udevd[224375]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:07:26 compute-0 NetworkManager[55227]: <info>  [1764400046.4691] device (tapf051fae4-0f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:26.465 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[36172299-dd40-4d9e-bea2-487db00a2277]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:07:26 compute-0 NetworkManager[55227]: <info>  [1764400046.4698] device (tapf051fae4-0f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:07:26 compute-0 NetworkManager[55227]: <info>  [1764400046.4743] manager: (tapc8a3c675-40): new Veth device (/org/freedesktop/NetworkManager/Devices/99)
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:26.473 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[abc61d39-f678-4e16-b28e-08bf2d5c2264]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:26.507 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[8fc7a40c-79be-4430-88ae-52b7fce9bbf6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:26.510 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[ce08bc76-36f2-4253-a827-01c45b11c573]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:07:26 compute-0 NetworkManager[55227]: <info>  [1764400046.5311] device (tapc8a3c675-40): carrier: link connected
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:26.539 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[f6b24903-5fb6-4279-9e04-4f431a6d4a39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:26.554 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a8b141f3-c011-4550-a186-5d75be5038cd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc8a3c675-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:06:f6:63'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 549419, 'reachable_time': 15395, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224403, 'error': None, 'target': 'ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:26.579 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[80f5530e-b107-4bc4-b33b-964e8d386ffc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe06:f663'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 549419, 'tstamp': 549419}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224404, 'error': None, 'target': 'ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:26.604 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[1ae2a8c7-0e95-42e4-ae57-f84193ef74a1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc8a3c675-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:06:f6:63'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 549419, 'reachable_time': 15395, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224405, 'error': None, 'target': 'ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:07:26 compute-0 nova_compute[187185]: 2025-11-29 07:07:26.638 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400031.636617, 3b461587-d91f-4a59-a05a-9a2ae89cfacd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:07:26 compute-0 nova_compute[187185]: 2025-11-29 07:07:26.638 187189 INFO nova.compute.manager [-] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] VM Stopped (Lifecycle Event)
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:26.648 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[845a7123-bf2b-43b2-a90c-5899193ac4d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:07:26 compute-0 nova_compute[187185]: 2025-11-29 07:07:26.675 187189 DEBUG nova.compute.manager [None req-63587e99-ac59-45aa-b149-3d1370020b84 - - - - - -] [instance: 3b461587-d91f-4a59-a05a-9a2ae89cfacd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:07:26 compute-0 nova_compute[187185]: 2025-11-29 07:07:26.698 187189 DEBUG nova.compute.manager [req-1d62a269-7bcb-46a9-a778-c061c18b1d07 req-44c0fa11-7bb8-4cf9-8d36-2846c67e53b3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Received event network-vif-plugged-f051fae4-0fdf-4a7d-831f-f3291755af62 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:07:26 compute-0 nova_compute[187185]: 2025-11-29 07:07:26.698 187189 DEBUG oslo_concurrency.lockutils [req-1d62a269-7bcb-46a9-a778-c061c18b1d07 req-44c0fa11-7bb8-4cf9-8d36-2846c67e53b3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "95b8710a-1d4d-4b62-b920-af874de99431-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:07:26 compute-0 nova_compute[187185]: 2025-11-29 07:07:26.698 187189 DEBUG oslo_concurrency.lockutils [req-1d62a269-7bcb-46a9-a778-c061c18b1d07 req-44c0fa11-7bb8-4cf9-8d36-2846c67e53b3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "95b8710a-1d4d-4b62-b920-af874de99431-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:07:26 compute-0 nova_compute[187185]: 2025-11-29 07:07:26.699 187189 DEBUG oslo_concurrency.lockutils [req-1d62a269-7bcb-46a9-a778-c061c18b1d07 req-44c0fa11-7bb8-4cf9-8d36-2846c67e53b3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "95b8710a-1d4d-4b62-b920-af874de99431-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:07:26 compute-0 nova_compute[187185]: 2025-11-29 07:07:26.699 187189 DEBUG nova.compute.manager [req-1d62a269-7bcb-46a9-a778-c061c18b1d07 req-44c0fa11-7bb8-4cf9-8d36-2846c67e53b3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Processing event network-vif-plugged-f051fae4-0fdf-4a7d-831f-f3291755af62 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:26.718 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e8c50262-dd66-4603-9856-1d3ed3552e68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:26.719 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc8a3c675-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:26.720 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:26.720 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc8a3c675-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:07:26 compute-0 nova_compute[187185]: 2025-11-29 07:07:26.722 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:26 compute-0 kernel: tapc8a3c675-40: entered promiscuous mode
Nov 29 07:07:26 compute-0 NetworkManager[55227]: <info>  [1764400046.7228] manager: (tapc8a3c675-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/100)
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:26.725 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc8a3c675-40, col_values=(('external_ids', {'iface-id': '2a5ced08-2785-4bf9-8fa1-c89240d15794'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:07:26 compute-0 nova_compute[187185]: 2025-11-29 07:07:26.726 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:26 compute-0 ovn_controller[95281]: 2025-11-29T07:07:26Z|00184|binding|INFO|Releasing lport 2a5ced08-2785-4bf9-8fa1-c89240d15794 from this chassis (sb_readonly=0)
Nov 29 07:07:26 compute-0 nova_compute[187185]: 2025-11-29 07:07:26.727 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:26.727 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c8a3c675-42f5-48a4-83d7-2d39dd3304b9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c8a3c675-42f5-48a4-83d7-2d39dd3304b9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:26.738 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d0038809-7610-4ce6-824a-ab857ce24330]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:26.740 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-c8a3c675-42f5-48a4-83d7-2d39dd3304b9
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/c8a3c675-42f5-48a4-83d7-2d39dd3304b9.pid.haproxy
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID c8a3c675-42f5-48a4-83d7-2d39dd3304b9
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:07:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:26.741 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9', 'env', 'PROCESS_TAG=haproxy-c8a3c675-42f5-48a4-83d7-2d39dd3304b9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c8a3c675-42f5-48a4-83d7-2d39dd3304b9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:07:26 compute-0 nova_compute[187185]: 2025-11-29 07:07:26.741 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:27 compute-0 nova_compute[187185]: 2025-11-29 07:07:27.036 187189 DEBUG nova.compute.manager [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:07:27 compute-0 nova_compute[187185]: 2025-11-29 07:07:27.038 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400047.0360024, 95b8710a-1d4d-4b62-b920-af874de99431 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:07:27 compute-0 nova_compute[187185]: 2025-11-29 07:07:27.038 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] VM Started (Lifecycle Event)
Nov 29 07:07:27 compute-0 nova_compute[187185]: 2025-11-29 07:07:27.050 187189 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:07:27 compute-0 nova_compute[187185]: 2025-11-29 07:07:27.054 187189 INFO nova.virt.libvirt.driver [-] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Instance spawned successfully.
Nov 29 07:07:27 compute-0 nova_compute[187185]: 2025-11-29 07:07:27.054 187189 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:07:27 compute-0 nova_compute[187185]: 2025-11-29 07:07:27.236 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:07:27 compute-0 nova_compute[187185]: 2025-11-29 07:07:27.241 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:07:27 compute-0 podman[224444]: 2025-11-29 07:07:27.147731636 +0000 UTC m=+0.023776757 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:07:27 compute-0 nova_compute[187185]: 2025-11-29 07:07:27.452 187189 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:07:27 compute-0 nova_compute[187185]: 2025-11-29 07:07:27.453 187189 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:07:27 compute-0 nova_compute[187185]: 2025-11-29 07:07:27.454 187189 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:07:27 compute-0 nova_compute[187185]: 2025-11-29 07:07:27.455 187189 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:07:27 compute-0 nova_compute[187185]: 2025-11-29 07:07:27.455 187189 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:07:27 compute-0 nova_compute[187185]: 2025-11-29 07:07:27.456 187189 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:07:27 compute-0 podman[224444]: 2025-11-29 07:07:27.740027517 +0000 UTC m=+0.616072618 container create 144cfda298b7dd345cf08fe80541316cca11efbece106ca343fb6eb90eea51ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:07:27 compute-0 nova_compute[187185]: 2025-11-29 07:07:27.821 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:07:27 compute-0 nova_compute[187185]: 2025-11-29 07:07:27.822 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400047.0363224, 95b8710a-1d4d-4b62-b920-af874de99431 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:07:27 compute-0 nova_compute[187185]: 2025-11-29 07:07:27.822 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] VM Paused (Lifecycle Event)
Nov 29 07:07:27 compute-0 systemd[1]: Started libpod-conmon-144cfda298b7dd345cf08fe80541316cca11efbece106ca343fb6eb90eea51ee.scope.
Nov 29 07:07:27 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:07:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/65aa7273ab43e6554338123d331af1344e02e0d3973311e8455d424e9fe66be4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:07:28 compute-0 podman[224444]: 2025-11-29 07:07:28.091341854 +0000 UTC m=+0.967386965 container init 144cfda298b7dd345cf08fe80541316cca11efbece106ca343fb6eb90eea51ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 29 07:07:28 compute-0 podman[224444]: 2025-11-29 07:07:28.099968935 +0000 UTC m=+0.976014036 container start 144cfda298b7dd345cf08fe80541316cca11efbece106ca343fb6eb90eea51ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 07:07:28 compute-0 neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9[224479]: [NOTICE]   (224491) : New worker (224493) forked
Nov 29 07:07:28 compute-0 neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9[224479]: [NOTICE]   (224491) : Loading success.
Nov 29 07:07:28 compute-0 podman[224456]: 2025-11-29 07:07:28.128854855 +0000 UTC m=+0.389473917 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 29 07:07:28 compute-0 nova_compute[187185]: 2025-11-29 07:07:28.521 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:07:28 compute-0 nova_compute[187185]: 2025-11-29 07:07:28.527 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400047.0410373, 95b8710a-1d4d-4b62-b920-af874de99431 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:07:28 compute-0 nova_compute[187185]: 2025-11-29 07:07:28.527 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] VM Resumed (Lifecycle Event)
Nov 29 07:07:28 compute-0 nova_compute[187185]: 2025-11-29 07:07:28.680 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:07:28 compute-0 nova_compute[187185]: 2025-11-29 07:07:28.685 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:07:28 compute-0 nova_compute[187185]: 2025-11-29 07:07:28.802 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:07:28 compute-0 nova_compute[187185]: 2025-11-29 07:07:28.863 187189 INFO nova.compute.manager [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Took 15.69 seconds to spawn the instance on the hypervisor.
Nov 29 07:07:28 compute-0 nova_compute[187185]: 2025-11-29 07:07:28.864 187189 DEBUG nova.compute.manager [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:07:29 compute-0 nova_compute[187185]: 2025-11-29 07:07:29.548 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:29 compute-0 nova_compute[187185]: 2025-11-29 07:07:29.560 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:29 compute-0 nova_compute[187185]: 2025-11-29 07:07:29.801 187189 INFO nova.compute.manager [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Rebuilding instance
Nov 29 07:07:29 compute-0 nova_compute[187185]: 2025-11-29 07:07:29.806 187189 DEBUG nova.compute.manager [req-bc61dd41-f7d7-4eed-9624-12e6c6c8feaf req-a9b79eca-da82-4166-a48b-aacbd6422a8f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Received event network-vif-plugged-f051fae4-0fdf-4a7d-831f-f3291755af62 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:07:29 compute-0 nova_compute[187185]: 2025-11-29 07:07:29.806 187189 DEBUG oslo_concurrency.lockutils [req-bc61dd41-f7d7-4eed-9624-12e6c6c8feaf req-a9b79eca-da82-4166-a48b-aacbd6422a8f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "95b8710a-1d4d-4b62-b920-af874de99431-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:07:29 compute-0 nova_compute[187185]: 2025-11-29 07:07:29.807 187189 DEBUG oslo_concurrency.lockutils [req-bc61dd41-f7d7-4eed-9624-12e6c6c8feaf req-a9b79eca-da82-4166-a48b-aacbd6422a8f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "95b8710a-1d4d-4b62-b920-af874de99431-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:07:29 compute-0 nova_compute[187185]: 2025-11-29 07:07:29.807 187189 DEBUG oslo_concurrency.lockutils [req-bc61dd41-f7d7-4eed-9624-12e6c6c8feaf req-a9b79eca-da82-4166-a48b-aacbd6422a8f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "95b8710a-1d4d-4b62-b920-af874de99431-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:07:29 compute-0 nova_compute[187185]: 2025-11-29 07:07:29.808 187189 DEBUG nova.compute.manager [req-bc61dd41-f7d7-4eed-9624-12e6c6c8feaf req-a9b79eca-da82-4166-a48b-aacbd6422a8f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] No waiting events found dispatching network-vif-plugged-f051fae4-0fdf-4a7d-831f-f3291755af62 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:07:29 compute-0 nova_compute[187185]: 2025-11-29 07:07:29.808 187189 WARNING nova.compute.manager [req-bc61dd41-f7d7-4eed-9624-12e6c6c8feaf req-a9b79eca-da82-4166-a48b-aacbd6422a8f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Received unexpected event network-vif-plugged-f051fae4-0fdf-4a7d-831f-f3291755af62 for instance with vm_state active and task_state None.
Nov 29 07:07:30 compute-0 nova_compute[187185]: 2025-11-29 07:07:30.901 187189 INFO nova.compute.manager [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Took 18.67 seconds to build instance.
Nov 29 07:07:31 compute-0 nova_compute[187185]: 2025-11-29 07:07:31.504 187189 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "95b8710a-1d4d-4b62-b920-af874de99431" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.409s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:07:31 compute-0 nova_compute[187185]: 2025-11-29 07:07:31.952 187189 DEBUG nova.compute.manager [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:07:32 compute-0 nova_compute[187185]: 2025-11-29 07:07:32.305 187189 DEBUG nova.objects.instance [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Lazy-loading 'pci_requests' on Instance uuid 2f35ca71-193f-457f-b09f-78963a176460 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:07:32 compute-0 nova_compute[187185]: 2025-11-29 07:07:32.587 187189 DEBUG nova.objects.instance [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2f35ca71-193f-457f-b09f-78963a176460 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:07:32 compute-0 nova_compute[187185]: 2025-11-29 07:07:32.609 187189 DEBUG nova.objects.instance [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Lazy-loading 'resources' on Instance uuid 2f35ca71-193f-457f-b09f-78963a176460 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:07:32 compute-0 nova_compute[187185]: 2025-11-29 07:07:32.622 187189 DEBUG nova.objects.instance [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Lazy-loading 'migration_context' on Instance uuid 2f35ca71-193f-457f-b09f-78963a176460 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:07:32 compute-0 podman[224502]: 2025-11-29 07:07:32.788640747 +0000 UTC m=+0.056407752 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 07:07:33 compute-0 nova_compute[187185]: 2025-11-29 07:07:33.268 187189 DEBUG nova.objects.instance [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Nov 29 07:07:33 compute-0 nova_compute[187185]: 2025-11-29 07:07:33.274 187189 DEBUG nova.virt.libvirt.driver [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Nov 29 07:07:34 compute-0 nova_compute[187185]: 2025-11-29 07:07:34.562 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:07:34 compute-0 nova_compute[187185]: 2025-11-29 07:07:34.564 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:07:34 compute-0 nova_compute[187185]: 2025-11-29 07:07:34.565 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 29 07:07:34 compute-0 nova_compute[187185]: 2025-11-29 07:07:34.565 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 29 07:07:34 compute-0 nova_compute[187185]: 2025-11-29 07:07:34.596 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:34 compute-0 nova_compute[187185]: 2025-11-29 07:07:34.597 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 29 07:07:35 compute-0 podman[224538]: 2025-11-29 07:07:35.807303511 +0000 UTC m=+0.067156743 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 29 07:07:37 compute-0 podman[224563]: 2025-11-29 07:07:37.812496621 +0000 UTC m=+0.070622840 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:07:39 compute-0 nova_compute[187185]: 2025-11-29 07:07:39.597 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:39 compute-0 nova_compute[187185]: 2025-11-29 07:07:39.600 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:07:40 compute-0 ovn_controller[95281]: 2025-11-29T07:07:40Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:07:5c:91 10.100.0.12
Nov 29 07:07:40 compute-0 ovn_controller[95281]: 2025-11-29T07:07:40Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:07:5c:91 10.100.0.12
Nov 29 07:07:43 compute-0 nova_compute[187185]: 2025-11-29 07:07:43.324 187189 DEBUG nova.virt.libvirt.driver [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Nov 29 07:07:44 compute-0 nova_compute[187185]: 2025-11-29 07:07:44.600 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:07:45 compute-0 nova_compute[187185]: 2025-11-29 07:07:45.681 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:45.681 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:07:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:45.684 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:07:45 compute-0 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d00000054.scope: Deactivated successfully.
Nov 29 07:07:45 compute-0 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d00000054.scope: Consumed 12.979s CPU time.
Nov 29 07:07:45 compute-0 systemd-machined[153486]: Machine qemu-26-instance-00000054 terminated.
Nov 29 07:07:46 compute-0 nova_compute[187185]: 2025-11-29 07:07:46.341 187189 INFO nova.virt.libvirt.driver [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Instance shutdown successfully after 13 seconds.
Nov 29 07:07:46 compute-0 nova_compute[187185]: 2025-11-29 07:07:46.353 187189 INFO nova.virt.libvirt.driver [-] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Instance destroyed successfully.
Nov 29 07:07:46 compute-0 nova_compute[187185]: 2025-11-29 07:07:46.362 187189 INFO nova.virt.libvirt.driver [-] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Instance destroyed successfully.
Nov 29 07:07:46 compute-0 nova_compute[187185]: 2025-11-29 07:07:46.363 187189 INFO nova.virt.libvirt.driver [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Deleting instance files /var/lib/nova/instances/2f35ca71-193f-457f-b09f-78963a176460_del
Nov 29 07:07:46 compute-0 nova_compute[187185]: 2025-11-29 07:07:46.364 187189 INFO nova.virt.libvirt.driver [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Deletion of /var/lib/nova/instances/2f35ca71-193f-457f-b09f-78963a176460_del complete
Nov 29 07:07:46 compute-0 nova_compute[187185]: 2025-11-29 07:07:46.959 187189 DEBUG nova.virt.libvirt.driver [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:07:46 compute-0 nova_compute[187185]: 2025-11-29 07:07:46.960 187189 INFO nova.virt.libvirt.driver [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Creating image(s)
Nov 29 07:07:46 compute-0 nova_compute[187185]: 2025-11-29 07:07:46.960 187189 DEBUG oslo_concurrency.lockutils [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Acquiring lock "/var/lib/nova/instances/2f35ca71-193f-457f-b09f-78963a176460/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:07:46 compute-0 nova_compute[187185]: 2025-11-29 07:07:46.961 187189 DEBUG oslo_concurrency.lockutils [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Lock "/var/lib/nova/instances/2f35ca71-193f-457f-b09f-78963a176460/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:07:46 compute-0 nova_compute[187185]: 2025-11-29 07:07:46.961 187189 DEBUG oslo_concurrency.lockutils [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Lock "/var/lib/nova/instances/2f35ca71-193f-457f-b09f-78963a176460/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:07:46 compute-0 nova_compute[187185]: 2025-11-29 07:07:46.962 187189 DEBUG oslo_concurrency.lockutils [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Acquiring lock "923f30c548f83d073f1130ce28fd6a6debb4b123" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:07:46 compute-0 nova_compute[187185]: 2025-11-29 07:07:46.963 187189 DEBUG oslo_concurrency.lockutils [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Lock "923f30c548f83d073f1130ce28fd6a6debb4b123" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:07:48 compute-0 nova_compute[187185]: 2025-11-29 07:07:48.446 187189 DEBUG oslo_concurrency.processutils [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:07:48 compute-0 nova_compute[187185]: 2025-11-29 07:07:48.529 187189 DEBUG oslo_concurrency.processutils [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123.part --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:07:48 compute-0 nova_compute[187185]: 2025-11-29 07:07:48.530 187189 DEBUG nova.virt.images [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] 3372b7b2-657b-4c4d-9d9d-7c5b771a630a was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Nov 29 07:07:48 compute-0 nova_compute[187185]: 2025-11-29 07:07:48.531 187189 DEBUG nova.privsep.utils [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 29 07:07:48 compute-0 nova_compute[187185]: 2025-11-29 07:07:48.532 187189 DEBUG oslo_concurrency.processutils [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123.part /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:07:48 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:48.688 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:07:48 compute-0 podman[224621]: 2025-11-29 07:07:48.80779733 +0000 UTC m=+0.062026619 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, architecture=x86_64, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Nov 29 07:07:48 compute-0 podman[224622]: 2025-11-29 07:07:48.816226607 +0000 UTC m=+0.063143411 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 07:07:48 compute-0 podman[224620]: 2025-11-29 07:07:48.831944947 +0000 UTC m=+0.087599636 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 07:07:49 compute-0 nova_compute[187185]: 2025-11-29 07:07:49.417 187189 DEBUG oslo_concurrency.processutils [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123.part /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123.converted" returned: 0 in 0.885s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:07:49 compute-0 nova_compute[187185]: 2025-11-29 07:07:49.422 187189 DEBUG oslo_concurrency.processutils [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:07:49 compute-0 nova_compute[187185]: 2025-11-29 07:07:49.541 187189 DEBUG oslo_concurrency.processutils [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123.converted --force-share --output=json" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:07:49 compute-0 nova_compute[187185]: 2025-11-29 07:07:49.542 187189 DEBUG oslo_concurrency.lockutils [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Lock "923f30c548f83d073f1130ce28fd6a6debb4b123" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.579s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:07:49 compute-0 nova_compute[187185]: 2025-11-29 07:07:49.556 187189 DEBUG oslo_concurrency.processutils [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:07:49 compute-0 nova_compute[187185]: 2025-11-29 07:07:49.603 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:49 compute-0 nova_compute[187185]: 2025-11-29 07:07:49.643 187189 DEBUG oslo_concurrency.processutils [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:07:49 compute-0 nova_compute[187185]: 2025-11-29 07:07:49.644 187189 DEBUG oslo_concurrency.lockutils [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Acquiring lock "923f30c548f83d073f1130ce28fd6a6debb4b123" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:07:49 compute-0 nova_compute[187185]: 2025-11-29 07:07:49.645 187189 DEBUG oslo_concurrency.lockutils [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Lock "923f30c548f83d073f1130ce28fd6a6debb4b123" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:07:49 compute-0 nova_compute[187185]: 2025-11-29 07:07:49.656 187189 DEBUG oslo_concurrency.processutils [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:07:49 compute-0 nova_compute[187185]: 2025-11-29 07:07:49.708 187189 DEBUG oslo_concurrency.processutils [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:07:49 compute-0 nova_compute[187185]: 2025-11-29 07:07:49.710 187189 DEBUG oslo_concurrency.processutils [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123,backing_fmt=raw /var/lib/nova/instances/2f35ca71-193f-457f-b09f-78963a176460/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:07:49 compute-0 nova_compute[187185]: 2025-11-29 07:07:49.939 187189 DEBUG oslo_concurrency.processutils [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123,backing_fmt=raw /var/lib/nova/instances/2f35ca71-193f-457f-b09f-78963a176460/disk 1073741824" returned: 0 in 0.230s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:07:49 compute-0 nova_compute[187185]: 2025-11-29 07:07:49.941 187189 DEBUG oslo_concurrency.lockutils [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Lock "923f30c548f83d073f1130ce28fd6a6debb4b123" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.296s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:07:49 compute-0 nova_compute[187185]: 2025-11-29 07:07:49.942 187189 DEBUG oslo_concurrency.processutils [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:07:49 compute-0 nova_compute[187185]: 2025-11-29 07:07:49.998 187189 DEBUG oslo_concurrency.lockutils [None req-7b6b8a42-ec0b-42cc-bc3d-889615309981 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Acquiring lock "95b8710a-1d4d-4b62-b920-af874de99431" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:07:49 compute-0 nova_compute[187185]: 2025-11-29 07:07:49.999 187189 DEBUG oslo_concurrency.lockutils [None req-7b6b8a42-ec0b-42cc-bc3d-889615309981 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "95b8710a-1d4d-4b62-b920-af874de99431" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:49.999 187189 DEBUG oslo_concurrency.lockutils [None req-7b6b8a42-ec0b-42cc-bc3d-889615309981 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Acquiring lock "95b8710a-1d4d-4b62-b920-af874de99431-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.000 187189 DEBUG oslo_concurrency.lockutils [None req-7b6b8a42-ec0b-42cc-bc3d-889615309981 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "95b8710a-1d4d-4b62-b920-af874de99431-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.000 187189 DEBUG oslo_concurrency.lockutils [None req-7b6b8a42-ec0b-42cc-bc3d-889615309981 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "95b8710a-1d4d-4b62-b920-af874de99431-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.017 187189 INFO nova.compute.manager [None req-7b6b8a42-ec0b-42cc-bc3d-889615309981 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Terminating instance
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.034 187189 DEBUG oslo_concurrency.processutils [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.035 187189 DEBUG nova.virt.disk.api [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Checking if we can resize image /var/lib/nova/instances/2f35ca71-193f-457f-b09f-78963a176460/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.035 187189 DEBUG oslo_concurrency.processutils [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2f35ca71-193f-457f-b09f-78963a176460/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.062 187189 DEBUG nova.compute.manager [None req-7b6b8a42-ec0b-42cc-bc3d-889615309981 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 07:07:50 compute-0 kernel: tapf051fae4-0f (unregistering): left promiscuous mode
Nov 29 07:07:50 compute-0 NetworkManager[55227]: <info>  [1764400070.0848] device (tapf051fae4-0f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:07:50 compute-0 ovn_controller[95281]: 2025-11-29T07:07:50Z|00185|binding|INFO|Releasing lport f051fae4-0fdf-4a7d-831f-f3291755af62 from this chassis (sb_readonly=0)
Nov 29 07:07:50 compute-0 ovn_controller[95281]: 2025-11-29T07:07:50Z|00186|binding|INFO|Setting lport f051fae4-0fdf-4a7d-831f-f3291755af62 down in Southbound
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.095 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:50 compute-0 ovn_controller[95281]: 2025-11-29T07:07:50Z|00187|binding|INFO|Removing iface tapf051fae4-0f ovn-installed in OVS
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.115 187189 DEBUG oslo_concurrency.processutils [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2f35ca71-193f-457f-b09f-78963a176460/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.115 187189 DEBUG nova.virt.disk.api [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Cannot resize image /var/lib/nova/instances/2f35ca71-193f-457f-b09f-78963a176460/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.116 187189 DEBUG nova.virt.libvirt.driver [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.116 187189 DEBUG nova.virt.libvirt.driver [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Ensure instance console log exists: /var/lib/nova/instances/2f35ca71-193f-457f-b09f-78963a176460/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.117 187189 DEBUG oslo_concurrency.lockutils [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.117 187189 DEBUG oslo_concurrency.lockutils [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.117 187189 DEBUG oslo_concurrency.lockutils [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:07:50 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:50.119 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:07:5c:91 10.100.0.12'], port_security=['fa:16:3e:07:5c:91 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '95b8710a-1d4d-4b62-b920-af874de99431', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c8a3c675-42f5-48a4-83d7-2d39dd3304b9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '477b89fb35da42f69c15b3f01054754a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9d87673c-3e8a-46ed-9956-50ea661306ab', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=114ba21f-c978-4c05-97f9-429aa66017b7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=f051fae4-0fdf-4a7d-831f-f3291755af62) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.120 187189 DEBUG nova.virt.libvirt.driver [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:09Z,direct_url=<?>,disk_format='qcow2',id=3372b7b2-657b-4c4d-9d9d-7c5b771a630a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:07:50 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:50.120 104254 INFO neutron.agent.ovn.metadata.agent [-] Port f051fae4-0fdf-4a7d-831f-f3291755af62 in datapath c8a3c675-42f5-48a4-83d7-2d39dd3304b9 unbound from our chassis
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.121 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:50 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:50.121 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c8a3c675-42f5-48a4-83d7-2d39dd3304b9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:07:50 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:50.122 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6df5f0c9-3647-4e71-bb0e-356338a599a5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:07:50 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:50.123 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9 namespace which is not needed anymore
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.129 187189 WARNING nova.virt.libvirt.driver [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.140 187189 DEBUG nova.virt.libvirt.host [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.141 187189 DEBUG nova.virt.libvirt.host [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.144 187189 DEBUG nova.virt.libvirt.host [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.144 187189 DEBUG nova.virt.libvirt.host [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.146 187189 DEBUG nova.virt.libvirt.driver [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.146 187189 DEBUG nova.virt.hardware [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:09Z,direct_url=<?>,disk_format='qcow2',id=3372b7b2-657b-4c4d-9d9d-7c5b771a630a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.146 187189 DEBUG nova.virt.hardware [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.146 187189 DEBUG nova.virt.hardware [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.147 187189 DEBUG nova.virt.hardware [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.147 187189 DEBUG nova.virt.hardware [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.147 187189 DEBUG nova.virt.hardware [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.147 187189 DEBUG nova.virt.hardware [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.148 187189 DEBUG nova.virt.hardware [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.148 187189 DEBUG nova.virt.hardware [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.148 187189 DEBUG nova.virt.hardware [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.148 187189 DEBUG nova.virt.hardware [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.149 187189 DEBUG nova.objects.instance [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 2f35ca71-193f-457f-b09f-78963a176460 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:07:50 compute-0 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000052.scope: Deactivated successfully.
Nov 29 07:07:50 compute-0 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000052.scope: Consumed 13.073s CPU time.
Nov 29 07:07:50 compute-0 systemd-machined[153486]: Machine qemu-27-instance-00000052 terminated.
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.175 187189 DEBUG nova.virt.libvirt.driver [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:07:50 compute-0 nova_compute[187185]:   <uuid>2f35ca71-193f-457f-b09f-78963a176460</uuid>
Nov 29 07:07:50 compute-0 nova_compute[187185]:   <name>instance-00000054</name>
Nov 29 07:07:50 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:07:50 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:07:50 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:07:50 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:       <nova:name>tempest-ServerShowV257Test-server-1416562793</nova:name>
Nov 29 07:07:50 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:07:50</nova:creationTime>
Nov 29 07:07:50 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:07:50 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:07:50 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:07:50 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:07:50 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:07:50 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:07:50 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:07:50 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:07:50 compute-0 nova_compute[187185]:         <nova:user uuid="7293fa2b633e4b42af3128c6bcbd5176">tempest-ServerShowV257Test-1776381761-project-member</nova:user>
Nov 29 07:07:50 compute-0 nova_compute[187185]:         <nova:project uuid="618890258b5f40d4a3313b98a94795c7">tempest-ServerShowV257Test-1776381761</nova:project>
Nov 29 07:07:50 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:07:50 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="3372b7b2-657b-4c4d-9d9d-7c5b771a630a"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:       <nova:ports/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:07:50 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:07:50 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <system>
Nov 29 07:07:50 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:07:50 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:07:50 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:07:50 compute-0 nova_compute[187185]:       <entry name="serial">2f35ca71-193f-457f-b09f-78963a176460</entry>
Nov 29 07:07:50 compute-0 nova_compute[187185]:       <entry name="uuid">2f35ca71-193f-457f-b09f-78963a176460</entry>
Nov 29 07:07:50 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     </system>
Nov 29 07:07:50 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:07:50 compute-0 nova_compute[187185]:   <os>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:   </os>
Nov 29 07:07:50 compute-0 nova_compute[187185]:   <features>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:   </features>
Nov 29 07:07:50 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:07:50 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:07:50 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:07:50 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/2f35ca71-193f-457f-b09f-78963a176460/disk"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:07:50 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/2f35ca71-193f-457f-b09f-78963a176460/disk.config"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:07:50 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/2f35ca71-193f-457f-b09f-78963a176460/console.log" append="off"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <video>
Nov 29 07:07:50 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     </video>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:07:50 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:07:50 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:07:50 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:07:50 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:07:50 compute-0 nova_compute[187185]: </domain>
Nov 29 07:07:50 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.329 187189 INFO nova.virt.libvirt.driver [-] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Instance destroyed successfully.
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.330 187189 DEBUG nova.objects.instance [None req-7b6b8a42-ec0b-42cc-bc3d-889615309981 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lazy-loading 'resources' on Instance uuid 95b8710a-1d4d-4b62-b920-af874de99431 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.353 187189 DEBUG nova.compute.manager [req-2b77f676-877f-422c-8376-bfe489032ba4 req-939f1cb9-c31d-4093-8306-9339978bc5f8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Received event network-vif-unplugged-f051fae4-0fdf-4a7d-831f-f3291755af62 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.354 187189 DEBUG oslo_concurrency.lockutils [req-2b77f676-877f-422c-8376-bfe489032ba4 req-939f1cb9-c31d-4093-8306-9339978bc5f8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "95b8710a-1d4d-4b62-b920-af874de99431-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.354 187189 DEBUG oslo_concurrency.lockutils [req-2b77f676-877f-422c-8376-bfe489032ba4 req-939f1cb9-c31d-4093-8306-9339978bc5f8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "95b8710a-1d4d-4b62-b920-af874de99431-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.354 187189 DEBUG oslo_concurrency.lockutils [req-2b77f676-877f-422c-8376-bfe489032ba4 req-939f1cb9-c31d-4093-8306-9339978bc5f8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "95b8710a-1d4d-4b62-b920-af874de99431-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.354 187189 DEBUG nova.compute.manager [req-2b77f676-877f-422c-8376-bfe489032ba4 req-939f1cb9-c31d-4093-8306-9339978bc5f8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] No waiting events found dispatching network-vif-unplugged-f051fae4-0fdf-4a7d-831f-f3291755af62 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.354 187189 DEBUG nova.compute.manager [req-2b77f676-877f-422c-8376-bfe489032ba4 req-939f1cb9-c31d-4093-8306-9339978bc5f8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Received event network-vif-unplugged-f051fae4-0fdf-4a7d-831f-f3291755af62 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.358 187189 DEBUG nova.virt.libvirt.vif [None req-7b6b8a42-ec0b-42cc-bc3d-889615309981 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:07:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-1009215778',display_name='tempest-ListServersNegativeTestJSON-server-1009215778-2',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-1009215778-2',id=82,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=1,launched_at=2025-11-29T07:07:28Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='477b89fb35da42f69c15b3f01054754a',ramdisk_id='',reservation_id='r-fs8srmd1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServersNegativeTestJSON-316367608',owner_user_name='tempest-ListServersNegativeTestJSON-316367608-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:07:28Z,user_data=None,user_id='3c9a3fa9f480479d98f522f6f02870fb',uuid=95b8710a-1d4d-4b62-b920-af874de99431,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f051fae4-0fdf-4a7d-831f-f3291755af62", "address": "fa:16:3e:07:5c:91", "network": {"id": "c8a3c675-42f5-48a4-83d7-2d39dd3304b9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1711984379-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "477b89fb35da42f69c15b3f01054754a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf051fae4-0f", "ovs_interfaceid": "f051fae4-0fdf-4a7d-831f-f3291755af62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.358 187189 DEBUG nova.network.os_vif_util [None req-7b6b8a42-ec0b-42cc-bc3d-889615309981 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Converting VIF {"id": "f051fae4-0fdf-4a7d-831f-f3291755af62", "address": "fa:16:3e:07:5c:91", "network": {"id": "c8a3c675-42f5-48a4-83d7-2d39dd3304b9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1711984379-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "477b89fb35da42f69c15b3f01054754a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf051fae4-0f", "ovs_interfaceid": "f051fae4-0fdf-4a7d-831f-f3291755af62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.359 187189 DEBUG nova.network.os_vif_util [None req-7b6b8a42-ec0b-42cc-bc3d-889615309981 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:07:5c:91,bridge_name='br-int',has_traffic_filtering=True,id=f051fae4-0fdf-4a7d-831f-f3291755af62,network=Network(c8a3c675-42f5-48a4-83d7-2d39dd3304b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf051fae4-0f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.359 187189 DEBUG os_vif [None req-7b6b8a42-ec0b-42cc-bc3d-889615309981 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:07:5c:91,bridge_name='br-int',has_traffic_filtering=True,id=f051fae4-0fdf-4a7d-831f-f3291755af62,network=Network(c8a3c675-42f5-48a4-83d7-2d39dd3304b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf051fae4-0f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.361 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.361 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf051fae4-0f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.363 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.364 187189 DEBUG nova.virt.libvirt.driver [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.365 187189 DEBUG nova.virt.libvirt.driver [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.365 187189 INFO nova.virt.libvirt.driver [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Using config drive
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.367 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.370 187189 INFO os_vif [None req-7b6b8a42-ec0b-42cc-bc3d-889615309981 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:07:5c:91,bridge_name='br-int',has_traffic_filtering=True,id=f051fae4-0fdf-4a7d-831f-f3291755af62,network=Network(c8a3c675-42f5-48a4-83d7-2d39dd3304b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf051fae4-0f')
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.371 187189 INFO nova.virt.libvirt.driver [None req-7b6b8a42-ec0b-42cc-bc3d-889615309981 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Deleting instance files /var/lib/nova/instances/95b8710a-1d4d-4b62-b920-af874de99431_del
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.372 187189 INFO nova.virt.libvirt.driver [None req-7b6b8a42-ec0b-42cc-bc3d-889615309981 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Deletion of /var/lib/nova/instances/95b8710a-1d4d-4b62-b920-af874de99431_del complete
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.396 187189 DEBUG nova.objects.instance [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 2f35ca71-193f-457f-b09f-78963a176460 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:07:50 compute-0 neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9[224479]: [NOTICE]   (224491) : haproxy version is 2.8.14-c23fe91
Nov 29 07:07:50 compute-0 neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9[224479]: [NOTICE]   (224491) : path to executable is /usr/sbin/haproxy
Nov 29 07:07:50 compute-0 neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9[224479]: [WARNING]  (224491) : Exiting Master process...
Nov 29 07:07:50 compute-0 neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9[224479]: [WARNING]  (224491) : Exiting Master process...
Nov 29 07:07:50 compute-0 neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9[224479]: [ALERT]    (224491) : Current worker (224493) exited with code 143 (Terminated)
Nov 29 07:07:50 compute-0 neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9[224479]: [WARNING]  (224491) : All workers exited. Exiting... (0)
Nov 29 07:07:50 compute-0 systemd[1]: libpod-144cfda298b7dd345cf08fe80541316cca11efbece106ca343fb6eb90eea51ee.scope: Deactivated successfully.
Nov 29 07:07:50 compute-0 podman[224723]: 2025-11-29 07:07:50.446348805 +0000 UTC m=+0.223988969 container died 144cfda298b7dd345cf08fe80541316cca11efbece106ca343fb6eb90eea51ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.446 187189 DEBUG nova.objects.instance [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Lazy-loading 'keypairs' on Instance uuid 2f35ca71-193f-457f-b09f-78963a176460 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.483 187189 INFO nova.compute.manager [None req-7b6b8a42-ec0b-42cc-bc3d-889615309981 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Took 0.42 seconds to destroy the instance on the hypervisor.
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.484 187189 DEBUG oslo.service.loopingcall [None req-7b6b8a42-ec0b-42cc-bc3d-889615309981 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.486 187189 DEBUG nova.compute.manager [-] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.487 187189 DEBUG nova.network.neutron [-] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 07:07:50 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-144cfda298b7dd345cf08fe80541316cca11efbece106ca343fb6eb90eea51ee-userdata-shm.mount: Deactivated successfully.
Nov 29 07:07:50 compute-0 systemd[1]: var-lib-containers-storage-overlay-65aa7273ab43e6554338123d331af1344e02e0d3973311e8455d424e9fe66be4-merged.mount: Deactivated successfully.
Nov 29 07:07:50 compute-0 podman[224723]: 2025-11-29 07:07:50.650344463 +0000 UTC m=+0.427984657 container cleanup 144cfda298b7dd345cf08fe80541316cca11efbece106ca343fb6eb90eea51ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:07:50 compute-0 systemd[1]: libpod-conmon-144cfda298b7dd345cf08fe80541316cca11efbece106ca343fb6eb90eea51ee.scope: Deactivated successfully.
Nov 29 07:07:50 compute-0 podman[224770]: 2025-11-29 07:07:50.755066948 +0000 UTC m=+0.080938640 container remove 144cfda298b7dd345cf08fe80541316cca11efbece106ca343fb6eb90eea51ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 29 07:07:50 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:50.761 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[906af74f-cc8e-44c1-81e1-865da59cd7bc]: (4, ('Sat Nov 29 07:07:50 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9 (144cfda298b7dd345cf08fe80541316cca11efbece106ca343fb6eb90eea51ee)\n144cfda298b7dd345cf08fe80541316cca11efbece106ca343fb6eb90eea51ee\nSat Nov 29 07:07:50 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9 (144cfda298b7dd345cf08fe80541316cca11efbece106ca343fb6eb90eea51ee)\n144cfda298b7dd345cf08fe80541316cca11efbece106ca343fb6eb90eea51ee\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:07:50 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:50.763 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[266255ee-0fa0-4f5a-bace-bca499e26f4a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:07:50 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:50.764 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc8a3c675-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.768 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:50 compute-0 kernel: tapc8a3c675-40: left promiscuous mode
Nov 29 07:07:50 compute-0 nova_compute[187185]: 2025-11-29 07:07:50.793 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:50 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:50.796 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[9fa15331-6f36-49f2-86f4-06603adb52a7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:07:50 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:50.817 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[59c2a937-5de7-4963-bfe1-d7a0504755a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:07:50 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:50.819 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[69b34705-cab8-4b56-b3b6-34b6100ea1a4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:07:50 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:50.833 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[9fda3e86-5fba-4265-8a6a-749ac01f0979]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 549412, 'reachable_time': 30076, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224785, 'error': None, 'target': 'ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:07:50 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:50.836 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:07:50 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:07:50.836 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[0b515f55-d5a0-47ea-a9de-d12582ec54d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:07:50 compute-0 systemd[1]: run-netns-ovnmeta\x2dc8a3c675\x2d42f5\x2d48a4\x2d83d7\x2d2d39dd3304b9.mount: Deactivated successfully.
Nov 29 07:07:51 compute-0 nova_compute[187185]: 2025-11-29 07:07:51.062 187189 INFO nova.virt.libvirt.driver [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Creating config drive at /var/lib/nova/instances/2f35ca71-193f-457f-b09f-78963a176460/disk.config
Nov 29 07:07:51 compute-0 nova_compute[187185]: 2025-11-29 07:07:51.070 187189 DEBUG oslo_concurrency.processutils [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2f35ca71-193f-457f-b09f-78963a176460/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkmr94h9a execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:07:51 compute-0 nova_compute[187185]: 2025-11-29 07:07:51.198 187189 DEBUG oslo_concurrency.processutils [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2f35ca71-193f-457f-b09f-78963a176460/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkmr94h9a" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:07:51 compute-0 systemd-machined[153486]: New machine qemu-28-instance-00000054.
Nov 29 07:07:51 compute-0 systemd[1]: Started Virtual Machine qemu-28-instance-00000054.
Nov 29 07:07:51 compute-0 nova_compute[187185]: 2025-11-29 07:07:51.458 187189 DEBUG nova.network.neutron [-] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:07:51 compute-0 nova_compute[187185]: 2025-11-29 07:07:51.480 187189 INFO nova.compute.manager [-] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Took 0.99 seconds to deallocate network for instance.
Nov 29 07:07:51 compute-0 nova_compute[187185]: 2025-11-29 07:07:51.579 187189 DEBUG oslo_concurrency.lockutils [None req-7b6b8a42-ec0b-42cc-bc3d-889615309981 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:07:51 compute-0 nova_compute[187185]: 2025-11-29 07:07:51.579 187189 DEBUG oslo_concurrency.lockutils [None req-7b6b8a42-ec0b-42cc-bc3d-889615309981 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:07:51 compute-0 nova_compute[187185]: 2025-11-29 07:07:51.649 187189 DEBUG nova.virt.libvirt.host [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Removed pending event for 2f35ca71-193f-457f-b09f-78963a176460 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Nov 29 07:07:51 compute-0 nova_compute[187185]: 2025-11-29 07:07:51.650 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400071.6483536, 2f35ca71-193f-457f-b09f-78963a176460 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:07:51 compute-0 nova_compute[187185]: 2025-11-29 07:07:51.650 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 2f35ca71-193f-457f-b09f-78963a176460] VM Resumed (Lifecycle Event)
Nov 29 07:07:51 compute-0 nova_compute[187185]: 2025-11-29 07:07:51.654 187189 DEBUG nova.compute.manager [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:07:51 compute-0 nova_compute[187185]: 2025-11-29 07:07:51.655 187189 DEBUG nova.virt.libvirt.driver [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:07:51 compute-0 nova_compute[187185]: 2025-11-29 07:07:51.662 187189 DEBUG nova.compute.provider_tree [None req-7b6b8a42-ec0b-42cc-bc3d-889615309981 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:07:51 compute-0 nova_compute[187185]: 2025-11-29 07:07:51.667 187189 INFO nova.virt.libvirt.driver [-] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Instance spawned successfully.
Nov 29 07:07:51 compute-0 nova_compute[187185]: 2025-11-29 07:07:51.668 187189 DEBUG nova.virt.libvirt.driver [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:07:51 compute-0 nova_compute[187185]: 2025-11-29 07:07:51.695 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:07:51 compute-0 nova_compute[187185]: 2025-11-29 07:07:51.700 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:07:51 compute-0 nova_compute[187185]: 2025-11-29 07:07:51.895 187189 DEBUG nova.scheduler.client.report [None req-7b6b8a42-ec0b-42cc-bc3d-889615309981 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:07:51 compute-0 nova_compute[187185]: 2025-11-29 07:07:51.905 187189 DEBUG nova.virt.libvirt.driver [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:07:51 compute-0 nova_compute[187185]: 2025-11-29 07:07:51.906 187189 DEBUG nova.virt.libvirt.driver [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:07:51 compute-0 nova_compute[187185]: 2025-11-29 07:07:51.906 187189 DEBUG nova.virt.libvirt.driver [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:07:51 compute-0 nova_compute[187185]: 2025-11-29 07:07:51.907 187189 DEBUG nova.virt.libvirt.driver [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:07:51 compute-0 nova_compute[187185]: 2025-11-29 07:07:51.908 187189 DEBUG nova.virt.libvirt.driver [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:07:51 compute-0 nova_compute[187185]: 2025-11-29 07:07:51.908 187189 DEBUG nova.virt.libvirt.driver [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:07:51 compute-0 nova_compute[187185]: 2025-11-29 07:07:51.937 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 2f35ca71-193f-457f-b09f-78963a176460] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Nov 29 07:07:51 compute-0 nova_compute[187185]: 2025-11-29 07:07:51.938 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400071.649943, 2f35ca71-193f-457f-b09f-78963a176460 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:07:51 compute-0 nova_compute[187185]: 2025-11-29 07:07:51.938 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 2f35ca71-193f-457f-b09f-78963a176460] VM Started (Lifecycle Event)
Nov 29 07:07:51 compute-0 nova_compute[187185]: 2025-11-29 07:07:51.963 187189 DEBUG oslo_concurrency.lockutils [None req-7b6b8a42-ec0b-42cc-bc3d-889615309981 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.384s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:07:51 compute-0 nova_compute[187185]: 2025-11-29 07:07:51.982 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:07:51 compute-0 nova_compute[187185]: 2025-11-29 07:07:51.985 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:07:52 compute-0 nova_compute[187185]: 2025-11-29 07:07:52.000 187189 INFO nova.scheduler.client.report [None req-7b6b8a42-ec0b-42cc-bc3d-889615309981 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Deleted allocations for instance 95b8710a-1d4d-4b62-b920-af874de99431
Nov 29 07:07:52 compute-0 nova_compute[187185]: 2025-11-29 07:07:52.030 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 2f35ca71-193f-457f-b09f-78963a176460] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Nov 29 07:07:52 compute-0 nova_compute[187185]: 2025-11-29 07:07:52.047 187189 DEBUG nova.compute.manager [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:07:52 compute-0 nova_compute[187185]: 2025-11-29 07:07:52.054 187189 DEBUG nova.compute.manager [req-61f726c7-ce15-460c-a6df-17d5d3a42076 req-45a71cc5-ab94-4058-9d06-3be7647bed9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Received event network-vif-deleted-f051fae4-0fdf-4a7d-831f-f3291755af62 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:07:52 compute-0 nova_compute[187185]: 2025-11-29 07:07:52.143 187189 DEBUG oslo_concurrency.lockutils [None req-7b6b8a42-ec0b-42cc-bc3d-889615309981 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "95b8710a-1d4d-4b62-b920-af874de99431" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:07:52 compute-0 nova_compute[187185]: 2025-11-29 07:07:52.174 187189 DEBUG oslo_concurrency.lockutils [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:07:52 compute-0 nova_compute[187185]: 2025-11-29 07:07:52.175 187189 DEBUG oslo_concurrency.lockutils [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:07:52 compute-0 nova_compute[187185]: 2025-11-29 07:07:52.175 187189 DEBUG nova.objects.instance [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Nov 29 07:07:52 compute-0 nova_compute[187185]: 2025-11-29 07:07:52.281 187189 DEBUG oslo_concurrency.lockutils [None req-7f0c8488-539f-4c17-bac5-08adf44ffa66 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:07:52 compute-0 nova_compute[187185]: 2025-11-29 07:07:52.444 187189 DEBUG nova.compute.manager [req-9bb8030f-ecf1-4b82-9a2a-17f55e4b1a56 req-1c6f8b75-5cc1-4180-a2b3-b43fe0c67de8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Received event network-vif-plugged-f051fae4-0fdf-4a7d-831f-f3291755af62 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:07:52 compute-0 nova_compute[187185]: 2025-11-29 07:07:52.445 187189 DEBUG oslo_concurrency.lockutils [req-9bb8030f-ecf1-4b82-9a2a-17f55e4b1a56 req-1c6f8b75-5cc1-4180-a2b3-b43fe0c67de8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "95b8710a-1d4d-4b62-b920-af874de99431-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:07:52 compute-0 nova_compute[187185]: 2025-11-29 07:07:52.445 187189 DEBUG oslo_concurrency.lockutils [req-9bb8030f-ecf1-4b82-9a2a-17f55e4b1a56 req-1c6f8b75-5cc1-4180-a2b3-b43fe0c67de8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "95b8710a-1d4d-4b62-b920-af874de99431-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:07:52 compute-0 nova_compute[187185]: 2025-11-29 07:07:52.445 187189 DEBUG oslo_concurrency.lockutils [req-9bb8030f-ecf1-4b82-9a2a-17f55e4b1a56 req-1c6f8b75-5cc1-4180-a2b3-b43fe0c67de8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "95b8710a-1d4d-4b62-b920-af874de99431-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:07:52 compute-0 nova_compute[187185]: 2025-11-29 07:07:52.446 187189 DEBUG nova.compute.manager [req-9bb8030f-ecf1-4b82-9a2a-17f55e4b1a56 req-1c6f8b75-5cc1-4180-a2b3-b43fe0c67de8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] No waiting events found dispatching network-vif-plugged-f051fae4-0fdf-4a7d-831f-f3291755af62 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:07:52 compute-0 nova_compute[187185]: 2025-11-29 07:07:52.446 187189 WARNING nova.compute.manager [req-9bb8030f-ecf1-4b82-9a2a-17f55e4b1a56 req-1c6f8b75-5cc1-4180-a2b3-b43fe0c67de8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Received unexpected event network-vif-plugged-f051fae4-0fdf-4a7d-831f-f3291755af62 for instance with vm_state deleted and task_state None.
Nov 29 07:07:53 compute-0 nova_compute[187185]: 2025-11-29 07:07:53.752 187189 DEBUG oslo_concurrency.lockutils [None req-8bfb0d56-8b80-4cc6-8b27-b4c116edfa11 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Acquiring lock "2f35ca71-193f-457f-b09f-78963a176460" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:07:53 compute-0 nova_compute[187185]: 2025-11-29 07:07:53.752 187189 DEBUG oslo_concurrency.lockutils [None req-8bfb0d56-8b80-4cc6-8b27-b4c116edfa11 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Lock "2f35ca71-193f-457f-b09f-78963a176460" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:07:53 compute-0 nova_compute[187185]: 2025-11-29 07:07:53.753 187189 DEBUG oslo_concurrency.lockutils [None req-8bfb0d56-8b80-4cc6-8b27-b4c116edfa11 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Acquiring lock "2f35ca71-193f-457f-b09f-78963a176460-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:07:53 compute-0 nova_compute[187185]: 2025-11-29 07:07:53.753 187189 DEBUG oslo_concurrency.lockutils [None req-8bfb0d56-8b80-4cc6-8b27-b4c116edfa11 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Lock "2f35ca71-193f-457f-b09f-78963a176460-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:07:53 compute-0 nova_compute[187185]: 2025-11-29 07:07:53.753 187189 DEBUG oslo_concurrency.lockutils [None req-8bfb0d56-8b80-4cc6-8b27-b4c116edfa11 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Lock "2f35ca71-193f-457f-b09f-78963a176460-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:07:53 compute-0 nova_compute[187185]: 2025-11-29 07:07:53.768 187189 INFO nova.compute.manager [None req-8bfb0d56-8b80-4cc6-8b27-b4c116edfa11 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Terminating instance
Nov 29 07:07:53 compute-0 nova_compute[187185]: 2025-11-29 07:07:53.779 187189 DEBUG oslo_concurrency.lockutils [None req-8bfb0d56-8b80-4cc6-8b27-b4c116edfa11 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Acquiring lock "refresh_cache-2f35ca71-193f-457f-b09f-78963a176460" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:07:53 compute-0 nova_compute[187185]: 2025-11-29 07:07:53.780 187189 DEBUG oslo_concurrency.lockutils [None req-8bfb0d56-8b80-4cc6-8b27-b4c116edfa11 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Acquired lock "refresh_cache-2f35ca71-193f-457f-b09f-78963a176460" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:07:53 compute-0 nova_compute[187185]: 2025-11-29 07:07:53.780 187189 DEBUG nova.network.neutron [None req-8bfb0d56-8b80-4cc6-8b27-b4c116edfa11 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:07:54 compute-0 nova_compute[187185]: 2025-11-29 07:07:54.328 187189 DEBUG nova.network.neutron [None req-8bfb0d56-8b80-4cc6-8b27-b4c116edfa11 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:07:54 compute-0 nova_compute[187185]: 2025-11-29 07:07:54.591 187189 DEBUG nova.network.neutron [None req-8bfb0d56-8b80-4cc6-8b27-b4c116edfa11 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:07:54 compute-0 nova_compute[187185]: 2025-11-29 07:07:54.605 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:54 compute-0 nova_compute[187185]: 2025-11-29 07:07:54.618 187189 DEBUG oslo_concurrency.lockutils [None req-8bfb0d56-8b80-4cc6-8b27-b4c116edfa11 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Releasing lock "refresh_cache-2f35ca71-193f-457f-b09f-78963a176460" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:07:54 compute-0 nova_compute[187185]: 2025-11-29 07:07:54.619 187189 DEBUG nova.compute.manager [None req-8bfb0d56-8b80-4cc6-8b27-b4c116edfa11 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 07:07:54 compute-0 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000054.scope: Deactivated successfully.
Nov 29 07:07:54 compute-0 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000054.scope: Consumed 3.343s CPU time.
Nov 29 07:07:54 compute-0 systemd-machined[153486]: Machine qemu-28-instance-00000054 terminated.
Nov 29 07:07:54 compute-0 nova_compute[187185]: 2025-11-29 07:07:54.882 187189 INFO nova.virt.libvirt.driver [-] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Instance destroyed successfully.
Nov 29 07:07:54 compute-0 nova_compute[187185]: 2025-11-29 07:07:54.882 187189 DEBUG nova.objects.instance [None req-8bfb0d56-8b80-4cc6-8b27-b4c116edfa11 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Lazy-loading 'resources' on Instance uuid 2f35ca71-193f-457f-b09f-78963a176460 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:07:54 compute-0 nova_compute[187185]: 2025-11-29 07:07:54.908 187189 INFO nova.virt.libvirt.driver [None req-8bfb0d56-8b80-4cc6-8b27-b4c116edfa11 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Deleting instance files /var/lib/nova/instances/2f35ca71-193f-457f-b09f-78963a176460_del
Nov 29 07:07:54 compute-0 nova_compute[187185]: 2025-11-29 07:07:54.908 187189 INFO nova.virt.libvirt.driver [None req-8bfb0d56-8b80-4cc6-8b27-b4c116edfa11 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Deletion of /var/lib/nova/instances/2f35ca71-193f-457f-b09f-78963a176460_del complete
Nov 29 07:07:55 compute-0 nova_compute[187185]: 2025-11-29 07:07:55.013 187189 INFO nova.compute.manager [None req-8bfb0d56-8b80-4cc6-8b27-b4c116edfa11 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Took 0.39 seconds to destroy the instance on the hypervisor.
Nov 29 07:07:55 compute-0 nova_compute[187185]: 2025-11-29 07:07:55.013 187189 DEBUG oslo.service.loopingcall [None req-8bfb0d56-8b80-4cc6-8b27-b4c116edfa11 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 07:07:55 compute-0 nova_compute[187185]: 2025-11-29 07:07:55.013 187189 DEBUG nova.compute.manager [-] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 07:07:55 compute-0 nova_compute[187185]: 2025-11-29 07:07:55.014 187189 DEBUG nova.network.neutron [-] [instance: 2f35ca71-193f-457f-b09f-78963a176460] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 07:07:55 compute-0 nova_compute[187185]: 2025-11-29 07:07:55.329 187189 DEBUG nova.network.neutron [-] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:07:55 compute-0 nova_compute[187185]: 2025-11-29 07:07:55.360 187189 DEBUG nova.network.neutron [-] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:07:55 compute-0 nova_compute[187185]: 2025-11-29 07:07:55.364 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:55 compute-0 nova_compute[187185]: 2025-11-29 07:07:55.375 187189 INFO nova.compute.manager [-] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Took 0.36 seconds to deallocate network for instance.
Nov 29 07:07:55 compute-0 nova_compute[187185]: 2025-11-29 07:07:55.477 187189 DEBUG oslo_concurrency.lockutils [None req-8bfb0d56-8b80-4cc6-8b27-b4c116edfa11 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:07:55 compute-0 nova_compute[187185]: 2025-11-29 07:07:55.477 187189 DEBUG oslo_concurrency.lockutils [None req-8bfb0d56-8b80-4cc6-8b27-b4c116edfa11 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:07:55 compute-0 nova_compute[187185]: 2025-11-29 07:07:55.537 187189 DEBUG nova.compute.provider_tree [None req-8bfb0d56-8b80-4cc6-8b27-b4c116edfa11 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:07:55 compute-0 nova_compute[187185]: 2025-11-29 07:07:55.580 187189 DEBUG nova.scheduler.client.report [None req-8bfb0d56-8b80-4cc6-8b27-b4c116edfa11 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:07:55 compute-0 nova_compute[187185]: 2025-11-29 07:07:55.739 187189 DEBUG oslo_concurrency.lockutils [None req-8bfb0d56-8b80-4cc6-8b27-b4c116edfa11 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.261s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:07:55 compute-0 nova_compute[187185]: 2025-11-29 07:07:55.790 187189 INFO nova.scheduler.client.report [None req-8bfb0d56-8b80-4cc6-8b27-b4c116edfa11 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Deleted allocations for instance 2f35ca71-193f-457f-b09f-78963a176460
Nov 29 07:07:56 compute-0 nova_compute[187185]: 2025-11-29 07:07:56.044 187189 DEBUG oslo_concurrency.lockutils [None req-8bfb0d56-8b80-4cc6-8b27-b4c116edfa11 7293fa2b633e4b42af3128c6bcbd5176 618890258b5f40d4a3313b98a94795c7 - - default default] Lock "2f35ca71-193f-457f-b09f-78963a176460" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.292s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:07:57 compute-0 nova_compute[187185]: 2025-11-29 07:07:57.094 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:07:58 compute-0 nova_compute[187185]: 2025-11-29 07:07:58.822 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:07:58 compute-0 nova_compute[187185]: 2025-11-29 07:07:58.823 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:07:58 compute-0 podman[224821]: 2025-11-29 07:07:58.887037093 +0000 UTC m=+0.146341341 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 07:07:58 compute-0 nova_compute[187185]: 2025-11-29 07:07:58.930 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 07:07:58 compute-0 nova_compute[187185]: 2025-11-29 07:07:58.930 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:07:58 compute-0 nova_compute[187185]: 2025-11-29 07:07:58.931 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:07:58 compute-0 nova_compute[187185]: 2025-11-29 07:07:58.931 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:07:59 compute-0 nova_compute[187185]: 2025-11-29 07:07:59.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:07:59 compute-0 nova_compute[187185]: 2025-11-29 07:07:59.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:07:59 compute-0 nova_compute[187185]: 2025-11-29 07:07:59.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:07:59 compute-0 nova_compute[187185]: 2025-11-29 07:07:59.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:07:59 compute-0 nova_compute[187185]: 2025-11-29 07:07:59.607 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:00 compute-0 nova_compute[187185]: 2025-11-29 07:08:00.310 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:08:00 compute-0 nova_compute[187185]: 2025-11-29 07:08:00.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:08:00 compute-0 nova_compute[187185]: 2025-11-29 07:08:00.367 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:00 compute-0 nova_compute[187185]: 2025-11-29 07:08:00.387 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:08:00 compute-0 nova_compute[187185]: 2025-11-29 07:08:00.388 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:08:00 compute-0 nova_compute[187185]: 2025-11-29 07:08:00.389 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:08:00 compute-0 nova_compute[187185]: 2025-11-29 07:08:00.389 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:08:00 compute-0 nova_compute[187185]: 2025-11-29 07:08:00.571 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:08:00 compute-0 nova_compute[187185]: 2025-11-29 07:08:00.573 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5686MB free_disk=73.29670333862305GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:08:00 compute-0 nova_compute[187185]: 2025-11-29 07:08:00.573 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:08:00 compute-0 nova_compute[187185]: 2025-11-29 07:08:00.574 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:08:01 compute-0 nova_compute[187185]: 2025-11-29 07:08:01.226 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:08:01 compute-0 nova_compute[187185]: 2025-11-29 07:08:01.227 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:08:01 compute-0 nova_compute[187185]: 2025-11-29 07:08:01.443 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:08:01 compute-0 nova_compute[187185]: 2025-11-29 07:08:01.504 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:08:01 compute-0 nova_compute[187185]: 2025-11-29 07:08:01.588 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:08:01 compute-0 nova_compute[187185]: 2025-11-29 07:08:01.589 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.015s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:08:02 compute-0 nova_compute[187185]: 2025-11-29 07:08:02.585 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:08:03 compute-0 podman[224848]: 2025-11-29 07:08:03.816625597 +0000 UTC m=+0.068930993 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 07:08:04 compute-0 nova_compute[187185]: 2025-11-29 07:08:04.609 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:05 compute-0 nova_compute[187185]: 2025-11-29 07:08:05.328 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400070.3274357, 95b8710a-1d4d-4b62-b920-af874de99431 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:08:05 compute-0 nova_compute[187185]: 2025-11-29 07:08:05.329 187189 INFO nova.compute.manager [-] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] VM Stopped (Lifecycle Event)
Nov 29 07:08:05 compute-0 nova_compute[187185]: 2025-11-29 07:08:05.392 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:05 compute-0 nova_compute[187185]: 2025-11-29 07:08:05.430 187189 DEBUG nova.compute.manager [None req-3812b24a-9e3e-4d1e-a772-5ba2fa6e0b1a - - - - - -] [instance: 95b8710a-1d4d-4b62-b920-af874de99431] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:08:06 compute-0 podman[224872]: 2025-11-29 07:08:06.824910741 +0000 UTC m=+0.087758961 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Nov 29 07:08:08 compute-0 podman[224892]: 2025-11-29 07:08:08.79936517 +0000 UTC m=+0.061314719 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 07:08:09 compute-0 nova_compute[187185]: 2025-11-29 07:08:09.611 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:09 compute-0 nova_compute[187185]: 2025-11-29 07:08:09.878 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400074.8769877, 2f35ca71-193f-457f-b09f-78963a176460 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:08:09 compute-0 nova_compute[187185]: 2025-11-29 07:08:09.879 187189 INFO nova.compute.manager [-] [instance: 2f35ca71-193f-457f-b09f-78963a176460] VM Stopped (Lifecycle Event)
Nov 29 07:08:09 compute-0 nova_compute[187185]: 2025-11-29 07:08:09.913 187189 DEBUG nova.compute.manager [None req-882ef736-38d7-401d-ad28-75d650e7798f - - - - - -] [instance: 2f35ca71-193f-457f-b09f-78963a176460] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:08:10 compute-0 nova_compute[187185]: 2025-11-29 07:08:10.394 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:14 compute-0 nova_compute[187185]: 2025-11-29 07:08:14.616 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:15 compute-0 nova_compute[187185]: 2025-11-29 07:08:15.398 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:19 compute-0 nova_compute[187185]: 2025-11-29 07:08:19.617 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:19 compute-0 podman[224914]: 2025-11-29 07:08:19.803624978 +0000 UTC m=+0.065139086 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 29 07:08:19 compute-0 podman[224916]: 2025-11-29 07:08:19.805095169 +0000 UTC m=+0.050827245 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 07:08:19 compute-0 podman[224915]: 2025-11-29 07:08:19.820792889 +0000 UTC m=+0.075861987 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, distribution-scope=public, config_id=edpm, vcs-type=git, version=9.6, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter)
Nov 29 07:08:20 compute-0 nova_compute[187185]: 2025-11-29 07:08:20.401 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:24 compute-0 nova_compute[187185]: 2025-11-29 07:08:24.619 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:24.827 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:08:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:24.828 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:08:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:24.828 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:08:25 compute-0 nova_compute[187185]: 2025-11-29 07:08:25.455 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:29 compute-0 nova_compute[187185]: 2025-11-29 07:08:29.098 187189 DEBUG oslo_concurrency.lockutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquiring lock "5ea1413a-4aa3-4cf3-9d93-21b138ed8739" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:08:29 compute-0 nova_compute[187185]: 2025-11-29 07:08:29.099 187189 DEBUG oslo_concurrency.lockutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "5ea1413a-4aa3-4cf3-9d93-21b138ed8739" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:08:29 compute-0 nova_compute[187185]: 2025-11-29 07:08:29.125 187189 DEBUG nova.compute.manager [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 07:08:29 compute-0 nova_compute[187185]: 2025-11-29 07:08:29.557 187189 DEBUG oslo_concurrency.lockutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:08:29 compute-0 nova_compute[187185]: 2025-11-29 07:08:29.557 187189 DEBUG oslo_concurrency.lockutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:08:29 compute-0 nova_compute[187185]: 2025-11-29 07:08:29.565 187189 DEBUG nova.virt.hardware [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:08:29 compute-0 nova_compute[187185]: 2025-11-29 07:08:29.565 187189 INFO nova.compute.claims [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:08:29 compute-0 nova_compute[187185]: 2025-11-29 07:08:29.621 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:29 compute-0 podman[224975]: 2025-11-29 07:08:29.921640579 +0000 UTC m=+0.180363205 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:08:30 compute-0 nova_compute[187185]: 2025-11-29 07:08:30.415 187189 DEBUG nova.compute.provider_tree [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:08:30 compute-0 nova_compute[187185]: 2025-11-29 07:08:30.439 187189 DEBUG nova.scheduler.client.report [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:08:30 compute-0 nova_compute[187185]: 2025-11-29 07:08:30.458 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:30 compute-0 nova_compute[187185]: 2025-11-29 07:08:30.477 187189 DEBUG oslo_concurrency.lockutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.920s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:08:30 compute-0 nova_compute[187185]: 2025-11-29 07:08:30.478 187189 DEBUG nova.compute.manager [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 07:08:30 compute-0 nova_compute[187185]: 2025-11-29 07:08:30.624 187189 DEBUG nova.compute.manager [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 07:08:30 compute-0 nova_compute[187185]: 2025-11-29 07:08:30.624 187189 DEBUG nova.network.neutron [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 07:08:30 compute-0 nova_compute[187185]: 2025-11-29 07:08:30.655 187189 INFO nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 07:08:30 compute-0 nova_compute[187185]: 2025-11-29 07:08:30.692 187189 DEBUG nova.compute.manager [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 07:08:30 compute-0 nova_compute[187185]: 2025-11-29 07:08:30.987 187189 DEBUG nova.compute.manager [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 07:08:30 compute-0 nova_compute[187185]: 2025-11-29 07:08:30.990 187189 DEBUG nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:08:30 compute-0 nova_compute[187185]: 2025-11-29 07:08:30.990 187189 INFO nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Creating image(s)
Nov 29 07:08:30 compute-0 nova_compute[187185]: 2025-11-29 07:08:30.992 187189 DEBUG oslo_concurrency.lockutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquiring lock "/var/lib/nova/instances/5ea1413a-4aa3-4cf3-9d93-21b138ed8739/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:08:30 compute-0 nova_compute[187185]: 2025-11-29 07:08:30.992 187189 DEBUG oslo_concurrency.lockutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "/var/lib/nova/instances/5ea1413a-4aa3-4cf3-9d93-21b138ed8739/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:08:30 compute-0 nova_compute[187185]: 2025-11-29 07:08:30.994 187189 DEBUG oslo_concurrency.lockutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "/var/lib/nova/instances/5ea1413a-4aa3-4cf3-9d93-21b138ed8739/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:08:31 compute-0 nova_compute[187185]: 2025-11-29 07:08:31.025 187189 DEBUG oslo_concurrency.processutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:08:31 compute-0 nova_compute[187185]: 2025-11-29 07:08:31.083 187189 DEBUG oslo_concurrency.processutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:08:31 compute-0 nova_compute[187185]: 2025-11-29 07:08:31.084 187189 DEBUG oslo_concurrency.lockutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:08:31 compute-0 nova_compute[187185]: 2025-11-29 07:08:31.085 187189 DEBUG oslo_concurrency.lockutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:08:31 compute-0 nova_compute[187185]: 2025-11-29 07:08:31.098 187189 DEBUG oslo_concurrency.processutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:08:31 compute-0 nova_compute[187185]: 2025-11-29 07:08:31.153 187189 DEBUG oslo_concurrency.processutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:08:31 compute-0 nova_compute[187185]: 2025-11-29 07:08:31.155 187189 DEBUG oslo_concurrency.processutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/5ea1413a-4aa3-4cf3-9d93-21b138ed8739/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:08:31 compute-0 nova_compute[187185]: 2025-11-29 07:08:31.188 187189 DEBUG oslo_concurrency.processutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/5ea1413a-4aa3-4cf3-9d93-21b138ed8739/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:08:31 compute-0 nova_compute[187185]: 2025-11-29 07:08:31.189 187189 DEBUG oslo_concurrency.lockutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:08:31 compute-0 nova_compute[187185]: 2025-11-29 07:08:31.189 187189 DEBUG oslo_concurrency.processutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:08:31 compute-0 nova_compute[187185]: 2025-11-29 07:08:31.243 187189 DEBUG oslo_concurrency.processutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:08:31 compute-0 nova_compute[187185]: 2025-11-29 07:08:31.244 187189 DEBUG nova.virt.disk.api [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Checking if we can resize image /var/lib/nova/instances/5ea1413a-4aa3-4cf3-9d93-21b138ed8739/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:08:31 compute-0 nova_compute[187185]: 2025-11-29 07:08:31.244 187189 DEBUG oslo_concurrency.processutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ea1413a-4aa3-4cf3-9d93-21b138ed8739/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:08:31 compute-0 nova_compute[187185]: 2025-11-29 07:08:31.298 187189 DEBUG oslo_concurrency.processutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ea1413a-4aa3-4cf3-9d93-21b138ed8739/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:08:31 compute-0 nova_compute[187185]: 2025-11-29 07:08:31.300 187189 DEBUG nova.virt.disk.api [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Cannot resize image /var/lib/nova/instances/5ea1413a-4aa3-4cf3-9d93-21b138ed8739/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:08:31 compute-0 nova_compute[187185]: 2025-11-29 07:08:31.301 187189 DEBUG nova.objects.instance [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lazy-loading 'migration_context' on Instance uuid 5ea1413a-4aa3-4cf3-9d93-21b138ed8739 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:08:31 compute-0 nova_compute[187185]: 2025-11-29 07:08:31.358 187189 DEBUG nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:08:31 compute-0 nova_compute[187185]: 2025-11-29 07:08:31.359 187189 DEBUG nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Ensure instance console log exists: /var/lib/nova/instances/5ea1413a-4aa3-4cf3-9d93-21b138ed8739/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:08:31 compute-0 nova_compute[187185]: 2025-11-29 07:08:31.360 187189 DEBUG oslo_concurrency.lockutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:08:31 compute-0 nova_compute[187185]: 2025-11-29 07:08:31.360 187189 DEBUG oslo_concurrency.lockutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:08:31 compute-0 nova_compute[187185]: 2025-11-29 07:08:31.360 187189 DEBUG oslo_concurrency.lockutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:08:32 compute-0 nova_compute[187185]: 2025-11-29 07:08:32.278 187189 DEBUG nova.policy [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 07:08:33 compute-0 nova_compute[187185]: 2025-11-29 07:08:33.513 187189 DEBUG nova.network.neutron [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Successfully created port: fbbb56f9-2872-40ed-9660-660a9c9371d6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:08:34 compute-0 nova_compute[187185]: 2025-11-29 07:08:34.623 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:34 compute-0 podman[225016]: 2025-11-29 07:08:34.787887066 +0000 UTC m=+0.051455753 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 07:08:35 compute-0 nova_compute[187185]: 2025-11-29 07:08:35.514 187189 DEBUG nova.network.neutron [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Successfully updated port: fbbb56f9-2872-40ed-9660-660a9c9371d6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:08:35 compute-0 nova_compute[187185]: 2025-11-29 07:08:35.516 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:35 compute-0 nova_compute[187185]: 2025-11-29 07:08:35.539 187189 DEBUG oslo_concurrency.lockutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquiring lock "refresh_cache-5ea1413a-4aa3-4cf3-9d93-21b138ed8739" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:08:35 compute-0 nova_compute[187185]: 2025-11-29 07:08:35.540 187189 DEBUG oslo_concurrency.lockutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquired lock "refresh_cache-5ea1413a-4aa3-4cf3-9d93-21b138ed8739" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:08:35 compute-0 nova_compute[187185]: 2025-11-29 07:08:35.540 187189 DEBUG nova.network.neutron [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:08:35 compute-0 nova_compute[187185]: 2025-11-29 07:08:35.803 187189 DEBUG nova.network.neutron [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.280 187189 DEBUG nova.network.neutron [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Updating instance_info_cache with network_info: [{"id": "fbbb56f9-2872-40ed-9660-660a9c9371d6", "address": "fa:16:3e:e5:00:ab", "network": {"id": "61999b35-f067-478e-ae7d-2c014e39aec6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-668045473-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a16c3c4eb5654a7f9742906d1a6f6698", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbbb56f9-28", "ovs_interfaceid": "fbbb56f9-2872-40ed-9660-660a9c9371d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.341 187189 DEBUG nova.compute.manager [req-df0c6622-4788-45ed-8448-0a87005e5453 req-a72ef882-ce44-48da-8e3f-032e0657cd54 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Received event network-changed-fbbb56f9-2872-40ed-9660-660a9c9371d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.342 187189 DEBUG nova.compute.manager [req-df0c6622-4788-45ed-8448-0a87005e5453 req-a72ef882-ce44-48da-8e3f-032e0657cd54 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Refreshing instance network info cache due to event network-changed-fbbb56f9-2872-40ed-9660-660a9c9371d6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.342 187189 DEBUG oslo_concurrency.lockutils [req-df0c6622-4788-45ed-8448-0a87005e5453 req-a72ef882-ce44-48da-8e3f-032e0657cd54 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-5ea1413a-4aa3-4cf3-9d93-21b138ed8739" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:08:37 compute-0 podman[225041]: 2025-11-29 07:08:37.816067299 +0000 UTC m=+0.072539504 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.894 187189 DEBUG oslo_concurrency.lockutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Releasing lock "refresh_cache-5ea1413a-4aa3-4cf3-9d93-21b138ed8739" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.895 187189 DEBUG nova.compute.manager [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Instance network_info: |[{"id": "fbbb56f9-2872-40ed-9660-660a9c9371d6", "address": "fa:16:3e:e5:00:ab", "network": {"id": "61999b35-f067-478e-ae7d-2c014e39aec6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-668045473-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a16c3c4eb5654a7f9742906d1a6f6698", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbbb56f9-28", "ovs_interfaceid": "fbbb56f9-2872-40ed-9660-660a9c9371d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.895 187189 DEBUG oslo_concurrency.lockutils [req-df0c6622-4788-45ed-8448-0a87005e5453 req-a72ef882-ce44-48da-8e3f-032e0657cd54 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-5ea1413a-4aa3-4cf3-9d93-21b138ed8739" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.895 187189 DEBUG nova.network.neutron [req-df0c6622-4788-45ed-8448-0a87005e5453 req-a72ef882-ce44-48da-8e3f-032e0657cd54 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Refreshing network info cache for port fbbb56f9-2872-40ed-9660-660a9c9371d6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.898 187189 DEBUG nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Start _get_guest_xml network_info=[{"id": "fbbb56f9-2872-40ed-9660-660a9c9371d6", "address": "fa:16:3e:e5:00:ab", "network": {"id": "61999b35-f067-478e-ae7d-2c014e39aec6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-668045473-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a16c3c4eb5654a7f9742906d1a6f6698", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbbb56f9-28", "ovs_interfaceid": "fbbb56f9-2872-40ed-9660-660a9c9371d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.904 187189 WARNING nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.909 187189 DEBUG nova.virt.libvirt.host [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.909 187189 DEBUG nova.virt.libvirt.host [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.914 187189 DEBUG nova.virt.libvirt.host [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.914 187189 DEBUG nova.virt.libvirt.host [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.916 187189 DEBUG nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.916 187189 DEBUG nova.virt.hardware [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.916 187189 DEBUG nova.virt.hardware [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.917 187189 DEBUG nova.virt.hardware [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.917 187189 DEBUG nova.virt.hardware [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.917 187189 DEBUG nova.virt.hardware [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.917 187189 DEBUG nova.virt.hardware [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.918 187189 DEBUG nova.virt.hardware [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.918 187189 DEBUG nova.virt.hardware [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.918 187189 DEBUG nova.virt.hardware [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.918 187189 DEBUG nova.virt.hardware [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.919 187189 DEBUG nova.virt.hardware [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.922 187189 DEBUG nova.virt.libvirt.vif [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:08:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-622119530',display_name='tempest-tempest.common.compute-instance-622119530-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-622119530-1',id=85,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a16c3c4eb5654a7f9742906d1a6f6698',ramdisk_id='',reservation_id='r-u25t76wp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-910974113',owner_user_name='tempest-MultipleCreateTestJSON-910974113-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:08:30Z,user_data=None,user_id='e621c9f314214c7980a4d441f0600e90',uuid=5ea1413a-4aa3-4cf3-9d93-21b138ed8739,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fbbb56f9-2872-40ed-9660-660a9c9371d6", "address": "fa:16:3e:e5:00:ab", "network": {"id": "61999b35-f067-478e-ae7d-2c014e39aec6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-668045473-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a16c3c4eb5654a7f9742906d1a6f6698", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbbb56f9-28", "ovs_interfaceid": "fbbb56f9-2872-40ed-9660-660a9c9371d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.923 187189 DEBUG nova.network.os_vif_util [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Converting VIF {"id": "fbbb56f9-2872-40ed-9660-660a9c9371d6", "address": "fa:16:3e:e5:00:ab", "network": {"id": "61999b35-f067-478e-ae7d-2c014e39aec6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-668045473-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a16c3c4eb5654a7f9742906d1a6f6698", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbbb56f9-28", "ovs_interfaceid": "fbbb56f9-2872-40ed-9660-660a9c9371d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.923 187189 DEBUG nova.network.os_vif_util [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:00:ab,bridge_name='br-int',has_traffic_filtering=True,id=fbbb56f9-2872-40ed-9660-660a9c9371d6,network=Network(61999b35-f067-478e-ae7d-2c014e39aec6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbbb56f9-28') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.924 187189 DEBUG nova.objects.instance [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5ea1413a-4aa3-4cf3-9d93-21b138ed8739 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.949 187189 DEBUG nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:08:37 compute-0 nova_compute[187185]:   <uuid>5ea1413a-4aa3-4cf3-9d93-21b138ed8739</uuid>
Nov 29 07:08:37 compute-0 nova_compute[187185]:   <name>instance-00000055</name>
Nov 29 07:08:37 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:08:37 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:08:37 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:08:37 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:       <nova:name>tempest-tempest.common.compute-instance-622119530-1</nova:name>
Nov 29 07:08:37 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:08:37</nova:creationTime>
Nov 29 07:08:37 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:08:37 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:08:37 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:08:37 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:08:37 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:08:37 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:08:37 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:08:37 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:08:37 compute-0 nova_compute[187185]:         <nova:user uuid="e621c9f314214c7980a4d441f0600e90">tempest-MultipleCreateTestJSON-910974113-project-member</nova:user>
Nov 29 07:08:37 compute-0 nova_compute[187185]:         <nova:project uuid="a16c3c4eb5654a7f9742906d1a6f6698">tempest-MultipleCreateTestJSON-910974113</nova:project>
Nov 29 07:08:37 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:08:37 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:08:37 compute-0 nova_compute[187185]:         <nova:port uuid="fbbb56f9-2872-40ed-9660-660a9c9371d6">
Nov 29 07:08:37 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:08:37 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:08:37 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:08:37 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <system>
Nov 29 07:08:37 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:08:37 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:08:37 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:08:37 compute-0 nova_compute[187185]:       <entry name="serial">5ea1413a-4aa3-4cf3-9d93-21b138ed8739</entry>
Nov 29 07:08:37 compute-0 nova_compute[187185]:       <entry name="uuid">5ea1413a-4aa3-4cf3-9d93-21b138ed8739</entry>
Nov 29 07:08:37 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     </system>
Nov 29 07:08:37 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:08:37 compute-0 nova_compute[187185]:   <os>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:   </os>
Nov 29 07:08:37 compute-0 nova_compute[187185]:   <features>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:   </features>
Nov 29 07:08:37 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:08:37 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:08:37 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:08:37 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/5ea1413a-4aa3-4cf3-9d93-21b138ed8739/disk"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:08:37 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/5ea1413a-4aa3-4cf3-9d93-21b138ed8739/disk.config"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:08:37 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:e5:00:ab"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:       <target dev="tapfbbb56f9-28"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:08:37 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/5ea1413a-4aa3-4cf3-9d93-21b138ed8739/console.log" append="off"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <video>
Nov 29 07:08:37 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     </video>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:08:37 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:08:37 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:08:37 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:08:37 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:08:37 compute-0 nova_compute[187185]: </domain>
Nov 29 07:08:37 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.951 187189 DEBUG nova.compute.manager [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Preparing to wait for external event network-vif-plugged-fbbb56f9-2872-40ed-9660-660a9c9371d6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.951 187189 DEBUG oslo_concurrency.lockutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquiring lock "5ea1413a-4aa3-4cf3-9d93-21b138ed8739-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.952 187189 DEBUG oslo_concurrency.lockutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "5ea1413a-4aa3-4cf3-9d93-21b138ed8739-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.952 187189 DEBUG oslo_concurrency.lockutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "5ea1413a-4aa3-4cf3-9d93-21b138ed8739-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.953 187189 DEBUG nova.virt.libvirt.vif [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:08:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-622119530',display_name='tempest-tempest.common.compute-instance-622119530-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-622119530-1',id=85,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a16c3c4eb5654a7f9742906d1a6f6698',ramdisk_id='',reservation_id='r-u25t76wp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-910974113',owner_user_name='tempest-MultipleCreateTestJSON-910974113-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:08:30Z,user_data=None,user_id='e621c9f314214c7980a4d441f0600e90',uuid=5ea1413a-4aa3-4cf3-9d93-21b138ed8739,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fbbb56f9-2872-40ed-9660-660a9c9371d6", "address": "fa:16:3e:e5:00:ab", "network": {"id": "61999b35-f067-478e-ae7d-2c014e39aec6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-668045473-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a16c3c4eb5654a7f9742906d1a6f6698", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbbb56f9-28", "ovs_interfaceid": "fbbb56f9-2872-40ed-9660-660a9c9371d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.953 187189 DEBUG nova.network.os_vif_util [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Converting VIF {"id": "fbbb56f9-2872-40ed-9660-660a9c9371d6", "address": "fa:16:3e:e5:00:ab", "network": {"id": "61999b35-f067-478e-ae7d-2c014e39aec6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-668045473-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a16c3c4eb5654a7f9742906d1a6f6698", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbbb56f9-28", "ovs_interfaceid": "fbbb56f9-2872-40ed-9660-660a9c9371d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.954 187189 DEBUG nova.network.os_vif_util [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:00:ab,bridge_name='br-int',has_traffic_filtering=True,id=fbbb56f9-2872-40ed-9660-660a9c9371d6,network=Network(61999b35-f067-478e-ae7d-2c014e39aec6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbbb56f9-28') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.954 187189 DEBUG os_vif [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:00:ab,bridge_name='br-int',has_traffic_filtering=True,id=fbbb56f9-2872-40ed-9660-660a9c9371d6,network=Network(61999b35-f067-478e-ae7d-2c014e39aec6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbbb56f9-28') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.955 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.955 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.955 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.959 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.959 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbbb56f9-28, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.960 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfbbb56f9-28, col_values=(('external_ids', {'iface-id': 'fbbb56f9-2872-40ed-9660-660a9c9371d6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e5:00:ab', 'vm-uuid': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.962 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:37 compute-0 NetworkManager[55227]: <info>  [1764400117.9634] manager: (tapfbbb56f9-28): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/101)
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.963 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.968 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:37 compute-0 nova_compute[187185]: 2025-11-29 07:08:37.970 187189 INFO os_vif [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:00:ab,bridge_name='br-int',has_traffic_filtering=True,id=fbbb56f9-2872-40ed-9660-660a9c9371d6,network=Network(61999b35-f067-478e-ae7d-2c014e39aec6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbbb56f9-28')
Nov 29 07:08:38 compute-0 nova_compute[187185]: 2025-11-29 07:08:38.480 187189 DEBUG nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:08:38 compute-0 nova_compute[187185]: 2025-11-29 07:08:38.480 187189 DEBUG nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:08:38 compute-0 nova_compute[187185]: 2025-11-29 07:08:38.480 187189 DEBUG nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] No VIF found with MAC fa:16:3e:e5:00:ab, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:08:38 compute-0 nova_compute[187185]: 2025-11-29 07:08:38.481 187189 INFO nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Using config drive
Nov 29 07:08:39 compute-0 nova_compute[187185]: 2025-11-29 07:08:39.289 187189 INFO nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Creating config drive at /var/lib/nova/instances/5ea1413a-4aa3-4cf3-9d93-21b138ed8739/disk.config
Nov 29 07:08:39 compute-0 nova_compute[187185]: 2025-11-29 07:08:39.295 187189 DEBUG oslo_concurrency.processutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5ea1413a-4aa3-4cf3-9d93-21b138ed8739/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj0te9gmf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:08:39 compute-0 nova_compute[187185]: 2025-11-29 07:08:39.440 187189 DEBUG oslo_concurrency.processutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5ea1413a-4aa3-4cf3-9d93-21b138ed8739/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj0te9gmf" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:08:39 compute-0 kernel: tapfbbb56f9-28: entered promiscuous mode
Nov 29 07:08:39 compute-0 NetworkManager[55227]: <info>  [1764400119.5531] manager: (tapfbbb56f9-28): new Tun device (/org/freedesktop/NetworkManager/Devices/102)
Nov 29 07:08:39 compute-0 nova_compute[187185]: 2025-11-29 07:08:39.554 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:39 compute-0 ovn_controller[95281]: 2025-11-29T07:08:39Z|00188|binding|INFO|Claiming lport fbbb56f9-2872-40ed-9660-660a9c9371d6 for this chassis.
Nov 29 07:08:39 compute-0 ovn_controller[95281]: 2025-11-29T07:08:39Z|00189|binding|INFO|fbbb56f9-2872-40ed-9660-660a9c9371d6: Claiming fa:16:3e:e5:00:ab 10.100.0.10
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:39.578 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e5:00:ab 10.100.0.10'], port_security=['fa:16:3e:e5:00:ab 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-61999b35-f067-478e-ae7d-2c014e39aec6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'neutron:revision_number': '2', 'neutron:security_group_ids': '33049197-f5b6-46f9-bb9f-ae10a060cbf4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0493b85c-e95a-459c-8e5e-a22ec09f96c2, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=fbbb56f9-2872-40ed-9660-660a9c9371d6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:39.580 104254 INFO neutron.agent.ovn.metadata.agent [-] Port fbbb56f9-2872-40ed-9660-660a9c9371d6 in datapath 61999b35-f067-478e-ae7d-2c014e39aec6 bound to our chassis
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:39.583 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 61999b35-f067-478e-ae7d-2c014e39aec6
Nov 29 07:08:39 compute-0 systemd-udevd[225090]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:39.597 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[53dd5e3d-4626-4133-8287-38fcf0f8d0df]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:39.598 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap61999b35-f1 in ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:08:39 compute-0 NetworkManager[55227]: <info>  [1764400119.6022] device (tapfbbb56f9-28): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:08:39 compute-0 NetworkManager[55227]: <info>  [1764400119.6032] device (tapfbbb56f9-28): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:39.602 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap61999b35-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:39.602 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[7d3028cf-fa11-4234-b94e-23016f0ce8dc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:39.603 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[63feb576-8eb8-4d1b-85fd-448181fcc538]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:08:39 compute-0 nova_compute[187185]: 2025-11-29 07:08:39.617 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:39 compute-0 ovn_controller[95281]: 2025-11-29T07:08:39Z|00190|binding|INFO|Setting lport fbbb56f9-2872-40ed-9660-660a9c9371d6 ovn-installed in OVS
Nov 29 07:08:39 compute-0 ovn_controller[95281]: 2025-11-29T07:08:39Z|00191|binding|INFO|Setting lport fbbb56f9-2872-40ed-9660-660a9c9371d6 up in Southbound
Nov 29 07:08:39 compute-0 nova_compute[187185]: 2025-11-29 07:08:39.624 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:39.621 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[aacf56bc-aec0-4d46-a500-f673f8722b23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:08:39 compute-0 systemd-machined[153486]: New machine qemu-29-instance-00000055.
Nov 29 07:08:39 compute-0 podman[225074]: 2025-11-29 07:08:39.631504381 +0000 UTC m=+0.085137467 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=multipathd, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:39.642 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[aff87fe6-6262-4ecd-82de-6f3983def5f6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:08:39 compute-0 systemd[1]: Started Virtual Machine qemu-29-instance-00000055.
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:39.675 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[1507a9b6-c79e-4985-be15-a5fbdfb4e39b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:39.681 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[9c05070f-20b8-44f5-8b01-fc7a7faf38dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:08:39 compute-0 NetworkManager[55227]: <info>  [1764400119.6824] manager: (tap61999b35-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/103)
Nov 29 07:08:39 compute-0 systemd-udevd[225099]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:39.714 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[37edf14f-51ae-4624-aec5-884361695ec8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:39.718 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[6b7f02d8-9c8c-4200-ab87-df591b4156b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:08:39 compute-0 NetworkManager[55227]: <info>  [1764400119.7464] device (tap61999b35-f0): carrier: link connected
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:39.751 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[bc0e5d7c-adaa-4f09-87b0-64eef2a2176e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:39.769 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[335a262d-f6f8-4ee1-8f1c-8dc986c7a9e3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap61999b35-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:e2:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 556740, 'reachable_time': 40289, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225134, 'error': None, 'target': 'ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:39.788 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[ce04c09e-fd2a-4b6a-8add-479323424e26]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee8:e2e6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 556740, 'tstamp': 556740}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225135, 'error': None, 'target': 'ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:39.805 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6496067a-0469-4d36-bc02-d79dec95d9ba]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap61999b35-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:e2:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 556740, 'reachable_time': 40289, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225136, 'error': None, 'target': 'ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:39.834 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[37e8d8a6-1d38-4b88-8d1c-191cb13bbd89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:39.905 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[976f19e1-b60f-4c35-8a14-195beec47b21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:39.907 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap61999b35-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:39.907 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:39.908 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap61999b35-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:08:39 compute-0 nova_compute[187185]: 2025-11-29 07:08:39.910 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:39 compute-0 NetworkManager[55227]: <info>  [1764400119.9117] manager: (tap61999b35-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/104)
Nov 29 07:08:39 compute-0 kernel: tap61999b35-f0: entered promiscuous mode
Nov 29 07:08:39 compute-0 nova_compute[187185]: 2025-11-29 07:08:39.913 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:39.914 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap61999b35-f0, col_values=(('external_ids', {'iface-id': 'c68228ff-9afd-4bc1-81a6-230bf1aa485f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:08:39 compute-0 nova_compute[187185]: 2025-11-29 07:08:39.916 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:39 compute-0 ovn_controller[95281]: 2025-11-29T07:08:39Z|00192|binding|INFO|Releasing lport c68228ff-9afd-4bc1-81a6-230bf1aa485f from this chassis (sb_readonly=0)
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:39.918 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/61999b35-f067-478e-ae7d-2c014e39aec6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/61999b35-f067-478e-ae7d-2c014e39aec6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:39.929 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[474b9a30-a95c-42a7-9263-c971797c329b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:39.930 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-61999b35-f067-478e-ae7d-2c014e39aec6
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/61999b35-f067-478e-ae7d-2c014e39aec6.pid.haproxy
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID 61999b35-f067-478e-ae7d-2c014e39aec6
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:08:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:39.931 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6', 'env', 'PROCESS_TAG=haproxy-61999b35-f067-478e-ae7d-2c014e39aec6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/61999b35-f067-478e-ae7d-2c014e39aec6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:08:39 compute-0 nova_compute[187185]: 2025-11-29 07:08:39.933 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:40 compute-0 nova_compute[187185]: 2025-11-29 07:08:40.282 187189 DEBUG nova.network.neutron [req-df0c6622-4788-45ed-8448-0a87005e5453 req-a72ef882-ce44-48da-8e3f-032e0657cd54 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Updated VIF entry in instance network info cache for port fbbb56f9-2872-40ed-9660-660a9c9371d6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:08:40 compute-0 nova_compute[187185]: 2025-11-29 07:08:40.284 187189 DEBUG nova.network.neutron [req-df0c6622-4788-45ed-8448-0a87005e5453 req-a72ef882-ce44-48da-8e3f-032e0657cd54 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Updating instance_info_cache with network_info: [{"id": "fbbb56f9-2872-40ed-9660-660a9c9371d6", "address": "fa:16:3e:e5:00:ab", "network": {"id": "61999b35-f067-478e-ae7d-2c014e39aec6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-668045473-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a16c3c4eb5654a7f9742906d1a6f6698", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbbb56f9-28", "ovs_interfaceid": "fbbb56f9-2872-40ed-9660-660a9c9371d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:08:40 compute-0 podman[225168]: 2025-11-29 07:08:40.316472829 +0000 UTC m=+0.060726203 container create bb8dd9a19d2c2016b1193c4b0c64a355fed4f0a635c10f242a1fa466305926eb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 29 07:08:40 compute-0 nova_compute[187185]: 2025-11-29 07:08:40.338 187189 DEBUG oslo_concurrency.lockutils [req-df0c6622-4788-45ed-8448-0a87005e5453 req-a72ef882-ce44-48da-8e3f-032e0657cd54 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-5ea1413a-4aa3-4cf3-9d93-21b138ed8739" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:08:40 compute-0 systemd[1]: Started libpod-conmon-bb8dd9a19d2c2016b1193c4b0c64a355fed4f0a635c10f242a1fa466305926eb.scope.
Nov 29 07:08:40 compute-0 podman[225168]: 2025-11-29 07:08:40.278551646 +0000 UTC m=+0.022805040 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:08:40 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:08:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/12412b606e9099702dd0831ca9295b1864713d11a84ef0550822745b5f5cce3c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:08:40 compute-0 podman[225168]: 2025-11-29 07:08:40.401648076 +0000 UTC m=+0.145901470 container init bb8dd9a19d2c2016b1193c4b0c64a355fed4f0a635c10f242a1fa466305926eb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Nov 29 07:08:40 compute-0 podman[225168]: 2025-11-29 07:08:40.407235743 +0000 UTC m=+0.151489137 container start bb8dd9a19d2c2016b1193c4b0c64a355fed4f0a635c10f242a1fa466305926eb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:08:40 compute-0 neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6[225183]: [NOTICE]   (225193) : New worker (225195) forked
Nov 29 07:08:40 compute-0 neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6[225183]: [NOTICE]   (225193) : Loading success.
Nov 29 07:08:40 compute-0 nova_compute[187185]: 2025-11-29 07:08:40.485 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400120.4846587, 5ea1413a-4aa3-4cf3-9d93-21b138ed8739 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:08:40 compute-0 nova_compute[187185]: 2025-11-29 07:08:40.486 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] VM Started (Lifecycle Event)
Nov 29 07:08:40 compute-0 nova_compute[187185]: 2025-11-29 07:08:40.545 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:08:40 compute-0 nova_compute[187185]: 2025-11-29 07:08:40.550 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400120.486259, 5ea1413a-4aa3-4cf3-9d93-21b138ed8739 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:08:40 compute-0 nova_compute[187185]: 2025-11-29 07:08:40.551 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] VM Paused (Lifecycle Event)
Nov 29 07:08:40 compute-0 nova_compute[187185]: 2025-11-29 07:08:40.806 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:08:40 compute-0 nova_compute[187185]: 2025-11-29 07:08:40.811 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:08:40 compute-0 nova_compute[187185]: 2025-11-29 07:08:40.938 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:08:42 compute-0 nova_compute[187185]: 2025-11-29 07:08:42.963 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:44 compute-0 nova_compute[187185]: 2025-11-29 07:08:44.627 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:45 compute-0 nova_compute[187185]: 2025-11-29 07:08:45.197 187189 DEBUG nova.compute.manager [req-6a67930e-049d-4f31-8c73-231945ca8628 req-9d9ea6da-a855-4d46-85a1-2d8d30beaa65 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Received event network-vif-plugged-fbbb56f9-2872-40ed-9660-660a9c9371d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:08:45 compute-0 nova_compute[187185]: 2025-11-29 07:08:45.198 187189 DEBUG oslo_concurrency.lockutils [req-6a67930e-049d-4f31-8c73-231945ca8628 req-9d9ea6da-a855-4d46-85a1-2d8d30beaa65 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5ea1413a-4aa3-4cf3-9d93-21b138ed8739-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:08:45 compute-0 nova_compute[187185]: 2025-11-29 07:08:45.198 187189 DEBUG oslo_concurrency.lockutils [req-6a67930e-049d-4f31-8c73-231945ca8628 req-9d9ea6da-a855-4d46-85a1-2d8d30beaa65 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5ea1413a-4aa3-4cf3-9d93-21b138ed8739-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:08:45 compute-0 nova_compute[187185]: 2025-11-29 07:08:45.198 187189 DEBUG oslo_concurrency.lockutils [req-6a67930e-049d-4f31-8c73-231945ca8628 req-9d9ea6da-a855-4d46-85a1-2d8d30beaa65 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5ea1413a-4aa3-4cf3-9d93-21b138ed8739-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:08:45 compute-0 nova_compute[187185]: 2025-11-29 07:08:45.198 187189 DEBUG nova.compute.manager [req-6a67930e-049d-4f31-8c73-231945ca8628 req-9d9ea6da-a855-4d46-85a1-2d8d30beaa65 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Processing event network-vif-plugged-fbbb56f9-2872-40ed-9660-660a9c9371d6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:08:45 compute-0 nova_compute[187185]: 2025-11-29 07:08:45.199 187189 DEBUG nova.compute.manager [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:08:45 compute-0 nova_compute[187185]: 2025-11-29 07:08:45.204 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400125.2035828, 5ea1413a-4aa3-4cf3-9d93-21b138ed8739 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:08:45 compute-0 nova_compute[187185]: 2025-11-29 07:08:45.205 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] VM Resumed (Lifecycle Event)
Nov 29 07:08:45 compute-0 nova_compute[187185]: 2025-11-29 07:08:45.208 187189 DEBUG nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:08:45 compute-0 nova_compute[187185]: 2025-11-29 07:08:45.215 187189 INFO nova.virt.libvirt.driver [-] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Instance spawned successfully.
Nov 29 07:08:45 compute-0 nova_compute[187185]: 2025-11-29 07:08:45.216 187189 DEBUG nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:08:45 compute-0 nova_compute[187185]: 2025-11-29 07:08:45.232 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:08:45 compute-0 nova_compute[187185]: 2025-11-29 07:08:45.236 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:08:45 compute-0 nova_compute[187185]: 2025-11-29 07:08:45.259 187189 DEBUG nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:08:45 compute-0 nova_compute[187185]: 2025-11-29 07:08:45.260 187189 DEBUG nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:08:45 compute-0 nova_compute[187185]: 2025-11-29 07:08:45.260 187189 DEBUG nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:08:45 compute-0 nova_compute[187185]: 2025-11-29 07:08:45.261 187189 DEBUG nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:08:45 compute-0 nova_compute[187185]: 2025-11-29 07:08:45.261 187189 DEBUG nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:08:45 compute-0 nova_compute[187185]: 2025-11-29 07:08:45.262 187189 DEBUG nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:08:45 compute-0 nova_compute[187185]: 2025-11-29 07:08:45.281 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:08:45 compute-0 nova_compute[187185]: 2025-11-29 07:08:45.414 187189 INFO nova.compute.manager [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Took 14.43 seconds to spawn the instance on the hypervisor.
Nov 29 07:08:45 compute-0 nova_compute[187185]: 2025-11-29 07:08:45.415 187189 DEBUG nova.compute.manager [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:08:45 compute-0 nova_compute[187185]: 2025-11-29 07:08:45.608 187189 INFO nova.compute.manager [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Took 16.37 seconds to build instance.
Nov 29 07:08:45 compute-0 nova_compute[187185]: 2025-11-29 07:08:45.634 187189 DEBUG oslo_concurrency.lockutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "5ea1413a-4aa3-4cf3-9d93-21b138ed8739" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.535s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:08:47 compute-0 nova_compute[187185]: 2025-11-29 07:08:47.543 187189 DEBUG nova.compute.manager [req-c56acb7d-410f-4a93-b0c9-9a2b7817f25c req-2fc270a4-7eaf-428a-9ec9-14311b960ae2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Received event network-vif-plugged-fbbb56f9-2872-40ed-9660-660a9c9371d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:08:47 compute-0 nova_compute[187185]: 2025-11-29 07:08:47.545 187189 DEBUG oslo_concurrency.lockutils [req-c56acb7d-410f-4a93-b0c9-9a2b7817f25c req-2fc270a4-7eaf-428a-9ec9-14311b960ae2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5ea1413a-4aa3-4cf3-9d93-21b138ed8739-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:08:47 compute-0 nova_compute[187185]: 2025-11-29 07:08:47.545 187189 DEBUG oslo_concurrency.lockutils [req-c56acb7d-410f-4a93-b0c9-9a2b7817f25c req-2fc270a4-7eaf-428a-9ec9-14311b960ae2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5ea1413a-4aa3-4cf3-9d93-21b138ed8739-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:08:47 compute-0 nova_compute[187185]: 2025-11-29 07:08:47.546 187189 DEBUG oslo_concurrency.lockutils [req-c56acb7d-410f-4a93-b0c9-9a2b7817f25c req-2fc270a4-7eaf-428a-9ec9-14311b960ae2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5ea1413a-4aa3-4cf3-9d93-21b138ed8739-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:08:47 compute-0 nova_compute[187185]: 2025-11-29 07:08:47.546 187189 DEBUG nova.compute.manager [req-c56acb7d-410f-4a93-b0c9-9a2b7817f25c req-2fc270a4-7eaf-428a-9ec9-14311b960ae2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] No waiting events found dispatching network-vif-plugged-fbbb56f9-2872-40ed-9660-660a9c9371d6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:08:47 compute-0 nova_compute[187185]: 2025-11-29 07:08:47.547 187189 WARNING nova.compute.manager [req-c56acb7d-410f-4a93-b0c9-9a2b7817f25c req-2fc270a4-7eaf-428a-9ec9-14311b960ae2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Received unexpected event network-vif-plugged-fbbb56f9-2872-40ed-9660-660a9c9371d6 for instance with vm_state active and task_state None.
Nov 29 07:08:47 compute-0 nova_compute[187185]: 2025-11-29 07:08:47.966 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:47.995 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739', 'name': 'tempest-tempest.common.compute-instance-622119530-1', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000055', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'hostId': '1b3a7f2f8db79a4ce787135fd10449fcf6ee63aeb5f746225d33cd12', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 07:08:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:47.997 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.021 12 DEBUG ceilometer.compute.pollsters [-] 5ea1413a-4aa3-4cf3-9d93-21b138ed8739/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.021 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 5ea1413a-4aa3-4cf3-9d93-21b138ed8739: ceilometer.compute.pollsters.NoVolumeException
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.021 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.024 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 5ea1413a-4aa3-4cf3-9d93-21b138ed8739 / tapfbbb56f9-28 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.025 12 DEBUG ceilometer.compute.pollsters [-] 5ea1413a-4aa3-4cf3-9d93-21b138ed8739/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b7eae5e2-e63c-48b2-b02b-9cd70b35f681', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': 'instance-00000055-5ea1413a-4aa3-4cf3-9d93-21b138ed8739-tapfbbb56f9-28', 'timestamp': '2025-11-29T07:08:48.022079', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-1', 'name': 'tapfbbb56f9-28', 'instance_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739', 'instance_type': 'm1.nano', 'host': '1b3a7f2f8db79a4ce787135fd10449fcf6ee63aeb5f746225d33cd12', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e5:00:ab', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfbbb56f9-28'}, 'message_id': '403b3534-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5575.740306908, 'message_signature': 'fcf86ff30555645c805861681cc29ef63c182a7d7c6440614a82dd11a27dbf6a'}]}, 'timestamp': '2025-11-29 07:08:48.026298', '_unique_id': '971b9c9d04384f5b8e0ff835fab150f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.028 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.029 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.029 12 DEBUG ceilometer.compute.pollsters [-] 5ea1413a-4aa3-4cf3-9d93-21b138ed8739/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd9bed723-490d-4093-9b4b-4690e6a4a0e5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': 'instance-00000055-5ea1413a-4aa3-4cf3-9d93-21b138ed8739-tapfbbb56f9-28', 'timestamp': '2025-11-29T07:08:48.029281', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-1', 'name': 'tapfbbb56f9-28', 'instance_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739', 'instance_type': 'm1.nano', 'host': '1b3a7f2f8db79a4ce787135fd10449fcf6ee63aeb5f746225d33cd12', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e5:00:ab', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfbbb56f9-28'}, 'message_id': '403bc166-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5575.740306908, 'message_signature': 'e7f3e666334a989c4aec6d46fdbf3212ac493e9f5979ed894f9c14507de86def'}]}, 'timestamp': '2025-11-29 07:08:48.029710', '_unique_id': 'c11ee1f39f844eeb9c12ca386d00a87e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.030 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.031 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.047 12 DEBUG ceilometer.compute.pollsters [-] 5ea1413a-4aa3-4cf3-9d93-21b138ed8739/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.048 12 DEBUG ceilometer.compute.pollsters [-] 5ea1413a-4aa3-4cf3-9d93-21b138ed8739/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '53baf295-621e-4df2-972d-7740ef4f7096', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739-vda', 'timestamp': '2025-11-29T07:08:48.031566', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-1', 'name': 'instance-00000055', 'instance_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739', 'instance_type': 'm1.nano', 'host': '1b3a7f2f8db79a4ce787135fd10449fcf6ee63aeb5f746225d33cd12', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '403e93b4-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5575.749793814, 'message_signature': '3fce2b30b810b1f78b60b3cb01fb667bc41ede929563d55a72847efc530b6881'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739-sda', 'timestamp': '2025-11-29T07:08:48.031566', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-1', 'name': 'instance-00000055', 'instance_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739', 'instance_type': 'm1.nano', 'host': '1b3a7f2f8db79a4ce787135fd10449fcf6ee63aeb5f746225d33cd12', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '403e9f30-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5575.749793814, 'message_signature': '6313247b03f13254551c649e5b74819a68d0c07b5bf4c0576fb7eda61bb50b09'}]}, 'timestamp': '2025-11-29 07:08:48.048425', '_unique_id': '001a4129bac945cbaef49b6b2971f870'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.049 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.050 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.050 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.050 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-tempest.common.compute-instance-622119530-1>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-tempest.common.compute-instance-622119530-1>]
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.050 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.050 12 DEBUG ceilometer.compute.pollsters [-] 5ea1413a-4aa3-4cf3-9d93-21b138ed8739/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '62dd3344-bf4d-4435-ad14-ed5c78286bde', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': 'instance-00000055-5ea1413a-4aa3-4cf3-9d93-21b138ed8739-tapfbbb56f9-28', 'timestamp': '2025-11-29T07:08:48.050426', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-1', 'name': 'tapfbbb56f9-28', 'instance_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739', 'instance_type': 'm1.nano', 'host': '1b3a7f2f8db79a4ce787135fd10449fcf6ee63aeb5f746225d33cd12', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e5:00:ab', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfbbb56f9-28'}, 'message_id': '403ef872-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5575.740306908, 'message_signature': 'cf6c3392417de60aacacf6ff906121ab8fed89cc55935f575f42b395394229db'}]}, 'timestamp': '2025-11-29 07:08:48.050736', '_unique_id': '23f547b66a6c4f9aa9ead12493231eb0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.051 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.087 12 DEBUG ceilometer.compute.pollsters [-] 5ea1413a-4aa3-4cf3-9d93-21b138ed8739/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.088 12 DEBUG ceilometer.compute.pollsters [-] 5ea1413a-4aa3-4cf3-9d93-21b138ed8739/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '55680110-ca2e-46c2-a19b-b4526cc83fae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739-vda', 'timestamp': '2025-11-29T07:08:48.052060', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-1', 'name': 'instance-00000055', 'instance_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739', 'instance_type': 'm1.nano', 'host': '1b3a7f2f8db79a4ce787135fd10449fcf6ee63aeb5f746225d33cd12', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4044ac36-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5575.770304259, 'message_signature': '9898118b30630ee7913b41c88bfa62e3a482b1b9db51f853c1121e54cd840576'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739-sda', 'timestamp': '2025-11-29T07:08:48.052060', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-1', 'name': 'instance-00000055', 'instance_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739', 'instance_type': 'm1.nano', 'host': '1b3a7f2f8db79a4ce787135fd10449fcf6ee63aeb5f746225d33cd12', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4044bc80-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5575.770304259, 'message_signature': '97680e63f94d7e7063c6c97e8f75231bfcfbd1d0054abe09553bca03e747da0f'}]}, 'timestamp': '2025-11-29 07:08:48.088539', '_unique_id': '1a662bbb3ef24713a98649f603084338'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.089 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.090 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.090 12 DEBUG ceilometer.compute.pollsters [-] 5ea1413a-4aa3-4cf3-9d93-21b138ed8739/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '70a99302-e25d-4fac-9db4-06fdd2ab90bc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': 'instance-00000055-5ea1413a-4aa3-4cf3-9d93-21b138ed8739-tapfbbb56f9-28', 'timestamp': '2025-11-29T07:08:48.090746', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-1', 'name': 'tapfbbb56f9-28', 'instance_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739', 'instance_type': 'm1.nano', 'host': '1b3a7f2f8db79a4ce787135fd10449fcf6ee63aeb5f746225d33cd12', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e5:00:ab', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfbbb56f9-28'}, 'message_id': '404521d4-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5575.740306908, 'message_signature': '7bf20967e194f9d1835b5650a62445156558953b3f9c8583a7cf7b9fa668e947'}]}, 'timestamp': '2025-11-29 07:08:48.091144', '_unique_id': 'a56d828f219a49f38a0cda1e6b4368e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.091 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.092 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.092 12 DEBUG ceilometer.compute.pollsters [-] 5ea1413a-4aa3-4cf3-9d93-21b138ed8739/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.093 12 DEBUG ceilometer.compute.pollsters [-] 5ea1413a-4aa3-4cf3-9d93-21b138ed8739/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '664d62cb-7574-48c9-9769-db85a6c19fc8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739-vda', 'timestamp': '2025-11-29T07:08:48.092723', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-1', 'name': 'instance-00000055', 'instance_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739', 'instance_type': 'm1.nano', 'host': '1b3a7f2f8db79a4ce787135fd10449fcf6ee63aeb5f746225d33cd12', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '40456d60-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5575.770304259, 'message_signature': '9856c5a0f34ee5a987ba3f38677155861a6001e4d8c479938466d90a6bb26902'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739-sda', 'timestamp': '2025-11-29T07:08:48.092723', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-1', 'name': 'instance-00000055', 'instance_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739', 'instance_type': 'm1.nano', 'host': '1b3a7f2f8db79a4ce787135fd10449fcf6ee63aeb5f746225d33cd12', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '40457a76-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5575.770304259, 'message_signature': '2357f007da2038963bd187dc674f7599974d46f82eaf73ef624ced8d43ca4372'}]}, 'timestamp': '2025-11-29 07:08:48.093357', '_unique_id': '7909922b0ebc42298ced580f252a386a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.094 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.101 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.101 12 DEBUG ceilometer.compute.pollsters [-] 5ea1413a-4aa3-4cf3-9d93-21b138ed8739/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '533cfc22-afe3-4ed7-bfee-72e25ba6e317', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': 'instance-00000055-5ea1413a-4aa3-4cf3-9d93-21b138ed8739-tapfbbb56f9-28', 'timestamp': '2025-11-29T07:08:48.101752', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-1', 'name': 'tapfbbb56f9-28', 'instance_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739', 'instance_type': 'm1.nano', 'host': '1b3a7f2f8db79a4ce787135fd10449fcf6ee63aeb5f746225d33cd12', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e5:00:ab', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfbbb56f9-28'}, 'message_id': '4046d40c-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5575.740306908, 'message_signature': '6c71ffd1f772af29981367d4f7e46d6094726c8c6d73681cfcabe555fa9b23b1'}]}, 'timestamp': '2025-11-29 07:08:48.102339', '_unique_id': 'a6621fa05e5d4f49a1ec6bc1d7d25a76'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.103 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.104 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.105 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.105 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-tempest.common.compute-instance-622119530-1>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-tempest.common.compute-instance-622119530-1>]
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.105 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.105 12 DEBUG ceilometer.compute.pollsters [-] 5ea1413a-4aa3-4cf3-9d93-21b138ed8739/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1f944a64-560f-47d2-bdfb-0a2e56a6af1a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': 'instance-00000055-5ea1413a-4aa3-4cf3-9d93-21b138ed8739-tapfbbb56f9-28', 'timestamp': '2025-11-29T07:08:48.105564', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-1', 'name': 'tapfbbb56f9-28', 'instance_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739', 'instance_type': 'm1.nano', 'host': '1b3a7f2f8db79a4ce787135fd10449fcf6ee63aeb5f746225d33cd12', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e5:00:ab', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfbbb56f9-28'}, 'message_id': '40476372-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5575.740306908, 'message_signature': 'c243f309e8aba5e78de64e226ca633d88edf373ac4a03ce5053311cab03496c6'}]}, 'timestamp': '2025-11-29 07:08:48.106009', '_unique_id': '9a25058176ea4b3ab80c368379ac17b5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.107 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.108 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.108 12 DEBUG ceilometer.compute.pollsters [-] 5ea1413a-4aa3-4cf3-9d93-21b138ed8739/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.108 12 DEBUG ceilometer.compute.pollsters [-] 5ea1413a-4aa3-4cf3-9d93-21b138ed8739/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96f1b852-d8f5-4fa6-bf53-89291a6acfbc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739-vda', 'timestamp': '2025-11-29T07:08:48.108333', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-1', 'name': 'instance-00000055', 'instance_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739', 'instance_type': 'm1.nano', 'host': '1b3a7f2f8db79a4ce787135fd10449fcf6ee63aeb5f746225d33cd12', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4047cee8-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5575.770304259, 'message_signature': 'c8d630cd3e02f106bf172ca0cf8f92e16af211cd4b11bf5f3008fa8650b41573'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739-sda', 'timestamp': '2025-11-29T07:08:48.108333', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-1', 'name': 'instance-00000055', 'instance_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739', 'instance_type': 'm1.nano', 'host': '1b3a7f2f8db79a4ce787135fd10449fcf6ee63aeb5f746225d33cd12', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4047da3c-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5575.770304259, 'message_signature': '609e340cfb3ee5b413c2ddeb263066c8baff0474c30efcdff9bebb7f547db3a4'}]}, 'timestamp': '2025-11-29 07:08:48.108983', '_unique_id': 'e19e2d8ffe47423c898c130781b6cfff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.109 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.110 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.110 12 DEBUG ceilometer.compute.pollsters [-] 5ea1413a-4aa3-4cf3-9d93-21b138ed8739/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8a78759c-560f-4771-93a4-ddaee7371c89', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': 'instance-00000055-5ea1413a-4aa3-4cf3-9d93-21b138ed8739-tapfbbb56f9-28', 'timestamp': '2025-11-29T07:08:48.110826', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-1', 'name': 'tapfbbb56f9-28', 'instance_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739', 'instance_type': 'm1.nano', 'host': '1b3a7f2f8db79a4ce787135fd10449fcf6ee63aeb5f746225d33cd12', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e5:00:ab', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfbbb56f9-28'}, 'message_id': '4048343c-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5575.740306908, 'message_signature': 'c3ad888d1fd7bf0206317401f43a3e089d34384bc0d3658642c8592af615ddeb'}]}, 'timestamp': '2025-11-29 07:08:48.111268', '_unique_id': 'a28f18c0bfc744f9b6335bcf610664be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.112 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.113 12 DEBUG ceilometer.compute.pollsters [-] 5ea1413a-4aa3-4cf3-9d93-21b138ed8739/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eea79b22-6c47-43cc-b473-706c66dedc1e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': 'instance-00000055-5ea1413a-4aa3-4cf3-9d93-21b138ed8739-tapfbbb56f9-28', 'timestamp': '2025-11-29T07:08:48.113043', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-1', 'name': 'tapfbbb56f9-28', 'instance_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739', 'instance_type': 'm1.nano', 'host': '1b3a7f2f8db79a4ce787135fd10449fcf6ee63aeb5f746225d33cd12', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e5:00:ab', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfbbb56f9-28'}, 'message_id': '4048869e-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5575.740306908, 'message_signature': 'bc31f22b32a4706279f860213256876f16b0d2f07ffe1b04519762af6a6f05eb'}]}, 'timestamp': '2025-11-29 07:08:48.113371', '_unique_id': '7334b06bd97f423e86e719a6b4e4aa87'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.114 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.115 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.115 12 DEBUG ceilometer.compute.pollsters [-] 5ea1413a-4aa3-4cf3-9d93-21b138ed8739/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.115 12 DEBUG ceilometer.compute.pollsters [-] 5ea1413a-4aa3-4cf3-9d93-21b138ed8739/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd74e78ea-f4c2-4171-9323-94b752a9bb75', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739-vda', 'timestamp': '2025-11-29T07:08:48.115159', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-1', 'name': 'instance-00000055', 'instance_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739', 'instance_type': 'm1.nano', 'host': '1b3a7f2f8db79a4ce787135fd10449fcf6ee63aeb5f746225d33cd12', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4048dd1a-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5575.770304259, 'message_signature': 'ecc756975e8edf97cb0484e84b920cfc97681892bd57f6900010c8515e31b470'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739-sda', 'timestamp': '2025-11-29T07:08:48.115159', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-1', 'name': 'instance-00000055', 'instance_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739', 'instance_type': 'm1.nano', 'host': '1b3a7f2f8db79a4ce787135fd10449fcf6ee63aeb5f746225d33cd12', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4048e9ea-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5575.770304259, 'message_signature': '28c2cf1b811cd688285410680b6b78ac3a55ef4aaaaece29c32f68119ad9911d'}]}, 'timestamp': '2025-11-29 07:08:48.115911', '_unique_id': 'bbb19e9dff4b4684a0c090608de8d810'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.116 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.117 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.118 12 DEBUG ceilometer.compute.pollsters [-] 5ea1413a-4aa3-4cf3-9d93-21b138ed8739/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.118 12 DEBUG ceilometer.compute.pollsters [-] 5ea1413a-4aa3-4cf3-9d93-21b138ed8739/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '323c3593-551d-4b21-ae41-526e126f567a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739-vda', 'timestamp': '2025-11-29T07:08:48.118088', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-1', 'name': 'instance-00000055', 'instance_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739', 'instance_type': 'm1.nano', 'host': '1b3a7f2f8db79a4ce787135fd10449fcf6ee63aeb5f746225d33cd12', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '40494ca0-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5575.749793814, 'message_signature': '193d1e1acf6d143f3e3f41b710b6f5bc03a2eefda7ad7a4ff7cb44b982f9ec2f'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739-sda', 'timestamp': '2025-11-29T07:08:48.118088', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-1', 'name': 'instance-00000055', 'instance_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739', 'instance_type': 'm1.nano', 'host': '1b3a7f2f8db79a4ce787135fd10449fcf6ee63aeb5f746225d33cd12', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '40495a06-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5575.749793814, 'message_signature': 'fa3481856e29745628ebcd2646872cd910fa4fc49b86c49f92de07548923652e'}]}, 'timestamp': '2025-11-29 07:08:48.120291', '_unique_id': '1081e72ca3ba47ce831c7fc01361f026'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.121 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.123 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.123 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.123 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-tempest.common.compute-instance-622119530-1>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-tempest.common.compute-instance-622119530-1>]
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.123 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.123 12 DEBUG ceilometer.compute.pollsters [-] 5ea1413a-4aa3-4cf3-9d93-21b138ed8739/cpu volume: 2670000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '691bb062-c88d-4902-97e6-c207f9bb9461', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2670000000, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739', 'timestamp': '2025-11-29T07:08:48.123855', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-1', 'name': 'instance-00000055', 'instance_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739', 'instance_type': 'm1.nano', 'host': '1b3a7f2f8db79a4ce787135fd10449fcf6ee63aeb5f746225d33cd12', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '404a2c9c-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5575.739534926, 'message_signature': 'c79e65761c6e0f07b4ecc74679c5628a14a6b966e66df37252d10d0a578e03dc'}]}, 'timestamp': '2025-11-29 07:08:48.124173', '_unique_id': 'e69d9a706e02487591940065e9f93c78'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.124 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.125 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.125 12 DEBUG ceilometer.compute.pollsters [-] 5ea1413a-4aa3-4cf3-9d93-21b138ed8739/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.125 12 DEBUG ceilometer.compute.pollsters [-] 5ea1413a-4aa3-4cf3-9d93-21b138ed8739/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '19cde642-dfaa-4e04-a081-b7d5c618a054', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739-vda', 'timestamp': '2025-11-29T07:08:48.125291', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-1', 'name': 'instance-00000055', 'instance_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739', 'instance_type': 'm1.nano', 'host': '1b3a7f2f8db79a4ce787135fd10449fcf6ee63aeb5f746225d33cd12', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '404a6306-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5575.770304259, 'message_signature': 'f7e966ab552629224df21ec9350babbcce85cca7fcab75a66bac9de97b75184b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739-sda', 'timestamp': '2025-11-29T07:08:48.125291', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-1', 'name': 'instance-00000055', 'instance_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739', 'instance_type': 'm1.nano', 'host': '1b3a7f2f8db79a4ce787135fd10449fcf6ee63aeb5f746225d33cd12', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '404a6ab8-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5575.770304259, 'message_signature': 'cd4854612527736cb4d9c98be266dbcf117bc1705d94eaef18090c3a6f2baa6a'}]}, 'timestamp': '2025-11-29 07:08:48.125706', '_unique_id': '66a9c6fdd01c41609afe1484ed750307'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.126 12 DEBUG ceilometer.compute.pollsters [-] 5ea1413a-4aa3-4cf3-9d93-21b138ed8739/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 DEBUG ceilometer.compute.pollsters [-] 5ea1413a-4aa3-4cf3-9d93-21b138ed8739/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '196120e4-01db-4603-bf7c-9f6ed862d46b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739-vda', 'timestamp': '2025-11-29T07:08:48.126802', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-1', 'name': 'instance-00000055', 'instance_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739', 'instance_type': 'm1.nano', 'host': '1b3a7f2f8db79a4ce787135fd10449fcf6ee63aeb5f746225d33cd12', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '404aa1ae-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5575.749793814, 'message_signature': 'ebeafb90844f1ff15056dd46b75f4e4d2385d8c4822fb520a1b83e03feae8d54'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739-sda', 'timestamp': '2025-11-29T07:08:48.126802', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-1', 'name': 'instance-00000055', 'instance_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739', 'instance_type': 'm1.nano', 'host': '1b3a7f2f8db79a4ce787135fd10449fcf6ee63aeb5f746225d33cd12', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '404aaa1e-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5575.749793814, 'message_signature': '5cc1b4ddf442112d96c97459b39b1d37436fc63064ff00d1d2e4f0cb0c1955ce'}]}, 'timestamp': '2025-11-29 07:08:48.127328', '_unique_id': '2c3ca5921ad9466eabfaa8f40fc7250e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.128 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.128 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.128 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-tempest.common.compute-instance-622119530-1>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-tempest.common.compute-instance-622119530-1>]
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.128 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.128 12 DEBUG ceilometer.compute.pollsters [-] 5ea1413a-4aa3-4cf3-9d93-21b138ed8739/disk.device.read.latency volume: 140273299 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.128 12 DEBUG ceilometer.compute.pollsters [-] 5ea1413a-4aa3-4cf3-9d93-21b138ed8739/disk.device.read.latency volume: 463493 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fac4884f-d0a4-4616-9e04-b6adc303700c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 140273299, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739-vda', 'timestamp': '2025-11-29T07:08:48.128681', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-1', 'name': 'instance-00000055', 'instance_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739', 'instance_type': 'm1.nano', 'host': '1b3a7f2f8db79a4ce787135fd10449fcf6ee63aeb5f746225d33cd12', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '404ae754-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5575.770304259, 'message_signature': '92e8dc46a623d4d78552090954792d20c5603c8c3cbe7e0c2708b1266cffe3ed'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 463493, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739-sda', 'timestamp': '2025-11-29T07:08:48.128681', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-1', 'name': 'instance-00000055', 'instance_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739', 'instance_type': 'm1.nano', 'host': '1b3a7f2f8db79a4ce787135fd10449fcf6ee63aeb5f746225d33cd12', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '404aefd8-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5575.770304259, 'message_signature': '34fc4590bff1236dd073a29c67fba4d83c307ee1a1a55077634c7e5a4e084e44'}]}, 'timestamp': '2025-11-29 07:08:48.129114', '_unique_id': '0f99ab4a5a324519a4e22fea2cae6f94'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.129 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.130 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.130 12 DEBUG ceilometer.compute.pollsters [-] 5ea1413a-4aa3-4cf3-9d93-21b138ed8739/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8e898465-4176-4b02-b62a-bdad8fe1859e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': 'instance-00000055-5ea1413a-4aa3-4cf3-9d93-21b138ed8739-tapfbbb56f9-28', 'timestamp': '2025-11-29T07:08:48.130184', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-1', 'name': 'tapfbbb56f9-28', 'instance_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739', 'instance_type': 'm1.nano', 'host': '1b3a7f2f8db79a4ce787135fd10449fcf6ee63aeb5f746225d33cd12', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e5:00:ab', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfbbb56f9-28'}, 'message_id': '404b23a4-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5575.740306908, 'message_signature': '8f0e9f14919a42aca4fe38eb1d65bf84ee443429b8e95899566b6700635141ee'}]}, 'timestamp': '2025-11-29 07:08:48.130485', '_unique_id': '726c366bdfbf45a59165caea453b1381'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.131 12 DEBUG ceilometer.compute.pollsters [-] 5ea1413a-4aa3-4cf3-9d93-21b138ed8739/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '002c8e5f-5160-4655-b954-7c990dda2931', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': 'instance-00000055-5ea1413a-4aa3-4cf3-9d93-21b138ed8739-tapfbbb56f9-28', 'timestamp': '2025-11-29T07:08:48.131698', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-1', 'name': 'tapfbbb56f9-28', 'instance_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739', 'instance_type': 'm1.nano', 'host': '1b3a7f2f8db79a4ce787135fd10449fcf6ee63aeb5f746225d33cd12', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e5:00:ab', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfbbb56f9-28'}, 'message_id': '404b5ee6-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5575.740306908, 'message_signature': 'dad105a44f5f7c71124af53a3dd4f503ba8e7721b8d7f39814e913e8e2c87427'}]}, 'timestamp': '2025-11-29 07:08:48.131989', '_unique_id': '136ce5b6eeed4814b78196fd9860fba1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:08:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:08:48.132 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:08:48 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:48.892 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:08:48 compute-0 nova_compute[187185]: 2025-11-29 07:08:48.894 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:48 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:48.895 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:08:48 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:48.897 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:08:49 compute-0 nova_compute[187185]: 2025-11-29 07:08:49.631 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:49 compute-0 nova_compute[187185]: 2025-11-29 07:08:49.929 187189 DEBUG oslo_concurrency.lockutils [None req-83da7a0b-7720-4e57-b246-21667647c545 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquiring lock "5ea1413a-4aa3-4cf3-9d93-21b138ed8739" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:08:49 compute-0 nova_compute[187185]: 2025-11-29 07:08:49.931 187189 DEBUG oslo_concurrency.lockutils [None req-83da7a0b-7720-4e57-b246-21667647c545 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "5ea1413a-4aa3-4cf3-9d93-21b138ed8739" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:08:49 compute-0 nova_compute[187185]: 2025-11-29 07:08:49.932 187189 DEBUG oslo_concurrency.lockutils [None req-83da7a0b-7720-4e57-b246-21667647c545 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquiring lock "5ea1413a-4aa3-4cf3-9d93-21b138ed8739-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:08:49 compute-0 nova_compute[187185]: 2025-11-29 07:08:49.932 187189 DEBUG oslo_concurrency.lockutils [None req-83da7a0b-7720-4e57-b246-21667647c545 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "5ea1413a-4aa3-4cf3-9d93-21b138ed8739-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:08:49 compute-0 nova_compute[187185]: 2025-11-29 07:08:49.932 187189 DEBUG oslo_concurrency.lockutils [None req-83da7a0b-7720-4e57-b246-21667647c545 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "5ea1413a-4aa3-4cf3-9d93-21b138ed8739-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:08:49 compute-0 nova_compute[187185]: 2025-11-29 07:08:49.951 187189 INFO nova.compute.manager [None req-83da7a0b-7720-4e57-b246-21667647c545 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Terminating instance
Nov 29 07:08:49 compute-0 nova_compute[187185]: 2025-11-29 07:08:49.969 187189 DEBUG nova.compute.manager [None req-83da7a0b-7720-4e57-b246-21667647c545 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 07:08:49 compute-0 kernel: tapfbbb56f9-28 (unregistering): left promiscuous mode
Nov 29 07:08:49 compute-0 NetworkManager[55227]: <info>  [1764400129.9870] device (tapfbbb56f9-28): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:08:49 compute-0 ovn_controller[95281]: 2025-11-29T07:08:49Z|00193|binding|INFO|Releasing lport fbbb56f9-2872-40ed-9660-660a9c9371d6 from this chassis (sb_readonly=0)
Nov 29 07:08:49 compute-0 ovn_controller[95281]: 2025-11-29T07:08:49Z|00194|binding|INFO|Setting lport fbbb56f9-2872-40ed-9660-660a9c9371d6 down in Southbound
Nov 29 07:08:49 compute-0 ovn_controller[95281]: 2025-11-29T07:08:49Z|00195|binding|INFO|Removing iface tapfbbb56f9-28 ovn-installed in OVS
Nov 29 07:08:49 compute-0 nova_compute[187185]: 2025-11-29 07:08:49.992 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:49 compute-0 nova_compute[187185]: 2025-11-29 07:08:49.994 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:50 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:50.003 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e5:00:ab 10.100.0.10'], port_security=['fa:16:3e:e5:00:ab 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '5ea1413a-4aa3-4cf3-9d93-21b138ed8739', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-61999b35-f067-478e-ae7d-2c014e39aec6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'neutron:revision_number': '4', 'neutron:security_group_ids': '33049197-f5b6-46f9-bb9f-ae10a060cbf4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0493b85c-e95a-459c-8e5e-a22ec09f96c2, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=fbbb56f9-2872-40ed-9660-660a9c9371d6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:08:50 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:50.006 104254 INFO neutron.agent.ovn.metadata.agent [-] Port fbbb56f9-2872-40ed-9660-660a9c9371d6 in datapath 61999b35-f067-478e-ae7d-2c014e39aec6 unbound from our chassis
Nov 29 07:08:50 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:50.009 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 61999b35-f067-478e-ae7d-2c014e39aec6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:08:50 compute-0 nova_compute[187185]: 2025-11-29 07:08:50.013 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:50 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:50.013 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[98e429ef-0cb2-423f-8c56-109dcedb5c6e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:08:50 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:50.015 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6 namespace which is not needed anymore
Nov 29 07:08:50 compute-0 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d00000055.scope: Deactivated successfully.
Nov 29 07:08:50 compute-0 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d00000055.scope: Consumed 5.674s CPU time.
Nov 29 07:08:50 compute-0 systemd-machined[153486]: Machine qemu-29-instance-00000055 terminated.
Nov 29 07:08:50 compute-0 podman[225207]: 2025-11-29 07:08:50.10200174 +0000 UTC m=+0.076912376 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 29 07:08:50 compute-0 podman[225210]: 2025-11-29 07:08:50.102500514 +0000 UTC m=+0.073451869 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 07:08:50 compute-0 podman[225209]: 2025-11-29 07:08:50.103034089 +0000 UTC m=+0.078931493 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., vcs-type=git, managed_by=edpm_ansible, name=ubi9-minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 07:08:50 compute-0 neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6[225183]: [NOTICE]   (225193) : haproxy version is 2.8.14-c23fe91
Nov 29 07:08:50 compute-0 neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6[225183]: [NOTICE]   (225193) : path to executable is /usr/sbin/haproxy
Nov 29 07:08:50 compute-0 neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6[225183]: [WARNING]  (225193) : Exiting Master process...
Nov 29 07:08:50 compute-0 neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6[225183]: [ALERT]    (225193) : Current worker (225195) exited with code 143 (Terminated)
Nov 29 07:08:50 compute-0 neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6[225183]: [WARNING]  (225193) : All workers exited. Exiting... (0)
Nov 29 07:08:50 compute-0 systemd[1]: libpod-bb8dd9a19d2c2016b1193c4b0c64a355fed4f0a635c10f242a1fa466305926eb.scope: Deactivated successfully.
Nov 29 07:08:50 compute-0 podman[225292]: 2025-11-29 07:08:50.169687677 +0000 UTC m=+0.047515552 container died bb8dd9a19d2c2016b1193c4b0c64a355fed4f0a635c10f242a1fa466305926eb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true)
Nov 29 07:08:50 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bb8dd9a19d2c2016b1193c4b0c64a355fed4f0a635c10f242a1fa466305926eb-userdata-shm.mount: Deactivated successfully.
Nov 29 07:08:50 compute-0 nova_compute[187185]: 2025-11-29 07:08:50.198 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:50 compute-0 systemd[1]: var-lib-containers-storage-overlay-12412b606e9099702dd0831ca9295b1864713d11a84ef0550822745b5f5cce3c-merged.mount: Deactivated successfully.
Nov 29 07:08:50 compute-0 nova_compute[187185]: 2025-11-29 07:08:50.204 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:50 compute-0 podman[225292]: 2025-11-29 07:08:50.219872174 +0000 UTC m=+0.097700049 container cleanup bb8dd9a19d2c2016b1193c4b0c64a355fed4f0a635c10f242a1fa466305926eb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 29 07:08:50 compute-0 systemd[1]: libpod-conmon-bb8dd9a19d2c2016b1193c4b0c64a355fed4f0a635c10f242a1fa466305926eb.scope: Deactivated successfully.
Nov 29 07:08:50 compute-0 nova_compute[187185]: 2025-11-29 07:08:50.249 187189 INFO nova.virt.libvirt.driver [-] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Instance destroyed successfully.
Nov 29 07:08:50 compute-0 nova_compute[187185]: 2025-11-29 07:08:50.250 187189 DEBUG nova.objects.instance [None req-83da7a0b-7720-4e57-b246-21667647c545 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lazy-loading 'resources' on Instance uuid 5ea1413a-4aa3-4cf3-9d93-21b138ed8739 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:08:50 compute-0 nova_compute[187185]: 2025-11-29 07:08:50.264 187189 DEBUG nova.virt.libvirt.vif [None req-83da7a0b-7720-4e57-b246-21667647c545 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:08:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-622119530',display_name='tempest-tempest.common.compute-instance-622119530-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-622119530-1',id=85,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:08:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a16c3c4eb5654a7f9742906d1a6f6698',ramdisk_id='',reservation_id='r-u25t76wp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-MultipleCreateTestJSON-910974113',owner_user_name='tempest-MultipleCreateTestJSON-910974113-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:08:45Z,user_data=None,user_id='e621c9f314214c7980a4d441f0600e90',uuid=5ea1413a-4aa3-4cf3-9d93-21b138ed8739,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fbbb56f9-2872-40ed-9660-660a9c9371d6", "address": "fa:16:3e:e5:00:ab", "network": {"id": "61999b35-f067-478e-ae7d-2c014e39aec6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-668045473-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a16c3c4eb5654a7f9742906d1a6f6698", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbbb56f9-28", "ovs_interfaceid": "fbbb56f9-2872-40ed-9660-660a9c9371d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:08:50 compute-0 nova_compute[187185]: 2025-11-29 07:08:50.265 187189 DEBUG nova.network.os_vif_util [None req-83da7a0b-7720-4e57-b246-21667647c545 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Converting VIF {"id": "fbbb56f9-2872-40ed-9660-660a9c9371d6", "address": "fa:16:3e:e5:00:ab", "network": {"id": "61999b35-f067-478e-ae7d-2c014e39aec6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-668045473-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a16c3c4eb5654a7f9742906d1a6f6698", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfbbb56f9-28", "ovs_interfaceid": "fbbb56f9-2872-40ed-9660-660a9c9371d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:08:50 compute-0 nova_compute[187185]: 2025-11-29 07:08:50.266 187189 DEBUG nova.network.os_vif_util [None req-83da7a0b-7720-4e57-b246-21667647c545 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:00:ab,bridge_name='br-int',has_traffic_filtering=True,id=fbbb56f9-2872-40ed-9660-660a9c9371d6,network=Network(61999b35-f067-478e-ae7d-2c014e39aec6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbbb56f9-28') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:08:50 compute-0 nova_compute[187185]: 2025-11-29 07:08:50.266 187189 DEBUG os_vif [None req-83da7a0b-7720-4e57-b246-21667647c545 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:00:ab,bridge_name='br-int',has_traffic_filtering=True,id=fbbb56f9-2872-40ed-9660-660a9c9371d6,network=Network(61999b35-f067-478e-ae7d-2c014e39aec6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbbb56f9-28') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:08:50 compute-0 nova_compute[187185]: 2025-11-29 07:08:50.268 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:50 compute-0 nova_compute[187185]: 2025-11-29 07:08:50.268 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbbb56f9-28, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:08:50 compute-0 nova_compute[187185]: 2025-11-29 07:08:50.270 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:50 compute-0 nova_compute[187185]: 2025-11-29 07:08:50.297 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:50 compute-0 nova_compute[187185]: 2025-11-29 07:08:50.301 187189 INFO os_vif [None req-83da7a0b-7720-4e57-b246-21667647c545 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:00:ab,bridge_name='br-int',has_traffic_filtering=True,id=fbbb56f9-2872-40ed-9660-660a9c9371d6,network=Network(61999b35-f067-478e-ae7d-2c014e39aec6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfbbb56f9-28')
Nov 29 07:08:50 compute-0 nova_compute[187185]: 2025-11-29 07:08:50.302 187189 INFO nova.virt.libvirt.driver [None req-83da7a0b-7720-4e57-b246-21667647c545 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Deleting instance files /var/lib/nova/instances/5ea1413a-4aa3-4cf3-9d93-21b138ed8739_del
Nov 29 07:08:50 compute-0 nova_compute[187185]: 2025-11-29 07:08:50.303 187189 INFO nova.virt.libvirt.driver [None req-83da7a0b-7720-4e57-b246-21667647c545 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Deletion of /var/lib/nova/instances/5ea1413a-4aa3-4cf3-9d93-21b138ed8739_del complete
Nov 29 07:08:50 compute-0 podman[225333]: 2025-11-29 07:08:50.324958399 +0000 UTC m=+0.074922491 container remove bb8dd9a19d2c2016b1193c4b0c64a355fed4f0a635c10f242a1fa466305926eb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:08:50 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:50.330 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[7d126f5d-7d02-4813-a783-cdc5bf6606ad]: (4, ('Sat Nov 29 07:08:50 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6 (bb8dd9a19d2c2016b1193c4b0c64a355fed4f0a635c10f242a1fa466305926eb)\nbb8dd9a19d2c2016b1193c4b0c64a355fed4f0a635c10f242a1fa466305926eb\nSat Nov 29 07:08:50 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6 (bb8dd9a19d2c2016b1193c4b0c64a355fed4f0a635c10f242a1fa466305926eb)\nbb8dd9a19d2c2016b1193c4b0c64a355fed4f0a635c10f242a1fa466305926eb\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:08:50 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:50.333 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[bab19b83-eee7-4f3b-aee1-77480078dd55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:08:50 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:50.334 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap61999b35-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:08:50 compute-0 nova_compute[187185]: 2025-11-29 07:08:50.335 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:50 compute-0 kernel: tap61999b35-f0: left promiscuous mode
Nov 29 07:08:50 compute-0 nova_compute[187185]: 2025-11-29 07:08:50.349 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:50 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:50.351 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[1fa721ec-ab23-4425-a39b-099f853b9d6a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:08:50 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:50.370 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e044fb32-2628-4c25-b815-0b5522e73e24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:08:50 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:50.372 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f35434a7-78a8-4e87-b4b8-49cb0ca3acec]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:08:50 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:50.389 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[2547c5d2-56ce-4dd1-8d71-1f61272dffc6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 556733, 'reachable_time': 18508, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225348, 'error': None, 'target': 'ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:08:50 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:50.394 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:08:50 compute-0 systemd[1]: run-netns-ovnmeta\x2d61999b35\x2df067\x2d478e\x2dae7d\x2d2c014e39aec6.mount: Deactivated successfully.
Nov 29 07:08:50 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:08:50.395 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[0407f2c0-0fdc-445e-a1e6-6177018982cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:08:50 compute-0 nova_compute[187185]: 2025-11-29 07:08:50.519 187189 DEBUG nova.compute.manager [req-8ae46ac4-299b-4c05-ade2-aed2ec31a2d4 req-54d6f21e-58c9-44f5-8626-d7cd19426156 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Received event network-vif-unplugged-fbbb56f9-2872-40ed-9660-660a9c9371d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:08:50 compute-0 nova_compute[187185]: 2025-11-29 07:08:50.519 187189 DEBUG oslo_concurrency.lockutils [req-8ae46ac4-299b-4c05-ade2-aed2ec31a2d4 req-54d6f21e-58c9-44f5-8626-d7cd19426156 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5ea1413a-4aa3-4cf3-9d93-21b138ed8739-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:08:50 compute-0 nova_compute[187185]: 2025-11-29 07:08:50.519 187189 DEBUG oslo_concurrency.lockutils [req-8ae46ac4-299b-4c05-ade2-aed2ec31a2d4 req-54d6f21e-58c9-44f5-8626-d7cd19426156 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5ea1413a-4aa3-4cf3-9d93-21b138ed8739-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:08:50 compute-0 nova_compute[187185]: 2025-11-29 07:08:50.520 187189 DEBUG oslo_concurrency.lockutils [req-8ae46ac4-299b-4c05-ade2-aed2ec31a2d4 req-54d6f21e-58c9-44f5-8626-d7cd19426156 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5ea1413a-4aa3-4cf3-9d93-21b138ed8739-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:08:50 compute-0 nova_compute[187185]: 2025-11-29 07:08:50.520 187189 DEBUG nova.compute.manager [req-8ae46ac4-299b-4c05-ade2-aed2ec31a2d4 req-54d6f21e-58c9-44f5-8626-d7cd19426156 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] No waiting events found dispatching network-vif-unplugged-fbbb56f9-2872-40ed-9660-660a9c9371d6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:08:50 compute-0 nova_compute[187185]: 2025-11-29 07:08:50.520 187189 DEBUG nova.compute.manager [req-8ae46ac4-299b-4c05-ade2-aed2ec31a2d4 req-54d6f21e-58c9-44f5-8626-d7cd19426156 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Received event network-vif-unplugged-fbbb56f9-2872-40ed-9660-660a9c9371d6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 07:08:50 compute-0 nova_compute[187185]: 2025-11-29 07:08:50.591 187189 INFO nova.compute.manager [None req-83da7a0b-7720-4e57-b246-21667647c545 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Took 0.62 seconds to destroy the instance on the hypervisor.
Nov 29 07:08:50 compute-0 nova_compute[187185]: 2025-11-29 07:08:50.592 187189 DEBUG oslo.service.loopingcall [None req-83da7a0b-7720-4e57-b246-21667647c545 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 07:08:50 compute-0 nova_compute[187185]: 2025-11-29 07:08:50.592 187189 DEBUG nova.compute.manager [-] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 07:08:50 compute-0 nova_compute[187185]: 2025-11-29 07:08:50.592 187189 DEBUG nova.network.neutron [-] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 07:08:51 compute-0 nova_compute[187185]: 2025-11-29 07:08:51.613 187189 DEBUG nova.network.neutron [-] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:08:51 compute-0 nova_compute[187185]: 2025-11-29 07:08:51.631 187189 INFO nova.compute.manager [-] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Took 1.04 seconds to deallocate network for instance.
Nov 29 07:08:52 compute-0 nova_compute[187185]: 2025-11-29 07:08:52.261 187189 DEBUG oslo_concurrency.lockutils [None req-83da7a0b-7720-4e57-b246-21667647c545 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:08:52 compute-0 nova_compute[187185]: 2025-11-29 07:08:52.262 187189 DEBUG oslo_concurrency.lockutils [None req-83da7a0b-7720-4e57-b246-21667647c545 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:08:52 compute-0 nova_compute[187185]: 2025-11-29 07:08:52.344 187189 DEBUG nova.compute.provider_tree [None req-83da7a0b-7720-4e57-b246-21667647c545 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:08:52 compute-0 nova_compute[187185]: 2025-11-29 07:08:52.581 187189 DEBUG nova.scheduler.client.report [None req-83da7a0b-7720-4e57-b246-21667647c545 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:08:52 compute-0 nova_compute[187185]: 2025-11-29 07:08:52.951 187189 DEBUG nova.compute.manager [req-79e51812-5d86-425e-9b77-96f35ca8a18b req-43aeb9a5-22f8-4cf7-8e4d-7bc75f3fcb01 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Received event network-vif-plugged-fbbb56f9-2872-40ed-9660-660a9c9371d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:08:52 compute-0 nova_compute[187185]: 2025-11-29 07:08:52.952 187189 DEBUG oslo_concurrency.lockutils [req-79e51812-5d86-425e-9b77-96f35ca8a18b req-43aeb9a5-22f8-4cf7-8e4d-7bc75f3fcb01 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5ea1413a-4aa3-4cf3-9d93-21b138ed8739-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:08:52 compute-0 nova_compute[187185]: 2025-11-29 07:08:52.953 187189 DEBUG oslo_concurrency.lockutils [req-79e51812-5d86-425e-9b77-96f35ca8a18b req-43aeb9a5-22f8-4cf7-8e4d-7bc75f3fcb01 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5ea1413a-4aa3-4cf3-9d93-21b138ed8739-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:08:52 compute-0 nova_compute[187185]: 2025-11-29 07:08:52.953 187189 DEBUG oslo_concurrency.lockutils [req-79e51812-5d86-425e-9b77-96f35ca8a18b req-43aeb9a5-22f8-4cf7-8e4d-7bc75f3fcb01 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5ea1413a-4aa3-4cf3-9d93-21b138ed8739-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:08:52 compute-0 nova_compute[187185]: 2025-11-29 07:08:52.954 187189 DEBUG nova.compute.manager [req-79e51812-5d86-425e-9b77-96f35ca8a18b req-43aeb9a5-22f8-4cf7-8e4d-7bc75f3fcb01 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] No waiting events found dispatching network-vif-plugged-fbbb56f9-2872-40ed-9660-660a9c9371d6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:08:52 compute-0 nova_compute[187185]: 2025-11-29 07:08:52.954 187189 WARNING nova.compute.manager [req-79e51812-5d86-425e-9b77-96f35ca8a18b req-43aeb9a5-22f8-4cf7-8e4d-7bc75f3fcb01 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Received unexpected event network-vif-plugged-fbbb56f9-2872-40ed-9660-660a9c9371d6 for instance with vm_state deleted and task_state None.
Nov 29 07:08:52 compute-0 nova_compute[187185]: 2025-11-29 07:08:52.955 187189 DEBUG nova.compute.manager [req-79e51812-5d86-425e-9b77-96f35ca8a18b req-43aeb9a5-22f8-4cf7-8e4d-7bc75f3fcb01 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Received event network-vif-deleted-fbbb56f9-2872-40ed-9660-660a9c9371d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:08:53 compute-0 nova_compute[187185]: 2025-11-29 07:08:53.032 187189 DEBUG oslo_concurrency.lockutils [None req-83da7a0b-7720-4e57-b246-21667647c545 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.770s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:08:53 compute-0 nova_compute[187185]: 2025-11-29 07:08:53.348 187189 INFO nova.scheduler.client.report [None req-83da7a0b-7720-4e57-b246-21667647c545 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Deleted allocations for instance 5ea1413a-4aa3-4cf3-9d93-21b138ed8739
Nov 29 07:08:54 compute-0 nova_compute[187185]: 2025-11-29 07:08:54.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:08:54 compute-0 nova_compute[187185]: 2025-11-29 07:08:54.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:08:54 compute-0 nova_compute[187185]: 2025-11-29 07:08:54.318 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:08:54 compute-0 nova_compute[187185]: 2025-11-29 07:08:54.440 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 07:08:54 compute-0 nova_compute[187185]: 2025-11-29 07:08:54.634 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:54 compute-0 nova_compute[187185]: 2025-11-29 07:08:54.654 187189 DEBUG oslo_concurrency.lockutils [None req-83da7a0b-7720-4e57-b246-21667647c545 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "5ea1413a-4aa3-4cf3-9d93-21b138ed8739" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.723s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:08:55 compute-0 nova_compute[187185]: 2025-11-29 07:08:55.272 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:08:56 compute-0 nova_compute[187185]: 2025-11-29 07:08:56.725 187189 DEBUG oslo_concurrency.lockutils [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Acquiring lock "0b894b20-2de8-4b79-99d5-1fbd9057f1d2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:08:56 compute-0 nova_compute[187185]: 2025-11-29 07:08:56.725 187189 DEBUG oslo_concurrency.lockutils [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Lock "0b894b20-2de8-4b79-99d5-1fbd9057f1d2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:08:56 compute-0 nova_compute[187185]: 2025-11-29 07:08:56.840 187189 DEBUG nova.compute.manager [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 07:08:56 compute-0 nova_compute[187185]: 2025-11-29 07:08:56.947 187189 DEBUG oslo_concurrency.lockutils [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:08:56 compute-0 nova_compute[187185]: 2025-11-29 07:08:56.948 187189 DEBUG oslo_concurrency.lockutils [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:08:57 compute-0 nova_compute[187185]: 2025-11-29 07:08:57.029 187189 DEBUG nova.compute.manager [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 07:08:57 compute-0 nova_compute[187185]: 2025-11-29 07:08:57.300 187189 DEBUG oslo_concurrency.lockutils [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:08:57 compute-0 nova_compute[187185]: 2025-11-29 07:08:57.301 187189 DEBUG oslo_concurrency.lockutils [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:08:57 compute-0 nova_compute[187185]: 2025-11-29 07:08:57.311 187189 DEBUG nova.virt.hardware [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:08:57 compute-0 nova_compute[187185]: 2025-11-29 07:08:57.311 187189 INFO nova.compute.claims [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:08:57 compute-0 nova_compute[187185]: 2025-11-29 07:08:57.335 187189 DEBUG oslo_concurrency.lockutils [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:08:57 compute-0 nova_compute[187185]: 2025-11-29 07:08:57.408 187189 DEBUG nova.scheduler.client.report [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Refreshing inventories for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 29 07:08:57 compute-0 nova_compute[187185]: 2025-11-29 07:08:57.427 187189 DEBUG nova.scheduler.client.report [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Updating ProviderTree inventory for provider 4e39a026-df39-4e20-874a-dbb5a40df044 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 29 07:08:57 compute-0 nova_compute[187185]: 2025-11-29 07:08:57.427 187189 DEBUG nova.compute.provider_tree [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Updating inventory in ProviderTree for provider 4e39a026-df39-4e20-874a-dbb5a40df044 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 29 07:08:57 compute-0 nova_compute[187185]: 2025-11-29 07:08:57.446 187189 DEBUG nova.scheduler.client.report [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Refreshing aggregate associations for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 29 07:08:57 compute-0 nova_compute[187185]: 2025-11-29 07:08:57.469 187189 DEBUG nova.scheduler.client.report [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Refreshing trait associations for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044, traits: HW_CPU_X86_SSE,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_TRUSTED_CERTS,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 29 07:08:57 compute-0 nova_compute[187185]: 2025-11-29 07:08:57.525 187189 DEBUG nova.compute.provider_tree [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:08:57 compute-0 nova_compute[187185]: 2025-11-29 07:08:57.548 187189 DEBUG nova.scheduler.client.report [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:08:57 compute-0 nova_compute[187185]: 2025-11-29 07:08:57.570 187189 DEBUG oslo_concurrency.lockutils [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.270s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:08:57 compute-0 nova_compute[187185]: 2025-11-29 07:08:57.572 187189 DEBUG nova.compute.manager [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 07:08:57 compute-0 nova_compute[187185]: 2025-11-29 07:08:57.574 187189 DEBUG oslo_concurrency.lockutils [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.239s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:08:57 compute-0 nova_compute[187185]: 2025-11-29 07:08:57.580 187189 DEBUG nova.virt.hardware [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:08:57 compute-0 nova_compute[187185]: 2025-11-29 07:08:57.580 187189 INFO nova.compute.claims [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:08:57 compute-0 nova_compute[187185]: 2025-11-29 07:08:57.650 187189 DEBUG nova.compute.manager [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 07:08:57 compute-0 nova_compute[187185]: 2025-11-29 07:08:57.651 187189 DEBUG nova.network.neutron [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 07:08:57 compute-0 nova_compute[187185]: 2025-11-29 07:08:57.685 187189 INFO nova.virt.libvirt.driver [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 07:08:57 compute-0 nova_compute[187185]: 2025-11-29 07:08:57.708 187189 DEBUG nova.compute.manager [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 07:08:57 compute-0 nova_compute[187185]: 2025-11-29 07:08:57.753 187189 DEBUG nova.compute.provider_tree [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:08:57 compute-0 nova_compute[187185]: 2025-11-29 07:08:57.787 187189 DEBUG nova.scheduler.client.report [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:08:57 compute-0 nova_compute[187185]: 2025-11-29 07:08:57.835 187189 DEBUG oslo_concurrency.lockutils [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.261s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:08:57 compute-0 nova_compute[187185]: 2025-11-29 07:08:57.836 187189 DEBUG nova.compute.manager [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 07:08:57 compute-0 nova_compute[187185]: 2025-11-29 07:08:57.888 187189 DEBUG nova.compute.manager [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 07:08:57 compute-0 nova_compute[187185]: 2025-11-29 07:08:57.889 187189 DEBUG nova.virt.libvirt.driver [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:08:57 compute-0 nova_compute[187185]: 2025-11-29 07:08:57.890 187189 INFO nova.virt.libvirt.driver [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Creating image(s)
Nov 29 07:08:57 compute-0 nova_compute[187185]: 2025-11-29 07:08:57.891 187189 DEBUG oslo_concurrency.lockutils [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Acquiring lock "/var/lib/nova/instances/0b894b20-2de8-4b79-99d5-1fbd9057f1d2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:08:57 compute-0 nova_compute[187185]: 2025-11-29 07:08:57.891 187189 DEBUG oslo_concurrency.lockutils [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Lock "/var/lib/nova/instances/0b894b20-2de8-4b79-99d5-1fbd9057f1d2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:08:57 compute-0 nova_compute[187185]: 2025-11-29 07:08:57.892 187189 DEBUG oslo_concurrency.lockutils [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Lock "/var/lib/nova/instances/0b894b20-2de8-4b79-99d5-1fbd9057f1d2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:08:57 compute-0 nova_compute[187185]: 2025-11-29 07:08:57.904 187189 DEBUG oslo_concurrency.processutils [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:08:57 compute-0 nova_compute[187185]: 2025-11-29 07:08:57.936 187189 DEBUG nova.compute.manager [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 07:08:57 compute-0 nova_compute[187185]: 2025-11-29 07:08:57.937 187189 DEBUG nova.network.neutron [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 07:08:57 compute-0 nova_compute[187185]: 2025-11-29 07:08:57.978 187189 INFO nova.virt.libvirt.driver [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:57.999 187189 DEBUG oslo_concurrency.processutils [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.000 187189 DEBUG oslo_concurrency.lockutils [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.001 187189 DEBUG oslo_concurrency.lockutils [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.012 187189 DEBUG oslo_concurrency.processutils [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.037 187189 DEBUG nova.compute.manager [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.082 187189 DEBUG oslo_concurrency.processutils [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.083 187189 DEBUG oslo_concurrency.processutils [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/0b894b20-2de8-4b79-99d5-1fbd9057f1d2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.209 187189 DEBUG oslo_concurrency.processutils [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/0b894b20-2de8-4b79-99d5-1fbd9057f1d2/disk 1073741824" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.211 187189 DEBUG oslo_concurrency.lockutils [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.210s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.211 187189 DEBUG oslo_concurrency.processutils [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.252 187189 DEBUG nova.compute.manager [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.254 187189 DEBUG nova.virt.libvirt.driver [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.255 187189 INFO nova.virt.libvirt.driver [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Creating image(s)
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.255 187189 DEBUG oslo_concurrency.lockutils [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "/var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.257 187189 DEBUG oslo_concurrency.lockutils [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "/var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.257 187189 DEBUG oslo_concurrency.lockutils [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "/var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.275 187189 DEBUG nova.policy [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b9c54ea5b1e14b5d9d20f3ef82014170', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e238c36ac0d649759e01ee6569916035', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.279 187189 DEBUG oslo_concurrency.processutils [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.304 187189 DEBUG nova.policy [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.308 187189 DEBUG oslo_concurrency.processutils [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.309 187189 DEBUG nova.virt.disk.api [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Checking if we can resize image /var/lib/nova/instances/0b894b20-2de8-4b79-99d5-1fbd9057f1d2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.310 187189 DEBUG oslo_concurrency.processutils [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0b894b20-2de8-4b79-99d5-1fbd9057f1d2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.330 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.331 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.348 187189 DEBUG oslo_concurrency.processutils [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.348 187189 DEBUG oslo_concurrency.lockutils [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.349 187189 DEBUG oslo_concurrency.lockutils [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.363 187189 DEBUG oslo_concurrency.processutils [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.382 187189 DEBUG oslo_concurrency.processutils [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0b894b20-2de8-4b79-99d5-1fbd9057f1d2/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.383 187189 DEBUG nova.virt.disk.api [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Cannot resize image /var/lib/nova/instances/0b894b20-2de8-4b79-99d5-1fbd9057f1d2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.384 187189 DEBUG nova.objects.instance [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Lazy-loading 'migration_context' on Instance uuid 0b894b20-2de8-4b79-99d5-1fbd9057f1d2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.420 187189 DEBUG oslo_concurrency.processutils [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.421 187189 DEBUG oslo_concurrency.processutils [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.449 187189 DEBUG nova.virt.libvirt.driver [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.450 187189 DEBUG nova.virt.libvirt.driver [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Ensure instance console log exists: /var/lib/nova/instances/0b894b20-2de8-4b79-99d5-1fbd9057f1d2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.450 187189 DEBUG oslo_concurrency.lockutils [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.451 187189 DEBUG oslo_concurrency.lockutils [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.451 187189 DEBUG oslo_concurrency.lockutils [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.456 187189 DEBUG oslo_concurrency.processutils [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.457 187189 DEBUG oslo_concurrency.lockutils [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.458 187189 DEBUG oslo_concurrency.processutils [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.529 187189 DEBUG oslo_concurrency.processutils [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.530 187189 DEBUG nova.virt.disk.api [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Checking if we can resize image /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.530 187189 DEBUG oslo_concurrency.processutils [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.593 187189 DEBUG oslo_concurrency.processutils [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.595 187189 DEBUG nova.virt.disk.api [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Cannot resize image /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.595 187189 DEBUG nova.objects.instance [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'migration_context' on Instance uuid 6d4e9a0c-c91c-45a4-911d-7526b420a8a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.609 187189 DEBUG nova.virt.libvirt.driver [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.609 187189 DEBUG nova.virt.libvirt.driver [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Ensure instance console log exists: /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.610 187189 DEBUG oslo_concurrency.lockutils [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.610 187189 DEBUG oslo_concurrency.lockutils [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:08:58 compute-0 nova_compute[187185]: 2025-11-29 07:08:58.610 187189 DEBUG oslo_concurrency.lockutils [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:08:59 compute-0 nova_compute[187185]: 2025-11-29 07:08:59.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:08:59 compute-0 nova_compute[187185]: 2025-11-29 07:08:59.636 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:00 compute-0 nova_compute[187185]: 2025-11-29 07:09:00.262 187189 DEBUG nova.network.neutron [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Successfully created port: 9aaa10f7-2af7-4567-82e0-f80c26727f34 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:09:00 compute-0 nova_compute[187185]: 2025-11-29 07:09:00.274 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:00 compute-0 podman[225379]: 2025-11-29 07:09:00.877813188 +0000 UTC m=+0.128682497 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:09:00 compute-0 nova_compute[187185]: 2025-11-29 07:09:00.942 187189 DEBUG nova.network.neutron [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Successfully created port: 95792ac7-cbc8-4bad-903e-600bb3d09fce _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:09:01 compute-0 nova_compute[187185]: 2025-11-29 07:09:01.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:09:01 compute-0 nova_compute[187185]: 2025-11-29 07:09:01.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:09:01 compute-0 nova_compute[187185]: 2025-11-29 07:09:01.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:09:01 compute-0 nova_compute[187185]: 2025-11-29 07:09:01.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:09:01 compute-0 nova_compute[187185]: 2025-11-29 07:09:01.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:09:01 compute-0 nova_compute[187185]: 2025-11-29 07:09:01.389 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:09:01 compute-0 nova_compute[187185]: 2025-11-29 07:09:01.390 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:09:01 compute-0 nova_compute[187185]: 2025-11-29 07:09:01.390 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:09:01 compute-0 nova_compute[187185]: 2025-11-29 07:09:01.391 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:09:01 compute-0 nova_compute[187185]: 2025-11-29 07:09:01.605 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:09:01 compute-0 nova_compute[187185]: 2025-11-29 07:09:01.607 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5729MB free_disk=73.29632568359375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:09:01 compute-0 nova_compute[187185]: 2025-11-29 07:09:01.608 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:09:01 compute-0 nova_compute[187185]: 2025-11-29 07:09:01.608 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:09:01 compute-0 nova_compute[187185]: 2025-11-29 07:09:01.722 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance 0b894b20-2de8-4b79-99d5-1fbd9057f1d2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:09:01 compute-0 nova_compute[187185]: 2025-11-29 07:09:01.723 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance 6d4e9a0c-c91c-45a4-911d-7526b420a8a9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:09:01 compute-0 nova_compute[187185]: 2025-11-29 07:09:01.723 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:09:01 compute-0 nova_compute[187185]: 2025-11-29 07:09:01.723 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:09:01 compute-0 nova_compute[187185]: 2025-11-29 07:09:01.780 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:09:01 compute-0 nova_compute[187185]: 2025-11-29 07:09:01.799 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:09:01 compute-0 nova_compute[187185]: 2025-11-29 07:09:01.829 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:09:01 compute-0 nova_compute[187185]: 2025-11-29 07:09:01.829 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.221s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:09:02 compute-0 nova_compute[187185]: 2025-11-29 07:09:02.133 187189 DEBUG nova.network.neutron [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Successfully updated port: 9aaa10f7-2af7-4567-82e0-f80c26727f34 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:09:02 compute-0 nova_compute[187185]: 2025-11-29 07:09:02.163 187189 DEBUG oslo_concurrency.lockutils [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Acquiring lock "refresh_cache-0b894b20-2de8-4b79-99d5-1fbd9057f1d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:09:02 compute-0 nova_compute[187185]: 2025-11-29 07:09:02.163 187189 DEBUG oslo_concurrency.lockutils [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Acquired lock "refresh_cache-0b894b20-2de8-4b79-99d5-1fbd9057f1d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:09:02 compute-0 nova_compute[187185]: 2025-11-29 07:09:02.163 187189 DEBUG nova.network.neutron [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:09:02 compute-0 nova_compute[187185]: 2025-11-29 07:09:02.305 187189 DEBUG nova.compute.manager [req-b77a70f0-3d9b-45a4-a794-c971a66b2d6d req-ccb04517-9547-4194-bdc6-cfa313b2c2d8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Received event network-changed-9aaa10f7-2af7-4567-82e0-f80c26727f34 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:09:02 compute-0 nova_compute[187185]: 2025-11-29 07:09:02.305 187189 DEBUG nova.compute.manager [req-b77a70f0-3d9b-45a4-a794-c971a66b2d6d req-ccb04517-9547-4194-bdc6-cfa313b2c2d8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Refreshing instance network info cache due to event network-changed-9aaa10f7-2af7-4567-82e0-f80c26727f34. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:09:02 compute-0 nova_compute[187185]: 2025-11-29 07:09:02.306 187189 DEBUG oslo_concurrency.lockutils [req-b77a70f0-3d9b-45a4-a794-c971a66b2d6d req-ccb04517-9547-4194-bdc6-cfa313b2c2d8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-0b894b20-2de8-4b79-99d5-1fbd9057f1d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:09:02 compute-0 nova_compute[187185]: 2025-11-29 07:09:02.382 187189 DEBUG nova.network.neutron [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:09:02 compute-0 nova_compute[187185]: 2025-11-29 07:09:02.824 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:09:02 compute-0 nova_compute[187185]: 2025-11-29 07:09:02.880 187189 DEBUG nova.network.neutron [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Successfully updated port: 95792ac7-cbc8-4bad-903e-600bb3d09fce _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:09:02 compute-0 nova_compute[187185]: 2025-11-29 07:09:02.903 187189 DEBUG oslo_concurrency.lockutils [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "refresh_cache-6d4e9a0c-c91c-45a4-911d-7526b420a8a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:09:02 compute-0 nova_compute[187185]: 2025-11-29 07:09:02.904 187189 DEBUG oslo_concurrency.lockutils [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquired lock "refresh_cache-6d4e9a0c-c91c-45a4-911d-7526b420a8a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:09:02 compute-0 nova_compute[187185]: 2025-11-29 07:09:02.905 187189 DEBUG nova.network.neutron [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.162 187189 DEBUG nova.network.neutron [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.483 187189 DEBUG nova.network.neutron [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Updating instance_info_cache with network_info: [{"id": "9aaa10f7-2af7-4567-82e0-f80c26727f34", "address": "fa:16:3e:4a:21:16", "network": {"id": "509f7b77-378f-48db-9510-854bfa33858f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-319376736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e238c36ac0d649759e01ee6569916035", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9aaa10f7-2a", "ovs_interfaceid": "9aaa10f7-2af7-4567-82e0-f80c26727f34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.529 187189 DEBUG oslo_concurrency.lockutils [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Releasing lock "refresh_cache-0b894b20-2de8-4b79-99d5-1fbd9057f1d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.530 187189 DEBUG nova.compute.manager [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Instance network_info: |[{"id": "9aaa10f7-2af7-4567-82e0-f80c26727f34", "address": "fa:16:3e:4a:21:16", "network": {"id": "509f7b77-378f-48db-9510-854bfa33858f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-319376736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e238c36ac0d649759e01ee6569916035", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9aaa10f7-2a", "ovs_interfaceid": "9aaa10f7-2af7-4567-82e0-f80c26727f34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.530 187189 DEBUG oslo_concurrency.lockutils [req-b77a70f0-3d9b-45a4-a794-c971a66b2d6d req-ccb04517-9547-4194-bdc6-cfa313b2c2d8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-0b894b20-2de8-4b79-99d5-1fbd9057f1d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.531 187189 DEBUG nova.network.neutron [req-b77a70f0-3d9b-45a4-a794-c971a66b2d6d req-ccb04517-9547-4194-bdc6-cfa313b2c2d8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Refreshing network info cache for port 9aaa10f7-2af7-4567-82e0-f80c26727f34 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.534 187189 DEBUG nova.virt.libvirt.driver [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Start _get_guest_xml network_info=[{"id": "9aaa10f7-2af7-4567-82e0-f80c26727f34", "address": "fa:16:3e:4a:21:16", "network": {"id": "509f7b77-378f-48db-9510-854bfa33858f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-319376736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e238c36ac0d649759e01ee6569916035", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9aaa10f7-2a", "ovs_interfaceid": "9aaa10f7-2af7-4567-82e0-f80c26727f34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.540 187189 WARNING nova.virt.libvirt.driver [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.547 187189 DEBUG nova.virt.libvirt.host [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.547 187189 DEBUG nova.virt.libvirt.host [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.551 187189 DEBUG nova.virt.libvirt.host [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.552 187189 DEBUG nova.virt.libvirt.host [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.553 187189 DEBUG nova.virt.libvirt.driver [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.554 187189 DEBUG nova.virt.hardware [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.554 187189 DEBUG nova.virt.hardware [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.554 187189 DEBUG nova.virt.hardware [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.555 187189 DEBUG nova.virt.hardware [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.555 187189 DEBUG nova.virt.hardware [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.555 187189 DEBUG nova.virt.hardware [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.556 187189 DEBUG nova.virt.hardware [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.556 187189 DEBUG nova.virt.hardware [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.556 187189 DEBUG nova.virt.hardware [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.556 187189 DEBUG nova.virt.hardware [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.557 187189 DEBUG nova.virt.hardware [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.561 187189 DEBUG nova.virt.libvirt.vif [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:08:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesNegativeTestJSON-server-22491651',display_name='tempest-ImagesNegativeTestJSON-server-22491651',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesnegativetestjson-server-22491651',id=88,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e238c36ac0d649759e01ee6569916035',ramdisk_id='',reservation_id='r-6mf3o7mo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesNegativeTestJSON-2093773463',owner_user_name='tempest-ImagesNegativeTestJSON-2093773463-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:08:57Z,user_data=None,user_id='b9c54ea5b1e14b5d9d20f3ef82014170',uuid=0b894b20-2de8-4b79-99d5-1fbd9057f1d2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9aaa10f7-2af7-4567-82e0-f80c26727f34", "address": "fa:16:3e:4a:21:16", "network": {"id": "509f7b77-378f-48db-9510-854bfa33858f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-319376736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e238c36ac0d649759e01ee6569916035", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9aaa10f7-2a", "ovs_interfaceid": "9aaa10f7-2af7-4567-82e0-f80c26727f34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.561 187189 DEBUG nova.network.os_vif_util [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Converting VIF {"id": "9aaa10f7-2af7-4567-82e0-f80c26727f34", "address": "fa:16:3e:4a:21:16", "network": {"id": "509f7b77-378f-48db-9510-854bfa33858f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-319376736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e238c36ac0d649759e01ee6569916035", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9aaa10f7-2a", "ovs_interfaceid": "9aaa10f7-2af7-4567-82e0-f80c26727f34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.563 187189 DEBUG nova.network.os_vif_util [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4a:21:16,bridge_name='br-int',has_traffic_filtering=True,id=9aaa10f7-2af7-4567-82e0-f80c26727f34,network=Network(509f7b77-378f-48db-9510-854bfa33858f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9aaa10f7-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.564 187189 DEBUG nova.objects.instance [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0b894b20-2de8-4b79-99d5-1fbd9057f1d2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.583 187189 DEBUG nova.virt.libvirt.driver [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:09:03 compute-0 nova_compute[187185]:   <uuid>0b894b20-2de8-4b79-99d5-1fbd9057f1d2</uuid>
Nov 29 07:09:03 compute-0 nova_compute[187185]:   <name>instance-00000058</name>
Nov 29 07:09:03 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:09:03 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:09:03 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:09:03 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:       <nova:name>tempest-ImagesNegativeTestJSON-server-22491651</nova:name>
Nov 29 07:09:03 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:09:03</nova:creationTime>
Nov 29 07:09:03 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:09:03 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:09:03 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:09:03 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:09:03 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:09:03 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:09:03 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:09:03 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:09:03 compute-0 nova_compute[187185]:         <nova:user uuid="b9c54ea5b1e14b5d9d20f3ef82014170">tempest-ImagesNegativeTestJSON-2093773463-project-member</nova:user>
Nov 29 07:09:03 compute-0 nova_compute[187185]:         <nova:project uuid="e238c36ac0d649759e01ee6569916035">tempest-ImagesNegativeTestJSON-2093773463</nova:project>
Nov 29 07:09:03 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:09:03 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:09:03 compute-0 nova_compute[187185]:         <nova:port uuid="9aaa10f7-2af7-4567-82e0-f80c26727f34">
Nov 29 07:09:03 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:09:03 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:09:03 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:09:03 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <system>
Nov 29 07:09:03 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:09:03 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:09:03 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:09:03 compute-0 nova_compute[187185]:       <entry name="serial">0b894b20-2de8-4b79-99d5-1fbd9057f1d2</entry>
Nov 29 07:09:03 compute-0 nova_compute[187185]:       <entry name="uuid">0b894b20-2de8-4b79-99d5-1fbd9057f1d2</entry>
Nov 29 07:09:03 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     </system>
Nov 29 07:09:03 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:09:03 compute-0 nova_compute[187185]:   <os>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:   </os>
Nov 29 07:09:03 compute-0 nova_compute[187185]:   <features>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:   </features>
Nov 29 07:09:03 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:09:03 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:09:03 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:09:03 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/0b894b20-2de8-4b79-99d5-1fbd9057f1d2/disk"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:09:03 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/0b894b20-2de8-4b79-99d5-1fbd9057f1d2/disk.config"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:09:03 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:4a:21:16"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:       <target dev="tap9aaa10f7-2a"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:09:03 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/0b894b20-2de8-4b79-99d5-1fbd9057f1d2/console.log" append="off"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <video>
Nov 29 07:09:03 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     </video>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:09:03 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:09:03 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:09:03 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:09:03 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:09:03 compute-0 nova_compute[187185]: </domain>
Nov 29 07:09:03 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.585 187189 DEBUG nova.compute.manager [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Preparing to wait for external event network-vif-plugged-9aaa10f7-2af7-4567-82e0-f80c26727f34 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.585 187189 DEBUG oslo_concurrency.lockutils [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Acquiring lock "0b894b20-2de8-4b79-99d5-1fbd9057f1d2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.586 187189 DEBUG oslo_concurrency.lockutils [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Lock "0b894b20-2de8-4b79-99d5-1fbd9057f1d2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.586 187189 DEBUG oslo_concurrency.lockutils [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Lock "0b894b20-2de8-4b79-99d5-1fbd9057f1d2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.586 187189 DEBUG nova.virt.libvirt.vif [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:08:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesNegativeTestJSON-server-22491651',display_name='tempest-ImagesNegativeTestJSON-server-22491651',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesnegativetestjson-server-22491651',id=88,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e238c36ac0d649759e01ee6569916035',ramdisk_id='',reservation_id='r-6mf3o7mo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesNegativeTestJSON-2093773463',owner_user_name='tempest-ImagesNegativeTestJSON-2093773463-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:08:57Z,user_data=None,user_id='b9c54ea5b1e14b5d9d20f3ef82014170',uuid=0b894b20-2de8-4b79-99d5-1fbd9057f1d2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9aaa10f7-2af7-4567-82e0-f80c26727f34", "address": "fa:16:3e:4a:21:16", "network": {"id": "509f7b77-378f-48db-9510-854bfa33858f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-319376736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e238c36ac0d649759e01ee6569916035", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9aaa10f7-2a", "ovs_interfaceid": "9aaa10f7-2af7-4567-82e0-f80c26727f34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.587 187189 DEBUG nova.network.os_vif_util [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Converting VIF {"id": "9aaa10f7-2af7-4567-82e0-f80c26727f34", "address": "fa:16:3e:4a:21:16", "network": {"id": "509f7b77-378f-48db-9510-854bfa33858f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-319376736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e238c36ac0d649759e01ee6569916035", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9aaa10f7-2a", "ovs_interfaceid": "9aaa10f7-2af7-4567-82e0-f80c26727f34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.587 187189 DEBUG nova.network.os_vif_util [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4a:21:16,bridge_name='br-int',has_traffic_filtering=True,id=9aaa10f7-2af7-4567-82e0-f80c26727f34,network=Network(509f7b77-378f-48db-9510-854bfa33858f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9aaa10f7-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.588 187189 DEBUG os_vif [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:21:16,bridge_name='br-int',has_traffic_filtering=True,id=9aaa10f7-2af7-4567-82e0-f80c26727f34,network=Network(509f7b77-378f-48db-9510-854bfa33858f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9aaa10f7-2a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.588 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.588 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.589 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.591 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.591 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9aaa10f7-2a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.592 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9aaa10f7-2a, col_values=(('external_ids', {'iface-id': '9aaa10f7-2af7-4567-82e0-f80c26727f34', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4a:21:16', 'vm-uuid': '0b894b20-2de8-4b79-99d5-1fbd9057f1d2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.594 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:03 compute-0 NetworkManager[55227]: <info>  [1764400143.5961] manager: (tap9aaa10f7-2a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/105)
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.596 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.600 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.601 187189 INFO os_vif [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:21:16,bridge_name='br-int',has_traffic_filtering=True,id=9aaa10f7-2af7-4567-82e0-f80c26727f34,network=Network(509f7b77-378f-48db-9510-854bfa33858f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9aaa10f7-2a')
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.904 187189 DEBUG nova.virt.libvirt.driver [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.905 187189 DEBUG nova.virt.libvirt.driver [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.906 187189 DEBUG nova.virt.libvirt.driver [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] No VIF found with MAC fa:16:3e:4a:21:16, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:09:03 compute-0 nova_compute[187185]: 2025-11-29 07:09:03.907 187189 INFO nova.virt.libvirt.driver [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Using config drive
Nov 29 07:09:04 compute-0 nova_compute[187185]: 2025-11-29 07:09:04.406 187189 DEBUG nova.compute.manager [req-bdfade11-e78d-41ed-9a99-d8526e779c2d req-d174b096-bf2e-4d4e-abd3-9f500c0acf9f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Received event network-changed-95792ac7-cbc8-4bad-903e-600bb3d09fce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:09:04 compute-0 nova_compute[187185]: 2025-11-29 07:09:04.407 187189 DEBUG nova.compute.manager [req-bdfade11-e78d-41ed-9a99-d8526e779c2d req-d174b096-bf2e-4d4e-abd3-9f500c0acf9f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Refreshing instance network info cache due to event network-changed-95792ac7-cbc8-4bad-903e-600bb3d09fce. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:09:04 compute-0 nova_compute[187185]: 2025-11-29 07:09:04.407 187189 DEBUG oslo_concurrency.lockutils [req-bdfade11-e78d-41ed-9a99-d8526e779c2d req-d174b096-bf2e-4d4e-abd3-9f500c0acf9f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-6d4e9a0c-c91c-45a4-911d-7526b420a8a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:09:04 compute-0 nova_compute[187185]: 2025-11-29 07:09:04.684 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.244 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400130.2435348, 5ea1413a-4aa3-4cf3-9d93-21b138ed8739 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.245 187189 INFO nova.compute.manager [-] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] VM Stopped (Lifecycle Event)
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.270 187189 DEBUG nova.compute.manager [None req-173ad13b-ec0a-4689-8edc-713ca528f458 - - - - - -] [instance: 5ea1413a-4aa3-4cf3-9d93-21b138ed8739] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.402 187189 INFO nova.virt.libvirt.driver [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Creating config drive at /var/lib/nova/instances/0b894b20-2de8-4b79-99d5-1fbd9057f1d2/disk.config
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.407 187189 DEBUG oslo_concurrency.processutils [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0b894b20-2de8-4b79-99d5-1fbd9057f1d2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr11dk8_e execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.465 187189 DEBUG nova.network.neutron [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Updating instance_info_cache with network_info: [{"id": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "address": "fa:16:3e:a1:a1:8f", "network": {"id": "af9d1967-d1a9-4382-82b7-d9db26a40cb7", "bridge": "br-int", "label": "tempest-network-smoke--1326373200", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95792ac7-cb", "ovs_interfaceid": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.494 187189 DEBUG oslo_concurrency.lockutils [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Releasing lock "refresh_cache-6d4e9a0c-c91c-45a4-911d-7526b420a8a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.495 187189 DEBUG nova.compute.manager [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Instance network_info: |[{"id": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "address": "fa:16:3e:a1:a1:8f", "network": {"id": "af9d1967-d1a9-4382-82b7-d9db26a40cb7", "bridge": "br-int", "label": "tempest-network-smoke--1326373200", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95792ac7-cb", "ovs_interfaceid": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.495 187189 DEBUG oslo_concurrency.lockutils [req-bdfade11-e78d-41ed-9a99-d8526e779c2d req-d174b096-bf2e-4d4e-abd3-9f500c0acf9f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-6d4e9a0c-c91c-45a4-911d-7526b420a8a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.496 187189 DEBUG nova.network.neutron [req-bdfade11-e78d-41ed-9a99-d8526e779c2d req-d174b096-bf2e-4d4e-abd3-9f500c0acf9f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Refreshing network info cache for port 95792ac7-cbc8-4bad-903e-600bb3d09fce _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.499 187189 DEBUG nova.virt.libvirt.driver [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Start _get_guest_xml network_info=[{"id": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "address": "fa:16:3e:a1:a1:8f", "network": {"id": "af9d1967-d1a9-4382-82b7-d9db26a40cb7", "bridge": "br-int", "label": "tempest-network-smoke--1326373200", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95792ac7-cb", "ovs_interfaceid": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.503 187189 WARNING nova.virt.libvirt.driver [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.507 187189 DEBUG nova.virt.libvirt.host [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.508 187189 DEBUG nova.virt.libvirt.host [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.511 187189 DEBUG nova.virt.libvirt.host [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.511 187189 DEBUG nova.virt.libvirt.host [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.512 187189 DEBUG nova.virt.libvirt.driver [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.512 187189 DEBUG nova.virt.hardware [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.513 187189 DEBUG nova.virt.hardware [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.513 187189 DEBUG nova.virt.hardware [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.513 187189 DEBUG nova.virt.hardware [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.513 187189 DEBUG nova.virt.hardware [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.514 187189 DEBUG nova.virt.hardware [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.514 187189 DEBUG nova.virt.hardware [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.514 187189 DEBUG nova.virt.hardware [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.514 187189 DEBUG nova.virt.hardware [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.515 187189 DEBUG nova.virt.hardware [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.515 187189 DEBUG nova.virt.hardware [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.518 187189 DEBUG nova.virt.libvirt.vif [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:08:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1405928271',display_name='tempest-TestNetworkAdvancedServerOps-server-1405928271',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1405928271',id=89,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJzX+cYphgzFb/LmLSqgC4l/EgTLaDqQRgz2oIoLmiT9pJmbbaoOE/h8lTp9y4P6Lqu0yte5POR0cnSIwuT6ICbf/J95VY/pQuT7Mh/Rw0RaK2X3rgSaxQ5jqSeZ2XDRaw==',key_name='tempest-TestNetworkAdvancedServerOps-1604525815',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-q0t03bzu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:08:58Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=6d4e9a0c-c91c-45a4-911d-7526b420a8a9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "address": "fa:16:3e:a1:a1:8f", "network": {"id": "af9d1967-d1a9-4382-82b7-d9db26a40cb7", "bridge": "br-int", "label": "tempest-network-smoke--1326373200", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95792ac7-cb", "ovs_interfaceid": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.518 187189 DEBUG nova.network.os_vif_util [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converting VIF {"id": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "address": "fa:16:3e:a1:a1:8f", "network": {"id": "af9d1967-d1a9-4382-82b7-d9db26a40cb7", "bridge": "br-int", "label": "tempest-network-smoke--1326373200", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95792ac7-cb", "ovs_interfaceid": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.519 187189 DEBUG nova.network.os_vif_util [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a1:a1:8f,bridge_name='br-int',has_traffic_filtering=True,id=95792ac7-cbc8-4bad-903e-600bb3d09fce,network=Network(af9d1967-d1a9-4382-82b7-d9db26a40cb7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95792ac7-cb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.520 187189 DEBUG nova.objects.instance [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6d4e9a0c-c91c-45a4-911d-7526b420a8a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.535 187189 DEBUG nova.virt.libvirt.driver [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:09:05 compute-0 nova_compute[187185]:   <uuid>6d4e9a0c-c91c-45a4-911d-7526b420a8a9</uuid>
Nov 29 07:09:05 compute-0 nova_compute[187185]:   <name>instance-00000059</name>
Nov 29 07:09:05 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:09:05 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:09:05 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:09:05 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-1405928271</nova:name>
Nov 29 07:09:05 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:09:05</nova:creationTime>
Nov 29 07:09:05 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:09:05 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:09:05 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:09:05 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:09:05 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:09:05 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:09:05 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:09:05 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:09:05 compute-0 nova_compute[187185]:         <nova:user uuid="bfd2024670594b10941cec8a59d2573f">tempest-TestNetworkAdvancedServerOps-1380683659-project-member</nova:user>
Nov 29 07:09:05 compute-0 nova_compute[187185]:         <nova:project uuid="c231e63624d44fc19e0989abfb1afb22">tempest-TestNetworkAdvancedServerOps-1380683659</nova:project>
Nov 29 07:09:05 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:09:05 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:09:05 compute-0 nova_compute[187185]:         <nova:port uuid="95792ac7-cbc8-4bad-903e-600bb3d09fce">
Nov 29 07:09:05 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:09:05 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:09:05 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:09:05 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <system>
Nov 29 07:09:05 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:09:05 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:09:05 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:09:05 compute-0 nova_compute[187185]:       <entry name="serial">6d4e9a0c-c91c-45a4-911d-7526b420a8a9</entry>
Nov 29 07:09:05 compute-0 nova_compute[187185]:       <entry name="uuid">6d4e9a0c-c91c-45a4-911d-7526b420a8a9</entry>
Nov 29 07:09:05 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     </system>
Nov 29 07:09:05 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:09:05 compute-0 nova_compute[187185]:   <os>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:   </os>
Nov 29 07:09:05 compute-0 nova_compute[187185]:   <features>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:   </features>
Nov 29 07:09:05 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:09:05 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:09:05 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:09:05 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:09:05 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk.config"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:09:05 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:a1:a1:8f"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:       <target dev="tap95792ac7-cb"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:09:05 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/console.log" append="off"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <video>
Nov 29 07:09:05 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     </video>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:09:05 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:09:05 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:09:05 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:09:05 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:09:05 compute-0 nova_compute[187185]: </domain>
Nov 29 07:09:05 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.536 187189 DEBUG nova.compute.manager [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Preparing to wait for external event network-vif-plugged-95792ac7-cbc8-4bad-903e-600bb3d09fce prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.537 187189 DEBUG oslo_concurrency.lockutils [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.537 187189 DEBUG oslo_concurrency.lockutils [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.537 187189 DEBUG oslo_concurrency.lockutils [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.538 187189 DEBUG nova.virt.libvirt.vif [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:08:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1405928271',display_name='tempest-TestNetworkAdvancedServerOps-server-1405928271',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1405928271',id=89,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJzX+cYphgzFb/LmLSqgC4l/EgTLaDqQRgz2oIoLmiT9pJmbbaoOE/h8lTp9y4P6Lqu0yte5POR0cnSIwuT6ICbf/J95VY/pQuT7Mh/Rw0RaK2X3rgSaxQ5jqSeZ2XDRaw==',key_name='tempest-TestNetworkAdvancedServerOps-1604525815',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-q0t03bzu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:08:58Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=6d4e9a0c-c91c-45a4-911d-7526b420a8a9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "address": "fa:16:3e:a1:a1:8f", "network": {"id": "af9d1967-d1a9-4382-82b7-d9db26a40cb7", "bridge": "br-int", "label": "tempest-network-smoke--1326373200", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95792ac7-cb", "ovs_interfaceid": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.538 187189 DEBUG nova.network.os_vif_util [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converting VIF {"id": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "address": "fa:16:3e:a1:a1:8f", "network": {"id": "af9d1967-d1a9-4382-82b7-d9db26a40cb7", "bridge": "br-int", "label": "tempest-network-smoke--1326373200", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95792ac7-cb", "ovs_interfaceid": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.538 187189 DEBUG nova.network.os_vif_util [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a1:a1:8f,bridge_name='br-int',has_traffic_filtering=True,id=95792ac7-cbc8-4bad-903e-600bb3d09fce,network=Network(af9d1967-d1a9-4382-82b7-d9db26a40cb7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95792ac7-cb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.539 187189 DEBUG os_vif [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a1:a1:8f,bridge_name='br-int',has_traffic_filtering=True,id=95792ac7-cbc8-4bad-903e-600bb3d09fce,network=Network(af9d1967-d1a9-4382-82b7-d9db26a40cb7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95792ac7-cb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.539 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.540 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.540 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.541 187189 DEBUG oslo_concurrency.processutils [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0b894b20-2de8-4b79-99d5-1fbd9057f1d2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr11dk8_e" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.544 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.544 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap95792ac7-cb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.545 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap95792ac7-cb, col_values=(('external_ids', {'iface-id': '95792ac7-cbc8-4bad-903e-600bb3d09fce', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a1:a1:8f', 'vm-uuid': '6d4e9a0c-c91c-45a4-911d-7526b420a8a9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.547 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:05 compute-0 NetworkManager[55227]: <info>  [1764400145.5477] manager: (tap95792ac7-cb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/106)
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.549 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.554 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.555 187189 INFO os_vif [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a1:a1:8f,bridge_name='br-int',has_traffic_filtering=True,id=95792ac7-cbc8-4bad-903e-600bb3d09fce,network=Network(af9d1967-d1a9-4382-82b7-d9db26a40cb7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95792ac7-cb')
Nov 29 07:09:05 compute-0 NetworkManager[55227]: <info>  [1764400145.6188] manager: (tap9aaa10f7-2a): new Tun device (/org/freedesktop/NetworkManager/Devices/107)
Nov 29 07:09:05 compute-0 kernel: tap9aaa10f7-2a: entered promiscuous mode
Nov 29 07:09:05 compute-0 ovn_controller[95281]: 2025-11-29T07:09:05Z|00196|binding|INFO|Claiming lport 9aaa10f7-2af7-4567-82e0-f80c26727f34 for this chassis.
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.626 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:05 compute-0 ovn_controller[95281]: 2025-11-29T07:09:05Z|00197|binding|INFO|9aaa10f7-2af7-4567-82e0-f80c26727f34: Claiming fa:16:3e:4a:21:16 10.100.0.12
Nov 29 07:09:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:05.639 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:21:16 10.100.0.12'], port_security=['fa:16:3e:4a:21:16 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '0b894b20-2de8-4b79-99d5-1fbd9057f1d2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-509f7b77-378f-48db-9510-854bfa33858f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e238c36ac0d649759e01ee6569916035', 'neutron:revision_number': '2', 'neutron:security_group_ids': '746756c3-25aa-4b29-8e5f-471c63400935', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b50c6820-25b5-4f1e-9bb5-b4b77a019b0d, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=9aaa10f7-2af7-4567-82e0-f80c26727f34) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:09:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:05.640 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 9aaa10f7-2af7-4567-82e0-f80c26727f34 in datapath 509f7b77-378f-48db-9510-854bfa33858f bound to our chassis
Nov 29 07:09:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:05.643 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 509f7b77-378f-48db-9510-854bfa33858f
Nov 29 07:09:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:05.655 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[c249cb0e-569d-4fd6-98ac-a8cc5f1372b8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:05.657 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap509f7b77-31 in ovnmeta-509f7b77-378f-48db-9510-854bfa33858f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:09:05 compute-0 systemd-udevd[225438]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:09:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:05.665 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap509f7b77-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:09:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:05.665 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[90c28754-deba-4953-824f-73fce4aefffe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:05.667 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[716a3c4a-744a-47ac-a0d7-fff85296c926]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:05 compute-0 NetworkManager[55227]: <info>  [1764400145.6715] device (tap9aaa10f7-2a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:09:05 compute-0 NetworkManager[55227]: <info>  [1764400145.6725] device (tap9aaa10f7-2a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:09:05 compute-0 systemd-machined[153486]: New machine qemu-30-instance-00000058.
Nov 29 07:09:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:05.679 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[8244cf52-325d-4470-9f6a-073d19636d38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:05 compute-0 systemd[1]: Started Virtual Machine qemu-30-instance-00000058.
Nov 29 07:09:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:05.695 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a666da90-06fe-4a88-8fd3-757c0b95bc61]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.709 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:05 compute-0 ovn_controller[95281]: 2025-11-29T07:09:05Z|00198|binding|INFO|Setting lport 9aaa10f7-2af7-4567-82e0-f80c26727f34 ovn-installed in OVS
Nov 29 07:09:05 compute-0 ovn_controller[95281]: 2025-11-29T07:09:05Z|00199|binding|INFO|Setting lport 9aaa10f7-2af7-4567-82e0-f80c26727f34 up in Southbound
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.714 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:05.732 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[990a1963-dc0c-4ed6-a3e3-3ddd14ff32ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:05 compute-0 systemd-udevd[225443]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:09:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:05.740 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[4a9caa8d-978e-4034-8dd9-b6e7ac1f30b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:05 compute-0 NetworkManager[55227]: <info>  [1764400145.7419] manager: (tap509f7b77-30): new Veth device (/org/freedesktop/NetworkManager/Devices/108)
Nov 29 07:09:05 compute-0 podman[225421]: 2025-11-29 07:09:05.74492768 +0000 UTC m=+0.126468955 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.777 187189 DEBUG nova.virt.libvirt.driver [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.777 187189 DEBUG nova.virt.libvirt.driver [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.777 187189 DEBUG nova.virt.libvirt.driver [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] No VIF found with MAC fa:16:3e:a1:a1:8f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:09:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:05.776 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[9c3aba3c-82af-4b17-949c-025cae4d3378]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.778 187189 INFO nova.virt.libvirt.driver [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Using config drive
Nov 29 07:09:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:05.781 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[e932adf1-ea73-4c9a-8163-9ca8cec0c7f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:05 compute-0 NetworkManager[55227]: <info>  [1764400145.8086] device (tap509f7b77-30): carrier: link connected
Nov 29 07:09:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:05.815 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[18d789fa-eb91-41dc-801b-8bed0999ec84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:05.840 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[cb1f6004-6b64-47a3-b5be-82aae421a913]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap509f7b77-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2a:ed:5f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 559347, 'reachable_time': 20445, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225488, 'error': None, 'target': 'ovnmeta-509f7b77-378f-48db-9510-854bfa33858f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:05.861 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[13c19086-5776-4367-8b77-cfa7073ac354]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2a:ed5f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 559347, 'tstamp': 559347}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225489, 'error': None, 'target': 'ovnmeta-509f7b77-378f-48db-9510-854bfa33858f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:05.877 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[ee73687a-828e-4695-a68c-b110398598e9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap509f7b77-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2a:ed:5f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 559347, 'reachable_time': 20445, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225490, 'error': None, 'target': 'ovnmeta-509f7b77-378f-48db-9510-854bfa33858f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:05.912 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[cb532d47-2331-480f-a617-b91895819d43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:05.989 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[4d1ac243-b23c-4d8a-935f-2c94716ab91c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:05.991 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap509f7b77-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:09:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:05.992 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:09:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:05.993 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap509f7b77-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:09:05 compute-0 nova_compute[187185]: 2025-11-29 07:09:05.997 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:05 compute-0 kernel: tap509f7b77-30: entered promiscuous mode
Nov 29 07:09:05 compute-0 NetworkManager[55227]: <info>  [1764400145.9981] manager: (tap509f7b77-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/109)
Nov 29 07:09:06 compute-0 nova_compute[187185]: 2025-11-29 07:09:06.002 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:06.004 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap509f7b77-30, col_values=(('external_ids', {'iface-id': '8f308f4a-d247-458f-940c-c0a5d51a9ea1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:09:06 compute-0 nova_compute[187185]: 2025-11-29 07:09:06.006 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:06 compute-0 ovn_controller[95281]: 2025-11-29T07:09:06Z|00200|binding|INFO|Releasing lport 8f308f4a-d247-458f-940c-c0a5d51a9ea1 from this chassis (sb_readonly=0)
Nov 29 07:09:06 compute-0 nova_compute[187185]: 2025-11-29 07:09:06.008 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400146.006873, 0b894b20-2de8-4b79-99d5-1fbd9057f1d2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:09:06 compute-0 nova_compute[187185]: 2025-11-29 07:09:06.008 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] VM Started (Lifecycle Event)
Nov 29 07:09:06 compute-0 nova_compute[187185]: 2025-11-29 07:09:06.034 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:09:06 compute-0 nova_compute[187185]: 2025-11-29 07:09:06.035 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:06 compute-0 nova_compute[187185]: 2025-11-29 07:09:06.040 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400146.0085773, 0b894b20-2de8-4b79-99d5-1fbd9057f1d2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:09:06 compute-0 nova_compute[187185]: 2025-11-29 07:09:06.041 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] VM Paused (Lifecycle Event)
Nov 29 07:09:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:06.042 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/509f7b77-378f-48db-9510-854bfa33858f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/509f7b77-378f-48db-9510-854bfa33858f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:09:06 compute-0 nova_compute[187185]: 2025-11-29 07:09:06.043 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:06.043 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[10d6985d-110e-45aa-8285-40eb77e2c36e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:06.043 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:09:06 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:09:06 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:09:06 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-509f7b77-378f-48db-9510-854bfa33858f
Nov 29 07:09:06 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:09:06 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:09:06 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:09:06 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/509f7b77-378f-48db-9510-854bfa33858f.pid.haproxy
Nov 29 07:09:06 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:09:06 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:09:06 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:09:06 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:09:06 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:09:06 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:09:06 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:09:06 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:09:06 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:09:06 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:09:06 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:09:06 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:09:06 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:09:06 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:09:06 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:09:06 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:09:06 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:09:06 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:09:06 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:09:06 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:09:06 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID 509f7b77-378f-48db-9510-854bfa33858f
Nov 29 07:09:06 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:09:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:06.044 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-509f7b77-378f-48db-9510-854bfa33858f', 'env', 'PROCESS_TAG=haproxy-509f7b77-378f-48db-9510-854bfa33858f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/509f7b77-378f-48db-9510-854bfa33858f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:09:06 compute-0 nova_compute[187185]: 2025-11-29 07:09:06.060 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:09:06 compute-0 nova_compute[187185]: 2025-11-29 07:09:06.065 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:09:06 compute-0 nova_compute[187185]: 2025-11-29 07:09:06.083 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:09:06 compute-0 podman[225534]: 2025-11-29 07:09:06.440751972 +0000 UTC m=+0.023090768 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:09:06 compute-0 nova_compute[187185]: 2025-11-29 07:09:06.582 187189 DEBUG nova.compute.manager [req-3902aea0-4363-4221-bc01-c8a3218384e8 req-5dd58b44-b5b2-4f18-bafe-a8506580ef7f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Received event network-vif-plugged-9aaa10f7-2af7-4567-82e0-f80c26727f34 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:09:06 compute-0 nova_compute[187185]: 2025-11-29 07:09:06.584 187189 DEBUG oslo_concurrency.lockutils [req-3902aea0-4363-4221-bc01-c8a3218384e8 req-5dd58b44-b5b2-4f18-bafe-a8506580ef7f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "0b894b20-2de8-4b79-99d5-1fbd9057f1d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:09:06 compute-0 nova_compute[187185]: 2025-11-29 07:09:06.585 187189 DEBUG oslo_concurrency.lockutils [req-3902aea0-4363-4221-bc01-c8a3218384e8 req-5dd58b44-b5b2-4f18-bafe-a8506580ef7f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0b894b20-2de8-4b79-99d5-1fbd9057f1d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:09:06 compute-0 nova_compute[187185]: 2025-11-29 07:09:06.585 187189 DEBUG oslo_concurrency.lockutils [req-3902aea0-4363-4221-bc01-c8a3218384e8 req-5dd58b44-b5b2-4f18-bafe-a8506580ef7f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0b894b20-2de8-4b79-99d5-1fbd9057f1d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:09:06 compute-0 nova_compute[187185]: 2025-11-29 07:09:06.585 187189 DEBUG nova.compute.manager [req-3902aea0-4363-4221-bc01-c8a3218384e8 req-5dd58b44-b5b2-4f18-bafe-a8506580ef7f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Processing event network-vif-plugged-9aaa10f7-2af7-4567-82e0-f80c26727f34 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:09:06 compute-0 nova_compute[187185]: 2025-11-29 07:09:06.586 187189 DEBUG nova.compute.manager [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:09:06 compute-0 nova_compute[187185]: 2025-11-29 07:09:06.589 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400146.5895228, 0b894b20-2de8-4b79-99d5-1fbd9057f1d2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:09:06 compute-0 nova_compute[187185]: 2025-11-29 07:09:06.590 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] VM Resumed (Lifecycle Event)
Nov 29 07:09:06 compute-0 nova_compute[187185]: 2025-11-29 07:09:06.591 187189 DEBUG nova.virt.libvirt.driver [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:09:06 compute-0 nova_compute[187185]: 2025-11-29 07:09:06.594 187189 INFO nova.virt.libvirt.driver [-] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Instance spawned successfully.
Nov 29 07:09:06 compute-0 nova_compute[187185]: 2025-11-29 07:09:06.594 187189 DEBUG nova.virt.libvirt.driver [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:09:06 compute-0 nova_compute[187185]: 2025-11-29 07:09:06.615 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:09:06 compute-0 nova_compute[187185]: 2025-11-29 07:09:06.619 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:09:06 compute-0 nova_compute[187185]: 2025-11-29 07:09:06.630 187189 DEBUG nova.virt.libvirt.driver [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:09:06 compute-0 nova_compute[187185]: 2025-11-29 07:09:06.631 187189 DEBUG nova.virt.libvirt.driver [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:09:06 compute-0 nova_compute[187185]: 2025-11-29 07:09:06.632 187189 DEBUG nova.virt.libvirt.driver [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:09:06 compute-0 nova_compute[187185]: 2025-11-29 07:09:06.632 187189 DEBUG nova.virt.libvirt.driver [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:09:06 compute-0 nova_compute[187185]: 2025-11-29 07:09:06.633 187189 DEBUG nova.virt.libvirt.driver [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:09:06 compute-0 nova_compute[187185]: 2025-11-29 07:09:06.633 187189 DEBUG nova.virt.libvirt.driver [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:09:06 compute-0 nova_compute[187185]: 2025-11-29 07:09:06.649 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:09:06 compute-0 nova_compute[187185]: 2025-11-29 07:09:06.736 187189 INFO nova.compute.manager [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Took 8.85 seconds to spawn the instance on the hypervisor.
Nov 29 07:09:06 compute-0 nova_compute[187185]: 2025-11-29 07:09:06.737 187189 DEBUG nova.compute.manager [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:09:06 compute-0 nova_compute[187185]: 2025-11-29 07:09:06.823 187189 INFO nova.compute.manager [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Took 9.75 seconds to build instance.
Nov 29 07:09:06 compute-0 nova_compute[187185]: 2025-11-29 07:09:06.838 187189 DEBUG oslo_concurrency.lockutils [None req-10d172c0-f044-4b8e-b16c-66afae97501c b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Lock "0b894b20-2de8-4b79-99d5-1fbd9057f1d2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:09:07 compute-0 nova_compute[187185]: 2025-11-29 07:09:07.075 187189 INFO nova.virt.libvirt.driver [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Creating config drive at /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk.config
Nov 29 07:09:07 compute-0 nova_compute[187185]: 2025-11-29 07:09:07.084 187189 DEBUG oslo_concurrency.processutils [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpw3vb17j4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:09:07 compute-0 nova_compute[187185]: 2025-11-29 07:09:07.220 187189 DEBUG oslo_concurrency.processutils [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpw3vb17j4" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:09:07 compute-0 NetworkManager[55227]: <info>  [1764400147.3109] manager: (tap95792ac7-cb): new Tun device (/org/freedesktop/NetworkManager/Devices/110)
Nov 29 07:09:07 compute-0 kernel: tap95792ac7-cb: entered promiscuous mode
Nov 29 07:09:07 compute-0 systemd-udevd[225482]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:09:07 compute-0 nova_compute[187185]: 2025-11-29 07:09:07.317 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:07 compute-0 ovn_controller[95281]: 2025-11-29T07:09:07Z|00201|binding|INFO|Claiming lport 95792ac7-cbc8-4bad-903e-600bb3d09fce for this chassis.
Nov 29 07:09:07 compute-0 ovn_controller[95281]: 2025-11-29T07:09:07Z|00202|binding|INFO|95792ac7-cbc8-4bad-903e-600bb3d09fce: Claiming fa:16:3e:a1:a1:8f 10.100.0.8
Nov 29 07:09:07 compute-0 nova_compute[187185]: 2025-11-29 07:09:07.321 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:07 compute-0 nova_compute[187185]: 2025-11-29 07:09:07.325 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:07 compute-0 NetworkManager[55227]: <info>  [1764400147.3387] device (tap95792ac7-cb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:09:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:07.337 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a1:a1:8f 10.100.0.8'], port_security=['fa:16:3e:a1:a1:8f 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '6d4e9a0c-c91c-45a4-911d-7526b420a8a9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-af9d1967-d1a9-4382-82b7-d9db26a40cb7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c231e63624d44fc19e0989abfb1afb22', 'neutron:revision_number': '2', 'neutron:security_group_ids': '376a466b-335f-4204-8812-ec229fd4d3b3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2abd3f5a-1a92-4bfd-a631-54a420dbc598, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=95792ac7-cbc8-4bad-903e-600bb3d09fce) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:09:07 compute-0 NetworkManager[55227]: <info>  [1764400147.3403] device (tap95792ac7-cb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:09:07 compute-0 systemd-machined[153486]: New machine qemu-31-instance-00000059.
Nov 29 07:09:07 compute-0 ovn_controller[95281]: 2025-11-29T07:09:07Z|00203|binding|INFO|Setting lport 95792ac7-cbc8-4bad-903e-600bb3d09fce ovn-installed in OVS
Nov 29 07:09:07 compute-0 ovn_controller[95281]: 2025-11-29T07:09:07Z|00204|binding|INFO|Setting lport 95792ac7-cbc8-4bad-903e-600bb3d09fce up in Southbound
Nov 29 07:09:07 compute-0 nova_compute[187185]: 2025-11-29 07:09:07.389 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:07 compute-0 systemd[1]: Started Virtual Machine qemu-31-instance-00000059.
Nov 29 07:09:07 compute-0 nova_compute[187185]: 2025-11-29 07:09:07.788 187189 DEBUG nova.network.neutron [req-b77a70f0-3d9b-45a4-a794-c971a66b2d6d req-ccb04517-9547-4194-bdc6-cfa313b2c2d8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Updated VIF entry in instance network info cache for port 9aaa10f7-2af7-4567-82e0-f80c26727f34. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:09:07 compute-0 nova_compute[187185]: 2025-11-29 07:09:07.789 187189 DEBUG nova.network.neutron [req-b77a70f0-3d9b-45a4-a794-c971a66b2d6d req-ccb04517-9547-4194-bdc6-cfa313b2c2d8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Updating instance_info_cache with network_info: [{"id": "9aaa10f7-2af7-4567-82e0-f80c26727f34", "address": "fa:16:3e:4a:21:16", "network": {"id": "509f7b77-378f-48db-9510-854bfa33858f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-319376736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e238c36ac0d649759e01ee6569916035", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9aaa10f7-2a", "ovs_interfaceid": "9aaa10f7-2af7-4567-82e0-f80c26727f34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:09:07 compute-0 nova_compute[187185]: 2025-11-29 07:09:07.814 187189 DEBUG oslo_concurrency.lockutils [req-b77a70f0-3d9b-45a4-a794-c971a66b2d6d req-ccb04517-9547-4194-bdc6-cfa313b2c2d8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-0b894b20-2de8-4b79-99d5-1fbd9057f1d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.088 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400148.0877304, 6d4e9a0c-c91c-45a4-911d-7526b420a8a9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.088 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] VM Started (Lifecycle Event)
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.115 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.121 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400148.0879776, 6d4e9a0c-c91c-45a4-911d-7526b420a8a9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.121 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] VM Paused (Lifecycle Event)
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.140 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.146 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.175 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:09:08 compute-0 podman[225534]: 2025-11-29 07:09:08.519690909 +0000 UTC m=+2.102029715 container create 5245ac16a6cded310c9ec54b615a4fa6302bf752431c832cff7bc64a1ffda784 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-509f7b77-378f-48db-9510-854bfa33858f, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.629 187189 DEBUG nova.network.neutron [req-bdfade11-e78d-41ed-9a99-d8526e779c2d req-d174b096-bf2e-4d4e-abd3-9f500c0acf9f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Updated VIF entry in instance network info cache for port 95792ac7-cbc8-4bad-903e-600bb3d09fce. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.631 187189 DEBUG nova.network.neutron [req-bdfade11-e78d-41ed-9a99-d8526e779c2d req-d174b096-bf2e-4d4e-abd3-9f500c0acf9f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Updating instance_info_cache with network_info: [{"id": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "address": "fa:16:3e:a1:a1:8f", "network": {"id": "af9d1967-d1a9-4382-82b7-d9db26a40cb7", "bridge": "br-int", "label": "tempest-network-smoke--1326373200", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95792ac7-cb", "ovs_interfaceid": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.651 187189 DEBUG oslo_concurrency.lockutils [req-bdfade11-e78d-41ed-9a99-d8526e779c2d req-d174b096-bf2e-4d4e-abd3-9f500c0acf9f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-6d4e9a0c-c91c-45a4-911d-7526b420a8a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.732 187189 DEBUG nova.compute.manager [req-6ea7eba2-96d8-4d52-9b31-f6ce6281d8c0 req-835cb968-4737-41be-a21c-f18ddcbfe009 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Received event network-vif-plugged-9aaa10f7-2af7-4567-82e0-f80c26727f34 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.734 187189 DEBUG oslo_concurrency.lockutils [req-6ea7eba2-96d8-4d52-9b31-f6ce6281d8c0 req-835cb968-4737-41be-a21c-f18ddcbfe009 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "0b894b20-2de8-4b79-99d5-1fbd9057f1d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.734 187189 DEBUG oslo_concurrency.lockutils [req-6ea7eba2-96d8-4d52-9b31-f6ce6281d8c0 req-835cb968-4737-41be-a21c-f18ddcbfe009 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0b894b20-2de8-4b79-99d5-1fbd9057f1d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.734 187189 DEBUG oslo_concurrency.lockutils [req-6ea7eba2-96d8-4d52-9b31-f6ce6281d8c0 req-835cb968-4737-41be-a21c-f18ddcbfe009 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0b894b20-2de8-4b79-99d5-1fbd9057f1d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.734 187189 DEBUG nova.compute.manager [req-6ea7eba2-96d8-4d52-9b31-f6ce6281d8c0 req-835cb968-4737-41be-a21c-f18ddcbfe009 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] No waiting events found dispatching network-vif-plugged-9aaa10f7-2af7-4567-82e0-f80c26727f34 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.734 187189 WARNING nova.compute.manager [req-6ea7eba2-96d8-4d52-9b31-f6ce6281d8c0 req-835cb968-4737-41be-a21c-f18ddcbfe009 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Received unexpected event network-vif-plugged-9aaa10f7-2af7-4567-82e0-f80c26727f34 for instance with vm_state active and task_state None.
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.735 187189 DEBUG nova.compute.manager [req-6ea7eba2-96d8-4d52-9b31-f6ce6281d8c0 req-835cb968-4737-41be-a21c-f18ddcbfe009 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Received event network-vif-plugged-95792ac7-cbc8-4bad-903e-600bb3d09fce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.735 187189 DEBUG oslo_concurrency.lockutils [req-6ea7eba2-96d8-4d52-9b31-f6ce6281d8c0 req-835cb968-4737-41be-a21c-f18ddcbfe009 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.735 187189 DEBUG oslo_concurrency.lockutils [req-6ea7eba2-96d8-4d52-9b31-f6ce6281d8c0 req-835cb968-4737-41be-a21c-f18ddcbfe009 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.735 187189 DEBUG oslo_concurrency.lockutils [req-6ea7eba2-96d8-4d52-9b31-f6ce6281d8c0 req-835cb968-4737-41be-a21c-f18ddcbfe009 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.735 187189 DEBUG nova.compute.manager [req-6ea7eba2-96d8-4d52-9b31-f6ce6281d8c0 req-835cb968-4737-41be-a21c-f18ddcbfe009 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Processing event network-vif-plugged-95792ac7-cbc8-4bad-903e-600bb3d09fce _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.735 187189 DEBUG nova.compute.manager [req-6ea7eba2-96d8-4d52-9b31-f6ce6281d8c0 req-835cb968-4737-41be-a21c-f18ddcbfe009 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Received event network-vif-plugged-95792ac7-cbc8-4bad-903e-600bb3d09fce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.736 187189 DEBUG oslo_concurrency.lockutils [req-6ea7eba2-96d8-4d52-9b31-f6ce6281d8c0 req-835cb968-4737-41be-a21c-f18ddcbfe009 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.736 187189 DEBUG oslo_concurrency.lockutils [req-6ea7eba2-96d8-4d52-9b31-f6ce6281d8c0 req-835cb968-4737-41be-a21c-f18ddcbfe009 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.736 187189 DEBUG oslo_concurrency.lockutils [req-6ea7eba2-96d8-4d52-9b31-f6ce6281d8c0 req-835cb968-4737-41be-a21c-f18ddcbfe009 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.736 187189 DEBUG nova.compute.manager [req-6ea7eba2-96d8-4d52-9b31-f6ce6281d8c0 req-835cb968-4737-41be-a21c-f18ddcbfe009 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] No waiting events found dispatching network-vif-plugged-95792ac7-cbc8-4bad-903e-600bb3d09fce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.736 187189 WARNING nova.compute.manager [req-6ea7eba2-96d8-4d52-9b31-f6ce6281d8c0 req-835cb968-4737-41be-a21c-f18ddcbfe009 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Received unexpected event network-vif-plugged-95792ac7-cbc8-4bad-903e-600bb3d09fce for instance with vm_state building and task_state spawning.
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.737 187189 DEBUG nova.compute.manager [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.759 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400148.7480536, 6d4e9a0c-c91c-45a4-911d-7526b420a8a9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.760 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] VM Resumed (Lifecycle Event)
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.762 187189 DEBUG nova.virt.libvirt.driver [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.774 187189 INFO nova.virt.libvirt.driver [-] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Instance spawned successfully.
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.775 187189 DEBUG nova.virt.libvirt.driver [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.780 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.790 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.803 187189 DEBUG nova.virt.libvirt.driver [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.804 187189 DEBUG nova.virt.libvirt.driver [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.804 187189 DEBUG nova.virt.libvirt.driver [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.805 187189 DEBUG nova.virt.libvirt.driver [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.806 187189 DEBUG nova.virt.libvirt.driver [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.807 187189 DEBUG nova.virt.libvirt.driver [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.816 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.861 187189 DEBUG oslo_concurrency.lockutils [None req-65f53ff3-1170-41fe-bf69-21605a3cc660 b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Acquiring lock "0b894b20-2de8-4b79-99d5-1fbd9057f1d2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.862 187189 DEBUG oslo_concurrency.lockutils [None req-65f53ff3-1170-41fe-bf69-21605a3cc660 b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Lock "0b894b20-2de8-4b79-99d5-1fbd9057f1d2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.862 187189 DEBUG oslo_concurrency.lockutils [None req-65f53ff3-1170-41fe-bf69-21605a3cc660 b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Acquiring lock "0b894b20-2de8-4b79-99d5-1fbd9057f1d2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.863 187189 DEBUG oslo_concurrency.lockutils [None req-65f53ff3-1170-41fe-bf69-21605a3cc660 b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Lock "0b894b20-2de8-4b79-99d5-1fbd9057f1d2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.863 187189 DEBUG oslo_concurrency.lockutils [None req-65f53ff3-1170-41fe-bf69-21605a3cc660 b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Lock "0b894b20-2de8-4b79-99d5-1fbd9057f1d2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.882 187189 INFO nova.compute.manager [None req-65f53ff3-1170-41fe-bf69-21605a3cc660 b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Terminating instance
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.887 187189 INFO nova.compute.manager [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Took 10.63 seconds to spawn the instance on the hypervisor.
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.888 187189 DEBUG nova.compute.manager [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:09:08 compute-0 systemd[1]: Started libpod-conmon-5245ac16a6cded310c9ec54b615a4fa6302bf752431c832cff7bc64a1ffda784.scope.
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.898 187189 DEBUG nova.compute.manager [None req-65f53ff3-1170-41fe-bf69-21605a3cc660 b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 07:09:08 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:09:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c173566b5568ff640bb98e94656c6fd49e874a1f49db1685a6ea4cabb33b470/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:09:08 compute-0 kernel: tap9aaa10f7-2a (unregistering): left promiscuous mode
Nov 29 07:09:08 compute-0 NetworkManager[55227]: <info>  [1764400148.9474] device (tap9aaa10f7-2a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.958 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:08 compute-0 ovn_controller[95281]: 2025-11-29T07:09:08Z|00205|binding|INFO|Releasing lport 9aaa10f7-2af7-4567-82e0-f80c26727f34 from this chassis (sb_readonly=0)
Nov 29 07:09:08 compute-0 ovn_controller[95281]: 2025-11-29T07:09:08Z|00206|binding|INFO|Setting lport 9aaa10f7-2af7-4567-82e0-f80c26727f34 down in Southbound
Nov 29 07:09:08 compute-0 ovn_controller[95281]: 2025-11-29T07:09:08Z|00207|binding|INFO|Removing iface tap9aaa10f7-2a ovn-installed in OVS
Nov 29 07:09:08 compute-0 nova_compute[187185]: 2025-11-29 07:09:08.963 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:08.972 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:21:16 10.100.0.12'], port_security=['fa:16:3e:4a:21:16 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '0b894b20-2de8-4b79-99d5-1fbd9057f1d2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-509f7b77-378f-48db-9510-854bfa33858f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e238c36ac0d649759e01ee6569916035', 'neutron:revision_number': '4', 'neutron:security_group_ids': '746756c3-25aa-4b29-8e5f-471c63400935', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b50c6820-25b5-4f1e-9bb5-b4b77a019b0d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=9aaa10f7-2af7-4567-82e0-f80c26727f34) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:09:09 compute-0 nova_compute[187185]: 2025-11-29 07:09:09.046 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:09 compute-0 nova_compute[187185]: 2025-11-29 07:09:09.054 187189 INFO nova.compute.manager [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Took 11.76 seconds to build instance.
Nov 29 07:09:09 compute-0 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d00000058.scope: Deactivated successfully.
Nov 29 07:09:09 compute-0 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d00000058.scope: Consumed 2.643s CPU time.
Nov 29 07:09:09 compute-0 systemd-machined[153486]: Machine qemu-30-instance-00000058 terminated.
Nov 29 07:09:09 compute-0 nova_compute[187185]: 2025-11-29 07:09:09.075 187189 DEBUG oslo_concurrency.lockutils [None req-356aefa0-dad0-4ce2-9007-3ccfa614ddf2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:09:09 compute-0 podman[225534]: 2025-11-29 07:09:09.079602852 +0000 UTC m=+2.661941648 container init 5245ac16a6cded310c9ec54b615a4fa6302bf752431c832cff7bc64a1ffda784 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-509f7b77-378f-48db-9510-854bfa33858f, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 29 07:09:09 compute-0 podman[225580]: 2025-11-29 07:09:09.083495881 +0000 UTC m=+0.519782359 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:09:09 compute-0 podman[225534]: 2025-11-29 07:09:09.087343619 +0000 UTC m=+2.669682385 container start 5245ac16a6cded310c9ec54b615a4fa6302bf752431c832cff7bc64a1ffda784 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-509f7b77-378f-48db-9510-854bfa33858f, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 07:09:09 compute-0 neutron-haproxy-ovnmeta-509f7b77-378f-48db-9510-854bfa33858f[225594]: [NOTICE]   (225612) : New worker (225614) forked
Nov 29 07:09:09 compute-0 neutron-haproxy-ovnmeta-509f7b77-378f-48db-9510-854bfa33858f[225594]: [NOTICE]   (225612) : Loading success.
Nov 29 07:09:09 compute-0 nova_compute[187185]: 2025-11-29 07:09:09.186 187189 INFO nova.virt.libvirt.driver [-] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Instance destroyed successfully.
Nov 29 07:09:09 compute-0 nova_compute[187185]: 2025-11-29 07:09:09.187 187189 DEBUG nova.objects.instance [None req-65f53ff3-1170-41fe-bf69-21605a3cc660 b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Lazy-loading 'resources' on Instance uuid 0b894b20-2de8-4b79-99d5-1fbd9057f1d2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:09:09 compute-0 nova_compute[187185]: 2025-11-29 07:09:09.219 187189 DEBUG nova.virt.libvirt.vif [None req-65f53ff3-1170-41fe-bf69-21605a3cc660 b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:08:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesNegativeTestJSON-server-22491651',display_name='tempest-ImagesNegativeTestJSON-server-22491651',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesnegativetestjson-server-22491651',id=88,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:09:06Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e238c36ac0d649759e01ee6569916035',ramdisk_id='',reservation_id='r-6mf3o7mo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesNegativeTestJSON-2093773463',owner_user_name='tempest-ImagesNegativeTestJSON-2093773463-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:09:06Z,user_data=None,user_id='b9c54ea5b1e14b5d9d20f3ef82014170',uuid=0b894b20-2de8-4b79-99d5-1fbd9057f1d2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9aaa10f7-2af7-4567-82e0-f80c26727f34", "address": "fa:16:3e:4a:21:16", "network": {"id": "509f7b77-378f-48db-9510-854bfa33858f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-319376736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e238c36ac0d649759e01ee6569916035", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9aaa10f7-2a", "ovs_interfaceid": "9aaa10f7-2af7-4567-82e0-f80c26727f34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:09:09 compute-0 nova_compute[187185]: 2025-11-29 07:09:09.219 187189 DEBUG nova.network.os_vif_util [None req-65f53ff3-1170-41fe-bf69-21605a3cc660 b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Converting VIF {"id": "9aaa10f7-2af7-4567-82e0-f80c26727f34", "address": "fa:16:3e:4a:21:16", "network": {"id": "509f7b77-378f-48db-9510-854bfa33858f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-319376736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e238c36ac0d649759e01ee6569916035", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9aaa10f7-2a", "ovs_interfaceid": "9aaa10f7-2af7-4567-82e0-f80c26727f34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:09:09 compute-0 nova_compute[187185]: 2025-11-29 07:09:09.220 187189 DEBUG nova.network.os_vif_util [None req-65f53ff3-1170-41fe-bf69-21605a3cc660 b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4a:21:16,bridge_name='br-int',has_traffic_filtering=True,id=9aaa10f7-2af7-4567-82e0-f80c26727f34,network=Network(509f7b77-378f-48db-9510-854bfa33858f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9aaa10f7-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:09:09 compute-0 nova_compute[187185]: 2025-11-29 07:09:09.220 187189 DEBUG os_vif [None req-65f53ff3-1170-41fe-bf69-21605a3cc660 b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:21:16,bridge_name='br-int',has_traffic_filtering=True,id=9aaa10f7-2af7-4567-82e0-f80c26727f34,network=Network(509f7b77-378f-48db-9510-854bfa33858f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9aaa10f7-2a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:09:09 compute-0 nova_compute[187185]: 2025-11-29 07:09:09.222 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:09 compute-0 nova_compute[187185]: 2025-11-29 07:09:09.223 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9aaa10f7-2a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:09:09 compute-0 nova_compute[187185]: 2025-11-29 07:09:09.224 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:09 compute-0 nova_compute[187185]: 2025-11-29 07:09:09.226 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:09 compute-0 nova_compute[187185]: 2025-11-29 07:09:09.230 187189 INFO os_vif [None req-65f53ff3-1170-41fe-bf69-21605a3cc660 b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:21:16,bridge_name='br-int',has_traffic_filtering=True,id=9aaa10f7-2af7-4567-82e0-f80c26727f34,network=Network(509f7b77-378f-48db-9510-854bfa33858f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9aaa10f7-2a')
Nov 29 07:09:09 compute-0 nova_compute[187185]: 2025-11-29 07:09:09.231 187189 INFO nova.virt.libvirt.driver [None req-65f53ff3-1170-41fe-bf69-21605a3cc660 b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Deleting instance files /var/lib/nova/instances/0b894b20-2de8-4b79-99d5-1fbd9057f1d2_del
Nov 29 07:09:09 compute-0 nova_compute[187185]: 2025-11-29 07:09:09.232 187189 INFO nova.virt.libvirt.driver [None req-65f53ff3-1170-41fe-bf69-21605a3cc660 b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Deletion of /var/lib/nova/instances/0b894b20-2de8-4b79-99d5-1fbd9057f1d2_del complete
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:09.246 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 95792ac7-cbc8-4bad-903e-600bb3d09fce in datapath af9d1967-d1a9-4382-82b7-d9db26a40cb7 unbound from our chassis
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:09.247 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network af9d1967-d1a9-4382-82b7-d9db26a40cb7
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:09.260 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[76b73e0a-2c4e-4d3b-8e8d-75572971a38c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:09.261 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapaf9d1967-d1 in ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:09.263 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapaf9d1967-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:09.263 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[91b1e207-66b0-4069-9b70-44f1fedf49e5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:09.264 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e7147db7-16fa-4d4f-84b5-c100ab7a3d44]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:09.279 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[99e7bfb3-efb7-42ca-9506-23eb85b329aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:09.295 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[14460817-3f5a-45b7-bf1b-fd02aeea3d16]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:09 compute-0 nova_compute[187185]: 2025-11-29 07:09:09.317 187189 INFO nova.compute.manager [None req-65f53ff3-1170-41fe-bf69-21605a3cc660 b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Took 0.42 seconds to destroy the instance on the hypervisor.
Nov 29 07:09:09 compute-0 nova_compute[187185]: 2025-11-29 07:09:09.318 187189 DEBUG oslo.service.loopingcall [None req-65f53ff3-1170-41fe-bf69-21605a3cc660 b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 07:09:09 compute-0 nova_compute[187185]: 2025-11-29 07:09:09.318 187189 DEBUG nova.compute.manager [-] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 07:09:09 compute-0 nova_compute[187185]: 2025-11-29 07:09:09.318 187189 DEBUG nova.network.neutron [-] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:09.334 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[4adb5b21-daa0-44e9-a560-5d2fd092411a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:09 compute-0 systemd-udevd[225607]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:09.342 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[2fc68489-933e-45a9-b974-330a3253582a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:09 compute-0 NetworkManager[55227]: <info>  [1764400149.3444] manager: (tapaf9d1967-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/111)
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:09.393 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[0edeccdc-6b3c-4a41-8722-8b9989e084aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:09.400 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[7ee6bd34-5afb-42aa-a6c4-692519c7cf0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:09 compute-0 NetworkManager[55227]: <info>  [1764400149.4247] device (tapaf9d1967-d0): carrier: link connected
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:09.429 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[d00241ec-a1fb-4a71-9edf-291086889ed2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:09.445 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[74263b90-fe30-4b50-8949-a92a738106d5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaf9d1967-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:21:78:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 69], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 559708, 'reachable_time': 27703, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225664, 'error': None, 'target': 'ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:09.460 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a4b5e0e2-c13e-4bbc-be07-51e871c34109]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe21:78da'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 559708, 'tstamp': 559708}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225665, 'error': None, 'target': 'ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:09.475 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d38cbefc-e3d3-470f-880c-85849f107115]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaf9d1967-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:21:78:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 69], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 559708, 'reachable_time': 27703, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225666, 'error': None, 'target': 'ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:09.498 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[52e91793-d3d3-4208-a9aa-eacedf770cc8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:09.543 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[cfb74eaa-7d8b-42b9-b05d-c8d498297b00]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:09.545 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaf9d1967-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:09.545 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:09.545 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaf9d1967-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:09:09 compute-0 nova_compute[187185]: 2025-11-29 07:09:09.547 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:09 compute-0 NetworkManager[55227]: <info>  [1764400149.5476] manager: (tapaf9d1967-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/112)
Nov 29 07:09:09 compute-0 kernel: tapaf9d1967-d0: entered promiscuous mode
Nov 29 07:09:09 compute-0 nova_compute[187185]: 2025-11-29 07:09:09.549 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:09.549 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapaf9d1967-d0, col_values=(('external_ids', {'iface-id': '0801dae7-0304-45c2-9288-7005217fa4a8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:09:09 compute-0 nova_compute[187185]: 2025-11-29 07:09:09.550 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:09 compute-0 nova_compute[187185]: 2025-11-29 07:09:09.552 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:09.553 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/af9d1967-d1a9-4382-82b7-d9db26a40cb7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/af9d1967-d1a9-4382-82b7-d9db26a40cb7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:09.554 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a4ce94c3-a1e2-4d57-acfc-0e1e483453c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:09.554 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-af9d1967-d1a9-4382-82b7-d9db26a40cb7
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/af9d1967-d1a9-4382-82b7-d9db26a40cb7.pid.haproxy
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID af9d1967-d1a9-4382-82b7-d9db26a40cb7
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:09:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:09.556 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7', 'env', 'PROCESS_TAG=haproxy-af9d1967-d1a9-4382-82b7-d9db26a40cb7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/af9d1967-d1a9-4382-82b7-d9db26a40cb7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:09:09 compute-0 ovn_controller[95281]: 2025-11-29T07:09:09Z|00208|binding|INFO|Releasing lport 0801dae7-0304-45c2-9288-7005217fa4a8 from this chassis (sb_readonly=0)
Nov 29 07:09:09 compute-0 nova_compute[187185]: 2025-11-29 07:09:09.576 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:09 compute-0 nova_compute[187185]: 2025-11-29 07:09:09.686 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:09 compute-0 podman[225676]: 2025-11-29 07:09:09.800044744 +0000 UTC m=+0.060029933 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 29 07:09:09 compute-0 podman[225718]: 2025-11-29 07:09:09.896967401 +0000 UTC m=+0.026739121 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:09:10 compute-0 nova_compute[187185]: 2025-11-29 07:09:10.755 187189 DEBUG nova.network.neutron [-] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:09:10 compute-0 nova_compute[187185]: 2025-11-29 07:09:10.776 187189 INFO nova.compute.manager [-] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Took 1.46 seconds to deallocate network for instance.
Nov 29 07:09:10 compute-0 nova_compute[187185]: 2025-11-29 07:09:10.826 187189 DEBUG nova.compute.manager [req-94a04529-41f0-48d5-a5f6-d0caae3b7cd3 req-9c6acb84-daa2-4d0d-a33a-deb7f4ef58d6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Received event network-vif-deleted-9aaa10f7-2af7-4567-82e0-f80c26727f34 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:09:10 compute-0 nova_compute[187185]: 2025-11-29 07:09:10.892 187189 DEBUG oslo_concurrency.lockutils [None req-65f53ff3-1170-41fe-bf69-21605a3cc660 b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:09:10 compute-0 nova_compute[187185]: 2025-11-29 07:09:10.892 187189 DEBUG oslo_concurrency.lockutils [None req-65f53ff3-1170-41fe-bf69-21605a3cc660 b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:09:10 compute-0 nova_compute[187185]: 2025-11-29 07:09:10.981 187189 DEBUG nova.compute.provider_tree [None req-65f53ff3-1170-41fe-bf69-21605a3cc660 b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:09:11 compute-0 nova_compute[187185]: 2025-11-29 07:09:11.010 187189 DEBUG nova.scheduler.client.report [None req-65f53ff3-1170-41fe-bf69-21605a3cc660 b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:09:11 compute-0 nova_compute[187185]: 2025-11-29 07:09:11.041 187189 DEBUG oslo_concurrency.lockutils [None req-65f53ff3-1170-41fe-bf69-21605a3cc660 b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:09:11 compute-0 nova_compute[187185]: 2025-11-29 07:09:11.074 187189 INFO nova.scheduler.client.report [None req-65f53ff3-1170-41fe-bf69-21605a3cc660 b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Deleted allocations for instance 0b894b20-2de8-4b79-99d5-1fbd9057f1d2
Nov 29 07:09:11 compute-0 nova_compute[187185]: 2025-11-29 07:09:11.188 187189 DEBUG oslo_concurrency.lockutils [None req-65f53ff3-1170-41fe-bf69-21605a3cc660 b9c54ea5b1e14b5d9d20f3ef82014170 e238c36ac0d649759e01ee6569916035 - - default default] Lock "0b894b20-2de8-4b79-99d5-1fbd9057f1d2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.326s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:09:11 compute-0 podman[225718]: 2025-11-29 07:09:11.98907794 +0000 UTC m=+2.118849640 container create e5341b8812ca666dca97f396d4e7b10a9e84d028ce1648b94ede9125ce589339 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 07:09:12 compute-0 systemd[1]: Started libpod-conmon-e5341b8812ca666dca97f396d4e7b10a9e84d028ce1648b94ede9125ce589339.scope.
Nov 29 07:09:12 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:09:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a08024df7e89767ec95c0384589ca417e6d51860832044e81ae1144023c31102/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:09:12 compute-0 podman[225718]: 2025-11-29 07:09:12.724096076 +0000 UTC m=+2.853867866 container init e5341b8812ca666dca97f396d4e7b10a9e84d028ce1648b94ede9125ce589339 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 29 07:09:12 compute-0 podman[225718]: 2025-11-29 07:09:12.73521521 +0000 UTC m=+2.864986910 container start e5341b8812ca666dca97f396d4e7b10a9e84d028ce1648b94ede9125ce589339 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 29 07:09:12 compute-0 neutron-haproxy-ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7[225734]: [NOTICE]   (225738) : New worker (225740) forked
Nov 29 07:09:12 compute-0 neutron-haproxy-ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7[225734]: [NOTICE]   (225738) : Loading success.
Nov 29 07:09:12 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:12.830 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 9aaa10f7-2af7-4567-82e0-f80c26727f34 in datapath 509f7b77-378f-48db-9510-854bfa33858f unbound from our chassis
Nov 29 07:09:12 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:12.833 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 509f7b77-378f-48db-9510-854bfa33858f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:09:12 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:12.835 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[48f0d42b-a07d-44c3-8ce2-08ce31a4c475]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:12 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:12.836 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-509f7b77-378f-48db-9510-854bfa33858f namespace which is not needed anymore
Nov 29 07:09:13 compute-0 neutron-haproxy-ovnmeta-509f7b77-378f-48db-9510-854bfa33858f[225594]: [NOTICE]   (225612) : haproxy version is 2.8.14-c23fe91
Nov 29 07:09:13 compute-0 neutron-haproxy-ovnmeta-509f7b77-378f-48db-9510-854bfa33858f[225594]: [NOTICE]   (225612) : path to executable is /usr/sbin/haproxy
Nov 29 07:09:13 compute-0 neutron-haproxy-ovnmeta-509f7b77-378f-48db-9510-854bfa33858f[225594]: [WARNING]  (225612) : Exiting Master process...
Nov 29 07:09:13 compute-0 neutron-haproxy-ovnmeta-509f7b77-378f-48db-9510-854bfa33858f[225594]: [ALERT]    (225612) : Current worker (225614) exited with code 143 (Terminated)
Nov 29 07:09:13 compute-0 neutron-haproxy-ovnmeta-509f7b77-378f-48db-9510-854bfa33858f[225594]: [WARNING]  (225612) : All workers exited. Exiting... (0)
Nov 29 07:09:13 compute-0 systemd[1]: libpod-5245ac16a6cded310c9ec54b615a4fa6302bf752431c832cff7bc64a1ffda784.scope: Deactivated successfully.
Nov 29 07:09:13 compute-0 conmon[225594]: conmon 5245ac16a6cded310c9e <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-5245ac16a6cded310c9ec54b615a4fa6302bf752431c832cff7bc64a1ffda784.scope/container/memory.events
Nov 29 07:09:13 compute-0 podman[225766]: 2025-11-29 07:09:13.470536855 +0000 UTC m=+0.539449250 container died 5245ac16a6cded310c9ec54b615a4fa6302bf752431c832cff7bc64a1ffda784 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-509f7b77-378f-48db-9510-854bfa33858f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 07:09:13 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5245ac16a6cded310c9ec54b615a4fa6302bf752431c832cff7bc64a1ffda784-userdata-shm.mount: Deactivated successfully.
Nov 29 07:09:13 compute-0 systemd[1]: var-lib-containers-storage-overlay-7c173566b5568ff640bb98e94656c6fd49e874a1f49db1685a6ea4cabb33b470-merged.mount: Deactivated successfully.
Nov 29 07:09:13 compute-0 podman[225766]: 2025-11-29 07:09:13.6220866 +0000 UTC m=+0.690998985 container cleanup 5245ac16a6cded310c9ec54b615a4fa6302bf752431c832cff7bc64a1ffda784 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-509f7b77-378f-48db-9510-854bfa33858f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 07:09:13 compute-0 systemd[1]: libpod-conmon-5245ac16a6cded310c9ec54b615a4fa6302bf752431c832cff7bc64a1ffda784.scope: Deactivated successfully.
Nov 29 07:09:13 compute-0 nova_compute[187185]: 2025-11-29 07:09:13.777 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:13 compute-0 NetworkManager[55227]: <info>  [1764400153.7790] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/113)
Nov 29 07:09:13 compute-0 NetworkManager[55227]: <info>  [1764400153.7807] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/114)
Nov 29 07:09:13 compute-0 nova_compute[187185]: 2025-11-29 07:09:13.842 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:13 compute-0 ovn_controller[95281]: 2025-11-29T07:09:13Z|00209|binding|INFO|Releasing lport 8f308f4a-d247-458f-940c-c0a5d51a9ea1 from this chassis (sb_readonly=0)
Nov 29 07:09:13 compute-0 ovn_controller[95281]: 2025-11-29T07:09:13Z|00210|binding|INFO|Releasing lport 0801dae7-0304-45c2-9288-7005217fa4a8 from this chassis (sb_readonly=0)
Nov 29 07:09:13 compute-0 nova_compute[187185]: 2025-11-29 07:09:13.861 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:13 compute-0 podman[225795]: 2025-11-29 07:09:13.87801636 +0000 UTC m=+0.228359413 container remove 5245ac16a6cded310c9ec54b615a4fa6302bf752431c832cff7bc64a1ffda784 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-509f7b77-378f-48db-9510-854bfa33858f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:09:13 compute-0 nova_compute[187185]: 2025-11-29 07:09:13.882 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:13 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:13.883 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[fd9c6d54-1b39-44fd-959b-3065a48175dd]: (4, ('Sat Nov 29 07:09:12 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-509f7b77-378f-48db-9510-854bfa33858f (5245ac16a6cded310c9ec54b615a4fa6302bf752431c832cff7bc64a1ffda784)\n5245ac16a6cded310c9ec54b615a4fa6302bf752431c832cff7bc64a1ffda784\nSat Nov 29 07:09:13 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-509f7b77-378f-48db-9510-854bfa33858f (5245ac16a6cded310c9ec54b615a4fa6302bf752431c832cff7bc64a1ffda784)\n5245ac16a6cded310c9ec54b615a4fa6302bf752431c832cff7bc64a1ffda784\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:13 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:13.885 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[dca94c80-c98f-418d-87f2-61055a238d5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:13 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:13.886 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap509f7b77-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:09:13 compute-0 nova_compute[187185]: 2025-11-29 07:09:13.888 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:13 compute-0 kernel: tap509f7b77-30: left promiscuous mode
Nov 29 07:09:13 compute-0 nova_compute[187185]: 2025-11-29 07:09:13.900 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:13 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:13.902 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d6f807bc-202f-4100-8300-f33d0807d9d7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:13 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:13.920 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[de439d4a-78e5-42fa-9605-cbadd626d9b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:13 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:13.922 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[34d52be3-6ab0-461b-a387-184ea6c0ca3e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:13 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:13.940 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a55b716b-b49b-4fc7-9507-c11d1f6041b5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 559339, 'reachable_time': 37353, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225814, 'error': None, 'target': 'ovnmeta-509f7b77-378f-48db-9510-854bfa33858f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:13 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:13.945 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-509f7b77-378f-48db-9510-854bfa33858f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:09:13 compute-0 systemd[1]: run-netns-ovnmeta\x2d509f7b77\x2d378f\x2d48db\x2d9510\x2d854bfa33858f.mount: Deactivated successfully.
Nov 29 07:09:13 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:13.945 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[e27660c2-9914-40e0-b46a-a4cd8dfadae0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:14 compute-0 nova_compute[187185]: 2025-11-29 07:09:14.201 187189 DEBUG nova.compute.manager [req-aa064030-3a51-4216-a730-309c300ed1e5 req-307f9050-0f86-47eb-b8a6-d616736b1db1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Received event network-changed-95792ac7-cbc8-4bad-903e-600bb3d09fce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:09:14 compute-0 nova_compute[187185]: 2025-11-29 07:09:14.201 187189 DEBUG nova.compute.manager [req-aa064030-3a51-4216-a730-309c300ed1e5 req-307f9050-0f86-47eb-b8a6-d616736b1db1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Refreshing instance network info cache due to event network-changed-95792ac7-cbc8-4bad-903e-600bb3d09fce. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:09:14 compute-0 nova_compute[187185]: 2025-11-29 07:09:14.201 187189 DEBUG oslo_concurrency.lockutils [req-aa064030-3a51-4216-a730-309c300ed1e5 req-307f9050-0f86-47eb-b8a6-d616736b1db1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-6d4e9a0c-c91c-45a4-911d-7526b420a8a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:09:14 compute-0 nova_compute[187185]: 2025-11-29 07:09:14.201 187189 DEBUG oslo_concurrency.lockutils [req-aa064030-3a51-4216-a730-309c300ed1e5 req-307f9050-0f86-47eb-b8a6-d616736b1db1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-6d4e9a0c-c91c-45a4-911d-7526b420a8a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:09:14 compute-0 nova_compute[187185]: 2025-11-29 07:09:14.202 187189 DEBUG nova.network.neutron [req-aa064030-3a51-4216-a730-309c300ed1e5 req-307f9050-0f86-47eb-b8a6-d616736b1db1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Refreshing network info cache for port 95792ac7-cbc8-4bad-903e-600bb3d09fce _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:09:14 compute-0 nova_compute[187185]: 2025-11-29 07:09:14.226 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:14 compute-0 nova_compute[187185]: 2025-11-29 07:09:14.688 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:15 compute-0 sshd-session[225819]: Connection closed by 5.101.64.6 port 60023
Nov 29 07:09:16 compute-0 nova_compute[187185]: 2025-11-29 07:09:16.736 187189 DEBUG nova.network.neutron [req-aa064030-3a51-4216-a730-309c300ed1e5 req-307f9050-0f86-47eb-b8a6-d616736b1db1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Updated VIF entry in instance network info cache for port 95792ac7-cbc8-4bad-903e-600bb3d09fce. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:09:16 compute-0 nova_compute[187185]: 2025-11-29 07:09:16.736 187189 DEBUG nova.network.neutron [req-aa064030-3a51-4216-a730-309c300ed1e5 req-307f9050-0f86-47eb-b8a6-d616736b1db1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Updating instance_info_cache with network_info: [{"id": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "address": "fa:16:3e:a1:a1:8f", "network": {"id": "af9d1967-d1a9-4382-82b7-d9db26a40cb7", "bridge": "br-int", "label": "tempest-network-smoke--1326373200", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95792ac7-cb", "ovs_interfaceid": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:09:16 compute-0 nova_compute[187185]: 2025-11-29 07:09:16.766 187189 DEBUG oslo_concurrency.lockutils [req-aa064030-3a51-4216-a730-309c300ed1e5 req-307f9050-0f86-47eb-b8a6-d616736b1db1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-6d4e9a0c-c91c-45a4-911d-7526b420a8a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:09:16 compute-0 ovn_controller[95281]: 2025-11-29T07:09:16Z|00211|binding|INFO|Releasing lport 0801dae7-0304-45c2-9288-7005217fa4a8 from this chassis (sb_readonly=0)
Nov 29 07:09:16 compute-0 nova_compute[187185]: 2025-11-29 07:09:16.919 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:19 compute-0 nova_compute[187185]: 2025-11-29 07:09:19.228 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:19 compute-0 nova_compute[187185]: 2025-11-29 07:09:19.692 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:20 compute-0 podman[225822]: 2025-11-29 07:09:20.848093178 +0000 UTC m=+0.093209010 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 07:09:20 compute-0 podman[225823]: 2025-11-29 07:09:20.857247456 +0000 UTC m=+0.096920335 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1755695350, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, build-date=2025-08-20T13:12:41, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 29 07:09:20 compute-0 podman[225824]: 2025-11-29 07:09:20.857546115 +0000 UTC m=+0.094604750 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 07:09:23 compute-0 ovn_controller[95281]: 2025-11-29T07:09:23Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a1:a1:8f 10.100.0.8
Nov 29 07:09:23 compute-0 ovn_controller[95281]: 2025-11-29T07:09:23Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a1:a1:8f 10.100.0.8
Nov 29 07:09:24 compute-0 nova_compute[187185]: 2025-11-29 07:09:24.184 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400149.1829998, 0b894b20-2de8-4b79-99d5-1fbd9057f1d2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:09:24 compute-0 nova_compute[187185]: 2025-11-29 07:09:24.184 187189 INFO nova.compute.manager [-] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] VM Stopped (Lifecycle Event)
Nov 29 07:09:24 compute-0 nova_compute[187185]: 2025-11-29 07:09:24.231 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:24 compute-0 nova_compute[187185]: 2025-11-29 07:09:24.621 187189 DEBUG nova.compute.manager [None req-8d2dadb5-a79e-4d37-95bc-1a9c4a4595a6 - - - - - -] [instance: 0b894b20-2de8-4b79-99d5-1fbd9057f1d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:09:24 compute-0 nova_compute[187185]: 2025-11-29 07:09:24.695 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:24.828 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:09:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:24.829 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:09:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:24.830 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:09:26 compute-0 ovn_controller[95281]: 2025-11-29T07:09:26Z|00212|binding|INFO|Releasing lport 0801dae7-0304-45c2-9288-7005217fa4a8 from this chassis (sb_readonly=0)
Nov 29 07:09:26 compute-0 nova_compute[187185]: 2025-11-29 07:09:26.620 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:29 compute-0 nova_compute[187185]: 2025-11-29 07:09:29.234 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:29 compute-0 nova_compute[187185]: 2025-11-29 07:09:29.698 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:31 compute-0 nova_compute[187185]: 2025-11-29 07:09:31.401 187189 INFO nova.compute.manager [None req-d4254def-f2e0-4418-a3b1-90c3b54217d7 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Get console output
Nov 29 07:09:31 compute-0 nova_compute[187185]: 2025-11-29 07:09:31.540 213758 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 29 07:09:31 compute-0 podman[225907]: 2025-11-29 07:09:31.863883051 +0000 UTC m=+0.108613934 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:09:34 compute-0 nova_compute[187185]: 2025-11-29 07:09:34.236 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:34 compute-0 nova_compute[187185]: 2025-11-29 07:09:34.701 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:35 compute-0 nova_compute[187185]: 2025-11-29 07:09:35.815 187189 INFO nova.compute.manager [None req-c2aec038-b0cb-427b-98e0-0d8038caffa6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Get console output
Nov 29 07:09:35 compute-0 nova_compute[187185]: 2025-11-29 07:09:35.820 213758 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 29 07:09:36 compute-0 nova_compute[187185]: 2025-11-29 07:09:36.503 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:36 compute-0 podman[225933]: 2025-11-29 07:09:36.799284882 +0000 UTC m=+0.065371806 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 07:09:39 compute-0 nova_compute[187185]: 2025-11-29 07:09:39.238 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:39 compute-0 nova_compute[187185]: 2025-11-29 07:09:39.705 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:39 compute-0 podman[225959]: 2025-11-29 07:09:39.794576731 +0000 UTC m=+0.065987282 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:09:40 compute-0 podman[225981]: 2025-11-29 07:09:40.80018531 +0000 UTC m=+0.064906592 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 07:09:40 compute-0 sshd-session[225982]: Connection closed by 5.101.64.6 port 51610
Nov 29 07:09:40 compute-0 nova_compute[187185]: 2025-11-29 07:09:40.884 187189 DEBUG oslo_concurrency.lockutils [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Acquiring lock "refresh_cache-6d4e9a0c-c91c-45a4-911d-7526b420a8a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:09:40 compute-0 nova_compute[187185]: 2025-11-29 07:09:40.885 187189 DEBUG oslo_concurrency.lockutils [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Acquired lock "refresh_cache-6d4e9a0c-c91c-45a4-911d-7526b420a8a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:09:40 compute-0 nova_compute[187185]: 2025-11-29 07:09:40.885 187189 DEBUG nova.network.neutron [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:09:41 compute-0 sshd-session[226002]: Unable to negotiate with 5.101.64.6 port 51616: no matching host key type found. Their offer: ssh-rsa,ssh-dss [preauth]
Nov 29 07:09:42 compute-0 nova_compute[187185]: 2025-11-29 07:09:42.444 187189 DEBUG nova.network.neutron [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Updating instance_info_cache with network_info: [{"id": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "address": "fa:16:3e:a1:a1:8f", "network": {"id": "af9d1967-d1a9-4382-82b7-d9db26a40cb7", "bridge": "br-int", "label": "tempest-network-smoke--1326373200", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95792ac7-cb", "ovs_interfaceid": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:09:42 compute-0 nova_compute[187185]: 2025-11-29 07:09:42.469 187189 DEBUG oslo_concurrency.lockutils [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Releasing lock "refresh_cache-6d4e9a0c-c91c-45a4-911d-7526b420a8a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:09:42 compute-0 nova_compute[187185]: 2025-11-29 07:09:42.626 187189 DEBUG nova.virt.libvirt.driver [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Nov 29 07:09:42 compute-0 nova_compute[187185]: 2025-11-29 07:09:42.627 187189 DEBUG nova.virt.libvirt.volume.remotefs [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Creating file /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/f52e0e1c9ca846fc8d5a757058046438.tmp on remote host 192.168.122.102 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79
Nov 29 07:09:42 compute-0 nova_compute[187185]: 2025-11-29 07:09:42.628 187189 DEBUG oslo_concurrency.processutils [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/f52e0e1c9ca846fc8d5a757058046438.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:09:42 compute-0 nova_compute[187185]: 2025-11-29 07:09:42.798 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:42 compute-0 nova_compute[187185]: 2025-11-29 07:09:42.862 187189 DEBUG oslo_concurrency.processutils [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/f52e0e1c9ca846fc8d5a757058046438.tmp" returned: 1 in 0.234s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:09:42 compute-0 nova_compute[187185]: 2025-11-29 07:09:42.863 187189 DEBUG oslo_concurrency.processutils [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] 'ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/f52e0e1c9ca846fc8d5a757058046438.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Nov 29 07:09:42 compute-0 nova_compute[187185]: 2025-11-29 07:09:42.863 187189 DEBUG nova.virt.libvirt.volume.remotefs [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Creating directory /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9 on remote host 192.168.122.102 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91
Nov 29 07:09:42 compute-0 nova_compute[187185]: 2025-11-29 07:09:42.864 187189 DEBUG oslo_concurrency.processutils [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:09:43 compute-0 nova_compute[187185]: 2025-11-29 07:09:43.070 187189 DEBUG oslo_concurrency.processutils [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9" returned: 0 in 0.206s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:09:43 compute-0 nova_compute[187185]: 2025-11-29 07:09:43.082 187189 DEBUG nova.virt.libvirt.driver [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Nov 29 07:09:43 compute-0 nova_compute[187185]: 2025-11-29 07:09:43.469 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:44 compute-0 nova_compute[187185]: 2025-11-29 07:09:44.240 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:44 compute-0 nova_compute[187185]: 2025-11-29 07:09:44.707 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:45 compute-0 kernel: tap95792ac7-cb (unregistering): left promiscuous mode
Nov 29 07:09:45 compute-0 NetworkManager[55227]: <info>  [1764400185.3505] device (tap95792ac7-cb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:09:45 compute-0 ovn_controller[95281]: 2025-11-29T07:09:45Z|00213|binding|INFO|Releasing lport 95792ac7-cbc8-4bad-903e-600bb3d09fce from this chassis (sb_readonly=0)
Nov 29 07:09:45 compute-0 ovn_controller[95281]: 2025-11-29T07:09:45Z|00214|binding|INFO|Setting lport 95792ac7-cbc8-4bad-903e-600bb3d09fce down in Southbound
Nov 29 07:09:45 compute-0 nova_compute[187185]: 2025-11-29 07:09:45.360 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:45 compute-0 ovn_controller[95281]: 2025-11-29T07:09:45Z|00215|binding|INFO|Removing iface tap95792ac7-cb ovn-installed in OVS
Nov 29 07:09:45 compute-0 nova_compute[187185]: 2025-11-29 07:09:45.363 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:45 compute-0 nova_compute[187185]: 2025-11-29 07:09:45.386 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:45 compute-0 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d00000059.scope: Deactivated successfully.
Nov 29 07:09:45 compute-0 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d00000059.scope: Consumed 14.687s CPU time.
Nov 29 07:09:45 compute-0 systemd-machined[153486]: Machine qemu-31-instance-00000059 terminated.
Nov 29 07:09:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:45.434 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a1:a1:8f 10.100.0.8'], port_security=['fa:16:3e:a1:a1:8f 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '6d4e9a0c-c91c-45a4-911d-7526b420a8a9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-af9d1967-d1a9-4382-82b7-d9db26a40cb7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c231e63624d44fc19e0989abfb1afb22', 'neutron:revision_number': '4', 'neutron:security_group_ids': '376a466b-335f-4204-8812-ec229fd4d3b3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.208'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2abd3f5a-1a92-4bfd-a631-54a420dbc598, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=95792ac7-cbc8-4bad-903e-600bb3d09fce) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:09:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:45.438 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 95792ac7-cbc8-4bad-903e-600bb3d09fce in datapath af9d1967-d1a9-4382-82b7-d9db26a40cb7 unbound from our chassis
Nov 29 07:09:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:45.440 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network af9d1967-d1a9-4382-82b7-d9db26a40cb7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:09:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:45.442 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6d1baedd-eb46-4879-ad71-887fc0e7ad22]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:45.442 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7 namespace which is not needed anymore
Nov 29 07:09:45 compute-0 nova_compute[187185]: 2025-11-29 07:09:45.585 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:45 compute-0 nova_compute[187185]: 2025-11-29 07:09:45.590 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:45 compute-0 neutron-haproxy-ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7[225734]: [NOTICE]   (225738) : haproxy version is 2.8.14-c23fe91
Nov 29 07:09:45 compute-0 neutron-haproxy-ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7[225734]: [NOTICE]   (225738) : path to executable is /usr/sbin/haproxy
Nov 29 07:09:45 compute-0 neutron-haproxy-ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7[225734]: [WARNING]  (225738) : Exiting Master process...
Nov 29 07:09:45 compute-0 neutron-haproxy-ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7[225734]: [WARNING]  (225738) : Exiting Master process...
Nov 29 07:09:45 compute-0 neutron-haproxy-ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7[225734]: [ALERT]    (225738) : Current worker (225740) exited with code 143 (Terminated)
Nov 29 07:09:45 compute-0 neutron-haproxy-ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7[225734]: [WARNING]  (225738) : All workers exited. Exiting... (0)
Nov 29 07:09:45 compute-0 systemd[1]: libpod-e5341b8812ca666dca97f396d4e7b10a9e84d028ce1648b94ede9125ce589339.scope: Deactivated successfully.
Nov 29 07:09:45 compute-0 podman[226031]: 2025-11-29 07:09:45.831791324 +0000 UTC m=+0.289657552 container died e5341b8812ca666dca97f396d4e7b10a9e84d028ce1648b94ede9125ce589339 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 07:09:46 compute-0 nova_compute[187185]: 2025-11-29 07:09:46.101 187189 INFO nova.virt.libvirt.driver [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Instance shutdown successfully after 3 seconds.
Nov 29 07:09:46 compute-0 nova_compute[187185]: 2025-11-29 07:09:46.109 187189 INFO nova.virt.libvirt.driver [-] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Instance destroyed successfully.
Nov 29 07:09:46 compute-0 nova_compute[187185]: 2025-11-29 07:09:46.110 187189 DEBUG nova.virt.libvirt.vif [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:08:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1405928271',display_name='tempest-TestNetworkAdvancedServerOps-server-1405928271',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1405928271',id=89,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJzX+cYphgzFb/LmLSqgC4l/EgTLaDqQRgz2oIoLmiT9pJmbbaoOE/h8lTp9y4P6Lqu0yte5POR0cnSIwuT6ICbf/J95VY/pQuT7Mh/Rw0RaK2X3rgSaxQ5jqSeZ2XDRaw==',key_name='tempest-TestNetworkAdvancedServerOps-1604525815',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:09:08Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-q0t03bzu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:09:40Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=6d4e9a0c-c91c-45a4-911d-7526b420a8a9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "address": "fa:16:3e:a1:a1:8f", "network": {"id": "af9d1967-d1a9-4382-82b7-d9db26a40cb7", "bridge": "br-int", "label": "tempest-network-smoke--1326373200", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1326373200", "vif_mac": "fa:16:3e:a1:a1:8f"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95792ac7-cb", "ovs_interfaceid": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:09:46 compute-0 nova_compute[187185]: 2025-11-29 07:09:46.111 187189 DEBUG nova.network.os_vif_util [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Converting VIF {"id": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "address": "fa:16:3e:a1:a1:8f", "network": {"id": "af9d1967-d1a9-4382-82b7-d9db26a40cb7", "bridge": "br-int", "label": "tempest-network-smoke--1326373200", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1326373200", "vif_mac": "fa:16:3e:a1:a1:8f"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95792ac7-cb", "ovs_interfaceid": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:09:46 compute-0 nova_compute[187185]: 2025-11-29 07:09:46.112 187189 DEBUG nova.network.os_vif_util [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a1:a1:8f,bridge_name='br-int',has_traffic_filtering=True,id=95792ac7-cbc8-4bad-903e-600bb3d09fce,network=Network(af9d1967-d1a9-4382-82b7-d9db26a40cb7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95792ac7-cb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:09:46 compute-0 nova_compute[187185]: 2025-11-29 07:09:46.113 187189 DEBUG os_vif [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a1:a1:8f,bridge_name='br-int',has_traffic_filtering=True,id=95792ac7-cbc8-4bad-903e-600bb3d09fce,network=Network(af9d1967-d1a9-4382-82b7-d9db26a40cb7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95792ac7-cb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:09:46 compute-0 nova_compute[187185]: 2025-11-29 07:09:46.115 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:46 compute-0 nova_compute[187185]: 2025-11-29 07:09:46.115 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap95792ac7-cb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:09:46 compute-0 nova_compute[187185]: 2025-11-29 07:09:46.155 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:46 compute-0 nova_compute[187185]: 2025-11-29 07:09:46.158 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:09:46 compute-0 nova_compute[187185]: 2025-11-29 07:09:46.162 187189 INFO os_vif [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a1:a1:8f,bridge_name='br-int',has_traffic_filtering=True,id=95792ac7-cbc8-4bad-903e-600bb3d09fce,network=Network(af9d1967-d1a9-4382-82b7-d9db26a40cb7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95792ac7-cb')
Nov 29 07:09:46 compute-0 nova_compute[187185]: 2025-11-29 07:09:46.169 187189 DEBUG oslo_concurrency.processutils [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:09:46 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e5341b8812ca666dca97f396d4e7b10a9e84d028ce1648b94ede9125ce589339-userdata-shm.mount: Deactivated successfully.
Nov 29 07:09:46 compute-0 systemd[1]: var-lib-containers-storage-overlay-a08024df7e89767ec95c0384589ca417e6d51860832044e81ae1144023c31102-merged.mount: Deactivated successfully.
Nov 29 07:09:46 compute-0 nova_compute[187185]: 2025-11-29 07:09:46.279 187189 DEBUG oslo_concurrency.processutils [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk --force-share --output=json" returned: 0 in 0.111s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:09:46 compute-0 nova_compute[187185]: 2025-11-29 07:09:46.281 187189 DEBUG oslo_concurrency.processutils [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:09:46 compute-0 podman[226031]: 2025-11-29 07:09:46.283886877 +0000 UTC m=+0.741753115 container cleanup e5341b8812ca666dca97f396d4e7b10a9e84d028ce1648b94ede9125ce589339 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Nov 29 07:09:46 compute-0 systemd[1]: libpod-conmon-e5341b8812ca666dca97f396d4e7b10a9e84d028ce1648b94ede9125ce589339.scope: Deactivated successfully.
Nov 29 07:09:46 compute-0 nova_compute[187185]: 2025-11-29 07:09:46.352 187189 DEBUG oslo_concurrency.processutils [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:09:46 compute-0 nova_compute[187185]: 2025-11-29 07:09:46.355 187189 DEBUG nova.virt.libvirt.volume.remotefs [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Copying file /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9_resize/disk to 192.168.122.102:/var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Nov 29 07:09:46 compute-0 nova_compute[187185]: 2025-11-29 07:09:46.355 187189 DEBUG oslo_concurrency.processutils [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9_resize/disk 192.168.122.102:/var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:09:46 compute-0 podman[226082]: 2025-11-29 07:09:46.369342248 +0000 UTC m=+0.059018166 container remove e5341b8812ca666dca97f396d4e7b10a9e84d028ce1648b94ede9125ce589339 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 07:09:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:46.376 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[ed00f1d5-8e84-4d51-92de-df6d7a572a94]: (4, ('Sat Nov 29 07:09:45 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7 (e5341b8812ca666dca97f396d4e7b10a9e84d028ce1648b94ede9125ce589339)\ne5341b8812ca666dca97f396d4e7b10a9e84d028ce1648b94ede9125ce589339\nSat Nov 29 07:09:46 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7 (e5341b8812ca666dca97f396d4e7b10a9e84d028ce1648b94ede9125ce589339)\ne5341b8812ca666dca97f396d4e7b10a9e84d028ce1648b94ede9125ce589339\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:46.380 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[27e3bc61-458c-4f41-8326-7b1882ad9a2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:46.381 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaf9d1967-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:09:46 compute-0 nova_compute[187185]: 2025-11-29 07:09:46.384 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:46 compute-0 kernel: tapaf9d1967-d0: left promiscuous mode
Nov 29 07:09:46 compute-0 nova_compute[187185]: 2025-11-29 07:09:46.396 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:46.400 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[2ecc5081-43f3-4744-b614-08e064851a3a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:46.418 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[813acec9-d446-4b2b-825e-d2e66176d2d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:46.423 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[c1c3ac69-6f1b-4d3f-9e1f-639e4836baed]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:46.446 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[054397bf-b266-4982-aab3-9d2d7b4e7c3f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 559698, 'reachable_time': 28972, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226101, 'error': None, 'target': 'ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:46 compute-0 systemd[1]: run-netns-ovnmeta\x2daf9d1967\x2dd1a9\x2d4382\x2d82b7\x2dd9db26a40cb7.mount: Deactivated successfully.
Nov 29 07:09:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:46.450 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:09:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:46.451 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[10b6fc45-ef1e-4658-aabc-30a8b1704b2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:09:46 compute-0 nova_compute[187185]: 2025-11-29 07:09:46.976 187189 DEBUG oslo_concurrency.processutils [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] CMD "scp -r /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9_resize/disk 192.168.122.102:/var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk" returned: 0 in 0.621s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:09:46 compute-0 nova_compute[187185]: 2025-11-29 07:09:46.977 187189 DEBUG nova.virt.libvirt.volume.remotefs [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Copying file /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9_resize/disk.config to 192.168.122.102:/var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Nov 29 07:09:46 compute-0 nova_compute[187185]: 2025-11-29 07:09:46.977 187189 DEBUG oslo_concurrency.processutils [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9_resize/disk.config 192.168.122.102:/var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:09:47 compute-0 nova_compute[187185]: 2025-11-29 07:09:47.246 187189 DEBUG oslo_concurrency.processutils [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] CMD "scp -C -r /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9_resize/disk.config 192.168.122.102:/var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk.config" returned: 0 in 0.268s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:09:47 compute-0 nova_compute[187185]: 2025-11-29 07:09:47.248 187189 DEBUG nova.virt.libvirt.volume.remotefs [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Copying file /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9_resize/disk.info to 192.168.122.102:/var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Nov 29 07:09:47 compute-0 nova_compute[187185]: 2025-11-29 07:09:47.248 187189 DEBUG oslo_concurrency.processutils [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9_resize/disk.info 192.168.122.102:/var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:09:47 compute-0 nova_compute[187185]: 2025-11-29 07:09:47.463 187189 DEBUG oslo_concurrency.processutils [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] CMD "scp -C -r /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9_resize/disk.info 192.168.122.102:/var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk.info" returned: 0 in 0.215s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:09:48 compute-0 nova_compute[187185]: 2025-11-29 07:09:48.024 187189 DEBUG neutronclient.v2_0.client [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 95792ac7-cbc8-4bad-903e-600bb3d09fce for host compute-2.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Nov 29 07:09:48 compute-0 nova_compute[187185]: 2025-11-29 07:09:48.235 187189 DEBUG oslo_concurrency.lockutils [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Acquiring lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:09:48 compute-0 nova_compute[187185]: 2025-11-29 07:09:48.236 187189 DEBUG oslo_concurrency.lockutils [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:09:48 compute-0 nova_compute[187185]: 2025-11-29 07:09:48.236 187189 DEBUG oslo_concurrency.lockutils [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:09:49 compute-0 nova_compute[187185]: 2025-11-29 07:09:49.209 187189 DEBUG nova.compute.manager [req-8939c2c2-3fad-44be-a352-d53eeefb843d req-a74c8ecf-2f1a-4e10-932e-24bb06b8a411 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Received event network-vif-unplugged-95792ac7-cbc8-4bad-903e-600bb3d09fce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:09:49 compute-0 nova_compute[187185]: 2025-11-29 07:09:49.209 187189 DEBUG oslo_concurrency.lockutils [req-8939c2c2-3fad-44be-a352-d53eeefb843d req-a74c8ecf-2f1a-4e10-932e-24bb06b8a411 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:09:49 compute-0 nova_compute[187185]: 2025-11-29 07:09:49.210 187189 DEBUG oslo_concurrency.lockutils [req-8939c2c2-3fad-44be-a352-d53eeefb843d req-a74c8ecf-2f1a-4e10-932e-24bb06b8a411 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:09:49 compute-0 nova_compute[187185]: 2025-11-29 07:09:49.210 187189 DEBUG oslo_concurrency.lockutils [req-8939c2c2-3fad-44be-a352-d53eeefb843d req-a74c8ecf-2f1a-4e10-932e-24bb06b8a411 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:09:49 compute-0 nova_compute[187185]: 2025-11-29 07:09:49.210 187189 DEBUG nova.compute.manager [req-8939c2c2-3fad-44be-a352-d53eeefb843d req-a74c8ecf-2f1a-4e10-932e-24bb06b8a411 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] No waiting events found dispatching network-vif-unplugged-95792ac7-cbc8-4bad-903e-600bb3d09fce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:09:49 compute-0 nova_compute[187185]: 2025-11-29 07:09:49.210 187189 WARNING nova.compute.manager [req-8939c2c2-3fad-44be-a352-d53eeefb843d req-a74c8ecf-2f1a-4e10-932e-24bb06b8a411 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Received unexpected event network-vif-unplugged-95792ac7-cbc8-4bad-903e-600bb3d09fce for instance with vm_state active and task_state resize_migrated.
Nov 29 07:09:49 compute-0 nova_compute[187185]: 2025-11-29 07:09:49.709 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:51 compute-0 nova_compute[187185]: 2025-11-29 07:09:51.156 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:51 compute-0 podman[226106]: 2025-11-29 07:09:51.792328854 +0000 UTC m=+0.057473632 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 29 07:09:51 compute-0 podman[226108]: 2025-11-29 07:09:51.803971293 +0000 UTC m=+0.064209723 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 07:09:51 compute-0 podman[226107]: 2025-11-29 07:09:51.810132557 +0000 UTC m=+0.073311160 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_id=edpm, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, release=1755695350, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64)
Nov 29 07:09:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:52.646 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:09:52 compute-0 nova_compute[187185]: 2025-11-29 07:09:52.647 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:52.648 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:09:52 compute-0 nova_compute[187185]: 2025-11-29 07:09:52.729 187189 DEBUG nova.compute.manager [req-59060158-a93a-47ac-8509-85aaf6077025 req-71a5f9f7-9566-49d7-8101-6cd3dc4dc493 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Received event network-vif-plugged-95792ac7-cbc8-4bad-903e-600bb3d09fce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:09:52 compute-0 nova_compute[187185]: 2025-11-29 07:09:52.731 187189 DEBUG oslo_concurrency.lockutils [req-59060158-a93a-47ac-8509-85aaf6077025 req-71a5f9f7-9566-49d7-8101-6cd3dc4dc493 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:09:52 compute-0 nova_compute[187185]: 2025-11-29 07:09:52.731 187189 DEBUG oslo_concurrency.lockutils [req-59060158-a93a-47ac-8509-85aaf6077025 req-71a5f9f7-9566-49d7-8101-6cd3dc4dc493 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:09:52 compute-0 nova_compute[187185]: 2025-11-29 07:09:52.732 187189 DEBUG oslo_concurrency.lockutils [req-59060158-a93a-47ac-8509-85aaf6077025 req-71a5f9f7-9566-49d7-8101-6cd3dc4dc493 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:09:52 compute-0 nova_compute[187185]: 2025-11-29 07:09:52.733 187189 DEBUG nova.compute.manager [req-59060158-a93a-47ac-8509-85aaf6077025 req-71a5f9f7-9566-49d7-8101-6cd3dc4dc493 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] No waiting events found dispatching network-vif-plugged-95792ac7-cbc8-4bad-903e-600bb3d09fce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:09:52 compute-0 nova_compute[187185]: 2025-11-29 07:09:52.733 187189 WARNING nova.compute.manager [req-59060158-a93a-47ac-8509-85aaf6077025 req-71a5f9f7-9566-49d7-8101-6cd3dc4dc493 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Received unexpected event network-vif-plugged-95792ac7-cbc8-4bad-903e-600bb3d09fce for instance with vm_state active and task_state resize_migrated.
Nov 29 07:09:54 compute-0 nova_compute[187185]: 2025-11-29 07:09:54.711 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:55 compute-0 nova_compute[187185]: 2025-11-29 07:09:55.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:09:55 compute-0 nova_compute[187185]: 2025-11-29 07:09:55.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:09:55 compute-0 nova_compute[187185]: 2025-11-29 07:09:55.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:09:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:09:55.652 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:09:55 compute-0 nova_compute[187185]: 2025-11-29 07:09:55.823 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 07:09:56 compute-0 nova_compute[187185]: 2025-11-29 07:09:56.160 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:56 compute-0 nova_compute[187185]: 2025-11-29 07:09:56.865 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:57 compute-0 nova_compute[187185]: 2025-11-29 07:09:57.402 187189 DEBUG nova.compute.manager [req-ec5561d3-ca08-4e8e-a88e-5300649d44a9 req-01148284-35dd-48c3-9f45-5cb68f24a53b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Received event network-changed-95792ac7-cbc8-4bad-903e-600bb3d09fce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:09:57 compute-0 nova_compute[187185]: 2025-11-29 07:09:57.402 187189 DEBUG nova.compute.manager [req-ec5561d3-ca08-4e8e-a88e-5300649d44a9 req-01148284-35dd-48c3-9f45-5cb68f24a53b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Refreshing instance network info cache due to event network-changed-95792ac7-cbc8-4bad-903e-600bb3d09fce. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:09:57 compute-0 nova_compute[187185]: 2025-11-29 07:09:57.403 187189 DEBUG oslo_concurrency.lockutils [req-ec5561d3-ca08-4e8e-a88e-5300649d44a9 req-01148284-35dd-48c3-9f45-5cb68f24a53b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-6d4e9a0c-c91c-45a4-911d-7526b420a8a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:09:57 compute-0 nova_compute[187185]: 2025-11-29 07:09:57.403 187189 DEBUG oslo_concurrency.lockutils [req-ec5561d3-ca08-4e8e-a88e-5300649d44a9 req-01148284-35dd-48c3-9f45-5cb68f24a53b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-6d4e9a0c-c91c-45a4-911d-7526b420a8a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:09:57 compute-0 nova_compute[187185]: 2025-11-29 07:09:57.403 187189 DEBUG nova.network.neutron [req-ec5561d3-ca08-4e8e-a88e-5300649d44a9 req-01148284-35dd-48c3-9f45-5cb68f24a53b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Refreshing network info cache for port 95792ac7-cbc8-4bad-903e-600bb3d09fce _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:09:59 compute-0 nova_compute[187185]: 2025-11-29 07:09:59.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:09:59 compute-0 nova_compute[187185]: 2025-11-29 07:09:59.713 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:09:59 compute-0 nova_compute[187185]: 2025-11-29 07:09:59.783 187189 DEBUG nova.network.neutron [req-ec5561d3-ca08-4e8e-a88e-5300649d44a9 req-01148284-35dd-48c3-9f45-5cb68f24a53b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Updated VIF entry in instance network info cache for port 95792ac7-cbc8-4bad-903e-600bb3d09fce. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:09:59 compute-0 nova_compute[187185]: 2025-11-29 07:09:59.783 187189 DEBUG nova.network.neutron [req-ec5561d3-ca08-4e8e-a88e-5300649d44a9 req-01148284-35dd-48c3-9f45-5cb68f24a53b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Updating instance_info_cache with network_info: [{"id": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "address": "fa:16:3e:a1:a1:8f", "network": {"id": "af9d1967-d1a9-4382-82b7-d9db26a40cb7", "bridge": "br-int", "label": "tempest-network-smoke--1326373200", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95792ac7-cb", "ovs_interfaceid": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:09:59 compute-0 nova_compute[187185]: 2025-11-29 07:09:59.843 187189 DEBUG oslo_concurrency.lockutils [req-ec5561d3-ca08-4e8e-a88e-5300649d44a9 req-01148284-35dd-48c3-9f45-5cb68f24a53b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-6d4e9a0c-c91c-45a4-911d-7526b420a8a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:10:00 compute-0 nova_compute[187185]: 2025-11-29 07:10:00.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:10:00 compute-0 nova_compute[187185]: 2025-11-29 07:10:00.625 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400185.6234837, 6d4e9a0c-c91c-45a4-911d-7526b420a8a9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:10:00 compute-0 nova_compute[187185]: 2025-11-29 07:10:00.625 187189 INFO nova.compute.manager [-] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] VM Stopped (Lifecycle Event)
Nov 29 07:10:00 compute-0 nova_compute[187185]: 2025-11-29 07:10:00.649 187189 DEBUG nova.compute.manager [None req-f6be007e-fcc0-402d-aeba-c7861ac88116 - - - - - -] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:10:00 compute-0 nova_compute[187185]: 2025-11-29 07:10:00.656 187189 DEBUG nova.compute.manager [None req-f6be007e-fcc0-402d-aeba-c7861ac88116 - - - - - -] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:10:00 compute-0 nova_compute[187185]: 2025-11-29 07:10:00.682 187189 INFO nova.compute.manager [None req-f6be007e-fcc0-402d-aeba-c7861ac88116 - - - - - -] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] During the sync_power process the instance has moved from host compute-2.ctlplane.example.com to host compute-0.ctlplane.example.com
Nov 29 07:10:00 compute-0 nova_compute[187185]: 2025-11-29 07:10:00.823 187189 DEBUG nova.compute.manager [req-6cc39d73-b29c-4086-83bd-31279b3d4699 req-f98b323e-c5d1-4813-814d-40ed4d6bb1ae 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Received event network-vif-plugged-95792ac7-cbc8-4bad-903e-600bb3d09fce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:10:00 compute-0 nova_compute[187185]: 2025-11-29 07:10:00.824 187189 DEBUG oslo_concurrency.lockutils [req-6cc39d73-b29c-4086-83bd-31279b3d4699 req-f98b323e-c5d1-4813-814d-40ed4d6bb1ae 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:10:00 compute-0 nova_compute[187185]: 2025-11-29 07:10:00.825 187189 DEBUG oslo_concurrency.lockutils [req-6cc39d73-b29c-4086-83bd-31279b3d4699 req-f98b323e-c5d1-4813-814d-40ed4d6bb1ae 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:10:00 compute-0 nova_compute[187185]: 2025-11-29 07:10:00.825 187189 DEBUG oslo_concurrency.lockutils [req-6cc39d73-b29c-4086-83bd-31279b3d4699 req-f98b323e-c5d1-4813-814d-40ed4d6bb1ae 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:10:00 compute-0 nova_compute[187185]: 2025-11-29 07:10:00.825 187189 DEBUG nova.compute.manager [req-6cc39d73-b29c-4086-83bd-31279b3d4699 req-f98b323e-c5d1-4813-814d-40ed4d6bb1ae 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] No waiting events found dispatching network-vif-plugged-95792ac7-cbc8-4bad-903e-600bb3d09fce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:10:00 compute-0 nova_compute[187185]: 2025-11-29 07:10:00.826 187189 WARNING nova.compute.manager [req-6cc39d73-b29c-4086-83bd-31279b3d4699 req-f98b323e-c5d1-4813-814d-40ed4d6bb1ae 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Received unexpected event network-vif-plugged-95792ac7-cbc8-4bad-903e-600bb3d09fce for instance with vm_state active and task_state resize_finish.
Nov 29 07:10:01 compute-0 nova_compute[187185]: 2025-11-29 07:10:01.230 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:10:01 compute-0 nova_compute[187185]: 2025-11-29 07:10:01.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:10:01 compute-0 nova_compute[187185]: 2025-11-29 07:10:01.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:10:01 compute-0 nova_compute[187185]: 2025-11-29 07:10:01.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:10:02 compute-0 nova_compute[187185]: 2025-11-29 07:10:02.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:10:02 compute-0 nova_compute[187185]: 2025-11-29 07:10:02.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:10:02 compute-0 nova_compute[187185]: 2025-11-29 07:10:02.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:10:02 compute-0 nova_compute[187185]: 2025-11-29 07:10:02.392 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:10:02 compute-0 nova_compute[187185]: 2025-11-29 07:10:02.393 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:10:02 compute-0 nova_compute[187185]: 2025-11-29 07:10:02.393 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:10:02 compute-0 nova_compute[187185]: 2025-11-29 07:10:02.393 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:10:02 compute-0 podman[226165]: 2025-11-29 07:10:02.545611951 +0000 UTC m=+0.099729005 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 29 07:10:02 compute-0 nova_compute[187185]: 2025-11-29 07:10:02.571 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Periodic task is updating the host stats, it is trying to get disk info for instance-00000059, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk
Nov 29 07:10:02 compute-0 nova_compute[187185]: 2025-11-29 07:10:02.773 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:10:02 compute-0 nova_compute[187185]: 2025-11-29 07:10:02.775 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5733MB free_disk=73.26802444458008GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:10:02 compute-0 nova_compute[187185]: 2025-11-29 07:10:02.775 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:10:02 compute-0 nova_compute[187185]: 2025-11-29 07:10:02.775 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:10:02 compute-0 nova_compute[187185]: 2025-11-29 07:10:02.911 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Migration for instance 6d4e9a0c-c91c-45a4-911d-7526b420a8a9 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Nov 29 07:10:03 compute-0 nova_compute[187185]: 2025-11-29 07:10:03.024 187189 DEBUG nova.compute.manager [req-2ca69e98-9b42-4917-85d8-b3bef9ebf4c1 req-36f16fb5-6033-4d27-8117-2916afb3c022 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Received event network-vif-plugged-95792ac7-cbc8-4bad-903e-600bb3d09fce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:10:03 compute-0 nova_compute[187185]: 2025-11-29 07:10:03.025 187189 DEBUG oslo_concurrency.lockutils [req-2ca69e98-9b42-4917-85d8-b3bef9ebf4c1 req-36f16fb5-6033-4d27-8117-2916afb3c022 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:10:03 compute-0 nova_compute[187185]: 2025-11-29 07:10:03.025 187189 DEBUG oslo_concurrency.lockutils [req-2ca69e98-9b42-4917-85d8-b3bef9ebf4c1 req-36f16fb5-6033-4d27-8117-2916afb3c022 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:10:03 compute-0 nova_compute[187185]: 2025-11-29 07:10:03.026 187189 DEBUG oslo_concurrency.lockutils [req-2ca69e98-9b42-4917-85d8-b3bef9ebf4c1 req-36f16fb5-6033-4d27-8117-2916afb3c022 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:10:03 compute-0 nova_compute[187185]: 2025-11-29 07:10:03.026 187189 DEBUG nova.compute.manager [req-2ca69e98-9b42-4917-85d8-b3bef9ebf4c1 req-36f16fb5-6033-4d27-8117-2916afb3c022 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] No waiting events found dispatching network-vif-plugged-95792ac7-cbc8-4bad-903e-600bb3d09fce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:10:03 compute-0 nova_compute[187185]: 2025-11-29 07:10:03.027 187189 WARNING nova.compute.manager [req-2ca69e98-9b42-4917-85d8-b3bef9ebf4c1 req-36f16fb5-6033-4d27-8117-2916afb3c022 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Received unexpected event network-vif-plugged-95792ac7-cbc8-4bad-903e-600bb3d09fce for instance with vm_state resized and task_state None.
Nov 29 07:10:03 compute-0 nova_compute[187185]: 2025-11-29 07:10:03.066 187189 INFO nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Updating resource usage from migration 26ffe11d-ec78-48cf-95c2-9fbc8ddb9126
Nov 29 07:10:03 compute-0 nova_compute[187185]: 2025-11-29 07:10:03.067 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Starting to track outgoing migration 26ffe11d-ec78-48cf-95c2-9fbc8ddb9126 with flavor 1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1444
Nov 29 07:10:03 compute-0 nova_compute[187185]: 2025-11-29 07:10:03.109 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Migration 26ffe11d-ec78-48cf-95c2-9fbc8ddb9126 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Nov 29 07:10:03 compute-0 nova_compute[187185]: 2025-11-29 07:10:03.109 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:10:03 compute-0 nova_compute[187185]: 2025-11-29 07:10:03.109 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:10:03 compute-0 nova_compute[187185]: 2025-11-29 07:10:03.162 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:10:03 compute-0 nova_compute[187185]: 2025-11-29 07:10:03.211 187189 DEBUG oslo_concurrency.lockutils [None req-318da833-aa0d-4026-b4a4-e7431abaaaa3 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:10:03 compute-0 nova_compute[187185]: 2025-11-29 07:10:03.211 187189 DEBUG oslo_concurrency.lockutils [None req-318da833-aa0d-4026-b4a4-e7431abaaaa3 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:10:03 compute-0 nova_compute[187185]: 2025-11-29 07:10:03.211 187189 DEBUG nova.compute.manager [None req-318da833-aa0d-4026-b4a4-e7431abaaaa3 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Going to confirm migration 14 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679
Nov 29 07:10:03 compute-0 nova_compute[187185]: 2025-11-29 07:10:03.303 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:10:03 compute-0 nova_compute[187185]: 2025-11-29 07:10:03.322 187189 DEBUG nova.objects.instance [None req-318da833-aa0d-4026-b4a4-e7431abaaaa3 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'info_cache' on Instance uuid 6d4e9a0c-c91c-45a4-911d-7526b420a8a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:10:03 compute-0 nova_compute[187185]: 2025-11-29 07:10:03.353 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:10:03 compute-0 nova_compute[187185]: 2025-11-29 07:10:03.354 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.578s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:10:04 compute-0 nova_compute[187185]: 2025-11-29 07:10:04.349 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:10:04 compute-0 nova_compute[187185]: 2025-11-29 07:10:04.511 187189 DEBUG neutronclient.v2_0.client [None req-318da833-aa0d-4026-b4a4-e7431abaaaa3 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 95792ac7-cbc8-4bad-903e-600bb3d09fce for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Nov 29 07:10:04 compute-0 nova_compute[187185]: 2025-11-29 07:10:04.512 187189 DEBUG oslo_concurrency.lockutils [None req-318da833-aa0d-4026-b4a4-e7431abaaaa3 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "refresh_cache-6d4e9a0c-c91c-45a4-911d-7526b420a8a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:10:04 compute-0 nova_compute[187185]: 2025-11-29 07:10:04.513 187189 DEBUG oslo_concurrency.lockutils [None req-318da833-aa0d-4026-b4a4-e7431abaaaa3 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquired lock "refresh_cache-6d4e9a0c-c91c-45a4-911d-7526b420a8a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:10:04 compute-0 nova_compute[187185]: 2025-11-29 07:10:04.513 187189 DEBUG nova.network.neutron [None req-318da833-aa0d-4026-b4a4-e7431abaaaa3 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:10:04 compute-0 nova_compute[187185]: 2025-11-29 07:10:04.715 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:10:06 compute-0 nova_compute[187185]: 2025-11-29 07:10:06.233 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:10:06 compute-0 nova_compute[187185]: 2025-11-29 07:10:06.310 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:10:07 compute-0 nova_compute[187185]: 2025-11-29 07:10:07.103 187189 DEBUG nova.network.neutron [None req-318da833-aa0d-4026-b4a4-e7431abaaaa3 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Updating instance_info_cache with network_info: [{"id": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "address": "fa:16:3e:a1:a1:8f", "network": {"id": "af9d1967-d1a9-4382-82b7-d9db26a40cb7", "bridge": "br-int", "label": "tempest-network-smoke--1326373200", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95792ac7-cb", "ovs_interfaceid": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:10:07 compute-0 nova_compute[187185]: 2025-11-29 07:10:07.284 187189 DEBUG oslo_concurrency.lockutils [None req-318da833-aa0d-4026-b4a4-e7431abaaaa3 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Releasing lock "refresh_cache-6d4e9a0c-c91c-45a4-911d-7526b420a8a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:10:07 compute-0 nova_compute[187185]: 2025-11-29 07:10:07.285 187189 DEBUG nova.objects.instance [None req-318da833-aa0d-4026-b4a4-e7431abaaaa3 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'migration_context' on Instance uuid 6d4e9a0c-c91c-45a4-911d-7526b420a8a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:10:07 compute-0 nova_compute[187185]: 2025-11-29 07:10:07.700 187189 DEBUG nova.virt.libvirt.vif [None req-318da833-aa0d-4026-b4a4-e7431abaaaa3 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:08:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1405928271',display_name='tempest-TestNetworkAdvancedServerOps-server-1405928271',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1405928271',id=89,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJzX+cYphgzFb/LmLSqgC4l/EgTLaDqQRgz2oIoLmiT9pJmbbaoOE/h8lTp9y4P6Lqu0yte5POR0cnSIwuT6ICbf/J95VY/pQuT7Mh/Rw0RaK2X3rgSaxQ5jqSeZ2XDRaw==',key_name='tempest-TestNetworkAdvancedServerOps-1604525815',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:10:00Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-q0t03bzu',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:10:00Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=6d4e9a0c-c91c-45a4-911d-7526b420a8a9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "address": "fa:16:3e:a1:a1:8f", "network": {"id": "af9d1967-d1a9-4382-82b7-d9db26a40cb7", "bridge": "br-int", "label": "tempest-network-smoke--1326373200", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95792ac7-cb", "ovs_interfaceid": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:10:07 compute-0 nova_compute[187185]: 2025-11-29 07:10:07.701 187189 DEBUG nova.network.os_vif_util [None req-318da833-aa0d-4026-b4a4-e7431abaaaa3 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converting VIF {"id": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "address": "fa:16:3e:a1:a1:8f", "network": {"id": "af9d1967-d1a9-4382-82b7-d9db26a40cb7", "bridge": "br-int", "label": "tempest-network-smoke--1326373200", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95792ac7-cb", "ovs_interfaceid": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:10:07 compute-0 nova_compute[187185]: 2025-11-29 07:10:07.701 187189 DEBUG nova.network.os_vif_util [None req-318da833-aa0d-4026-b4a4-e7431abaaaa3 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a1:a1:8f,bridge_name='br-int',has_traffic_filtering=True,id=95792ac7-cbc8-4bad-903e-600bb3d09fce,network=Network(af9d1967-d1a9-4382-82b7-d9db26a40cb7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95792ac7-cb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:10:07 compute-0 nova_compute[187185]: 2025-11-29 07:10:07.702 187189 DEBUG os_vif [None req-318da833-aa0d-4026-b4a4-e7431abaaaa3 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a1:a1:8f,bridge_name='br-int',has_traffic_filtering=True,id=95792ac7-cbc8-4bad-903e-600bb3d09fce,network=Network(af9d1967-d1a9-4382-82b7-d9db26a40cb7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95792ac7-cb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:10:07 compute-0 nova_compute[187185]: 2025-11-29 07:10:07.704 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:10:07 compute-0 nova_compute[187185]: 2025-11-29 07:10:07.704 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap95792ac7-cb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:10:07 compute-0 nova_compute[187185]: 2025-11-29 07:10:07.705 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:10:07 compute-0 nova_compute[187185]: 2025-11-29 07:10:07.708 187189 INFO os_vif [None req-318da833-aa0d-4026-b4a4-e7431abaaaa3 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a1:a1:8f,bridge_name='br-int',has_traffic_filtering=True,id=95792ac7-cbc8-4bad-903e-600bb3d09fce,network=Network(af9d1967-d1a9-4382-82b7-d9db26a40cb7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95792ac7-cb')
Nov 29 07:10:07 compute-0 nova_compute[187185]: 2025-11-29 07:10:07.709 187189 DEBUG oslo_concurrency.lockutils [None req-318da833-aa0d-4026-b4a4-e7431abaaaa3 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:10:07 compute-0 nova_compute[187185]: 2025-11-29 07:10:07.710 187189 DEBUG oslo_concurrency.lockutils [None req-318da833-aa0d-4026-b4a4-e7431abaaaa3 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:10:07 compute-0 systemd[1]: Starting dnf makecache...
Nov 29 07:10:07 compute-0 podman[226194]: 2025-11-29 07:10:07.805418802 +0000 UTC m=+0.069949624 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 07:10:07 compute-0 dnf[226195]: Metadata cache refreshed recently.
Nov 29 07:10:08 compute-0 systemd[1]: dnf-makecache.service: Deactivated successfully.
Nov 29 07:10:08 compute-0 systemd[1]: Finished dnf makecache.
Nov 29 07:10:08 compute-0 nova_compute[187185]: 2025-11-29 07:10:08.219 187189 DEBUG nova.compute.provider_tree [None req-318da833-aa0d-4026-b4a4-e7431abaaaa3 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:10:08 compute-0 nova_compute[187185]: 2025-11-29 07:10:08.288 187189 DEBUG nova.scheduler.client.report [None req-318da833-aa0d-4026-b4a4-e7431abaaaa3 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:10:08 compute-0 nova_compute[187185]: 2025-11-29 07:10:08.383 187189 DEBUG oslo_concurrency.lockutils [None req-318da833-aa0d-4026-b4a4-e7431abaaaa3 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.673s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:10:08 compute-0 nova_compute[187185]: 2025-11-29 07:10:08.815 187189 INFO nova.scheduler.client.report [None req-318da833-aa0d-4026-b4a4-e7431abaaaa3 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Deleted allocation for migration 26ffe11d-ec78-48cf-95c2-9fbc8ddb9126
Nov 29 07:10:09 compute-0 nova_compute[187185]: 2025-11-29 07:10:09.251 187189 DEBUG oslo_concurrency.lockutils [None req-318da833-aa0d-4026-b4a4-e7431abaaaa3 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 6.040s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:10:09 compute-0 nova_compute[187185]: 2025-11-29 07:10:09.718 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:10:10 compute-0 podman[226217]: 2025-11-29 07:10:10.796397431 +0000 UTC m=+0.067625078 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 29 07:10:11 compute-0 nova_compute[187185]: 2025-11-29 07:10:11.237 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:10:11 compute-0 podman[226237]: 2025-11-29 07:10:11.826733057 +0000 UTC m=+0.083038213 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 07:10:14 compute-0 nova_compute[187185]: 2025-11-29 07:10:14.720 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:10:16 compute-0 nova_compute[187185]: 2025-11-29 07:10:16.241 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:10:19 compute-0 nova_compute[187185]: 2025-11-29 07:10:19.723 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:10:21 compute-0 nova_compute[187185]: 2025-11-29 07:10:21.244 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:10:22 compute-0 podman[226257]: 2025-11-29 07:10:22.809797468 +0000 UTC m=+0.056776963 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Nov 29 07:10:22 compute-0 podman[226259]: 2025-11-29 07:10:22.811195337 +0000 UTC m=+0.051863604 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:10:22 compute-0 podman[226258]: 2025-11-29 07:10:22.820480189 +0000 UTC m=+0.068820072 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, config_id=edpm, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 29 07:10:24 compute-0 nova_compute[187185]: 2025-11-29 07:10:24.724 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:10:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:10:24.829 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:10:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:10:24.830 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:10:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:10:24.830 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:10:26 compute-0 nova_compute[187185]: 2025-11-29 07:10:26.300 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:10:28 compute-0 nova_compute[187185]: 2025-11-29 07:10:28.307 187189 DEBUG oslo_concurrency.lockutils [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:10:28 compute-0 nova_compute[187185]: 2025-11-29 07:10:28.308 187189 DEBUG oslo_concurrency.lockutils [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:10:28 compute-0 nova_compute[187185]: 2025-11-29 07:10:28.324 187189 DEBUG nova.compute.manager [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 07:10:28 compute-0 nova_compute[187185]: 2025-11-29 07:10:28.459 187189 DEBUG oslo_concurrency.lockutils [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:10:28 compute-0 nova_compute[187185]: 2025-11-29 07:10:28.460 187189 DEBUG oslo_concurrency.lockutils [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:10:28 compute-0 nova_compute[187185]: 2025-11-29 07:10:28.468 187189 DEBUG nova.virt.hardware [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:10:28 compute-0 nova_compute[187185]: 2025-11-29 07:10:28.468 187189 INFO nova.compute.claims [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:10:28 compute-0 nova_compute[187185]: 2025-11-29 07:10:28.901 187189 DEBUG nova.compute.provider_tree [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:10:28 compute-0 nova_compute[187185]: 2025-11-29 07:10:28.922 187189 DEBUG nova.scheduler.client.report [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:10:28 compute-0 nova_compute[187185]: 2025-11-29 07:10:28.945 187189 DEBUG oslo_concurrency.lockutils [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.485s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:10:28 compute-0 nova_compute[187185]: 2025-11-29 07:10:28.947 187189 DEBUG nova.compute.manager [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 07:10:29 compute-0 nova_compute[187185]: 2025-11-29 07:10:29.009 187189 DEBUG nova.compute.manager [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 07:10:29 compute-0 nova_compute[187185]: 2025-11-29 07:10:29.009 187189 DEBUG nova.network.neutron [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 07:10:29 compute-0 nova_compute[187185]: 2025-11-29 07:10:29.031 187189 INFO nova.virt.libvirt.driver [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 07:10:29 compute-0 nova_compute[187185]: 2025-11-29 07:10:29.053 187189 DEBUG nova.compute.manager [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 07:10:29 compute-0 nova_compute[187185]: 2025-11-29 07:10:29.160 187189 DEBUG nova.compute.manager [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 07:10:29 compute-0 nova_compute[187185]: 2025-11-29 07:10:29.163 187189 DEBUG nova.virt.libvirt.driver [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:10:29 compute-0 nova_compute[187185]: 2025-11-29 07:10:29.164 187189 INFO nova.virt.libvirt.driver [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Creating image(s)
Nov 29 07:10:29 compute-0 nova_compute[187185]: 2025-11-29 07:10:29.165 187189 DEBUG oslo_concurrency.lockutils [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "/var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:10:29 compute-0 nova_compute[187185]: 2025-11-29 07:10:29.165 187189 DEBUG oslo_concurrency.lockutils [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "/var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:10:29 compute-0 nova_compute[187185]: 2025-11-29 07:10:29.167 187189 DEBUG oslo_concurrency.lockutils [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "/var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:10:29 compute-0 nova_compute[187185]: 2025-11-29 07:10:29.179 187189 DEBUG oslo_concurrency.processutils [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:10:29 compute-0 nova_compute[187185]: 2025-11-29 07:10:29.242 187189 DEBUG oslo_concurrency.processutils [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:10:29 compute-0 nova_compute[187185]: 2025-11-29 07:10:29.243 187189 DEBUG oslo_concurrency.lockutils [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:10:29 compute-0 nova_compute[187185]: 2025-11-29 07:10:29.244 187189 DEBUG oslo_concurrency.lockutils [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:10:29 compute-0 nova_compute[187185]: 2025-11-29 07:10:29.255 187189 DEBUG oslo_concurrency.processutils [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:10:29 compute-0 nova_compute[187185]: 2025-11-29 07:10:29.313 187189 DEBUG oslo_concurrency.processutils [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:10:29 compute-0 nova_compute[187185]: 2025-11-29 07:10:29.314 187189 DEBUG oslo_concurrency.processutils [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:10:29 compute-0 nova_compute[187185]: 2025-11-29 07:10:29.582 187189 DEBUG oslo_concurrency.processutils [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk 1073741824" returned: 0 in 0.268s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:10:29 compute-0 nova_compute[187185]: 2025-11-29 07:10:29.583 187189 DEBUG oslo_concurrency.lockutils [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.339s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:10:29 compute-0 nova_compute[187185]: 2025-11-29 07:10:29.583 187189 DEBUG oslo_concurrency.processutils [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:10:29 compute-0 nova_compute[187185]: 2025-11-29 07:10:29.652 187189 DEBUG oslo_concurrency.processutils [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:10:29 compute-0 nova_compute[187185]: 2025-11-29 07:10:29.655 187189 DEBUG nova.virt.disk.api [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Checking if we can resize image /var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:10:29 compute-0 nova_compute[187185]: 2025-11-29 07:10:29.656 187189 DEBUG oslo_concurrency.processutils [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:10:29 compute-0 nova_compute[187185]: 2025-11-29 07:10:29.719 187189 DEBUG oslo_concurrency.processutils [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:10:29 compute-0 nova_compute[187185]: 2025-11-29 07:10:29.721 187189 DEBUG nova.virt.disk.api [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Cannot resize image /var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:10:29 compute-0 nova_compute[187185]: 2025-11-29 07:10:29.722 187189 DEBUG nova.objects.instance [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'migration_context' on Instance uuid 5896e8b0-25a2-4075-8ebf-5458b5ed9234 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:10:29 compute-0 nova_compute[187185]: 2025-11-29 07:10:29.744 187189 DEBUG nova.virt.libvirt.driver [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:10:29 compute-0 nova_compute[187185]: 2025-11-29 07:10:29.745 187189 DEBUG nova.virt.libvirt.driver [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Ensure instance console log exists: /var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:10:29 compute-0 nova_compute[187185]: 2025-11-29 07:10:29.745 187189 DEBUG oslo_concurrency.lockutils [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:10:29 compute-0 nova_compute[187185]: 2025-11-29 07:10:29.746 187189 DEBUG oslo_concurrency.lockutils [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:10:29 compute-0 nova_compute[187185]: 2025-11-29 07:10:29.746 187189 DEBUG oslo_concurrency.lockutils [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:10:29 compute-0 nova_compute[187185]: 2025-11-29 07:10:29.751 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:10:29 compute-0 nova_compute[187185]: 2025-11-29 07:10:29.796 187189 DEBUG nova.policy [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 07:10:30 compute-0 nova_compute[187185]: 2025-11-29 07:10:30.905 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:10:31 compute-0 nova_compute[187185]: 2025-11-29 07:10:31.124 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:10:31 compute-0 nova_compute[187185]: 2025-11-29 07:10:31.303 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:10:32 compute-0 nova_compute[187185]: 2025-11-29 07:10:32.335 187189 DEBUG nova.network.neutron [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Successfully created port: 69f2ab13-2311-4137-9c26-e256f33759e5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:10:32 compute-0 podman[226336]: 2025-11-29 07:10:32.849687459 +0000 UTC m=+0.115453768 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller)
Nov 29 07:10:33 compute-0 nova_compute[187185]: 2025-11-29 07:10:33.825 187189 DEBUG nova.network.neutron [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Successfully updated port: 69f2ab13-2311-4137-9c26-e256f33759e5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:10:33 compute-0 nova_compute[187185]: 2025-11-29 07:10:33.846 187189 DEBUG oslo_concurrency.lockutils [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "refresh_cache-5896e8b0-25a2-4075-8ebf-5458b5ed9234" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:10:33 compute-0 nova_compute[187185]: 2025-11-29 07:10:33.846 187189 DEBUG oslo_concurrency.lockutils [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquired lock "refresh_cache-5896e8b0-25a2-4075-8ebf-5458b5ed9234" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:10:33 compute-0 nova_compute[187185]: 2025-11-29 07:10:33.846 187189 DEBUG nova.network.neutron [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:10:33 compute-0 nova_compute[187185]: 2025-11-29 07:10:33.964 187189 DEBUG nova.compute.manager [req-223003e0-ae97-4bb1-88c5-96999d286df6 req-f2afd9b4-cec8-4130-a03d-9ba9839f558d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Received event network-changed-69f2ab13-2311-4137-9c26-e256f33759e5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:10:33 compute-0 nova_compute[187185]: 2025-11-29 07:10:33.964 187189 DEBUG nova.compute.manager [req-223003e0-ae97-4bb1-88c5-96999d286df6 req-f2afd9b4-cec8-4130-a03d-9ba9839f558d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Refreshing instance network info cache due to event network-changed-69f2ab13-2311-4137-9c26-e256f33759e5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:10:33 compute-0 nova_compute[187185]: 2025-11-29 07:10:33.965 187189 DEBUG oslo_concurrency.lockutils [req-223003e0-ae97-4bb1-88c5-96999d286df6 req-f2afd9b4-cec8-4130-a03d-9ba9839f558d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-5896e8b0-25a2-4075-8ebf-5458b5ed9234" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:10:34 compute-0 nova_compute[187185]: 2025-11-29 07:10:34.460 187189 DEBUG nova.network.neutron [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:10:34 compute-0 nova_compute[187185]: 2025-11-29 07:10:34.753 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.489 187189 DEBUG nova.network.neutron [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Updating instance_info_cache with network_info: [{"id": "69f2ab13-2311-4137-9c26-e256f33759e5", "address": "fa:16:3e:cf:f0:da", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69f2ab13-23", "ovs_interfaceid": "69f2ab13-2311-4137-9c26-e256f33759e5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.508 187189 DEBUG oslo_concurrency.lockutils [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Releasing lock "refresh_cache-5896e8b0-25a2-4075-8ebf-5458b5ed9234" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.509 187189 DEBUG nova.compute.manager [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Instance network_info: |[{"id": "69f2ab13-2311-4137-9c26-e256f33759e5", "address": "fa:16:3e:cf:f0:da", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69f2ab13-23", "ovs_interfaceid": "69f2ab13-2311-4137-9c26-e256f33759e5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.510 187189 DEBUG oslo_concurrency.lockutils [req-223003e0-ae97-4bb1-88c5-96999d286df6 req-f2afd9b4-cec8-4130-a03d-9ba9839f558d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-5896e8b0-25a2-4075-8ebf-5458b5ed9234" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.510 187189 DEBUG nova.network.neutron [req-223003e0-ae97-4bb1-88c5-96999d286df6 req-f2afd9b4-cec8-4130-a03d-9ba9839f558d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Refreshing network info cache for port 69f2ab13-2311-4137-9c26-e256f33759e5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.514 187189 DEBUG nova.virt.libvirt.driver [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Start _get_guest_xml network_info=[{"id": "69f2ab13-2311-4137-9c26-e256f33759e5", "address": "fa:16:3e:cf:f0:da", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69f2ab13-23", "ovs_interfaceid": "69f2ab13-2311-4137-9c26-e256f33759e5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.519 187189 WARNING nova.virt.libvirt.driver [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.524 187189 DEBUG nova.virt.libvirt.host [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.525 187189 DEBUG nova.virt.libvirt.host [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.528 187189 DEBUG nova.virt.libvirt.host [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.529 187189 DEBUG nova.virt.libvirt.host [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.530 187189 DEBUG nova.virt.libvirt.driver [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.530 187189 DEBUG nova.virt.hardware [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.531 187189 DEBUG nova.virt.hardware [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.531 187189 DEBUG nova.virt.hardware [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.531 187189 DEBUG nova.virt.hardware [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.532 187189 DEBUG nova.virt.hardware [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.532 187189 DEBUG nova.virt.hardware [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.532 187189 DEBUG nova.virt.hardware [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.532 187189 DEBUG nova.virt.hardware [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.533 187189 DEBUG nova.virt.hardware [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.533 187189 DEBUG nova.virt.hardware [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.533 187189 DEBUG nova.virt.hardware [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.538 187189 DEBUG nova.virt.libvirt.vif [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:10:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-878694992',display_name='tempest-ServerActionsTestJSON-server-878694992',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-878694992',id=94,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCMk4ml8b3fuOLAahNu+FCDl9v+SWDrsolrsZ2S6Gt/612rFnCoMRUdrLWExAZcapVyOr30SCUoxNsWGg9cONHUEcP39CbtnDaooW7qgHPKQnqdGyUHB5iGAkkvgw7SerQ==',key_name='tempest-keypair-11305483',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6e6c366001df43fb91731faf7a9578fc',ramdisk_id='',reservation_id='r-j3wprymf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-157226036',owner_user_name='tempest-ServerActionsTestJSON-157226036-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:10:29Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e1b8fbcc8caa4d94b69570f233c56d18',uuid=5896e8b0-25a2-4075-8ebf-5458b5ed9234,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "69f2ab13-2311-4137-9c26-e256f33759e5", "address": "fa:16:3e:cf:f0:da", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69f2ab13-23", "ovs_interfaceid": "69f2ab13-2311-4137-9c26-e256f33759e5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.538 187189 DEBUG nova.network.os_vif_util [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converting VIF {"id": "69f2ab13-2311-4137-9c26-e256f33759e5", "address": "fa:16:3e:cf:f0:da", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69f2ab13-23", "ovs_interfaceid": "69f2ab13-2311-4137-9c26-e256f33759e5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.539 187189 DEBUG nova.network.os_vif_util [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cf:f0:da,bridge_name='br-int',has_traffic_filtering=True,id=69f2ab13-2311-4137-9c26-e256f33759e5,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69f2ab13-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.540 187189 DEBUG nova.objects.instance [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'pci_devices' on Instance uuid 5896e8b0-25a2-4075-8ebf-5458b5ed9234 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.554 187189 DEBUG nova.virt.libvirt.driver [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:10:35 compute-0 nova_compute[187185]:   <uuid>5896e8b0-25a2-4075-8ebf-5458b5ed9234</uuid>
Nov 29 07:10:35 compute-0 nova_compute[187185]:   <name>instance-0000005e</name>
Nov 29 07:10:35 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:10:35 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:10:35 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:10:35 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:       <nova:name>tempest-ServerActionsTestJSON-server-878694992</nova:name>
Nov 29 07:10:35 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:10:35</nova:creationTime>
Nov 29 07:10:35 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:10:35 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:10:35 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:10:35 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:10:35 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:10:35 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:10:35 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:10:35 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:10:35 compute-0 nova_compute[187185]:         <nova:user uuid="e1b8fbcc8caa4d94b69570f233c56d18">tempest-ServerActionsTestJSON-157226036-project-member</nova:user>
Nov 29 07:10:35 compute-0 nova_compute[187185]:         <nova:project uuid="6e6c366001df43fb91731faf7a9578fc">tempest-ServerActionsTestJSON-157226036</nova:project>
Nov 29 07:10:35 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:10:35 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:10:35 compute-0 nova_compute[187185]:         <nova:port uuid="69f2ab13-2311-4137-9c26-e256f33759e5">
Nov 29 07:10:35 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:10:35 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:10:35 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:10:35 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <system>
Nov 29 07:10:35 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:10:35 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:10:35 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:10:35 compute-0 nova_compute[187185]:       <entry name="serial">5896e8b0-25a2-4075-8ebf-5458b5ed9234</entry>
Nov 29 07:10:35 compute-0 nova_compute[187185]:       <entry name="uuid">5896e8b0-25a2-4075-8ebf-5458b5ed9234</entry>
Nov 29 07:10:35 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     </system>
Nov 29 07:10:35 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:10:35 compute-0 nova_compute[187185]:   <os>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:   </os>
Nov 29 07:10:35 compute-0 nova_compute[187185]:   <features>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:   </features>
Nov 29 07:10:35 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:10:35 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:10:35 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:10:35 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:10:35 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk.config"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:10:35 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:cf:f0:da"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:       <target dev="tap69f2ab13-23"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:10:35 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/console.log" append="off"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <video>
Nov 29 07:10:35 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     </video>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:10:35 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:10:35 compute-0 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:10:35 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:10:35 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:10:35 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:10:35 compute-0 nova_compute[187185]: </domain>
Nov 29 07:10:35 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.555 187189 DEBUG nova.compute.manager [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Preparing to wait for external event network-vif-plugged-69f2ab13-2311-4137-9c26-e256f33759e5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.556 187189 DEBUG oslo_concurrency.lockutils [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.556 187189 DEBUG oslo_concurrency.lockutils [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.557 187189 DEBUG oslo_concurrency.lockutils [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.558 187189 DEBUG nova.virt.libvirt.vif [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:10:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-878694992',display_name='tempest-ServerActionsTestJSON-server-878694992',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-878694992',id=94,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCMk4ml8b3fuOLAahNu+FCDl9v+SWDrsolrsZ2S6Gt/612rFnCoMRUdrLWExAZcapVyOr30SCUoxNsWGg9cONHUEcP39CbtnDaooW7qgHPKQnqdGyUHB5iGAkkvgw7SerQ==',key_name='tempest-keypair-11305483',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6e6c366001df43fb91731faf7a9578fc',ramdisk_id='',reservation_id='r-j3wprymf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-157226036',owner_user_name='tempest-ServerActionsTestJSON-157226036-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:10:29Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e1b8fbcc8caa4d94b69570f233c56d18',uuid=5896e8b0-25a2-4075-8ebf-5458b5ed9234,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "69f2ab13-2311-4137-9c26-e256f33759e5", "address": "fa:16:3e:cf:f0:da", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69f2ab13-23", "ovs_interfaceid": "69f2ab13-2311-4137-9c26-e256f33759e5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.558 187189 DEBUG nova.network.os_vif_util [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converting VIF {"id": "69f2ab13-2311-4137-9c26-e256f33759e5", "address": "fa:16:3e:cf:f0:da", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69f2ab13-23", "ovs_interfaceid": "69f2ab13-2311-4137-9c26-e256f33759e5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.559 187189 DEBUG nova.network.os_vif_util [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cf:f0:da,bridge_name='br-int',has_traffic_filtering=True,id=69f2ab13-2311-4137-9c26-e256f33759e5,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69f2ab13-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.561 187189 DEBUG os_vif [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:f0:da,bridge_name='br-int',has_traffic_filtering=True,id=69f2ab13-2311-4137-9c26-e256f33759e5,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69f2ab13-23') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.561 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.562 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.562 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.566 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.566 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap69f2ab13-23, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.567 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap69f2ab13-23, col_values=(('external_ids', {'iface-id': '69f2ab13-2311-4137-9c26-e256f33759e5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cf:f0:da', 'vm-uuid': '5896e8b0-25a2-4075-8ebf-5458b5ed9234'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.569 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:10:35 compute-0 NetworkManager[55227]: <info>  [1764400235.5710] manager: (tap69f2ab13-23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/115)
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.572 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:10:35 compute-0 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.579 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.580 187189 INFO os_vif [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:f0:da,bridge_name='br-int',has_traffic_filtering=True,id=69f2ab13-2311-4137-9c26-e256f33759e5,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69f2ab13-23')
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.646 187189 DEBUG nova.virt.libvirt.driver [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.646 187189 DEBUG nova.virt.libvirt.driver [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.646 187189 DEBUG nova.virt.libvirt.driver [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] No VIF found with MAC fa:16:3e:cf:f0:da, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:10:35 compute-0 nova_compute[187185]: 2025-11-29 07:10:35.647 187189 INFO nova.virt.libvirt.driver [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Using config drive
Nov 29 07:10:36 compute-0 nova_compute[187185]: 2025-11-29 07:10:36.049 187189 INFO nova.virt.libvirt.driver [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Creating config drive at /var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk.config
Nov 29 07:10:36 compute-0 nova_compute[187185]: 2025-11-29 07:10:36.056 187189 DEBUG oslo_concurrency.processutils [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprdd7odti execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:10:36 compute-0 nova_compute[187185]: 2025-11-29 07:10:36.198 187189 DEBUG oslo_concurrency.processutils [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprdd7odti" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:10:36 compute-0 kernel: tap69f2ab13-23: entered promiscuous mode
Nov 29 07:10:36 compute-0 NetworkManager[55227]: <info>  [1764400236.2843] manager: (tap69f2ab13-23): new Tun device (/org/freedesktop/NetworkManager/Devices/116)
Nov 29 07:10:36 compute-0 ovn_controller[95281]: 2025-11-29T07:10:36Z|00216|binding|INFO|Claiming lport 69f2ab13-2311-4137-9c26-e256f33759e5 for this chassis.
Nov 29 07:10:36 compute-0 ovn_controller[95281]: 2025-11-29T07:10:36Z|00217|binding|INFO|69f2ab13-2311-4137-9c26-e256f33759e5: Claiming fa:16:3e:cf:f0:da 10.100.0.6
Nov 29 07:10:36 compute-0 nova_compute[187185]: 2025-11-29 07:10:36.285 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:10:36 compute-0 nova_compute[187185]: 2025-11-29 07:10:36.292 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:10:36.300 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cf:f0:da 10.100.0.6'], port_security=['fa:16:3e:cf:f0:da 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9226dea3-6355-4dd9-9441-d093c1f1a399', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e6c366001df43fb91731faf7a9578fc', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1ab550ec-f542-4b91-8354-868da408e26d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d2b9cdf8-2516-48a0-81c5-85c2327075d5, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=69f2ab13-2311-4137-9c26-e256f33759e5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:10:36.302 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 69f2ab13-2311-4137-9c26-e256f33759e5 in datapath 9226dea3-6355-4dd9-9441-d093c1f1a399 bound to our chassis
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:10:36.303 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9226dea3-6355-4dd9-9441-d093c1f1a399
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:10:36.320 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[310f32fc-af67-457b-9113-61a5aca2c55d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:10:36.321 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9226dea3-61 in ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:10:36.324 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9226dea3-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:10:36.324 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[b46c5896-83b9-44b7-b2e8-eac2ff138623]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:10:36.325 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[ceb9729b-ca89-4689-a93a-0908194d403a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:10:36 compute-0 systemd-udevd[226382]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:10:36 compute-0 systemd-machined[153486]: New machine qemu-32-instance-0000005e.
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:10:36.341 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[d24b141f-0c5a-4234-b9c9-1f81e638d131]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:10:36 compute-0 NetworkManager[55227]: <info>  [1764400236.3456] device (tap69f2ab13-23): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:10:36 compute-0 NetworkManager[55227]: <info>  [1764400236.3463] device (tap69f2ab13-23): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:10:36 compute-0 nova_compute[187185]: 2025-11-29 07:10:36.347 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:10:36 compute-0 ovn_controller[95281]: 2025-11-29T07:10:36Z|00218|binding|INFO|Setting lport 69f2ab13-2311-4137-9c26-e256f33759e5 ovn-installed in OVS
Nov 29 07:10:36 compute-0 ovn_controller[95281]: 2025-11-29T07:10:36Z|00219|binding|INFO|Setting lport 69f2ab13-2311-4137-9c26-e256f33759e5 up in Southbound
Nov 29 07:10:36 compute-0 nova_compute[187185]: 2025-11-29 07:10:36.353 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:10:36 compute-0 systemd[1]: Started Virtual Machine qemu-32-instance-0000005e.
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:10:36.375 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[dcb3bbe8-87f4-4b43-bb7c-d797ac545649]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:10:36.409 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[5393d877-241b-4831-829b-5bfa91aa1a62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:10:36 compute-0 systemd-udevd[226386]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:10:36.415 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[558f6965-9f3d-4d44-a1d8-00d1b4ea6fa4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:10:36 compute-0 NetworkManager[55227]: <info>  [1764400236.4161] manager: (tap9226dea3-60): new Veth device (/org/freedesktop/NetworkManager/Devices/117)
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:10:36.450 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[ae382f24-4996-47e2-a22f-c1de99858ee3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:10:36.452 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[a95bfbe6-4213-4847-8fac-203a75587888]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:10:36 compute-0 NetworkManager[55227]: <info>  [1764400236.4733] device (tap9226dea3-60): carrier: link connected
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:10:36.479 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[51ea5a9b-49df-4074-ac6a-1438dab3473e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:10:36.500 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[b247e403-5b4f-4028-9325-e66539549a3a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9226dea3-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cb:49:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 568413, 'reachable_time': 24151, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226415, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:10:36.521 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a80aad16-68cc-43fb-abb1-337d219abf10]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecb:493d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 568413, 'tstamp': 568413}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226416, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:10:36.541 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[66856129-49c9-4e43-8b81-02a02ec32b58]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9226dea3-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cb:49:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 568413, 'reachable_time': 24151, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226417, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:10:36.580 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[37187918-2e99-46ee-82cc-c58d25e0ec0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:10:36.651 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[db813f7e-3163-4511-a9b3-fd398194c07b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:10:36.653 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9226dea3-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:10:36.654 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:10:36.654 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9226dea3-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:10:36 compute-0 nova_compute[187185]: 2025-11-29 07:10:36.656 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:10:36 compute-0 NetworkManager[55227]: <info>  [1764400236.6576] manager: (tap9226dea3-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/118)
Nov 29 07:10:36 compute-0 kernel: tap9226dea3-60: entered promiscuous mode
Nov 29 07:10:36 compute-0 nova_compute[187185]: 2025-11-29 07:10:36.659 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:10:36.667 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9226dea3-60, col_values=(('external_ids', {'iface-id': 'e99fae54-9bf0-4a59-8b06-7a4b6ecf1479'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:10:36 compute-0 nova_compute[187185]: 2025-11-29 07:10:36.668 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:10:36 compute-0 ovn_controller[95281]: 2025-11-29T07:10:36Z|00220|binding|INFO|Releasing lport e99fae54-9bf0-4a59-8b06-7a4b6ecf1479 from this chassis (sb_readonly=0)
Nov 29 07:10:36 compute-0 nova_compute[187185]: 2025-11-29 07:10:36.669 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:10:36 compute-0 nova_compute[187185]: 2025-11-29 07:10:36.687 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:10:36.689 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9226dea3-6355-4dd9-9441-d093c1f1a399.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9226dea3-6355-4dd9-9441-d093c1f1a399.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:10:36.691 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[b6a00566-47d8-4736-a249-1de94fbfc3b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:10:36.692 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-9226dea3-6355-4dd9-9441-d093c1f1a399
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/9226dea3-6355-4dd9-9441-d093c1f1a399.pid.haproxy
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID 9226dea3-6355-4dd9-9441-d093c1f1a399
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:10:36 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:10:36.693 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'env', 'PROCESS_TAG=haproxy-9226dea3-6355-4dd9-9441-d093c1f1a399', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9226dea3-6355-4dd9-9441-d093c1f1a399.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:10:36 compute-0 nova_compute[187185]: 2025-11-29 07:10:36.968 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400236.967417, 5896e8b0-25a2-4075-8ebf-5458b5ed9234 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:10:36 compute-0 nova_compute[187185]: 2025-11-29 07:10:36.969 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] VM Started (Lifecycle Event)
Nov 29 07:10:36 compute-0 nova_compute[187185]: 2025-11-29 07:10:36.988 187189 DEBUG nova.network.neutron [req-223003e0-ae97-4bb1-88c5-96999d286df6 req-f2afd9b4-cec8-4130-a03d-9ba9839f558d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Updated VIF entry in instance network info cache for port 69f2ab13-2311-4137-9c26-e256f33759e5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:10:36 compute-0 nova_compute[187185]: 2025-11-29 07:10:36.989 187189 DEBUG nova.network.neutron [req-223003e0-ae97-4bb1-88c5-96999d286df6 req-f2afd9b4-cec8-4130-a03d-9ba9839f558d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Updating instance_info_cache with network_info: [{"id": "69f2ab13-2311-4137-9c26-e256f33759e5", "address": "fa:16:3e:cf:f0:da", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69f2ab13-23", "ovs_interfaceid": "69f2ab13-2311-4137-9c26-e256f33759e5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:10:36 compute-0 nova_compute[187185]: 2025-11-29 07:10:36.991 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:10:36 compute-0 nova_compute[187185]: 2025-11-29 07:10:36.996 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400236.9685047, 5896e8b0-25a2-4075-8ebf-5458b5ed9234 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:10:36 compute-0 nova_compute[187185]: 2025-11-29 07:10:36.996 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] VM Paused (Lifecycle Event)
Nov 29 07:10:37 compute-0 nova_compute[187185]: 2025-11-29 07:10:37.017 187189 DEBUG oslo_concurrency.lockutils [req-223003e0-ae97-4bb1-88c5-96999d286df6 req-f2afd9b4-cec8-4130-a03d-9ba9839f558d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-5896e8b0-25a2-4075-8ebf-5458b5ed9234" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:10:37 compute-0 nova_compute[187185]: 2025-11-29 07:10:37.027 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:10:37 compute-0 nova_compute[187185]: 2025-11-29 07:10:37.032 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:10:37 compute-0 nova_compute[187185]: 2025-11-29 07:10:37.050 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:10:37 compute-0 podman[226455]: 2025-11-29 07:10:37.094059146 +0000 UTC m=+0.054845019 container create 381532344d12d9ac807746ff3f6956802f3686583b18abc4e35373c93c1323e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:10:37 compute-0 systemd[1]: Started libpod-conmon-381532344d12d9ac807746ff3f6956802f3686583b18abc4e35373c93c1323e6.scope.
Nov 29 07:10:37 compute-0 podman[226455]: 2025-11-29 07:10:37.065719976 +0000 UTC m=+0.026505869 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:10:37 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:10:37 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a5f15b90c3583df15e605da6f64b73a36f9ef28cfa74f5da88d8db45a6cfce6a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:10:37 compute-0 podman[226455]: 2025-11-29 07:10:37.185994749 +0000 UTC m=+0.146780652 container init 381532344d12d9ac807746ff3f6956802f3686583b18abc4e35373c93c1323e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 07:10:37 compute-0 podman[226455]: 2025-11-29 07:10:37.197078662 +0000 UTC m=+0.157864525 container start 381532344d12d9ac807746ff3f6956802f3686583b18abc4e35373c93c1323e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 07:10:37 compute-0 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[226470]: [NOTICE]   (226474) : New worker (226476) forked
Nov 29 07:10:37 compute-0 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[226470]: [NOTICE]   (226474) : Loading success.
Nov 29 07:10:38 compute-0 podman[226485]: 2025-11-29 07:10:38.807867133 +0000 UTC m=+0.069377288 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 07:10:39 compute-0 nova_compute[187185]: 2025-11-29 07:10:39.754 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:10:39 compute-0 nova_compute[187185]: 2025-11-29 07:10:39.822 187189 DEBUG nova.compute.manager [req-f5380f32-60c0-455a-8d0b-dc85a9065ba5 req-6a6db3f3-68ff-4e17-a5f3-a3581a7321b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Received event network-vif-plugged-69f2ab13-2311-4137-9c26-e256f33759e5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:10:39 compute-0 nova_compute[187185]: 2025-11-29 07:10:39.822 187189 DEBUG oslo_concurrency.lockutils [req-f5380f32-60c0-455a-8d0b-dc85a9065ba5 req-6a6db3f3-68ff-4e17-a5f3-a3581a7321b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:10:39 compute-0 nova_compute[187185]: 2025-11-29 07:10:39.823 187189 DEBUG oslo_concurrency.lockutils [req-f5380f32-60c0-455a-8d0b-dc85a9065ba5 req-6a6db3f3-68ff-4e17-a5f3-a3581a7321b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:10:39 compute-0 nova_compute[187185]: 2025-11-29 07:10:39.823 187189 DEBUG oslo_concurrency.lockutils [req-f5380f32-60c0-455a-8d0b-dc85a9065ba5 req-6a6db3f3-68ff-4e17-a5f3-a3581a7321b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:10:39 compute-0 nova_compute[187185]: 2025-11-29 07:10:39.823 187189 DEBUG nova.compute.manager [req-f5380f32-60c0-455a-8d0b-dc85a9065ba5 req-6a6db3f3-68ff-4e17-a5f3-a3581a7321b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Processing event network-vif-plugged-69f2ab13-2311-4137-9c26-e256f33759e5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:10:39 compute-0 nova_compute[187185]: 2025-11-29 07:10:39.824 187189 DEBUG nova.compute.manager [req-f5380f32-60c0-455a-8d0b-dc85a9065ba5 req-6a6db3f3-68ff-4e17-a5f3-a3581a7321b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Received event network-vif-plugged-69f2ab13-2311-4137-9c26-e256f33759e5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:10:39 compute-0 nova_compute[187185]: 2025-11-29 07:10:39.824 187189 DEBUG oslo_concurrency.lockutils [req-f5380f32-60c0-455a-8d0b-dc85a9065ba5 req-6a6db3f3-68ff-4e17-a5f3-a3581a7321b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:10:39 compute-0 nova_compute[187185]: 2025-11-29 07:10:39.824 187189 DEBUG oslo_concurrency.lockutils [req-f5380f32-60c0-455a-8d0b-dc85a9065ba5 req-6a6db3f3-68ff-4e17-a5f3-a3581a7321b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:10:39 compute-0 nova_compute[187185]: 2025-11-29 07:10:39.824 187189 DEBUG oslo_concurrency.lockutils [req-f5380f32-60c0-455a-8d0b-dc85a9065ba5 req-6a6db3f3-68ff-4e17-a5f3-a3581a7321b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:10:39 compute-0 nova_compute[187185]: 2025-11-29 07:10:39.825 187189 DEBUG nova.compute.manager [req-f5380f32-60c0-455a-8d0b-dc85a9065ba5 req-6a6db3f3-68ff-4e17-a5f3-a3581a7321b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] No waiting events found dispatching network-vif-plugged-69f2ab13-2311-4137-9c26-e256f33759e5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:10:39 compute-0 nova_compute[187185]: 2025-11-29 07:10:39.825 187189 WARNING nova.compute.manager [req-f5380f32-60c0-455a-8d0b-dc85a9065ba5 req-6a6db3f3-68ff-4e17-a5f3-a3581a7321b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Received unexpected event network-vif-plugged-69f2ab13-2311-4137-9c26-e256f33759e5 for instance with vm_state building and task_state spawning.
Nov 29 07:10:39 compute-0 nova_compute[187185]: 2025-11-29 07:10:39.826 187189 DEBUG nova.compute.manager [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:10:39 compute-0 nova_compute[187185]: 2025-11-29 07:10:39.830 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400239.8300781, 5896e8b0-25a2-4075-8ebf-5458b5ed9234 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:10:39 compute-0 nova_compute[187185]: 2025-11-29 07:10:39.830 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] VM Resumed (Lifecycle Event)
Nov 29 07:10:39 compute-0 nova_compute[187185]: 2025-11-29 07:10:39.833 187189 DEBUG nova.virt.libvirt.driver [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:10:39 compute-0 nova_compute[187185]: 2025-11-29 07:10:39.837 187189 INFO nova.virt.libvirt.driver [-] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Instance spawned successfully.
Nov 29 07:10:39 compute-0 nova_compute[187185]: 2025-11-29 07:10:39.837 187189 DEBUG nova.virt.libvirt.driver [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:10:39 compute-0 nova_compute[187185]: 2025-11-29 07:10:39.916 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:10:39 compute-0 nova_compute[187185]: 2025-11-29 07:10:39.921 187189 DEBUG nova.virt.libvirt.driver [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:10:39 compute-0 nova_compute[187185]: 2025-11-29 07:10:39.921 187189 DEBUG nova.virt.libvirt.driver [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:10:39 compute-0 nova_compute[187185]: 2025-11-29 07:10:39.922 187189 DEBUG nova.virt.libvirt.driver [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:10:39 compute-0 nova_compute[187185]: 2025-11-29 07:10:39.922 187189 DEBUG nova.virt.libvirt.driver [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:10:39 compute-0 nova_compute[187185]: 2025-11-29 07:10:39.922 187189 DEBUG nova.virt.libvirt.driver [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:10:39 compute-0 nova_compute[187185]: 2025-11-29 07:10:39.923 187189 DEBUG nova.virt.libvirt.driver [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:10:39 compute-0 nova_compute[187185]: 2025-11-29 07:10:39.926 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:10:40 compute-0 nova_compute[187185]: 2025-11-29 07:10:40.013 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:10:40 compute-0 nova_compute[187185]: 2025-11-29 07:10:40.051 187189 INFO nova.compute.manager [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Took 10.89 seconds to spawn the instance on the hypervisor.
Nov 29 07:10:40 compute-0 nova_compute[187185]: 2025-11-29 07:10:40.052 187189 DEBUG nova.compute.manager [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:10:40 compute-0 nova_compute[187185]: 2025-11-29 07:10:40.571 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:10:41 compute-0 nova_compute[187185]: 2025-11-29 07:10:41.291 187189 INFO nova.compute.manager [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Took 12.87 seconds to build instance.
Nov 29 07:10:41 compute-0 nova_compute[187185]: 2025-11-29 07:10:41.315 187189 DEBUG oslo_concurrency.lockutils [None req-aca27a91-fc9c-4c93-a918-7794871d81d0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.007s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:10:41 compute-0 podman[226509]: 2025-11-29 07:10:41.829220527 +0000 UTC m=+0.090433902 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 07:10:41 compute-0 podman[226529]: 2025-11-29 07:10:41.952129635 +0000 UTC m=+0.074454252 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 29 07:10:43 compute-0 NetworkManager[55227]: <info>  [1764400243.6090] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/119)
Nov 29 07:10:43 compute-0 NetworkManager[55227]: <info>  [1764400243.6101] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/120)
Nov 29 07:10:43 compute-0 nova_compute[187185]: 2025-11-29 07:10:43.607 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:10:43 compute-0 ovn_controller[95281]: 2025-11-29T07:10:43Z|00221|binding|INFO|Releasing lport e99fae54-9bf0-4a59-8b06-7a4b6ecf1479 from this chassis (sb_readonly=0)
Nov 29 07:10:43 compute-0 nova_compute[187185]: 2025-11-29 07:10:43.665 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:10:43 compute-0 nova_compute[187185]: 2025-11-29 07:10:43.678 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:10:44 compute-0 nova_compute[187185]: 2025-11-29 07:10:44.058 187189 DEBUG nova.compute.manager [req-12451815-98c8-4c46-b303-b4824b34b54b req-58107535-c2d8-4eb6-bc7f-d0fee304e707 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Received event network-changed-69f2ab13-2311-4137-9c26-e256f33759e5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:10:44 compute-0 nova_compute[187185]: 2025-11-29 07:10:44.059 187189 DEBUG nova.compute.manager [req-12451815-98c8-4c46-b303-b4824b34b54b req-58107535-c2d8-4eb6-bc7f-d0fee304e707 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Refreshing instance network info cache due to event network-changed-69f2ab13-2311-4137-9c26-e256f33759e5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:10:44 compute-0 nova_compute[187185]: 2025-11-29 07:10:44.059 187189 DEBUG oslo_concurrency.lockutils [req-12451815-98c8-4c46-b303-b4824b34b54b req-58107535-c2d8-4eb6-bc7f-d0fee304e707 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-5896e8b0-25a2-4075-8ebf-5458b5ed9234" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:10:44 compute-0 nova_compute[187185]: 2025-11-29 07:10:44.059 187189 DEBUG oslo_concurrency.lockutils [req-12451815-98c8-4c46-b303-b4824b34b54b req-58107535-c2d8-4eb6-bc7f-d0fee304e707 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-5896e8b0-25a2-4075-8ebf-5458b5ed9234" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:10:44 compute-0 nova_compute[187185]: 2025-11-29 07:10:44.060 187189 DEBUG nova.network.neutron [req-12451815-98c8-4c46-b303-b4824b34b54b req-58107535-c2d8-4eb6-bc7f-d0fee304e707 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Refreshing network info cache for port 69f2ab13-2311-4137-9c26-e256f33759e5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:10:44 compute-0 nova_compute[187185]: 2025-11-29 07:10:44.646 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:10:44 compute-0 nova_compute[187185]: 2025-11-29 07:10:44.756 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:10:45 compute-0 nova_compute[187185]: 2025-11-29 07:10:45.573 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:10:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:47.996 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234', 'name': 'tempest-ServerActionsTestJSON-server-878694992', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000005e', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '6e6c366001df43fb91731faf7a9578fc', 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'hostId': '5640fa721172a4d7bf83648244233a9823df778da5b3ab8028ef92fd', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 07:10:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:47.997 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.003 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 5896e8b0-25a2-4075-8ebf-5458b5ed9234 / tap69f2ab13-23 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.003 12 DEBUG ceilometer.compute.pollsters [-] 5896e8b0-25a2-4075-8ebf-5458b5ed9234/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '155ef1a1-bb1e-4688-951e-090707448f3f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': 'instance-0000005e-5896e8b0-25a2-4075-8ebf-5458b5ed9234-tap69f2ab13-23', 'timestamp': '2025-11-29T07:10:47.997996', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-878694992', 'name': 'tap69f2ab13-23', 'instance_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234', 'instance_type': 'm1.nano', 'host': '5640fa721172a4d7bf83648244233a9823df778da5b3ab8028ef92fd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:cf:f0:da', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap69f2ab13-23'}, 'message_id': '87be6a70-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5695.716251634, 'message_signature': 'c7c729a1981faacc32def2d7f75a48be31031faaf4c824e0cb6a5b6b60ee607e'}]}, 'timestamp': '2025-11-29 07:10:48.004425', '_unique_id': '01bf534b52ec420cbc856cdc81d8cf76'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.005 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.007 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.007 12 DEBUG ceilometer.compute.pollsters [-] 5896e8b0-25a2-4075-8ebf-5458b5ed9234/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '51839874-7490-4e7c-b297-4d15ce7cfef3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': 'instance-0000005e-5896e8b0-25a2-4075-8ebf-5458b5ed9234-tap69f2ab13-23', 'timestamp': '2025-11-29T07:10:48.007108', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-878694992', 'name': 'tap69f2ab13-23', 'instance_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234', 'instance_type': 'm1.nano', 'host': '5640fa721172a4d7bf83648244233a9823df778da5b3ab8028ef92fd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:cf:f0:da', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap69f2ab13-23'}, 'message_id': '87bee84c-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5695.716251634, 'message_signature': 'd5264c84822f9f8a4271ed3fdef55a60688fdf935b96b2537697cdedcdd9bd6a'}]}, 'timestamp': '2025-11-29 07:10:48.007431', '_unique_id': 'c3f4665a047349678255072fe84618b1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.008 12 DEBUG ceilometer.compute.pollsters [-] 5896e8b0-25a2-4075-8ebf-5458b5ed9234/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'da0028c4-ec62-4a4f-90b9-1bb2863531c1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': 'instance-0000005e-5896e8b0-25a2-4075-8ebf-5458b5ed9234-tap69f2ab13-23', 'timestamp': '2025-11-29T07:10:48.008894', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-878694992', 'name': 'tap69f2ab13-23', 'instance_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234', 'instance_type': 'm1.nano', 'host': '5640fa721172a4d7bf83648244233a9823df778da5b3ab8028ef92fd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:cf:f0:da', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap69f2ab13-23'}, 'message_id': '87bf2dac-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5695.716251634, 'message_signature': '63208ab45f791c6ced0f05a6bce7f8edbcf1ff2373a9bd454d02b7cbe8d404a7'}]}, 'timestamp': '2025-11-29 07:10:48.009201', '_unique_id': '84d33fd68ae74a668bfd7a191fdc79fe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.009 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.010 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.010 12 DEBUG ceilometer.compute.pollsters [-] 5896e8b0-25a2-4075-8ebf-5458b5ed9234/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '999a790b-2189-4a06-9e26-bcbc43ced772', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': 'instance-0000005e-5896e8b0-25a2-4075-8ebf-5458b5ed9234-tap69f2ab13-23', 'timestamp': '2025-11-29T07:10:48.010616', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-878694992', 'name': 'tap69f2ab13-23', 'instance_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234', 'instance_type': 'm1.nano', 'host': '5640fa721172a4d7bf83648244233a9823df778da5b3ab8028ef92fd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:cf:f0:da', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap69f2ab13-23'}, 'message_id': '87bf7118-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5695.716251634, 'message_signature': '72ca2fabfa96460ad8423b31ac5d70f5a085090162da254be34945d3075c1460'}]}, 'timestamp': '2025-11-29 07:10:48.010947', '_unique_id': '41950e9d36044647b9fbc749bf3d3d07'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.011 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.012 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.012 12 DEBUG ceilometer.compute.pollsters [-] 5896e8b0-25a2-4075-8ebf-5458b5ed9234/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c45eb54c-aaea-4d67-93c9-ab6dabf6cfe8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': 'instance-0000005e-5896e8b0-25a2-4075-8ebf-5458b5ed9234-tap69f2ab13-23', 'timestamp': '2025-11-29T07:10:48.012570', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-878694992', 'name': 'tap69f2ab13-23', 'instance_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234', 'instance_type': 'm1.nano', 'host': '5640fa721172a4d7bf83648244233a9823df778da5b3ab8028ef92fd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:cf:f0:da', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap69f2ab13-23'}, 'message_id': '87bfbbbe-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5695.716251634, 'message_signature': '9fd8a80d06ca6d9030a43a41eb99202b34f911a6407db499245400dc2fc8f8d2'}]}, 'timestamp': '2025-11-29 07:10:48.012815', '_unique_id': 'c5d18583ced9446fb8b7c5eb7e59c2a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.013 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.040 12 DEBUG ceilometer.compute.pollsters [-] 5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.040 12 DEBUG ceilometer.compute.pollsters [-] 5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c1a761fa-2e14-425f-b90e-4ad853e881f3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234-vda', 'timestamp': '2025-11-29T07:10:48.014011', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-878694992', 'name': 'instance-0000005e', 'instance_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234', 'instance_type': 'm1.nano', 'host': '5640fa721172a4d7bf83648244233a9823df778da5b3ab8028ef92fd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '87c40192-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5695.732252456, 'message_signature': 'a115e4a904e4f0d59facc8e1b6edc80be74284fe6dfe88756a47b7cb072b0fe9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234-sda', 'timestamp': '2025-11-29T07:10:48.014011', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-878694992', 'name': 'instance-0000005e', 'instance_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234', 'instance_type': 'm1.nano', 'host': '5640fa721172a4d7bf83648244233a9823df778da5b3ab8028ef92fd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '87c40e3a-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5695.732252456, 'message_signature': 'b9ad94ce3bc5e136b2163c88101f27cfd2b6d5acc345f2537819c04b97e33ffd'}]}, 'timestamp': '2025-11-29 07:10:48.041128', '_unique_id': 'b729783b737e432c8823aac0efef082d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.042 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.053 12 DEBUG ceilometer.compute.pollsters [-] 5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.053 12 DEBUG ceilometer.compute.pollsters [-] 5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5541b391-ac0e-4f3e-9ba8-d8fb3cf4d628', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234-vda', 'timestamp': '2025-11-29T07:10:48.042917', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-878694992', 'name': 'instance-0000005e', 'instance_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234', 'instance_type': 'm1.nano', 'host': '5640fa721172a4d7bf83648244233a9823df778da5b3ab8028ef92fd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '87c60096-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5695.761157151, 'message_signature': '4276af1d7dd5bd893dde2befd2e8478a1fa56c60e3f68762b1296d7dce0b924b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234-sda', 'timestamp': '2025-11-29T07:10:48.042917', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-878694992', 'name': 'instance-0000005e', 'instance_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234', 'instance_type': 'm1.nano', 'host': '5640fa721172a4d7bf83648244233a9823df778da5b3ab8028ef92fd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '87c60adc-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5695.761157151, 'message_signature': '3d5d04bb235713dc9ece10c0f5b555009ee7fe17676879c16c078b7832fcdaef'}]}, 'timestamp': '2025-11-29 07:10:48.054140', '_unique_id': '75acb160971548d2898cb29928f9873e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.054 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.055 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.055 12 DEBUG ceilometer.compute.pollsters [-] 5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk.device.read.latency volume: 209732975 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.055 12 DEBUG ceilometer.compute.pollsters [-] 5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk.device.read.latency volume: 740641 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '341e970a-70c1-42a5-8d6e-53fe93446eb9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 209732975, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234-vda', 'timestamp': '2025-11-29T07:10:48.055601', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-878694992', 'name': 'instance-0000005e', 'instance_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234', 'instance_type': 'm1.nano', 'host': '5640fa721172a4d7bf83648244233a9823df778da5b3ab8028ef92fd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '87c64c86-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5695.732252456, 'message_signature': 'e12fc1a4fd32117bddb5dabb6b34ff86c50ca92669f034e18ab0b1b5f6f9a389'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 740641, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234-sda', 'timestamp': '2025-11-29T07:10:48.055601', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-878694992', 'name': 'instance-0000005e', 'instance_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234', 'instance_type': 'm1.nano', 'host': '5640fa721172a4d7bf83648244233a9823df778da5b3ab8028ef92fd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '87c65500-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5695.732252456, 'message_signature': '6eabced7d79d71102cc0ba57c5beddf6f4f6eb686de11a07491bd4ed0d326d1f'}]}, 'timestamp': '2025-11-29 07:10:48.056029', '_unique_id': '70e2f0d577534f419dff819760496fa7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.056 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.057 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.057 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.057 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-878694992>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-878694992>]
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.057 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.057 12 DEBUG ceilometer.compute.pollsters [-] 5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.057 12 DEBUG ceilometer.compute.pollsters [-] 5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3b7cad75-351a-4b5e-bbdb-a0f000ede8ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234-vda', 'timestamp': '2025-11-29T07:10:48.057508', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-878694992', 'name': 'instance-0000005e', 'instance_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234', 'instance_type': 'm1.nano', 'host': '5640fa721172a4d7bf83648244233a9823df778da5b3ab8028ef92fd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '87c696be-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5695.761157151, 'message_signature': '8b03257055d6e26d0788be6d665af11e8a681458323a6d8b365b992cd0905b1b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234-sda', 'timestamp': '2025-11-29T07:10:48.057508', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-878694992', 'name': 'instance-0000005e', 'instance_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234', 'instance_type': 'm1.nano', 'host': '5640fa721172a4d7bf83648244233a9823df778da5b3ab8028ef92fd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '87c69e5c-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5695.761157151, 'message_signature': 'ab6ced6d3490bc62a4e56e03fe25e10528af27fd9d374f2a272c6925fb1fdb21'}]}, 'timestamp': '2025-11-29 07:10:48.057921', '_unique_id': 'fb77cd848ac44d3dace15aaaa678310f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.058 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-878694992>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-878694992>]
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 DEBUG ceilometer.compute.pollsters [-] 5896e8b0-25a2-4075-8ebf-5458b5ed9234/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eaa9b192-c021-4cd6-bd08-1c4c99bd09aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': 'instance-0000005e-5896e8b0-25a2-4075-8ebf-5458b5ed9234-tap69f2ab13-23', 'timestamp': '2025-11-29T07:10:48.059259', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-878694992', 'name': 'tap69f2ab13-23', 'instance_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234', 'instance_type': 'm1.nano', 'host': '5640fa721172a4d7bf83648244233a9823df778da5b3ab8028ef92fd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:cf:f0:da', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap69f2ab13-23'}, 'message_id': '87c6db88-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5695.716251634, 'message_signature': '30c0142e6ba67592fb3536640ae4f9a48ac53fce3af5dc6e53e24688812808e7'}]}, 'timestamp': '2025-11-29 07:10:48.059485', '_unique_id': '1d9a336eb88b4f73b8f3717a80750737'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.059 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.060 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.060 12 DEBUG ceilometer.compute.pollsters [-] 5896e8b0-25a2-4075-8ebf-5458b5ed9234/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'feae0abc-2a3d-4cb4-a0fc-168f260596bc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': 'instance-0000005e-5896e8b0-25a2-4075-8ebf-5458b5ed9234-tap69f2ab13-23', 'timestamp': '2025-11-29T07:10:48.060564', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-878694992', 'name': 'tap69f2ab13-23', 'instance_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234', 'instance_type': 'm1.nano', 'host': '5640fa721172a4d7bf83648244233a9823df778da5b3ab8028ef92fd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:cf:f0:da', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap69f2ab13-23'}, 'message_id': '87c70e14-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5695.716251634, 'message_signature': 'f608193b6c1f0d0fe4bda525e7fe55a08797485a08087b65b3d6a4d38e0596d8'}]}, 'timestamp': '2025-11-29 07:10:48.060781', '_unique_id': '65a9d13b3747402bb143d49d7f2f6bec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.061 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.081 12 DEBUG ceilometer.compute.pollsters [-] 5896e8b0-25a2-4075-8ebf-5458b5ed9234/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.081 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 5896e8b0-25a2-4075-8ebf-5458b5ed9234: ceilometer.compute.pollsters.NoVolumeException
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.081 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.081 12 DEBUG ceilometer.compute.pollsters [-] 5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.082 12 DEBUG ceilometer.compute.pollsters [-] 5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '15090780-942d-412e-8d9a-d50aa934eb95', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234-vda', 'timestamp': '2025-11-29T07:10:48.081720', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-878694992', 'name': 'instance-0000005e', 'instance_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234', 'instance_type': 'm1.nano', 'host': '5640fa721172a4d7bf83648244233a9823df778da5b3ab8028ef92fd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '87ca4f5c-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5695.732252456, 'message_signature': '27290a88af1f1c539976100a8dbecd766c3a896c372dafd4590241c68f87a2d2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234-sda', 'timestamp': '2025-11-29T07:10:48.081720', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-878694992', 'name': 'instance-0000005e', 'instance_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234', 'instance_type': 'm1.nano', 'host': '5640fa721172a4d7bf83648244233a9823df778da5b3ab8028ef92fd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '87ca60aa-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5695.732252456, 'message_signature': 'e286eedab0b8cd6f69c034cac05f95bb2a00ca4c6f60a80c97f3fe4776becf96'}]}, 'timestamp': '2025-11-29 07:10:48.082651', '_unique_id': '432d54ada37d40a7a89b2b9feaf89b0e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.083 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.084 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.085 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.085 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-878694992>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-878694992>]
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.085 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.085 12 DEBUG ceilometer.compute.pollsters [-] 5896e8b0-25a2-4075-8ebf-5458b5ed9234/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f57c85d9-d2c8-4518-9294-9a367234dd67', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': 'instance-0000005e-5896e8b0-25a2-4075-8ebf-5458b5ed9234-tap69f2ab13-23', 'timestamp': '2025-11-29T07:10:48.085772', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-878694992', 'name': 'tap69f2ab13-23', 'instance_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234', 'instance_type': 'm1.nano', 'host': '5640fa721172a4d7bf83648244233a9823df778da5b3ab8028ef92fd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:cf:f0:da', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap69f2ab13-23'}, 'message_id': '87caebec-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5695.716251634, 'message_signature': 'fee1d0db7ff135eb8925586ef5bac4e6dcf2ddde2d0a6881d36a0d5b1f135225'}]}, 'timestamp': '2025-11-29 07:10:48.086174', '_unique_id': '29350fb1f1c54c41b668b6911cc8a0a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.086 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.087 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.087 12 DEBUG ceilometer.compute.pollsters [-] 5896e8b0-25a2-4075-8ebf-5458b5ed9234/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cb5a46a4-59ba-496d-8ad8-ce0a9fa60ea0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': 'instance-0000005e-5896e8b0-25a2-4075-8ebf-5458b5ed9234-tap69f2ab13-23', 'timestamp': '2025-11-29T07:10:48.087672', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-878694992', 'name': 'tap69f2ab13-23', 'instance_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234', 'instance_type': 'm1.nano', 'host': '5640fa721172a4d7bf83648244233a9823df778da5b3ab8028ef92fd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:cf:f0:da', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap69f2ab13-23'}, 'message_id': '87cb3408-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5695.716251634, 'message_signature': '1147aafc59ee8f0234f552268d082da2782aa9ed22bf4f622d92dbbc6797a87b'}]}, 'timestamp': '2025-11-29 07:10:48.088036', '_unique_id': 'bbd638275134454dac8f3e02458abf56'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.088 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.089 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.089 12 DEBUG ceilometer.compute.pollsters [-] 5896e8b0-25a2-4075-8ebf-5458b5ed9234/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a9e7e6f1-b562-4da1-bfa3-434fb6702994', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': 'instance-0000005e-5896e8b0-25a2-4075-8ebf-5458b5ed9234-tap69f2ab13-23', 'timestamp': '2025-11-29T07:10:48.089783', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-878694992', 'name': 'tap69f2ab13-23', 'instance_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234', 'instance_type': 'm1.nano', 'host': '5640fa721172a4d7bf83648244233a9823df778da5b3ab8028ef92fd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:cf:f0:da', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap69f2ab13-23'}, 'message_id': '87cb8714-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5695.716251634, 'message_signature': '1a7a7d733e1075a8d1e48d50c94894cb393e03114c6f629094923a8b9f5358d8'}]}, 'timestamp': '2025-11-29 07:10:48.090142', '_unique_id': '0d8ebe4b98ac48b09e64e198351b479a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.090 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.091 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.091 12 DEBUG ceilometer.compute.pollsters [-] 5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.092 12 DEBUG ceilometer.compute.pollsters [-] 5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cf1e44c0-e173-49d2-abfb-80dc012106b1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234-vda', 'timestamp': '2025-11-29T07:10:48.091663', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-878694992', 'name': 'instance-0000005e', 'instance_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234', 'instance_type': 'm1.nano', 'host': '5640fa721172a4d7bf83648244233a9823df778da5b3ab8028ef92fd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '87cbd07a-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5695.732252456, 'message_signature': 'c161a37e5e2d04bd9d89852f80e55ed45c30765a23fef1c630fd8f01f27171ea'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234-sda', 'timestamp': '2025-11-29T07:10:48.091663', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-878694992', 'name': 'instance-0000005e', 'instance_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234', 'instance_type': 'm1.nano', 'host': '5640fa721172a4d7bf83648244233a9823df778da5b3ab8028ef92fd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '87cbdcd2-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5695.732252456, 'message_signature': '4239ff1d46577f985bffc60536e330c34307f38a58b0f16c9e72b364165c6a5c'}]}, 'timestamp': '2025-11-29 07:10:48.092318', '_unique_id': 'c4a532acfe3b48ae9d2c5602e54a751c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.093 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 DEBUG ceilometer.compute.pollsters [-] 5896e8b0-25a2-4075-8ebf-5458b5ed9234/cpu volume: 7990000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'af97a4ec-a828-4f55-b65a-fb28d49ae603', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 7990000000, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234', 'timestamp': '2025-11-29T07:10:48.094041', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-878694992', 'name': 'instance-0000005e', 'instance_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234', 'instance_type': 'm1.nano', 'host': '5640fa721172a4d7bf83648244233a9823df778da5b3ab8028ef92fd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '87cc2bc4-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5695.799265956, 'message_signature': 'c6e5d63bae4a68264ddbde9c81360364ba5848749739662f3c687af79ed3c7fd'}]}, 'timestamp': '2025-11-29 07:10:48.094346', '_unique_id': '9d5f680abad544b69f913cb3429707d5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.094 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.095 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.095 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.095 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-878694992>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-878694992>]
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.096 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.096 12 DEBUG ceilometer.compute.pollsters [-] 5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.096 12 DEBUG ceilometer.compute.pollsters [-] 5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '51075a8c-7ce2-4933-9daa-937ad9295c8b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234-vda', 'timestamp': '2025-11-29T07:10:48.096247', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-878694992', 'name': 'instance-0000005e', 'instance_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234', 'instance_type': 'm1.nano', 'host': '5640fa721172a4d7bf83648244233a9823df778da5b3ab8028ef92fd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '87cc81aa-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5695.732252456, 'message_signature': 'dc35ed7d288dab32fc29013c3ada6b228ac6ff7c944ab728f49250ebd833ce8b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234-sda', 'timestamp': '2025-11-29T07:10:48.096247', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-878694992', 'name': 'instance-0000005e', 'instance_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234', 'instance_type': 'm1.nano', 'host': '5640fa721172a4d7bf83648244233a9823df778da5b3ab8028ef92fd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '87cc8c9a-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5695.732252456, 'message_signature': '3d56d80d3470d5908c1ea2a1e1a07a85184b06614f19af5419074f9fd7ef2e46'}]}, 'timestamp': '2025-11-29 07:10:48.096813', '_unique_id': '0ce255beedd04cb696188bce6f563f5b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.097 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.098 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.098 12 DEBUG ceilometer.compute.pollsters [-] 5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.098 12 DEBUG ceilometer.compute.pollsters [-] 5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cb7c8fb0-ab08-427c-9dfd-b75706fa5a7b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234-vda', 'timestamp': '2025-11-29T07:10:48.098403', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-878694992', 'name': 'instance-0000005e', 'instance_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234', 'instance_type': 'm1.nano', 'host': '5640fa721172a4d7bf83648244233a9823df778da5b3ab8028ef92fd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '87ccd61e-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5695.761157151, 'message_signature': 'b43a6376f68d8e60d48fe07679be29f5b101a998e2e424ea6fa6c7e40460c267'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234-sda', 'timestamp': '2025-11-29T07:10:48.098403', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-878694992', 'name': 'instance-0000005e', 'instance_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234', 'instance_type': 'm1.nano', 'host': '5640fa721172a4d7bf83648244233a9823df778da5b3ab8028ef92fd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '87cce1e0-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5695.761157151, 'message_signature': '46fc7e683b3996f89b4d698b342fce7b40be178ca284939135e158bca92066b4'}]}, 'timestamp': '2025-11-29 07:10:48.099000', '_unique_id': '6e3f9cae8a0d4574994ab89b4cc61316'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.099 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.100 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.100 12 DEBUG ceilometer.compute.pollsters [-] 5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.100 12 DEBUG ceilometer.compute.pollsters [-] 5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'af30b0df-01a5-41a1-9844-66875c27975d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234-vda', 'timestamp': '2025-11-29T07:10:48.100507', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-878694992', 'name': 'instance-0000005e', 'instance_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234', 'instance_type': 'm1.nano', 'host': '5640fa721172a4d7bf83648244233a9823df778da5b3ab8028ef92fd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '87cd2826-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5695.732252456, 'message_signature': '7329c18cdd0632159f5065e3c1ea008013e675e0b2e12e9326d838481dd06e2b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_name': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_name': None, 'resource_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234-sda', 'timestamp': '2025-11-29T07:10:48.100507', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestJSON-server-878694992', 'name': 'instance-0000005e', 'instance_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234', 'instance_type': 'm1.nano', 'host': '5640fa721172a4d7bf83648244233a9823df778da5b3ab8028ef92fd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '87cd33e8-ccf2-11f0-8f64-fa163e220349', 'monotonic_time': 5695.732252456, 'message_signature': '8b1cd9a878cbe34465adfc7e73705f6256334b4d6b5ee322e083554dc77250a2'}]}, 'timestamp': '2025-11-29 07:10:48.101097', '_unique_id': '0a71aa44140c40bdb13bcc44be2ee397'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:10:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:10:48.101 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:10:48 compute-0 nova_compute[187185]: 2025-11-29 07:10:48.605 187189 DEBUG nova.network.neutron [req-12451815-98c8-4c46-b303-b4824b34b54b req-58107535-c2d8-4eb6-bc7f-d0fee304e707 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Updated VIF entry in instance network info cache for port 69f2ab13-2311-4137-9c26-e256f33759e5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:10:48 compute-0 nova_compute[187185]: 2025-11-29 07:10:48.605 187189 DEBUG nova.network.neutron [req-12451815-98c8-4c46-b303-b4824b34b54b req-58107535-c2d8-4eb6-bc7f-d0fee304e707 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Updating instance_info_cache with network_info: [{"id": "69f2ab13-2311-4137-9c26-e256f33759e5", "address": "fa:16:3e:cf:f0:da", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69f2ab13-23", "ovs_interfaceid": "69f2ab13-2311-4137-9c26-e256f33759e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:10:48 compute-0 nova_compute[187185]: 2025-11-29 07:10:48.639 187189 DEBUG oslo_concurrency.lockutils [req-12451815-98c8-4c46-b303-b4824b34b54b req-58107535-c2d8-4eb6-bc7f-d0fee304e707 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-5896e8b0-25a2-4075-8ebf-5458b5ed9234" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:10:49 compute-0 nova_compute[187185]: 2025-11-29 07:10:49.759 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:10:50 compute-0 nova_compute[187185]: 2025-11-29 07:10:50.576 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:10:53 compute-0 ovn_controller[95281]: 2025-11-29T07:10:53Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:cf:f0:da 10.100.0.6
Nov 29 07:10:53 compute-0 ovn_controller[95281]: 2025-11-29T07:10:53Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:cf:f0:da 10.100.0.6
Nov 29 07:10:53 compute-0 nova_compute[187185]: 2025-11-29 07:10:53.667 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:10:53 compute-0 podman[226568]: 2025-11-29 07:10:53.804786257 +0000 UTC m=+0.065575211 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true)
Nov 29 07:10:53 compute-0 podman[226569]: 2025-11-29 07:10:53.810091266 +0000 UTC m=+0.065889729 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.expose-services=, version=9.6, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350)
Nov 29 07:10:53 compute-0 podman[226570]: 2025-11-29 07:10:53.812925376 +0000 UTC m=+0.060392514 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 07:10:54 compute-0 nova_compute[187185]: 2025-11-29 07:10:54.762 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:10:55 compute-0 nova_compute[187185]: 2025-11-29 07:10:55.580 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:10:57 compute-0 nova_compute[187185]: 2025-11-29 07:10:57.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:10:57 compute-0 nova_compute[187185]: 2025-11-29 07:10:57.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:10:57 compute-0 nova_compute[187185]: 2025-11-29 07:10:57.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:10:58 compute-0 nova_compute[187185]: 2025-11-29 07:10:58.923 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "refresh_cache-5896e8b0-25a2-4075-8ebf-5458b5ed9234" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:10:58 compute-0 nova_compute[187185]: 2025-11-29 07:10:58.924 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquired lock "refresh_cache-5896e8b0-25a2-4075-8ebf-5458b5ed9234" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:10:58 compute-0 nova_compute[187185]: 2025-11-29 07:10:58.924 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 29 07:10:58 compute-0 nova_compute[187185]: 2025-11-29 07:10:58.924 187189 DEBUG nova.objects.instance [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 5896e8b0-25a2-4075-8ebf-5458b5ed9234 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:10:59 compute-0 nova_compute[187185]: 2025-11-29 07:10:59.765 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:00 compute-0 nova_compute[187185]: 2025-11-29 07:11:00.582 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:03 compute-0 podman[226632]: 2025-11-29 07:11:03.871761033 +0000 UTC m=+0.131538491 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 29 07:11:04 compute-0 nova_compute[187185]: 2025-11-29 07:11:04.192 187189 DEBUG oslo_concurrency.lockutils [None req-04f1088e-92ac-4dc5-b7b1-d924ea2ffc9e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:11:04 compute-0 nova_compute[187185]: 2025-11-29 07:11:04.192 187189 DEBUG oslo_concurrency.lockutils [None req-04f1088e-92ac-4dc5-b7b1-d924ea2ffc9e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:11:04 compute-0 nova_compute[187185]: 2025-11-29 07:11:04.193 187189 DEBUG nova.compute.manager [None req-04f1088e-92ac-4dc5-b7b1-d924ea2ffc9e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:11:04 compute-0 nova_compute[187185]: 2025-11-29 07:11:04.197 187189 DEBUG nova.compute.manager [None req-04f1088e-92ac-4dc5-b7b1-d924ea2ffc9e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338
Nov 29 07:11:04 compute-0 nova_compute[187185]: 2025-11-29 07:11:04.198 187189 DEBUG nova.objects.instance [None req-04f1088e-92ac-4dc5-b7b1-d924ea2ffc9e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'flavor' on Instance uuid 5896e8b0-25a2-4075-8ebf-5458b5ed9234 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:11:04 compute-0 nova_compute[187185]: 2025-11-29 07:11:04.252 187189 DEBUG nova.objects.instance [None req-04f1088e-92ac-4dc5-b7b1-d924ea2ffc9e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'info_cache' on Instance uuid 5896e8b0-25a2-4075-8ebf-5458b5ed9234 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:11:04 compute-0 nova_compute[187185]: 2025-11-29 07:11:04.333 187189 DEBUG nova.virt.libvirt.driver [None req-04f1088e-92ac-4dc5-b7b1-d924ea2ffc9e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Nov 29 07:11:04 compute-0 nova_compute[187185]: 2025-11-29 07:11:04.768 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:05 compute-0 nova_compute[187185]: 2025-11-29 07:11:05.585 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:05.742 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:11:05 compute-0 nova_compute[187185]: 2025-11-29 07:11:05.745 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:05.746 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:11:06 compute-0 kernel: tap69f2ab13-23 (unregistering): left promiscuous mode
Nov 29 07:11:06 compute-0 NetworkManager[55227]: <info>  [1764400266.5352] device (tap69f2ab13-23): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:11:06 compute-0 ovn_controller[95281]: 2025-11-29T07:11:06Z|00222|binding|INFO|Releasing lport 69f2ab13-2311-4137-9c26-e256f33759e5 from this chassis (sb_readonly=0)
Nov 29 07:11:06 compute-0 ovn_controller[95281]: 2025-11-29T07:11:06Z|00223|binding|INFO|Setting lport 69f2ab13-2311-4137-9c26-e256f33759e5 down in Southbound
Nov 29 07:11:06 compute-0 ovn_controller[95281]: 2025-11-29T07:11:06Z|00224|binding|INFO|Removing iface tap69f2ab13-23 ovn-installed in OVS
Nov 29 07:11:06 compute-0 nova_compute[187185]: 2025-11-29 07:11:06.553 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:06 compute-0 nova_compute[187185]: 2025-11-29 07:11:06.557 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:06 compute-0 nova_compute[187185]: 2025-11-29 07:11:06.567 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:06.576 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cf:f0:da 10.100.0.6'], port_security=['fa:16:3e:cf:f0:da 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9226dea3-6355-4dd9-9441-d093c1f1a399', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e6c366001df43fb91731faf7a9578fc', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1ab550ec-f542-4b91-8354-868da408e26d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.225'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d2b9cdf8-2516-48a0-81c5-85c2327075d5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=69f2ab13-2311-4137-9c26-e256f33759e5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:11:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:06.578 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 69f2ab13-2311-4137-9c26-e256f33759e5 in datapath 9226dea3-6355-4dd9-9441-d093c1f1a399 unbound from our chassis
Nov 29 07:11:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:06.580 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9226dea3-6355-4dd9-9441-d093c1f1a399, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:11:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:06.582 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[3104ee6a-6bae-40f5-b2c7-f8acad7cea8d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:11:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:06.583 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 namespace which is not needed anymore
Nov 29 07:11:06 compute-0 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d0000005e.scope: Deactivated successfully.
Nov 29 07:11:06 compute-0 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d0000005e.scope: Consumed 13.976s CPU time.
Nov 29 07:11:06 compute-0 systemd-machined[153486]: Machine qemu-32-instance-0000005e terminated.
Nov 29 07:11:06 compute-0 nova_compute[187185]: 2025-11-29 07:11:06.791 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:06 compute-0 nova_compute[187185]: 2025-11-29 07:11:06.797 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:06 compute-0 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[226470]: [NOTICE]   (226474) : haproxy version is 2.8.14-c23fe91
Nov 29 07:11:06 compute-0 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[226470]: [NOTICE]   (226474) : path to executable is /usr/sbin/haproxy
Nov 29 07:11:06 compute-0 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[226470]: [WARNING]  (226474) : Exiting Master process...
Nov 29 07:11:06 compute-0 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[226470]: [ALERT]    (226474) : Current worker (226476) exited with code 143 (Terminated)
Nov 29 07:11:06 compute-0 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[226470]: [WARNING]  (226474) : All workers exited. Exiting... (0)
Nov 29 07:11:06 compute-0 systemd[1]: libpod-381532344d12d9ac807746ff3f6956802f3686583b18abc4e35373c93c1323e6.scope: Deactivated successfully.
Nov 29 07:11:06 compute-0 podman[226688]: 2025-11-29 07:11:06.811041573 +0000 UTC m=+0.111798035 container died 381532344d12d9ac807746ff3f6956802f3686583b18abc4e35373c93c1323e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 29 07:11:06 compute-0 nova_compute[187185]: 2025-11-29 07:11:06.823 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Updating instance_info_cache with network_info: [{"id": "69f2ab13-2311-4137-9c26-e256f33759e5", "address": "fa:16:3e:cf:f0:da", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69f2ab13-23", "ovs_interfaceid": "69f2ab13-2311-4137-9c26-e256f33759e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:11:06 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-381532344d12d9ac807746ff3f6956802f3686583b18abc4e35373c93c1323e6-userdata-shm.mount: Deactivated successfully.
Nov 29 07:11:06 compute-0 systemd[1]: var-lib-containers-storage-overlay-a5f15b90c3583df15e605da6f64b73a36f9ef28cfa74f5da88d8db45a6cfce6a-merged.mount: Deactivated successfully.
Nov 29 07:11:06 compute-0 podman[226688]: 2025-11-29 07:11:06.86198253 +0000 UTC m=+0.162738962 container cleanup 381532344d12d9ac807746ff3f6956802f3686583b18abc4e35373c93c1323e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 07:11:06 compute-0 systemd[1]: libpod-conmon-381532344d12d9ac807746ff3f6956802f3686583b18abc4e35373c93c1323e6.scope: Deactivated successfully.
Nov 29 07:11:06 compute-0 podman[226735]: 2025-11-29 07:11:06.936534553 +0000 UTC m=+0.044982290 container remove 381532344d12d9ac807746ff3f6956802f3686583b18abc4e35373c93c1323e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:11:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:06.944 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6ed8cef6-3c54-45fc-8534-75d17098e658]: (4, ('Sat Nov 29 07:11:06 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 (381532344d12d9ac807746ff3f6956802f3686583b18abc4e35373c93c1323e6)\n381532344d12d9ac807746ff3f6956802f3686583b18abc4e35373c93c1323e6\nSat Nov 29 07:11:06 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 (381532344d12d9ac807746ff3f6956802f3686583b18abc4e35373c93c1323e6)\n381532344d12d9ac807746ff3f6956802f3686583b18abc4e35373c93c1323e6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:11:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:06.947 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f8f77984-90a2-4f61-8af7-938d55ec7f1e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:11:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:06.948 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9226dea3-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:11:06 compute-0 nova_compute[187185]: 2025-11-29 07:11:06.950 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:06 compute-0 kernel: tap9226dea3-60: left promiscuous mode
Nov 29 07:11:06 compute-0 nova_compute[187185]: 2025-11-29 07:11:06.969 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:06.974 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[22ebceef-145a-437f-94dc-098e67910283]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:11:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:06.997 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[7c154681-e61a-474d-8e0b-ccbdcf286dd3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:11:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:06.999 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[9b477362-59f8-460f-8289-6c0dd3591b01]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:11:07 compute-0 nova_compute[187185]: 2025-11-29 07:11:07.011 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Releasing lock "refresh_cache-5896e8b0-25a2-4075-8ebf-5458b5ed9234" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:11:07 compute-0 nova_compute[187185]: 2025-11-29 07:11:07.012 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 29 07:11:07 compute-0 nova_compute[187185]: 2025-11-29 07:11:07.013 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:11:07 compute-0 nova_compute[187185]: 2025-11-29 07:11:07.013 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:11:07 compute-0 nova_compute[187185]: 2025-11-29 07:11:07.014 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:11:07 compute-0 nova_compute[187185]: 2025-11-29 07:11:07.014 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:11:07 compute-0 nova_compute[187185]: 2025-11-29 07:11:07.015 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:11:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:07.014 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[25ddf191-21b6-40de-bd13-7cccf12e8763]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 568406, 'reachable_time': 38820, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226754, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:11:07 compute-0 nova_compute[187185]: 2025-11-29 07:11:07.015 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:11:07 compute-0 nova_compute[187185]: 2025-11-29 07:11:07.016 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:11:07 compute-0 nova_compute[187185]: 2025-11-29 07:11:07.016 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:11:07 compute-0 systemd[1]: run-netns-ovnmeta\x2d9226dea3\x2d6355\x2d4dd9\x2d9441\x2dd093c1f1a399.mount: Deactivated successfully.
Nov 29 07:11:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:07.020 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:11:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:07.020 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[7e0a366f-3bdd-4f9e-aada-738d9b4bb960]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:11:07 compute-0 nova_compute[187185]: 2025-11-29 07:11:07.176 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:11:07 compute-0 nova_compute[187185]: 2025-11-29 07:11:07.177 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:11:07 compute-0 nova_compute[187185]: 2025-11-29 07:11:07.178 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:11:07 compute-0 nova_compute[187185]: 2025-11-29 07:11:07.178 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:11:07 compute-0 nova_compute[187185]: 2025-11-29 07:11:07.352 187189 INFO nova.virt.libvirt.driver [None req-04f1088e-92ac-4dc5-b7b1-d924ea2ffc9e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Instance shutdown successfully after 3 seconds.
Nov 29 07:11:07 compute-0 nova_compute[187185]: 2025-11-29 07:11:07.358 187189 INFO nova.virt.libvirt.driver [-] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Instance destroyed successfully.
Nov 29 07:11:07 compute-0 nova_compute[187185]: 2025-11-29 07:11:07.359 187189 DEBUG nova.objects.instance [None req-04f1088e-92ac-4dc5-b7b1-d924ea2ffc9e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'numa_topology' on Instance uuid 5896e8b0-25a2-4075-8ebf-5458b5ed9234 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:11:07 compute-0 nova_compute[187185]: 2025-11-29 07:11:07.389 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:11:07 compute-0 nova_compute[187185]: 2025-11-29 07:11:07.414 187189 DEBUG nova.compute.manager [None req-04f1088e-92ac-4dc5-b7b1-d924ea2ffc9e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:11:07 compute-0 nova_compute[187185]: 2025-11-29 07:11:07.458 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:11:07 compute-0 nova_compute[187185]: 2025-11-29 07:11:07.459 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:11:07 compute-0 nova_compute[187185]: 2025-11-29 07:11:07.533 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:11:07 compute-0 nova_compute[187185]: 2025-11-29 07:11:07.580 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:07 compute-0 nova_compute[187185]: 2025-11-29 07:11:07.698 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:11:07 compute-0 nova_compute[187185]: 2025-11-29 07:11:07.701 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5733MB free_disk=73.26745986938477GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:11:07 compute-0 nova_compute[187185]: 2025-11-29 07:11:07.701 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:11:07 compute-0 nova_compute[187185]: 2025-11-29 07:11:07.701 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:11:07 compute-0 nova_compute[187185]: 2025-11-29 07:11:07.834 187189 DEBUG oslo_concurrency.lockutils [None req-04f1088e-92ac-4dc5-b7b1-d924ea2ffc9e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 3.641s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:11:07 compute-0 nova_compute[187185]: 2025-11-29 07:11:07.954 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance 5896e8b0-25a2-4075-8ebf-5458b5ed9234 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:11:07 compute-0 nova_compute[187185]: 2025-11-29 07:11:07.955 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:11:07 compute-0 nova_compute[187185]: 2025-11-29 07:11:07.955 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:11:08 compute-0 nova_compute[187185]: 2025-11-29 07:11:08.059 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:11:08 compute-0 nova_compute[187185]: 2025-11-29 07:11:08.102 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:11:08 compute-0 nova_compute[187185]: 2025-11-29 07:11:08.134 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:11:08 compute-0 nova_compute[187185]: 2025-11-29 07:11:08.135 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.433s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:11:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:08.748 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:11:09 compute-0 nova_compute[187185]: 2025-11-29 07:11:09.770 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:09 compute-0 podman[226762]: 2025-11-29 07:11:09.8367487 +0000 UTC m=+0.093333734 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:11:10 compute-0 nova_compute[187185]: 2025-11-29 07:11:10.587 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:12 compute-0 podman[226790]: 2025-11-29 07:11:12.810260505 +0000 UTC m=+0.063215245 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute)
Nov 29 07:11:12 compute-0 podman[226789]: 2025-11-29 07:11:12.828733956 +0000 UTC m=+0.085559305 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:11:13 compute-0 nova_compute[187185]: 2025-11-29 07:11:13.154 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:14 compute-0 nova_compute[187185]: 2025-11-29 07:11:14.772 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:15 compute-0 nova_compute[187185]: 2025-11-29 07:11:15.128 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:11:15 compute-0 nova_compute[187185]: 2025-11-29 07:11:15.589 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:19 compute-0 nova_compute[187185]: 2025-11-29 07:11:19.775 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:20 compute-0 nova_compute[187185]: 2025-11-29 07:11:20.593 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:20 compute-0 nova_compute[187185]: 2025-11-29 07:11:20.961 187189 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 0.05 sec
Nov 29 07:11:20 compute-0 nova_compute[187185]: 2025-11-29 07:11:20.989 187189 DEBUG nova.objects.instance [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'flavor' on Instance uuid 5896e8b0-25a2-4075-8ebf-5458b5ed9234 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:11:21 compute-0 nova_compute[187185]: 2025-11-29 07:11:21.032 187189 DEBUG nova.objects.instance [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'info_cache' on Instance uuid 5896e8b0-25a2-4075-8ebf-5458b5ed9234 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:11:21 compute-0 nova_compute[187185]: 2025-11-29 07:11:21.095 187189 DEBUG oslo_concurrency.lockutils [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "refresh_cache-5896e8b0-25a2-4075-8ebf-5458b5ed9234" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:11:21 compute-0 nova_compute[187185]: 2025-11-29 07:11:21.095 187189 DEBUG oslo_concurrency.lockutils [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquired lock "refresh_cache-5896e8b0-25a2-4075-8ebf-5458b5ed9234" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:11:21 compute-0 nova_compute[187185]: 2025-11-29 07:11:21.096 187189 DEBUG nova.network.neutron [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:11:21 compute-0 nova_compute[187185]: 2025-11-29 07:11:21.849 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400266.8485065, 5896e8b0-25a2-4075-8ebf-5458b5ed9234 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:11:21 compute-0 nova_compute[187185]: 2025-11-29 07:11:21.850 187189 INFO nova.compute.manager [-] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] VM Stopped (Lifecycle Event)
Nov 29 07:11:21 compute-0 nova_compute[187185]: 2025-11-29 07:11:21.889 187189 DEBUG nova.compute.manager [None req-79436e28-3073-4a4b-8b6c-6061f1b36e9a - - - - - -] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:11:21 compute-0 nova_compute[187185]: 2025-11-29 07:11:21.892 187189 DEBUG nova.compute.manager [None req-79436e28-3073-4a4b-8b6c-6061f1b36e9a - - - - - -] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:11:21 compute-0 nova_compute[187185]: 2025-11-29 07:11:21.960 187189 INFO nova.compute.manager [None req-79436e28-3073-4a4b-8b6c-6061f1b36e9a - - - - - -] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] During sync_power_state the instance has a pending task (powering-on). Skip.
Nov 29 07:11:22 compute-0 nova_compute[187185]: 2025-11-29 07:11:22.753 187189 DEBUG nova.compute.manager [req-622fd30e-f7de-46fe-b131-21b5bbaffc6a req-e9e0189a-6326-44bc-b2c3-73c03fd12d58 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Received event network-vif-unplugged-69f2ab13-2311-4137-9c26-e256f33759e5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:11:22 compute-0 nova_compute[187185]: 2025-11-29 07:11:22.754 187189 DEBUG oslo_concurrency.lockutils [req-622fd30e-f7de-46fe-b131-21b5bbaffc6a req-e9e0189a-6326-44bc-b2c3-73c03fd12d58 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:11:22 compute-0 nova_compute[187185]: 2025-11-29 07:11:22.755 187189 DEBUG oslo_concurrency.lockutils [req-622fd30e-f7de-46fe-b131-21b5bbaffc6a req-e9e0189a-6326-44bc-b2c3-73c03fd12d58 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:11:22 compute-0 nova_compute[187185]: 2025-11-29 07:11:22.755 187189 DEBUG oslo_concurrency.lockutils [req-622fd30e-f7de-46fe-b131-21b5bbaffc6a req-e9e0189a-6326-44bc-b2c3-73c03fd12d58 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:11:22 compute-0 nova_compute[187185]: 2025-11-29 07:11:22.756 187189 DEBUG nova.compute.manager [req-622fd30e-f7de-46fe-b131-21b5bbaffc6a req-e9e0189a-6326-44bc-b2c3-73c03fd12d58 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] No waiting events found dispatching network-vif-unplugged-69f2ab13-2311-4137-9c26-e256f33759e5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:11:22 compute-0 nova_compute[187185]: 2025-11-29 07:11:22.756 187189 WARNING nova.compute.manager [req-622fd30e-f7de-46fe-b131-21b5bbaffc6a req-e9e0189a-6326-44bc-b2c3-73c03fd12d58 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Received unexpected event network-vif-unplugged-69f2ab13-2311-4137-9c26-e256f33759e5 for instance with vm_state stopped and task_state powering-on.
Nov 29 07:11:24 compute-0 nova_compute[187185]: 2025-11-29 07:11:24.777 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:24 compute-0 podman[226829]: 2025-11-29 07:11:24.799773585 +0000 UTC m=+0.060434770 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 29 07:11:24 compute-0 podman[226831]: 2025-11-29 07:11:24.820763038 +0000 UTC m=+0.070873635 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 07:11:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:24.830 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:11:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:24.831 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:11:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:24.831 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:11:24 compute-0 podman[226830]: 2025-11-29 07:11:24.845352694 +0000 UTC m=+0.096238993 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, release=1755695350, container_name=openstack_network_exporter, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container)
Nov 29 07:11:25 compute-0 nova_compute[187185]: 2025-11-29 07:11:25.595 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:27 compute-0 nova_compute[187185]: 2025-11-29 07:11:27.252 187189 DEBUG nova.network.neutron [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Updating instance_info_cache with network_info: [{"id": "69f2ab13-2311-4137-9c26-e256f33759e5", "address": "fa:16:3e:cf:f0:da", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69f2ab13-23", "ovs_interfaceid": "69f2ab13-2311-4137-9c26-e256f33759e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:11:28 compute-0 nova_compute[187185]: 2025-11-29 07:11:28.029 187189 DEBUG oslo_concurrency.lockutils [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Releasing lock "refresh_cache-5896e8b0-25a2-4075-8ebf-5458b5ed9234" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:11:28 compute-0 nova_compute[187185]: 2025-11-29 07:11:28.277 187189 DEBUG nova.compute.manager [req-4902ae29-713a-4359-9984-2a3afbf6e469 req-9cd0d0e5-2b1f-4a14-a090-75a9132563a6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Received event network-vif-plugged-69f2ab13-2311-4137-9c26-e256f33759e5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:11:28 compute-0 nova_compute[187185]: 2025-11-29 07:11:28.277 187189 DEBUG oslo_concurrency.lockutils [req-4902ae29-713a-4359-9984-2a3afbf6e469 req-9cd0d0e5-2b1f-4a14-a090-75a9132563a6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:11:28 compute-0 nova_compute[187185]: 2025-11-29 07:11:28.278 187189 DEBUG oslo_concurrency.lockutils [req-4902ae29-713a-4359-9984-2a3afbf6e469 req-9cd0d0e5-2b1f-4a14-a090-75a9132563a6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:11:28 compute-0 nova_compute[187185]: 2025-11-29 07:11:28.278 187189 DEBUG oslo_concurrency.lockutils [req-4902ae29-713a-4359-9984-2a3afbf6e469 req-9cd0d0e5-2b1f-4a14-a090-75a9132563a6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:11:28 compute-0 nova_compute[187185]: 2025-11-29 07:11:28.278 187189 DEBUG nova.compute.manager [req-4902ae29-713a-4359-9984-2a3afbf6e469 req-9cd0d0e5-2b1f-4a14-a090-75a9132563a6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] No waiting events found dispatching network-vif-plugged-69f2ab13-2311-4137-9c26-e256f33759e5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:11:28 compute-0 nova_compute[187185]: 2025-11-29 07:11:28.278 187189 WARNING nova.compute.manager [req-4902ae29-713a-4359-9984-2a3afbf6e469 req-9cd0d0e5-2b1f-4a14-a090-75a9132563a6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Received unexpected event network-vif-plugged-69f2ab13-2311-4137-9c26-e256f33759e5 for instance with vm_state stopped and task_state powering-on.
Nov 29 07:11:28 compute-0 nova_compute[187185]: 2025-11-29 07:11:28.976 187189 INFO nova.virt.libvirt.driver [-] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Instance destroyed successfully.
Nov 29 07:11:28 compute-0 nova_compute[187185]: 2025-11-29 07:11:28.977 187189 DEBUG nova.objects.instance [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'numa_topology' on Instance uuid 5896e8b0-25a2-4075-8ebf-5458b5ed9234 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:11:29 compute-0 nova_compute[187185]: 2025-11-29 07:11:29.780 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:30 compute-0 nova_compute[187185]: 2025-11-29 07:11:30.598 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:32 compute-0 nova_compute[187185]: 2025-11-29 07:11:32.005 187189 DEBUG nova.objects.instance [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'resources' on Instance uuid 5896e8b0-25a2-4075-8ebf-5458b5ed9234 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:11:34 compute-0 nova_compute[187185]: 2025-11-29 07:11:34.468 187189 DEBUG nova.virt.libvirt.vif [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:10:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-878694992',display_name='tempest-ServerActionsTestJSON-server-878694992',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-878694992',id=94,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCMk4ml8b3fuOLAahNu+FCDl9v+SWDrsolrsZ2S6Gt/612rFnCoMRUdrLWExAZcapVyOr30SCUoxNsWGg9cONHUEcP39CbtnDaooW7qgHPKQnqdGyUHB5iGAkkvgw7SerQ==',key_name='tempest-keypair-11305483',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:10:40Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='6e6c366001df43fb91731faf7a9578fc',ramdisk_id='',reservation_id='r-j3wprymf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-157226036',owner_user_name='tempest-ServerActionsTestJSON-157226036-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:11:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e1b8fbcc8caa4d94b69570f233c56d18',uuid=5896e8b0-25a2-4075-8ebf-5458b5ed9234,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "69f2ab13-2311-4137-9c26-e256f33759e5", "address": "fa:16:3e:cf:f0:da", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69f2ab13-23", "ovs_interfaceid": "69f2ab13-2311-4137-9c26-e256f33759e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:11:34 compute-0 nova_compute[187185]: 2025-11-29 07:11:34.469 187189 DEBUG nova.network.os_vif_util [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converting VIF {"id": "69f2ab13-2311-4137-9c26-e256f33759e5", "address": "fa:16:3e:cf:f0:da", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69f2ab13-23", "ovs_interfaceid": "69f2ab13-2311-4137-9c26-e256f33759e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:11:34 compute-0 nova_compute[187185]: 2025-11-29 07:11:34.470 187189 DEBUG nova.network.os_vif_util [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cf:f0:da,bridge_name='br-int',has_traffic_filtering=True,id=69f2ab13-2311-4137-9c26-e256f33759e5,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69f2ab13-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:11:34 compute-0 nova_compute[187185]: 2025-11-29 07:11:34.470 187189 DEBUG os_vif [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:cf:f0:da,bridge_name='br-int',has_traffic_filtering=True,id=69f2ab13-2311-4137-9c26-e256f33759e5,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69f2ab13-23') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:11:34 compute-0 nova_compute[187185]: 2025-11-29 07:11:34.472 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:34 compute-0 nova_compute[187185]: 2025-11-29 07:11:34.472 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap69f2ab13-23, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:11:34 compute-0 nova_compute[187185]: 2025-11-29 07:11:34.474 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:34 compute-0 nova_compute[187185]: 2025-11-29 07:11:34.476 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:11:34 compute-0 nova_compute[187185]: 2025-11-29 07:11:34.477 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:34 compute-0 nova_compute[187185]: 2025-11-29 07:11:34.480 187189 INFO os_vif [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:cf:f0:da,bridge_name='br-int',has_traffic_filtering=True,id=69f2ab13-2311-4137-9c26-e256f33759e5,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69f2ab13-23')
Nov 29 07:11:34 compute-0 nova_compute[187185]: 2025-11-29 07:11:34.488 187189 DEBUG nova.virt.libvirt.driver [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Start _get_guest_xml network_info=[{"id": "69f2ab13-2311-4137-9c26-e256f33759e5", "address": "fa:16:3e:cf:f0:da", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69f2ab13-23", "ovs_interfaceid": "69f2ab13-2311-4137-9c26-e256f33759e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:11:34 compute-0 nova_compute[187185]: 2025-11-29 07:11:34.492 187189 WARNING nova.virt.libvirt.driver [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:11:34 compute-0 nova_compute[187185]: 2025-11-29 07:11:34.503 187189 DEBUG nova.virt.libvirt.host [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:11:34 compute-0 nova_compute[187185]: 2025-11-29 07:11:34.504 187189 DEBUG nova.virt.libvirt.host [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:11:34 compute-0 nova_compute[187185]: 2025-11-29 07:11:34.507 187189 DEBUG nova.virt.libvirt.host [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:11:34 compute-0 nova_compute[187185]: 2025-11-29 07:11:34.508 187189 DEBUG nova.virt.libvirt.host [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:11:34 compute-0 nova_compute[187185]: 2025-11-29 07:11:34.509 187189 DEBUG nova.virt.libvirt.driver [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:11:34 compute-0 nova_compute[187185]: 2025-11-29 07:11:34.509 187189 DEBUG nova.virt.hardware [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:11:34 compute-0 nova_compute[187185]: 2025-11-29 07:11:34.510 187189 DEBUG nova.virt.hardware [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:11:34 compute-0 nova_compute[187185]: 2025-11-29 07:11:34.510 187189 DEBUG nova.virt.hardware [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:11:34 compute-0 nova_compute[187185]: 2025-11-29 07:11:34.510 187189 DEBUG nova.virt.hardware [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:11:34 compute-0 nova_compute[187185]: 2025-11-29 07:11:34.510 187189 DEBUG nova.virt.hardware [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:11:34 compute-0 nova_compute[187185]: 2025-11-29 07:11:34.510 187189 DEBUG nova.virt.hardware [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:11:34 compute-0 nova_compute[187185]: 2025-11-29 07:11:34.511 187189 DEBUG nova.virt.hardware [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:11:34 compute-0 nova_compute[187185]: 2025-11-29 07:11:34.511 187189 DEBUG nova.virt.hardware [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:11:34 compute-0 nova_compute[187185]: 2025-11-29 07:11:34.511 187189 DEBUG nova.virt.hardware [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:11:34 compute-0 nova_compute[187185]: 2025-11-29 07:11:34.511 187189 DEBUG nova.virt.hardware [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:11:34 compute-0 nova_compute[187185]: 2025-11-29 07:11:34.511 187189 DEBUG nova.virt.hardware [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:11:34 compute-0 nova_compute[187185]: 2025-11-29 07:11:34.512 187189 DEBUG nova.objects.instance [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'vcpu_model' on Instance uuid 5896e8b0-25a2-4075-8ebf-5458b5ed9234 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:11:34 compute-0 nova_compute[187185]: 2025-11-29 07:11:34.783 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:34 compute-0 podman[226892]: 2025-11-29 07:11:34.873360479 +0000 UTC m=+0.126946361 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 07:11:36 compute-0 nova_compute[187185]: 2025-11-29 07:11:36.710 187189 DEBUG oslo_concurrency.processutils [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:11:36 compute-0 nova_compute[187185]: 2025-11-29 07:11:36.801 187189 DEBUG oslo_concurrency.processutils [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk.config --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:11:36 compute-0 nova_compute[187185]: 2025-11-29 07:11:36.803 187189 DEBUG oslo_concurrency.lockutils [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "/var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:11:36 compute-0 nova_compute[187185]: 2025-11-29 07:11:36.805 187189 DEBUG oslo_concurrency.lockutils [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "/var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:11:36 compute-0 nova_compute[187185]: 2025-11-29 07:11:36.807 187189 DEBUG oslo_concurrency.lockutils [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "/var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:11:36 compute-0 nova_compute[187185]: 2025-11-29 07:11:36.808 187189 DEBUG nova.virt.libvirt.vif [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:10:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-878694992',display_name='tempest-ServerActionsTestJSON-server-878694992',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-878694992',id=94,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCMk4ml8b3fuOLAahNu+FCDl9v+SWDrsolrsZ2S6Gt/612rFnCoMRUdrLWExAZcapVyOr30SCUoxNsWGg9cONHUEcP39CbtnDaooW7qgHPKQnqdGyUHB5iGAkkvgw7SerQ==',key_name='tempest-keypair-11305483',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:10:40Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='6e6c366001df43fb91731faf7a9578fc',ramdisk_id='',reservation_id='r-j3wprymf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-157226036',owner_user_name='tempest-ServerActionsTestJSON-157226036-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:11:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e1b8fbcc8caa4d94b69570f233c56d18',uuid=5896e8b0-25a2-4075-8ebf-5458b5ed9234,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "69f2ab13-2311-4137-9c26-e256f33759e5", "address": "fa:16:3e:cf:f0:da", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69f2ab13-23", "ovs_interfaceid": "69f2ab13-2311-4137-9c26-e256f33759e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:11:36 compute-0 nova_compute[187185]: 2025-11-29 07:11:36.808 187189 DEBUG nova.network.os_vif_util [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converting VIF {"id": "69f2ab13-2311-4137-9c26-e256f33759e5", "address": "fa:16:3e:cf:f0:da", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69f2ab13-23", "ovs_interfaceid": "69f2ab13-2311-4137-9c26-e256f33759e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:11:36 compute-0 nova_compute[187185]: 2025-11-29 07:11:36.809 187189 DEBUG nova.network.os_vif_util [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cf:f0:da,bridge_name='br-int',has_traffic_filtering=True,id=69f2ab13-2311-4137-9c26-e256f33759e5,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69f2ab13-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:11:36 compute-0 nova_compute[187185]: 2025-11-29 07:11:36.811 187189 DEBUG nova.objects.instance [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'pci_devices' on Instance uuid 5896e8b0-25a2-4075-8ebf-5458b5ed9234 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:11:39 compute-0 nova_compute[187185]: 2025-11-29 07:11:39.478 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:39 compute-0 nova_compute[187185]: 2025-11-29 07:11:39.785 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:40 compute-0 nova_compute[187185]: 2025-11-29 07:11:40.047 187189 DEBUG nova.virt.libvirt.driver [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:11:40 compute-0 nova_compute[187185]:   <uuid>5896e8b0-25a2-4075-8ebf-5458b5ed9234</uuid>
Nov 29 07:11:40 compute-0 nova_compute[187185]:   <name>instance-0000005e</name>
Nov 29 07:11:40 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:11:40 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:11:40 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:11:40 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:       <nova:name>tempest-ServerActionsTestJSON-server-878694992</nova:name>
Nov 29 07:11:40 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:11:34</nova:creationTime>
Nov 29 07:11:40 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:11:40 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:11:40 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:11:40 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:11:40 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:11:40 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:11:40 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:11:40 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:11:40 compute-0 nova_compute[187185]:         <nova:user uuid="e1b8fbcc8caa4d94b69570f233c56d18">tempest-ServerActionsTestJSON-157226036-project-member</nova:user>
Nov 29 07:11:40 compute-0 nova_compute[187185]:         <nova:project uuid="6e6c366001df43fb91731faf7a9578fc">tempest-ServerActionsTestJSON-157226036</nova:project>
Nov 29 07:11:40 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:11:40 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:11:40 compute-0 nova_compute[187185]:         <nova:port uuid="69f2ab13-2311-4137-9c26-e256f33759e5">
Nov 29 07:11:40 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:11:40 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:11:40 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:11:40 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <system>
Nov 29 07:11:40 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:11:40 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:11:40 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:11:40 compute-0 nova_compute[187185]:       <entry name="serial">5896e8b0-25a2-4075-8ebf-5458b5ed9234</entry>
Nov 29 07:11:40 compute-0 nova_compute[187185]:       <entry name="uuid">5896e8b0-25a2-4075-8ebf-5458b5ed9234</entry>
Nov 29 07:11:40 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     </system>
Nov 29 07:11:40 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:11:40 compute-0 nova_compute[187185]:   <os>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:   </os>
Nov 29 07:11:40 compute-0 nova_compute[187185]:   <features>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:   </features>
Nov 29 07:11:40 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:11:40 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:11:40 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:11:40 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:11:40 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk.config"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:11:40 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:cf:f0:da"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:       <target dev="tap69f2ab13-23"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:11:40 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/console.log" append="off"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <video>
Nov 29 07:11:40 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     </video>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <input type="keyboard" bus="usb"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:11:40 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:11:40 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:11:40 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:11:40 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:11:40 compute-0 nova_compute[187185]: </domain>
Nov 29 07:11:40 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:11:40 compute-0 nova_compute[187185]: 2025-11-29 07:11:40.049 187189 DEBUG oslo_concurrency.processutils [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:11:40 compute-0 nova_compute[187185]: 2025-11-29 07:11:40.127 187189 DEBUG oslo_concurrency.processutils [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:11:40 compute-0 nova_compute[187185]: 2025-11-29 07:11:40.128 187189 DEBUG oslo_concurrency.processutils [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:11:40 compute-0 nova_compute[187185]: 2025-11-29 07:11:40.200 187189 DEBUG oslo_concurrency.processutils [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:11:40 compute-0 nova_compute[187185]: 2025-11-29 07:11:40.202 187189 DEBUG nova.objects.instance [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'trusted_certs' on Instance uuid 5896e8b0-25a2-4075-8ebf-5458b5ed9234 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:11:40 compute-0 nova_compute[187185]: 2025-11-29 07:11:40.342 187189 DEBUG oslo_concurrency.processutils [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:11:40 compute-0 nova_compute[187185]: 2025-11-29 07:11:40.435 187189 DEBUG oslo_concurrency.processutils [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:11:40 compute-0 nova_compute[187185]: 2025-11-29 07:11:40.436 187189 DEBUG nova.virt.disk.api [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Checking if we can resize image /var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:11:40 compute-0 nova_compute[187185]: 2025-11-29 07:11:40.437 187189 DEBUG oslo_concurrency.processutils [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:11:40 compute-0 nova_compute[187185]: 2025-11-29 07:11:40.502 187189 DEBUG oslo_concurrency.processutils [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:11:40 compute-0 nova_compute[187185]: 2025-11-29 07:11:40.503 187189 DEBUG nova.virt.disk.api [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Cannot resize image /var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:11:40 compute-0 nova_compute[187185]: 2025-11-29 07:11:40.504 187189 DEBUG nova.objects.instance [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'migration_context' on Instance uuid 5896e8b0-25a2-4075-8ebf-5458b5ed9234 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:11:40 compute-0 nova_compute[187185]: 2025-11-29 07:11:40.719 187189 DEBUG nova.virt.libvirt.vif [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:10:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-878694992',display_name='tempest-ServerActionsTestJSON-server-878694992',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-878694992',id=94,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCMk4ml8b3fuOLAahNu+FCDl9v+SWDrsolrsZ2S6Gt/612rFnCoMRUdrLWExAZcapVyOr30SCUoxNsWGg9cONHUEcP39CbtnDaooW7qgHPKQnqdGyUHB5iGAkkvgw7SerQ==',key_name='tempest-keypair-11305483',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:10:40Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='6e6c366001df43fb91731faf7a9578fc',ramdisk_id='',reservation_id='r-j3wprymf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-157226036',owner_user_name='tempest-ServerActionsTestJSON-157226036-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:11:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e1b8fbcc8caa4d94b69570f233c56d18',uuid=5896e8b0-25a2-4075-8ebf-5458b5ed9234,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "69f2ab13-2311-4137-9c26-e256f33759e5", "address": "fa:16:3e:cf:f0:da", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69f2ab13-23", "ovs_interfaceid": "69f2ab13-2311-4137-9c26-e256f33759e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:11:40 compute-0 nova_compute[187185]: 2025-11-29 07:11:40.720 187189 DEBUG nova.network.os_vif_util [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converting VIF {"id": "69f2ab13-2311-4137-9c26-e256f33759e5", "address": "fa:16:3e:cf:f0:da", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69f2ab13-23", "ovs_interfaceid": "69f2ab13-2311-4137-9c26-e256f33759e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:11:40 compute-0 nova_compute[187185]: 2025-11-29 07:11:40.721 187189 DEBUG nova.network.os_vif_util [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cf:f0:da,bridge_name='br-int',has_traffic_filtering=True,id=69f2ab13-2311-4137-9c26-e256f33759e5,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69f2ab13-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:11:40 compute-0 nova_compute[187185]: 2025-11-29 07:11:40.721 187189 DEBUG os_vif [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:cf:f0:da,bridge_name='br-int',has_traffic_filtering=True,id=69f2ab13-2311-4137-9c26-e256f33759e5,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69f2ab13-23') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:11:40 compute-0 nova_compute[187185]: 2025-11-29 07:11:40.722 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:40 compute-0 nova_compute[187185]: 2025-11-29 07:11:40.723 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:11:40 compute-0 nova_compute[187185]: 2025-11-29 07:11:40.724 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:11:40 compute-0 nova_compute[187185]: 2025-11-29 07:11:40.727 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:40 compute-0 nova_compute[187185]: 2025-11-29 07:11:40.727 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap69f2ab13-23, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:11:40 compute-0 nova_compute[187185]: 2025-11-29 07:11:40.728 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap69f2ab13-23, col_values=(('external_ids', {'iface-id': '69f2ab13-2311-4137-9c26-e256f33759e5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cf:f0:da', 'vm-uuid': '5896e8b0-25a2-4075-8ebf-5458b5ed9234'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:11:40 compute-0 NetworkManager[55227]: <info>  [1764400300.7321] manager: (tap69f2ab13-23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/121)
Nov 29 07:11:40 compute-0 nova_compute[187185]: 2025-11-29 07:11:40.733 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:11:40 compute-0 nova_compute[187185]: 2025-11-29 07:11:40.738 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:40 compute-0 nova_compute[187185]: 2025-11-29 07:11:40.739 187189 INFO os_vif [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:cf:f0:da,bridge_name='br-int',has_traffic_filtering=True,id=69f2ab13-2311-4137-9c26-e256f33759e5,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69f2ab13-23')
Nov 29 07:11:40 compute-0 podman[226935]: 2025-11-29 07:11:40.801030142 +0000 UTC m=+0.061735776 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 07:11:40 compute-0 kernel: tap69f2ab13-23: entered promiscuous mode
Nov 29 07:11:40 compute-0 ovn_controller[95281]: 2025-11-29T07:11:40Z|00225|binding|INFO|Claiming lport 69f2ab13-2311-4137-9c26-e256f33759e5 for this chassis.
Nov 29 07:11:40 compute-0 ovn_controller[95281]: 2025-11-29T07:11:40Z|00226|binding|INFO|69f2ab13-2311-4137-9c26-e256f33759e5: Claiming fa:16:3e:cf:f0:da 10.100.0.6
Nov 29 07:11:40 compute-0 nova_compute[187185]: 2025-11-29 07:11:40.835 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:40 compute-0 NetworkManager[55227]: <info>  [1764400300.8370] manager: (tap69f2ab13-23): new Tun device (/org/freedesktop/NetworkManager/Devices/122)
Nov 29 07:11:40 compute-0 ovn_controller[95281]: 2025-11-29T07:11:40Z|00227|binding|INFO|Setting lport 69f2ab13-2311-4137-9c26-e256f33759e5 ovn-installed in OVS
Nov 29 07:11:40 compute-0 nova_compute[187185]: 2025-11-29 07:11:40.849 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:40 compute-0 nova_compute[187185]: 2025-11-29 07:11:40.851 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:40 compute-0 systemd-machined[153486]: New machine qemu-33-instance-0000005e.
Nov 29 07:11:40 compute-0 systemd-udevd[226975]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:11:40 compute-0 systemd[1]: Started Virtual Machine qemu-33-instance-0000005e.
Nov 29 07:11:40 compute-0 NetworkManager[55227]: <info>  [1764400300.9002] device (tap69f2ab13-23): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:11:40 compute-0 NetworkManager[55227]: <info>  [1764400300.9013] device (tap69f2ab13-23): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:11:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:40.945 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cf:f0:da 10.100.0.6'], port_security=['fa:16:3e:cf:f0:da 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9226dea3-6355-4dd9-9441-d093c1f1a399', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e6c366001df43fb91731faf7a9578fc', 'neutron:revision_number': '5', 'neutron:security_group_ids': '1ab550ec-f542-4b91-8354-868da408e26d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.225'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d2b9cdf8-2516-48a0-81c5-85c2327075d5, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=69f2ab13-2311-4137-9c26-e256f33759e5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:11:40 compute-0 ovn_controller[95281]: 2025-11-29T07:11:40Z|00228|binding|INFO|Setting lport 69f2ab13-2311-4137-9c26-e256f33759e5 up in Southbound
Nov 29 07:11:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:40.946 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 69f2ab13-2311-4137-9c26-e256f33759e5 in datapath 9226dea3-6355-4dd9-9441-d093c1f1a399 bound to our chassis
Nov 29 07:11:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:40.948 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9226dea3-6355-4dd9-9441-d093c1f1a399
Nov 29 07:11:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:40.960 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[5abb4eff-38aa-467e-9218-6010aab69efb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:11:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:40.962 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9226dea3-61 in ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:11:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:40.964 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9226dea3-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:11:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:40.964 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[2cedd5cd-e5e2-4cae-a1fa-006558bf19ff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:11:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:40.965 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a863c23c-31d5-47a8-b458-6fadf66b53b0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:11:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:40.978 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[42f81de9-8b1d-4e90-9d5e-6b49d909adda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:40.999 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[26891709-b72b-4fe6-b75e-d380b7d3f08e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:41.029 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[54b51035-6b72-43fe-8e4a-5c9179289650]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:11:41 compute-0 systemd-udevd[226977]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:41.035 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[9a485044-193e-4c0c-a9ff-9c4832319a5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:11:41 compute-0 NetworkManager[55227]: <info>  [1764400301.0366] manager: (tap9226dea3-60): new Veth device (/org/freedesktop/NetworkManager/Devices/123)
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:41.071 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[505e3fab-dfaf-48b6-973f-975f41044935]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:41.075 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[32c9275a-484c-49c2-95db-8589d7d1d739]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:11:41 compute-0 NetworkManager[55227]: <info>  [1764400301.0971] device (tap9226dea3-60): carrier: link connected
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:41.100 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[21177bbd-bff9-439c-8b02-e019aed81595]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:41.116 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[8c830376-b0d1-4c73-8d88-018bdcfcafd9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9226dea3-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cb:49:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 75], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 574876, 'reachable_time': 36375, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227014, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:41.131 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[61fce370-cd00-4fdc-ab85-ef137aad116d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecb:493d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 574876, 'tstamp': 574876}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227016, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:11:41 compute-0 nova_compute[187185]: 2025-11-29 07:11:41.140 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400301.1393187, 5896e8b0-25a2-4075-8ebf-5458b5ed9234 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:11:41 compute-0 nova_compute[187185]: 2025-11-29 07:11:41.141 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] VM Resumed (Lifecycle Event)
Nov 29 07:11:41 compute-0 nova_compute[187185]: 2025-11-29 07:11:41.144 187189 DEBUG nova.compute.manager [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:11:41 compute-0 nova_compute[187185]: 2025-11-29 07:11:41.149 187189 INFO nova.virt.libvirt.driver [-] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Instance rebooted successfully.
Nov 29 07:11:41 compute-0 nova_compute[187185]: 2025-11-29 07:11:41.150 187189 DEBUG nova.compute.manager [None req-1b639fa1-ed86-46eb-9b81-75930774b341 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:41.153 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[fdcb645c-deb7-40fa-9418-da8af723107d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9226dea3-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cb:49:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 75], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 574876, 'reachable_time': 36375, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227017, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:41.186 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[2571a2d4-9b8e-4907-9771-54a8298d81bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:41.239 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[804a7dcf-df5e-491d-9699-b91b643f12ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:41.241 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9226dea3-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:41.242 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:41.242 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9226dea3-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:11:41 compute-0 nova_compute[187185]: 2025-11-29 07:11:41.244 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:41 compute-0 kernel: tap9226dea3-60: entered promiscuous mode
Nov 29 07:11:41 compute-0 NetworkManager[55227]: <info>  [1764400301.2453] manager: (tap9226dea3-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/124)
Nov 29 07:11:41 compute-0 nova_compute[187185]: 2025-11-29 07:11:41.247 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:41.250 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9226dea3-60, col_values=(('external_ids', {'iface-id': 'e99fae54-9bf0-4a59-8b06-7a4b6ecf1479'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:11:41 compute-0 nova_compute[187185]: 2025-11-29 07:11:41.251 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:41 compute-0 ovn_controller[95281]: 2025-11-29T07:11:41Z|00229|binding|INFO|Releasing lport e99fae54-9bf0-4a59-8b06-7a4b6ecf1479 from this chassis (sb_readonly=1)
Nov 29 07:11:41 compute-0 nova_compute[187185]: 2025-11-29 07:11:41.252 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:41.254 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9226dea3-6355-4dd9-9441-d093c1f1a399.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9226dea3-6355-4dd9-9441-d093c1f1a399.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:41.255 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[1b966c5b-99bc-4438-8ffc-90942bdb1c34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:41.255 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-9226dea3-6355-4dd9-9441-d093c1f1a399
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/9226dea3-6355-4dd9-9441-d093c1f1a399.pid.haproxy
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID 9226dea3-6355-4dd9-9441-d093c1f1a399
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:11:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:11:41.256 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'env', 'PROCESS_TAG=haproxy-9226dea3-6355-4dd9-9441-d093c1f1a399', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9226dea3-6355-4dd9-9441-d093c1f1a399.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:11:41 compute-0 nova_compute[187185]: 2025-11-29 07:11:41.264 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:41 compute-0 podman[227049]: 2025-11-29 07:11:41.653668032 +0000 UTC m=+0.063491126 container create 13c6b60b768007e6ee720972f88461b1ab36e5fd2f0740dd0ff7bfcada0ffbab (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 07:11:41 compute-0 systemd[1]: Started libpod-conmon-13c6b60b768007e6ee720972f88461b1ab36e5fd2f0740dd0ff7bfcada0ffbab.scope.
Nov 29 07:11:41 compute-0 podman[227049]: 2025-11-29 07:11:41.611290244 +0000 UTC m=+0.021113358 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:11:41 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:11:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef54d49b48c0bfcebf5e25a7190995aaa8d84a1a7a372ae1d53ab3c58ae28761/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:11:41 compute-0 podman[227049]: 2025-11-29 07:11:41.746625141 +0000 UTC m=+0.156448255 container init 13c6b60b768007e6ee720972f88461b1ab36e5fd2f0740dd0ff7bfcada0ffbab (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:11:41 compute-0 podman[227049]: 2025-11-29 07:11:41.751663033 +0000 UTC m=+0.161486127 container start 13c6b60b768007e6ee720972f88461b1ab36e5fd2f0740dd0ff7bfcada0ffbab (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 07:11:41 compute-0 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[227066]: [NOTICE]   (227070) : New worker (227072) forked
Nov 29 07:11:41 compute-0 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[227066]: [NOTICE]   (227070) : Loading success.
Nov 29 07:11:42 compute-0 nova_compute[187185]: 2025-11-29 07:11:42.836 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:11:42 compute-0 nova_compute[187185]: 2025-11-29 07:11:42.841 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:11:43 compute-0 podman[227081]: 2025-11-29 07:11:43.78959339 +0000 UTC m=+0.059570946 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd)
Nov 29 07:11:43 compute-0 podman[227082]: 2025-11-29 07:11:43.790180106 +0000 UTC m=+0.058103694 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 07:11:44 compute-0 nova_compute[187185]: 2025-11-29 07:11:44.157 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400301.1410706, 5896e8b0-25a2-4075-8ebf-5458b5ed9234 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:11:44 compute-0 nova_compute[187185]: 2025-11-29 07:11:44.158 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] VM Started (Lifecycle Event)
Nov 29 07:11:44 compute-0 nova_compute[187185]: 2025-11-29 07:11:44.789 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:45 compute-0 nova_compute[187185]: 2025-11-29 07:11:45.731 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:49 compute-0 nova_compute[187185]: 2025-11-29 07:11:49.791 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:50 compute-0 nova_compute[187185]: 2025-11-29 07:11:50.735 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:52 compute-0 ovn_controller[95281]: 2025-11-29T07:11:52Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:cf:f0:da 10.100.0.6
Nov 29 07:11:54 compute-0 nova_compute[187185]: 2025-11-29 07:11:54.640 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:11:54 compute-0 nova_compute[187185]: 2025-11-29 07:11:54.645 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:11:54 compute-0 nova_compute[187185]: 2025-11-29 07:11:54.792 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:55 compute-0 nova_compute[187185]: 2025-11-29 07:11:55.738 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:11:55 compute-0 podman[227131]: 2025-11-29 07:11:55.832699197 +0000 UTC m=+0.079063427 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 07:11:55 compute-0 podman[227132]: 2025-11-29 07:11:55.83279063 +0000 UTC m=+0.081444774 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, maintainer=Red Hat, Inc., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.openshift.expose-services=, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7)
Nov 29 07:11:55 compute-0 podman[227133]: 2025-11-29 07:11:55.837627946 +0000 UTC m=+0.083085330 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 07:11:58 compute-0 nova_compute[187185]: 2025-11-29 07:11:58.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:11:58 compute-0 nova_compute[187185]: 2025-11-29 07:11:58.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:11:58 compute-0 nova_compute[187185]: 2025-11-29 07:11:58.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:11:59 compute-0 nova_compute[187185]: 2025-11-29 07:11:59.794 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:12:00 compute-0 nova_compute[187185]: 2025-11-29 07:12:00.627 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "refresh_cache-5896e8b0-25a2-4075-8ebf-5458b5ed9234" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:12:00 compute-0 nova_compute[187185]: 2025-11-29 07:12:00.628 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquired lock "refresh_cache-5896e8b0-25a2-4075-8ebf-5458b5ed9234" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:12:00 compute-0 nova_compute[187185]: 2025-11-29 07:12:00.628 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 29 07:12:00 compute-0 nova_compute[187185]: 2025-11-29 07:12:00.628 187189 DEBUG nova.objects.instance [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 5896e8b0-25a2-4075-8ebf-5458b5ed9234 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:12:00 compute-0 nova_compute[187185]: 2025-11-29 07:12:00.741 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:12:03 compute-0 nova_compute[187185]: 2025-11-29 07:12:03.900 187189 DEBUG nova.compute.manager [req-fe7b168a-aa50-4302-ab06-58364b7498ee req-00fdaa53-a1e7-41c2-8eb1-8f1feff71022 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Received event network-vif-plugged-69f2ab13-2311-4137-9c26-e256f33759e5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:12:03 compute-0 nova_compute[187185]: 2025-11-29 07:12:03.901 187189 DEBUG oslo_concurrency.lockutils [req-fe7b168a-aa50-4302-ab06-58364b7498ee req-00fdaa53-a1e7-41c2-8eb1-8f1feff71022 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:12:03 compute-0 nova_compute[187185]: 2025-11-29 07:12:03.901 187189 DEBUG oslo_concurrency.lockutils [req-fe7b168a-aa50-4302-ab06-58364b7498ee req-00fdaa53-a1e7-41c2-8eb1-8f1feff71022 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:12:03 compute-0 nova_compute[187185]: 2025-11-29 07:12:03.901 187189 DEBUG oslo_concurrency.lockutils [req-fe7b168a-aa50-4302-ab06-58364b7498ee req-00fdaa53-a1e7-41c2-8eb1-8f1feff71022 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:12:03 compute-0 nova_compute[187185]: 2025-11-29 07:12:03.901 187189 DEBUG nova.compute.manager [req-fe7b168a-aa50-4302-ab06-58364b7498ee req-00fdaa53-a1e7-41c2-8eb1-8f1feff71022 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] No waiting events found dispatching network-vif-plugged-69f2ab13-2311-4137-9c26-e256f33759e5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:12:03 compute-0 nova_compute[187185]: 2025-11-29 07:12:03.902 187189 WARNING nova.compute.manager [req-fe7b168a-aa50-4302-ab06-58364b7498ee req-00fdaa53-a1e7-41c2-8eb1-8f1feff71022 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Received unexpected event network-vif-plugged-69f2ab13-2311-4137-9c26-e256f33759e5 for instance with vm_state active and task_state None.
Nov 29 07:12:04 compute-0 nova_compute[187185]: 2025-11-29 07:12:04.796 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:12:05 compute-0 nova_compute[187185]: 2025-11-29 07:12:05.773 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:12:05 compute-0 podman[227190]: 2025-11-29 07:12:05.875322236 +0000 UTC m=+0.142553440 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, managed_by=edpm_ansible)
Nov 29 07:12:09 compute-0 nova_compute[187185]: 2025-11-29 07:12:09.798 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:12:10 compute-0 nova_compute[187185]: 2025-11-29 07:12:10.775 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:12:11 compute-0 podman[227217]: 2025-11-29 07:12:11.844066193 +0000 UTC m=+0.095467231 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 07:12:14 compute-0 ovn_controller[95281]: 2025-11-29T07:12:14Z|00230|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Nov 29 07:12:14 compute-0 nova_compute[187185]: 2025-11-29 07:12:14.800 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:12:14 compute-0 podman[227242]: 2025-11-29 07:12:14.812715256 +0000 UTC m=+0.066565484 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:12:14 compute-0 podman[227241]: 2025-11-29 07:12:14.843611129 +0000 UTC m=+0.101467270 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, io.buildah.version=1.41.3)
Nov 29 07:12:15 compute-0 nova_compute[187185]: 2025-11-29 07:12:15.807 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:12:19 compute-0 nova_compute[187185]: 2025-11-29 07:12:19.803 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:12:20 compute-0 nova_compute[187185]: 2025-11-29 07:12:20.810 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:12:24 compute-0 nova_compute[187185]: 2025-11-29 07:12:24.060 187189 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 3.09 sec
Nov 29 07:12:24 compute-0 nova_compute[187185]: 2025-11-29 07:12:24.763 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:12:24 compute-0 nova_compute[187185]: 2025-11-29 07:12:24.805 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:12:25 compute-0 nova_compute[187185]: 2025-11-29 07:12:25.176 187189 DEBUG nova.compute.manager [req-76885f0a-cb6d-4e57-9e4e-951e8d35abc7 req-962b8af2-261a-4f4d-abfc-281f0f5ce50f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Received event network-vif-plugged-69f2ab13-2311-4137-9c26-e256f33759e5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:12:25 compute-0 nova_compute[187185]: 2025-11-29 07:12:25.176 187189 DEBUG oslo_concurrency.lockutils [req-76885f0a-cb6d-4e57-9e4e-951e8d35abc7 req-962b8af2-261a-4f4d-abfc-281f0f5ce50f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:12:25 compute-0 nova_compute[187185]: 2025-11-29 07:12:25.177 187189 DEBUG oslo_concurrency.lockutils [req-76885f0a-cb6d-4e57-9e4e-951e8d35abc7 req-962b8af2-261a-4f4d-abfc-281f0f5ce50f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:12:25 compute-0 nova_compute[187185]: 2025-11-29 07:12:25.177 187189 DEBUG oslo_concurrency.lockutils [req-76885f0a-cb6d-4e57-9e4e-951e8d35abc7 req-962b8af2-261a-4f4d-abfc-281f0f5ce50f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:12:25 compute-0 nova_compute[187185]: 2025-11-29 07:12:25.177 187189 DEBUG nova.compute.manager [req-76885f0a-cb6d-4e57-9e4e-951e8d35abc7 req-962b8af2-261a-4f4d-abfc-281f0f5ce50f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] No waiting events found dispatching network-vif-plugged-69f2ab13-2311-4137-9c26-e256f33759e5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:12:25 compute-0 nova_compute[187185]: 2025-11-29 07:12:25.178 187189 WARNING nova.compute.manager [req-76885f0a-cb6d-4e57-9e4e-951e8d35abc7 req-962b8af2-261a-4f4d-abfc-281f0f5ce50f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Received unexpected event network-vif-plugged-69f2ab13-2311-4137-9c26-e256f33759e5 for instance with vm_state active and task_state None.
Nov 29 07:12:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:12:25.502 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:12:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:12:25.503 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:12:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:12:25.504 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:12:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:12:25.505 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:12:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:12:25.505 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:12:25 compute-0 nova_compute[187185]: 2025-11-29 07:12:25.812 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:12:26 compute-0 podman[227285]: 2025-11-29 07:12:26.805820171 +0000 UTC m=+0.064602098 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_id=edpm, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64, io.buildah.version=1.33.7)
Nov 29 07:12:26 compute-0 podman[227286]: 2025-11-29 07:12:26.810708219 +0000 UTC m=+0.064228027 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 07:12:26 compute-0 podman[227284]: 2025-11-29 07:12:26.81110783 +0000 UTC m=+0.075591678 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:12:29 compute-0 nova_compute[187185]: 2025-11-29 07:12:29.807 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:12:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:12:30.508 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:12:30 compute-0 nova_compute[187185]: 2025-11-29 07:12:30.855 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:12:32 compute-0 nova_compute[187185]: 2025-11-29 07:12:32.157 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Updating instance_info_cache with network_info: [{"id": "69f2ab13-2311-4137-9c26-e256f33759e5", "address": "fa:16:3e:cf:f0:da", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69f2ab13-23", "ovs_interfaceid": "69f2ab13-2311-4137-9c26-e256f33759e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:12:32 compute-0 nova_compute[187185]: 2025-11-29 07:12:32.850 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Releasing lock "refresh_cache-5896e8b0-25a2-4075-8ebf-5458b5ed9234" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:12:32 compute-0 nova_compute[187185]: 2025-11-29 07:12:32.850 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 29 07:12:32 compute-0 nova_compute[187185]: 2025-11-29 07:12:32.850 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:12:32 compute-0 nova_compute[187185]: 2025-11-29 07:12:32.851 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:12:32 compute-0 nova_compute[187185]: 2025-11-29 07:12:32.851 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:12:32 compute-0 nova_compute[187185]: 2025-11-29 07:12:32.851 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:12:32 compute-0 nova_compute[187185]: 2025-11-29 07:12:32.851 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:12:32 compute-0 nova_compute[187185]: 2025-11-29 07:12:32.851 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:12:32 compute-0 nova_compute[187185]: 2025-11-29 07:12:32.851 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:12:32 compute-0 nova_compute[187185]: 2025-11-29 07:12:32.851 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:12:33 compute-0 nova_compute[187185]: 2025-11-29 07:12:33.010 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:12:33 compute-0 nova_compute[187185]: 2025-11-29 07:12:33.011 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:12:33 compute-0 nova_compute[187185]: 2025-11-29 07:12:33.011 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:12:33 compute-0 nova_compute[187185]: 2025-11-29 07:12:33.012 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:12:34 compute-0 nova_compute[187185]: 2025-11-29 07:12:34.124 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:12:34 compute-0 nova_compute[187185]: 2025-11-29 07:12:34.213 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:12:34 compute-0 nova_compute[187185]: 2025-11-29 07:12:34.214 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:12:34 compute-0 nova_compute[187185]: 2025-11-29 07:12:34.275 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:12:34 compute-0 nova_compute[187185]: 2025-11-29 07:12:34.436 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:12:34 compute-0 nova_compute[187185]: 2025-11-29 07:12:34.438 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5554MB free_disk=73.26740646362305GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:12:34 compute-0 nova_compute[187185]: 2025-11-29 07:12:34.438 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:12:34 compute-0 nova_compute[187185]: 2025-11-29 07:12:34.438 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:12:34 compute-0 nova_compute[187185]: 2025-11-29 07:12:34.598 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance 5896e8b0-25a2-4075-8ebf-5458b5ed9234 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:12:34 compute-0 nova_compute[187185]: 2025-11-29 07:12:34.599 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:12:34 compute-0 nova_compute[187185]: 2025-11-29 07:12:34.599 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:12:34 compute-0 nova_compute[187185]: 2025-11-29 07:12:34.719 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:12:34 compute-0 nova_compute[187185]: 2025-11-29 07:12:34.746 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:12:34 compute-0 nova_compute[187185]: 2025-11-29 07:12:34.778 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:12:34 compute-0 nova_compute[187185]: 2025-11-29 07:12:34.779 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.341s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:12:34 compute-0 nova_compute[187185]: 2025-11-29 07:12:34.780 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:12:34 compute-0 nova_compute[187185]: 2025-11-29 07:12:34.809 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:12:34 compute-0 nova_compute[187185]: 2025-11-29 07:12:34.814 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:12:34 compute-0 nova_compute[187185]: 2025-11-29 07:12:34.814 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 29 07:12:34 compute-0 nova_compute[187185]: 2025-11-29 07:12:34.834 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 29 07:12:34 compute-0 nova_compute[187185]: 2025-11-29 07:12:34.835 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:12:34 compute-0 nova_compute[187185]: 2025-11-29 07:12:34.835 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 29 07:12:34 compute-0 nova_compute[187185]: 2025-11-29 07:12:34.862 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:12:35 compute-0 nova_compute[187185]: 2025-11-29 07:12:35.859 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:12:36 compute-0 podman[227351]: 2025-11-29 07:12:36.877879606 +0000 UTC m=+0.135138652 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 07:12:37 compute-0 nova_compute[187185]: 2025-11-29 07:12:37.805 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:12:39 compute-0 nova_compute[187185]: 2025-11-29 07:12:39.812 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:12:40 compute-0 nova_compute[187185]: 2025-11-29 07:12:40.862 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:12:40 compute-0 nova_compute[187185]: 2025-11-29 07:12:40.888 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:12:40 compute-0 nova_compute[187185]: 2025-11-29 07:12:40.889 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:12:42 compute-0 podman[227379]: 2025-11-29 07:12:42.828415077 +0000 UTC m=+0.081292390 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 07:12:42 compute-0 nova_compute[187185]: 2025-11-29 07:12:42.900 187189 DEBUG nova.objects.instance [None req-fabcd282-d3eb-4dbe-bc57-477614573d79 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'pci_devices' on Instance uuid 5896e8b0-25a2-4075-8ebf-5458b5ed9234 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:12:43 compute-0 nova_compute[187185]: 2025-11-29 07:12:43.075 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400363.0746467, 5896e8b0-25a2-4075-8ebf-5458b5ed9234 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:12:43 compute-0 nova_compute[187185]: 2025-11-29 07:12:43.075 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] VM Paused (Lifecycle Event)
Nov 29 07:12:43 compute-0 nova_compute[187185]: 2025-11-29 07:12:43.555 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:12:43 compute-0 nova_compute[187185]: 2025-11-29 07:12:43.558 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:12:43 compute-0 nova_compute[187185]: 2025-11-29 07:12:43.944 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] During sync_power_state the instance has a pending task (suspending). Skip.
Nov 29 07:12:44 compute-0 kernel: tap69f2ab13-23 (unregistering): left promiscuous mode
Nov 29 07:12:44 compute-0 NetworkManager[55227]: <info>  [1764400364.4530] device (tap69f2ab13-23): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:12:44 compute-0 ovn_controller[95281]: 2025-11-29T07:12:44Z|00231|binding|INFO|Releasing lport 69f2ab13-2311-4137-9c26-e256f33759e5 from this chassis (sb_readonly=0)
Nov 29 07:12:44 compute-0 ovn_controller[95281]: 2025-11-29T07:12:44Z|00232|binding|INFO|Setting lport 69f2ab13-2311-4137-9c26-e256f33759e5 down in Southbound
Nov 29 07:12:44 compute-0 ovn_controller[95281]: 2025-11-29T07:12:44Z|00233|binding|INFO|Removing iface tap69f2ab13-23 ovn-installed in OVS
Nov 29 07:12:44 compute-0 nova_compute[187185]: 2025-11-29 07:12:44.489 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:12:44 compute-0 nova_compute[187185]: 2025-11-29 07:12:44.505 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:12:44 compute-0 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d0000005e.scope: Deactivated successfully.
Nov 29 07:12:44 compute-0 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d0000005e.scope: Consumed 14.370s CPU time.
Nov 29 07:12:44 compute-0 systemd-machined[153486]: Machine qemu-33-instance-0000005e terminated.
Nov 29 07:12:44 compute-0 nova_compute[187185]: 2025-11-29 07:12:44.576 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:12:44 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:12:44.602 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cf:f0:da 10.100.0.6'], port_security=['fa:16:3e:cf:f0:da 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9226dea3-6355-4dd9-9441-d093c1f1a399', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e6c366001df43fb91731faf7a9578fc', 'neutron:revision_number': '6', 'neutron:security_group_ids': '1ab550ec-f542-4b91-8354-868da408e26d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.225', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d2b9cdf8-2516-48a0-81c5-85c2327075d5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=69f2ab13-2311-4137-9c26-e256f33759e5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:12:44 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:12:44.605 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 69f2ab13-2311-4137-9c26-e256f33759e5 in datapath 9226dea3-6355-4dd9-9441-d093c1f1a399 unbound from our chassis
Nov 29 07:12:44 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:12:44.606 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9226dea3-6355-4dd9-9441-d093c1f1a399, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:12:44 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:12:44.609 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[8f1593f3-2974-4755-98e8-447ffdfe7344]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:12:44 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:12:44.609 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 namespace which is not needed anymore
Nov 29 07:12:44 compute-0 nova_compute[187185]: 2025-11-29 07:12:44.714 187189 DEBUG nova.compute.manager [None req-fabcd282-d3eb-4dbe-bc57-477614573d79 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:12:44 compute-0 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[227066]: [NOTICE]   (227070) : haproxy version is 2.8.14-c23fe91
Nov 29 07:12:44 compute-0 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[227066]: [NOTICE]   (227070) : path to executable is /usr/sbin/haproxy
Nov 29 07:12:44 compute-0 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[227066]: [WARNING]  (227070) : Exiting Master process...
Nov 29 07:12:44 compute-0 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[227066]: [ALERT]    (227070) : Current worker (227072) exited with code 143 (Terminated)
Nov 29 07:12:44 compute-0 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[227066]: [WARNING]  (227070) : All workers exited. Exiting... (0)
Nov 29 07:12:44 compute-0 systemd[1]: libpod-13c6b60b768007e6ee720972f88461b1ab36e5fd2f0740dd0ff7bfcada0ffbab.scope: Deactivated successfully.
Nov 29 07:12:44 compute-0 podman[227444]: 2025-11-29 07:12:44.800051608 +0000 UTC m=+0.052559897 container died 13c6b60b768007e6ee720972f88461b1ab36e5fd2f0740dd0ff7bfcada0ffbab (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:12:44 compute-0 nova_compute[187185]: 2025-11-29 07:12:44.814 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:12:44 compute-0 systemd[1]: var-lib-containers-storage-overlay-ef54d49b48c0bfcebf5e25a7190995aaa8d84a1a7a372ae1d53ab3c58ae28761-merged.mount: Deactivated successfully.
Nov 29 07:12:44 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-13c6b60b768007e6ee720972f88461b1ab36e5fd2f0740dd0ff7bfcada0ffbab-userdata-shm.mount: Deactivated successfully.
Nov 29 07:12:44 compute-0 podman[227444]: 2025-11-29 07:12:44.853113558 +0000 UTC m=+0.105621847 container cleanup 13c6b60b768007e6ee720972f88461b1ab36e5fd2f0740dd0ff7bfcada0ffbab (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 29 07:12:44 compute-0 systemd[1]: libpod-conmon-13c6b60b768007e6ee720972f88461b1ab36e5fd2f0740dd0ff7bfcada0ffbab.scope: Deactivated successfully.
Nov 29 07:12:44 compute-0 nova_compute[187185]: 2025-11-29 07:12:44.889 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Triggering sync for uuid 5896e8b0-25a2-4075-8ebf-5458b5ed9234 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Nov 29 07:12:44 compute-0 nova_compute[187185]: 2025-11-29 07:12:44.889 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:12:44 compute-0 nova_compute[187185]: 2025-11-29 07:12:44.890 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:12:44 compute-0 nova_compute[187185]: 2025-11-29 07:12:44.890 187189 INFO nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] During sync_power_state the instance has a pending task (suspending). Skip.
Nov 29 07:12:44 compute-0 nova_compute[187185]: 2025-11-29 07:12:44.891 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:12:44 compute-0 podman[227482]: 2025-11-29 07:12:44.930085235 +0000 UTC m=+0.053014750 container remove 13c6b60b768007e6ee720972f88461b1ab36e5fd2f0740dd0ff7bfcada0ffbab (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 07:12:44 compute-0 podman[227475]: 2025-11-29 07:12:44.931122964 +0000 UTC m=+0.063726733 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 07:12:44 compute-0 podman[227473]: 2025-11-29 07:12:44.933686027 +0000 UTC m=+0.075438745 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 07:12:44 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:12:44.935 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[8ae3a28a-b990-4ffd-911e-07875a74c893]: (4, ('Sat Nov 29 07:12:44 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 (13c6b60b768007e6ee720972f88461b1ab36e5fd2f0740dd0ff7bfcada0ffbab)\n13c6b60b768007e6ee720972f88461b1ab36e5fd2f0740dd0ff7bfcada0ffbab\nSat Nov 29 07:12:44 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 (13c6b60b768007e6ee720972f88461b1ab36e5fd2f0740dd0ff7bfcada0ffbab)\n13c6b60b768007e6ee720972f88461b1ab36e5fd2f0740dd0ff7bfcada0ffbab\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:12:44 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:12:44.937 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[2477c31a-32e9-4962-ad64-4fddb9bdcc40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:12:44 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:12:44.940 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9226dea3-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:12:44 compute-0 nova_compute[187185]: 2025-11-29 07:12:44.942 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:12:44 compute-0 kernel: tap9226dea3-60: left promiscuous mode
Nov 29 07:12:44 compute-0 nova_compute[187185]: 2025-11-29 07:12:44.964 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:12:44 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:12:44.968 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[93472c1e-a3c0-44f6-951d-59d7fd44b6a6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:12:44 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:12:44.985 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[0bf131eb-9611-4cd6-85af-7787f6039008]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:12:44 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:12:44.986 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[1b41768b-71a4-4310-a100-6f7e3a49f13c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:12:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:12:45.003 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[b27f2da6-2151-4fb2-8779-4c131adb3c80]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 574868, 'reachable_time': 23901, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227530, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:12:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:12:45.007 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:12:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:12:45.008 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[a4bd2ec1-3f4e-4d6d-b531-55eeb2d51c11]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:12:45 compute-0 systemd[1]: run-netns-ovnmeta\x2d9226dea3\x2d6355\x2d4dd9\x2d9441\x2dd093c1f1a399.mount: Deactivated successfully.
Nov 29 07:12:45 compute-0 nova_compute[187185]: 2025-11-29 07:12:45.864 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:47.999 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234', 'name': 'tempest-ServerActionsTestJSON-server-878694992', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000005e', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': '6e6c366001df43fb91731faf7a9578fc', 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'hostId': '5640fa721172a4d7bf83648244233a9823df778da5b3ab8028ef92fd', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.000 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.002 12 DEBUG ceilometer.compute.pollsters [-] Instance 5896e8b0-25a2-4075-8ebf-5458b5ed9234 was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-0000005e, id=5896e8b0-25a2-4075-8ebf-5458b5ed9234>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.002 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.003 12 DEBUG ceilometer.compute.pollsters [-] Instance 5896e8b0-25a2-4075-8ebf-5458b5ed9234 was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-0000005e, id=5896e8b0-25a2-4075-8ebf-5458b5ed9234>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.004 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.005 12 DEBUG ceilometer.compute.pollsters [-] Instance 5896e8b0-25a2-4075-8ebf-5458b5ed9234 was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-0000005e, id=5896e8b0-25a2-4075-8ebf-5458b5ed9234>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.005 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.006 12 DEBUG ceilometer.compute.pollsters [-] Instance 5896e8b0-25a2-4075-8ebf-5458b5ed9234 was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-0000005e, id=5896e8b0-25a2-4075-8ebf-5458b5ed9234>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.007 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.008 12 DEBUG ceilometer.compute.pollsters [-] Instance 5896e8b0-25a2-4075-8ebf-5458b5ed9234 was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-0000005e, id=5896e8b0-25a2-4075-8ebf-5458b5ed9234>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.008 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.010 12 DEBUG ceilometer.compute.pollsters [-] Instance 5896e8b0-25a2-4075-8ebf-5458b5ed9234 was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-0000005e, id=5896e8b0-25a2-4075-8ebf-5458b5ed9234>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.010 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.012 12 DEBUG ceilometer.compute.pollsters [-] Instance 5896e8b0-25a2-4075-8ebf-5458b5ed9234 was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-0000005e, id=5896e8b0-25a2-4075-8ebf-5458b5ed9234>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.013 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.014 12 DEBUG ceilometer.compute.pollsters [-] Instance 5896e8b0-25a2-4075-8ebf-5458b5ed9234 was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-0000005e, id=5896e8b0-25a2-4075-8ebf-5458b5ed9234>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.014 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.015 12 DEBUG ceilometer.compute.pollsters [-] Instance 5896e8b0-25a2-4075-8ebf-5458b5ed9234 was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-0000005e, id=5896e8b0-25a2-4075-8ebf-5458b5ed9234>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.015 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.016 12 DEBUG ceilometer.compute.pollsters [-] Instance 5896e8b0-25a2-4075-8ebf-5458b5ed9234 was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-0000005e, id=5896e8b0-25a2-4075-8ebf-5458b5ed9234>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.016 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.017 12 DEBUG ceilometer.compute.pollsters [-] Instance 5896e8b0-25a2-4075-8ebf-5458b5ed9234 was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-0000005e, id=5896e8b0-25a2-4075-8ebf-5458b5ed9234>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.018 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.019 12 DEBUG ceilometer.compute.pollsters [-] Instance 5896e8b0-25a2-4075-8ebf-5458b5ed9234 was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-0000005e, id=5896e8b0-25a2-4075-8ebf-5458b5ed9234>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.019 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.020 12 DEBUG ceilometer.compute.pollsters [-] Instance 5896e8b0-25a2-4075-8ebf-5458b5ed9234 was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-0000005e, id=5896e8b0-25a2-4075-8ebf-5458b5ed9234>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.020 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.021 12 DEBUG ceilometer.compute.pollsters [-] Instance 5896e8b0-25a2-4075-8ebf-5458b5ed9234 was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-0000005e, id=5896e8b0-25a2-4075-8ebf-5458b5ed9234>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.021 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.022 12 DEBUG ceilometer.compute.pollsters [-] Instance 5896e8b0-25a2-4075-8ebf-5458b5ed9234 was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-0000005e, id=5896e8b0-25a2-4075-8ebf-5458b5ed9234>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.022 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.023 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.024 12 DEBUG ceilometer.compute.pollsters [-] Instance 5896e8b0-25a2-4075-8ebf-5458b5ed9234 was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-0000005e, id=5896e8b0-25a2-4075-8ebf-5458b5ed9234>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.024 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.024 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.025 12 DEBUG ceilometer.compute.pollsters [-] Instance 5896e8b0-25a2-4075-8ebf-5458b5ed9234 was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-0000005e, id=5896e8b0-25a2-4075-8ebf-5458b5ed9234>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.025 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.026 12 DEBUG ceilometer.compute.pollsters [-] Instance 5896e8b0-25a2-4075-8ebf-5458b5ed9234 was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-0000005e, id=5896e8b0-25a2-4075-8ebf-5458b5ed9234>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.027 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.028 12 DEBUG ceilometer.compute.pollsters [-] Instance 5896e8b0-25a2-4075-8ebf-5458b5ed9234 was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-0000005e, id=5896e8b0-25a2-4075-8ebf-5458b5ed9234>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.028 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.028 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.029 12 DEBUG ceilometer.compute.pollsters [-] Instance 5896e8b0-25a2-4075-8ebf-5458b5ed9234 was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-0000005e, id=5896e8b0-25a2-4075-8ebf-5458b5ed9234>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.029 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 07:12:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:12:48.030 12 DEBUG ceilometer.compute.pollsters [-] Instance 5896e8b0-25a2-4075-8ebf-5458b5ed9234 was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-0000005e, id=5896e8b0-25a2-4075-8ebf-5458b5ed9234>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 07:12:48 compute-0 nova_compute[187185]: 2025-11-29 07:12:48.791 187189 DEBUG nova.compute.manager [req-8e29eae5-b3eb-453a-9d97-b71eb42a909c req-2f7afa1c-fc19-49cb-923a-1598af599089 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Received event network-vif-unplugged-69f2ab13-2311-4137-9c26-e256f33759e5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:12:48 compute-0 nova_compute[187185]: 2025-11-29 07:12:48.792 187189 DEBUG oslo_concurrency.lockutils [req-8e29eae5-b3eb-453a-9d97-b71eb42a909c req-2f7afa1c-fc19-49cb-923a-1598af599089 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:12:48 compute-0 nova_compute[187185]: 2025-11-29 07:12:48.793 187189 DEBUG oslo_concurrency.lockutils [req-8e29eae5-b3eb-453a-9d97-b71eb42a909c req-2f7afa1c-fc19-49cb-923a-1598af599089 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:12:48 compute-0 nova_compute[187185]: 2025-11-29 07:12:48.793 187189 DEBUG oslo_concurrency.lockutils [req-8e29eae5-b3eb-453a-9d97-b71eb42a909c req-2f7afa1c-fc19-49cb-923a-1598af599089 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:12:48 compute-0 nova_compute[187185]: 2025-11-29 07:12:48.794 187189 DEBUG nova.compute.manager [req-8e29eae5-b3eb-453a-9d97-b71eb42a909c req-2f7afa1c-fc19-49cb-923a-1598af599089 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] No waiting events found dispatching network-vif-unplugged-69f2ab13-2311-4137-9c26-e256f33759e5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:12:48 compute-0 nova_compute[187185]: 2025-11-29 07:12:48.794 187189 WARNING nova.compute.manager [req-8e29eae5-b3eb-453a-9d97-b71eb42a909c req-2f7afa1c-fc19-49cb-923a-1598af599089 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Received unexpected event network-vif-unplugged-69f2ab13-2311-4137-9c26-e256f33759e5 for instance with vm_state suspended and task_state None.
Nov 29 07:12:49 compute-0 nova_compute[187185]: 2025-11-29 07:12:49.817 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:12:50 compute-0 nova_compute[187185]: 2025-11-29 07:12:50.900 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:12:51 compute-0 nova_compute[187185]: 2025-11-29 07:12:51.558 187189 DEBUG nova.compute.manager [req-fc73a997-381a-4204-9b72-344bf626485f req-db70b0b8-9967-4c10-853c-1731c55b5bd1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Received event network-vif-plugged-69f2ab13-2311-4137-9c26-e256f33759e5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:12:51 compute-0 nova_compute[187185]: 2025-11-29 07:12:51.558 187189 DEBUG oslo_concurrency.lockutils [req-fc73a997-381a-4204-9b72-344bf626485f req-db70b0b8-9967-4c10-853c-1731c55b5bd1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:12:51 compute-0 nova_compute[187185]: 2025-11-29 07:12:51.559 187189 DEBUG oslo_concurrency.lockutils [req-fc73a997-381a-4204-9b72-344bf626485f req-db70b0b8-9967-4c10-853c-1731c55b5bd1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:12:51 compute-0 nova_compute[187185]: 2025-11-29 07:12:51.559 187189 DEBUG oslo_concurrency.lockutils [req-fc73a997-381a-4204-9b72-344bf626485f req-db70b0b8-9967-4c10-853c-1731c55b5bd1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:12:51 compute-0 nova_compute[187185]: 2025-11-29 07:12:51.559 187189 DEBUG nova.compute.manager [req-fc73a997-381a-4204-9b72-344bf626485f req-db70b0b8-9967-4c10-853c-1731c55b5bd1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] No waiting events found dispatching network-vif-plugged-69f2ab13-2311-4137-9c26-e256f33759e5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:12:51 compute-0 nova_compute[187185]: 2025-11-29 07:12:51.559 187189 WARNING nova.compute.manager [req-fc73a997-381a-4204-9b72-344bf626485f req-db70b0b8-9967-4c10-853c-1731c55b5bd1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Received unexpected event network-vif-plugged-69f2ab13-2311-4137-9c26-e256f33759e5 for instance with vm_state suspended and task_state resuming.
Nov 29 07:12:51 compute-0 nova_compute[187185]: 2025-11-29 07:12:51.561 187189 INFO nova.compute.manager [None req-8c4bfe24-57ea-4052-be4a-f5ee5f45cfca e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Resuming
Nov 29 07:12:51 compute-0 nova_compute[187185]: 2025-11-29 07:12:51.562 187189 DEBUG nova.objects.instance [None req-8c4bfe24-57ea-4052-be4a-f5ee5f45cfca e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'flavor' on Instance uuid 5896e8b0-25a2-4075-8ebf-5458b5ed9234 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:12:52 compute-0 nova_compute[187185]: 2025-11-29 07:12:52.885 187189 DEBUG oslo_concurrency.lockutils [None req-8c4bfe24-57ea-4052-be4a-f5ee5f45cfca e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "refresh_cache-5896e8b0-25a2-4075-8ebf-5458b5ed9234" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:12:52 compute-0 nova_compute[187185]: 2025-11-29 07:12:52.886 187189 DEBUG oslo_concurrency.lockutils [None req-8c4bfe24-57ea-4052-be4a-f5ee5f45cfca e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquired lock "refresh_cache-5896e8b0-25a2-4075-8ebf-5458b5ed9234" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:12:52 compute-0 nova_compute[187185]: 2025-11-29 07:12:52.887 187189 DEBUG nova.network.neutron [None req-8c4bfe24-57ea-4052-be4a-f5ee5f45cfca e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:12:54 compute-0 nova_compute[187185]: 2025-11-29 07:12:54.819 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:12:55 compute-0 nova_compute[187185]: 2025-11-29 07:12:55.902 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:12:57 compute-0 nova_compute[187185]: 2025-11-29 07:12:57.669 187189 DEBUG nova.network.neutron [None req-8c4bfe24-57ea-4052-be4a-f5ee5f45cfca e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Updating instance_info_cache with network_info: [{"id": "69f2ab13-2311-4137-9c26-e256f33759e5", "address": "fa:16:3e:cf:f0:da", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69f2ab13-23", "ovs_interfaceid": "69f2ab13-2311-4137-9c26-e256f33759e5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:12:57 compute-0 podman[227531]: 2025-11-29 07:12:57.813442013 +0000 UTC m=+0.070723681 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Nov 29 07:12:57 compute-0 podman[227533]: 2025-11-29 07:12:57.830101904 +0000 UTC m=+0.072758358 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 07:12:57 compute-0 podman[227532]: 2025-11-29 07:12:57.850963884 +0000 UTC m=+0.105165035 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, version=9.6, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9)
Nov 29 07:12:59 compute-0 nova_compute[187185]: 2025-11-29 07:12:59.360 187189 DEBUG oslo_concurrency.lockutils [None req-8c4bfe24-57ea-4052-be4a-f5ee5f45cfca e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Releasing lock "refresh_cache-5896e8b0-25a2-4075-8ebf-5458b5ed9234" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:12:59 compute-0 nova_compute[187185]: 2025-11-29 07:12:59.365 187189 DEBUG nova.virt.libvirt.vif [None req-8c4bfe24-57ea-4052-be4a-f5ee5f45cfca e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:10:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-878694992',display_name='tempest-ServerActionsTestJSON-server-878694992',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-878694992',id=94,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCMk4ml8b3fuOLAahNu+FCDl9v+SWDrsolrsZ2S6Gt/612rFnCoMRUdrLWExAZcapVyOr30SCUoxNsWGg9cONHUEcP39CbtnDaooW7qgHPKQnqdGyUHB5iGAkkvgw7SerQ==',key_name='tempest-keypair-11305483',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:10:40Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='6e6c366001df43fb91731faf7a9578fc',ramdisk_id='',reservation_id='r-j3wprymf',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-157226036',owner_user_name='tempest-ServerActionsTestJSON-157226036-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:12:44Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e1b8fbcc8caa4d94b69570f233c56d18',uuid=5896e8b0-25a2-4075-8ebf-5458b5ed9234,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "69f2ab13-2311-4137-9c26-e256f33759e5", "address": "fa:16:3e:cf:f0:da", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69f2ab13-23", "ovs_interfaceid": "69f2ab13-2311-4137-9c26-e256f33759e5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:12:59 compute-0 nova_compute[187185]: 2025-11-29 07:12:59.366 187189 DEBUG nova.network.os_vif_util [None req-8c4bfe24-57ea-4052-be4a-f5ee5f45cfca e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converting VIF {"id": "69f2ab13-2311-4137-9c26-e256f33759e5", "address": "fa:16:3e:cf:f0:da", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69f2ab13-23", "ovs_interfaceid": "69f2ab13-2311-4137-9c26-e256f33759e5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:12:59 compute-0 nova_compute[187185]: 2025-11-29 07:12:59.366 187189 DEBUG nova.network.os_vif_util [None req-8c4bfe24-57ea-4052-be4a-f5ee5f45cfca e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cf:f0:da,bridge_name='br-int',has_traffic_filtering=True,id=69f2ab13-2311-4137-9c26-e256f33759e5,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69f2ab13-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:12:59 compute-0 nova_compute[187185]: 2025-11-29 07:12:59.366 187189 DEBUG os_vif [None req-8c4bfe24-57ea-4052-be4a-f5ee5f45cfca e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:f0:da,bridge_name='br-int',has_traffic_filtering=True,id=69f2ab13-2311-4137-9c26-e256f33759e5,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69f2ab13-23') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:12:59 compute-0 nova_compute[187185]: 2025-11-29 07:12:59.367 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:12:59 compute-0 nova_compute[187185]: 2025-11-29 07:12:59.367 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:12:59 compute-0 nova_compute[187185]: 2025-11-29 07:12:59.368 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:12:59 compute-0 nova_compute[187185]: 2025-11-29 07:12:59.372 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:12:59 compute-0 nova_compute[187185]: 2025-11-29 07:12:59.372 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap69f2ab13-23, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:12:59 compute-0 nova_compute[187185]: 2025-11-29 07:12:59.372 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap69f2ab13-23, col_values=(('external_ids', {'iface-id': '69f2ab13-2311-4137-9c26-e256f33759e5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cf:f0:da', 'vm-uuid': '5896e8b0-25a2-4075-8ebf-5458b5ed9234'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:12:59 compute-0 nova_compute[187185]: 2025-11-29 07:12:59.373 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:12:59 compute-0 nova_compute[187185]: 2025-11-29 07:12:59.373 187189 INFO os_vif [None req-8c4bfe24-57ea-4052-be4a-f5ee5f45cfca e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:f0:da,bridge_name='br-int',has_traffic_filtering=True,id=69f2ab13-2311-4137-9c26-e256f33759e5,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69f2ab13-23')
Nov 29 07:12:59 compute-0 nova_compute[187185]: 2025-11-29 07:12:59.409 187189 DEBUG nova.objects.instance [None req-8c4bfe24-57ea-4052-be4a-f5ee5f45cfca e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'numa_topology' on Instance uuid 5896e8b0-25a2-4075-8ebf-5458b5ed9234 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:12:59 compute-0 nova_compute[187185]: 2025-11-29 07:12:59.716 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400364.7137756, 5896e8b0-25a2-4075-8ebf-5458b5ed9234 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:12:59 compute-0 nova_compute[187185]: 2025-11-29 07:12:59.717 187189 INFO nova.compute.manager [-] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] VM Stopped (Lifecycle Event)
Nov 29 07:12:59 compute-0 nova_compute[187185]: 2025-11-29 07:12:59.822 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:12:59 compute-0 nova_compute[187185]: 2025-11-29 07:12:59.977 187189 DEBUG nova.compute.manager [None req-4d3e057d-8943-4fe7-aee9-4735b86a8da2 - - - - - -] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:13:00 compute-0 kernel: tap69f2ab13-23: entered promiscuous mode
Nov 29 07:13:00 compute-0 NetworkManager[55227]: <info>  [1764400380.0438] manager: (tap69f2ab13-23): new Tun device (/org/freedesktop/NetworkManager/Devices/125)
Nov 29 07:13:00 compute-0 ovn_controller[95281]: 2025-11-29T07:13:00Z|00234|binding|INFO|Claiming lport 69f2ab13-2311-4137-9c26-e256f33759e5 for this chassis.
Nov 29 07:13:00 compute-0 nova_compute[187185]: 2025-11-29 07:13:00.043 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:13:00 compute-0 ovn_controller[95281]: 2025-11-29T07:13:00Z|00235|binding|INFO|69f2ab13-2311-4137-9c26-e256f33759e5: Claiming fa:16:3e:cf:f0:da 10.100.0.6
Nov 29 07:13:00 compute-0 nova_compute[187185]: 2025-11-29 07:13:00.047 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:13:00 compute-0 ovn_controller[95281]: 2025-11-29T07:13:00Z|00236|binding|INFO|Setting lport 69f2ab13-2311-4137-9c26-e256f33759e5 ovn-installed in OVS
Nov 29 07:13:00 compute-0 nova_compute[187185]: 2025-11-29 07:13:00.057 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:13:00 compute-0 ovn_controller[95281]: 2025-11-29T07:13:00Z|00237|binding|INFO|Setting lport 69f2ab13-2311-4137-9c26-e256f33759e5 up in Southbound
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:00.064 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cf:f0:da 10.100.0.6'], port_security=['fa:16:3e:cf:f0:da 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9226dea3-6355-4dd9-9441-d093c1f1a399', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e6c366001df43fb91731faf7a9578fc', 'neutron:revision_number': '7', 'neutron:security_group_ids': '1ab550ec-f542-4b91-8354-868da408e26d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.225'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d2b9cdf8-2516-48a0-81c5-85c2327075d5, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=69f2ab13-2311-4137-9c26-e256f33759e5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:00.065 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 69f2ab13-2311-4137-9c26-e256f33759e5 in datapath 9226dea3-6355-4dd9-9441-d093c1f1a399 bound to our chassis
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:00.067 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9226dea3-6355-4dd9-9441-d093c1f1a399
Nov 29 07:13:00 compute-0 systemd-udevd[227609]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:00.079 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[0630a48b-e379-41e6-a139-21d7ea4d2439]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:00.080 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9226dea3-61 in ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:00.083 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9226dea3-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:00.083 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6b693eff-c976-4b4c-a36c-14d85e95e5de]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:00.084 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[432206e5-11a2-43ee-8311-d68af2c676a1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:13:00 compute-0 NetworkManager[55227]: <info>  [1764400380.0933] device (tap69f2ab13-23): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:13:00 compute-0 NetworkManager[55227]: <info>  [1764400380.0942] device (tap69f2ab13-23): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:13:00 compute-0 systemd-machined[153486]: New machine qemu-34-instance-0000005e.
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:00.098 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[c4c5ccd8-43ee-4c26-a161-7023f1dda403]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:13:00 compute-0 systemd[1]: Started Virtual Machine qemu-34-instance-0000005e.
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:00.116 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[4475b150-0a94-4eed-aad0-f0fcfc1c6433]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:00.146 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[877a4323-063f-40a3-b50d-8038f8522b82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:00.152 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[9b30564f-bb14-49a0-8e91-5a07d8e50c98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:13:00 compute-0 NetworkManager[55227]: <info>  [1764400380.1538] manager: (tap9226dea3-60): new Veth device (/org/freedesktop/NetworkManager/Devices/126)
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:00.201 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[8a4ae2cf-fcf6-472c-87e6-1882b8420b17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:00.204 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[ad2c16cb-ed41-4736-95e7-bde830cbd13f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:13:00 compute-0 NetworkManager[55227]: <info>  [1764400380.2295] device (tap9226dea3-60): carrier: link connected
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:00.240 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[3c148e79-58b8-4e9e-95a8-1156ea81cf9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:00.262 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[53c2c2db-baf2-4864-b22d-756abeb710d7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9226dea3-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cb:49:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 78], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 582789, 'reachable_time': 19115, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227644, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:00.281 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6d600702-e82f-4e20-8d82-571dfe6d3db2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecb:493d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 582789, 'tstamp': 582789}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227645, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:00.298 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[dca0ebe9-aa24-451d-aaad-ba854bba77df]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9226dea3-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cb:49:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 78], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 582789, 'reachable_time': 19115, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227646, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:00.339 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d789812c-5eaa-4647-821f-19371896184a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:00.423 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[b2d22397-de28-432a-ace5-09a6e8fd1e8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:00.425 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9226dea3-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:00.425 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:00.426 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9226dea3-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:13:00 compute-0 kernel: tap9226dea3-60: entered promiscuous mode
Nov 29 07:13:00 compute-0 NetworkManager[55227]: <info>  [1764400380.4303] manager: (tap9226dea3-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/127)
Nov 29 07:13:00 compute-0 nova_compute[187185]: 2025-11-29 07:13:00.429 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:00.435 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9226dea3-60, col_values=(('external_ids', {'iface-id': 'e99fae54-9bf0-4a59-8b06-7a4b6ecf1479'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:13:00 compute-0 ovn_controller[95281]: 2025-11-29T07:13:00Z|00238|binding|INFO|Releasing lport e99fae54-9bf0-4a59-8b06-7a4b6ecf1479 from this chassis (sb_readonly=0)
Nov 29 07:13:00 compute-0 nova_compute[187185]: 2025-11-29 07:13:00.437 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:13:00 compute-0 nova_compute[187185]: 2025-11-29 07:13:00.458 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:00.460 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9226dea3-6355-4dd9-9441-d093c1f1a399.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9226dea3-6355-4dd9-9441-d093c1f1a399.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:00.461 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[3c6d5ecb-9c19-4f96-9822-d15b2b26f46e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:00.462 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-9226dea3-6355-4dd9-9441-d093c1f1a399
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/9226dea3-6355-4dd9-9441-d093c1f1a399.pid.haproxy
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID 9226dea3-6355-4dd9-9441-d093c1f1a399
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:13:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:00.463 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'env', 'PROCESS_TAG=haproxy-9226dea3-6355-4dd9-9441-d093c1f1a399', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9226dea3-6355-4dd9-9441-d093c1f1a399.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:13:00 compute-0 nova_compute[187185]: 2025-11-29 07:13:00.631 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:13:00 compute-0 nova_compute[187185]: 2025-11-29 07:13:00.631 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:13:00 compute-0 nova_compute[187185]: 2025-11-29 07:13:00.631 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:13:00 compute-0 nova_compute[187185]: 2025-11-29 07:13:00.748 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400380.7482839, 5896e8b0-25a2-4075-8ebf-5458b5ed9234 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:13:00 compute-0 nova_compute[187185]: 2025-11-29 07:13:00.749 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] VM Started (Lifecycle Event)
Nov 29 07:13:00 compute-0 nova_compute[187185]: 2025-11-29 07:13:00.798 187189 DEBUG nova.compute.manager [None req-8c4bfe24-57ea-4052-be4a-f5ee5f45cfca e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:13:00 compute-0 nova_compute[187185]: 2025-11-29 07:13:00.799 187189 DEBUG nova.objects.instance [None req-8c4bfe24-57ea-4052-be4a-f5ee5f45cfca e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'pci_devices' on Instance uuid 5896e8b0-25a2-4075-8ebf-5458b5ed9234 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:13:00 compute-0 podman[227684]: 2025-11-29 07:13:00.909738676 +0000 UTC m=+0.066533582 container create 8adc41f14d4e8664c81c06d653104c7d8330c263a3bd24dc6bd7c213ad0ac122 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:13:00 compute-0 nova_compute[187185]: 2025-11-29 07:13:00.946 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:13:00 compute-0 podman[227684]: 2025-11-29 07:13:00.876076474 +0000 UTC m=+0.032871390 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:13:00 compute-0 systemd[1]: Started libpod-conmon-8adc41f14d4e8664c81c06d653104c7d8330c263a3bd24dc6bd7c213ad0ac122.scope.
Nov 29 07:13:01 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:13:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7cdae94394fddf26cb2f05c9a725d31b07f6af5edc3ebb23ebcc768047cd6ebb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:13:01 compute-0 podman[227684]: 2025-11-29 07:13:01.040049291 +0000 UTC m=+0.196844207 container init 8adc41f14d4e8664c81c06d653104c7d8330c263a3bd24dc6bd7c213ad0ac122 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 07:13:01 compute-0 podman[227684]: 2025-11-29 07:13:01.050263969 +0000 UTC m=+0.207058865 container start 8adc41f14d4e8664c81c06d653104c7d8330c263a3bd24dc6bd7c213ad0ac122 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 29 07:13:01 compute-0 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[227699]: [NOTICE]   (227703) : New worker (227705) forked
Nov 29 07:13:01 compute-0 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[227699]: [NOTICE]   (227703) : Loading success.
Nov 29 07:13:01 compute-0 nova_compute[187185]: 2025-11-29 07:13:01.154 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "refresh_cache-5896e8b0-25a2-4075-8ebf-5458b5ed9234" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:13:01 compute-0 nova_compute[187185]: 2025-11-29 07:13:01.155 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquired lock "refresh_cache-5896e8b0-25a2-4075-8ebf-5458b5ed9234" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:13:01 compute-0 nova_compute[187185]: 2025-11-29 07:13:01.155 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 29 07:13:01 compute-0 nova_compute[187185]: 2025-11-29 07:13:01.156 187189 DEBUG nova.objects.instance [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 5896e8b0-25a2-4075-8ebf-5458b5ed9234 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:13:01 compute-0 nova_compute[187185]: 2025-11-29 07:13:01.253 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:13:01 compute-0 nova_compute[187185]: 2025-11-29 07:13:01.262 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:13:01 compute-0 nova_compute[187185]: 2025-11-29 07:13:01.266 187189 INFO nova.virt.libvirt.driver [-] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Instance running successfully.
Nov 29 07:13:01 compute-0 virtqemud[186729]: argument unsupported: QEMU guest agent is not configured
Nov 29 07:13:01 compute-0 nova_compute[187185]: 2025-11-29 07:13:01.269 187189 DEBUG nova.virt.libvirt.guest [None req-8c4bfe24-57ea-4052-be4a-f5ee5f45cfca e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Nov 29 07:13:01 compute-0 nova_compute[187185]: 2025-11-29 07:13:01.270 187189 DEBUG nova.compute.manager [None req-8c4bfe24-57ea-4052-be4a-f5ee5f45cfca e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:13:01 compute-0 nova_compute[187185]: 2025-11-29 07:13:01.585 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] During sync_power_state the instance has a pending task (resuming). Skip.
Nov 29 07:13:01 compute-0 nova_compute[187185]: 2025-11-29 07:13:01.586 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400380.7554889, 5896e8b0-25a2-4075-8ebf-5458b5ed9234 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:13:01 compute-0 nova_compute[187185]: 2025-11-29 07:13:01.586 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] VM Resumed (Lifecycle Event)
Nov 29 07:13:02 compute-0 nova_compute[187185]: 2025-11-29 07:13:02.412 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:13:02 compute-0 nova_compute[187185]: 2025-11-29 07:13:02.418 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:13:04 compute-0 nova_compute[187185]: 2025-11-29 07:13:04.162 187189 DEBUG nova.compute.manager [req-609de4f6-655b-4046-b144-d7edf732d601 req-eeb72eb1-ee32-45d5-9b11-ef0a5de717b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Received event network-vif-plugged-69f2ab13-2311-4137-9c26-e256f33759e5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:13:04 compute-0 nova_compute[187185]: 2025-11-29 07:13:04.163 187189 DEBUG oslo_concurrency.lockutils [req-609de4f6-655b-4046-b144-d7edf732d601 req-eeb72eb1-ee32-45d5-9b11-ef0a5de717b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:13:04 compute-0 nova_compute[187185]: 2025-11-29 07:13:04.163 187189 DEBUG oslo_concurrency.lockutils [req-609de4f6-655b-4046-b144-d7edf732d601 req-eeb72eb1-ee32-45d5-9b11-ef0a5de717b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:13:04 compute-0 nova_compute[187185]: 2025-11-29 07:13:04.164 187189 DEBUG oslo_concurrency.lockutils [req-609de4f6-655b-4046-b144-d7edf732d601 req-eeb72eb1-ee32-45d5-9b11-ef0a5de717b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:13:04 compute-0 nova_compute[187185]: 2025-11-29 07:13:04.164 187189 DEBUG nova.compute.manager [req-609de4f6-655b-4046-b144-d7edf732d601 req-eeb72eb1-ee32-45d5-9b11-ef0a5de717b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] No waiting events found dispatching network-vif-plugged-69f2ab13-2311-4137-9c26-e256f33759e5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:13:04 compute-0 nova_compute[187185]: 2025-11-29 07:13:04.165 187189 WARNING nova.compute.manager [req-609de4f6-655b-4046-b144-d7edf732d601 req-eeb72eb1-ee32-45d5-9b11-ef0a5de717b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Received unexpected event network-vif-plugged-69f2ab13-2311-4137-9c26-e256f33759e5 for instance with vm_state active and task_state None.
Nov 29 07:13:04 compute-0 nova_compute[187185]: 2025-11-29 07:13:04.825 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:13:05 compute-0 nova_compute[187185]: 2025-11-29 07:13:05.951 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:13:07 compute-0 ovn_controller[95281]: 2025-11-29T07:13:07Z|00239|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Nov 29 07:13:07 compute-0 podman[227714]: 2025-11-29 07:13:07.87646227 +0000 UTC m=+0.128774753 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251125)
Nov 29 07:13:09 compute-0 nova_compute[187185]: 2025-11-29 07:13:09.828 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:13:10 compute-0 nova_compute[187185]: 2025-11-29 07:13:10.953 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:13:13 compute-0 podman[227740]: 2025-11-29 07:13:13.8163956 +0000 UTC m=+0.078108899 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 07:13:14 compute-0 nova_compute[187185]: 2025-11-29 07:13:14.829 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:13:15 compute-0 podman[227764]: 2025-11-29 07:13:15.849487219 +0000 UTC m=+0.104097394 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3)
Nov 29 07:13:15 compute-0 podman[227765]: 2025-11-29 07:13:15.849585952 +0000 UTC m=+0.100893954 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 29 07:13:15 compute-0 nova_compute[187185]: 2025-11-29 07:13:15.957 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:13:18 compute-0 nova_compute[187185]: 2025-11-29 07:13:18.225 187189 DEBUG nova.compute.manager [req-2e165ea7-838a-41c0-9660-83123d097a05 req-f46ad872-a343-4360-9808-fe64847997dc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Received event network-vif-plugged-69f2ab13-2311-4137-9c26-e256f33759e5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:13:18 compute-0 nova_compute[187185]: 2025-11-29 07:13:18.226 187189 DEBUG oslo_concurrency.lockutils [req-2e165ea7-838a-41c0-9660-83123d097a05 req-f46ad872-a343-4360-9808-fe64847997dc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:13:18 compute-0 nova_compute[187185]: 2025-11-29 07:13:18.226 187189 DEBUG oslo_concurrency.lockutils [req-2e165ea7-838a-41c0-9660-83123d097a05 req-f46ad872-a343-4360-9808-fe64847997dc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:13:18 compute-0 nova_compute[187185]: 2025-11-29 07:13:18.226 187189 DEBUG oslo_concurrency.lockutils [req-2e165ea7-838a-41c0-9660-83123d097a05 req-f46ad872-a343-4360-9808-fe64847997dc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:13:18 compute-0 nova_compute[187185]: 2025-11-29 07:13:18.227 187189 DEBUG nova.compute.manager [req-2e165ea7-838a-41c0-9660-83123d097a05 req-f46ad872-a343-4360-9808-fe64847997dc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] No waiting events found dispatching network-vif-plugged-69f2ab13-2311-4137-9c26-e256f33759e5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:13:18 compute-0 nova_compute[187185]: 2025-11-29 07:13:18.227 187189 WARNING nova.compute.manager [req-2e165ea7-838a-41c0-9660-83123d097a05 req-f46ad872-a343-4360-9808-fe64847997dc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Received unexpected event network-vif-plugged-69f2ab13-2311-4137-9c26-e256f33759e5 for instance with vm_state active and task_state None.
Nov 29 07:13:19 compute-0 nova_compute[187185]: 2025-11-29 07:13:19.294 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Updating instance_info_cache with network_info: [{"id": "69f2ab13-2311-4137-9c26-e256f33759e5", "address": "fa:16:3e:cf:f0:da", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69f2ab13-23", "ovs_interfaceid": "69f2ab13-2311-4137-9c26-e256f33759e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:13:19 compute-0 nova_compute[187185]: 2025-11-29 07:13:19.832 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:13:20 compute-0 nova_compute[187185]: 2025-11-29 07:13:20.437 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Releasing lock "refresh_cache-5896e8b0-25a2-4075-8ebf-5458b5ed9234" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:13:20 compute-0 nova_compute[187185]: 2025-11-29 07:13:20.437 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 29 07:13:20 compute-0 nova_compute[187185]: 2025-11-29 07:13:20.438 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:13:20 compute-0 nova_compute[187185]: 2025-11-29 07:13:20.438 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:13:20 compute-0 nova_compute[187185]: 2025-11-29 07:13:20.438 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:13:20 compute-0 nova_compute[187185]: 2025-11-29 07:13:20.438 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:13:20 compute-0 nova_compute[187185]: 2025-11-29 07:13:20.439 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:13:20 compute-0 nova_compute[187185]: 2025-11-29 07:13:20.439 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:13:20 compute-0 nova_compute[187185]: 2025-11-29 07:13:20.439 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:13:20 compute-0 nova_compute[187185]: 2025-11-29 07:13:20.439 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:13:20 compute-0 nova_compute[187185]: 2025-11-29 07:13:20.484 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:13:20 compute-0 nova_compute[187185]: 2025-11-29 07:13:20.485 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:13:20 compute-0 nova_compute[187185]: 2025-11-29 07:13:20.485 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:13:20 compute-0 nova_compute[187185]: 2025-11-29 07:13:20.485 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:13:20 compute-0 nova_compute[187185]: 2025-11-29 07:13:20.599 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:13:20 compute-0 nova_compute[187185]: 2025-11-29 07:13:20.654 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:13:20 compute-0 nova_compute[187185]: 2025-11-29 07:13:20.655 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:13:20 compute-0 nova_compute[187185]: 2025-11-29 07:13:20.713 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:13:20 compute-0 nova_compute[187185]: 2025-11-29 07:13:20.851 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:13:20 compute-0 nova_compute[187185]: 2025-11-29 07:13:20.852 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5511MB free_disk=73.26745223999023GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:13:20 compute-0 nova_compute[187185]: 2025-11-29 07:13:20.852 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:13:20 compute-0 nova_compute[187185]: 2025-11-29 07:13:20.853 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:13:20 compute-0 nova_compute[187185]: 2025-11-29 07:13:20.960 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:13:21 compute-0 nova_compute[187185]: 2025-11-29 07:13:21.104 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance 5896e8b0-25a2-4075-8ebf-5458b5ed9234 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:13:21 compute-0 nova_compute[187185]: 2025-11-29 07:13:21.104 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:13:21 compute-0 nova_compute[187185]: 2025-11-29 07:13:21.105 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:13:21 compute-0 nova_compute[187185]: 2025-11-29 07:13:21.190 187189 DEBUG oslo_concurrency.lockutils [None req-53b9bcf0-cc91-4286-88be-8590aa0ccf4e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:13:21 compute-0 nova_compute[187185]: 2025-11-29 07:13:21.190 187189 DEBUG oslo_concurrency.lockutils [None req-53b9bcf0-cc91-4286-88be-8590aa0ccf4e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:13:21 compute-0 nova_compute[187185]: 2025-11-29 07:13:21.191 187189 DEBUG oslo_concurrency.lockutils [None req-53b9bcf0-cc91-4286-88be-8590aa0ccf4e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:13:21 compute-0 nova_compute[187185]: 2025-11-29 07:13:21.191 187189 DEBUG oslo_concurrency.lockutils [None req-53b9bcf0-cc91-4286-88be-8590aa0ccf4e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:13:21 compute-0 nova_compute[187185]: 2025-11-29 07:13:21.192 187189 DEBUG oslo_concurrency.lockutils [None req-53b9bcf0-cc91-4286-88be-8590aa0ccf4e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:13:21 compute-0 nova_compute[187185]: 2025-11-29 07:13:21.209 187189 INFO nova.compute.manager [None req-53b9bcf0-cc91-4286-88be-8590aa0ccf4e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Terminating instance
Nov 29 07:13:21 compute-0 nova_compute[187185]: 2025-11-29 07:13:21.240 187189 DEBUG nova.compute.manager [None req-53b9bcf0-cc91-4286-88be-8590aa0ccf4e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 07:13:21 compute-0 kernel: tap69f2ab13-23 (unregistering): left promiscuous mode
Nov 29 07:13:21 compute-0 NetworkManager[55227]: <info>  [1764400401.2649] device (tap69f2ab13-23): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:13:21 compute-0 nova_compute[187185]: 2025-11-29 07:13:21.277 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:13:21 compute-0 ovn_controller[95281]: 2025-11-29T07:13:21Z|00240|binding|INFO|Releasing lport 69f2ab13-2311-4137-9c26-e256f33759e5 from this chassis (sb_readonly=0)
Nov 29 07:13:21 compute-0 ovn_controller[95281]: 2025-11-29T07:13:21Z|00241|binding|INFO|Setting lport 69f2ab13-2311-4137-9c26-e256f33759e5 down in Southbound
Nov 29 07:13:21 compute-0 ovn_controller[95281]: 2025-11-29T07:13:21Z|00242|binding|INFO|Removing iface tap69f2ab13-23 ovn-installed in OVS
Nov 29 07:13:21 compute-0 nova_compute[187185]: 2025-11-29 07:13:21.280 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:13:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:21.286 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cf:f0:da 10.100.0.6'], port_security=['fa:16:3e:cf:f0:da 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '5896e8b0-25a2-4075-8ebf-5458b5ed9234', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9226dea3-6355-4dd9-9441-d093c1f1a399', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e6c366001df43fb91731faf7a9578fc', 'neutron:revision_number': '8', 'neutron:security_group_ids': '1ab550ec-f542-4b91-8354-868da408e26d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.225', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d2b9cdf8-2516-48a0-81c5-85c2327075d5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=69f2ab13-2311-4137-9c26-e256f33759e5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:13:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:21.288 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 69f2ab13-2311-4137-9c26-e256f33759e5 in datapath 9226dea3-6355-4dd9-9441-d093c1f1a399 unbound from our chassis
Nov 29 07:13:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:21.291 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9226dea3-6355-4dd9-9441-d093c1f1a399, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:13:21 compute-0 nova_compute[187185]: 2025-11-29 07:13:21.294 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:13:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:21.293 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a318c06d-762e-4ac1-97cb-9f620dc6fe86]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:13:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:21.294 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 namespace which is not needed anymore
Nov 29 07:13:21 compute-0 nova_compute[187185]: 2025-11-29 07:13:21.303 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:13:21 compute-0 nova_compute[187185]: 2025-11-29 07:13:21.323 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:13:21 compute-0 nova_compute[187185]: 2025-11-29 07:13:21.325 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:13:21 compute-0 nova_compute[187185]: 2025-11-29 07:13:21.325 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.472s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:13:21 compute-0 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d0000005e.scope: Deactivated successfully.
Nov 29 07:13:21 compute-0 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d0000005e.scope: Consumed 1.656s CPU time.
Nov 29 07:13:21 compute-0 systemd-machined[153486]: Machine qemu-34-instance-0000005e terminated.
Nov 29 07:13:21 compute-0 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[227699]: [NOTICE]   (227703) : haproxy version is 2.8.14-c23fe91
Nov 29 07:13:21 compute-0 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[227699]: [NOTICE]   (227703) : path to executable is /usr/sbin/haproxy
Nov 29 07:13:21 compute-0 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[227699]: [WARNING]  (227703) : Exiting Master process...
Nov 29 07:13:21 compute-0 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[227699]: [ALERT]    (227703) : Current worker (227705) exited with code 143 (Terminated)
Nov 29 07:13:21 compute-0 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[227699]: [WARNING]  (227703) : All workers exited. Exiting... (0)
Nov 29 07:13:21 compute-0 systemd[1]: libpod-8adc41f14d4e8664c81c06d653104c7d8330c263a3bd24dc6bd7c213ad0ac122.scope: Deactivated successfully.
Nov 29 07:13:21 compute-0 nova_compute[187185]: 2025-11-29 07:13:21.471 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:13:21 compute-0 podman[227837]: 2025-11-29 07:13:21.472318074 +0000 UTC m=+0.069575018 container died 8adc41f14d4e8664c81c06d653104c7d8330c263a3bd24dc6bd7c213ad0ac122 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:13:21 compute-0 nova_compute[187185]: 2025-11-29 07:13:21.481 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:13:21 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8adc41f14d4e8664c81c06d653104c7d8330c263a3bd24dc6bd7c213ad0ac122-userdata-shm.mount: Deactivated successfully.
Nov 29 07:13:21 compute-0 nova_compute[187185]: 2025-11-29 07:13:21.520 187189 INFO nova.virt.libvirt.driver [-] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Instance destroyed successfully.
Nov 29 07:13:21 compute-0 nova_compute[187185]: 2025-11-29 07:13:21.521 187189 DEBUG nova.objects.instance [None req-53b9bcf0-cc91-4286-88be-8590aa0ccf4e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'resources' on Instance uuid 5896e8b0-25a2-4075-8ebf-5458b5ed9234 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:13:21 compute-0 systemd[1]: var-lib-containers-storage-overlay-7cdae94394fddf26cb2f05c9a725d31b07f6af5edc3ebb23ebcc768047cd6ebb-merged.mount: Deactivated successfully.
Nov 29 07:13:21 compute-0 podman[227837]: 2025-11-29 07:13:21.535564213 +0000 UTC m=+0.132821097 container cleanup 8adc41f14d4e8664c81c06d653104c7d8330c263a3bd24dc6bd7c213ad0ac122 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 29 07:13:21 compute-0 systemd[1]: libpod-conmon-8adc41f14d4e8664c81c06d653104c7d8330c263a3bd24dc6bd7c213ad0ac122.scope: Deactivated successfully.
Nov 29 07:13:21 compute-0 nova_compute[187185]: 2025-11-29 07:13:21.546 187189 DEBUG nova.virt.libvirt.vif [None req-53b9bcf0-cc91-4286-88be-8590aa0ccf4e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:10:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-878694992',display_name='tempest-ServerActionsTestJSON-server-878694992',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-878694992',id=94,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCMk4ml8b3fuOLAahNu+FCDl9v+SWDrsolrsZ2S6Gt/612rFnCoMRUdrLWExAZcapVyOr30SCUoxNsWGg9cONHUEcP39CbtnDaooW7qgHPKQnqdGyUHB5iGAkkvgw7SerQ==',key_name='tempest-keypair-11305483',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:10:40Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6e6c366001df43fb91731faf7a9578fc',ramdisk_id='',reservation_id='r-j3wprymf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-157226036',owner_user_name='tempest-ServerActionsTestJSON-157226036-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:13:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e1b8fbcc8caa4d94b69570f233c56d18',uuid=5896e8b0-25a2-4075-8ebf-5458b5ed9234,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "69f2ab13-2311-4137-9c26-e256f33759e5", "address": "fa:16:3e:cf:f0:da", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69f2ab13-23", "ovs_interfaceid": "69f2ab13-2311-4137-9c26-e256f33759e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:13:21 compute-0 nova_compute[187185]: 2025-11-29 07:13:21.546 187189 DEBUG nova.network.os_vif_util [None req-53b9bcf0-cc91-4286-88be-8590aa0ccf4e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converting VIF {"id": "69f2ab13-2311-4137-9c26-e256f33759e5", "address": "fa:16:3e:cf:f0:da", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69f2ab13-23", "ovs_interfaceid": "69f2ab13-2311-4137-9c26-e256f33759e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:13:21 compute-0 nova_compute[187185]: 2025-11-29 07:13:21.547 187189 DEBUG nova.network.os_vif_util [None req-53b9bcf0-cc91-4286-88be-8590aa0ccf4e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cf:f0:da,bridge_name='br-int',has_traffic_filtering=True,id=69f2ab13-2311-4137-9c26-e256f33759e5,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69f2ab13-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:13:21 compute-0 nova_compute[187185]: 2025-11-29 07:13:21.548 187189 DEBUG os_vif [None req-53b9bcf0-cc91-4286-88be-8590aa0ccf4e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:cf:f0:da,bridge_name='br-int',has_traffic_filtering=True,id=69f2ab13-2311-4137-9c26-e256f33759e5,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69f2ab13-23') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:13:21 compute-0 nova_compute[187185]: 2025-11-29 07:13:21.550 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:13:21 compute-0 nova_compute[187185]: 2025-11-29 07:13:21.550 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap69f2ab13-23, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:13:21 compute-0 nova_compute[187185]: 2025-11-29 07:13:21.552 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:13:21 compute-0 nova_compute[187185]: 2025-11-29 07:13:21.554 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:13:21 compute-0 nova_compute[187185]: 2025-11-29 07:13:21.557 187189 INFO os_vif [None req-53b9bcf0-cc91-4286-88be-8590aa0ccf4e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:cf:f0:da,bridge_name='br-int',has_traffic_filtering=True,id=69f2ab13-2311-4137-9c26-e256f33759e5,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69f2ab13-23')
Nov 29 07:13:21 compute-0 nova_compute[187185]: 2025-11-29 07:13:21.557 187189 INFO nova.virt.libvirt.driver [None req-53b9bcf0-cc91-4286-88be-8590aa0ccf4e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Deleting instance files /var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234_del
Nov 29 07:13:21 compute-0 nova_compute[187185]: 2025-11-29 07:13:21.558 187189 INFO nova.virt.libvirt.driver [None req-53b9bcf0-cc91-4286-88be-8590aa0ccf4e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Deletion of /var/lib/nova/instances/5896e8b0-25a2-4075-8ebf-5458b5ed9234_del complete
Nov 29 07:13:21 compute-0 podman[227880]: 2025-11-29 07:13:21.629430057 +0000 UTC m=+0.067467349 container remove 8adc41f14d4e8664c81c06d653104c7d8330c263a3bd24dc6bd7c213ad0ac122 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Nov 29 07:13:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:21.634 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[51bf51ff-cb0e-4056-a510-deeab9106e8b]: (4, ('Sat Nov 29 07:13:21 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 (8adc41f14d4e8664c81c06d653104c7d8330c263a3bd24dc6bd7c213ad0ac122)\n8adc41f14d4e8664c81c06d653104c7d8330c263a3bd24dc6bd7c213ad0ac122\nSat Nov 29 07:13:21 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 (8adc41f14d4e8664c81c06d653104c7d8330c263a3bd24dc6bd7c213ad0ac122)\n8adc41f14d4e8664c81c06d653104c7d8330c263a3bd24dc6bd7c213ad0ac122\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:13:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:21.637 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e7a9a673-a947-4725-a812-4522d4054b9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:13:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:21.638 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9226dea3-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:13:21 compute-0 nova_compute[187185]: 2025-11-29 07:13:21.641 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:13:21 compute-0 kernel: tap9226dea3-60: left promiscuous mode
Nov 29 07:13:21 compute-0 nova_compute[187185]: 2025-11-29 07:13:21.652 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:13:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:21.655 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[c0e46087-a597-4393-a6eb-d24d0b83ebeb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:13:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:21.669 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6655c826-cbca-456f-bd4e-5ace4a6eada0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:13:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:21.670 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[32d134b6-afc5-4ccd-aef2-f54c164b6558]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:13:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:21.691 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a2bedb3b-9b3c-4adf-a3f1-034b518b7091]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 582780, 'reachable_time': 39004, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227895, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:13:21 compute-0 systemd[1]: run-netns-ovnmeta\x2d9226dea3\x2d6355\x2d4dd9\x2d9441\x2dd093c1f1a399.mount: Deactivated successfully.
Nov 29 07:13:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:21.694 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:13:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:21.694 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[0b849a10-aaa9-42bb-9917-d8b77d8a9e21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:13:21 compute-0 nova_compute[187185]: 2025-11-29 07:13:21.768 187189 INFO nova.compute.manager [None req-53b9bcf0-cc91-4286-88be-8590aa0ccf4e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Took 0.53 seconds to destroy the instance on the hypervisor.
Nov 29 07:13:21 compute-0 nova_compute[187185]: 2025-11-29 07:13:21.768 187189 DEBUG oslo.service.loopingcall [None req-53b9bcf0-cc91-4286-88be-8590aa0ccf4e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 07:13:21 compute-0 nova_compute[187185]: 2025-11-29 07:13:21.770 187189 DEBUG nova.compute.manager [-] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 07:13:21 compute-0 nova_compute[187185]: 2025-11-29 07:13:21.770 187189 DEBUG nova.network.neutron [-] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 07:13:22 compute-0 nova_compute[187185]: 2025-11-29 07:13:22.123 187189 DEBUG nova.compute.manager [req-f7ddba13-f0d6-44d5-93f5-7c0c7dccce12 req-79eb2135-43ea-451d-bc78-3c9da6d4dbba 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Received event network-vif-unplugged-69f2ab13-2311-4137-9c26-e256f33759e5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:13:22 compute-0 nova_compute[187185]: 2025-11-29 07:13:22.123 187189 DEBUG oslo_concurrency.lockutils [req-f7ddba13-f0d6-44d5-93f5-7c0c7dccce12 req-79eb2135-43ea-451d-bc78-3c9da6d4dbba 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:13:22 compute-0 nova_compute[187185]: 2025-11-29 07:13:22.124 187189 DEBUG oslo_concurrency.lockutils [req-f7ddba13-f0d6-44d5-93f5-7c0c7dccce12 req-79eb2135-43ea-451d-bc78-3c9da6d4dbba 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:13:22 compute-0 nova_compute[187185]: 2025-11-29 07:13:22.124 187189 DEBUG oslo_concurrency.lockutils [req-f7ddba13-f0d6-44d5-93f5-7c0c7dccce12 req-79eb2135-43ea-451d-bc78-3c9da6d4dbba 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:13:22 compute-0 nova_compute[187185]: 2025-11-29 07:13:22.124 187189 DEBUG nova.compute.manager [req-f7ddba13-f0d6-44d5-93f5-7c0c7dccce12 req-79eb2135-43ea-451d-bc78-3c9da6d4dbba 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] No waiting events found dispatching network-vif-unplugged-69f2ab13-2311-4137-9c26-e256f33759e5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:13:22 compute-0 nova_compute[187185]: 2025-11-29 07:13:22.124 187189 DEBUG nova.compute.manager [req-f7ddba13-f0d6-44d5-93f5-7c0c7dccce12 req-79eb2135-43ea-451d-bc78-3c9da6d4dbba 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Received event network-vif-unplugged-69f2ab13-2311-4137-9c26-e256f33759e5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 07:13:24 compute-0 nova_compute[187185]: 2025-11-29 07:13:24.472 187189 DEBUG nova.compute.manager [req-c4f81cce-6568-453a-b32c-15ed168dad73 req-2a06a205-44cb-4cff-972d-4ecc3e7ce42b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Received event network-vif-plugged-69f2ab13-2311-4137-9c26-e256f33759e5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:13:24 compute-0 nova_compute[187185]: 2025-11-29 07:13:24.473 187189 DEBUG oslo_concurrency.lockutils [req-c4f81cce-6568-453a-b32c-15ed168dad73 req-2a06a205-44cb-4cff-972d-4ecc3e7ce42b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:13:24 compute-0 nova_compute[187185]: 2025-11-29 07:13:24.474 187189 DEBUG oslo_concurrency.lockutils [req-c4f81cce-6568-453a-b32c-15ed168dad73 req-2a06a205-44cb-4cff-972d-4ecc3e7ce42b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:13:24 compute-0 nova_compute[187185]: 2025-11-29 07:13:24.474 187189 DEBUG oslo_concurrency.lockutils [req-c4f81cce-6568-453a-b32c-15ed168dad73 req-2a06a205-44cb-4cff-972d-4ecc3e7ce42b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:13:24 compute-0 nova_compute[187185]: 2025-11-29 07:13:24.474 187189 DEBUG nova.compute.manager [req-c4f81cce-6568-453a-b32c-15ed168dad73 req-2a06a205-44cb-4cff-972d-4ecc3e7ce42b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] No waiting events found dispatching network-vif-plugged-69f2ab13-2311-4137-9c26-e256f33759e5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:13:24 compute-0 nova_compute[187185]: 2025-11-29 07:13:24.475 187189 WARNING nova.compute.manager [req-c4f81cce-6568-453a-b32c-15ed168dad73 req-2a06a205-44cb-4cff-972d-4ecc3e7ce42b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Received unexpected event network-vif-plugged-69f2ab13-2311-4137-9c26-e256f33759e5 for instance with vm_state active and task_state deleting.
Nov 29 07:13:24 compute-0 nova_compute[187185]: 2025-11-29 07:13:24.644 187189 DEBUG nova.network.neutron [-] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:13:24 compute-0 nova_compute[187185]: 2025-11-29 07:13:24.738 187189 INFO nova.compute.manager [-] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Took 2.97 seconds to deallocate network for instance.
Nov 29 07:13:24 compute-0 nova_compute[187185]: 2025-11-29 07:13:24.834 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:13:25 compute-0 nova_compute[187185]: 2025-11-29 07:13:25.000 187189 DEBUG oslo_concurrency.lockutils [None req-53b9bcf0-cc91-4286-88be-8590aa0ccf4e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:13:25 compute-0 nova_compute[187185]: 2025-11-29 07:13:25.000 187189 DEBUG oslo_concurrency.lockutils [None req-53b9bcf0-cc91-4286-88be-8590aa0ccf4e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:13:25 compute-0 nova_compute[187185]: 2025-11-29 07:13:25.050 187189 DEBUG nova.compute.manager [req-27a35a2f-72d6-498b-8ca1-603e62a22aac req-0c6c3b7b-5dab-474b-9e2d-ee1bf7a8b8f3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Received event network-vif-deleted-69f2ab13-2311-4137-9c26-e256f33759e5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:13:25 compute-0 nova_compute[187185]: 2025-11-29 07:13:25.164 187189 DEBUG nova.compute.provider_tree [None req-53b9bcf0-cc91-4286-88be-8590aa0ccf4e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:13:25 compute-0 nova_compute[187185]: 2025-11-29 07:13:25.209 187189 DEBUG nova.scheduler.client.report [None req-53b9bcf0-cc91-4286-88be-8590aa0ccf4e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:13:25 compute-0 nova_compute[187185]: 2025-11-29 07:13:25.305 187189 DEBUG oslo_concurrency.lockutils [None req-53b9bcf0-cc91-4286-88be-8590aa0ccf4e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.304s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:13:25 compute-0 nova_compute[187185]: 2025-11-29 07:13:25.368 187189 INFO nova.scheduler.client.report [None req-53b9bcf0-cc91-4286-88be-8590aa0ccf4e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Deleted allocations for instance 5896e8b0-25a2-4075-8ebf-5458b5ed9234
Nov 29 07:13:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:25.502 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:13:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:25.503 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:13:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:25.503 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:13:25 compute-0 nova_compute[187185]: 2025-11-29 07:13:25.527 187189 DEBUG oslo_concurrency.lockutils [None req-53b9bcf0-cc91-4286-88be-8590aa0ccf4e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "5896e8b0-25a2-4075-8ebf-5458b5ed9234" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.336s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:13:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:25.712 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:13:25 compute-0 nova_compute[187185]: 2025-11-29 07:13:25.713 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:13:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:25.714 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:13:26 compute-0 nova_compute[187185]: 2025-11-29 07:13:26.005 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:13:26 compute-0 nova_compute[187185]: 2025-11-29 07:13:26.553 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:13:28 compute-0 podman[227898]: 2025-11-29 07:13:28.811924565 +0000 UTC m=+0.062403816 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, release=1755695350, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, version=9.6, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=minimal rhel9)
Nov 29 07:13:28 compute-0 podman[227897]: 2025-11-29 07:13:28.83227167 +0000 UTC m=+0.083130751 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 29 07:13:28 compute-0 podman[227899]: 2025-11-29 07:13:28.832654851 +0000 UTC m=+0.073579412 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:13:29 compute-0 nova_compute[187185]: 2025-11-29 07:13:29.866 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:13:31 compute-0 nova_compute[187185]: 2025-11-29 07:13:31.556 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:13:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:13:31.717 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:13:34 compute-0 nova_compute[187185]: 2025-11-29 07:13:34.870 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:13:36 compute-0 nova_compute[187185]: 2025-11-29 07:13:36.519 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400401.517266, 5896e8b0-25a2-4075-8ebf-5458b5ed9234 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:13:36 compute-0 nova_compute[187185]: 2025-11-29 07:13:36.519 187189 INFO nova.compute.manager [-] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] VM Stopped (Lifecycle Event)
Nov 29 07:13:36 compute-0 nova_compute[187185]: 2025-11-29 07:13:36.559 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:13:36 compute-0 nova_compute[187185]: 2025-11-29 07:13:36.598 187189 DEBUG nova.compute.manager [None req-05429e01-96c4-46d1-a6b6-c10b48a68f70 - - - - - -] [instance: 5896e8b0-25a2-4075-8ebf-5458b5ed9234] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:13:37 compute-0 nova_compute[187185]: 2025-11-29 07:13:37.749 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:13:38 compute-0 podman[227960]: 2025-11-29 07:13:38.882106505 +0000 UTC m=+0.138483266 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 07:13:39 compute-0 nova_compute[187185]: 2025-11-29 07:13:39.924 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:13:41 compute-0 nova_compute[187185]: 2025-11-29 07:13:41.562 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:13:44 compute-0 podman[227984]: 2025-11-29 07:13:44.781762158 +0000 UTC m=+0.048693668 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 07:13:44 compute-0 nova_compute[187185]: 2025-11-29 07:13:44.929 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:13:46 compute-0 nova_compute[187185]: 2025-11-29 07:13:46.569 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:13:46 compute-0 podman[228009]: 2025-11-29 07:13:46.802441085 +0000 UTC m=+0.068443696 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 07:13:46 compute-0 podman[228008]: 2025-11-29 07:13:46.832818554 +0000 UTC m=+0.099760362 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 07:13:48 compute-0 nova_compute[187185]: 2025-11-29 07:13:48.830 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:13:49 compute-0 nova_compute[187185]: 2025-11-29 07:13:49.009 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:13:49 compute-0 nova_compute[187185]: 2025-11-29 07:13:49.937 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:13:51 compute-0 nova_compute[187185]: 2025-11-29 07:13:51.573 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:13:54 compute-0 nova_compute[187185]: 2025-11-29 07:13:54.980 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:13:56 compute-0 nova_compute[187185]: 2025-11-29 07:13:56.577 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:13:59 compute-0 podman[228048]: 2025-11-29 07:13:59.806972226 +0000 UTC m=+0.067861490 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, distribution-scope=public, vcs-type=git, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_id=edpm, name=ubi9-minimal, vendor=Red Hat, Inc.)
Nov 29 07:13:59 compute-0 podman[228047]: 2025-11-29 07:13:59.807374858 +0000 UTC m=+0.070904416 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Nov 29 07:13:59 compute-0 podman[228049]: 2025-11-29 07:13:59.809068856 +0000 UTC m=+0.058959048 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 07:13:59 compute-0 nova_compute[187185]: 2025-11-29 07:13:59.983 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:14:01 compute-0 nova_compute[187185]: 2025-11-29 07:14:01.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:14:01 compute-0 nova_compute[187185]: 2025-11-29 07:14:01.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:14:01 compute-0 nova_compute[187185]: 2025-11-29 07:14:01.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:14:01 compute-0 nova_compute[187185]: 2025-11-29 07:14:01.336 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 07:14:01 compute-0 nova_compute[187185]: 2025-11-29 07:14:01.580 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:14:02 compute-0 nova_compute[187185]: 2025-11-29 07:14:02.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:14:03 compute-0 nova_compute[187185]: 2025-11-29 07:14:03.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:14:03 compute-0 nova_compute[187185]: 2025-11-29 07:14:03.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:14:04 compute-0 nova_compute[187185]: 2025-11-29 07:14:04.983 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:14:05 compute-0 nova_compute[187185]: 2025-11-29 07:14:05.310 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:14:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:14:05.472 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:14:05 compute-0 nova_compute[187185]: 2025-11-29 07:14:05.473 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:14:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:14:05.473 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:14:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:14:05.474 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:14:06 compute-0 nova_compute[187185]: 2025-11-29 07:14:06.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:14:06 compute-0 nova_compute[187185]: 2025-11-29 07:14:06.583 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:14:07 compute-0 nova_compute[187185]: 2025-11-29 07:14:07.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:14:08 compute-0 nova_compute[187185]: 2025-11-29 07:14:08.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:14:08 compute-0 nova_compute[187185]: 2025-11-29 07:14:08.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:14:08 compute-0 nova_compute[187185]: 2025-11-29 07:14:08.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:14:08 compute-0 nova_compute[187185]: 2025-11-29 07:14:08.378 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:14:08 compute-0 nova_compute[187185]: 2025-11-29 07:14:08.379 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:14:08 compute-0 nova_compute[187185]: 2025-11-29 07:14:08.380 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:14:08 compute-0 nova_compute[187185]: 2025-11-29 07:14:08.380 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:14:08 compute-0 nova_compute[187185]: 2025-11-29 07:14:08.565 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:14:08 compute-0 nova_compute[187185]: 2025-11-29 07:14:08.566 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5710MB free_disk=73.29655456542969GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:14:08 compute-0 nova_compute[187185]: 2025-11-29 07:14:08.566 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:14:08 compute-0 nova_compute[187185]: 2025-11-29 07:14:08.566 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:14:09 compute-0 nova_compute[187185]: 2025-11-29 07:14:09.052 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance 8ca0969d-04fe-43b8-8f05-68f183f888a9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1692
Nov 29 07:14:09 compute-0 nova_compute[187185]: 2025-11-29 07:14:09.053 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:14:09 compute-0 nova_compute[187185]: 2025-11-29 07:14:09.053 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:14:09 compute-0 nova_compute[187185]: 2025-11-29 07:14:09.085 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Refreshing inventories for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 29 07:14:09 compute-0 nova_compute[187185]: 2025-11-29 07:14:09.139 187189 DEBUG oslo_concurrency.lockutils [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "8ca0969d-04fe-43b8-8f05-68f183f888a9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:14:09 compute-0 nova_compute[187185]: 2025-11-29 07:14:09.140 187189 DEBUG oslo_concurrency.lockutils [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "8ca0969d-04fe-43b8-8f05-68f183f888a9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:14:09 compute-0 nova_compute[187185]: 2025-11-29 07:14:09.221 187189 DEBUG nova.compute.manager [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 07:14:09 compute-0 nova_compute[187185]: 2025-11-29 07:14:09.417 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Updating ProviderTree inventory for provider 4e39a026-df39-4e20-874a-dbb5a40df044 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 29 07:14:09 compute-0 nova_compute[187185]: 2025-11-29 07:14:09.418 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Updating inventory in ProviderTree for provider 4e39a026-df39-4e20-874a-dbb5a40df044 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 29 07:14:09 compute-0 nova_compute[187185]: 2025-11-29 07:14:09.439 187189 DEBUG oslo_concurrency.lockutils [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:14:09 compute-0 nova_compute[187185]: 2025-11-29 07:14:09.443 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Refreshing aggregate associations for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 29 07:14:09 compute-0 nova_compute[187185]: 2025-11-29 07:14:09.484 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Refreshing trait associations for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044, traits: HW_CPU_X86_SSE,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_TRUSTED_CERTS,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 29 07:14:09 compute-0 nova_compute[187185]: 2025-11-29 07:14:09.577 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:14:09 compute-0 nova_compute[187185]: 2025-11-29 07:14:09.619 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:14:09 compute-0 podman[228107]: 2025-11-29 07:14:09.820729251 +0000 UTC m=+0.080863518 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:14:09 compute-0 nova_compute[187185]: 2025-11-29 07:14:09.985 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:14:10 compute-0 nova_compute[187185]: 2025-11-29 07:14:10.189 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:14:10 compute-0 nova_compute[187185]: 2025-11-29 07:14:10.190 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.623s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:14:10 compute-0 nova_compute[187185]: 2025-11-29 07:14:10.190 187189 DEBUG oslo_concurrency.lockutils [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.751s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:14:10 compute-0 nova_compute[187185]: 2025-11-29 07:14:10.198 187189 DEBUG nova.virt.hardware [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:14:10 compute-0 nova_compute[187185]: 2025-11-29 07:14:10.199 187189 INFO nova.compute.claims [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:14:10 compute-0 nova_compute[187185]: 2025-11-29 07:14:10.556 187189 DEBUG nova.compute.provider_tree [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:14:10 compute-0 nova_compute[187185]: 2025-11-29 07:14:10.666 187189 DEBUG nova.scheduler.client.report [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:14:10 compute-0 nova_compute[187185]: 2025-11-29 07:14:10.706 187189 DEBUG oslo_concurrency.lockutils [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.516s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:14:10 compute-0 nova_compute[187185]: 2025-11-29 07:14:10.707 187189 DEBUG nova.compute.manager [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 07:14:11 compute-0 nova_compute[187185]: 2025-11-29 07:14:11.333 187189 DEBUG nova.compute.manager [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 07:14:11 compute-0 nova_compute[187185]: 2025-11-29 07:14:11.333 187189 DEBUG nova.network.neutron [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 07:14:11 compute-0 nova_compute[187185]: 2025-11-29 07:14:11.586 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:14:11 compute-0 nova_compute[187185]: 2025-11-29 07:14:11.687 187189 INFO nova.virt.libvirt.driver [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 07:14:11 compute-0 nova_compute[187185]: 2025-11-29 07:14:11.720 187189 DEBUG nova.compute.manager [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 07:14:11 compute-0 nova_compute[187185]: 2025-11-29 07:14:11.998 187189 DEBUG nova.compute.manager [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 07:14:12 compute-0 nova_compute[187185]: 2025-11-29 07:14:12.000 187189 DEBUG nova.virt.libvirt.driver [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:14:12 compute-0 nova_compute[187185]: 2025-11-29 07:14:12.000 187189 INFO nova.virt.libvirt.driver [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Creating image(s)
Nov 29 07:14:12 compute-0 nova_compute[187185]: 2025-11-29 07:14:12.001 187189 DEBUG oslo_concurrency.lockutils [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "/var/lib/nova/instances/8ca0969d-04fe-43b8-8f05-68f183f888a9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:14:12 compute-0 nova_compute[187185]: 2025-11-29 07:14:12.001 187189 DEBUG oslo_concurrency.lockutils [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "/var/lib/nova/instances/8ca0969d-04fe-43b8-8f05-68f183f888a9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:14:12 compute-0 nova_compute[187185]: 2025-11-29 07:14:12.002 187189 DEBUG oslo_concurrency.lockutils [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "/var/lib/nova/instances/8ca0969d-04fe-43b8-8f05-68f183f888a9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:14:12 compute-0 nova_compute[187185]: 2025-11-29 07:14:12.014 187189 DEBUG oslo_concurrency.processutils [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:14:12 compute-0 nova_compute[187185]: 2025-11-29 07:14:12.071 187189 DEBUG oslo_concurrency.processutils [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:14:12 compute-0 nova_compute[187185]: 2025-11-29 07:14:12.072 187189 DEBUG oslo_concurrency.lockutils [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:14:12 compute-0 nova_compute[187185]: 2025-11-29 07:14:12.073 187189 DEBUG oslo_concurrency.lockutils [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:14:12 compute-0 nova_compute[187185]: 2025-11-29 07:14:12.085 187189 DEBUG oslo_concurrency.processutils [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:14:12 compute-0 nova_compute[187185]: 2025-11-29 07:14:12.150 187189 DEBUG oslo_concurrency.processutils [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:14:12 compute-0 nova_compute[187185]: 2025-11-29 07:14:12.151 187189 DEBUG oslo_concurrency.processutils [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/8ca0969d-04fe-43b8-8f05-68f183f888a9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:14:12 compute-0 nova_compute[187185]: 2025-11-29 07:14:12.188 187189 DEBUG oslo_concurrency.processutils [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/8ca0969d-04fe-43b8-8f05-68f183f888a9/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:14:12 compute-0 nova_compute[187185]: 2025-11-29 07:14:12.190 187189 DEBUG oslo_concurrency.lockutils [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:14:12 compute-0 nova_compute[187185]: 2025-11-29 07:14:12.190 187189 DEBUG oslo_concurrency.processutils [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:14:12 compute-0 nova_compute[187185]: 2025-11-29 07:14:12.249 187189 DEBUG oslo_concurrency.processutils [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:14:12 compute-0 nova_compute[187185]: 2025-11-29 07:14:12.250 187189 DEBUG nova.virt.disk.api [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Checking if we can resize image /var/lib/nova/instances/8ca0969d-04fe-43b8-8f05-68f183f888a9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:14:12 compute-0 nova_compute[187185]: 2025-11-29 07:14:12.250 187189 DEBUG oslo_concurrency.processutils [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8ca0969d-04fe-43b8-8f05-68f183f888a9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:14:12 compute-0 nova_compute[187185]: 2025-11-29 07:14:12.305 187189 DEBUG oslo_concurrency.processutils [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8ca0969d-04fe-43b8-8f05-68f183f888a9/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:14:12 compute-0 nova_compute[187185]: 2025-11-29 07:14:12.306 187189 DEBUG nova.virt.disk.api [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Cannot resize image /var/lib/nova/instances/8ca0969d-04fe-43b8-8f05-68f183f888a9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:14:12 compute-0 nova_compute[187185]: 2025-11-29 07:14:12.306 187189 DEBUG nova.objects.instance [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lazy-loading 'migration_context' on Instance uuid 8ca0969d-04fe-43b8-8f05-68f183f888a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:14:12 compute-0 nova_compute[187185]: 2025-11-29 07:14:12.350 187189 DEBUG nova.virt.libvirt.driver [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:14:12 compute-0 nova_compute[187185]: 2025-11-29 07:14:12.351 187189 DEBUG nova.virt.libvirt.driver [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Ensure instance console log exists: /var/lib/nova/instances/8ca0969d-04fe-43b8-8f05-68f183f888a9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:14:12 compute-0 nova_compute[187185]: 2025-11-29 07:14:12.351 187189 DEBUG oslo_concurrency.lockutils [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:14:12 compute-0 nova_compute[187185]: 2025-11-29 07:14:12.351 187189 DEBUG oslo_concurrency.lockutils [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:14:12 compute-0 nova_compute[187185]: 2025-11-29 07:14:12.352 187189 DEBUG oslo_concurrency.lockutils [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:14:13 compute-0 nova_compute[187185]: 2025-11-29 07:14:13.107 187189 DEBUG nova.policy [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 07:14:14 compute-0 nova_compute[187185]: 2025-11-29 07:14:14.987 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:14:15 compute-0 podman[228148]: 2025-11-29 07:14:15.792334708 +0000 UTC m=+0.054091830 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 07:14:16 compute-0 nova_compute[187185]: 2025-11-29 07:14:16.590 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:14:17 compute-0 podman[228174]: 2025-11-29 07:14:17.811809702 +0000 UTC m=+0.061839139 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 29 07:14:17 compute-0 podman[228173]: 2025-11-29 07:14:17.840147404 +0000 UTC m=+0.094214715 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 07:14:17 compute-0 nova_compute[187185]: 2025-11-29 07:14:17.995 187189 DEBUG nova.network.neutron [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Successfully created port: ef96623e-7b97-4117-b30e-902c4b9be2ae _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:14:18 compute-0 nova_compute[187185]: 2025-11-29 07:14:18.186 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:14:19 compute-0 nova_compute[187185]: 2025-11-29 07:14:19.990 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:14:21 compute-0 nova_compute[187185]: 2025-11-29 07:14:21.593 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:14:22 compute-0 nova_compute[187185]: 2025-11-29 07:14:22.030 187189 DEBUG nova.network.neutron [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Successfully updated port: ef96623e-7b97-4117-b30e-902c4b9be2ae _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:14:22 compute-0 nova_compute[187185]: 2025-11-29 07:14:22.063 187189 DEBUG oslo_concurrency.lockutils [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "refresh_cache-8ca0969d-04fe-43b8-8f05-68f183f888a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:14:22 compute-0 nova_compute[187185]: 2025-11-29 07:14:22.063 187189 DEBUG oslo_concurrency.lockutils [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquired lock "refresh_cache-8ca0969d-04fe-43b8-8f05-68f183f888a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:14:22 compute-0 nova_compute[187185]: 2025-11-29 07:14:22.063 187189 DEBUG nova.network.neutron [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:14:22 compute-0 nova_compute[187185]: 2025-11-29 07:14:22.150 187189 DEBUG nova.compute.manager [req-b9bb8483-bb76-43c4-8229-ddbb540b6a48 req-2fc282b4-0812-43d9-9d09-fd099224c445 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Received event network-changed-ef96623e-7b97-4117-b30e-902c4b9be2ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:14:22 compute-0 nova_compute[187185]: 2025-11-29 07:14:22.151 187189 DEBUG nova.compute.manager [req-b9bb8483-bb76-43c4-8229-ddbb540b6a48 req-2fc282b4-0812-43d9-9d09-fd099224c445 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Refreshing instance network info cache due to event network-changed-ef96623e-7b97-4117-b30e-902c4b9be2ae. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:14:22 compute-0 nova_compute[187185]: 2025-11-29 07:14:22.151 187189 DEBUG oslo_concurrency.lockutils [req-b9bb8483-bb76-43c4-8229-ddbb540b6a48 req-2fc282b4-0812-43d9-9d09-fd099224c445 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-8ca0969d-04fe-43b8-8f05-68f183f888a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:14:22 compute-0 nova_compute[187185]: 2025-11-29 07:14:22.711 187189 DEBUG nova.network.neutron [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.731 187189 DEBUG nova.network.neutron [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Updating instance_info_cache with network_info: [{"id": "ef96623e-7b97-4117-b30e-902c4b9be2ae", "address": "fa:16:3e:97:62:f8", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef96623e-7b", "ovs_interfaceid": "ef96623e-7b97-4117-b30e-902c4b9be2ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.836 187189 DEBUG oslo_concurrency.lockutils [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Releasing lock "refresh_cache-8ca0969d-04fe-43b8-8f05-68f183f888a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.837 187189 DEBUG nova.compute.manager [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Instance network_info: |[{"id": "ef96623e-7b97-4117-b30e-902c4b9be2ae", "address": "fa:16:3e:97:62:f8", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef96623e-7b", "ovs_interfaceid": "ef96623e-7b97-4117-b30e-902c4b9be2ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.837 187189 DEBUG oslo_concurrency.lockutils [req-b9bb8483-bb76-43c4-8229-ddbb540b6a48 req-2fc282b4-0812-43d9-9d09-fd099224c445 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-8ca0969d-04fe-43b8-8f05-68f183f888a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.838 187189 DEBUG nova.network.neutron [req-b9bb8483-bb76-43c4-8229-ddbb540b6a48 req-2fc282b4-0812-43d9-9d09-fd099224c445 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Refreshing network info cache for port ef96623e-7b97-4117-b30e-902c4b9be2ae _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.840 187189 DEBUG nova.virt.libvirt.driver [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Start _get_guest_xml network_info=[{"id": "ef96623e-7b97-4117-b30e-902c4b9be2ae", "address": "fa:16:3e:97:62:f8", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef96623e-7b", "ovs_interfaceid": "ef96623e-7b97-4117-b30e-902c4b9be2ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.847 187189 WARNING nova.virt.libvirt.driver [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.864 187189 DEBUG nova.virt.libvirt.host [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.865 187189 DEBUG nova.virt.libvirt.host [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.868 187189 DEBUG nova.virt.libvirt.host [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.869 187189 DEBUG nova.virt.libvirt.host [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.870 187189 DEBUG nova.virt.libvirt.driver [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.870 187189 DEBUG nova.virt.hardware [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.871 187189 DEBUG nova.virt.hardware [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.871 187189 DEBUG nova.virt.hardware [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.871 187189 DEBUG nova.virt.hardware [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.872 187189 DEBUG nova.virt.hardware [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.872 187189 DEBUG nova.virt.hardware [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.872 187189 DEBUG nova.virt.hardware [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.872 187189 DEBUG nova.virt.hardware [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.872 187189 DEBUG nova.virt.hardware [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.873 187189 DEBUG nova.virt.hardware [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.873 187189 DEBUG nova.virt.hardware [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.877 187189 DEBUG nova.virt.libvirt.vif [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:14:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-575694004',display_name='tempest-AttachInterfacesTestJSON-server-575694004',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-575694004',id=97,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNgGd4Iacs7wY+oKmY26nBMFcX8qOmRS8JGNl6lyYXdVDbe/o53pDlvPJRfgkkFQQMuZj1PL9g22BJFBRRmej3Uv85Ig8WM1/1T7okGrP88moNmlmJpygv+50mQ4/2U61Q==',key_name='tempest-keypair-1636541378',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='16d7af1670ea460db3d0422f176b6f98',ramdisk_id='',reservation_id='r-ysw8l28l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-844604773',owner_user_name='tempest-AttachInterfacesTestJSON-844604773-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:14:11Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a13f011f5b74a6f94a2d2c8e9104f4a',uuid=8ca0969d-04fe-43b8-8f05-68f183f888a9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ef96623e-7b97-4117-b30e-902c4b9be2ae", "address": "fa:16:3e:97:62:f8", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef96623e-7b", "ovs_interfaceid": "ef96623e-7b97-4117-b30e-902c4b9be2ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.877 187189 DEBUG nova.network.os_vif_util [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converting VIF {"id": "ef96623e-7b97-4117-b30e-902c4b9be2ae", "address": "fa:16:3e:97:62:f8", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef96623e-7b", "ovs_interfaceid": "ef96623e-7b97-4117-b30e-902c4b9be2ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.878 187189 DEBUG nova.network.os_vif_util [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:62:f8,bridge_name='br-int',has_traffic_filtering=True,id=ef96623e-7b97-4117-b30e-902c4b9be2ae,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef96623e-7b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.879 187189 DEBUG nova.objects.instance [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lazy-loading 'pci_devices' on Instance uuid 8ca0969d-04fe-43b8-8f05-68f183f888a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.942 187189 DEBUG nova.virt.libvirt.driver [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:14:24 compute-0 nova_compute[187185]:   <uuid>8ca0969d-04fe-43b8-8f05-68f183f888a9</uuid>
Nov 29 07:14:24 compute-0 nova_compute[187185]:   <name>instance-00000061</name>
Nov 29 07:14:24 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:14:24 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:14:24 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:14:24 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:       <nova:name>tempest-AttachInterfacesTestJSON-server-575694004</nova:name>
Nov 29 07:14:24 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:14:24</nova:creationTime>
Nov 29 07:14:24 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:14:24 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:14:24 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:14:24 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:14:24 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:14:24 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:14:24 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:14:24 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:14:24 compute-0 nova_compute[187185]:         <nova:user uuid="9a13f011f5b74a6f94a2d2c8e9104f4a">tempest-AttachInterfacesTestJSON-844604773-project-member</nova:user>
Nov 29 07:14:24 compute-0 nova_compute[187185]:         <nova:project uuid="16d7af1670ea460db3d0422f176b6f98">tempest-AttachInterfacesTestJSON-844604773</nova:project>
Nov 29 07:14:24 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:14:24 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:14:24 compute-0 nova_compute[187185]:         <nova:port uuid="ef96623e-7b97-4117-b30e-902c4b9be2ae">
Nov 29 07:14:24 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:14:24 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:14:24 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:14:24 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <system>
Nov 29 07:14:24 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:14:24 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:14:24 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:14:24 compute-0 nova_compute[187185]:       <entry name="serial">8ca0969d-04fe-43b8-8f05-68f183f888a9</entry>
Nov 29 07:14:24 compute-0 nova_compute[187185]:       <entry name="uuid">8ca0969d-04fe-43b8-8f05-68f183f888a9</entry>
Nov 29 07:14:24 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     </system>
Nov 29 07:14:24 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:14:24 compute-0 nova_compute[187185]:   <os>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:   </os>
Nov 29 07:14:24 compute-0 nova_compute[187185]:   <features>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:   </features>
Nov 29 07:14:24 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:14:24 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:14:24 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:14:24 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/8ca0969d-04fe-43b8-8f05-68f183f888a9/disk"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:14:24 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/8ca0969d-04fe-43b8-8f05-68f183f888a9/disk.config"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:14:24 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:97:62:f8"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:       <target dev="tapef96623e-7b"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:14:24 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/8ca0969d-04fe-43b8-8f05-68f183f888a9/console.log" append="off"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <video>
Nov 29 07:14:24 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     </video>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:14:24 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:14:24 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:14:24 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:14:24 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:14:24 compute-0 nova_compute[187185]: </domain>
Nov 29 07:14:24 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.944 187189 DEBUG nova.compute.manager [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Preparing to wait for external event network-vif-plugged-ef96623e-7b97-4117-b30e-902c4b9be2ae prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.944 187189 DEBUG oslo_concurrency.lockutils [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "8ca0969d-04fe-43b8-8f05-68f183f888a9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.945 187189 DEBUG oslo_concurrency.lockutils [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "8ca0969d-04fe-43b8-8f05-68f183f888a9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.945 187189 DEBUG oslo_concurrency.lockutils [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "8ca0969d-04fe-43b8-8f05-68f183f888a9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.946 187189 DEBUG nova.virt.libvirt.vif [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:14:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-575694004',display_name='tempest-AttachInterfacesTestJSON-server-575694004',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-575694004',id=97,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNgGd4Iacs7wY+oKmY26nBMFcX8qOmRS8JGNl6lyYXdVDbe/o53pDlvPJRfgkkFQQMuZj1PL9g22BJFBRRmej3Uv85Ig8WM1/1T7okGrP88moNmlmJpygv+50mQ4/2U61Q==',key_name='tempest-keypair-1636541378',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='16d7af1670ea460db3d0422f176b6f98',ramdisk_id='',reservation_id='r-ysw8l28l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-844604773',owner_user_name='tempest-AttachInterfacesTestJSON-844604773-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:14:11Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a13f011f5b74a6f94a2d2c8e9104f4a',uuid=8ca0969d-04fe-43b8-8f05-68f183f888a9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ef96623e-7b97-4117-b30e-902c4b9be2ae", "address": "fa:16:3e:97:62:f8", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef96623e-7b", "ovs_interfaceid": "ef96623e-7b97-4117-b30e-902c4b9be2ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.946 187189 DEBUG nova.network.os_vif_util [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converting VIF {"id": "ef96623e-7b97-4117-b30e-902c4b9be2ae", "address": "fa:16:3e:97:62:f8", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef96623e-7b", "ovs_interfaceid": "ef96623e-7b97-4117-b30e-902c4b9be2ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.947 187189 DEBUG nova.network.os_vif_util [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:62:f8,bridge_name='br-int',has_traffic_filtering=True,id=ef96623e-7b97-4117-b30e-902c4b9be2ae,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef96623e-7b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.948 187189 DEBUG os_vif [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:62:f8,bridge_name='br-int',has_traffic_filtering=True,id=ef96623e-7b97-4117-b30e-902c4b9be2ae,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef96623e-7b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.948 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.949 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.949 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.951 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.952 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapef96623e-7b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.952 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapef96623e-7b, col_values=(('external_ids', {'iface-id': 'ef96623e-7b97-4117-b30e-902c4b9be2ae', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:97:62:f8', 'vm-uuid': '8ca0969d-04fe-43b8-8f05-68f183f888a9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.954 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:14:24 compute-0 NetworkManager[55227]: <info>  [1764400464.9551] manager: (tapef96623e-7b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/128)
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.956 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.961 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.962 187189 INFO os_vif [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:62:f8,bridge_name='br-int',has_traffic_filtering=True,id=ef96623e-7b97-4117-b30e-902c4b9be2ae,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef96623e-7b')
Nov 29 07:14:24 compute-0 nova_compute[187185]: 2025-11-29 07:14:24.990 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:14:25 compute-0 nova_compute[187185]: 2025-11-29 07:14:25.343 187189 DEBUG nova.virt.libvirt.driver [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:14:25 compute-0 nova_compute[187185]: 2025-11-29 07:14:25.344 187189 DEBUG nova.virt.libvirt.driver [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:14:25 compute-0 nova_compute[187185]: 2025-11-29 07:14:25.345 187189 DEBUG nova.virt.libvirt.driver [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] No VIF found with MAC fa:16:3e:97:62:f8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:14:25 compute-0 nova_compute[187185]: 2025-11-29 07:14:25.345 187189 INFO nova.virt.libvirt.driver [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Using config drive
Nov 29 07:14:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:14:25.504 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:14:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:14:25.504 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:14:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:14:25.504 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:14:26 compute-0 nova_compute[187185]: 2025-11-29 07:14:26.272 187189 INFO nova.virt.libvirt.driver [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Creating config drive at /var/lib/nova/instances/8ca0969d-04fe-43b8-8f05-68f183f888a9/disk.config
Nov 29 07:14:26 compute-0 nova_compute[187185]: 2025-11-29 07:14:26.281 187189 DEBUG oslo_concurrency.processutils [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8ca0969d-04fe-43b8-8f05-68f183f888a9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4tcrh9oi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:14:26 compute-0 nova_compute[187185]: 2025-11-29 07:14:26.415 187189 DEBUG oslo_concurrency.processutils [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8ca0969d-04fe-43b8-8f05-68f183f888a9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4tcrh9oi" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:14:26 compute-0 kernel: tapef96623e-7b: entered promiscuous mode
Nov 29 07:14:26 compute-0 NetworkManager[55227]: <info>  [1764400466.4878] manager: (tapef96623e-7b): new Tun device (/org/freedesktop/NetworkManager/Devices/129)
Nov 29 07:14:26 compute-0 nova_compute[187185]: 2025-11-29 07:14:26.489 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:14:26 compute-0 ovn_controller[95281]: 2025-11-29T07:14:26Z|00243|binding|INFO|Claiming lport ef96623e-7b97-4117-b30e-902c4b9be2ae for this chassis.
Nov 29 07:14:26 compute-0 ovn_controller[95281]: 2025-11-29T07:14:26Z|00244|binding|INFO|ef96623e-7b97-4117-b30e-902c4b9be2ae: Claiming fa:16:3e:97:62:f8 10.100.0.4
Nov 29 07:14:26 compute-0 nova_compute[187185]: 2025-11-29 07:14:26.495 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:14:26 compute-0 nova_compute[187185]: 2025-11-29 07:14:26.499 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:14:26 compute-0 systemd-udevd[228233]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:14:26 compute-0 systemd-machined[153486]: New machine qemu-35-instance-00000061.
Nov 29 07:14:26 compute-0 NetworkManager[55227]: <info>  [1764400466.5455] device (tapef96623e-7b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:14:26 compute-0 NetworkManager[55227]: <info>  [1764400466.5467] device (tapef96623e-7b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:14:26 compute-0 systemd[1]: Started Virtual Machine qemu-35-instance-00000061.
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:14:26.573 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:97:62:f8 10.100.0.4'], port_security=['fa:16:3e:97:62:f8 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-90812230-35cb-4e21-b16b-75b900100d8b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '16d7af1670ea460db3d0422f176b6f98', 'neutron:revision_number': '2', 'neutron:security_group_ids': '751306cd-ebe9-4aad-803e-19aae2b7594e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=41b9bfbf-a9b3-4bdb-9144-e5db6a660517, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=ef96623e-7b97-4117-b30e-902c4b9be2ae) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:14:26.575 104254 INFO neutron.agent.ovn.metadata.agent [-] Port ef96623e-7b97-4117-b30e-902c4b9be2ae in datapath 90812230-35cb-4e21-b16b-75b900100d8b bound to our chassis
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:14:26.577 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 90812230-35cb-4e21-b16b-75b900100d8b
Nov 29 07:14:26 compute-0 nova_compute[187185]: 2025-11-29 07:14:26.580 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:14:26 compute-0 ovn_controller[95281]: 2025-11-29T07:14:26Z|00245|binding|INFO|Setting lport ef96623e-7b97-4117-b30e-902c4b9be2ae ovn-installed in OVS
Nov 29 07:14:26 compute-0 ovn_controller[95281]: 2025-11-29T07:14:26Z|00246|binding|INFO|Setting lport ef96623e-7b97-4117-b30e-902c4b9be2ae up in Southbound
Nov 29 07:14:26 compute-0 nova_compute[187185]: 2025-11-29 07:14:26.586 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:14:26.591 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[9aead324-8579-49c5-b063-30404fbae510]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:14:26.592 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap90812230-31 in ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:14:26.595 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap90812230-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:14:26.595 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[4665406d-3f0b-4ddf-9c59-e947ff408622]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:14:26.596 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e2ae054d-e9af-4697-aee7-8ac65d446034]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:14:26.609 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[29a569f2-13df-4375-87e2-3699adecfcba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:14:26.634 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[87cc9653-5443-4d86-a841-77a4154b1d0b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:14:26.671 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[7fefe02c-2b85-4029-a70c-03a6fdcef5e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:14:26 compute-0 NetworkManager[55227]: <info>  [1764400466.6786] manager: (tap90812230-30): new Veth device (/org/freedesktop/NetworkManager/Devices/130)
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:14:26.677 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[bd8e0332-aa6c-437f-bca2-f2292240c79b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:14:26.722 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[d30ef93a-3bdc-489b-9166-172044df6f0a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:14:26.727 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[3f74f801-2df3-47a4-a54a-4b3cb270cf7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:14:26 compute-0 NetworkManager[55227]: <info>  [1764400466.7579] device (tap90812230-30): carrier: link connected
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:14:26.764 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[ed5fd552-2c6a-4b02-9eb2-cfe2eee87d51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:14:26.788 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a8d50122-5642-49ef-963c-477a02a3cd96]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap90812230-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:5f:07'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 81], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 591442, 'reachable_time': 18293, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228267, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:14:26.807 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[cafa24eb-845a-45d4-a92d-4f4b0e5cba4d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee4:5f07'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 591442, 'tstamp': 591442}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228268, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:14:26.830 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d9f3ab8c-6311-4083-8213-403aef0914f3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap90812230-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:5f:07'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 81], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 591442, 'reachable_time': 18293, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228269, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:14:26.862 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e51322d4-7d83-4ff0-9f11-ab360dd981d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:14:26.930 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d6521d30-1145-4ffb-8277-b1f72c677994]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:14:26.932 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap90812230-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:14:26.932 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:14:26.933 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap90812230-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:14:26 compute-0 nova_compute[187185]: 2025-11-29 07:14:26.935 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:14:26 compute-0 NetworkManager[55227]: <info>  [1764400466.9362] manager: (tap90812230-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/131)
Nov 29 07:14:26 compute-0 kernel: tap90812230-30: entered promiscuous mode
Nov 29 07:14:26 compute-0 nova_compute[187185]: 2025-11-29 07:14:26.938 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:14:26.945 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap90812230-30, col_values=(('external_ids', {'iface-id': '71b1ea47-55d6-453c-a181-e6370c4f7968'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:14:26 compute-0 nova_compute[187185]: 2025-11-29 07:14:26.946 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:14:26 compute-0 ovn_controller[95281]: 2025-11-29T07:14:26Z|00247|binding|INFO|Releasing lport 71b1ea47-55d6-453c-a181-e6370c4f7968 from this chassis (sb_readonly=0)
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:14:26.950 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/90812230-35cb-4e21-b16b-75b900100d8b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/90812230-35cb-4e21-b16b-75b900100d8b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:14:26.956 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[797287c7-4c5f-4b3e-8fb6-f9a3af9f9ce7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:14:26 compute-0 nova_compute[187185]: 2025-11-29 07:14:26.957 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:14:26.958 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-90812230-35cb-4e21-b16b-75b900100d8b
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/90812230-35cb-4e21-b16b-75b900100d8b.pid.haproxy
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID 90812230-35cb-4e21-b16b-75b900100d8b
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:14:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:14:26.959 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'env', 'PROCESS_TAG=haproxy-90812230-35cb-4e21-b16b-75b900100d8b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/90812230-35cb-4e21-b16b-75b900100d8b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:14:27 compute-0 nova_compute[187185]: 2025-11-29 07:14:27.152 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400467.1505241, 8ca0969d-04fe-43b8-8f05-68f183f888a9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:14:27 compute-0 nova_compute[187185]: 2025-11-29 07:14:27.153 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] VM Started (Lifecycle Event)
Nov 29 07:14:27 compute-0 nova_compute[187185]: 2025-11-29 07:14:27.158 187189 DEBUG nova.network.neutron [req-b9bb8483-bb76-43c4-8229-ddbb540b6a48 req-2fc282b4-0812-43d9-9d09-fd099224c445 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Updated VIF entry in instance network info cache for port ef96623e-7b97-4117-b30e-902c4b9be2ae. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:14:27 compute-0 nova_compute[187185]: 2025-11-29 07:14:27.159 187189 DEBUG nova.network.neutron [req-b9bb8483-bb76-43c4-8229-ddbb540b6a48 req-2fc282b4-0812-43d9-9d09-fd099224c445 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Updating instance_info_cache with network_info: [{"id": "ef96623e-7b97-4117-b30e-902c4b9be2ae", "address": "fa:16:3e:97:62:f8", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef96623e-7b", "ovs_interfaceid": "ef96623e-7b97-4117-b30e-902c4b9be2ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:14:27 compute-0 nova_compute[187185]: 2025-11-29 07:14:27.236 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:14:27 compute-0 nova_compute[187185]: 2025-11-29 07:14:27.242 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400467.1510038, 8ca0969d-04fe-43b8-8f05-68f183f888a9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:14:27 compute-0 nova_compute[187185]: 2025-11-29 07:14:27.243 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] VM Paused (Lifecycle Event)
Nov 29 07:14:27 compute-0 nova_compute[187185]: 2025-11-29 07:14:27.248 187189 DEBUG oslo_concurrency.lockutils [req-b9bb8483-bb76-43c4-8229-ddbb540b6a48 req-2fc282b4-0812-43d9-9d09-fd099224c445 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-8ca0969d-04fe-43b8-8f05-68f183f888a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:14:27 compute-0 nova_compute[187185]: 2025-11-29 07:14:27.349 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:14:27 compute-0 nova_compute[187185]: 2025-11-29 07:14:27.356 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:14:27 compute-0 nova_compute[187185]: 2025-11-29 07:14:27.382 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:14:27 compute-0 podman[228308]: 2025-11-29 07:14:27.458110386 +0000 UTC m=+0.072424879 container create 6468a1df4bc8cf685f4cd7d65dafc022a082d67fb850e968c687a979106f425f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 07:14:27 compute-0 systemd[1]: Started libpod-conmon-6468a1df4bc8cf685f4cd7d65dafc022a082d67fb850e968c687a979106f425f.scope.
Nov 29 07:14:27 compute-0 podman[228308]: 2025-11-29 07:14:27.422747556 +0000 UTC m=+0.037062079 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:14:27 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:14:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d980344e8eb69823235e0cc58f5056209472fe6ba8eeb1fe84e65b7dd9493a55/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:14:27 compute-0 podman[228308]: 2025-11-29 07:14:27.562855188 +0000 UTC m=+0.177169701 container init 6468a1df4bc8cf685f4cd7d65dafc022a082d67fb850e968c687a979106f425f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 29 07:14:27 compute-0 podman[228308]: 2025-11-29 07:14:27.56965553 +0000 UTC m=+0.183970013 container start 6468a1df4bc8cf685f4cd7d65dafc022a082d67fb850e968c687a979106f425f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:14:27 compute-0 neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b[228324]: [NOTICE]   (228328) : New worker (228330) forked
Nov 29 07:14:27 compute-0 neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b[228324]: [NOTICE]   (228328) : Loading success.
Nov 29 07:14:28 compute-0 nova_compute[187185]: 2025-11-29 07:14:28.128 187189 DEBUG nova.compute.manager [req-1c6831fa-fc73-4820-84c8-57d2dfed76a0 req-440e58ed-1832-4e78-b485-31e754577304 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Received event network-vif-plugged-ef96623e-7b97-4117-b30e-902c4b9be2ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:14:28 compute-0 nova_compute[187185]: 2025-11-29 07:14:28.129 187189 DEBUG oslo_concurrency.lockutils [req-1c6831fa-fc73-4820-84c8-57d2dfed76a0 req-440e58ed-1832-4e78-b485-31e754577304 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "8ca0969d-04fe-43b8-8f05-68f183f888a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:14:28 compute-0 nova_compute[187185]: 2025-11-29 07:14:28.129 187189 DEBUG oslo_concurrency.lockutils [req-1c6831fa-fc73-4820-84c8-57d2dfed76a0 req-440e58ed-1832-4e78-b485-31e754577304 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "8ca0969d-04fe-43b8-8f05-68f183f888a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:14:28 compute-0 nova_compute[187185]: 2025-11-29 07:14:28.129 187189 DEBUG oslo_concurrency.lockutils [req-1c6831fa-fc73-4820-84c8-57d2dfed76a0 req-440e58ed-1832-4e78-b485-31e754577304 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "8ca0969d-04fe-43b8-8f05-68f183f888a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:14:28 compute-0 nova_compute[187185]: 2025-11-29 07:14:28.130 187189 DEBUG nova.compute.manager [req-1c6831fa-fc73-4820-84c8-57d2dfed76a0 req-440e58ed-1832-4e78-b485-31e754577304 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Processing event network-vif-plugged-ef96623e-7b97-4117-b30e-902c4b9be2ae _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:14:28 compute-0 nova_compute[187185]: 2025-11-29 07:14:28.131 187189 DEBUG nova.compute.manager [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:14:28 compute-0 nova_compute[187185]: 2025-11-29 07:14:28.136 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400468.135991, 8ca0969d-04fe-43b8-8f05-68f183f888a9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:14:28 compute-0 nova_compute[187185]: 2025-11-29 07:14:28.136 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] VM Resumed (Lifecycle Event)
Nov 29 07:14:28 compute-0 nova_compute[187185]: 2025-11-29 07:14:28.138 187189 DEBUG nova.virt.libvirt.driver [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:14:28 compute-0 nova_compute[187185]: 2025-11-29 07:14:28.142 187189 INFO nova.virt.libvirt.driver [-] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Instance spawned successfully.
Nov 29 07:14:28 compute-0 nova_compute[187185]: 2025-11-29 07:14:28.142 187189 DEBUG nova.virt.libvirt.driver [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:14:28 compute-0 nova_compute[187185]: 2025-11-29 07:14:28.574 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:14:28 compute-0 nova_compute[187185]: 2025-11-29 07:14:28.581 187189 DEBUG nova.virt.libvirt.driver [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:14:28 compute-0 nova_compute[187185]: 2025-11-29 07:14:28.582 187189 DEBUG nova.virt.libvirt.driver [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:14:28 compute-0 nova_compute[187185]: 2025-11-29 07:14:28.582 187189 DEBUG nova.virt.libvirt.driver [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:14:28 compute-0 nova_compute[187185]: 2025-11-29 07:14:28.583 187189 DEBUG nova.virt.libvirt.driver [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:14:28 compute-0 nova_compute[187185]: 2025-11-29 07:14:28.583 187189 DEBUG nova.virt.libvirt.driver [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:14:28 compute-0 nova_compute[187185]: 2025-11-29 07:14:28.584 187189 DEBUG nova.virt.libvirt.driver [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:14:28 compute-0 nova_compute[187185]: 2025-11-29 07:14:28.592 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:14:28 compute-0 nova_compute[187185]: 2025-11-29 07:14:28.751 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:14:28 compute-0 nova_compute[187185]: 2025-11-29 07:14:28.952 187189 INFO nova.compute.manager [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Took 16.95 seconds to spawn the instance on the hypervisor.
Nov 29 07:14:28 compute-0 nova_compute[187185]: 2025-11-29 07:14:28.953 187189 DEBUG nova.compute.manager [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:14:29 compute-0 nova_compute[187185]: 2025-11-29 07:14:29.111 187189 INFO nova.compute.manager [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Took 19.73 seconds to build instance.
Nov 29 07:14:29 compute-0 nova_compute[187185]: 2025-11-29 07:14:29.134 187189 DEBUG oslo_concurrency.lockutils [None req-2745cbec-2bfb-4914-9022-7e3e1d4373c6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "8ca0969d-04fe-43b8-8f05-68f183f888a9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.994s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:14:29 compute-0 nova_compute[187185]: 2025-11-29 07:14:29.956 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:14:29 compute-0 nova_compute[187185]: 2025-11-29 07:14:29.994 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:14:30 compute-0 podman[228341]: 2025-11-29 07:14:30.810192441 +0000 UTC m=+0.062062646 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 07:14:30 compute-0 podman[228340]: 2025-11-29 07:14:30.822967702 +0000 UTC m=+0.078491980 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=edpm, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal)
Nov 29 07:14:30 compute-0 podman[228339]: 2025-11-29 07:14:30.833966343 +0000 UTC m=+0.092374653 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 07:14:30 compute-0 nova_compute[187185]: 2025-11-29 07:14:30.944 187189 DEBUG nova.compute.manager [req-58c1c5a6-b7d2-4551-8dad-b8c73aa679d6 req-188ddc9c-cca8-4ed6-b555-95163af41c22 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Received event network-vif-plugged-ef96623e-7b97-4117-b30e-902c4b9be2ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:14:30 compute-0 nova_compute[187185]: 2025-11-29 07:14:30.944 187189 DEBUG oslo_concurrency.lockutils [req-58c1c5a6-b7d2-4551-8dad-b8c73aa679d6 req-188ddc9c-cca8-4ed6-b555-95163af41c22 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "8ca0969d-04fe-43b8-8f05-68f183f888a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:14:30 compute-0 nova_compute[187185]: 2025-11-29 07:14:30.944 187189 DEBUG oslo_concurrency.lockutils [req-58c1c5a6-b7d2-4551-8dad-b8c73aa679d6 req-188ddc9c-cca8-4ed6-b555-95163af41c22 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "8ca0969d-04fe-43b8-8f05-68f183f888a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:14:30 compute-0 nova_compute[187185]: 2025-11-29 07:14:30.945 187189 DEBUG oslo_concurrency.lockutils [req-58c1c5a6-b7d2-4551-8dad-b8c73aa679d6 req-188ddc9c-cca8-4ed6-b555-95163af41c22 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "8ca0969d-04fe-43b8-8f05-68f183f888a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:14:30 compute-0 nova_compute[187185]: 2025-11-29 07:14:30.945 187189 DEBUG nova.compute.manager [req-58c1c5a6-b7d2-4551-8dad-b8c73aa679d6 req-188ddc9c-cca8-4ed6-b555-95163af41c22 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] No waiting events found dispatching network-vif-plugged-ef96623e-7b97-4117-b30e-902c4b9be2ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:14:30 compute-0 nova_compute[187185]: 2025-11-29 07:14:30.945 187189 WARNING nova.compute.manager [req-58c1c5a6-b7d2-4551-8dad-b8c73aa679d6 req-188ddc9c-cca8-4ed6-b555-95163af41c22 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Received unexpected event network-vif-plugged-ef96623e-7b97-4117-b30e-902c4b9be2ae for instance with vm_state active and task_state None.
Nov 29 07:14:34 compute-0 NetworkManager[55227]: <info>  [1764400474.8411] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/132)
Nov 29 07:14:34 compute-0 NetworkManager[55227]: <info>  [1764400474.8421] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/133)
Nov 29 07:14:34 compute-0 nova_compute[187185]: 2025-11-29 07:14:34.842 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:14:35 compute-0 nova_compute[187185]: 2025-11-29 07:14:35.000 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:14:35 compute-0 nova_compute[187185]: 2025-11-29 07:14:35.029 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:14:35 compute-0 ovn_controller[95281]: 2025-11-29T07:14:35Z|00248|binding|INFO|Releasing lport 71b1ea47-55d6-453c-a181-e6370c4f7968 from this chassis (sb_readonly=0)
Nov 29 07:14:35 compute-0 nova_compute[187185]: 2025-11-29 07:14:35.063 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:14:35 compute-0 nova_compute[187185]: 2025-11-29 07:14:35.498 187189 DEBUG nova.compute.manager [req-1d55ec4e-f213-4d01-8282-8a38df371137 req-4dc49098-3da1-47a8-abe8-e19375a5bb35 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Received event network-changed-ef96623e-7b97-4117-b30e-902c4b9be2ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:14:35 compute-0 nova_compute[187185]: 2025-11-29 07:14:35.499 187189 DEBUG nova.compute.manager [req-1d55ec4e-f213-4d01-8282-8a38df371137 req-4dc49098-3da1-47a8-abe8-e19375a5bb35 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Refreshing instance network info cache due to event network-changed-ef96623e-7b97-4117-b30e-902c4b9be2ae. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:14:35 compute-0 nova_compute[187185]: 2025-11-29 07:14:35.499 187189 DEBUG oslo_concurrency.lockutils [req-1d55ec4e-f213-4d01-8282-8a38df371137 req-4dc49098-3da1-47a8-abe8-e19375a5bb35 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-8ca0969d-04fe-43b8-8f05-68f183f888a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:14:35 compute-0 nova_compute[187185]: 2025-11-29 07:14:35.500 187189 DEBUG oslo_concurrency.lockutils [req-1d55ec4e-f213-4d01-8282-8a38df371137 req-4dc49098-3da1-47a8-abe8-e19375a5bb35 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-8ca0969d-04fe-43b8-8f05-68f183f888a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:14:35 compute-0 nova_compute[187185]: 2025-11-29 07:14:35.500 187189 DEBUG nova.network.neutron [req-1d55ec4e-f213-4d01-8282-8a38df371137 req-4dc49098-3da1-47a8-abe8-e19375a5bb35 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Refreshing network info cache for port ef96623e-7b97-4117-b30e-902c4b9be2ae _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:14:38 compute-0 nova_compute[187185]: 2025-11-29 07:14:38.826 187189 DEBUG nova.network.neutron [req-1d55ec4e-f213-4d01-8282-8a38df371137 req-4dc49098-3da1-47a8-abe8-e19375a5bb35 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Updated VIF entry in instance network info cache for port ef96623e-7b97-4117-b30e-902c4b9be2ae. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:14:38 compute-0 nova_compute[187185]: 2025-11-29 07:14:38.828 187189 DEBUG nova.network.neutron [req-1d55ec4e-f213-4d01-8282-8a38df371137 req-4dc49098-3da1-47a8-abe8-e19375a5bb35 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Updating instance_info_cache with network_info: [{"id": "ef96623e-7b97-4117-b30e-902c4b9be2ae", "address": "fa:16:3e:97:62:f8", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef96623e-7b", "ovs_interfaceid": "ef96623e-7b97-4117-b30e-902c4b9be2ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:14:38 compute-0 nova_compute[187185]: 2025-11-29 07:14:38.860 187189 DEBUG oslo_concurrency.lockutils [req-1d55ec4e-f213-4d01-8282-8a38df371137 req-4dc49098-3da1-47a8-abe8-e19375a5bb35 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-8ca0969d-04fe-43b8-8f05-68f183f888a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:14:40 compute-0 nova_compute[187185]: 2025-11-29 07:14:40.004 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:14:40 compute-0 nova_compute[187185]: 2025-11-29 07:14:40.032 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:14:40 compute-0 podman[228426]: 2025-11-29 07:14:40.837020045 +0000 UTC m=+0.104623849 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:14:41 compute-0 ovn_controller[95281]: 2025-11-29T07:14:41Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:97:62:f8 10.100.0.4
Nov 29 07:14:41 compute-0 ovn_controller[95281]: 2025-11-29T07:14:41Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:97:62:f8 10.100.0.4
Nov 29 07:14:45 compute-0 nova_compute[187185]: 2025-11-29 07:14:45.009 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:14:45 compute-0 nova_compute[187185]: 2025-11-29 07:14:45.034 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:14:46 compute-0 podman[228454]: 2025-11-29 07:14:46.828381319 +0000 UTC m=+0.079748305 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 07:14:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:47.996 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '8ca0969d-04fe-43b8-8f05-68f183f888a9', 'name': 'tempest-AttachInterfacesTestJSON-server-575694004', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000061', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '16d7af1670ea460db3d0422f176b6f98', 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'hostId': '957f2340bb87d7939335b7c5e04a65312f0f7b00c99bcc21b496eaee', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 07:14:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:47.998 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.001 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 8ca0969d-04fe-43b8-8f05-68f183f888a9 / tapef96623e-7b inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.002 12 DEBUG ceilometer.compute.pollsters [-] 8ca0969d-04fe-43b8-8f05-68f183f888a9/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '128ebea9-99c3-4c02-a3d8-30619304a0f9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-00000061-8ca0969d-04fe-43b8-8f05-68f183f888a9-tapef96623e-7b', 'timestamp': '2025-11-29T07:14:47.998518', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-575694004', 'name': 'tapef96623e-7b', 'instance_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9', 'instance_type': 'm1.nano', 'host': '957f2340bb87d7939335b7c5e04a65312f0f7b00c99bcc21b496eaee', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:97:62:f8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapef96623e-7b'}, 'message_id': '16cb4e7c-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 5935.716765069, 'message_signature': 'dd0e419ae72edcfed65f045a00c6d3f51f3dfd875496eca75e67e1573ce0603e'}]}, 'timestamp': '2025-11-29 07:14:48.003130', '_unique_id': 'af382228b2924f0081212d662dddaedf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.005 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.007 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.042 12 DEBUG ceilometer.compute.pollsters [-] 8ca0969d-04fe-43b8-8f05-68f183f888a9/disk.device.read.requests volume: 1105 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.042 12 DEBUG ceilometer.compute.pollsters [-] 8ca0969d-04fe-43b8-8f05-68f183f888a9/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e65321fd-ac4d-48ee-a3a4-7d8bdbeded73', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1105, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9-vda', 'timestamp': '2025-11-29T07:14:48.007674', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-575694004', 'name': 'instance-00000061', 'instance_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9', 'instance_type': 'm1.nano', 'host': '957f2340bb87d7939335b7c5e04a65312f0f7b00c99bcc21b496eaee', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '16d1658c-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 5935.725954989, 'message_signature': '2c5e74e80612189c6bf84609d5d67119bc510fb82da21ea1acdf4b7a47a40fe5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9-sda', 'timestamp': '2025-11-29T07:14:48.007674', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-575694004', 'name': 'instance-00000061', 'instance_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9', 'instance_type': 'm1.nano', 'host': '957f2340bb87d7939335b7c5e04a65312f0f7b00c99bcc21b496eaee', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '16d17432-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 5935.725954989, 'message_signature': '6dfc4ed3519b3c910e327cffcd765af1cac8623d7929ff7326ecf54008ee56a6'}]}, 'timestamp': '2025-11-29 07:14:48.043142', '_unique_id': '4100cc161f2046c0b8f3c1baa3d29991'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.044 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.045 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.045 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.045 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-AttachInterfacesTestJSON-server-575694004>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-AttachInterfacesTestJSON-server-575694004>]
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.045 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.046 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.046 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-AttachInterfacesTestJSON-server-575694004>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-AttachInterfacesTestJSON-server-575694004>]
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.046 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.046 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.046 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-AttachInterfacesTestJSON-server-575694004>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-AttachInterfacesTestJSON-server-575694004>]
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.046 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.059 12 DEBUG ceilometer.compute.pollsters [-] 8ca0969d-04fe-43b8-8f05-68f183f888a9/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.060 12 DEBUG ceilometer.compute.pollsters [-] 8ca0969d-04fe-43b8-8f05-68f183f888a9/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2d5086b4-2c1c-4939-a34c-d52ab0d17085', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9-vda', 'timestamp': '2025-11-29T07:14:48.046851', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-575694004', 'name': 'instance-00000061', 'instance_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9', 'instance_type': 'm1.nano', 'host': '957f2340bb87d7939335b7c5e04a65312f0f7b00c99bcc21b496eaee', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '16d416f6-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 5935.765084146, 'message_signature': 'e73904d58813078f7b265c85cfde631f286fb13e859ec2ffe16cf3a2583eb627'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9-sda', 'timestamp': '2025-11-29T07:14:48.046851', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-575694004', 'name': 'instance-00000061', 'instance_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9', 'instance_type': 'm1.nano', 'host': '957f2340bb87d7939335b7c5e04a65312f0f7b00c99bcc21b496eaee', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '16d422fe-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 5935.765084146, 'message_signature': 'd776fd3fc4c30091a99c81b5e0aeda6620e9690df383cac980ba8da8a4915441'}]}, 'timestamp': '2025-11-29 07:14:48.060707', '_unique_id': '2c6d59290e1547e39e2ac2cf0702a308'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.061 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.062 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.062 12 DEBUG ceilometer.compute.pollsters [-] 8ca0969d-04fe-43b8-8f05-68f183f888a9/disk.device.write.bytes volume: 72916992 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.062 12 DEBUG ceilometer.compute.pollsters [-] 8ca0969d-04fe-43b8-8f05-68f183f888a9/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a92c83ab-7270-4919-b0a8-32a892fabde5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72916992, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9-vda', 'timestamp': '2025-11-29T07:14:48.062627', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-575694004', 'name': 'instance-00000061', 'instance_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9', 'instance_type': 'm1.nano', 'host': '957f2340bb87d7939335b7c5e04a65312f0f7b00c99bcc21b496eaee', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '16d479de-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 5935.725954989, 'message_signature': 'e96eeceb3484bdab5479224fc33274f205012fd908fd9c361efa32ae81a6c0de'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9-sda', 'timestamp': '2025-11-29T07:14:48.062627', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-575694004', 'name': 'instance-00000061', 'instance_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9', 'instance_type': 'm1.nano', 'host': '957f2340bb87d7939335b7c5e04a65312f0f7b00c99bcc21b496eaee', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '16d48410-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 5935.725954989, 'message_signature': 'c68f84a6a80761cb805b99c1ac1e594b0e28e818d223e9b21d905f8d8d5dda1e'}]}, 'timestamp': '2025-11-29 07:14:48.063181', '_unique_id': '7264594ce8844477b1171ae91464de35'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.063 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.064 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.064 12 DEBUG ceilometer.compute.pollsters [-] 8ca0969d-04fe-43b8-8f05-68f183f888a9/disk.device.read.latency volume: 207160168 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.064 12 DEBUG ceilometer.compute.pollsters [-] 8ca0969d-04fe-43b8-8f05-68f183f888a9/disk.device.read.latency volume: 27417669 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c56c9410-da0b-499f-9e5f-96241e78becd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 207160168, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9-vda', 'timestamp': '2025-11-29T07:14:48.064654', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-575694004', 'name': 'instance-00000061', 'instance_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9', 'instance_type': 'm1.nano', 'host': '957f2340bb87d7939335b7c5e04a65312f0f7b00c99bcc21b496eaee', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '16d4c7fe-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 5935.725954989, 'message_signature': '33a22f8d07069d747953fa4c25ac5420c43b5872f899e8ec441e8572d98fa895'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 27417669, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9-sda', 'timestamp': '2025-11-29T07:14:48.064654', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-575694004', 'name': 'instance-00000061', 'instance_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9', 'instance_type': 'm1.nano', 'host': '957f2340bb87d7939335b7c5e04a65312f0f7b00c99bcc21b496eaee', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '16d4d212-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 5935.725954989, 'message_signature': 'd38e68505f29f8b892cbd5d65cfc93dfadc176ce60884d718faf0cbc9c2024b2'}]}, 'timestamp': '2025-11-29 07:14:48.065173', '_unique_id': '4c08ece7642d4b2c94a463c3d26a9b13'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.065 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.066 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.066 12 DEBUG ceilometer.compute.pollsters [-] 8ca0969d-04fe-43b8-8f05-68f183f888a9/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6a2bab8-23e5-49a6-a3a8-a43a22eb76b0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-00000061-8ca0969d-04fe-43b8-8f05-68f183f888a9-tapef96623e-7b', 'timestamp': '2025-11-29T07:14:48.066646', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-575694004', 'name': 'tapef96623e-7b', 'instance_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9', 'instance_type': 'm1.nano', 'host': '957f2340bb87d7939335b7c5e04a65312f0f7b00c99bcc21b496eaee', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:97:62:f8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapef96623e-7b'}, 'message_id': '16d5159c-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 5935.716765069, 'message_signature': '3267cdf98d24608c01055dff2d82f0a0c28b653ba2ed3fef2ad80973d06de0e8'}]}, 'timestamp': '2025-11-29 07:14:48.066934', '_unique_id': 'b52935a8712541f49452b6e094428cb5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.067 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.068 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.068 12 DEBUG ceilometer.compute.pollsters [-] 8ca0969d-04fe-43b8-8f05-68f183f888a9/network.incoming.bytes volume: 1648 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8599e5da-fb56-4b4f-99c3-5502d37d1e8c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1648, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-00000061-8ca0969d-04fe-43b8-8f05-68f183f888a9-tapef96623e-7b', 'timestamp': '2025-11-29T07:14:48.068326', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-575694004', 'name': 'tapef96623e-7b', 'instance_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9', 'instance_type': 'm1.nano', 'host': '957f2340bb87d7939335b7c5e04a65312f0f7b00c99bcc21b496eaee', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:97:62:f8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapef96623e-7b'}, 'message_id': '16d5571e-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 5935.716765069, 'message_signature': 'cc96ccf714e5366f76ff27f5b5cd5b475fb2c85be61558ebb20039c89e8eb1e3'}]}, 'timestamp': '2025-11-29 07:14:48.068591', '_unique_id': '619af4014efb4e1cb67bc119aebe32da'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.069 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.088 12 DEBUG ceilometer.compute.pollsters [-] 8ca0969d-04fe-43b8-8f05-68f183f888a9/cpu volume: 11850000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '243da0e9-343c-489d-b0bf-8eaf473915d1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11850000000, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9', 'timestamp': '2025-11-29T07:14:48.070061', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-575694004', 'name': 'instance-00000061', 'instance_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9', 'instance_type': 'm1.nano', 'host': '957f2340bb87d7939335b7c5e04a65312f0f7b00c99bcc21b496eaee', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '16d882c2-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 5935.806940249, 'message_signature': '9ce7336d6eb2f89091c142ef0d50b700b1b78a1c17249928b6e615b99785e5e7'}]}, 'timestamp': '2025-11-29 07:14:48.089442', '_unique_id': '1fe00937b8c146b9861925daf4281484'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.090 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.091 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.091 12 DEBUG ceilometer.compute.pollsters [-] 8ca0969d-04fe-43b8-8f05-68f183f888a9/disk.device.write.requests volume: 299 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.091 12 DEBUG ceilometer.compute.pollsters [-] 8ca0969d-04fe-43b8-8f05-68f183f888a9/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '41d8269c-c3e5-42c9-b0f4-9610368ebc07', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 299, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9-vda', 'timestamp': '2025-11-29T07:14:48.091439', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-575694004', 'name': 'instance-00000061', 'instance_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9', 'instance_type': 'm1.nano', 'host': '957f2340bb87d7939335b7c5e04a65312f0f7b00c99bcc21b496eaee', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '16d8deac-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 5935.725954989, 'message_signature': '6adb313f64b3717e53ea5fd94c6cf56e5ce9bc0ff6c307b02d18640f0b8875d3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9-sda', 'timestamp': '2025-11-29T07:14:48.091439', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-575694004', 'name': 'instance-00000061', 'instance_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9', 'instance_type': 'm1.nano', 'host': '957f2340bb87d7939335b7c5e04a65312f0f7b00c99bcc21b496eaee', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '16d8e974-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 5935.725954989, 'message_signature': 'de084f47d3c41e53e362c4210c7649a484b611f9034e9b1e1a33b43a8f836c36'}]}, 'timestamp': '2025-11-29 07:14:48.091989', '_unique_id': 'e5c52769a99542c1b2cc04569ca68c8e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.092 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.093 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.093 12 DEBUG ceilometer.compute.pollsters [-] 8ca0969d-04fe-43b8-8f05-68f183f888a9/memory.usage volume: 40.37890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14e8c86a-ab48-4add-9ef2-a27d642c357c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.37890625, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9', 'timestamp': '2025-11-29T07:14:48.093466', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-575694004', 'name': 'instance-00000061', 'instance_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9', 'instance_type': 'm1.nano', 'host': '957f2340bb87d7939335b7c5e04a65312f0f7b00c99bcc21b496eaee', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '16d92d80-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 5935.806940249, 'message_signature': '28fbfc17f8d9864144067f7d176222649ff98881c11259190159ff310ef395c2'}]}, 'timestamp': '2025-11-29 07:14:48.093738', '_unique_id': '04c534f70dd1455d89c1296f5786c74b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.094 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.095 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.095 12 DEBUG ceilometer.compute.pollsters [-] 8ca0969d-04fe-43b8-8f05-68f183f888a9/network.incoming.packets volume: 13 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8cfe9c01-11fa-4d23-8d59-50b4f290bb86', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 13, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-00000061-8ca0969d-04fe-43b8-8f05-68f183f888a9-tapef96623e-7b', 'timestamp': '2025-11-29T07:14:48.095728', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-575694004', 'name': 'tapef96623e-7b', 'instance_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9', 'instance_type': 'm1.nano', 'host': '957f2340bb87d7939335b7c5e04a65312f0f7b00c99bcc21b496eaee', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:97:62:f8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapef96623e-7b'}, 'message_id': '16d98f00-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 5935.716765069, 'message_signature': '985d4f43d992c74d2f2a27c55a3117669ea6e7232998ff17e4b2564b8349dab6'}]}, 'timestamp': '2025-11-29 07:14:48.096444', '_unique_id': '7a664c72491c47e29607bf2d89f3112d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.097 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.099 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.099 12 DEBUG ceilometer.compute.pollsters [-] 8ca0969d-04fe-43b8-8f05-68f183f888a9/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73491c1e-f690-4187-967e-a7b759957396', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-00000061-8ca0969d-04fe-43b8-8f05-68f183f888a9-tapef96623e-7b', 'timestamp': '2025-11-29T07:14:48.099865', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-575694004', 'name': 'tapef96623e-7b', 'instance_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9', 'instance_type': 'm1.nano', 'host': '957f2340bb87d7939335b7c5e04a65312f0f7b00c99bcc21b496eaee', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:97:62:f8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapef96623e-7b'}, 'message_id': '16da2faa-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 5935.716765069, 'message_signature': '2ae96654142bc9fc76cbd5c6f23a9df977ea6c3f111d4bbdc7782dfbd3562a73'}]}, 'timestamp': '2025-11-29 07:14:48.100500', '_unique_id': 'cc12c4c16cb54d8a93505700e04e43d0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.101 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.102 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.103 12 DEBUG ceilometer.compute.pollsters [-] 8ca0969d-04fe-43b8-8f05-68f183f888a9/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.103 12 DEBUG ceilometer.compute.pollsters [-] 8ca0969d-04fe-43b8-8f05-68f183f888a9/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9df5d5c6-53f3-4b37-87dd-0e77ad9f5f64', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9-vda', 'timestamp': '2025-11-29T07:14:48.103057', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-575694004', 'name': 'instance-00000061', 'instance_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9', 'instance_type': 'm1.nano', 'host': '957f2340bb87d7939335b7c5e04a65312f0f7b00c99bcc21b496eaee', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '16daa804-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 5935.765084146, 'message_signature': '1a5b6ce889e2709a04fefbf73fb6f723beeea4b11b012e34d8f9115f4ee34767'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9-sda', 'timestamp': '2025-11-29T07:14:48.103057', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-575694004', 'name': 'instance-00000061', 'instance_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9', 'instance_type': 'm1.nano', 'host': '957f2340bb87d7939335b7c5e04a65312f0f7b00c99bcc21b496eaee', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '16dab9fc-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 5935.765084146, 'message_signature': '3e1c5315a2cfac657458ea338bf68e9ff6b7873c51e96da840ee38062b5537df'}]}, 'timestamp': '2025-11-29 07:14:48.104043', '_unique_id': 'c5f42a8f5bb94129af9e2f815649a3aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.105 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.106 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.106 12 DEBUG ceilometer.compute.pollsters [-] 8ca0969d-04fe-43b8-8f05-68f183f888a9/network.outgoing.packets volume: 13 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '241afe79-c447-4de5-b3fd-dfaaeba54b6f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 13, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-00000061-8ca0969d-04fe-43b8-8f05-68f183f888a9-tapef96623e-7b', 'timestamp': '2025-11-29T07:14:48.106564', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-575694004', 'name': 'tapef96623e-7b', 'instance_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9', 'instance_type': 'm1.nano', 'host': '957f2340bb87d7939335b7c5e04a65312f0f7b00c99bcc21b496eaee', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:97:62:f8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapef96623e-7b'}, 'message_id': '16db3274-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 5935.716765069, 'message_signature': 'f1b5304f451c38e98cce57b9c5bba0013d8e4bcfa047abd613785bb7f5d61384'}]}, 'timestamp': '2025-11-29 07:14:48.107120', '_unique_id': '16f980343bc6426b8de29de0b8df11cc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.108 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.109 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.110 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.110 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-AttachInterfacesTestJSON-server-575694004>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-AttachInterfacesTestJSON-server-575694004>]
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.110 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.111 12 DEBUG ceilometer.compute.pollsters [-] 8ca0969d-04fe-43b8-8f05-68f183f888a9/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0695fa3f-4cab-4ca4-8ffd-5872ad7eb3c4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-00000061-8ca0969d-04fe-43b8-8f05-68f183f888a9-tapef96623e-7b', 'timestamp': '2025-11-29T07:14:48.111019', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-575694004', 'name': 'tapef96623e-7b', 'instance_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9', 'instance_type': 'm1.nano', 'host': '957f2340bb87d7939335b7c5e04a65312f0f7b00c99bcc21b496eaee', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:97:62:f8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapef96623e-7b'}, 'message_id': '16dbe1ba-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 5935.716765069, 'message_signature': '0987d7307fff03bed3bd95394f4fd150ba5e521d8a097fea185a1ce44e1d603e'}]}, 'timestamp': '2025-11-29 07:14:48.111823', '_unique_id': '4f4b3fed77564577b73b47c4235fc32a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.113 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.115 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.115 12 DEBUG ceilometer.compute.pollsters [-] 8ca0969d-04fe-43b8-8f05-68f183f888a9/disk.device.write.latency volume: 6481297835 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.116 12 DEBUG ceilometer.compute.pollsters [-] 8ca0969d-04fe-43b8-8f05-68f183f888a9/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '910acd44-8d4e-434b-a22d-7a8cfc70c786', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6481297835, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9-vda', 'timestamp': '2025-11-29T07:14:48.115440', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-575694004', 'name': 'instance-00000061', 'instance_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9', 'instance_type': 'm1.nano', 'host': '957f2340bb87d7939335b7c5e04a65312f0f7b00c99bcc21b496eaee', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '16dc8f34-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 5935.725954989, 'message_signature': '936e0eadad70a129afb27fc7967d1847653478c6ec708bc40acb7b8383397003'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9-sda', 'timestamp': '2025-11-29T07:14:48.115440', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-575694004', 'name': 'instance-00000061', 'instance_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9', 'instance_type': 'm1.nano', 'host': '957f2340bb87d7939335b7c5e04a65312f0f7b00c99bcc21b496eaee', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '16dcab22-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 5935.725954989, 'message_signature': '6ba6b7bd1fdd95ddb2454dfa8f820273f62d3f3ba0277d077ec927de3a70ec24'}]}, 'timestamp': '2025-11-29 07:14:48.116869', '_unique_id': '5e24b4a91b594a1380c7e84875d02b1d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.118 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.120 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.120 12 DEBUG ceilometer.compute.pollsters [-] 8ca0969d-04fe-43b8-8f05-68f183f888a9/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e26d494a-ffb9-4213-866e-3921752422a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-00000061-8ca0969d-04fe-43b8-8f05-68f183f888a9-tapef96623e-7b', 'timestamp': '2025-11-29T07:14:48.120215', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-575694004', 'name': 'tapef96623e-7b', 'instance_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9', 'instance_type': 'm1.nano', 'host': '957f2340bb87d7939335b7c5e04a65312f0f7b00c99bcc21b496eaee', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:97:62:f8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapef96623e-7b'}, 'message_id': '16dd496a-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 5935.716765069, 'message_signature': '067e1f71dbae3468f7f34f6dc16ffec27a6c8f45f96f24b43d45f101fda80ccc'}]}, 'timestamp': '2025-11-29 07:14:48.120943', '_unique_id': 'cd8523b82a0f485aa49eb0ad055a6123'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.122 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.123 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.124 12 DEBUG ceilometer.compute.pollsters [-] 8ca0969d-04fe-43b8-8f05-68f183f888a9/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.124 12 DEBUG ceilometer.compute.pollsters [-] 8ca0969d-04fe-43b8-8f05-68f183f888a9/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b6ae8db-5229-4dc3-91ba-a6bfaff6828c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9-vda', 'timestamp': '2025-11-29T07:14:48.124095', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-575694004', 'name': 'instance-00000061', 'instance_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9', 'instance_type': 'm1.nano', 'host': '957f2340bb87d7939335b7c5e04a65312f0f7b00c99bcc21b496eaee', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '16dde03c-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 5935.765084146, 'message_signature': 'acb8ea325e280f9877aeff930a106aaa8c97e9a44c46e50ba08eca3619e09d8d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9-sda', 'timestamp': '2025-11-29T07:14:48.124095', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-575694004', 'name': 'instance-00000061', 'instance_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9', 'instance_type': 'm1.nano', 'host': '957f2340bb87d7939335b7c5e04a65312f0f7b00c99bcc21b496eaee', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '16ddf90a-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 5935.765084146, 'message_signature': 'af2d7cc3ae38fddbbc85d45a6d832b26d5d21931e830bdb2821b89eb89abdccc'}]}, 'timestamp': '2025-11-29 07:14:48.125359', '_unique_id': '696c4b8718a74568b8e6ec6ef398a96b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.126 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.128 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.128 12 DEBUG ceilometer.compute.pollsters [-] 8ca0969d-04fe-43b8-8f05-68f183f888a9/network.outgoing.bytes volume: 1438 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0563028b-4343-455c-b368-b0e575bb985f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1438, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-00000061-8ca0969d-04fe-43b8-8f05-68f183f888a9-tapef96623e-7b', 'timestamp': '2025-11-29T07:14:48.128605', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-575694004', 'name': 'tapef96623e-7b', 'instance_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9', 'instance_type': 'm1.nano', 'host': '957f2340bb87d7939335b7c5e04a65312f0f7b00c99bcc21b496eaee', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:97:62:f8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapef96623e-7b'}, 'message_id': '16de9338-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 5935.716765069, 'message_signature': 'd1a5ce891d45a8eb82fc01b8cc5981423b7ea80e5b32ae551625570e63902af9'}]}, 'timestamp': '2025-11-29 07:14:48.129345', '_unique_id': '18a53544f269450db4ed1e1428821a2f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.130 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.132 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.132 12 DEBUG ceilometer.compute.pollsters [-] 8ca0969d-04fe-43b8-8f05-68f183f888a9/disk.device.read.bytes volume: 30525952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.133 12 DEBUG ceilometer.compute.pollsters [-] 8ca0969d-04fe-43b8-8f05-68f183f888a9/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aeccc1be-6107-4380-a6b5-fc2a98806611', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30525952, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9-vda', 'timestamp': '2025-11-29T07:14:48.132491', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-575694004', 'name': 'instance-00000061', 'instance_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9', 'instance_type': 'm1.nano', 'host': '957f2340bb87d7939335b7c5e04a65312f0f7b00c99bcc21b496eaee', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '16df2a5a-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 5935.725954989, 'message_signature': 'd3124454cfbc69b15fb4e8435c106f7d3b8097bf2ee1e100cc5418f48b8efedc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9-sda', 'timestamp': '2025-11-29T07:14:48.132491', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-575694004', 'name': 'instance-00000061', 'instance_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9', 'instance_type': 'm1.nano', 'host': '957f2340bb87d7939335b7c5e04a65312f0f7b00c99bcc21b496eaee', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '16df4224-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 5935.725954989, 'message_signature': '99eb88dabb2b2823a02bf99a4606fee6d06e34805be31c03ca5664a1aa838aad'}]}, 'timestamp': '2025-11-29 07:14:48.133786', '_unique_id': '338f86496811434fa773b81342412dd1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.135 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.136 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.137 12 DEBUG ceilometer.compute.pollsters [-] 8ca0969d-04fe-43b8-8f05-68f183f888a9/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7485f718-a6b4-4946-8119-daa77820c9f5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-00000061-8ca0969d-04fe-43b8-8f05-68f183f888a9-tapef96623e-7b', 'timestamp': '2025-11-29T07:14:48.137053', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-575694004', 'name': 'tapef96623e-7b', 'instance_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9', 'instance_type': 'm1.nano', 'host': '957f2340bb87d7939335b7c5e04a65312f0f7b00c99bcc21b496eaee', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:97:62:f8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapef96623e-7b'}, 'message_id': '16dfdb6c-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 5935.716765069, 'message_signature': '316a4e8230f5272cc9e4ba0f480f0f775b87e73d0dc58f07bcd49ccaceecb50a'}]}, 'timestamp': '2025-11-29 07:14:48.137740', '_unique_id': '37b1e80707ea411fa9a6dda906b285a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:14:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:14:48.139 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:14:48 compute-0 podman[228479]: 2025-11-29 07:14:48.816490627 +0000 UTC m=+0.069280120 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:14:48 compute-0 podman[228478]: 2025-11-29 07:14:48.817356281 +0000 UTC m=+0.079679624 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:14:50 compute-0 nova_compute[187185]: 2025-11-29 07:14:50.012 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:14:50 compute-0 nova_compute[187185]: 2025-11-29 07:14:50.037 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:14:55 compute-0 nova_compute[187185]: 2025-11-29 07:14:55.016 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:14:55 compute-0 nova_compute[187185]: 2025-11-29 07:14:55.040 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:14:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:14:55.632 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:14:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:14:55.634 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:14:55 compute-0 nova_compute[187185]: 2025-11-29 07:14:55.635 187189 DEBUG oslo_concurrency.lockutils [None req-aff7f1f2-1a94-4753-a88b-964bfa75231f 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "interface-8ca0969d-04fe-43b8-8f05-68f183f888a9-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:14:55 compute-0 nova_compute[187185]: 2025-11-29 07:14:55.636 187189 DEBUG oslo_concurrency.lockutils [None req-aff7f1f2-1a94-4753-a88b-964bfa75231f 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "interface-8ca0969d-04fe-43b8-8f05-68f183f888a9-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:14:55 compute-0 nova_compute[187185]: 2025-11-29 07:14:55.637 187189 DEBUG nova.objects.instance [None req-aff7f1f2-1a94-4753-a88b-964bfa75231f 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lazy-loading 'flavor' on Instance uuid 8ca0969d-04fe-43b8-8f05-68f183f888a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:14:55 compute-0 nova_compute[187185]: 2025-11-29 07:14:55.640 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:14:56 compute-0 nova_compute[187185]: 2025-11-29 07:14:56.000 187189 DEBUG nova.objects.instance [None req-aff7f1f2-1a94-4753-a88b-964bfa75231f 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lazy-loading 'pci_requests' on Instance uuid 8ca0969d-04fe-43b8-8f05-68f183f888a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:14:56 compute-0 nova_compute[187185]: 2025-11-29 07:14:56.301 187189 DEBUG nova.network.neutron [None req-aff7f1f2-1a94-4753-a88b-964bfa75231f 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 07:14:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:14:56.637 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:14:56 compute-0 nova_compute[187185]: 2025-11-29 07:14:56.830 187189 DEBUG nova.policy [None req-aff7f1f2-1a94-4753-a88b-964bfa75231f 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 07:14:58 compute-0 nova_compute[187185]: 2025-11-29 07:14:58.633 187189 DEBUG nova.network.neutron [None req-aff7f1f2-1a94-4753-a88b-964bfa75231f 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Successfully created port: 3669ef72-0983-4ed3-b52c-6891cbc3edc2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:15:00 compute-0 nova_compute[187185]: 2025-11-29 07:15:00.065 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:00 compute-0 nova_compute[187185]: 2025-11-29 07:15:00.534 187189 DEBUG nova.network.neutron [None req-aff7f1f2-1a94-4753-a88b-964bfa75231f 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Successfully updated port: 3669ef72-0983-4ed3-b52c-6891cbc3edc2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:15:00 compute-0 nova_compute[187185]: 2025-11-29 07:15:00.551 187189 DEBUG oslo_concurrency.lockutils [None req-aff7f1f2-1a94-4753-a88b-964bfa75231f 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "refresh_cache-8ca0969d-04fe-43b8-8f05-68f183f888a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:15:00 compute-0 nova_compute[187185]: 2025-11-29 07:15:00.552 187189 DEBUG oslo_concurrency.lockutils [None req-aff7f1f2-1a94-4753-a88b-964bfa75231f 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquired lock "refresh_cache-8ca0969d-04fe-43b8-8f05-68f183f888a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:15:00 compute-0 nova_compute[187185]: 2025-11-29 07:15:00.552 187189 DEBUG nova.network.neutron [None req-aff7f1f2-1a94-4753-a88b-964bfa75231f 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:15:00 compute-0 nova_compute[187185]: 2025-11-29 07:15:00.760 187189 WARNING nova.network.neutron [None req-aff7f1f2-1a94-4753-a88b-964bfa75231f 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] 90812230-35cb-4e21-b16b-75b900100d8b already exists in list: networks containing: ['90812230-35cb-4e21-b16b-75b900100d8b']. ignoring it
Nov 29 07:15:00 compute-0 nova_compute[187185]: 2025-11-29 07:15:00.935 187189 DEBUG nova.compute.manager [req-a4231bc9-701b-4084-bdfc-09ef2aeec931 req-d64a9c3b-00fd-4aa7-be39-f795bd49f885 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Received event network-changed-3669ef72-0983-4ed3-b52c-6891cbc3edc2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:15:00 compute-0 nova_compute[187185]: 2025-11-29 07:15:00.936 187189 DEBUG nova.compute.manager [req-a4231bc9-701b-4084-bdfc-09ef2aeec931 req-d64a9c3b-00fd-4aa7-be39-f795bd49f885 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Refreshing instance network info cache due to event network-changed-3669ef72-0983-4ed3-b52c-6891cbc3edc2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:15:00 compute-0 nova_compute[187185]: 2025-11-29 07:15:00.937 187189 DEBUG oslo_concurrency.lockutils [req-a4231bc9-701b-4084-bdfc-09ef2aeec931 req-d64a9c3b-00fd-4aa7-be39-f795bd49f885 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-8ca0969d-04fe-43b8-8f05-68f183f888a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:15:01 compute-0 podman[228520]: 2025-11-29 07:15:01.807324754 +0000 UTC m=+0.053917276 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 07:15:01 compute-0 podman[228518]: 2025-11-29 07:15:01.838715632 +0000 UTC m=+0.092926119 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 29 07:15:01 compute-0 podman[228519]: 2025-11-29 07:15:01.845420571 +0000 UTC m=+0.088879314 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.expose-services=, release=1755695350, maintainer=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, name=ubi9-minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 29 07:15:03 compute-0 nova_compute[187185]: 2025-11-29 07:15:03.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:15:03 compute-0 nova_compute[187185]: 2025-11-29 07:15:03.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:15:03 compute-0 nova_compute[187185]: 2025-11-29 07:15:03.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:15:03 compute-0 nova_compute[187185]: 2025-11-29 07:15:03.562 187189 DEBUG nova.network.neutron [None req-aff7f1f2-1a94-4753-a88b-964bfa75231f 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Updating instance_info_cache with network_info: [{"id": "ef96623e-7b97-4117-b30e-902c4b9be2ae", "address": "fa:16:3e:97:62:f8", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef96623e-7b", "ovs_interfaceid": "ef96623e-7b97-4117-b30e-902c4b9be2ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "3669ef72-0983-4ed3-b52c-6891cbc3edc2", "address": "fa:16:3e:96:9f:2d", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3669ef72-09", "ovs_interfaceid": "3669ef72-0983-4ed3-b52c-6891cbc3edc2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:15:03 compute-0 ovn_controller[95281]: 2025-11-29T07:15:03Z|00249|binding|INFO|Releasing lport 71b1ea47-55d6-453c-a181-e6370c4f7968 from this chassis (sb_readonly=0)
Nov 29 07:15:03 compute-0 nova_compute[187185]: 2025-11-29 07:15:03.689 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:03 compute-0 nova_compute[187185]: 2025-11-29 07:15:03.717 187189 DEBUG oslo_concurrency.lockutils [None req-aff7f1f2-1a94-4753-a88b-964bfa75231f 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Releasing lock "refresh_cache-8ca0969d-04fe-43b8-8f05-68f183f888a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:15:03 compute-0 nova_compute[187185]: 2025-11-29 07:15:03.719 187189 DEBUG oslo_concurrency.lockutils [req-a4231bc9-701b-4084-bdfc-09ef2aeec931 req-d64a9c3b-00fd-4aa7-be39-f795bd49f885 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-8ca0969d-04fe-43b8-8f05-68f183f888a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:15:03 compute-0 nova_compute[187185]: 2025-11-29 07:15:03.719 187189 DEBUG nova.network.neutron [req-a4231bc9-701b-4084-bdfc-09ef2aeec931 req-d64a9c3b-00fd-4aa7-be39-f795bd49f885 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Refreshing network info cache for port 3669ef72-0983-4ed3-b52c-6891cbc3edc2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:15:03 compute-0 nova_compute[187185]: 2025-11-29 07:15:03.725 187189 DEBUG nova.virt.libvirt.vif [None req-aff7f1f2-1a94-4753-a88b-964bfa75231f 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:14:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-575694004',display_name='tempest-AttachInterfacesTestJSON-server-575694004',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-575694004',id=97,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNgGd4Iacs7wY+oKmY26nBMFcX8qOmRS8JGNl6lyYXdVDbe/o53pDlvPJRfgkkFQQMuZj1PL9g22BJFBRRmej3Uv85Ig8WM1/1T7okGrP88moNmlmJpygv+50mQ4/2U61Q==',key_name='tempest-keypair-1636541378',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:14:28Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='16d7af1670ea460db3d0422f176b6f98',ramdisk_id='',reservation_id='r-ysw8l28l',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-844604773',owner_user_name='tempest-AttachInterfacesTestJSON-844604773-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:14:29Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a13f011f5b74a6f94a2d2c8e9104f4a',uuid=8ca0969d-04fe-43b8-8f05-68f183f888a9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3669ef72-0983-4ed3-b52c-6891cbc3edc2", "address": "fa:16:3e:96:9f:2d", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3669ef72-09", "ovs_interfaceid": "3669ef72-0983-4ed3-b52c-6891cbc3edc2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:15:03 compute-0 nova_compute[187185]: 2025-11-29 07:15:03.725 187189 DEBUG nova.network.os_vif_util [None req-aff7f1f2-1a94-4753-a88b-964bfa75231f 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converting VIF {"id": "3669ef72-0983-4ed3-b52c-6891cbc3edc2", "address": "fa:16:3e:96:9f:2d", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3669ef72-09", "ovs_interfaceid": "3669ef72-0983-4ed3-b52c-6891cbc3edc2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:15:03 compute-0 nova_compute[187185]: 2025-11-29 07:15:03.727 187189 DEBUG nova.network.os_vif_util [None req-aff7f1f2-1a94-4753-a88b-964bfa75231f 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:96:9f:2d,bridge_name='br-int',has_traffic_filtering=True,id=3669ef72-0983-4ed3-b52c-6891cbc3edc2,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3669ef72-09') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:15:03 compute-0 nova_compute[187185]: 2025-11-29 07:15:03.727 187189 DEBUG os_vif [None req-aff7f1f2-1a94-4753-a88b-964bfa75231f 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:96:9f:2d,bridge_name='br-int',has_traffic_filtering=True,id=3669ef72-0983-4ed3-b52c-6891cbc3edc2,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3669ef72-09') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:15:03 compute-0 nova_compute[187185]: 2025-11-29 07:15:03.728 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:03 compute-0 nova_compute[187185]: 2025-11-29 07:15:03.729 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:15:03 compute-0 nova_compute[187185]: 2025-11-29 07:15:03.730 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:15:03 compute-0 nova_compute[187185]: 2025-11-29 07:15:03.736 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:03 compute-0 nova_compute[187185]: 2025-11-29 07:15:03.737 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3669ef72-09, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:15:03 compute-0 nova_compute[187185]: 2025-11-29 07:15:03.737 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3669ef72-09, col_values=(('external_ids', {'iface-id': '3669ef72-0983-4ed3-b52c-6891cbc3edc2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:96:9f:2d', 'vm-uuid': '8ca0969d-04fe-43b8-8f05-68f183f888a9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:15:03 compute-0 nova_compute[187185]: 2025-11-29 07:15:03.739 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:03 compute-0 NetworkManager[55227]: <info>  [1764400503.7407] manager: (tap3669ef72-09): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/134)
Nov 29 07:15:03 compute-0 nova_compute[187185]: 2025-11-29 07:15:03.742 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:15:03 compute-0 nova_compute[187185]: 2025-11-29 07:15:03.746 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:03 compute-0 nova_compute[187185]: 2025-11-29 07:15:03.747 187189 INFO os_vif [None req-aff7f1f2-1a94-4753-a88b-964bfa75231f 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:96:9f:2d,bridge_name='br-int',has_traffic_filtering=True,id=3669ef72-0983-4ed3-b52c-6891cbc3edc2,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3669ef72-09')
Nov 29 07:15:03 compute-0 nova_compute[187185]: 2025-11-29 07:15:03.749 187189 DEBUG nova.virt.libvirt.vif [None req-aff7f1f2-1a94-4753-a88b-964bfa75231f 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:14:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-575694004',display_name='tempest-AttachInterfacesTestJSON-server-575694004',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-575694004',id=97,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNgGd4Iacs7wY+oKmY26nBMFcX8qOmRS8JGNl6lyYXdVDbe/o53pDlvPJRfgkkFQQMuZj1PL9g22BJFBRRmej3Uv85Ig8WM1/1T7okGrP88moNmlmJpygv+50mQ4/2U61Q==',key_name='tempest-keypair-1636541378',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:14:28Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='16d7af1670ea460db3d0422f176b6f98',ramdisk_id='',reservation_id='r-ysw8l28l',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-844604773',owner_user_name='tempest-AttachInterfacesTestJSON-844604773-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:14:29Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a13f011f5b74a6f94a2d2c8e9104f4a',uuid=8ca0969d-04fe-43b8-8f05-68f183f888a9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3669ef72-0983-4ed3-b52c-6891cbc3edc2", "address": "fa:16:3e:96:9f:2d", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3669ef72-09", "ovs_interfaceid": "3669ef72-0983-4ed3-b52c-6891cbc3edc2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:15:03 compute-0 nova_compute[187185]: 2025-11-29 07:15:03.750 187189 DEBUG nova.network.os_vif_util [None req-aff7f1f2-1a94-4753-a88b-964bfa75231f 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converting VIF {"id": "3669ef72-0983-4ed3-b52c-6891cbc3edc2", "address": "fa:16:3e:96:9f:2d", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3669ef72-09", "ovs_interfaceid": "3669ef72-0983-4ed3-b52c-6891cbc3edc2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:15:03 compute-0 nova_compute[187185]: 2025-11-29 07:15:03.751 187189 DEBUG nova.network.os_vif_util [None req-aff7f1f2-1a94-4753-a88b-964bfa75231f 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:96:9f:2d,bridge_name='br-int',has_traffic_filtering=True,id=3669ef72-0983-4ed3-b52c-6891cbc3edc2,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3669ef72-09') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:15:03 compute-0 nova_compute[187185]: 2025-11-29 07:15:03.754 187189 DEBUG nova.virt.libvirt.guest [None req-aff7f1f2-1a94-4753-a88b-964bfa75231f 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] attach device xml: <interface type="ethernet">
Nov 29 07:15:03 compute-0 nova_compute[187185]:   <mac address="fa:16:3e:96:9f:2d"/>
Nov 29 07:15:03 compute-0 nova_compute[187185]:   <model type="virtio"/>
Nov 29 07:15:03 compute-0 nova_compute[187185]:   <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:15:03 compute-0 nova_compute[187185]:   <mtu size="1442"/>
Nov 29 07:15:03 compute-0 nova_compute[187185]:   <target dev="tap3669ef72-09"/>
Nov 29 07:15:03 compute-0 nova_compute[187185]: </interface>
Nov 29 07:15:03 compute-0 nova_compute[187185]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Nov 29 07:15:03 compute-0 NetworkManager[55227]: <info>  [1764400503.7729] manager: (tap3669ef72-09): new Tun device (/org/freedesktop/NetworkManager/Devices/135)
Nov 29 07:15:03 compute-0 kernel: tap3669ef72-09: entered promiscuous mode
Nov 29 07:15:03 compute-0 ovn_controller[95281]: 2025-11-29T07:15:03Z|00250|binding|INFO|Claiming lport 3669ef72-0983-4ed3-b52c-6891cbc3edc2 for this chassis.
Nov 29 07:15:03 compute-0 ovn_controller[95281]: 2025-11-29T07:15:03Z|00251|binding|INFO|3669ef72-0983-4ed3-b52c-6891cbc3edc2: Claiming fa:16:3e:96:9f:2d 10.100.0.14
Nov 29 07:15:03 compute-0 nova_compute[187185]: 2025-11-29 07:15:03.777 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:03.797 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:96:9f:2d 10.100.0.14'], port_security=['fa:16:3e:96:9f:2d 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-90812230-35cb-4e21-b16b-75b900100d8b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '16d7af1670ea460db3d0422f176b6f98', 'neutron:revision_number': '2', 'neutron:security_group_ids': '026dfe19-5964-4af9-9b69-58d89d9181a8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=41b9bfbf-a9b3-4bdb-9144-e5db6a660517, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=3669ef72-0983-4ed3-b52c-6891cbc3edc2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:15:03 compute-0 nova_compute[187185]: 2025-11-29 07:15:03.799 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:03 compute-0 nova_compute[187185]: 2025-11-29 07:15:03.801 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:03 compute-0 ovn_controller[95281]: 2025-11-29T07:15:03Z|00252|binding|INFO|Setting lport 3669ef72-0983-4ed3-b52c-6891cbc3edc2 ovn-installed in OVS
Nov 29 07:15:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:03.801 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 3669ef72-0983-4ed3-b52c-6891cbc3edc2 in datapath 90812230-35cb-4e21-b16b-75b900100d8b bound to our chassis
Nov 29 07:15:03 compute-0 ovn_controller[95281]: 2025-11-29T07:15:03Z|00253|binding|INFO|Setting lport 3669ef72-0983-4ed3-b52c-6891cbc3edc2 up in Southbound
Nov 29 07:15:03 compute-0 nova_compute[187185]: 2025-11-29 07:15:03.803 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:03.807 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 90812230-35cb-4e21-b16b-75b900100d8b
Nov 29 07:15:03 compute-0 nova_compute[187185]: 2025-11-29 07:15:03.815 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "refresh_cache-8ca0969d-04fe-43b8-8f05-68f183f888a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:15:03 compute-0 systemd-udevd[228584]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:15:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:03.832 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[9774e74d-e57a-4793-b416-8d7c10462cb7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:03 compute-0 NetworkManager[55227]: <info>  [1764400503.8427] device (tap3669ef72-09): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:15:03 compute-0 NetworkManager[55227]: <info>  [1764400503.8439] device (tap3669ef72-09): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:15:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:03.874 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[3e6e609c-a9e5-43bc-bfc5-e528df1edc13]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:03.879 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[f1c72506-4d99-4cf7-b045-145246b53567]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:03.906 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[5624d35a-6401-4d67-be69-030c4a9ebfa3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:03.930 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[7e33cea1-68b2-485b-b2f0-5f8c96d28975]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap90812230-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:5f:07'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 81], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 591442, 'reachable_time': 18293, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228591, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:03.955 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[52cb62ca-d66e-4625-9b9e-c2b31acb5854]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap90812230-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 591455, 'tstamp': 591455}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228592, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap90812230-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 591458, 'tstamp': 591458}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228592, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:03.958 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap90812230-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:15:03 compute-0 nova_compute[187185]: 2025-11-29 07:15:03.960 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:03 compute-0 nova_compute[187185]: 2025-11-29 07:15:03.962 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:03.962 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap90812230-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:15:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:03.962 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:15:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:03.963 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap90812230-30, col_values=(('external_ids', {'iface-id': '71b1ea47-55d6-453c-a181-e6370c4f7968'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:15:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:03.963 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:15:04 compute-0 nova_compute[187185]: 2025-11-29 07:15:04.029 187189 DEBUG nova.virt.libvirt.driver [None req-aff7f1f2-1a94-4753-a88b-964bfa75231f 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:15:04 compute-0 nova_compute[187185]: 2025-11-29 07:15:04.030 187189 DEBUG nova.virt.libvirt.driver [None req-aff7f1f2-1a94-4753-a88b-964bfa75231f 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:15:04 compute-0 nova_compute[187185]: 2025-11-29 07:15:04.030 187189 DEBUG nova.virt.libvirt.driver [None req-aff7f1f2-1a94-4753-a88b-964bfa75231f 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] No VIF found with MAC fa:16:3e:97:62:f8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:15:04 compute-0 nova_compute[187185]: 2025-11-29 07:15:04.030 187189 DEBUG nova.virt.libvirt.driver [None req-aff7f1f2-1a94-4753-a88b-964bfa75231f 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] No VIF found with MAC fa:16:3e:96:9f:2d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:15:04 compute-0 nova_compute[187185]: 2025-11-29 07:15:04.070 187189 DEBUG nova.virt.libvirt.guest [None req-aff7f1f2-1a94-4753-a88b-964bfa75231f 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:15:04 compute-0 nova_compute[187185]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:15:04 compute-0 nova_compute[187185]:   <nova:name>tempest-AttachInterfacesTestJSON-server-575694004</nova:name>
Nov 29 07:15:04 compute-0 nova_compute[187185]:   <nova:creationTime>2025-11-29 07:15:04</nova:creationTime>
Nov 29 07:15:04 compute-0 nova_compute[187185]:   <nova:flavor name="m1.nano">
Nov 29 07:15:04 compute-0 nova_compute[187185]:     <nova:memory>128</nova:memory>
Nov 29 07:15:04 compute-0 nova_compute[187185]:     <nova:disk>1</nova:disk>
Nov 29 07:15:04 compute-0 nova_compute[187185]:     <nova:swap>0</nova:swap>
Nov 29 07:15:04 compute-0 nova_compute[187185]:     <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:15:04 compute-0 nova_compute[187185]:     <nova:vcpus>1</nova:vcpus>
Nov 29 07:15:04 compute-0 nova_compute[187185]:   </nova:flavor>
Nov 29 07:15:04 compute-0 nova_compute[187185]:   <nova:owner>
Nov 29 07:15:04 compute-0 nova_compute[187185]:     <nova:user uuid="9a13f011f5b74a6f94a2d2c8e9104f4a">tempest-AttachInterfacesTestJSON-844604773-project-member</nova:user>
Nov 29 07:15:04 compute-0 nova_compute[187185]:     <nova:project uuid="16d7af1670ea460db3d0422f176b6f98">tempest-AttachInterfacesTestJSON-844604773</nova:project>
Nov 29 07:15:04 compute-0 nova_compute[187185]:   </nova:owner>
Nov 29 07:15:04 compute-0 nova_compute[187185]:   <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:15:04 compute-0 nova_compute[187185]:   <nova:ports>
Nov 29 07:15:04 compute-0 nova_compute[187185]:     <nova:port uuid="ef96623e-7b97-4117-b30e-902c4b9be2ae">
Nov 29 07:15:04 compute-0 nova_compute[187185]:       <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 29 07:15:04 compute-0 nova_compute[187185]:     </nova:port>
Nov 29 07:15:04 compute-0 nova_compute[187185]:     <nova:port uuid="3669ef72-0983-4ed3-b52c-6891cbc3edc2">
Nov 29 07:15:04 compute-0 nova_compute[187185]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 29 07:15:04 compute-0 nova_compute[187185]:     </nova:port>
Nov 29 07:15:04 compute-0 nova_compute[187185]:   </nova:ports>
Nov 29 07:15:04 compute-0 nova_compute[187185]: </nova:instance>
Nov 29 07:15:04 compute-0 nova_compute[187185]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Nov 29 07:15:04 compute-0 nova_compute[187185]: 2025-11-29 07:15:04.137 187189 DEBUG oslo_concurrency.lockutils [None req-aff7f1f2-1a94-4753-a88b-964bfa75231f 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "interface-8ca0969d-04fe-43b8-8f05-68f183f888a9-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 8.502s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:15:04 compute-0 nova_compute[187185]: 2025-11-29 07:15:04.205 187189 DEBUG nova.compute.manager [req-7d8160b6-6d08-4e19-bf1c-29ef3e0082e7 req-9a258538-1ae3-431e-b0d6-274647cd2cea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Received event network-vif-plugged-3669ef72-0983-4ed3-b52c-6891cbc3edc2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:15:04 compute-0 nova_compute[187185]: 2025-11-29 07:15:04.206 187189 DEBUG oslo_concurrency.lockutils [req-7d8160b6-6d08-4e19-bf1c-29ef3e0082e7 req-9a258538-1ae3-431e-b0d6-274647cd2cea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "8ca0969d-04fe-43b8-8f05-68f183f888a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:15:04 compute-0 nova_compute[187185]: 2025-11-29 07:15:04.206 187189 DEBUG oslo_concurrency.lockutils [req-7d8160b6-6d08-4e19-bf1c-29ef3e0082e7 req-9a258538-1ae3-431e-b0d6-274647cd2cea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "8ca0969d-04fe-43b8-8f05-68f183f888a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:15:04 compute-0 nova_compute[187185]: 2025-11-29 07:15:04.207 187189 DEBUG oslo_concurrency.lockutils [req-7d8160b6-6d08-4e19-bf1c-29ef3e0082e7 req-9a258538-1ae3-431e-b0d6-274647cd2cea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "8ca0969d-04fe-43b8-8f05-68f183f888a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:15:04 compute-0 nova_compute[187185]: 2025-11-29 07:15:04.207 187189 DEBUG nova.compute.manager [req-7d8160b6-6d08-4e19-bf1c-29ef3e0082e7 req-9a258538-1ae3-431e-b0d6-274647cd2cea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] No waiting events found dispatching network-vif-plugged-3669ef72-0983-4ed3-b52c-6891cbc3edc2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:15:04 compute-0 nova_compute[187185]: 2025-11-29 07:15:04.207 187189 WARNING nova.compute.manager [req-7d8160b6-6d08-4e19-bf1c-29ef3e0082e7 req-9a258538-1ae3-431e-b0d6-274647cd2cea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Received unexpected event network-vif-plugged-3669ef72-0983-4ed3-b52c-6891cbc3edc2 for instance with vm_state active and task_state None.
Nov 29 07:15:04 compute-0 ovn_controller[95281]: 2025-11-29T07:15:04Z|00033|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:96:9f:2d 10.100.0.14
Nov 29 07:15:04 compute-0 ovn_controller[95281]: 2025-11-29T07:15:04Z|00034|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:96:9f:2d 10.100.0.14
Nov 29 07:15:05 compute-0 nova_compute[187185]: 2025-11-29 07:15:05.069 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:05 compute-0 nova_compute[187185]: 2025-11-29 07:15:05.898 187189 DEBUG nova.network.neutron [req-a4231bc9-701b-4084-bdfc-09ef2aeec931 req-d64a9c3b-00fd-4aa7-be39-f795bd49f885 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Updated VIF entry in instance network info cache for port 3669ef72-0983-4ed3-b52c-6891cbc3edc2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:15:05 compute-0 nova_compute[187185]: 2025-11-29 07:15:05.899 187189 DEBUG nova.network.neutron [req-a4231bc9-701b-4084-bdfc-09ef2aeec931 req-d64a9c3b-00fd-4aa7-be39-f795bd49f885 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Updating instance_info_cache with network_info: [{"id": "ef96623e-7b97-4117-b30e-902c4b9be2ae", "address": "fa:16:3e:97:62:f8", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef96623e-7b", "ovs_interfaceid": "ef96623e-7b97-4117-b30e-902c4b9be2ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "3669ef72-0983-4ed3-b52c-6891cbc3edc2", "address": "fa:16:3e:96:9f:2d", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3669ef72-09", "ovs_interfaceid": "3669ef72-0983-4ed3-b52c-6891cbc3edc2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:15:05 compute-0 nova_compute[187185]: 2025-11-29 07:15:05.920 187189 DEBUG oslo_concurrency.lockutils [req-a4231bc9-701b-4084-bdfc-09ef2aeec931 req-d64a9c3b-00fd-4aa7-be39-f795bd49f885 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-8ca0969d-04fe-43b8-8f05-68f183f888a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:15:05 compute-0 nova_compute[187185]: 2025-11-29 07:15:05.921 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquired lock "refresh_cache-8ca0969d-04fe-43b8-8f05-68f183f888a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:15:05 compute-0 nova_compute[187185]: 2025-11-29 07:15:05.921 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 29 07:15:05 compute-0 nova_compute[187185]: 2025-11-29 07:15:05.921 187189 DEBUG nova.objects.instance [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 8ca0969d-04fe-43b8-8f05-68f183f888a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:15:06 compute-0 nova_compute[187185]: 2025-11-29 07:15:06.231 187189 DEBUG oslo_concurrency.lockutils [None req-8ece8784-e0a1-4f81-8baf-90572bc0ca31 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "interface-8ca0969d-04fe-43b8-8f05-68f183f888a9-3669ef72-0983-4ed3-b52c-6891cbc3edc2" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:15:06 compute-0 nova_compute[187185]: 2025-11-29 07:15:06.232 187189 DEBUG oslo_concurrency.lockutils [None req-8ece8784-e0a1-4f81-8baf-90572bc0ca31 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "interface-8ca0969d-04fe-43b8-8f05-68f183f888a9-3669ef72-0983-4ed3-b52c-6891cbc3edc2" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:15:06 compute-0 nova_compute[187185]: 2025-11-29 07:15:06.254 187189 DEBUG nova.objects.instance [None req-8ece8784-e0a1-4f81-8baf-90572bc0ca31 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lazy-loading 'flavor' on Instance uuid 8ca0969d-04fe-43b8-8f05-68f183f888a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:15:06 compute-0 nova_compute[187185]: 2025-11-29 07:15:06.775 187189 DEBUG nova.virt.libvirt.vif [None req-8ece8784-e0a1-4f81-8baf-90572bc0ca31 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:14:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-575694004',display_name='tempest-AttachInterfacesTestJSON-server-575694004',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-575694004',id=97,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNgGd4Iacs7wY+oKmY26nBMFcX8qOmRS8JGNl6lyYXdVDbe/o53pDlvPJRfgkkFQQMuZj1PL9g22BJFBRRmej3Uv85Ig8WM1/1T7okGrP88moNmlmJpygv+50mQ4/2U61Q==',key_name='tempest-keypair-1636541378',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:14:28Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='16d7af1670ea460db3d0422f176b6f98',ramdisk_id='',reservation_id='r-ysw8l28l',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-844604773',owner_user_name='tempest-AttachInterfacesTestJSON-844604773-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:14:29Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a13f011f5b74a6f94a2d2c8e9104f4a',uuid=8ca0969d-04fe-43b8-8f05-68f183f888a9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3669ef72-0983-4ed3-b52c-6891cbc3edc2", "address": "fa:16:3e:96:9f:2d", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3669ef72-09", "ovs_interfaceid": "3669ef72-0983-4ed3-b52c-6891cbc3edc2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:15:06 compute-0 nova_compute[187185]: 2025-11-29 07:15:06.775 187189 DEBUG nova.network.os_vif_util [None req-8ece8784-e0a1-4f81-8baf-90572bc0ca31 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converting VIF {"id": "3669ef72-0983-4ed3-b52c-6891cbc3edc2", "address": "fa:16:3e:96:9f:2d", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3669ef72-09", "ovs_interfaceid": "3669ef72-0983-4ed3-b52c-6891cbc3edc2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:15:06 compute-0 nova_compute[187185]: 2025-11-29 07:15:06.776 187189 DEBUG nova.network.os_vif_util [None req-8ece8784-e0a1-4f81-8baf-90572bc0ca31 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:96:9f:2d,bridge_name='br-int',has_traffic_filtering=True,id=3669ef72-0983-4ed3-b52c-6891cbc3edc2,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3669ef72-09') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:15:06 compute-0 nova_compute[187185]: 2025-11-29 07:15:06.780 187189 DEBUG nova.virt.libvirt.guest [None req-8ece8784-e0a1-4f81-8baf-90572bc0ca31 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:96:9f:2d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3669ef72-09"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 29 07:15:06 compute-0 nova_compute[187185]: 2025-11-29 07:15:06.783 187189 DEBUG nova.virt.libvirt.guest [None req-8ece8784-e0a1-4f81-8baf-90572bc0ca31 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:96:9f:2d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3669ef72-09"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 29 07:15:06 compute-0 nova_compute[187185]: 2025-11-29 07:15:06.786 187189 DEBUG nova.virt.libvirt.driver [None req-8ece8784-e0a1-4f81-8baf-90572bc0ca31 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Attempting to detach device tap3669ef72-09 from instance 8ca0969d-04fe-43b8-8f05-68f183f888a9 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487
Nov 29 07:15:06 compute-0 nova_compute[187185]: 2025-11-29 07:15:06.786 187189 DEBUG nova.virt.libvirt.guest [None req-8ece8784-e0a1-4f81-8baf-90572bc0ca31 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] detach device xml: <interface type="ethernet">
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <mac address="fa:16:3e:96:9f:2d"/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <model type="virtio"/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <mtu size="1442"/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <target dev="tap3669ef72-09"/>
Nov 29 07:15:06 compute-0 nova_compute[187185]: </interface>
Nov 29 07:15:06 compute-0 nova_compute[187185]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Nov 29 07:15:06 compute-0 nova_compute[187185]: 2025-11-29 07:15:06.792 187189 DEBUG nova.virt.libvirt.guest [None req-8ece8784-e0a1-4f81-8baf-90572bc0ca31 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:96:9f:2d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3669ef72-09"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 29 07:15:06 compute-0 nova_compute[187185]: 2025-11-29 07:15:06.796 187189 DEBUG nova.virt.libvirt.guest [None req-8ece8784-e0a1-4f81-8baf-90572bc0ca31 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:96:9f:2d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3669ef72-09"/></interface>not found in domain: <domain type='kvm' id='35'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <name>instance-00000061</name>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <uuid>8ca0969d-04fe-43b8-8f05-68f183f888a9</uuid>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <nova:name>tempest-AttachInterfacesTestJSON-server-575694004</nova:name>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <nova:creationTime>2025-11-29 07:15:04</nova:creationTime>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <nova:flavor name="m1.nano">
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <nova:memory>128</nova:memory>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <nova:disk>1</nova:disk>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <nova:swap>0</nova:swap>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <nova:vcpus>1</nova:vcpus>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   </nova:flavor>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <nova:owner>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <nova:user uuid="9a13f011f5b74a6f94a2d2c8e9104f4a">tempest-AttachInterfacesTestJSON-844604773-project-member</nova:user>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <nova:project uuid="16d7af1670ea460db3d0422f176b6f98">tempest-AttachInterfacesTestJSON-844604773</nova:project>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   </nova:owner>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <nova:ports>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <nova:port uuid="ef96623e-7b97-4117-b30e-902c4b9be2ae">
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </nova:port>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <nova:port uuid="3669ef72-0983-4ed3-b52c-6891cbc3edc2">
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </nova:port>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   </nova:ports>
Nov 29 07:15:06 compute-0 nova_compute[187185]: </nova:instance>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <memory unit='KiB'>131072</memory>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <currentMemory unit='KiB'>131072</currentMemory>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <vcpu placement='static'>1</vcpu>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <resource>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <partition>/machine</partition>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   </resource>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <sysinfo type='smbios'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <system>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <entry name='manufacturer'>RDO</entry>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <entry name='product'>OpenStack Compute</entry>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <entry name='serial'>8ca0969d-04fe-43b8-8f05-68f183f888a9</entry>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <entry name='uuid'>8ca0969d-04fe-43b8-8f05-68f183f888a9</entry>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <entry name='family'>Virtual Machine</entry>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </system>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <os>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <boot dev='hd'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <smbios mode='sysinfo'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   </os>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <features>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <vmcoreinfo state='on'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   </features>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <cpu mode='custom' match='exact' check='full'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <model fallback='forbid'>Nehalem</model>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <feature policy='require' name='x2apic'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <feature policy='require' name='hypervisor'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <feature policy='require' name='vme'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <clock offset='utc'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <timer name='pit' tickpolicy='delay'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <timer name='rtc' tickpolicy='catchup'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <timer name='hpet' present='no'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <on_poweroff>destroy</on_poweroff>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <on_reboot>restart</on_reboot>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <on_crash>destroy</on_crash>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <disk type='file' device='disk'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <driver name='qemu' type='qcow2' cache='none'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <source file='/var/lib/nova/instances/8ca0969d-04fe-43b8-8f05-68f183f888a9/disk' index='2'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <backingStore type='file' index='3'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:         <format type='raw'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:         <source file='/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:         <backingStore/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       </backingStore>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target dev='vda' bus='virtio'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='virtio-disk0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <disk type='file' device='cdrom'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <driver name='qemu' type='raw' cache='none'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <source file='/var/lib/nova/instances/8ca0969d-04fe-43b8-8f05-68f183f888a9/disk.config' index='1'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <backingStore/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target dev='sda' bus='sata'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <readonly/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='sata0-0-0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='0' model='pcie-root'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pcie.0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='1' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='1' port='0x10'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.1'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='2' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='2' port='0x11'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.2'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='3' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='3' port='0x12'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.3'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='4' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='4' port='0x13'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.4'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='5' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='5' port='0x14'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.5'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='6' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='6' port='0x15'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.6'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='7' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='7' port='0x16'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.7'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='8' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='8' port='0x17'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.8'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='9' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='9' port='0x18'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.9'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='10' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='10' port='0x19'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.10'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='11' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='11' port='0x1a'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.11'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='12' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='12' port='0x1b'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.12'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='13' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='13' port='0x1c'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.13'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='14' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='14' port='0x1d'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.14'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='15' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='15' port='0x1e'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.15'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='16' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='16' port='0x1f'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.16'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='17' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='17' port='0x20'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.17'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='18' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='18' port='0x21'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.18'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='19' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='19' port='0x22'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.19'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='20' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='20' port='0x23'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.20'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='21' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='21' port='0x24'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.21'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='22' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='22' port='0x25'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.22'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='23' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='23' port='0x26'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.23'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='24' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='24' port='0x27'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.24'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='25' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='25' port='0x28'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.25'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-pci-bridge'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.26'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='usb' index='0' model='piix3-uhci'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='usb'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='sata' index='0'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='ide'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <interface type='ethernet'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <mac address='fa:16:3e:97:62:f8'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target dev='tapef96623e-7b'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model type='virtio'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <driver name='vhost' rx_queue_size='512'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <mtu size='1442'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='net0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <interface type='ethernet'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <mac address='fa:16:3e:96:9f:2d'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target dev='tap3669ef72-09'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model type='virtio'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <driver name='vhost' rx_queue_size='512'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <mtu size='1442'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='net1'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <serial type='pty'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <source path='/dev/pts/0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <log file='/var/lib/nova/instances/8ca0969d-04fe-43b8-8f05-68f183f888a9/console.log' append='off'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target type='isa-serial' port='0'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:         <model name='isa-serial'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       </target>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='serial0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <console type='pty' tty='/dev/pts/0'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <source path='/dev/pts/0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <log file='/var/lib/nova/instances/8ca0969d-04fe-43b8-8f05-68f183f888a9/console.log' append='off'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target type='serial' port='0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='serial0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </console>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <input type='tablet' bus='usb'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='input0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='usb' bus='0' port='1'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </input>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <input type='mouse' bus='ps2'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='input1'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </input>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <input type='keyboard' bus='ps2'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='input2'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </input>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <listen type='address' address='::0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </graphics>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <audio id='1' type='none'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <video>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model type='virtio' heads='1' primary='yes'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='video0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </video>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <watchdog model='itco' action='reset'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='watchdog0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </watchdog>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <memballoon model='virtio'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <stats period='10'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='balloon0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <rng model='virtio'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <backend model='random'>/dev/urandom</backend>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='rng0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <label>system_u:system_r:svirt_t:s0:c600,c749</label>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c600,c749</imagelabel>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   </seclabel>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <label>+107:+107</label>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <imagelabel>+107:+107</imagelabel>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   </seclabel>
Nov 29 07:15:06 compute-0 nova_compute[187185]: </domain>
Nov 29 07:15:06 compute-0 nova_compute[187185]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Nov 29 07:15:06 compute-0 nova_compute[187185]: 2025-11-29 07:15:06.797 187189 INFO nova.virt.libvirt.driver [None req-8ece8784-e0a1-4f81-8baf-90572bc0ca31 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Successfully detached device tap3669ef72-09 from instance 8ca0969d-04fe-43b8-8f05-68f183f888a9 from the persistent domain config.
Nov 29 07:15:06 compute-0 nova_compute[187185]: 2025-11-29 07:15:06.797 187189 DEBUG nova.virt.libvirt.driver [None req-8ece8784-e0a1-4f81-8baf-90572bc0ca31 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] (1/8): Attempting to detach device tap3669ef72-09 with device alias net1 from instance 8ca0969d-04fe-43b8-8f05-68f183f888a9 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523
Nov 29 07:15:06 compute-0 nova_compute[187185]: 2025-11-29 07:15:06.797 187189 DEBUG nova.virt.libvirt.guest [None req-8ece8784-e0a1-4f81-8baf-90572bc0ca31 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] detach device xml: <interface type="ethernet">
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <mac address="fa:16:3e:96:9f:2d"/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <model type="virtio"/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <mtu size="1442"/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <target dev="tap3669ef72-09"/>
Nov 29 07:15:06 compute-0 nova_compute[187185]: </interface>
Nov 29 07:15:06 compute-0 nova_compute[187185]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Nov 29 07:15:06 compute-0 nova_compute[187185]: 2025-11-29 07:15:06.876 187189 DEBUG nova.compute.manager [req-6df67b85-b365-4f3e-aad1-34bdb3bcb5fa req-426fbe8b-4bc4-4c45-88d1-60ef4640ef67 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Received event network-vif-plugged-3669ef72-0983-4ed3-b52c-6891cbc3edc2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:15:06 compute-0 nova_compute[187185]: 2025-11-29 07:15:06.877 187189 DEBUG oslo_concurrency.lockutils [req-6df67b85-b365-4f3e-aad1-34bdb3bcb5fa req-426fbe8b-4bc4-4c45-88d1-60ef4640ef67 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "8ca0969d-04fe-43b8-8f05-68f183f888a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:15:06 compute-0 nova_compute[187185]: 2025-11-29 07:15:06.877 187189 DEBUG oslo_concurrency.lockutils [req-6df67b85-b365-4f3e-aad1-34bdb3bcb5fa req-426fbe8b-4bc4-4c45-88d1-60ef4640ef67 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "8ca0969d-04fe-43b8-8f05-68f183f888a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:15:06 compute-0 nova_compute[187185]: 2025-11-29 07:15:06.878 187189 DEBUG oslo_concurrency.lockutils [req-6df67b85-b365-4f3e-aad1-34bdb3bcb5fa req-426fbe8b-4bc4-4c45-88d1-60ef4640ef67 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "8ca0969d-04fe-43b8-8f05-68f183f888a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:15:06 compute-0 nova_compute[187185]: 2025-11-29 07:15:06.878 187189 DEBUG nova.compute.manager [req-6df67b85-b365-4f3e-aad1-34bdb3bcb5fa req-426fbe8b-4bc4-4c45-88d1-60ef4640ef67 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] No waiting events found dispatching network-vif-plugged-3669ef72-0983-4ed3-b52c-6891cbc3edc2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:15:06 compute-0 nova_compute[187185]: 2025-11-29 07:15:06.878 187189 WARNING nova.compute.manager [req-6df67b85-b365-4f3e-aad1-34bdb3bcb5fa req-426fbe8b-4bc4-4c45-88d1-60ef4640ef67 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Received unexpected event network-vif-plugged-3669ef72-0983-4ed3-b52c-6891cbc3edc2 for instance with vm_state active and task_state None.
Nov 29 07:15:06 compute-0 kernel: tap3669ef72-09 (unregistering): left promiscuous mode
Nov 29 07:15:06 compute-0 NetworkManager[55227]: <info>  [1764400506.8983] device (tap3669ef72-09): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:15:06 compute-0 ovn_controller[95281]: 2025-11-29T07:15:06Z|00254|binding|INFO|Releasing lport 3669ef72-0983-4ed3-b52c-6891cbc3edc2 from this chassis (sb_readonly=0)
Nov 29 07:15:06 compute-0 ovn_controller[95281]: 2025-11-29T07:15:06Z|00255|binding|INFO|Setting lport 3669ef72-0983-4ed3-b52c-6891cbc3edc2 down in Southbound
Nov 29 07:15:06 compute-0 nova_compute[187185]: 2025-11-29 07:15:06.909 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:06 compute-0 ovn_controller[95281]: 2025-11-29T07:15:06Z|00256|binding|INFO|Removing iface tap3669ef72-09 ovn-installed in OVS
Nov 29 07:15:06 compute-0 nova_compute[187185]: 2025-11-29 07:15:06.913 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:06 compute-0 nova_compute[187185]: 2025-11-29 07:15:06.915 187189 DEBUG nova.virt.libvirt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Received event <DeviceRemovedEvent: 1764400506.9133654, 8ca0969d-04fe-43b8-8f05-68f183f888a9 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370
Nov 29 07:15:06 compute-0 nova_compute[187185]: 2025-11-29 07:15:06.917 187189 DEBUG nova.virt.libvirt.driver [None req-8ece8784-e0a1-4f81-8baf-90572bc0ca31 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Start waiting for the detach event from libvirt for device tap3669ef72-09 with device alias net1 for instance 8ca0969d-04fe-43b8-8f05-68f183f888a9 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599
Nov 29 07:15:06 compute-0 nova_compute[187185]: 2025-11-29 07:15:06.918 187189 DEBUG nova.virt.libvirt.guest [None req-8ece8784-e0a1-4f81-8baf-90572bc0ca31 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:96:9f:2d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3669ef72-09"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 29 07:15:06 compute-0 nova_compute[187185]: 2025-11-29 07:15:06.923 187189 DEBUG nova.virt.libvirt.guest [None req-8ece8784-e0a1-4f81-8baf-90572bc0ca31 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:96:9f:2d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3669ef72-09"/></interface>not found in domain: <domain type='kvm' id='35'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <name>instance-00000061</name>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <uuid>8ca0969d-04fe-43b8-8f05-68f183f888a9</uuid>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <nova:name>tempest-AttachInterfacesTestJSON-server-575694004</nova:name>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <nova:creationTime>2025-11-29 07:15:04</nova:creationTime>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <nova:flavor name="m1.nano">
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <nova:memory>128</nova:memory>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <nova:disk>1</nova:disk>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <nova:swap>0</nova:swap>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <nova:vcpus>1</nova:vcpus>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   </nova:flavor>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <nova:owner>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <nova:user uuid="9a13f011f5b74a6f94a2d2c8e9104f4a">tempest-AttachInterfacesTestJSON-844604773-project-member</nova:user>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <nova:project uuid="16d7af1670ea460db3d0422f176b6f98">tempest-AttachInterfacesTestJSON-844604773</nova:project>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   </nova:owner>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <nova:ports>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <nova:port uuid="ef96623e-7b97-4117-b30e-902c4b9be2ae">
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </nova:port>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <nova:port uuid="3669ef72-0983-4ed3-b52c-6891cbc3edc2">
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </nova:port>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   </nova:ports>
Nov 29 07:15:06 compute-0 nova_compute[187185]: </nova:instance>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <memory unit='KiB'>131072</memory>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <currentMemory unit='KiB'>131072</currentMemory>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <vcpu placement='static'>1</vcpu>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <resource>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <partition>/machine</partition>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   </resource>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <sysinfo type='smbios'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <system>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <entry name='manufacturer'>RDO</entry>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <entry name='product'>OpenStack Compute</entry>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <entry name='serial'>8ca0969d-04fe-43b8-8f05-68f183f888a9</entry>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <entry name='uuid'>8ca0969d-04fe-43b8-8f05-68f183f888a9</entry>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <entry name='family'>Virtual Machine</entry>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </system>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <os>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <boot dev='hd'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <smbios mode='sysinfo'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   </os>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <features>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <vmcoreinfo state='on'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   </features>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <cpu mode='custom' match='exact' check='full'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <model fallback='forbid'>Nehalem</model>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <feature policy='require' name='x2apic'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <feature policy='require' name='hypervisor'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <feature policy='require' name='vme'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <clock offset='utc'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <timer name='pit' tickpolicy='delay'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <timer name='rtc' tickpolicy='catchup'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <timer name='hpet' present='no'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <on_poweroff>destroy</on_poweroff>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <on_reboot>restart</on_reboot>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <on_crash>destroy</on_crash>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <disk type='file' device='disk'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <driver name='qemu' type='qcow2' cache='none'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <source file='/var/lib/nova/instances/8ca0969d-04fe-43b8-8f05-68f183f888a9/disk' index='2'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <backingStore type='file' index='3'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:         <format type='raw'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:         <source file='/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:         <backingStore/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       </backingStore>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target dev='vda' bus='virtio'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='virtio-disk0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <disk type='file' device='cdrom'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <driver name='qemu' type='raw' cache='none'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <source file='/var/lib/nova/instances/8ca0969d-04fe-43b8-8f05-68f183f888a9/disk.config' index='1'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <backingStore/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target dev='sda' bus='sata'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <readonly/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='sata0-0-0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='0' model='pcie-root'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pcie.0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='1' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='1' port='0x10'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.1'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='2' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='2' port='0x11'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.2'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='3' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='3' port='0x12'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.3'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='4' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='4' port='0x13'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.4'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='5' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='5' port='0x14'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.5'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='6' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='6' port='0x15'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.6'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='7' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='7' port='0x16'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.7'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='8' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='8' port='0x17'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.8'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='9' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='9' port='0x18'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.9'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='10' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='10' port='0x19'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.10'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='11' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='11' port='0x1a'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.11'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='12' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='12' port='0x1b'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.12'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='13' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='13' port='0x1c'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.13'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='14' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='14' port='0x1d'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.14'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='15' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='15' port='0x1e'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.15'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='16' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='16' port='0x1f'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.16'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='17' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='17' port='0x20'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.17'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='18' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='18' port='0x21'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.18'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='19' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='19' port='0x22'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.19'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='20' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='20' port='0x23'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.20'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='21' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='21' port='0x24'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.21'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='22' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='22' port='0x25'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.22'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='23' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='23' port='0x26'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.23'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='24' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='24' port='0x27'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.24'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='25' model='pcie-root-port'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target chassis='25' port='0x28'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.25'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model name='pcie-pci-bridge'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='pci.26'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='usb' index='0' model='piix3-uhci'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='usb'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <controller type='sata' index='0'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='ide'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <interface type='ethernet'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <mac address='fa:16:3e:97:62:f8'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target dev='tapef96623e-7b'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model type='virtio'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <driver name='vhost' rx_queue_size='512'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <mtu size='1442'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='net0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <serial type='pty'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <source path='/dev/pts/0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <log file='/var/lib/nova/instances/8ca0969d-04fe-43b8-8f05-68f183f888a9/console.log' append='off'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target type='isa-serial' port='0'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:         <model name='isa-serial'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       </target>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='serial0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <console type='pty' tty='/dev/pts/0'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <source path='/dev/pts/0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <log file='/var/lib/nova/instances/8ca0969d-04fe-43b8-8f05-68f183f888a9/console.log' append='off'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <target type='serial' port='0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='serial0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </console>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <input type='tablet' bus='usb'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='input0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='usb' bus='0' port='1'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </input>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <input type='mouse' bus='ps2'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='input1'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </input>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <input type='keyboard' bus='ps2'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='input2'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </input>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <listen type='address' address='::0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </graphics>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <audio id='1' type='none'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <video>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <model type='virtio' heads='1' primary='yes'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='video0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </video>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <watchdog model='itco' action='reset'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='watchdog0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </watchdog>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <memballoon model='virtio'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <stats period='10'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='balloon0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <rng model='virtio'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <backend model='random'>/dev/urandom</backend>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <alias name='rng0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <label>system_u:system_r:svirt_t:s0:c600,c749</label>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c600,c749</imagelabel>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   </seclabel>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <label>+107:+107</label>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <imagelabel>+107:+107</imagelabel>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   </seclabel>
Nov 29 07:15:06 compute-0 nova_compute[187185]: </domain>
Nov 29 07:15:06 compute-0 nova_compute[187185]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Nov 29 07:15:06 compute-0 nova_compute[187185]: 2025-11-29 07:15:06.923 187189 INFO nova.virt.libvirt.driver [None req-8ece8784-e0a1-4f81-8baf-90572bc0ca31 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Successfully detached device tap3669ef72-09 from instance 8ca0969d-04fe-43b8-8f05-68f183f888a9 from the live domain config.
Nov 29 07:15:06 compute-0 nova_compute[187185]: 2025-11-29 07:15:06.925 187189 DEBUG nova.virt.libvirt.vif [None req-8ece8784-e0a1-4f81-8baf-90572bc0ca31 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:14:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-575694004',display_name='tempest-AttachInterfacesTestJSON-server-575694004',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-575694004',id=97,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNgGd4Iacs7wY+oKmY26nBMFcX8qOmRS8JGNl6lyYXdVDbe/o53pDlvPJRfgkkFQQMuZj1PL9g22BJFBRRmej3Uv85Ig8WM1/1T7okGrP88moNmlmJpygv+50mQ4/2U61Q==',key_name='tempest-keypair-1636541378',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:14:28Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='16d7af1670ea460db3d0422f176b6f98',ramdisk_id='',reservation_id='r-ysw8l28l',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-844604773',owner_user_name='tempest-AttachInterfacesTestJSON-844604773-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:14:29Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a13f011f5b74a6f94a2d2c8e9104f4a',uuid=8ca0969d-04fe-43b8-8f05-68f183f888a9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3669ef72-0983-4ed3-b52c-6891cbc3edc2", "address": "fa:16:3e:96:9f:2d", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3669ef72-09", "ovs_interfaceid": "3669ef72-0983-4ed3-b52c-6891cbc3edc2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:15:06 compute-0 nova_compute[187185]: 2025-11-29 07:15:06.926 187189 DEBUG nova.network.os_vif_util [None req-8ece8784-e0a1-4f81-8baf-90572bc0ca31 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converting VIF {"id": "3669ef72-0983-4ed3-b52c-6891cbc3edc2", "address": "fa:16:3e:96:9f:2d", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3669ef72-09", "ovs_interfaceid": "3669ef72-0983-4ed3-b52c-6891cbc3edc2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:15:06 compute-0 nova_compute[187185]: 2025-11-29 07:15:06.926 187189 DEBUG nova.network.os_vif_util [None req-8ece8784-e0a1-4f81-8baf-90572bc0ca31 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:96:9f:2d,bridge_name='br-int',has_traffic_filtering=True,id=3669ef72-0983-4ed3-b52c-6891cbc3edc2,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3669ef72-09') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:15:06 compute-0 nova_compute[187185]: 2025-11-29 07:15:06.927 187189 DEBUG os_vif [None req-8ece8784-e0a1-4f81-8baf-90572bc0ca31 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:96:9f:2d,bridge_name='br-int',has_traffic_filtering=True,id=3669ef72-0983-4ed3-b52c-6891cbc3edc2,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3669ef72-09') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:15:06 compute-0 nova_compute[187185]: 2025-11-29 07:15:06.928 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:06 compute-0 nova_compute[187185]: 2025-11-29 07:15:06.929 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3669ef72-09, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:15:06 compute-0 nova_compute[187185]: 2025-11-29 07:15:06.930 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:06 compute-0 nova_compute[187185]: 2025-11-29 07:15:06.932 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:15:06 compute-0 nova_compute[187185]: 2025-11-29 07:15:06.936 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:06 compute-0 nova_compute[187185]: 2025-11-29 07:15:06.939 187189 INFO os_vif [None req-8ece8784-e0a1-4f81-8baf-90572bc0ca31 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:96:9f:2d,bridge_name='br-int',has_traffic_filtering=True,id=3669ef72-0983-4ed3-b52c-6891cbc3edc2,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3669ef72-09')
Nov 29 07:15:06 compute-0 nova_compute[187185]: 2025-11-29 07:15:06.940 187189 DEBUG nova.virt.libvirt.guest [None req-8ece8784-e0a1-4f81-8baf-90572bc0ca31 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <nova:name>tempest-AttachInterfacesTestJSON-server-575694004</nova:name>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <nova:creationTime>2025-11-29 07:15:06</nova:creationTime>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <nova:flavor name="m1.nano">
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <nova:memory>128</nova:memory>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <nova:disk>1</nova:disk>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <nova:swap>0</nova:swap>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <nova:vcpus>1</nova:vcpus>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   </nova:flavor>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <nova:owner>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <nova:user uuid="9a13f011f5b74a6f94a2d2c8e9104f4a">tempest-AttachInterfacesTestJSON-844604773-project-member</nova:user>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <nova:project uuid="16d7af1670ea460db3d0422f176b6f98">tempest-AttachInterfacesTestJSON-844604773</nova:project>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   </nova:owner>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   <nova:ports>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     <nova:port uuid="ef96623e-7b97-4117-b30e-902c4b9be2ae">
Nov 29 07:15:06 compute-0 nova_compute[187185]:       <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 29 07:15:06 compute-0 nova_compute[187185]:     </nova:port>
Nov 29 07:15:06 compute-0 nova_compute[187185]:   </nova:ports>
Nov 29 07:15:06 compute-0 nova_compute[187185]: </nova:instance>
Nov 29 07:15:06 compute-0 nova_compute[187185]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Nov 29 07:15:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:06.952 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:96:9f:2d 10.100.0.14'], port_security=['fa:16:3e:96:9f:2d 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-90812230-35cb-4e21-b16b-75b900100d8b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '16d7af1670ea460db3d0422f176b6f98', 'neutron:revision_number': '4', 'neutron:security_group_ids': '026dfe19-5964-4af9-9b69-58d89d9181a8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=41b9bfbf-a9b3-4bdb-9144-e5db6a660517, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=3669ef72-0983-4ed3-b52c-6891cbc3edc2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:15:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:06.954 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 3669ef72-0983-4ed3-b52c-6891cbc3edc2 in datapath 90812230-35cb-4e21-b16b-75b900100d8b unbound from our chassis
Nov 29 07:15:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:06.956 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 90812230-35cb-4e21-b16b-75b900100d8b
Nov 29 07:15:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:06.973 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[719716a9-4b88-472e-a274-e716ce514f8e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:07.010 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[fc9f670a-3bb0-4f8b-a226-0533d2929349]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:07.013 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[182cd6d0-622b-4991-9d5d-8928f1858837]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:07.045 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[f2f10d27-a9ee-44fa-a36a-f6387762f947]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:07.073 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[705e014d-529c-4506-b567-9e421b425f02]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap90812230-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:5f:07'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 81], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 591442, 'reachable_time': 18293, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228604, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:07.102 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[cd292caf-fc0e-4508-a3df-d30b574cc8e7]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap90812230-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 591455, 'tstamp': 591455}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228605, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap90812230-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 591458, 'tstamp': 591458}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228605, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:07.105 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap90812230-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:15:07 compute-0 nova_compute[187185]: 2025-11-29 07:15:07.142 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:07 compute-0 nova_compute[187185]: 2025-11-29 07:15:07.144 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:07.145 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap90812230-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:15:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:07.145 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:15:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:07.146 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap90812230-30, col_values=(('external_ids', {'iface-id': '71b1ea47-55d6-453c-a181-e6370c4f7968'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:15:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:07.146 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:15:08 compute-0 nova_compute[187185]: 2025-11-29 07:15:08.228 187189 DEBUG nova.compute.manager [req-3dfb386e-7b26-469a-807a-93499800f6e1 req-9a7ba6db-7231-45c4-a12d-d35ead5aaa89 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Received event network-vif-deleted-3669ef72-0983-4ed3-b52c-6891cbc3edc2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:15:08 compute-0 nova_compute[187185]: 2025-11-29 07:15:08.229 187189 INFO nova.compute.manager [req-3dfb386e-7b26-469a-807a-93499800f6e1 req-9a7ba6db-7231-45c4-a12d-d35ead5aaa89 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Neutron deleted interface 3669ef72-0983-4ed3-b52c-6891cbc3edc2; detaching it from the instance and deleting it from the info cache
Nov 29 07:15:08 compute-0 nova_compute[187185]: 2025-11-29 07:15:08.229 187189 DEBUG nova.network.neutron [req-3dfb386e-7b26-469a-807a-93499800f6e1 req-9a7ba6db-7231-45c4-a12d-d35ead5aaa89 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Updating instance_info_cache with network_info: [{"id": "ef96623e-7b97-4117-b30e-902c4b9be2ae", "address": "fa:16:3e:97:62:f8", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef96623e-7b", "ovs_interfaceid": "ef96623e-7b97-4117-b30e-902c4b9be2ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:15:08 compute-0 nova_compute[187185]: 2025-11-29 07:15:08.281 187189 DEBUG oslo_concurrency.lockutils [None req-8ece8784-e0a1-4f81-8baf-90572bc0ca31 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "refresh_cache-8ca0969d-04fe-43b8-8f05-68f183f888a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:15:08 compute-0 nova_compute[187185]: 2025-11-29 07:15:08.389 187189 DEBUG nova.objects.instance [req-3dfb386e-7b26-469a-807a-93499800f6e1 req-9a7ba6db-7231-45c4-a12d-d35ead5aaa89 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lazy-loading 'system_metadata' on Instance uuid 8ca0969d-04fe-43b8-8f05-68f183f888a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:15:08 compute-0 nova_compute[187185]: 2025-11-29 07:15:08.450 187189 DEBUG nova.objects.instance [req-3dfb386e-7b26-469a-807a-93499800f6e1 req-9a7ba6db-7231-45c4-a12d-d35ead5aaa89 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lazy-loading 'flavor' on Instance uuid 8ca0969d-04fe-43b8-8f05-68f183f888a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:15:08 compute-0 nova_compute[187185]: 2025-11-29 07:15:08.533 187189 DEBUG nova.virt.libvirt.vif [req-3dfb386e-7b26-469a-807a-93499800f6e1 req-9a7ba6db-7231-45c4-a12d-d35ead5aaa89 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:14:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-575694004',display_name='tempest-AttachInterfacesTestJSON-server-575694004',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-575694004',id=97,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNgGd4Iacs7wY+oKmY26nBMFcX8qOmRS8JGNl6lyYXdVDbe/o53pDlvPJRfgkkFQQMuZj1PL9g22BJFBRRmej3Uv85Ig8WM1/1T7okGrP88moNmlmJpygv+50mQ4/2U61Q==',key_name='tempest-keypair-1636541378',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:14:28Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='16d7af1670ea460db3d0422f176b6f98',ramdisk_id='',reservation_id='r-ysw8l28l',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-844604773',owner_user_name='tempest-AttachInterfacesTestJSON-844604773-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:14:29Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a13f011f5b74a6f94a2d2c8e9104f4a',uuid=8ca0969d-04fe-43b8-8f05-68f183f888a9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3669ef72-0983-4ed3-b52c-6891cbc3edc2", "address": "fa:16:3e:96:9f:2d", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3669ef72-09", "ovs_interfaceid": "3669ef72-0983-4ed3-b52c-6891cbc3edc2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:15:08 compute-0 nova_compute[187185]: 2025-11-29 07:15:08.534 187189 DEBUG nova.network.os_vif_util [req-3dfb386e-7b26-469a-807a-93499800f6e1 req-9a7ba6db-7231-45c4-a12d-d35ead5aaa89 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Converting VIF {"id": "3669ef72-0983-4ed3-b52c-6891cbc3edc2", "address": "fa:16:3e:96:9f:2d", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3669ef72-09", "ovs_interfaceid": "3669ef72-0983-4ed3-b52c-6891cbc3edc2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:15:08 compute-0 nova_compute[187185]: 2025-11-29 07:15:08.536 187189 DEBUG nova.network.os_vif_util [req-3dfb386e-7b26-469a-807a-93499800f6e1 req-9a7ba6db-7231-45c4-a12d-d35ead5aaa89 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:96:9f:2d,bridge_name='br-int',has_traffic_filtering=True,id=3669ef72-0983-4ed3-b52c-6891cbc3edc2,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3669ef72-09') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:15:08 compute-0 nova_compute[187185]: 2025-11-29 07:15:08.539 187189 DEBUG nova.virt.libvirt.guest [req-3dfb386e-7b26-469a-807a-93499800f6e1 req-9a7ba6db-7231-45c4-a12d-d35ead5aaa89 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:96:9f:2d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3669ef72-09"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 29 07:15:08 compute-0 nova_compute[187185]: 2025-11-29 07:15:08.542 187189 DEBUG nova.virt.libvirt.guest [req-3dfb386e-7b26-469a-807a-93499800f6e1 req-9a7ba6db-7231-45c4-a12d-d35ead5aaa89 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:96:9f:2d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3669ef72-09"/></interface>not found in domain: <domain type='kvm' id='35'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <name>instance-00000061</name>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <uuid>8ca0969d-04fe-43b8-8f05-68f183f888a9</uuid>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <nova:name>tempest-AttachInterfacesTestJSON-server-575694004</nova:name>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <nova:creationTime>2025-11-29 07:15:06</nova:creationTime>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <nova:flavor name="m1.nano">
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <nova:memory>128</nova:memory>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <nova:disk>1</nova:disk>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <nova:swap>0</nova:swap>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <nova:vcpus>1</nova:vcpus>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   </nova:flavor>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <nova:owner>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <nova:user uuid="9a13f011f5b74a6f94a2d2c8e9104f4a">tempest-AttachInterfacesTestJSON-844604773-project-member</nova:user>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <nova:project uuid="16d7af1670ea460db3d0422f176b6f98">tempest-AttachInterfacesTestJSON-844604773</nova:project>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   </nova:owner>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <nova:ports>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <nova:port uuid="ef96623e-7b97-4117-b30e-902c4b9be2ae">
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </nova:port>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   </nova:ports>
Nov 29 07:15:08 compute-0 nova_compute[187185]: </nova:instance>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <memory unit='KiB'>131072</memory>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <currentMemory unit='KiB'>131072</currentMemory>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <vcpu placement='static'>1</vcpu>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <resource>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <partition>/machine</partition>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   </resource>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <sysinfo type='smbios'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <system>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <entry name='manufacturer'>RDO</entry>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <entry name='product'>OpenStack Compute</entry>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <entry name='serial'>8ca0969d-04fe-43b8-8f05-68f183f888a9</entry>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <entry name='uuid'>8ca0969d-04fe-43b8-8f05-68f183f888a9</entry>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <entry name='family'>Virtual Machine</entry>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </system>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <os>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <boot dev='hd'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <smbios mode='sysinfo'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   </os>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <features>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <vmcoreinfo state='on'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   </features>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <cpu mode='custom' match='exact' check='full'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <model fallback='forbid'>Nehalem</model>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <feature policy='require' name='x2apic'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <feature policy='require' name='hypervisor'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <feature policy='require' name='vme'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <clock offset='utc'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <timer name='pit' tickpolicy='delay'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <timer name='rtc' tickpolicy='catchup'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <timer name='hpet' present='no'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <on_poweroff>destroy</on_poweroff>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <on_reboot>restart</on_reboot>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <on_crash>destroy</on_crash>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <disk type='file' device='disk'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <driver name='qemu' type='qcow2' cache='none'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <source file='/var/lib/nova/instances/8ca0969d-04fe-43b8-8f05-68f183f888a9/disk' index='2'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <backingStore type='file' index='3'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:         <format type='raw'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:         <source file='/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:         <backingStore/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       </backingStore>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target dev='vda' bus='virtio'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='virtio-disk0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <disk type='file' device='cdrom'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <driver name='qemu' type='raw' cache='none'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <source file='/var/lib/nova/instances/8ca0969d-04fe-43b8-8f05-68f183f888a9/disk.config' index='1'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <backingStore/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target dev='sda' bus='sata'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <readonly/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='sata0-0-0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='0' model='pcie-root'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pcie.0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='1' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='1' port='0x10'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.1'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='2' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='2' port='0x11'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.2'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='3' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='3' port='0x12'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.3'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='4' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='4' port='0x13'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.4'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='5' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='5' port='0x14'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.5'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='6' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='6' port='0x15'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.6'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='7' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='7' port='0x16'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.7'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='8' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='8' port='0x17'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.8'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='9' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='9' port='0x18'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.9'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='10' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='10' port='0x19'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.10'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='11' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='11' port='0x1a'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.11'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='12' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='12' port='0x1b'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.12'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='13' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='13' port='0x1c'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.13'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='14' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='14' port='0x1d'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.14'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='15' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='15' port='0x1e'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.15'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='16' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='16' port='0x1f'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.16'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='17' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='17' port='0x20'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.17'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='18' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='18' port='0x21'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.18'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='19' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='19' port='0x22'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.19'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='20' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='20' port='0x23'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.20'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='21' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='21' port='0x24'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.21'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='22' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='22' port='0x25'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.22'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='23' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='23' port='0x26'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.23'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='24' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='24' port='0x27'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.24'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='25' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='25' port='0x28'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.25'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-pci-bridge'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.26'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='usb' index='0' model='piix3-uhci'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='usb'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='sata' index='0'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='ide'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <interface type='ethernet'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <mac address='fa:16:3e:97:62:f8'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target dev='tapef96623e-7b'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model type='virtio'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <driver name='vhost' rx_queue_size='512'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <mtu size='1442'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='net0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <serial type='pty'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <source path='/dev/pts/0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <log file='/var/lib/nova/instances/8ca0969d-04fe-43b8-8f05-68f183f888a9/console.log' append='off'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target type='isa-serial' port='0'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:         <model name='isa-serial'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       </target>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='serial0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <console type='pty' tty='/dev/pts/0'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <source path='/dev/pts/0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <log file='/var/lib/nova/instances/8ca0969d-04fe-43b8-8f05-68f183f888a9/console.log' append='off'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target type='serial' port='0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='serial0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </console>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <input type='tablet' bus='usb'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='input0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='usb' bus='0' port='1'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </input>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <input type='mouse' bus='ps2'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='input1'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </input>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <input type='keyboard' bus='ps2'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='input2'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </input>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <listen type='address' address='::0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </graphics>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <audio id='1' type='none'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <video>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model type='virtio' heads='1' primary='yes'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='video0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </video>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <watchdog model='itco' action='reset'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='watchdog0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </watchdog>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <memballoon model='virtio'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <stats period='10'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='balloon0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <rng model='virtio'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <backend model='random'>/dev/urandom</backend>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='rng0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <label>system_u:system_r:svirt_t:s0:c600,c749</label>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c600,c749</imagelabel>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   </seclabel>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <label>+107:+107</label>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <imagelabel>+107:+107</imagelabel>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   </seclabel>
Nov 29 07:15:08 compute-0 nova_compute[187185]: </domain>
Nov 29 07:15:08 compute-0 nova_compute[187185]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Nov 29 07:15:08 compute-0 nova_compute[187185]: 2025-11-29 07:15:08.543 187189 DEBUG nova.virt.libvirt.guest [req-3dfb386e-7b26-469a-807a-93499800f6e1 req-9a7ba6db-7231-45c4-a12d-d35ead5aaa89 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:96:9f:2d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3669ef72-09"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 29 07:15:08 compute-0 nova_compute[187185]: 2025-11-29 07:15:08.548 187189 DEBUG nova.virt.libvirt.guest [req-3dfb386e-7b26-469a-807a-93499800f6e1 req-9a7ba6db-7231-45c4-a12d-d35ead5aaa89 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:96:9f:2d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3669ef72-09"/></interface>not found in domain: <domain type='kvm' id='35'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <name>instance-00000061</name>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <uuid>8ca0969d-04fe-43b8-8f05-68f183f888a9</uuid>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <nova:name>tempest-AttachInterfacesTestJSON-server-575694004</nova:name>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <nova:creationTime>2025-11-29 07:15:06</nova:creationTime>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <nova:flavor name="m1.nano">
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <nova:memory>128</nova:memory>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <nova:disk>1</nova:disk>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <nova:swap>0</nova:swap>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <nova:vcpus>1</nova:vcpus>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   </nova:flavor>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <nova:owner>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <nova:user uuid="9a13f011f5b74a6f94a2d2c8e9104f4a">tempest-AttachInterfacesTestJSON-844604773-project-member</nova:user>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <nova:project uuid="16d7af1670ea460db3d0422f176b6f98">tempest-AttachInterfacesTestJSON-844604773</nova:project>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   </nova:owner>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <nova:ports>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <nova:port uuid="ef96623e-7b97-4117-b30e-902c4b9be2ae">
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </nova:port>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   </nova:ports>
Nov 29 07:15:08 compute-0 nova_compute[187185]: </nova:instance>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <memory unit='KiB'>131072</memory>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <currentMemory unit='KiB'>131072</currentMemory>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <vcpu placement='static'>1</vcpu>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <resource>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <partition>/machine</partition>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   </resource>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <sysinfo type='smbios'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <system>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <entry name='manufacturer'>RDO</entry>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <entry name='product'>OpenStack Compute</entry>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <entry name='serial'>8ca0969d-04fe-43b8-8f05-68f183f888a9</entry>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <entry name='uuid'>8ca0969d-04fe-43b8-8f05-68f183f888a9</entry>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <entry name='family'>Virtual Machine</entry>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </system>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <os>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <boot dev='hd'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <smbios mode='sysinfo'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   </os>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <features>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <vmcoreinfo state='on'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   </features>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <cpu mode='custom' match='exact' check='full'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <model fallback='forbid'>Nehalem</model>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <feature policy='require' name='x2apic'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <feature policy='require' name='hypervisor'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <feature policy='require' name='vme'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <clock offset='utc'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <timer name='pit' tickpolicy='delay'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <timer name='rtc' tickpolicy='catchup'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <timer name='hpet' present='no'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <on_poweroff>destroy</on_poweroff>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <on_reboot>restart</on_reboot>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <on_crash>destroy</on_crash>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <disk type='file' device='disk'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <driver name='qemu' type='qcow2' cache='none'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <source file='/var/lib/nova/instances/8ca0969d-04fe-43b8-8f05-68f183f888a9/disk' index='2'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <backingStore type='file' index='3'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:         <format type='raw'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:         <source file='/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:         <backingStore/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       </backingStore>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target dev='vda' bus='virtio'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='virtio-disk0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <disk type='file' device='cdrom'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <driver name='qemu' type='raw' cache='none'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <source file='/var/lib/nova/instances/8ca0969d-04fe-43b8-8f05-68f183f888a9/disk.config' index='1'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <backingStore/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target dev='sda' bus='sata'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <readonly/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='sata0-0-0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='0' model='pcie-root'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pcie.0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='1' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='1' port='0x10'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.1'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='2' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='2' port='0x11'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.2'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='3' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='3' port='0x12'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.3'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='4' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='4' port='0x13'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.4'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='5' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='5' port='0x14'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.5'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='6' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='6' port='0x15'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.6'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='7' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='7' port='0x16'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.7'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='8' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='8' port='0x17'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.8'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='9' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='9' port='0x18'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.9'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='10' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='10' port='0x19'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.10'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='11' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='11' port='0x1a'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.11'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='12' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='12' port='0x1b'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.12'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='13' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='13' port='0x1c'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.13'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='14' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='14' port='0x1d'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.14'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='15' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='15' port='0x1e'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.15'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='16' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='16' port='0x1f'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.16'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='17' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='17' port='0x20'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.17'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='18' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='18' port='0x21'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.18'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='19' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='19' port='0x22'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.19'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='20' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='20' port='0x23'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.20'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='21' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='21' port='0x24'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.21'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='22' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='22' port='0x25'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.22'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='23' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='23' port='0x26'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.23'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='24' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='24' port='0x27'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.24'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='25' model='pcie-root-port'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-root-port'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target chassis='25' port='0x28'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.25'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model name='pcie-pci-bridge'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='pci.26'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='usb' index='0' model='piix3-uhci'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='usb'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <controller type='sata' index='0'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='ide'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </controller>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <interface type='ethernet'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <mac address='fa:16:3e:97:62:f8'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target dev='tapef96623e-7b'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model type='virtio'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <driver name='vhost' rx_queue_size='512'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <mtu size='1442'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='net0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <serial type='pty'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <source path='/dev/pts/0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <log file='/var/lib/nova/instances/8ca0969d-04fe-43b8-8f05-68f183f888a9/console.log' append='off'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target type='isa-serial' port='0'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:         <model name='isa-serial'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       </target>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='serial0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <console type='pty' tty='/dev/pts/0'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <source path='/dev/pts/0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <log file='/var/lib/nova/instances/8ca0969d-04fe-43b8-8f05-68f183f888a9/console.log' append='off'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <target type='serial' port='0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='serial0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </console>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <input type='tablet' bus='usb'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='input0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='usb' bus='0' port='1'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </input>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <input type='mouse' bus='ps2'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='input1'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </input>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <input type='keyboard' bus='ps2'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='input2'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </input>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <listen type='address' address='::0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </graphics>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <audio id='1' type='none'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <video>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <model type='virtio' heads='1' primary='yes'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='video0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </video>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <watchdog model='itco' action='reset'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='watchdog0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </watchdog>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <memballoon model='virtio'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <stats period='10'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='balloon0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <rng model='virtio'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <backend model='random'>/dev/urandom</backend>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <alias name='rng0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <label>system_u:system_r:svirt_t:s0:c600,c749</label>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c600,c749</imagelabel>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   </seclabel>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <label>+107:+107</label>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <imagelabel>+107:+107</imagelabel>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   </seclabel>
Nov 29 07:15:08 compute-0 nova_compute[187185]: </domain>
Nov 29 07:15:08 compute-0 nova_compute[187185]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Nov 29 07:15:08 compute-0 nova_compute[187185]: 2025-11-29 07:15:08.549 187189 WARNING nova.virt.libvirt.driver [req-3dfb386e-7b26-469a-807a-93499800f6e1 req-9a7ba6db-7231-45c4-a12d-d35ead5aaa89 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Detaching interface fa:16:3e:96:9f:2d failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tap3669ef72-09' not found.
Nov 29 07:15:08 compute-0 nova_compute[187185]: 2025-11-29 07:15:08.550 187189 DEBUG nova.virt.libvirt.vif [req-3dfb386e-7b26-469a-807a-93499800f6e1 req-9a7ba6db-7231-45c4-a12d-d35ead5aaa89 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:14:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-575694004',display_name='tempest-AttachInterfacesTestJSON-server-575694004',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-575694004',id=97,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNgGd4Iacs7wY+oKmY26nBMFcX8qOmRS8JGNl6lyYXdVDbe/o53pDlvPJRfgkkFQQMuZj1PL9g22BJFBRRmej3Uv85Ig8WM1/1T7okGrP88moNmlmJpygv+50mQ4/2U61Q==',key_name='tempest-keypair-1636541378',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:14:28Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='16d7af1670ea460db3d0422f176b6f98',ramdisk_id='',reservation_id='r-ysw8l28l',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-844604773',owner_user_name='tempest-AttachInterfacesTestJSON-844604773-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:14:29Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a13f011f5b74a6f94a2d2c8e9104f4a',uuid=8ca0969d-04fe-43b8-8f05-68f183f888a9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3669ef72-0983-4ed3-b52c-6891cbc3edc2", "address": "fa:16:3e:96:9f:2d", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3669ef72-09", "ovs_interfaceid": "3669ef72-0983-4ed3-b52c-6891cbc3edc2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:15:08 compute-0 nova_compute[187185]: 2025-11-29 07:15:08.552 187189 DEBUG nova.network.os_vif_util [req-3dfb386e-7b26-469a-807a-93499800f6e1 req-9a7ba6db-7231-45c4-a12d-d35ead5aaa89 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Converting VIF {"id": "3669ef72-0983-4ed3-b52c-6891cbc3edc2", "address": "fa:16:3e:96:9f:2d", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3669ef72-09", "ovs_interfaceid": "3669ef72-0983-4ed3-b52c-6891cbc3edc2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:15:08 compute-0 nova_compute[187185]: 2025-11-29 07:15:08.553 187189 DEBUG nova.network.os_vif_util [req-3dfb386e-7b26-469a-807a-93499800f6e1 req-9a7ba6db-7231-45c4-a12d-d35ead5aaa89 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:96:9f:2d,bridge_name='br-int',has_traffic_filtering=True,id=3669ef72-0983-4ed3-b52c-6891cbc3edc2,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3669ef72-09') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:15:08 compute-0 nova_compute[187185]: 2025-11-29 07:15:08.553 187189 DEBUG os_vif [req-3dfb386e-7b26-469a-807a-93499800f6e1 req-9a7ba6db-7231-45c4-a12d-d35ead5aaa89 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:96:9f:2d,bridge_name='br-int',has_traffic_filtering=True,id=3669ef72-0983-4ed3-b52c-6891cbc3edc2,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3669ef72-09') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:15:08 compute-0 nova_compute[187185]: 2025-11-29 07:15:08.555 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:08 compute-0 nova_compute[187185]: 2025-11-29 07:15:08.555 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3669ef72-09, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:15:08 compute-0 nova_compute[187185]: 2025-11-29 07:15:08.556 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:15:08 compute-0 nova_compute[187185]: 2025-11-29 07:15:08.558 187189 INFO os_vif [req-3dfb386e-7b26-469a-807a-93499800f6e1 req-9a7ba6db-7231-45c4-a12d-d35ead5aaa89 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:96:9f:2d,bridge_name='br-int',has_traffic_filtering=True,id=3669ef72-0983-4ed3-b52c-6891cbc3edc2,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3669ef72-09')
Nov 29 07:15:08 compute-0 nova_compute[187185]: 2025-11-29 07:15:08.558 187189 DEBUG nova.virt.libvirt.guest [req-3dfb386e-7b26-469a-807a-93499800f6e1 req-9a7ba6db-7231-45c4-a12d-d35ead5aaa89 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <nova:name>tempest-AttachInterfacesTestJSON-server-575694004</nova:name>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <nova:creationTime>2025-11-29 07:15:08</nova:creationTime>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <nova:flavor name="m1.nano">
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <nova:memory>128</nova:memory>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <nova:disk>1</nova:disk>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <nova:swap>0</nova:swap>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <nova:vcpus>1</nova:vcpus>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   </nova:flavor>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <nova:owner>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <nova:user uuid="9a13f011f5b74a6f94a2d2c8e9104f4a">tempest-AttachInterfacesTestJSON-844604773-project-member</nova:user>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <nova:project uuid="16d7af1670ea460db3d0422f176b6f98">tempest-AttachInterfacesTestJSON-844604773</nova:project>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   </nova:owner>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   <nova:ports>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     <nova:port uuid="ef96623e-7b97-4117-b30e-902c4b9be2ae">
Nov 29 07:15:08 compute-0 nova_compute[187185]:       <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 29 07:15:08 compute-0 nova_compute[187185]:     </nova:port>
Nov 29 07:15:08 compute-0 nova_compute[187185]:   </nova:ports>
Nov 29 07:15:08 compute-0 nova_compute[187185]: </nova:instance>
Nov 29 07:15:08 compute-0 nova_compute[187185]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Nov 29 07:15:09 compute-0 nova_compute[187185]: 2025-11-29 07:15:09.223 187189 DEBUG nova.compute.manager [req-15ccd4f8-45ce-4c78-9d81-0ded558db835 req-f3621fe0-afa0-404d-b294-a15453dc211f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Received event network-vif-unplugged-3669ef72-0983-4ed3-b52c-6891cbc3edc2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:15:09 compute-0 nova_compute[187185]: 2025-11-29 07:15:09.224 187189 DEBUG oslo_concurrency.lockutils [req-15ccd4f8-45ce-4c78-9d81-0ded558db835 req-f3621fe0-afa0-404d-b294-a15453dc211f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "8ca0969d-04fe-43b8-8f05-68f183f888a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:15:09 compute-0 nova_compute[187185]: 2025-11-29 07:15:09.224 187189 DEBUG oslo_concurrency.lockutils [req-15ccd4f8-45ce-4c78-9d81-0ded558db835 req-f3621fe0-afa0-404d-b294-a15453dc211f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "8ca0969d-04fe-43b8-8f05-68f183f888a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:15:09 compute-0 nova_compute[187185]: 2025-11-29 07:15:09.224 187189 DEBUG oslo_concurrency.lockutils [req-15ccd4f8-45ce-4c78-9d81-0ded558db835 req-f3621fe0-afa0-404d-b294-a15453dc211f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "8ca0969d-04fe-43b8-8f05-68f183f888a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:15:09 compute-0 nova_compute[187185]: 2025-11-29 07:15:09.224 187189 DEBUG nova.compute.manager [req-15ccd4f8-45ce-4c78-9d81-0ded558db835 req-f3621fe0-afa0-404d-b294-a15453dc211f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] No waiting events found dispatching network-vif-unplugged-3669ef72-0983-4ed3-b52c-6891cbc3edc2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:15:09 compute-0 nova_compute[187185]: 2025-11-29 07:15:09.225 187189 DEBUG nova.compute.manager [req-15ccd4f8-45ce-4c78-9d81-0ded558db835 req-f3621fe0-afa0-404d-b294-a15453dc211f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Received event network-vif-unplugged-3669ef72-0983-4ed3-b52c-6891cbc3edc2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 07:15:09 compute-0 nova_compute[187185]: 2025-11-29 07:15:09.225 187189 DEBUG nova.compute.manager [req-15ccd4f8-45ce-4c78-9d81-0ded558db835 req-f3621fe0-afa0-404d-b294-a15453dc211f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Received event network-vif-plugged-3669ef72-0983-4ed3-b52c-6891cbc3edc2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:15:09 compute-0 nova_compute[187185]: 2025-11-29 07:15:09.225 187189 DEBUG oslo_concurrency.lockutils [req-15ccd4f8-45ce-4c78-9d81-0ded558db835 req-f3621fe0-afa0-404d-b294-a15453dc211f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "8ca0969d-04fe-43b8-8f05-68f183f888a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:15:09 compute-0 nova_compute[187185]: 2025-11-29 07:15:09.225 187189 DEBUG oslo_concurrency.lockutils [req-15ccd4f8-45ce-4c78-9d81-0ded558db835 req-f3621fe0-afa0-404d-b294-a15453dc211f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "8ca0969d-04fe-43b8-8f05-68f183f888a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:15:09 compute-0 nova_compute[187185]: 2025-11-29 07:15:09.225 187189 DEBUG oslo_concurrency.lockutils [req-15ccd4f8-45ce-4c78-9d81-0ded558db835 req-f3621fe0-afa0-404d-b294-a15453dc211f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "8ca0969d-04fe-43b8-8f05-68f183f888a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:15:09 compute-0 nova_compute[187185]: 2025-11-29 07:15:09.226 187189 DEBUG nova.compute.manager [req-15ccd4f8-45ce-4c78-9d81-0ded558db835 req-f3621fe0-afa0-404d-b294-a15453dc211f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] No waiting events found dispatching network-vif-plugged-3669ef72-0983-4ed3-b52c-6891cbc3edc2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:15:09 compute-0 nova_compute[187185]: 2025-11-29 07:15:09.226 187189 WARNING nova.compute.manager [req-15ccd4f8-45ce-4c78-9d81-0ded558db835 req-f3621fe0-afa0-404d-b294-a15453dc211f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Received unexpected event network-vif-plugged-3669ef72-0983-4ed3-b52c-6891cbc3edc2 for instance with vm_state active and task_state deleting.
Nov 29 07:15:09 compute-0 nova_compute[187185]: 2025-11-29 07:15:09.335 187189 DEBUG oslo_concurrency.lockutils [None req-b928c091-cc24-48f8-9a7d-c5cc8ce22b20 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "8ca0969d-04fe-43b8-8f05-68f183f888a9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:15:09 compute-0 nova_compute[187185]: 2025-11-29 07:15:09.335 187189 DEBUG oslo_concurrency.lockutils [None req-b928c091-cc24-48f8-9a7d-c5cc8ce22b20 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "8ca0969d-04fe-43b8-8f05-68f183f888a9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:15:09 compute-0 nova_compute[187185]: 2025-11-29 07:15:09.336 187189 DEBUG oslo_concurrency.lockutils [None req-b928c091-cc24-48f8-9a7d-c5cc8ce22b20 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "8ca0969d-04fe-43b8-8f05-68f183f888a9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:15:09 compute-0 nova_compute[187185]: 2025-11-29 07:15:09.336 187189 DEBUG oslo_concurrency.lockutils [None req-b928c091-cc24-48f8-9a7d-c5cc8ce22b20 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "8ca0969d-04fe-43b8-8f05-68f183f888a9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:15:09 compute-0 nova_compute[187185]: 2025-11-29 07:15:09.336 187189 DEBUG oslo_concurrency.lockutils [None req-b928c091-cc24-48f8-9a7d-c5cc8ce22b20 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "8ca0969d-04fe-43b8-8f05-68f183f888a9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:15:09 compute-0 nova_compute[187185]: 2025-11-29 07:15:09.351 187189 INFO nova.compute.manager [None req-b928c091-cc24-48f8-9a7d-c5cc8ce22b20 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Terminating instance
Nov 29 07:15:09 compute-0 nova_compute[187185]: 2025-11-29 07:15:09.398 187189 DEBUG nova.compute.manager [None req-b928c091-cc24-48f8-9a7d-c5cc8ce22b20 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 07:15:09 compute-0 kernel: tapef96623e-7b (unregistering): left promiscuous mode
Nov 29 07:15:09 compute-0 NetworkManager[55227]: <info>  [1764400509.4258] device (tapef96623e-7b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:15:09 compute-0 ovn_controller[95281]: 2025-11-29T07:15:09Z|00257|binding|INFO|Releasing lport ef96623e-7b97-4117-b30e-902c4b9be2ae from this chassis (sb_readonly=0)
Nov 29 07:15:09 compute-0 ovn_controller[95281]: 2025-11-29T07:15:09Z|00258|binding|INFO|Setting lport ef96623e-7b97-4117-b30e-902c4b9be2ae down in Southbound
Nov 29 07:15:09 compute-0 nova_compute[187185]: 2025-11-29 07:15:09.437 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:09 compute-0 ovn_controller[95281]: 2025-11-29T07:15:09Z|00259|binding|INFO|Removing iface tapef96623e-7b ovn-installed in OVS
Nov 29 07:15:09 compute-0 nova_compute[187185]: 2025-11-29 07:15:09.441 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:09.455 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:97:62:f8 10.100.0.4'], port_security=['fa:16:3e:97:62:f8 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '8ca0969d-04fe-43b8-8f05-68f183f888a9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-90812230-35cb-4e21-b16b-75b900100d8b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '16d7af1670ea460db3d0422f176b6f98', 'neutron:revision_number': '4', 'neutron:security_group_ids': '751306cd-ebe9-4aad-803e-19aae2b7594e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.224'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=41b9bfbf-a9b3-4bdb-9144-e5db6a660517, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=ef96623e-7b97-4117-b30e-902c4b9be2ae) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:15:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:09.456 104254 INFO neutron.agent.ovn.metadata.agent [-] Port ef96623e-7b97-4117-b30e-902c4b9be2ae in datapath 90812230-35cb-4e21-b16b-75b900100d8b unbound from our chassis
Nov 29 07:15:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:09.458 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 90812230-35cb-4e21-b16b-75b900100d8b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:15:09 compute-0 nova_compute[187185]: 2025-11-29 07:15:09.462 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:09.459 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[93b27e66-bb14-44b3-83ca-c7b7ceee3a34]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:09.463 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b namespace which is not needed anymore
Nov 29 07:15:09 compute-0 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d00000061.scope: Deactivated successfully.
Nov 29 07:15:09 compute-0 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d00000061.scope: Consumed 14.764s CPU time.
Nov 29 07:15:09 compute-0 systemd-machined[153486]: Machine qemu-35-instance-00000061 terminated.
Nov 29 07:15:09 compute-0 NetworkManager[55227]: <info>  [1764400509.6235] manager: (tapef96623e-7b): new Tun device (/org/freedesktop/NetworkManager/Devices/136)
Nov 29 07:15:09 compute-0 neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b[228324]: [NOTICE]   (228328) : haproxy version is 2.8.14-c23fe91
Nov 29 07:15:09 compute-0 neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b[228324]: [NOTICE]   (228328) : path to executable is /usr/sbin/haproxy
Nov 29 07:15:09 compute-0 neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b[228324]: [WARNING]  (228328) : Exiting Master process...
Nov 29 07:15:09 compute-0 neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b[228324]: [WARNING]  (228328) : Exiting Master process...
Nov 29 07:15:09 compute-0 neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b[228324]: [ALERT]    (228328) : Current worker (228330) exited with code 143 (Terminated)
Nov 29 07:15:09 compute-0 neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b[228324]: [WARNING]  (228328) : All workers exited. Exiting... (0)
Nov 29 07:15:09 compute-0 systemd[1]: libpod-6468a1df4bc8cf685f4cd7d65dafc022a082d67fb850e968c687a979106f425f.scope: Deactivated successfully.
Nov 29 07:15:09 compute-0 conmon[228324]: conmon 6468a1df4bc8cf685f4c <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-6468a1df4bc8cf685f4cd7d65dafc022a082d67fb850e968c687a979106f425f.scope/container/memory.events
Nov 29 07:15:09 compute-0 podman[228628]: 2025-11-29 07:15:09.646963513 +0000 UTC m=+0.056398286 container died 6468a1df4bc8cf685f4cd7d65dafc022a082d67fb850e968c687a979106f425f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 07:15:09 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6468a1df4bc8cf685f4cd7d65dafc022a082d67fb850e968c687a979106f425f-userdata-shm.mount: Deactivated successfully.
Nov 29 07:15:09 compute-0 systemd[1]: var-lib-containers-storage-overlay-d980344e8eb69823235e0cc58f5056209472fe6ba8eeb1fe84e65b7dd9493a55-merged.mount: Deactivated successfully.
Nov 29 07:15:09 compute-0 nova_compute[187185]: 2025-11-29 07:15:09.684 187189 INFO nova.virt.libvirt.driver [-] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Instance destroyed successfully.
Nov 29 07:15:09 compute-0 nova_compute[187185]: 2025-11-29 07:15:09.684 187189 DEBUG nova.objects.instance [None req-b928c091-cc24-48f8-9a7d-c5cc8ce22b20 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lazy-loading 'resources' on Instance uuid 8ca0969d-04fe-43b8-8f05-68f183f888a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:15:09 compute-0 podman[228628]: 2025-11-29 07:15:09.693739166 +0000 UTC m=+0.103173939 container cleanup 6468a1df4bc8cf685f4cd7d65dafc022a082d67fb850e968c687a979106f425f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 07:15:09 compute-0 nova_compute[187185]: 2025-11-29 07:15:09.707 187189 DEBUG nova.virt.libvirt.vif [None req-b928c091-cc24-48f8-9a7d-c5cc8ce22b20 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:14:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-575694004',display_name='tempest-AttachInterfacesTestJSON-server-575694004',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-575694004',id=97,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNgGd4Iacs7wY+oKmY26nBMFcX8qOmRS8JGNl6lyYXdVDbe/o53pDlvPJRfgkkFQQMuZj1PL9g22BJFBRRmej3Uv85Ig8WM1/1T7okGrP88moNmlmJpygv+50mQ4/2U61Q==',key_name='tempest-keypair-1636541378',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:14:28Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='16d7af1670ea460db3d0422f176b6f98',ramdisk_id='',reservation_id='r-ysw8l28l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-844604773',owner_user_name='tempest-AttachInterfacesTestJSON-844604773-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:14:29Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a13f011f5b74a6f94a2d2c8e9104f4a',uuid=8ca0969d-04fe-43b8-8f05-68f183f888a9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ef96623e-7b97-4117-b30e-902c4b9be2ae", "address": "fa:16:3e:97:62:f8", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef96623e-7b", "ovs_interfaceid": "ef96623e-7b97-4117-b30e-902c4b9be2ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:15:09 compute-0 nova_compute[187185]: 2025-11-29 07:15:09.707 187189 DEBUG nova.network.os_vif_util [None req-b928c091-cc24-48f8-9a7d-c5cc8ce22b20 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converting VIF {"id": "ef96623e-7b97-4117-b30e-902c4b9be2ae", "address": "fa:16:3e:97:62:f8", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef96623e-7b", "ovs_interfaceid": "ef96623e-7b97-4117-b30e-902c4b9be2ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:15:09 compute-0 nova_compute[187185]: 2025-11-29 07:15:09.708 187189 DEBUG nova.network.os_vif_util [None req-b928c091-cc24-48f8-9a7d-c5cc8ce22b20 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:97:62:f8,bridge_name='br-int',has_traffic_filtering=True,id=ef96623e-7b97-4117-b30e-902c4b9be2ae,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef96623e-7b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:15:09 compute-0 nova_compute[187185]: 2025-11-29 07:15:09.708 187189 DEBUG os_vif [None req-b928c091-cc24-48f8-9a7d-c5cc8ce22b20 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:97:62:f8,bridge_name='br-int',has_traffic_filtering=True,id=ef96623e-7b97-4117-b30e-902c4b9be2ae,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef96623e-7b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:15:09 compute-0 nova_compute[187185]: 2025-11-29 07:15:09.709 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:09 compute-0 nova_compute[187185]: 2025-11-29 07:15:09.709 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapef96623e-7b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:15:09 compute-0 nova_compute[187185]: 2025-11-29 07:15:09.712 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:09 compute-0 nova_compute[187185]: 2025-11-29 07:15:09.716 187189 INFO os_vif [None req-b928c091-cc24-48f8-9a7d-c5cc8ce22b20 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:97:62:f8,bridge_name='br-int',has_traffic_filtering=True,id=ef96623e-7b97-4117-b30e-902c4b9be2ae,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef96623e-7b')
Nov 29 07:15:09 compute-0 nova_compute[187185]: 2025-11-29 07:15:09.717 187189 INFO nova.virt.libvirt.driver [None req-b928c091-cc24-48f8-9a7d-c5cc8ce22b20 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Deleting instance files /var/lib/nova/instances/8ca0969d-04fe-43b8-8f05-68f183f888a9_del
Nov 29 07:15:09 compute-0 nova_compute[187185]: 2025-11-29 07:15:09.718 187189 INFO nova.virt.libvirt.driver [None req-b928c091-cc24-48f8-9a7d-c5cc8ce22b20 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Deletion of /var/lib/nova/instances/8ca0969d-04fe-43b8-8f05-68f183f888a9_del complete
Nov 29 07:15:09 compute-0 systemd[1]: libpod-conmon-6468a1df4bc8cf685f4cd7d65dafc022a082d67fb850e968c687a979106f425f.scope: Deactivated successfully.
Nov 29 07:15:09 compute-0 podman[228673]: 2025-11-29 07:15:09.776179227 +0000 UTC m=+0.056701015 container remove 6468a1df4bc8cf685f4cd7d65dafc022a082d67fb850e968c687a979106f425f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:15:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:09.785 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[88642826-35e8-45eb-8e05-065e07e13b54]: (4, ('Sat Nov 29 07:15:09 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b (6468a1df4bc8cf685f4cd7d65dafc022a082d67fb850e968c687a979106f425f)\n6468a1df4bc8cf685f4cd7d65dafc022a082d67fb850e968c687a979106f425f\nSat Nov 29 07:15:09 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b (6468a1df4bc8cf685f4cd7d65dafc022a082d67fb850e968c687a979106f425f)\n6468a1df4bc8cf685f4cd7d65dafc022a082d67fb850e968c687a979106f425f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:09.788 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[ea7413d8-235c-4595-a194-6e8280cf8c98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:09.789 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap90812230-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:15:09 compute-0 nova_compute[187185]: 2025-11-29 07:15:09.792 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:09 compute-0 kernel: tap90812230-30: left promiscuous mode
Nov 29 07:15:09 compute-0 nova_compute[187185]: 2025-11-29 07:15:09.795 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:09.798 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[964ecb12-277d-420b-9382-f90044e34894]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:09 compute-0 nova_compute[187185]: 2025-11-29 07:15:09.811 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:09.816 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[049b8d2b-9727-48a4-8870-716b9278f306]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:09.818 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[00429a10-aa65-4086-99a2-e9d748d765d1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:09.842 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a4bf807e-aba2-4b81-b35b-d5b3da0ac43d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 591432, 'reachable_time': 38882, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228689, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:09.847 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:15:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:09.847 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[ddb1ccda-00e6-42b9-a5df-b5edd6d91f66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:09 compute-0 systemd[1]: run-netns-ovnmeta\x2d90812230\x2d35cb\x2d4e21\x2db16b\x2d75b900100d8b.mount: Deactivated successfully.
Nov 29 07:15:10 compute-0 nova_compute[187185]: 2025-11-29 07:15:10.071 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:10 compute-0 nova_compute[187185]: 2025-11-29 07:15:10.246 187189 INFO nova.compute.manager [None req-b928c091-cc24-48f8-9a7d-c5cc8ce22b20 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Took 0.85 seconds to destroy the instance on the hypervisor.
Nov 29 07:15:10 compute-0 nova_compute[187185]: 2025-11-29 07:15:10.248 187189 DEBUG oslo.service.loopingcall [None req-b928c091-cc24-48f8-9a7d-c5cc8ce22b20 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 07:15:10 compute-0 nova_compute[187185]: 2025-11-29 07:15:10.248 187189 DEBUG nova.compute.manager [-] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 07:15:10 compute-0 nova_compute[187185]: 2025-11-29 07:15:10.248 187189 DEBUG nova.network.neutron [-] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 07:15:11 compute-0 nova_compute[187185]: 2025-11-29 07:15:11.224 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:11 compute-0 nova_compute[187185]: 2025-11-29 07:15:11.437 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:11 compute-0 nova_compute[187185]: 2025-11-29 07:15:11.861 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Updating instance_info_cache with network_info: [{"id": "ef96623e-7b97-4117-b30e-902c4b9be2ae", "address": "fa:16:3e:97:62:f8", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef96623e-7b", "ovs_interfaceid": "ef96623e-7b97-4117-b30e-902c4b9be2ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "3669ef72-0983-4ed3-b52c-6891cbc3edc2", "address": "fa:16:3e:96:9f:2d", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3669ef72-09", "ovs_interfaceid": "3669ef72-0983-4ed3-b52c-6891cbc3edc2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:15:11 compute-0 podman[228691]: 2025-11-29 07:15:11.890521263 +0000 UTC m=+0.146392289 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 29 07:15:11 compute-0 nova_compute[187185]: 2025-11-29 07:15:11.972 187189 DEBUG nova.compute.manager [req-67bd1701-74e4-402b-8612-34c7616dc946 req-710fd8c7-1e85-4533-bdc2-89099fa959fe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Received event network-vif-unplugged-ef96623e-7b97-4117-b30e-902c4b9be2ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:15:11 compute-0 nova_compute[187185]: 2025-11-29 07:15:11.972 187189 DEBUG oslo_concurrency.lockutils [req-67bd1701-74e4-402b-8612-34c7616dc946 req-710fd8c7-1e85-4533-bdc2-89099fa959fe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "8ca0969d-04fe-43b8-8f05-68f183f888a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:15:11 compute-0 nova_compute[187185]: 2025-11-29 07:15:11.973 187189 DEBUG oslo_concurrency.lockutils [req-67bd1701-74e4-402b-8612-34c7616dc946 req-710fd8c7-1e85-4533-bdc2-89099fa959fe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "8ca0969d-04fe-43b8-8f05-68f183f888a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:15:11 compute-0 nova_compute[187185]: 2025-11-29 07:15:11.973 187189 DEBUG oslo_concurrency.lockutils [req-67bd1701-74e4-402b-8612-34c7616dc946 req-710fd8c7-1e85-4533-bdc2-89099fa959fe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "8ca0969d-04fe-43b8-8f05-68f183f888a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:15:11 compute-0 nova_compute[187185]: 2025-11-29 07:15:11.973 187189 DEBUG nova.compute.manager [req-67bd1701-74e4-402b-8612-34c7616dc946 req-710fd8c7-1e85-4533-bdc2-89099fa959fe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] No waiting events found dispatching network-vif-unplugged-ef96623e-7b97-4117-b30e-902c4b9be2ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:15:11 compute-0 nova_compute[187185]: 2025-11-29 07:15:11.973 187189 DEBUG nova.compute.manager [req-67bd1701-74e4-402b-8612-34c7616dc946 req-710fd8c7-1e85-4533-bdc2-89099fa959fe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Received event network-vif-unplugged-ef96623e-7b97-4117-b30e-902c4b9be2ae for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 07:15:12 compute-0 nova_compute[187185]: 2025-11-29 07:15:12.174 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Releasing lock "refresh_cache-8ca0969d-04fe-43b8-8f05-68f183f888a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:15:12 compute-0 nova_compute[187185]: 2025-11-29 07:15:12.175 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 29 07:15:12 compute-0 nova_compute[187185]: 2025-11-29 07:15:12.177 187189 DEBUG oslo_concurrency.lockutils [None req-8ece8784-e0a1-4f81-8baf-90572bc0ca31 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquired lock "refresh_cache-8ca0969d-04fe-43b8-8f05-68f183f888a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:15:12 compute-0 nova_compute[187185]: 2025-11-29 07:15:12.177 187189 DEBUG nova.network.neutron [None req-8ece8784-e0a1-4f81-8baf-90572bc0ca31 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:15:12 compute-0 nova_compute[187185]: 2025-11-29 07:15:12.179 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:15:12 compute-0 nova_compute[187185]: 2025-11-29 07:15:12.179 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:15:12 compute-0 nova_compute[187185]: 2025-11-29 07:15:12.180 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:15:12 compute-0 nova_compute[187185]: 2025-11-29 07:15:12.180 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:15:12 compute-0 nova_compute[187185]: 2025-11-29 07:15:12.181 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:15:12 compute-0 nova_compute[187185]: 2025-11-29 07:15:12.181 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:15:12 compute-0 nova_compute[187185]: 2025-11-29 07:15:12.181 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:15:12 compute-0 nova_compute[187185]: 2025-11-29 07:15:12.181 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:15:12 compute-0 nova_compute[187185]: 2025-11-29 07:15:12.325 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:15:12 compute-0 nova_compute[187185]: 2025-11-29 07:15:12.325 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:15:12 compute-0 nova_compute[187185]: 2025-11-29 07:15:12.326 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:15:12 compute-0 nova_compute[187185]: 2025-11-29 07:15:12.326 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:15:12 compute-0 nova_compute[187185]: 2025-11-29 07:15:12.526 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:15:12 compute-0 nova_compute[187185]: 2025-11-29 07:15:12.527 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5716MB free_disk=73.29471588134766GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:15:12 compute-0 nova_compute[187185]: 2025-11-29 07:15:12.527 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:15:12 compute-0 nova_compute[187185]: 2025-11-29 07:15:12.528 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:15:12 compute-0 nova_compute[187185]: 2025-11-29 07:15:12.726 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance 8ca0969d-04fe-43b8-8f05-68f183f888a9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:15:12 compute-0 nova_compute[187185]: 2025-11-29 07:15:12.726 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:15:12 compute-0 nova_compute[187185]: 2025-11-29 07:15:12.726 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:15:12 compute-0 nova_compute[187185]: 2025-11-29 07:15:12.767 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:15:12 compute-0 nova_compute[187185]: 2025-11-29 07:15:12.840 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:15:12 compute-0 nova_compute[187185]: 2025-11-29 07:15:12.937 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:15:12 compute-0 nova_compute[187185]: 2025-11-29 07:15:12.938 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.411s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:15:13 compute-0 nova_compute[187185]: 2025-11-29 07:15:13.949 187189 DEBUG nova.network.neutron [-] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:15:13 compute-0 nova_compute[187185]: 2025-11-29 07:15:13.978 187189 INFO nova.network.neutron [None req-8ece8784-e0a1-4f81-8baf-90572bc0ca31 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Port 3669ef72-0983-4ed3-b52c-6891cbc3edc2 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Nov 29 07:15:13 compute-0 nova_compute[187185]: 2025-11-29 07:15:13.979 187189 DEBUG nova.network.neutron [None req-8ece8784-e0a1-4f81-8baf-90572bc0ca31 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Updating instance_info_cache with network_info: [{"id": "ef96623e-7b97-4117-b30e-902c4b9be2ae", "address": "fa:16:3e:97:62:f8", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef96623e-7b", "ovs_interfaceid": "ef96623e-7b97-4117-b30e-902c4b9be2ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:15:14 compute-0 nova_compute[187185]: 2025-11-29 07:15:14.021 187189 INFO nova.compute.manager [-] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Took 3.77 seconds to deallocate network for instance.
Nov 29 07:15:14 compute-0 nova_compute[187185]: 2025-11-29 07:15:14.088 187189 DEBUG oslo_concurrency.lockutils [None req-8ece8784-e0a1-4f81-8baf-90572bc0ca31 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Releasing lock "refresh_cache-8ca0969d-04fe-43b8-8f05-68f183f888a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:15:14 compute-0 nova_compute[187185]: 2025-11-29 07:15:14.174 187189 DEBUG nova.compute.manager [req-f62dd806-d753-434f-9d1f-81f954cab176 req-c3d74e6f-09f8-45f3-bea9-97ae3a33a7d8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Received event network-vif-plugged-ef96623e-7b97-4117-b30e-902c4b9be2ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:15:14 compute-0 nova_compute[187185]: 2025-11-29 07:15:14.175 187189 DEBUG oslo_concurrency.lockutils [req-f62dd806-d753-434f-9d1f-81f954cab176 req-c3d74e6f-09f8-45f3-bea9-97ae3a33a7d8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "8ca0969d-04fe-43b8-8f05-68f183f888a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:15:14 compute-0 nova_compute[187185]: 2025-11-29 07:15:14.175 187189 DEBUG oslo_concurrency.lockutils [req-f62dd806-d753-434f-9d1f-81f954cab176 req-c3d74e6f-09f8-45f3-bea9-97ae3a33a7d8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "8ca0969d-04fe-43b8-8f05-68f183f888a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:15:14 compute-0 nova_compute[187185]: 2025-11-29 07:15:14.175 187189 DEBUG oslo_concurrency.lockutils [req-f62dd806-d753-434f-9d1f-81f954cab176 req-c3d74e6f-09f8-45f3-bea9-97ae3a33a7d8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "8ca0969d-04fe-43b8-8f05-68f183f888a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:15:14 compute-0 nova_compute[187185]: 2025-11-29 07:15:14.175 187189 DEBUG nova.compute.manager [req-f62dd806-d753-434f-9d1f-81f954cab176 req-c3d74e6f-09f8-45f3-bea9-97ae3a33a7d8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] No waiting events found dispatching network-vif-plugged-ef96623e-7b97-4117-b30e-902c4b9be2ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:15:14 compute-0 nova_compute[187185]: 2025-11-29 07:15:14.176 187189 WARNING nova.compute.manager [req-f62dd806-d753-434f-9d1f-81f954cab176 req-c3d74e6f-09f8-45f3-bea9-97ae3a33a7d8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Received unexpected event network-vif-plugged-ef96623e-7b97-4117-b30e-902c4b9be2ae for instance with vm_state active and task_state deleting.
Nov 29 07:15:14 compute-0 nova_compute[187185]: 2025-11-29 07:15:14.176 187189 DEBUG nova.compute.manager [req-f62dd806-d753-434f-9d1f-81f954cab176 req-c3d74e6f-09f8-45f3-bea9-97ae3a33a7d8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Received event network-vif-deleted-ef96623e-7b97-4117-b30e-902c4b9be2ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:15:14 compute-0 nova_compute[187185]: 2025-11-29 07:15:14.176 187189 INFO nova.compute.manager [req-f62dd806-d753-434f-9d1f-81f954cab176 req-c3d74e6f-09f8-45f3-bea9-97ae3a33a7d8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Neutron deleted interface ef96623e-7b97-4117-b30e-902c4b9be2ae; detaching it from the instance and deleting it from the info cache
Nov 29 07:15:14 compute-0 nova_compute[187185]: 2025-11-29 07:15:14.177 187189 DEBUG nova.network.neutron [req-f62dd806-d753-434f-9d1f-81f954cab176 req-c3d74e6f-09f8-45f3-bea9-97ae3a33a7d8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:15:14 compute-0 nova_compute[187185]: 2025-11-29 07:15:14.178 187189 DEBUG oslo_concurrency.lockutils [None req-8ece8784-e0a1-4f81-8baf-90572bc0ca31 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "interface-8ca0969d-04fe-43b8-8f05-68f183f888a9-3669ef72-0983-4ed3-b52c-6891cbc3edc2" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 7.946s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:15:14 compute-0 nova_compute[187185]: 2025-11-29 07:15:14.383 187189 DEBUG nova.compute.manager [req-f62dd806-d753-434f-9d1f-81f954cab176 req-c3d74e6f-09f8-45f3-bea9-97ae3a33a7d8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Detach interface failed, port_id=ef96623e-7b97-4117-b30e-902c4b9be2ae, reason: Instance 8ca0969d-04fe-43b8-8f05-68f183f888a9 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Nov 29 07:15:14 compute-0 nova_compute[187185]: 2025-11-29 07:15:14.388 187189 DEBUG oslo_concurrency.lockutils [None req-b928c091-cc24-48f8-9a7d-c5cc8ce22b20 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:15:14 compute-0 nova_compute[187185]: 2025-11-29 07:15:14.389 187189 DEBUG oslo_concurrency.lockutils [None req-b928c091-cc24-48f8-9a7d-c5cc8ce22b20 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:15:14 compute-0 nova_compute[187185]: 2025-11-29 07:15:14.451 187189 DEBUG nova.compute.provider_tree [None req-b928c091-cc24-48f8-9a7d-c5cc8ce22b20 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:15:14 compute-0 nova_compute[187185]: 2025-11-29 07:15:14.712 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:14 compute-0 nova_compute[187185]: 2025-11-29 07:15:14.814 187189 DEBUG nova.scheduler.client.report [None req-b928c091-cc24-48f8-9a7d-c5cc8ce22b20 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:15:14 compute-0 nova_compute[187185]: 2025-11-29 07:15:14.961 187189 DEBUG oslo_concurrency.lockutils [None req-b928c091-cc24-48f8-9a7d-c5cc8ce22b20 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.572s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:15:15 compute-0 nova_compute[187185]: 2025-11-29 07:15:15.058 187189 INFO nova.scheduler.client.report [None req-b928c091-cc24-48f8-9a7d-c5cc8ce22b20 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Deleted allocations for instance 8ca0969d-04fe-43b8-8f05-68f183f888a9
Nov 29 07:15:15 compute-0 nova_compute[187185]: 2025-11-29 07:15:15.118 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:15 compute-0 nova_compute[187185]: 2025-11-29 07:15:15.287 187189 DEBUG oslo_concurrency.lockutils [None req-b928c091-cc24-48f8-9a7d-c5cc8ce22b20 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "8ca0969d-04fe-43b8-8f05-68f183f888a9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.952s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:15:15 compute-0 nova_compute[187185]: 2025-11-29 07:15:15.590 187189 DEBUG nova.compute.manager [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Stashing vm_state: stopped _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Nov 29 07:15:15 compute-0 nova_compute[187185]: 2025-11-29 07:15:15.751 187189 DEBUG oslo_concurrency.lockutils [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:15:15 compute-0 nova_compute[187185]: 2025-11-29 07:15:15.752 187189 DEBUG oslo_concurrency.lockutils [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:15:15 compute-0 nova_compute[187185]: 2025-11-29 07:15:15.804 187189 DEBUG nova.objects.instance [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'pci_requests' on Instance uuid 084a0f8e-19b7-4b24-a503-c015b26addbc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:15:15 compute-0 nova_compute[187185]: 2025-11-29 07:15:15.829 187189 DEBUG nova.virt.hardware [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:15:15 compute-0 nova_compute[187185]: 2025-11-29 07:15:15.829 187189 INFO nova.compute.claims [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:15:15 compute-0 nova_compute[187185]: 2025-11-29 07:15:15.830 187189 DEBUG nova.objects.instance [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'resources' on Instance uuid 084a0f8e-19b7-4b24-a503-c015b26addbc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:15:15 compute-0 nova_compute[187185]: 2025-11-29 07:15:15.846 187189 DEBUG nova.objects.instance [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'pci_devices' on Instance uuid 084a0f8e-19b7-4b24-a503-c015b26addbc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:15:15 compute-0 nova_compute[187185]: 2025-11-29 07:15:15.899 187189 INFO nova.compute.resource_tracker [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Updating resource usage from migration 851ef7a8-9a43-4d1f-809d-562a326079bb
Nov 29 07:15:15 compute-0 nova_compute[187185]: 2025-11-29 07:15:15.899 187189 DEBUG nova.compute.resource_tracker [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Starting to track incoming migration 851ef7a8-9a43-4d1f-809d-562a326079bb with flavor e29df891-dca5-4a1c-9258-dc512a46956f _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Nov 29 07:15:15 compute-0 nova_compute[187185]: 2025-11-29 07:15:15.961 187189 DEBUG nova.compute.provider_tree [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:15:16 compute-0 nova_compute[187185]: 2025-11-29 07:15:16.259 187189 DEBUG nova.scheduler.client.report [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:15:16 compute-0 nova_compute[187185]: 2025-11-29 07:15:16.464 187189 DEBUG oslo_concurrency.lockutils [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.712s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:15:16 compute-0 nova_compute[187185]: 2025-11-29 07:15:16.464 187189 INFO nova.compute.manager [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Migrating
Nov 29 07:15:16 compute-0 nova_compute[187185]: 2025-11-29 07:15:16.934 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:15:17 compute-0 podman[228718]: 2025-11-29 07:15:17.80196929 +0000 UTC m=+0.062948590 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 07:15:19 compute-0 nova_compute[187185]: 2025-11-29 07:15:19.716 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:19 compute-0 podman[228742]: 2025-11-29 07:15:19.851643737 +0000 UTC m=+0.104533797 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd)
Nov 29 07:15:19 compute-0 podman[228743]: 2025-11-29 07:15:19.851540614 +0000 UTC m=+0.094846323 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 07:15:20 compute-0 nova_compute[187185]: 2025-11-29 07:15:20.121 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:24 compute-0 sshd-session[228781]: Accepted publickey for nova from 192.168.122.101 port 59436 ssh2: ECDSA SHA256:TRxHqyAgYthLa8Amy2/P9EvhpVYcaH8++okwzvcHmaY
Nov 29 07:15:24 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Nov 29 07:15:24 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 29 07:15:24 compute-0 systemd-logind[788]: New session 28 of user nova.
Nov 29 07:15:24 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 29 07:15:24 compute-0 systemd[1]: Starting User Manager for UID 42436...
Nov 29 07:15:24 compute-0 systemd[228785]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 29 07:15:24 compute-0 systemd[228785]: Queued start job for default target Main User Target.
Nov 29 07:15:24 compute-0 systemd[228785]: Created slice User Application Slice.
Nov 29 07:15:24 compute-0 systemd[228785]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 29 07:15:24 compute-0 systemd[228785]: Started Daily Cleanup of User's Temporary Directories.
Nov 29 07:15:24 compute-0 systemd[228785]: Reached target Paths.
Nov 29 07:15:24 compute-0 systemd[228785]: Reached target Timers.
Nov 29 07:15:24 compute-0 systemd[228785]: Starting D-Bus User Message Bus Socket...
Nov 29 07:15:24 compute-0 systemd[228785]: Starting Create User's Volatile Files and Directories...
Nov 29 07:15:24 compute-0 systemd[228785]: Finished Create User's Volatile Files and Directories.
Nov 29 07:15:24 compute-0 systemd[228785]: Listening on D-Bus User Message Bus Socket.
Nov 29 07:15:24 compute-0 systemd[228785]: Reached target Sockets.
Nov 29 07:15:24 compute-0 systemd[228785]: Reached target Basic System.
Nov 29 07:15:24 compute-0 systemd[228785]: Reached target Main User Target.
Nov 29 07:15:24 compute-0 systemd[228785]: Startup finished in 200ms.
Nov 29 07:15:24 compute-0 systemd[1]: Started User Manager for UID 42436.
Nov 29 07:15:24 compute-0 systemd[1]: Started Session 28 of User nova.
Nov 29 07:15:24 compute-0 sshd-session[228781]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 29 07:15:24 compute-0 sshd-session[228800]: Received disconnect from 192.168.122.101 port 59436:11: disconnected by user
Nov 29 07:15:24 compute-0 sshd-session[228800]: Disconnected from user nova 192.168.122.101 port 59436
Nov 29 07:15:24 compute-0 sshd-session[228781]: pam_unix(sshd:session): session closed for user nova
Nov 29 07:15:24 compute-0 systemd[1]: session-28.scope: Deactivated successfully.
Nov 29 07:15:24 compute-0 systemd-logind[788]: Session 28 logged out. Waiting for processes to exit.
Nov 29 07:15:24 compute-0 systemd-logind[788]: Removed session 28.
Nov 29 07:15:24 compute-0 nova_compute[187185]: 2025-11-29 07:15:24.675 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400509.6741748, 8ca0969d-04fe-43b8-8f05-68f183f888a9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:15:24 compute-0 nova_compute[187185]: 2025-11-29 07:15:24.677 187189 INFO nova.compute.manager [-] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] VM Stopped (Lifecycle Event)
Nov 29 07:15:24 compute-0 nova_compute[187185]: 2025-11-29 07:15:24.719 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:24 compute-0 sshd-session[228802]: Accepted publickey for nova from 192.168.122.101 port 59442 ssh2: ECDSA SHA256:TRxHqyAgYthLa8Amy2/P9EvhpVYcaH8++okwzvcHmaY
Nov 29 07:15:24 compute-0 systemd-logind[788]: New session 30 of user nova.
Nov 29 07:15:24 compute-0 systemd[1]: Started Session 30 of User nova.
Nov 29 07:15:24 compute-0 sshd-session[228802]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 29 07:15:24 compute-0 sshd-session[228805]: Received disconnect from 192.168.122.101 port 59442:11: disconnected by user
Nov 29 07:15:24 compute-0 sshd-session[228805]: Disconnected from user nova 192.168.122.101 port 59442
Nov 29 07:15:24 compute-0 sshd-session[228802]: pam_unix(sshd:session): session closed for user nova
Nov 29 07:15:24 compute-0 systemd[1]: session-30.scope: Deactivated successfully.
Nov 29 07:15:24 compute-0 systemd-logind[788]: Session 30 logged out. Waiting for processes to exit.
Nov 29 07:15:24 compute-0 systemd-logind[788]: Removed session 30.
Nov 29 07:15:24 compute-0 nova_compute[187185]: 2025-11-29 07:15:24.946 187189 DEBUG nova.compute.manager [None req-a61ef30b-f94e-49ce-90e4-3f593786cf91 - - - - - -] [instance: 8ca0969d-04fe-43b8-8f05-68f183f888a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:15:25 compute-0 nova_compute[187185]: 2025-11-29 07:15:25.123 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:25 compute-0 sshd-session[228807]: Accepted publickey for nova from 192.168.122.101 port 59444 ssh2: ECDSA SHA256:TRxHqyAgYthLa8Amy2/P9EvhpVYcaH8++okwzvcHmaY
Nov 29 07:15:25 compute-0 systemd-logind[788]: New session 31 of user nova.
Nov 29 07:15:25 compute-0 systemd[1]: Started Session 31 of User nova.
Nov 29 07:15:25 compute-0 sshd-session[228807]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 29 07:15:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:25.504 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:15:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:25.505 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:15:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:25.505 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:15:25 compute-0 sshd-session[228810]: Received disconnect from 192.168.122.101 port 59444:11: disconnected by user
Nov 29 07:15:25 compute-0 sshd-session[228810]: Disconnected from user nova 192.168.122.101 port 59444
Nov 29 07:15:25 compute-0 sshd-session[228807]: pam_unix(sshd:session): session closed for user nova
Nov 29 07:15:25 compute-0 systemd[1]: session-31.scope: Deactivated successfully.
Nov 29 07:15:25 compute-0 systemd-logind[788]: Session 31 logged out. Waiting for processes to exit.
Nov 29 07:15:25 compute-0 systemd-logind[788]: Removed session 31.
Nov 29 07:15:25 compute-0 sshd-session[228812]: Accepted publickey for nova from 192.168.122.101 port 59460 ssh2: ECDSA SHA256:TRxHqyAgYthLa8Amy2/P9EvhpVYcaH8++okwzvcHmaY
Nov 29 07:15:25 compute-0 systemd-logind[788]: New session 32 of user nova.
Nov 29 07:15:25 compute-0 systemd[1]: Started Session 32 of User nova.
Nov 29 07:15:25 compute-0 sshd-session[228812]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 29 07:15:25 compute-0 sshd-session[228816]: Received disconnect from 192.168.122.101 port 59460:11: disconnected by user
Nov 29 07:15:25 compute-0 sshd-session[228816]: Disconnected from user nova 192.168.122.101 port 59460
Nov 29 07:15:25 compute-0 sshd-session[228812]: pam_unix(sshd:session): session closed for user nova
Nov 29 07:15:25 compute-0 systemd[1]: session-32.scope: Deactivated successfully.
Nov 29 07:15:25 compute-0 systemd-logind[788]: Session 32 logged out. Waiting for processes to exit.
Nov 29 07:15:25 compute-0 systemd-logind[788]: Removed session 32.
Nov 29 07:15:26 compute-0 sshd-session[228818]: Accepted publickey for nova from 192.168.122.101 port 59472 ssh2: ECDSA SHA256:TRxHqyAgYthLa8Amy2/P9EvhpVYcaH8++okwzvcHmaY
Nov 29 07:15:26 compute-0 systemd-logind[788]: New session 33 of user nova.
Nov 29 07:15:26 compute-0 systemd[1]: Started Session 33 of User nova.
Nov 29 07:15:26 compute-0 sshd-session[228818]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 29 07:15:26 compute-0 sshd-session[228821]: Received disconnect from 192.168.122.101 port 59472:11: disconnected by user
Nov 29 07:15:26 compute-0 sshd-session[228821]: Disconnected from user nova 192.168.122.101 port 59472
Nov 29 07:15:26 compute-0 sshd-session[228818]: pam_unix(sshd:session): session closed for user nova
Nov 29 07:15:26 compute-0 systemd[1]: session-33.scope: Deactivated successfully.
Nov 29 07:15:26 compute-0 systemd-logind[788]: Session 33 logged out. Waiting for processes to exit.
Nov 29 07:15:26 compute-0 systemd-logind[788]: Removed session 33.
Nov 29 07:15:28 compute-0 nova_compute[187185]: 2025-11-29 07:15:28.354 187189 INFO nova.network.neutron [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Updating port 60943dec-d420-449f-abc3-233df163ebed with attributes {'binding:host_id': 'compute-0.ctlplane.example.com', 'device_owner': 'compute:nova'}
Nov 29 07:15:29 compute-0 nova_compute[187185]: 2025-11-29 07:15:29.722 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:30 compute-0 nova_compute[187185]: 2025-11-29 07:15:30.162 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:30 compute-0 nova_compute[187185]: 2025-11-29 07:15:30.188 187189 DEBUG oslo_concurrency.lockutils [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "refresh_cache-084a0f8e-19b7-4b24-a503-c015b26addbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:15:30 compute-0 nova_compute[187185]: 2025-11-29 07:15:30.188 187189 DEBUG oslo_concurrency.lockutils [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquired lock "refresh_cache-084a0f8e-19b7-4b24-a503-c015b26addbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:15:30 compute-0 nova_compute[187185]: 2025-11-29 07:15:30.189 187189 DEBUG nova.network.neutron [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:15:30 compute-0 nova_compute[187185]: 2025-11-29 07:15:30.401 187189 DEBUG nova.compute.manager [req-f5d43963-5577-4a95-9cde-8effa35a8502 req-e18777f2-8199-4795-925c-76f7489013ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Received event network-changed-60943dec-d420-449f-abc3-233df163ebed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:15:30 compute-0 nova_compute[187185]: 2025-11-29 07:15:30.402 187189 DEBUG nova.compute.manager [req-f5d43963-5577-4a95-9cde-8effa35a8502 req-e18777f2-8199-4795-925c-76f7489013ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Refreshing instance network info cache due to event network-changed-60943dec-d420-449f-abc3-233df163ebed. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:15:30 compute-0 nova_compute[187185]: 2025-11-29 07:15:30.402 187189 DEBUG oslo_concurrency.lockutils [req-f5d43963-5577-4a95-9cde-8effa35a8502 req-e18777f2-8199-4795-925c-76f7489013ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-084a0f8e-19b7-4b24-a503-c015b26addbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.028 187189 DEBUG nova.network.neutron [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Updating instance_info_cache with network_info: [{"id": "60943dec-d420-449f-abc3-233df163ebed", "address": "fa:16:3e:04:06:9e", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60943dec-d4", "ovs_interfaceid": "60943dec-d420-449f-abc3-233df163ebed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.045 187189 DEBUG oslo_concurrency.lockutils [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Releasing lock "refresh_cache-084a0f8e-19b7-4b24-a503-c015b26addbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.049 187189 DEBUG oslo_concurrency.lockutils [req-f5d43963-5577-4a95-9cde-8effa35a8502 req-e18777f2-8199-4795-925c-76f7489013ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-084a0f8e-19b7-4b24-a503-c015b26addbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.049 187189 DEBUG nova.network.neutron [req-f5d43963-5577-4a95-9cde-8effa35a8502 req-e18777f2-8199-4795-925c-76f7489013ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Refreshing network info cache for port 60943dec-d420-449f-abc3-233df163ebed _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.192 187189 DEBUG nova.virt.libvirt.driver [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.196 187189 DEBUG nova.virt.libvirt.driver [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.197 187189 INFO nova.virt.libvirt.driver [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Creating image(s)
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.199 187189 DEBUG nova.objects.instance [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'trusted_certs' on Instance uuid 084a0f8e-19b7-4b24-a503-c015b26addbc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.212 187189 DEBUG oslo_concurrency.processutils [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.285 187189 DEBUG oslo_concurrency.processutils [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.286 187189 DEBUG nova.virt.disk.api [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Checking if we can resize image /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.287 187189 DEBUG oslo_concurrency.processutils [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.350 187189 DEBUG oslo_concurrency.processutils [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.351 187189 DEBUG nova.virt.disk.api [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Cannot resize image /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.374 187189 DEBUG nova.virt.libvirt.driver [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.375 187189 DEBUG nova.virt.libvirt.driver [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Ensure instance console log exists: /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.376 187189 DEBUG oslo_concurrency.lockutils [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.377 187189 DEBUG oslo_concurrency.lockutils [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.377 187189 DEBUG oslo_concurrency.lockutils [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.383 187189 DEBUG nova.virt.libvirt.driver [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Start _get_guest_xml network_info=[{"id": "60943dec-d420-449f-abc3-233df163ebed", "address": "fa:16:3e:04:06:9e", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-1072835336-network", "vif_mac": "fa:16:3e:04:06:9e"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60943dec-d4", "ovs_interfaceid": "60943dec-d420-449f-abc3-233df163ebed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.389 187189 WARNING nova.virt.libvirt.driver [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.402 187189 DEBUG nova.virt.libvirt.host [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.403 187189 DEBUG nova.virt.libvirt.host [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.409 187189 DEBUG nova.virt.libvirt.host [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.410 187189 DEBUG nova.virt.libvirt.host [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.413 187189 DEBUG nova.virt.libvirt.driver [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.413 187189 DEBUG nova.virt.hardware [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e29df891-dca5-4a1c-9258-dc512a46956f',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.414 187189 DEBUG nova.virt.hardware [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.415 187189 DEBUG nova.virt.hardware [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.415 187189 DEBUG nova.virt.hardware [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.416 187189 DEBUG nova.virt.hardware [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.416 187189 DEBUG nova.virt.hardware [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.416 187189 DEBUG nova.virt.hardware [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.417 187189 DEBUG nova.virt.hardware [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.417 187189 DEBUG nova.virt.hardware [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.418 187189 DEBUG nova.virt.hardware [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.418 187189 DEBUG nova.virt.hardware [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.419 187189 DEBUG nova.objects.instance [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'vcpu_model' on Instance uuid 084a0f8e-19b7-4b24-a503-c015b26addbc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.444 187189 DEBUG oslo_concurrency.processutils [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.515 187189 DEBUG oslo_concurrency.processutils [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk.config --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.516 187189 DEBUG oslo_concurrency.lockutils [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "/var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.517 187189 DEBUG oslo_concurrency.lockutils [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "/var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.518 187189 DEBUG oslo_concurrency.lockutils [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "/var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.519 187189 DEBUG nova.virt.libvirt.vif [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:11:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-734207825',display_name='tempest-ServerActionsTestOtherB-server-734207825',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-734207825',id=96,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJEYU7KNgNpvYMhWLcgNKb4JeWm+l16ttLKZ2We4gLp8YMbZFLJD2i4RZSQXciBvCLn4uXa9U2Zxsdygka87gys3pZZ16d1VbC25mryAsCgbm8dp7GriXd9FfJytMY+M+Q==',key_name='tempest-keypair-1534024740',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:12:56Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='32e51e3a9a8f4a1ca6e022735ebf5f7b',ramdisk_id='',reservation_id='r-4lueok56',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='stopped',owner_project_name='tempest-ServerActionsTestOtherB-1538648925',owner_user_name='tempest-ServerActionsTestOtherB-1538648925-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:15:27Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ee2d4931cb504b13b92a2f52c95c05ce',uuid=084a0f8e-19b7-4b24-a503-c015b26addbc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "60943dec-d420-449f-abc3-233df163ebed", "address": "fa:16:3e:04:06:9e", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-1072835336-network", "vif_mac": "fa:16:3e:04:06:9e"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60943dec-d4", "ovs_interfaceid": "60943dec-d420-449f-abc3-233df163ebed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.520 187189 DEBUG nova.network.os_vif_util [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converting VIF {"id": "60943dec-d420-449f-abc3-233df163ebed", "address": "fa:16:3e:04:06:9e", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-1072835336-network", "vif_mac": "fa:16:3e:04:06:9e"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60943dec-d4", "ovs_interfaceid": "60943dec-d420-449f-abc3-233df163ebed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.521 187189 DEBUG nova.network.os_vif_util [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:06:9e,bridge_name='br-int',has_traffic_filtering=True,id=60943dec-d420-449f-abc3-233df163ebed,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60943dec-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.524 187189 DEBUG nova.virt.libvirt.driver [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:15:32 compute-0 nova_compute[187185]:   <uuid>084a0f8e-19b7-4b24-a503-c015b26addbc</uuid>
Nov 29 07:15:32 compute-0 nova_compute[187185]:   <name>instance-00000060</name>
Nov 29 07:15:32 compute-0 nova_compute[187185]:   <memory>196608</memory>
Nov 29 07:15:32 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:15:32 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:15:32 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:       <nova:name>tempest-ServerActionsTestOtherB-server-734207825</nova:name>
Nov 29 07:15:32 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:15:32</nova:creationTime>
Nov 29 07:15:32 compute-0 nova_compute[187185]:       <nova:flavor name="m1.micro">
Nov 29 07:15:32 compute-0 nova_compute[187185]:         <nova:memory>192</nova:memory>
Nov 29 07:15:32 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:15:32 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:15:32 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:15:32 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:15:32 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:15:32 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:15:32 compute-0 nova_compute[187185]:         <nova:user uuid="ee2d4931cb504b13b92a2f52c95c05ce">tempest-ServerActionsTestOtherB-1538648925-project-member</nova:user>
Nov 29 07:15:32 compute-0 nova_compute[187185]:         <nova:project uuid="32e51e3a9a8f4a1ca6e022735ebf5f7b">tempest-ServerActionsTestOtherB-1538648925</nova:project>
Nov 29 07:15:32 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:15:32 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:15:32 compute-0 nova_compute[187185]:         <nova:port uuid="60943dec-d420-449f-abc3-233df163ebed">
Nov 29 07:15:32 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:15:32 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:15:32 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:15:32 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <system>
Nov 29 07:15:32 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:15:32 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:15:32 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:15:32 compute-0 nova_compute[187185]:       <entry name="serial">084a0f8e-19b7-4b24-a503-c015b26addbc</entry>
Nov 29 07:15:32 compute-0 nova_compute[187185]:       <entry name="uuid">084a0f8e-19b7-4b24-a503-c015b26addbc</entry>
Nov 29 07:15:32 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     </system>
Nov 29 07:15:32 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:15:32 compute-0 nova_compute[187185]:   <os>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:   </os>
Nov 29 07:15:32 compute-0 nova_compute[187185]:   <features>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:   </features>
Nov 29 07:15:32 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:15:32 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:15:32 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:15:32 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:15:32 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk.config"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:15:32 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:04:06:9e"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:       <target dev="tap60943dec-d4"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:15:32 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/console.log" append="off"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <video>
Nov 29 07:15:32 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     </video>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:15:32 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:15:32 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:15:32 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:15:32 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:15:32 compute-0 nova_compute[187185]: </domain>
Nov 29 07:15:32 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.526 187189 DEBUG nova.virt.libvirt.vif [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:11:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-734207825',display_name='tempest-ServerActionsTestOtherB-server-734207825',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-734207825',id=96,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJEYU7KNgNpvYMhWLcgNKb4JeWm+l16ttLKZ2We4gLp8YMbZFLJD2i4RZSQXciBvCLn4uXa9U2Zxsdygka87gys3pZZ16d1VbC25mryAsCgbm8dp7GriXd9FfJytMY+M+Q==',key_name='tempest-keypair-1534024740',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:12:56Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='32e51e3a9a8f4a1ca6e022735ebf5f7b',ramdisk_id='',reservation_id='r-4lueok56',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='stopped',owner_project_name='tempest-ServerActionsTestOtherB-1538648925',owner_user_name='tempest-ServerActionsTestOtherB-1538648925-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:15:27Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ee2d4931cb504b13b92a2f52c95c05ce',uuid=084a0f8e-19b7-4b24-a503-c015b26addbc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "60943dec-d420-449f-abc3-233df163ebed", "address": "fa:16:3e:04:06:9e", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-1072835336-network", "vif_mac": "fa:16:3e:04:06:9e"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60943dec-d4", "ovs_interfaceid": "60943dec-d420-449f-abc3-233df163ebed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.526 187189 DEBUG nova.network.os_vif_util [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converting VIF {"id": "60943dec-d420-449f-abc3-233df163ebed", "address": "fa:16:3e:04:06:9e", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-1072835336-network", "vif_mac": "fa:16:3e:04:06:9e"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60943dec-d4", "ovs_interfaceid": "60943dec-d420-449f-abc3-233df163ebed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.527 187189 DEBUG nova.network.os_vif_util [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:06:9e,bridge_name='br-int',has_traffic_filtering=True,id=60943dec-d420-449f-abc3-233df163ebed,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60943dec-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.528 187189 DEBUG os_vif [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:06:9e,bridge_name='br-int',has_traffic_filtering=True,id=60943dec-d420-449f-abc3-233df163ebed,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60943dec-d4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.528 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.529 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.530 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.533 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.533 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap60943dec-d4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.534 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap60943dec-d4, col_values=(('external_ids', {'iface-id': '60943dec-d420-449f-abc3-233df163ebed', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:04:06:9e', 'vm-uuid': '084a0f8e-19b7-4b24-a503-c015b26addbc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.536 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:32 compute-0 NetworkManager[55227]: <info>  [1764400532.5373] manager: (tap60943dec-d4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/137)
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.540 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.549 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.551 187189 INFO os_vif [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:06:9e,bridge_name='br-int',has_traffic_filtering=True,id=60943dec-d420-449f-abc3-233df163ebed,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60943dec-d4')
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.640 187189 DEBUG nova.virt.libvirt.driver [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.641 187189 DEBUG nova.virt.libvirt.driver [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.641 187189 DEBUG nova.virt.libvirt.driver [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] No VIF found with MAC fa:16:3e:04:06:9e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.642 187189 INFO nova.virt.libvirt.driver [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Using config drive
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.642 187189 DEBUG nova.compute.manager [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:15:32 compute-0 nova_compute[187185]: 2025-11-29 07:15:32.643 187189 DEBUG nova.virt.libvirt.driver [None req-ca3716b1-6e81-418b-83d4-2ab4092839bc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Nov 29 07:15:32 compute-0 podman[228836]: 2025-11-29 07:15:32.833967953 +0000 UTC m=+0.073926282 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 07:15:32 compute-0 podman[228834]: 2025-11-29 07:15:32.835324761 +0000 UTC m=+0.090019777 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 07:15:32 compute-0 podman[228835]: 2025-11-29 07:15:32.847886526 +0000 UTC m=+0.096465869 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, managed_by=edpm_ansible, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, distribution-scope=public)
Nov 29 07:15:34 compute-0 nova_compute[187185]: 2025-11-29 07:15:34.118 187189 DEBUG nova.network.neutron [req-f5d43963-5577-4a95-9cde-8effa35a8502 req-e18777f2-8199-4795-925c-76f7489013ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Updated VIF entry in instance network info cache for port 60943dec-d420-449f-abc3-233df163ebed. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:15:34 compute-0 nova_compute[187185]: 2025-11-29 07:15:34.119 187189 DEBUG nova.network.neutron [req-f5d43963-5577-4a95-9cde-8effa35a8502 req-e18777f2-8199-4795-925c-76f7489013ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Updating instance_info_cache with network_info: [{"id": "60943dec-d420-449f-abc3-233df163ebed", "address": "fa:16:3e:04:06:9e", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60943dec-d4", "ovs_interfaceid": "60943dec-d420-449f-abc3-233df163ebed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:15:34 compute-0 nova_compute[187185]: 2025-11-29 07:15:34.140 187189 DEBUG oslo_concurrency.lockutils [req-f5d43963-5577-4a95-9cde-8effa35a8502 req-e18777f2-8199-4795-925c-76f7489013ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-084a0f8e-19b7-4b24-a503-c015b26addbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:15:35 compute-0 nova_compute[187185]: 2025-11-29 07:15:35.164 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:36 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Nov 29 07:15:36 compute-0 systemd[228785]: Activating special unit Exit the Session...
Nov 29 07:15:36 compute-0 systemd[228785]: Stopped target Main User Target.
Nov 29 07:15:36 compute-0 systemd[228785]: Stopped target Basic System.
Nov 29 07:15:36 compute-0 systemd[228785]: Stopped target Paths.
Nov 29 07:15:36 compute-0 systemd[228785]: Stopped target Sockets.
Nov 29 07:15:36 compute-0 systemd[228785]: Stopped target Timers.
Nov 29 07:15:36 compute-0 systemd[228785]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 29 07:15:36 compute-0 systemd[228785]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 29 07:15:36 compute-0 systemd[228785]: Closed D-Bus User Message Bus Socket.
Nov 29 07:15:36 compute-0 systemd[228785]: Stopped Create User's Volatile Files and Directories.
Nov 29 07:15:36 compute-0 systemd[228785]: Removed slice User Application Slice.
Nov 29 07:15:36 compute-0 systemd[228785]: Reached target Shutdown.
Nov 29 07:15:36 compute-0 systemd[228785]: Finished Exit the Session.
Nov 29 07:15:36 compute-0 systemd[228785]: Reached target Exit the Session.
Nov 29 07:15:36 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Nov 29 07:15:36 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Nov 29 07:15:36 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 29 07:15:36 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 29 07:15:36 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 29 07:15:36 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 29 07:15:36 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Nov 29 07:15:37 compute-0 nova_compute[187185]: 2025-11-29 07:15:37.539 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:40 compute-0 nova_compute[187185]: 2025-11-29 07:15:40.219 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:42 compute-0 nova_compute[187185]: 2025-11-29 07:15:42.544 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:42 compute-0 podman[228899]: 2025-11-29 07:15:42.83704086 +0000 UTC m=+0.099168084 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:15:45 compute-0 nova_compute[187185]: 2025-11-29 07:15:45.223 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:47 compute-0 nova_compute[187185]: 2025-11-29 07:15:47.547 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:48 compute-0 podman[228925]: 2025-11-29 07:15:48.814993356 +0000 UTC m=+0.073352232 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 07:15:49 compute-0 nova_compute[187185]: 2025-11-29 07:15:49.644 187189 DEBUG nova.objects.instance [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'flavor' on Instance uuid 084a0f8e-19b7-4b24-a503-c015b26addbc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:15:49 compute-0 nova_compute[187185]: 2025-11-29 07:15:49.843 187189 DEBUG nova.objects.instance [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'info_cache' on Instance uuid 084a0f8e-19b7-4b24-a503-c015b26addbc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:15:49 compute-0 nova_compute[187185]: 2025-11-29 07:15:49.962 187189 DEBUG oslo_concurrency.lockutils [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "refresh_cache-084a0f8e-19b7-4b24-a503-c015b26addbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:15:49 compute-0 nova_compute[187185]: 2025-11-29 07:15:49.962 187189 DEBUG oslo_concurrency.lockutils [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquired lock "refresh_cache-084a0f8e-19b7-4b24-a503-c015b26addbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:15:49 compute-0 nova_compute[187185]: 2025-11-29 07:15:49.963 187189 DEBUG nova.network.neutron [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:15:50 compute-0 nova_compute[187185]: 2025-11-29 07:15:50.225 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:50 compute-0 nova_compute[187185]: 2025-11-29 07:15:50.515 187189 DEBUG oslo_concurrency.lockutils [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "704c4aa7-3239-4ecc-bfdc-c72642678363" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:15:50 compute-0 nova_compute[187185]: 2025-11-29 07:15:50.516 187189 DEBUG oslo_concurrency.lockutils [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "704c4aa7-3239-4ecc-bfdc-c72642678363" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:15:50 compute-0 nova_compute[187185]: 2025-11-29 07:15:50.585 187189 DEBUG nova.compute.manager [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 07:15:50 compute-0 podman[228950]: 2025-11-29 07:15:50.801093262 +0000 UTC m=+0.065874009 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:15:50 compute-0 podman[228949]: 2025-11-29 07:15:50.806937328 +0000 UTC m=+0.079107685 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 07:15:51 compute-0 nova_compute[187185]: 2025-11-29 07:15:51.049 187189 DEBUG oslo_concurrency.lockutils [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:15:51 compute-0 nova_compute[187185]: 2025-11-29 07:15:51.049 187189 DEBUG oslo_concurrency.lockutils [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:15:51 compute-0 nova_compute[187185]: 2025-11-29 07:15:51.060 187189 DEBUG nova.virt.hardware [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:15:51 compute-0 nova_compute[187185]: 2025-11-29 07:15:51.061 187189 INFO nova.compute.claims [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:15:51 compute-0 nova_compute[187185]: 2025-11-29 07:15:51.469 187189 DEBUG nova.compute.provider_tree [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:15:51 compute-0 nova_compute[187185]: 2025-11-29 07:15:51.893 187189 DEBUG nova.scheduler.client.report [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:15:52 compute-0 nova_compute[187185]: 2025-11-29 07:15:52.028 187189 DEBUG oslo_concurrency.lockutils [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.978s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:15:52 compute-0 nova_compute[187185]: 2025-11-29 07:15:52.029 187189 DEBUG nova.compute.manager [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 07:15:52 compute-0 nova_compute[187185]: 2025-11-29 07:15:52.202 187189 DEBUG nova.compute.manager [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 07:15:52 compute-0 nova_compute[187185]: 2025-11-29 07:15:52.203 187189 DEBUG nova.network.neutron [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 07:15:52 compute-0 nova_compute[187185]: 2025-11-29 07:15:52.259 187189 INFO nova.virt.libvirt.driver [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 07:15:52 compute-0 nova_compute[187185]: 2025-11-29 07:15:52.310 187189 DEBUG nova.compute.manager [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 07:15:52 compute-0 nova_compute[187185]: 2025-11-29 07:15:52.703 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:52 compute-0 nova_compute[187185]: 2025-11-29 07:15:52.706 187189 DEBUG nova.compute.manager [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 07:15:52 compute-0 nova_compute[187185]: 2025-11-29 07:15:52.707 187189 DEBUG nova.virt.libvirt.driver [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:15:52 compute-0 nova_compute[187185]: 2025-11-29 07:15:52.707 187189 INFO nova.virt.libvirt.driver [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Creating image(s)
Nov 29 07:15:52 compute-0 nova_compute[187185]: 2025-11-29 07:15:52.707 187189 DEBUG oslo_concurrency.lockutils [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "/var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:15:52 compute-0 nova_compute[187185]: 2025-11-29 07:15:52.708 187189 DEBUG oslo_concurrency.lockutils [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "/var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:15:52 compute-0 nova_compute[187185]: 2025-11-29 07:15:52.708 187189 DEBUG oslo_concurrency.lockutils [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "/var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:15:52 compute-0 nova_compute[187185]: 2025-11-29 07:15:52.719 187189 DEBUG oslo_concurrency.processutils [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:15:52 compute-0 nova_compute[187185]: 2025-11-29 07:15:52.776 187189 DEBUG oslo_concurrency.processutils [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:15:52 compute-0 nova_compute[187185]: 2025-11-29 07:15:52.778 187189 DEBUG oslo_concurrency.lockutils [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:15:52 compute-0 nova_compute[187185]: 2025-11-29 07:15:52.780 187189 DEBUG oslo_concurrency.lockutils [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:15:52 compute-0 nova_compute[187185]: 2025-11-29 07:15:52.801 187189 DEBUG oslo_concurrency.processutils [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:15:52 compute-0 nova_compute[187185]: 2025-11-29 07:15:52.894 187189 DEBUG oslo_concurrency.processutils [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:15:52 compute-0 nova_compute[187185]: 2025-11-29 07:15:52.895 187189 DEBUG oslo_concurrency.processutils [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:15:52 compute-0 nova_compute[187185]: 2025-11-29 07:15:52.938 187189 DEBUG nova.policy [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 07:15:52 compute-0 nova_compute[187185]: 2025-11-29 07:15:52.941 187189 DEBUG oslo_concurrency.processutils [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363/disk 1073741824" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:15:52 compute-0 nova_compute[187185]: 2025-11-29 07:15:52.942 187189 DEBUG oslo_concurrency.lockutils [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:15:52 compute-0 nova_compute[187185]: 2025-11-29 07:15:52.942 187189 DEBUG oslo_concurrency.processutils [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:15:53 compute-0 nova_compute[187185]: 2025-11-29 07:15:53.016 187189 DEBUG oslo_concurrency.processutils [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:15:53 compute-0 nova_compute[187185]: 2025-11-29 07:15:53.017 187189 DEBUG nova.virt.disk.api [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Checking if we can resize image /var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:15:53 compute-0 nova_compute[187185]: 2025-11-29 07:15:53.018 187189 DEBUG oslo_concurrency.processutils [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:15:53 compute-0 nova_compute[187185]: 2025-11-29 07:15:53.078 187189 DEBUG oslo_concurrency.processutils [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:15:53 compute-0 nova_compute[187185]: 2025-11-29 07:15:53.080 187189 DEBUG nova.virt.disk.api [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Cannot resize image /var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:15:53 compute-0 nova_compute[187185]: 2025-11-29 07:15:53.081 187189 DEBUG nova.objects.instance [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'migration_context' on Instance uuid 704c4aa7-3239-4ecc-bfdc-c72642678363 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:15:53 compute-0 nova_compute[187185]: 2025-11-29 07:15:53.139 187189 DEBUG nova.virt.libvirt.driver [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:15:53 compute-0 nova_compute[187185]: 2025-11-29 07:15:53.140 187189 DEBUG nova.virt.libvirt.driver [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Ensure instance console log exists: /var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:15:53 compute-0 nova_compute[187185]: 2025-11-29 07:15:53.141 187189 DEBUG oslo_concurrency.lockutils [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:15:53 compute-0 nova_compute[187185]: 2025-11-29 07:15:53.142 187189 DEBUG oslo_concurrency.lockutils [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:15:53 compute-0 nova_compute[187185]: 2025-11-29 07:15:53.143 187189 DEBUG oslo_concurrency.lockutils [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:15:53 compute-0 nova_compute[187185]: 2025-11-29 07:15:53.970 187189 DEBUG nova.network.neutron [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Updating instance_info_cache with network_info: [{"id": "60943dec-d420-449f-abc3-233df163ebed", "address": "fa:16:3e:04:06:9e", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60943dec-d4", "ovs_interfaceid": "60943dec-d420-449f-abc3-233df163ebed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.087 187189 DEBUG oslo_concurrency.lockutils [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Releasing lock "refresh_cache-084a0f8e-19b7-4b24-a503-c015b26addbc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.250 187189 INFO nova.virt.libvirt.driver [-] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Instance destroyed successfully.
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.251 187189 DEBUG nova.objects.instance [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'numa_topology' on Instance uuid 084a0f8e-19b7-4b24-a503-c015b26addbc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.346 187189 DEBUG nova.objects.instance [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'resources' on Instance uuid 084a0f8e-19b7-4b24-a503-c015b26addbc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.483 187189 DEBUG nova.virt.libvirt.vif [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:11:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-734207825',display_name='tempest-ServerActionsTestOtherB-server-734207825',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-734207825',id=96,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJEYU7KNgNpvYMhWLcgNKb4JeWm+l16ttLKZ2We4gLp8YMbZFLJD2i4RZSQXciBvCLn4uXa9U2Zxsdygka87gys3pZZ16d1VbC25mryAsCgbm8dp7GriXd9FfJytMY+M+Q==',key_name='tempest-keypair-1534024740',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:15:32Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='32e51e3a9a8f4a1ca6e022735ebf5f7b',ramdisk_id='',reservation_id='r-4lueok56',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1538648925',owner_user_name='tempest-ServerActionsTestOtherB-1538648925-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:15:43Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ee2d4931cb504b13b92a2f52c95c05ce',uuid=084a0f8e-19b7-4b24-a503-c015b26addbc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "60943dec-d420-449f-abc3-233df163ebed", "address": "fa:16:3e:04:06:9e", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60943dec-d4", "ovs_interfaceid": "60943dec-d420-449f-abc3-233df163ebed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.484 187189 DEBUG nova.network.os_vif_util [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converting VIF {"id": "60943dec-d420-449f-abc3-233df163ebed", "address": "fa:16:3e:04:06:9e", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60943dec-d4", "ovs_interfaceid": "60943dec-d420-449f-abc3-233df163ebed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.486 187189 DEBUG nova.network.os_vif_util [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:06:9e,bridge_name='br-int',has_traffic_filtering=True,id=60943dec-d420-449f-abc3-233df163ebed,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60943dec-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.487 187189 DEBUG os_vif [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:06:9e,bridge_name='br-int',has_traffic_filtering=True,id=60943dec-d420-449f-abc3-233df163ebed,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60943dec-d4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.491 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.491 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap60943dec-d4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.494 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.497 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.502 187189 INFO os_vif [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:06:9e,bridge_name='br-int',has_traffic_filtering=True,id=60943dec-d420-449f-abc3-233df163ebed,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60943dec-d4')
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.511 187189 DEBUG nova.virt.libvirt.driver [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Start _get_guest_xml network_info=[{"id": "60943dec-d420-449f-abc3-233df163ebed", "address": "fa:16:3e:04:06:9e", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60943dec-d4", "ovs_interfaceid": "60943dec-d420-449f-abc3-233df163ebed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.516 187189 WARNING nova.virt.libvirt.driver [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.531 187189 DEBUG nova.virt.libvirt.host [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.531 187189 DEBUG nova.virt.libvirt.host [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.535 187189 DEBUG nova.virt.libvirt.host [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.535 187189 DEBUG nova.virt.libvirt.host [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.536 187189 DEBUG nova.virt.libvirt.driver [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.536 187189 DEBUG nova.virt.hardware [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e29df891-dca5-4a1c-9258-dc512a46956f',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.537 187189 DEBUG nova.virt.hardware [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.537 187189 DEBUG nova.virt.hardware [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.537 187189 DEBUG nova.virt.hardware [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.537 187189 DEBUG nova.virt.hardware [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.538 187189 DEBUG nova.virt.hardware [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.538 187189 DEBUG nova.virt.hardware [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.538 187189 DEBUG nova.virt.hardware [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.538 187189 DEBUG nova.virt.hardware [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.538 187189 DEBUG nova.virt.hardware [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.539 187189 DEBUG nova.virt.hardware [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.539 187189 DEBUG nova.objects.instance [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'vcpu_model' on Instance uuid 084a0f8e-19b7-4b24-a503-c015b26addbc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.587 187189 DEBUG nova.virt.libvirt.vif [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:11:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-734207825',display_name='tempest-ServerActionsTestOtherB-server-734207825',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-734207825',id=96,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJEYU7KNgNpvYMhWLcgNKb4JeWm+l16ttLKZ2We4gLp8YMbZFLJD2i4RZSQXciBvCLn4uXa9U2Zxsdygka87gys3pZZ16d1VbC25mryAsCgbm8dp7GriXd9FfJytMY+M+Q==',key_name='tempest-keypair-1534024740',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:15:32Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='32e51e3a9a8f4a1ca6e022735ebf5f7b',ramdisk_id='',reservation_id='r-4lueok56',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1538648925',owner_user_name='tempest-ServerActionsTestOtherB-1538648925-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:15:43Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ee2d4931cb504b13b92a2f52c95c05ce',uuid=084a0f8e-19b7-4b24-a503-c015b26addbc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "60943dec-d420-449f-abc3-233df163ebed", "address": "fa:16:3e:04:06:9e", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60943dec-d4", "ovs_interfaceid": "60943dec-d420-449f-abc3-233df163ebed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.587 187189 DEBUG nova.network.os_vif_util [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converting VIF {"id": "60943dec-d420-449f-abc3-233df163ebed", "address": "fa:16:3e:04:06:9e", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60943dec-d4", "ovs_interfaceid": "60943dec-d420-449f-abc3-233df163ebed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.588 187189 DEBUG nova.network.os_vif_util [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:06:9e,bridge_name='br-int',has_traffic_filtering=True,id=60943dec-d420-449f-abc3-233df163ebed,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60943dec-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.590 187189 DEBUG nova.objects.instance [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'pci_devices' on Instance uuid 084a0f8e-19b7-4b24-a503-c015b26addbc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.643 187189 DEBUG nova.virt.libvirt.driver [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:15:54 compute-0 nova_compute[187185]:   <uuid>084a0f8e-19b7-4b24-a503-c015b26addbc</uuid>
Nov 29 07:15:54 compute-0 nova_compute[187185]:   <name>instance-00000060</name>
Nov 29 07:15:54 compute-0 nova_compute[187185]:   <memory>196608</memory>
Nov 29 07:15:54 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:15:54 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:15:54 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:       <nova:name>tempest-ServerActionsTestOtherB-server-734207825</nova:name>
Nov 29 07:15:54 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:15:54</nova:creationTime>
Nov 29 07:15:54 compute-0 nova_compute[187185]:       <nova:flavor name="m1.micro">
Nov 29 07:15:54 compute-0 nova_compute[187185]:         <nova:memory>192</nova:memory>
Nov 29 07:15:54 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:15:54 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:15:54 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:15:54 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:15:54 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:15:54 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:15:54 compute-0 nova_compute[187185]:         <nova:user uuid="ee2d4931cb504b13b92a2f52c95c05ce">tempest-ServerActionsTestOtherB-1538648925-project-member</nova:user>
Nov 29 07:15:54 compute-0 nova_compute[187185]:         <nova:project uuid="32e51e3a9a8f4a1ca6e022735ebf5f7b">tempest-ServerActionsTestOtherB-1538648925</nova:project>
Nov 29 07:15:54 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:15:54 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:15:54 compute-0 nova_compute[187185]:         <nova:port uuid="60943dec-d420-449f-abc3-233df163ebed">
Nov 29 07:15:54 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:15:54 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:15:54 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:15:54 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <system>
Nov 29 07:15:54 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:15:54 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:15:54 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:15:54 compute-0 nova_compute[187185]:       <entry name="serial">084a0f8e-19b7-4b24-a503-c015b26addbc</entry>
Nov 29 07:15:54 compute-0 nova_compute[187185]:       <entry name="uuid">084a0f8e-19b7-4b24-a503-c015b26addbc</entry>
Nov 29 07:15:54 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     </system>
Nov 29 07:15:54 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:15:54 compute-0 nova_compute[187185]:   <os>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:   </os>
Nov 29 07:15:54 compute-0 nova_compute[187185]:   <features>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:   </features>
Nov 29 07:15:54 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:15:54 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:15:54 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:15:54 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:15:54 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk.config"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:15:54 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:04:06:9e"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:       <target dev="tap60943dec-d4"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:15:54 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/console.log" append="off"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <video>
Nov 29 07:15:54 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     </video>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <input type="keyboard" bus="usb"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:15:54 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:15:54 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:15:54 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:15:54 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:15:54 compute-0 nova_compute[187185]: </domain>
Nov 29 07:15:54 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.646 187189 DEBUG oslo_concurrency.processutils [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.720 187189 DEBUG oslo_concurrency.processutils [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.721 187189 DEBUG oslo_concurrency.processutils [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.779 187189 DEBUG oslo_concurrency.processutils [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.781 187189 DEBUG nova.objects.instance [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'trusted_certs' on Instance uuid 084a0f8e-19b7-4b24-a503-c015b26addbc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.847 187189 DEBUG oslo_concurrency.processutils [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.918 187189 DEBUG oslo_concurrency.processutils [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.919 187189 DEBUG nova.virt.disk.api [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Checking if we can resize image /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.919 187189 DEBUG oslo_concurrency.processutils [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.990 187189 DEBUG oslo_concurrency.processutils [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.991 187189 DEBUG nova.virt.disk.api [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Cannot resize image /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:15:54 compute-0 nova_compute[187185]: 2025-11-29 07:15:54.991 187189 DEBUG nova.objects.instance [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'migration_context' on Instance uuid 084a0f8e-19b7-4b24-a503-c015b26addbc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.024 187189 DEBUG nova.virt.libvirt.vif [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:11:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-734207825',display_name='tempest-ServerActionsTestOtherB-server-734207825',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-734207825',id=96,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJEYU7KNgNpvYMhWLcgNKb4JeWm+l16ttLKZ2We4gLp8YMbZFLJD2i4RZSQXciBvCLn4uXa9U2Zxsdygka87gys3pZZ16d1VbC25mryAsCgbm8dp7GriXd9FfJytMY+M+Q==',key_name='tempest-keypair-1534024740',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:15:32Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='32e51e3a9a8f4a1ca6e022735ebf5f7b',ramdisk_id='',reservation_id='r-4lueok56',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1538648925',owner_user_name='tempest-ServerActionsTestOtherB-1538648925-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:15:43Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ee2d4931cb504b13b92a2f52c95c05ce',uuid=084a0f8e-19b7-4b24-a503-c015b26addbc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "60943dec-d420-449f-abc3-233df163ebed", "address": "fa:16:3e:04:06:9e", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60943dec-d4", "ovs_interfaceid": "60943dec-d420-449f-abc3-233df163ebed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.025 187189 DEBUG nova.network.os_vif_util [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converting VIF {"id": "60943dec-d420-449f-abc3-233df163ebed", "address": "fa:16:3e:04:06:9e", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60943dec-d4", "ovs_interfaceid": "60943dec-d420-449f-abc3-233df163ebed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.025 187189 DEBUG nova.network.os_vif_util [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:06:9e,bridge_name='br-int',has_traffic_filtering=True,id=60943dec-d420-449f-abc3-233df163ebed,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60943dec-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.026 187189 DEBUG os_vif [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:06:9e,bridge_name='br-int',has_traffic_filtering=True,id=60943dec-d420-449f-abc3-233df163ebed,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60943dec-d4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.026 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.027 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.027 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.029 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.030 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap60943dec-d4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.030 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap60943dec-d4, col_values=(('external_ids', {'iface-id': '60943dec-d420-449f-abc3-233df163ebed', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:04:06:9e', 'vm-uuid': '084a0f8e-19b7-4b24-a503-c015b26addbc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.032 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:55 compute-0 NetworkManager[55227]: <info>  [1764400555.0329] manager: (tap60943dec-d4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/138)
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.034 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.038 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.038 187189 INFO os_vif [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:06:9e,bridge_name='br-int',has_traffic_filtering=True,id=60943dec-d420-449f-abc3-233df163ebed,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60943dec-d4')
Nov 29 07:15:55 compute-0 kernel: tap60943dec-d4: entered promiscuous mode
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.140 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:55 compute-0 ovn_controller[95281]: 2025-11-29T07:15:55Z|00260|binding|INFO|Claiming lport 60943dec-d420-449f-abc3-233df163ebed for this chassis.
Nov 29 07:15:55 compute-0 ovn_controller[95281]: 2025-11-29T07:15:55Z|00261|binding|INFO|60943dec-d420-449f-abc3-233df163ebed: Claiming fa:16:3e:04:06:9e 10.100.0.9
Nov 29 07:15:55 compute-0 NetworkManager[55227]: <info>  [1764400555.1408] manager: (tap60943dec-d4): new Tun device (/org/freedesktop/NetworkManager/Devices/139)
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.146 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.153 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.159 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.167 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:55 compute-0 NetworkManager[55227]: <info>  [1764400555.1685] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/140)
Nov 29 07:15:55 compute-0 NetworkManager[55227]: <info>  [1764400555.1691] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/141)
Nov 29 07:15:55 compute-0 systemd-udevd[229034]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:15:55 compute-0 NetworkManager[55227]: <info>  [1764400555.1893] device (tap60943dec-d4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:15:55 compute-0 NetworkManager[55227]: <info>  [1764400555.1900] device (tap60943dec-d4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:15:55 compute-0 systemd-machined[153486]: New machine qemu-36-instance-00000060.
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:55.224 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:06:9e 10.100.0.9'], port_security=['fa:16:3e:04:06:9e 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '084a0f8e-19b7-4b24-a503-c015b26addbc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'neutron:revision_number': '6', 'neutron:security_group_ids': '490b426d-026a-4a21-8c41-f013fe0c1458', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.195'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=04b58113-8105-402c-a103-4692d3989228, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=60943dec-d420-449f-abc3-233df163ebed) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:55.226 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 60943dec-d420-449f-abc3-233df163ebed in datapath df7cfc35-3f76-45b2-b70c-e4525d38f410 bound to our chassis
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:55.229 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network df7cfc35-3f76-45b2-b70c-e4525d38f410
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:55.241 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e467354f-3b15-4412-8a0b-27bef005f6b3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:55.242 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdf7cfc35-31 in ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:55.244 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdf7cfc35-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:55.245 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[ad73ff36-3318-47e2-9370-3862d0a7067e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:55.245 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[4e07a514-95ff-41aa-bd60-252cef7f8908]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:55.258 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[4dd2f792-0987-4ff6-9c8a-14a5d0036259]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:55 compute-0 systemd[1]: Started Virtual Machine qemu-36-instance-00000060.
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:55.291 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[00437af2-de73-4943-b9a5-e02c1760d2c4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:55.334 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[84d54c43-d917-4859-91c0-135b64ecd3cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:55.360 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[c7e72ba5-e4a7-44e0-a868-af6868fae93d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:55 compute-0 NetworkManager[55227]: <info>  [1764400555.3623] manager: (tapdf7cfc35-30): new Veth device (/org/freedesktop/NetworkManager/Devices/142)
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.380 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:55.405 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[9576e16b-c930-49e8-b2fd-251603a3fccf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:55.408 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[5f1d102a-5a52-4dd7-8a81-8ee0350e253e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.423 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:55 compute-0 ovn_controller[95281]: 2025-11-29T07:15:55Z|00262|binding|INFO|Setting lport 60943dec-d420-449f-abc3-233df163ebed ovn-installed in OVS
Nov 29 07:15:55 compute-0 ovn_controller[95281]: 2025-11-29T07:15:55Z|00263|binding|INFO|Setting lport 60943dec-d420-449f-abc3-233df163ebed up in Southbound
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.432 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:55 compute-0 NetworkManager[55227]: <info>  [1764400555.4338] device (tapdf7cfc35-30): carrier: link connected
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:55.442 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[c729a2a9-fa3d-4625-ad8d-289160b949c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:55.466 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e116d094-7a5c-4987-ac10-cf89fb746439]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdf7cfc35-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:ae:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 85], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600309, 'reachable_time': 41806, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229068, 'error': None, 'target': 'ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:55.484 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[8901729a-04d3-4b8b-b0cc-d4d64892211f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe84:aeb6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 600309, 'tstamp': 600309}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229069, 'error': None, 'target': 'ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:55.506 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[2d4db120-da49-4237-9994-0a1ce8afccd9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdf7cfc35-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:ae:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 85], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600309, 'reachable_time': 41806, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229070, 'error': None, 'target': 'ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:55.536 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[74569944-2f27-4bfe-b334-b36c478dfb9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:55.603 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[ae9a91e1-571a-4d88-8821-08238588178d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:55.605 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdf7cfc35-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:55.606 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:55.606 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdf7cfc35-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.610 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:55 compute-0 NetworkManager[55227]: <info>  [1764400555.6115] manager: (tapdf7cfc35-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/143)
Nov 29 07:15:55 compute-0 kernel: tapdf7cfc35-30: entered promiscuous mode
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.615 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:55.616 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdf7cfc35-30, col_values=(('external_ids', {'iface-id': 'cab31803-36dd-4107-bb9e-3d36862142c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.617 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:55 compute-0 ovn_controller[95281]: 2025-11-29T07:15:55Z|00264|binding|INFO|Releasing lport cab31803-36dd-4107-bb9e-3d36862142c0 from this chassis (sb_readonly=0)
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.618 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:55.618 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/df7cfc35-3f76-45b2-b70c-e4525d38f410.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/df7cfc35-3f76-45b2-b70c-e4525d38f410.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:55.619 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6020457b-b3be-41f5-abab-79ab0a300c7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:55.620 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-df7cfc35-3f76-45b2-b70c-e4525d38f410
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/df7cfc35-3f76-45b2-b70c-e4525d38f410.pid.haproxy
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID df7cfc35-3f76-45b2-b70c-e4525d38f410
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:55.622 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'env', 'PROCESS_TAG=haproxy-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/df7cfc35-3f76-45b2-b70c-e4525d38f410.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.628 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.738 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400555.7370622, 084a0f8e-19b7-4b24-a503-c015b26addbc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.738 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] VM Resumed (Lifecycle Event)
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.742 187189 DEBUG nova.compute.manager [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.746 187189 INFO nova.virt.libvirt.driver [-] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Instance rebooted successfully.
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.747 187189 DEBUG nova.compute.manager [None req-546b6200-846a-4e80-ac8b-035bec4071fb ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.831 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.835 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.873 187189 DEBUG nova.compute.manager [req-34f4edea-caef-4be0-8d4d-e571f2d8747c req-e1114391-74cb-402c-a781-db68d27b99c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Received event network-vif-plugged-60943dec-d420-449f-abc3-233df163ebed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.873 187189 DEBUG oslo_concurrency.lockutils [req-34f4edea-caef-4be0-8d4d-e571f2d8747c req-e1114391-74cb-402c-a781-db68d27b99c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "084a0f8e-19b7-4b24-a503-c015b26addbc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.874 187189 DEBUG oslo_concurrency.lockutils [req-34f4edea-caef-4be0-8d4d-e571f2d8747c req-e1114391-74cb-402c-a781-db68d27b99c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "084a0f8e-19b7-4b24-a503-c015b26addbc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.874 187189 DEBUG oslo_concurrency.lockutils [req-34f4edea-caef-4be0-8d4d-e571f2d8747c req-e1114391-74cb-402c-a781-db68d27b99c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "084a0f8e-19b7-4b24-a503-c015b26addbc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.874 187189 DEBUG nova.compute.manager [req-34f4edea-caef-4be0-8d4d-e571f2d8747c req-e1114391-74cb-402c-a781-db68d27b99c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] No waiting events found dispatching network-vif-plugged-60943dec-d420-449f-abc3-233df163ebed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.874 187189 WARNING nova.compute.manager [req-34f4edea-caef-4be0-8d4d-e571f2d8747c req-e1114391-74cb-402c-a781-db68d27b99c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Received unexpected event network-vif-plugged-60943dec-d420-449f-abc3-233df163ebed for instance with vm_state stopped and task_state powering-on.
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.917 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] During sync_power_state the instance has a pending task (powering-on). Skip.
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.918 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400555.737529, 084a0f8e-19b7-4b24-a503-c015b26addbc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.918 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] VM Started (Lifecycle Event)
Nov 29 07:15:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:55.935 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:15:55 compute-0 nova_compute[187185]: 2025-11-29 07:15:55.937 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:56 compute-0 podman[229109]: 2025-11-29 07:15:56.019748331 +0000 UTC m=+0.061379583 container create dad1afe97ac104d2aaea3bd1a74ef01de62d50ada7eaf30be58211baf30678b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 29 07:15:56 compute-0 nova_compute[187185]: 2025-11-29 07:15:56.038 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:15:56 compute-0 nova_compute[187185]: 2025-11-29 07:15:56.042 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:15:56 compute-0 systemd[1]: Started libpod-conmon-dad1afe97ac104d2aaea3bd1a74ef01de62d50ada7eaf30be58211baf30678b3.scope.
Nov 29 07:15:56 compute-0 podman[229109]: 2025-11-29 07:15:55.986770045 +0000 UTC m=+0.028401327 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:15:56 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:15:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9ffaf390974ad077192156f1e2b6ced2f28367966d31e0ba645c7e1190db86e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:15:56 compute-0 podman[229109]: 2025-11-29 07:15:56.107779417 +0000 UTC m=+0.149410699 container init dad1afe97ac104d2aaea3bd1a74ef01de62d50ada7eaf30be58211baf30678b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 29 07:15:56 compute-0 podman[229109]: 2025-11-29 07:15:56.114635682 +0000 UTC m=+0.156266914 container start dad1afe97ac104d2aaea3bd1a74ef01de62d50ada7eaf30be58211baf30678b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:15:56 compute-0 neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410[229125]: [NOTICE]   (229129) : New worker (229131) forked
Nov 29 07:15:56 compute-0 neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410[229125]: [NOTICE]   (229129) : Loading success.
Nov 29 07:15:56 compute-0 nova_compute[187185]: 2025-11-29 07:15:56.172 187189 DEBUG nova.network.neutron [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Successfully created port: 29881f52-aa42-4a78-a87b-06e906811ff2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:15:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:56.181 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:15:57 compute-0 nova_compute[187185]: 2025-11-29 07:15:57.586 187189 DEBUG oslo_concurrency.lockutils [None req-a6197097-3d4e-4b0e-8b11-da5ea9e3d9cc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "084a0f8e-19b7-4b24-a503-c015b26addbc" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:15:57 compute-0 nova_compute[187185]: 2025-11-29 07:15:57.586 187189 DEBUG oslo_concurrency.lockutils [None req-a6197097-3d4e-4b0e-8b11-da5ea9e3d9cc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "084a0f8e-19b7-4b24-a503-c015b26addbc" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:15:57 compute-0 nova_compute[187185]: 2025-11-29 07:15:57.586 187189 DEBUG oslo_concurrency.lockutils [None req-a6197097-3d4e-4b0e-8b11-da5ea9e3d9cc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "084a0f8e-19b7-4b24-a503-c015b26addbc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:15:57 compute-0 nova_compute[187185]: 2025-11-29 07:15:57.587 187189 DEBUG oslo_concurrency.lockutils [None req-a6197097-3d4e-4b0e-8b11-da5ea9e3d9cc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "084a0f8e-19b7-4b24-a503-c015b26addbc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:15:57 compute-0 nova_compute[187185]: 2025-11-29 07:15:57.587 187189 DEBUG oslo_concurrency.lockutils [None req-a6197097-3d4e-4b0e-8b11-da5ea9e3d9cc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "084a0f8e-19b7-4b24-a503-c015b26addbc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:15:57 compute-0 nova_compute[187185]: 2025-11-29 07:15:57.760 187189 INFO nova.compute.manager [None req-a6197097-3d4e-4b0e-8b11-da5ea9e3d9cc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Terminating instance
Nov 29 07:15:57 compute-0 nova_compute[187185]: 2025-11-29 07:15:57.897 187189 DEBUG nova.compute.manager [None req-a6197097-3d4e-4b0e-8b11-da5ea9e3d9cc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 07:15:57 compute-0 kernel: tap60943dec-d4 (unregistering): left promiscuous mode
Nov 29 07:15:57 compute-0 NetworkManager[55227]: <info>  [1764400557.9196] device (tap60943dec-d4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:15:57 compute-0 nova_compute[187185]: 2025-11-29 07:15:57.936 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:57 compute-0 ovn_controller[95281]: 2025-11-29T07:15:57Z|00265|binding|INFO|Releasing lport 60943dec-d420-449f-abc3-233df163ebed from this chassis (sb_readonly=0)
Nov 29 07:15:57 compute-0 ovn_controller[95281]: 2025-11-29T07:15:57Z|00266|binding|INFO|Setting lport 60943dec-d420-449f-abc3-233df163ebed down in Southbound
Nov 29 07:15:57 compute-0 ovn_controller[95281]: 2025-11-29T07:15:57Z|00267|binding|INFO|Removing iface tap60943dec-d4 ovn-installed in OVS
Nov 29 07:15:57 compute-0 nova_compute[187185]: 2025-11-29 07:15:57.939 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:57 compute-0 nova_compute[187185]: 2025-11-29 07:15:57.951 187189 DEBUG nova.network.neutron [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Successfully updated port: 29881f52-aa42-4a78-a87b-06e906811ff2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:15:57 compute-0 nova_compute[187185]: 2025-11-29 07:15:57.953 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:57.969 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:06:9e 10.100.0.9'], port_security=['fa:16:3e:04:06:9e 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '084a0f8e-19b7-4b24-a503-c015b26addbc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'neutron:revision_number': '8', 'neutron:security_group_ids': '490b426d-026a-4a21-8c41-f013fe0c1458', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.195', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=04b58113-8105-402c-a103-4692d3989228, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=60943dec-d420-449f-abc3-233df163ebed) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:15:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:57.972 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 60943dec-d420-449f-abc3-233df163ebed in datapath df7cfc35-3f76-45b2-b70c-e4525d38f410 unbound from our chassis
Nov 29 07:15:57 compute-0 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d00000060.scope: Deactivated successfully.
Nov 29 07:15:57 compute-0 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d00000060.scope: Consumed 2.587s CPU time.
Nov 29 07:15:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:57.976 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network df7cfc35-3f76-45b2-b70c-e4525d38f410, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:15:57 compute-0 systemd-machined[153486]: Machine qemu-36-instance-00000060 terminated.
Nov 29 07:15:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:57.979 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[fb0a85de-2fc2-4445-8ca2-dc7301e0168f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:57.980 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410 namespace which is not needed anymore
Nov 29 07:15:58 compute-0 nova_compute[187185]: 2025-11-29 07:15:58.101 187189 DEBUG oslo_concurrency.lockutils [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "refresh_cache-704c4aa7-3239-4ecc-bfdc-c72642678363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:15:58 compute-0 nova_compute[187185]: 2025-11-29 07:15:58.102 187189 DEBUG oslo_concurrency.lockutils [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquired lock "refresh_cache-704c4aa7-3239-4ecc-bfdc-c72642678363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:15:58 compute-0 nova_compute[187185]: 2025-11-29 07:15:58.102 187189 DEBUG nova.network.neutron [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:15:58 compute-0 NetworkManager[55227]: <info>  [1764400558.1191] manager: (tap60943dec-d4): new Tun device (/org/freedesktop/NetworkManager/Devices/144)
Nov 29 07:15:58 compute-0 nova_compute[187185]: 2025-11-29 07:15:58.121 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:58 compute-0 neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410[229125]: [NOTICE]   (229129) : haproxy version is 2.8.14-c23fe91
Nov 29 07:15:58 compute-0 neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410[229125]: [NOTICE]   (229129) : path to executable is /usr/sbin/haproxy
Nov 29 07:15:58 compute-0 neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410[229125]: [WARNING]  (229129) : Exiting Master process...
Nov 29 07:15:58 compute-0 neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410[229125]: [ALERT]    (229129) : Current worker (229131) exited with code 143 (Terminated)
Nov 29 07:15:58 compute-0 neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410[229125]: [WARNING]  (229129) : All workers exited. Exiting... (0)
Nov 29 07:15:58 compute-0 nova_compute[187185]: 2025-11-29 07:15:58.128 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:58 compute-0 systemd[1]: libpod-dad1afe97ac104d2aaea3bd1a74ef01de62d50ada7eaf30be58211baf30678b3.scope: Deactivated successfully.
Nov 29 07:15:58 compute-0 podman[229162]: 2025-11-29 07:15:58.137225954 +0000 UTC m=+0.047440257 container died dad1afe97ac104d2aaea3bd1a74ef01de62d50ada7eaf30be58211baf30678b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:15:58 compute-0 nova_compute[187185]: 2025-11-29 07:15:58.169 187189 INFO nova.virt.libvirt.driver [-] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Instance destroyed successfully.
Nov 29 07:15:58 compute-0 nova_compute[187185]: 2025-11-29 07:15:58.170 187189 DEBUG nova.objects.instance [None req-a6197097-3d4e-4b0e-8b11-da5ea9e3d9cc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'resources' on Instance uuid 084a0f8e-19b7-4b24-a503-c015b26addbc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:15:58 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-dad1afe97ac104d2aaea3bd1a74ef01de62d50ada7eaf30be58211baf30678b3-userdata-shm.mount: Deactivated successfully.
Nov 29 07:15:58 compute-0 systemd[1]: var-lib-containers-storage-overlay-f9ffaf390974ad077192156f1e2b6ced2f28367966d31e0ba645c7e1190db86e-merged.mount: Deactivated successfully.
Nov 29 07:15:58 compute-0 podman[229162]: 2025-11-29 07:15:58.206078587 +0000 UTC m=+0.116292900 container cleanup dad1afe97ac104d2aaea3bd1a74ef01de62d50ada7eaf30be58211baf30678b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 07:15:58 compute-0 systemd[1]: libpod-conmon-dad1afe97ac104d2aaea3bd1a74ef01de62d50ada7eaf30be58211baf30678b3.scope: Deactivated successfully.
Nov 29 07:15:58 compute-0 podman[229209]: 2025-11-29 07:15:58.270659999 +0000 UTC m=+0.043740902 container remove dad1afe97ac104d2aaea3bd1a74ef01de62d50ada7eaf30be58211baf30678b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:15:58 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:58.276 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a3843cdd-cff0-47de-bc3d-c45c83a938b1]: (4, ('Sat Nov 29 07:15:58 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410 (dad1afe97ac104d2aaea3bd1a74ef01de62d50ada7eaf30be58211baf30678b3)\ndad1afe97ac104d2aaea3bd1a74ef01de62d50ada7eaf30be58211baf30678b3\nSat Nov 29 07:15:58 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410 (dad1afe97ac104d2aaea3bd1a74ef01de62d50ada7eaf30be58211baf30678b3)\ndad1afe97ac104d2aaea3bd1a74ef01de62d50ada7eaf30be58211baf30678b3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:58 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:58.278 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[4c62af4a-4cfa-4143-aecd-b1445558d568]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:58 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:58.280 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdf7cfc35-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:15:58 compute-0 nova_compute[187185]: 2025-11-29 07:15:58.282 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:58 compute-0 kernel: tapdf7cfc35-30: left promiscuous mode
Nov 29 07:15:58 compute-0 nova_compute[187185]: 2025-11-29 07:15:58.297 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:58 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:58.301 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[b8abc358-2a59-4966-822a-1a09ff8be54c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:58 compute-0 nova_compute[187185]: 2025-11-29 07:15:58.308 187189 DEBUG nova.compute.manager [req-67cbadd7-a3eb-4b2b-89b2-e688dd0c9e9f req-e09172c3-8bf7-42ea-bc37-5ec9912852b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Received event network-vif-plugged-60943dec-d420-449f-abc3-233df163ebed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:15:58 compute-0 nova_compute[187185]: 2025-11-29 07:15:58.308 187189 DEBUG oslo_concurrency.lockutils [req-67cbadd7-a3eb-4b2b-89b2-e688dd0c9e9f req-e09172c3-8bf7-42ea-bc37-5ec9912852b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "084a0f8e-19b7-4b24-a503-c015b26addbc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:15:58 compute-0 nova_compute[187185]: 2025-11-29 07:15:58.309 187189 DEBUG oslo_concurrency.lockutils [req-67cbadd7-a3eb-4b2b-89b2-e688dd0c9e9f req-e09172c3-8bf7-42ea-bc37-5ec9912852b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "084a0f8e-19b7-4b24-a503-c015b26addbc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:15:58 compute-0 nova_compute[187185]: 2025-11-29 07:15:58.309 187189 DEBUG oslo_concurrency.lockutils [req-67cbadd7-a3eb-4b2b-89b2-e688dd0c9e9f req-e09172c3-8bf7-42ea-bc37-5ec9912852b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "084a0f8e-19b7-4b24-a503-c015b26addbc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:15:58 compute-0 nova_compute[187185]: 2025-11-29 07:15:58.309 187189 DEBUG nova.compute.manager [req-67cbadd7-a3eb-4b2b-89b2-e688dd0c9e9f req-e09172c3-8bf7-42ea-bc37-5ec9912852b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] No waiting events found dispatching network-vif-plugged-60943dec-d420-449f-abc3-233df163ebed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:15:58 compute-0 nova_compute[187185]: 2025-11-29 07:15:58.309 187189 WARNING nova.compute.manager [req-67cbadd7-a3eb-4b2b-89b2-e688dd0c9e9f req-e09172c3-8bf7-42ea-bc37-5ec9912852b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Received unexpected event network-vif-plugged-60943dec-d420-449f-abc3-233df163ebed for instance with vm_state active and task_state deleting.
Nov 29 07:15:58 compute-0 nova_compute[187185]: 2025-11-29 07:15:58.319 187189 DEBUG nova.virt.libvirt.vif [None req-a6197097-3d4e-4b0e-8b11-da5ea9e3d9cc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:11:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-734207825',display_name='tempest-ServerActionsTestOtherB-server-734207825',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-734207825',id=96,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJEYU7KNgNpvYMhWLcgNKb4JeWm+l16ttLKZ2We4gLp8YMbZFLJD2i4RZSQXciBvCLn4uXa9U2Zxsdygka87gys3pZZ16d1VbC25mryAsCgbm8dp7GriXd9FfJytMY+M+Q==',key_name='tempest-keypair-1534024740',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:15:32Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='32e51e3a9a8f4a1ca6e022735ebf5f7b',ramdisk_id='',reservation_id='r-4lueok56',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1538648925',owner_user_name='tempest-ServerActionsTestOtherB-1538648925-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:15:55Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ee2d4931cb504b13b92a2f52c95c05ce',uuid=084a0f8e-19b7-4b24-a503-c015b26addbc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "60943dec-d420-449f-abc3-233df163ebed", "address": "fa:16:3e:04:06:9e", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60943dec-d4", "ovs_interfaceid": "60943dec-d420-449f-abc3-233df163ebed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:15:58 compute-0 nova_compute[187185]: 2025-11-29 07:15:58.320 187189 DEBUG nova.network.os_vif_util [None req-a6197097-3d4e-4b0e-8b11-da5ea9e3d9cc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converting VIF {"id": "60943dec-d420-449f-abc3-233df163ebed", "address": "fa:16:3e:04:06:9e", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60943dec-d4", "ovs_interfaceid": "60943dec-d420-449f-abc3-233df163ebed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:15:58 compute-0 nova_compute[187185]: 2025-11-29 07:15:58.321 187189 DEBUG nova.network.os_vif_util [None req-a6197097-3d4e-4b0e-8b11-da5ea9e3d9cc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:06:9e,bridge_name='br-int',has_traffic_filtering=True,id=60943dec-d420-449f-abc3-233df163ebed,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60943dec-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:15:58 compute-0 nova_compute[187185]: 2025-11-29 07:15:58.321 187189 DEBUG os_vif [None req-a6197097-3d4e-4b0e-8b11-da5ea9e3d9cc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:06:9e,bridge_name='br-int',has_traffic_filtering=True,id=60943dec-d420-449f-abc3-233df163ebed,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60943dec-d4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:15:58 compute-0 nova_compute[187185]: 2025-11-29 07:15:58.323 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:58 compute-0 nova_compute[187185]: 2025-11-29 07:15:58.323 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap60943dec-d4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:15:58 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:58.324 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[694c3ad8-6dbd-4d94-ad28-a87c4ac42117]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:58 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:58.325 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[16b59bbf-a2a1-460d-8378-15bb11430963]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:58 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:58.351 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[371e06c3-3083-47c4-b69a-c8efd8272956]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600298, 'reachable_time': 28587, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229228, 'error': None, 'target': 'ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:58 compute-0 systemd[1]: run-netns-ovnmeta\x2ddf7cfc35\x2d3f76\x2d45b2\x2db70c\x2de4525d38f410.mount: Deactivated successfully.
Nov 29 07:15:58 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:58.369 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:15:58 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:15:58.369 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[082b2a29-1cc8-4b51-84b2-a6a71c9328b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:15:58 compute-0 nova_compute[187185]: 2025-11-29 07:15:58.369 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:15:58 compute-0 nova_compute[187185]: 2025-11-29 07:15:58.372 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:15:58 compute-0 nova_compute[187185]: 2025-11-29 07:15:58.375 187189 INFO os_vif [None req-a6197097-3d4e-4b0e-8b11-da5ea9e3d9cc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:06:9e,bridge_name='br-int',has_traffic_filtering=True,id=60943dec-d420-449f-abc3-233df163ebed,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60943dec-d4')
Nov 29 07:15:58 compute-0 nova_compute[187185]: 2025-11-29 07:15:58.375 187189 INFO nova.virt.libvirt.driver [None req-a6197097-3d4e-4b0e-8b11-da5ea9e3d9cc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Deleting instance files /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc_del
Nov 29 07:15:58 compute-0 nova_compute[187185]: 2025-11-29 07:15:58.384 187189 INFO nova.virt.libvirt.driver [None req-a6197097-3d4e-4b0e-8b11-da5ea9e3d9cc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Deletion of /var/lib/nova/instances/084a0f8e-19b7-4b24-a503-c015b26addbc_del complete
Nov 29 07:15:58 compute-0 nova_compute[187185]: 2025-11-29 07:15:58.447 187189 DEBUG nova.network.neutron [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:15:58 compute-0 nova_compute[187185]: 2025-11-29 07:15:58.713 187189 INFO nova.compute.manager [None req-a6197097-3d4e-4b0e-8b11-da5ea9e3d9cc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Took 0.82 seconds to destroy the instance on the hypervisor.
Nov 29 07:15:58 compute-0 nova_compute[187185]: 2025-11-29 07:15:58.714 187189 DEBUG oslo.service.loopingcall [None req-a6197097-3d4e-4b0e-8b11-da5ea9e3d9cc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 07:15:58 compute-0 nova_compute[187185]: 2025-11-29 07:15:58.714 187189 DEBUG nova.compute.manager [-] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 07:15:58 compute-0 nova_compute[187185]: 2025-11-29 07:15:58.714 187189 DEBUG nova.network.neutron [-] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 07:15:59 compute-0 nova_compute[187185]: 2025-11-29 07:15:59.773 187189 DEBUG nova.network.neutron [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Updating instance_info_cache with network_info: [{"id": "29881f52-aa42-4a78-a87b-06e906811ff2", "address": "fa:16:3e:44:9d:fe", "network": {"id": "f8ce59f3-d777-4899-bf5b-171901097199", "bridge": "br-int", "label": "tempest-network-smoke--1303008278", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29881f52-aa", "ovs_interfaceid": "29881f52-aa42-4a78-a87b-06e906811ff2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:15:59 compute-0 nova_compute[187185]: 2025-11-29 07:15:59.916 187189 DEBUG oslo_concurrency.lockutils [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Releasing lock "refresh_cache-704c4aa7-3239-4ecc-bfdc-c72642678363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:15:59 compute-0 nova_compute[187185]: 2025-11-29 07:15:59.917 187189 DEBUG nova.compute.manager [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Instance network_info: |[{"id": "29881f52-aa42-4a78-a87b-06e906811ff2", "address": "fa:16:3e:44:9d:fe", "network": {"id": "f8ce59f3-d777-4899-bf5b-171901097199", "bridge": "br-int", "label": "tempest-network-smoke--1303008278", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29881f52-aa", "ovs_interfaceid": "29881f52-aa42-4a78-a87b-06e906811ff2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 07:15:59 compute-0 nova_compute[187185]: 2025-11-29 07:15:59.921 187189 DEBUG nova.virt.libvirt.driver [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Start _get_guest_xml network_info=[{"id": "29881f52-aa42-4a78-a87b-06e906811ff2", "address": "fa:16:3e:44:9d:fe", "network": {"id": "f8ce59f3-d777-4899-bf5b-171901097199", "bridge": "br-int", "label": "tempest-network-smoke--1303008278", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29881f52-aa", "ovs_interfaceid": "29881f52-aa42-4a78-a87b-06e906811ff2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:15:59 compute-0 nova_compute[187185]: 2025-11-29 07:15:59.926 187189 WARNING nova.virt.libvirt.driver [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:15:59 compute-0 nova_compute[187185]: 2025-11-29 07:15:59.934 187189 DEBUG nova.virt.libvirt.host [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:15:59 compute-0 nova_compute[187185]: 2025-11-29 07:15:59.935 187189 DEBUG nova.virt.libvirt.host [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:15:59 compute-0 nova_compute[187185]: 2025-11-29 07:15:59.937 187189 DEBUG nova.network.neutron [-] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:15:59 compute-0 nova_compute[187185]: 2025-11-29 07:15:59.942 187189 DEBUG nova.virt.libvirt.host [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:15:59 compute-0 nova_compute[187185]: 2025-11-29 07:15:59.942 187189 DEBUG nova.virt.libvirt.host [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:15:59 compute-0 nova_compute[187185]: 2025-11-29 07:15:59.944 187189 DEBUG nova.virt.libvirt.driver [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:15:59 compute-0 nova_compute[187185]: 2025-11-29 07:15:59.944 187189 DEBUG nova.virt.hardware [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:15:59 compute-0 nova_compute[187185]: 2025-11-29 07:15:59.945 187189 DEBUG nova.virt.hardware [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:15:59 compute-0 nova_compute[187185]: 2025-11-29 07:15:59.945 187189 DEBUG nova.virt.hardware [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:15:59 compute-0 nova_compute[187185]: 2025-11-29 07:15:59.945 187189 DEBUG nova.virt.hardware [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:15:59 compute-0 nova_compute[187185]: 2025-11-29 07:15:59.946 187189 DEBUG nova.virt.hardware [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:15:59 compute-0 nova_compute[187185]: 2025-11-29 07:15:59.946 187189 DEBUG nova.virt.hardware [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:15:59 compute-0 nova_compute[187185]: 2025-11-29 07:15:59.946 187189 DEBUG nova.virt.hardware [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:15:59 compute-0 nova_compute[187185]: 2025-11-29 07:15:59.947 187189 DEBUG nova.virt.hardware [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:15:59 compute-0 nova_compute[187185]: 2025-11-29 07:15:59.947 187189 DEBUG nova.virt.hardware [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:15:59 compute-0 nova_compute[187185]: 2025-11-29 07:15:59.947 187189 DEBUG nova.virt.hardware [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:15:59 compute-0 nova_compute[187185]: 2025-11-29 07:15:59.947 187189 DEBUG nova.virt.hardware [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:15:59 compute-0 nova_compute[187185]: 2025-11-29 07:15:59.951 187189 DEBUG nova.virt.libvirt.vif [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:15:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1683200929',display_name='tempest-TestNetworkAdvancedServerOps-server-1683200929',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1683200929',id=102,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJpBdlQTrwm1jTLhsIWvBArp7FJbNV/DmsxpavKG+fSfuJYeopMQPEBt+TLRsvwJz1i5TrgMP98T/zGS4tH40QimuRAQV56ulySp5fCUrK73vauhbVZ7xUa0c5MPUYrHZg==',key_name='tempest-TestNetworkAdvancedServerOps-1740209866',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-v4kh9s3g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:15:52Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=704c4aa7-3239-4ecc-bfdc-c72642678363,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "29881f52-aa42-4a78-a87b-06e906811ff2", "address": "fa:16:3e:44:9d:fe", "network": {"id": "f8ce59f3-d777-4899-bf5b-171901097199", "bridge": "br-int", "label": "tempest-network-smoke--1303008278", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29881f52-aa", "ovs_interfaceid": "29881f52-aa42-4a78-a87b-06e906811ff2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:15:59 compute-0 nova_compute[187185]: 2025-11-29 07:15:59.952 187189 DEBUG nova.network.os_vif_util [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converting VIF {"id": "29881f52-aa42-4a78-a87b-06e906811ff2", "address": "fa:16:3e:44:9d:fe", "network": {"id": "f8ce59f3-d777-4899-bf5b-171901097199", "bridge": "br-int", "label": "tempest-network-smoke--1303008278", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29881f52-aa", "ovs_interfaceid": "29881f52-aa42-4a78-a87b-06e906811ff2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:15:59 compute-0 nova_compute[187185]: 2025-11-29 07:15:59.952 187189 DEBUG nova.network.os_vif_util [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:9d:fe,bridge_name='br-int',has_traffic_filtering=True,id=29881f52-aa42-4a78-a87b-06e906811ff2,network=Network(f8ce59f3-d777-4899-bf5b-171901097199),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29881f52-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:15:59 compute-0 nova_compute[187185]: 2025-11-29 07:15:59.953 187189 DEBUG nova.objects.instance [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'pci_devices' on Instance uuid 704c4aa7-3239-4ecc-bfdc-c72642678363 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:15:59 compute-0 nova_compute[187185]: 2025-11-29 07:15:59.958 187189 DEBUG nova.compute.manager [req-c476a685-8697-4b88-81cc-5e310ec03ebe req-14428faa-332d-49ac-a6d4-290db5195a82 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Received event network-vif-deleted-60943dec-d420-449f-abc3-233df163ebed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:15:59 compute-0 nova_compute[187185]: 2025-11-29 07:15:59.958 187189 INFO nova.compute.manager [req-c476a685-8697-4b88-81cc-5e310ec03ebe req-14428faa-332d-49ac-a6d4-290db5195a82 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Neutron deleted interface 60943dec-d420-449f-abc3-233df163ebed; detaching it from the instance and deleting it from the info cache
Nov 29 07:15:59 compute-0 nova_compute[187185]: 2025-11-29 07:15:59.959 187189 DEBUG nova.network.neutron [req-c476a685-8697-4b88-81cc-5e310ec03ebe req-14428faa-332d-49ac-a6d4-290db5195a82 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:16:00 compute-0 nova_compute[187185]: 2025-11-29 07:16:00.046 187189 DEBUG nova.virt.libvirt.driver [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:16:00 compute-0 nova_compute[187185]:   <uuid>704c4aa7-3239-4ecc-bfdc-c72642678363</uuid>
Nov 29 07:16:00 compute-0 nova_compute[187185]:   <name>instance-00000066</name>
Nov 29 07:16:00 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:16:00 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:16:00 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:16:00 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-1683200929</nova:name>
Nov 29 07:16:00 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:15:59</nova:creationTime>
Nov 29 07:16:00 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:16:00 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:16:00 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:16:00 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:16:00 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:16:00 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:16:00 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:16:00 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:16:00 compute-0 nova_compute[187185]:         <nova:user uuid="bfd2024670594b10941cec8a59d2573f">tempest-TestNetworkAdvancedServerOps-1380683659-project-member</nova:user>
Nov 29 07:16:00 compute-0 nova_compute[187185]:         <nova:project uuid="c231e63624d44fc19e0989abfb1afb22">tempest-TestNetworkAdvancedServerOps-1380683659</nova:project>
Nov 29 07:16:00 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:16:00 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:16:00 compute-0 nova_compute[187185]:         <nova:port uuid="29881f52-aa42-4a78-a87b-06e906811ff2">
Nov 29 07:16:00 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:16:00 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:16:00 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:16:00 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <system>
Nov 29 07:16:00 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:16:00 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:16:00 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:16:00 compute-0 nova_compute[187185]:       <entry name="serial">704c4aa7-3239-4ecc-bfdc-c72642678363</entry>
Nov 29 07:16:00 compute-0 nova_compute[187185]:       <entry name="uuid">704c4aa7-3239-4ecc-bfdc-c72642678363</entry>
Nov 29 07:16:00 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     </system>
Nov 29 07:16:00 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:16:00 compute-0 nova_compute[187185]:   <os>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:   </os>
Nov 29 07:16:00 compute-0 nova_compute[187185]:   <features>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:   </features>
Nov 29 07:16:00 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:16:00 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:16:00 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:16:00 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363/disk"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:16:00 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363/disk.config"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:16:00 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:44:9d:fe"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:       <target dev="tap29881f52-aa"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:16:00 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363/console.log" append="off"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <video>
Nov 29 07:16:00 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     </video>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:16:00 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:16:00 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:16:00 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:16:00 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:16:00 compute-0 nova_compute[187185]: </domain>
Nov 29 07:16:00 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:16:00 compute-0 nova_compute[187185]: 2025-11-29 07:16:00.047 187189 DEBUG nova.compute.manager [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Preparing to wait for external event network-vif-plugged-29881f52-aa42-4a78-a87b-06e906811ff2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:16:00 compute-0 nova_compute[187185]: 2025-11-29 07:16:00.047 187189 DEBUG oslo_concurrency.lockutils [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "704c4aa7-3239-4ecc-bfdc-c72642678363-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:16:00 compute-0 nova_compute[187185]: 2025-11-29 07:16:00.048 187189 DEBUG oslo_concurrency.lockutils [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "704c4aa7-3239-4ecc-bfdc-c72642678363-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:16:00 compute-0 nova_compute[187185]: 2025-11-29 07:16:00.049 187189 DEBUG oslo_concurrency.lockutils [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "704c4aa7-3239-4ecc-bfdc-c72642678363-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:16:00 compute-0 nova_compute[187185]: 2025-11-29 07:16:00.050 187189 DEBUG nova.virt.libvirt.vif [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:15:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1683200929',display_name='tempest-TestNetworkAdvancedServerOps-server-1683200929',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1683200929',id=102,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJpBdlQTrwm1jTLhsIWvBArp7FJbNV/DmsxpavKG+fSfuJYeopMQPEBt+TLRsvwJz1i5TrgMP98T/zGS4tH40QimuRAQV56ulySp5fCUrK73vauhbVZ7xUa0c5MPUYrHZg==',key_name='tempest-TestNetworkAdvancedServerOps-1740209866',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-v4kh9s3g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:15:52Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=704c4aa7-3239-4ecc-bfdc-c72642678363,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "29881f52-aa42-4a78-a87b-06e906811ff2", "address": "fa:16:3e:44:9d:fe", "network": {"id": "f8ce59f3-d777-4899-bf5b-171901097199", "bridge": "br-int", "label": "tempest-network-smoke--1303008278", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29881f52-aa", "ovs_interfaceid": "29881f52-aa42-4a78-a87b-06e906811ff2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:16:00 compute-0 nova_compute[187185]: 2025-11-29 07:16:00.050 187189 DEBUG nova.network.os_vif_util [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converting VIF {"id": "29881f52-aa42-4a78-a87b-06e906811ff2", "address": "fa:16:3e:44:9d:fe", "network": {"id": "f8ce59f3-d777-4899-bf5b-171901097199", "bridge": "br-int", "label": "tempest-network-smoke--1303008278", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29881f52-aa", "ovs_interfaceid": "29881f52-aa42-4a78-a87b-06e906811ff2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:16:00 compute-0 nova_compute[187185]: 2025-11-29 07:16:00.051 187189 DEBUG nova.network.os_vif_util [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:9d:fe,bridge_name='br-int',has_traffic_filtering=True,id=29881f52-aa42-4a78-a87b-06e906811ff2,network=Network(f8ce59f3-d777-4899-bf5b-171901097199),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29881f52-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:16:00 compute-0 nova_compute[187185]: 2025-11-29 07:16:00.052 187189 DEBUG os_vif [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:9d:fe,bridge_name='br-int',has_traffic_filtering=True,id=29881f52-aa42-4a78-a87b-06e906811ff2,network=Network(f8ce59f3-d777-4899-bf5b-171901097199),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29881f52-aa') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:16:00 compute-0 nova_compute[187185]: 2025-11-29 07:16:00.052 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:00 compute-0 nova_compute[187185]: 2025-11-29 07:16:00.053 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:16:00 compute-0 nova_compute[187185]: 2025-11-29 07:16:00.053 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:16:00 compute-0 nova_compute[187185]: 2025-11-29 07:16:00.056 187189 INFO nova.compute.manager [-] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Took 1.34 seconds to deallocate network for instance.
Nov 29 07:16:00 compute-0 nova_compute[187185]: 2025-11-29 07:16:00.059 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:00 compute-0 nova_compute[187185]: 2025-11-29 07:16:00.059 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap29881f52-aa, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:16:00 compute-0 nova_compute[187185]: 2025-11-29 07:16:00.060 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap29881f52-aa, col_values=(('external_ids', {'iface-id': '29881f52-aa42-4a78-a87b-06e906811ff2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:44:9d:fe', 'vm-uuid': '704c4aa7-3239-4ecc-bfdc-c72642678363'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:16:00 compute-0 NetworkManager[55227]: <info>  [1764400560.0638] manager: (tap29881f52-aa): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/145)
Nov 29 07:16:00 compute-0 nova_compute[187185]: 2025-11-29 07:16:00.064 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:00 compute-0 nova_compute[187185]: 2025-11-29 07:16:00.068 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:16:00 compute-0 nova_compute[187185]: 2025-11-29 07:16:00.069 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:00 compute-0 nova_compute[187185]: 2025-11-29 07:16:00.070 187189 INFO os_vif [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:9d:fe,bridge_name='br-int',has_traffic_filtering=True,id=29881f52-aa42-4a78-a87b-06e906811ff2,network=Network(f8ce59f3-d777-4899-bf5b-171901097199),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29881f52-aa')
Nov 29 07:16:00 compute-0 nova_compute[187185]: 2025-11-29 07:16:00.122 187189 DEBUG nova.compute.manager [req-c476a685-8697-4b88-81cc-5e310ec03ebe req-14428faa-332d-49ac-a6d4-290db5195a82 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Detach interface failed, port_id=60943dec-d420-449f-abc3-233df163ebed, reason: Instance 084a0f8e-19b7-4b24-a503-c015b26addbc could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Nov 29 07:16:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:00.183 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:16:00 compute-0 nova_compute[187185]: 2025-11-29 07:16:00.391 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:00 compute-0 nova_compute[187185]: 2025-11-29 07:16:00.424 187189 DEBUG nova.compute.manager [req-c69bd32f-11f8-4817-a65c-128a91004984 req-948694af-cc72-4208-8984-a6f2775df16e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Received event network-changed-29881f52-aa42-4a78-a87b-06e906811ff2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:16:00 compute-0 nova_compute[187185]: 2025-11-29 07:16:00.425 187189 DEBUG nova.compute.manager [req-c69bd32f-11f8-4817-a65c-128a91004984 req-948694af-cc72-4208-8984-a6f2775df16e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Refreshing instance network info cache due to event network-changed-29881f52-aa42-4a78-a87b-06e906811ff2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:16:00 compute-0 nova_compute[187185]: 2025-11-29 07:16:00.425 187189 DEBUG oslo_concurrency.lockutils [req-c69bd32f-11f8-4817-a65c-128a91004984 req-948694af-cc72-4208-8984-a6f2775df16e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-704c4aa7-3239-4ecc-bfdc-c72642678363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:16:00 compute-0 nova_compute[187185]: 2025-11-29 07:16:00.426 187189 DEBUG oslo_concurrency.lockutils [req-c69bd32f-11f8-4817-a65c-128a91004984 req-948694af-cc72-4208-8984-a6f2775df16e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-704c4aa7-3239-4ecc-bfdc-c72642678363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:16:00 compute-0 nova_compute[187185]: 2025-11-29 07:16:00.426 187189 DEBUG nova.network.neutron [req-c69bd32f-11f8-4817-a65c-128a91004984 req-948694af-cc72-4208-8984-a6f2775df16e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Refreshing network info cache for port 29881f52-aa42-4a78-a87b-06e906811ff2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:16:00 compute-0 nova_compute[187185]: 2025-11-29 07:16:00.451 187189 DEBUG nova.virt.libvirt.driver [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:16:00 compute-0 nova_compute[187185]: 2025-11-29 07:16:00.451 187189 DEBUG nova.virt.libvirt.driver [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:16:00 compute-0 nova_compute[187185]: 2025-11-29 07:16:00.452 187189 DEBUG nova.virt.libvirt.driver [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] No VIF found with MAC fa:16:3e:44:9d:fe, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:16:00 compute-0 nova_compute[187185]: 2025-11-29 07:16:00.453 187189 INFO nova.virt.libvirt.driver [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Using config drive
Nov 29 07:16:00 compute-0 nova_compute[187185]: 2025-11-29 07:16:00.747 187189 DEBUG nova.compute.manager [req-e5049d3e-06fb-4011-856d-095fd75567a5 req-53582a80-0b56-409c-bdf9-436831be87eb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Received event network-vif-unplugged-60943dec-d420-449f-abc3-233df163ebed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:16:00 compute-0 nova_compute[187185]: 2025-11-29 07:16:00.748 187189 DEBUG oslo_concurrency.lockutils [req-e5049d3e-06fb-4011-856d-095fd75567a5 req-53582a80-0b56-409c-bdf9-436831be87eb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "084a0f8e-19b7-4b24-a503-c015b26addbc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:16:00 compute-0 nova_compute[187185]: 2025-11-29 07:16:00.748 187189 DEBUG oslo_concurrency.lockutils [req-e5049d3e-06fb-4011-856d-095fd75567a5 req-53582a80-0b56-409c-bdf9-436831be87eb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "084a0f8e-19b7-4b24-a503-c015b26addbc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:16:00 compute-0 nova_compute[187185]: 2025-11-29 07:16:00.748 187189 DEBUG oslo_concurrency.lockutils [req-e5049d3e-06fb-4011-856d-095fd75567a5 req-53582a80-0b56-409c-bdf9-436831be87eb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "084a0f8e-19b7-4b24-a503-c015b26addbc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:16:00 compute-0 nova_compute[187185]: 2025-11-29 07:16:00.749 187189 DEBUG nova.compute.manager [req-e5049d3e-06fb-4011-856d-095fd75567a5 req-53582a80-0b56-409c-bdf9-436831be87eb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] No waiting events found dispatching network-vif-unplugged-60943dec-d420-449f-abc3-233df163ebed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:16:00 compute-0 nova_compute[187185]: 2025-11-29 07:16:00.749 187189 DEBUG nova.compute.manager [req-e5049d3e-06fb-4011-856d-095fd75567a5 req-53582a80-0b56-409c-bdf9-436831be87eb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Received event network-vif-unplugged-60943dec-d420-449f-abc3-233df163ebed for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 07:16:00 compute-0 nova_compute[187185]: 2025-11-29 07:16:00.749 187189 DEBUG nova.compute.manager [req-e5049d3e-06fb-4011-856d-095fd75567a5 req-53582a80-0b56-409c-bdf9-436831be87eb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Received event network-vif-plugged-60943dec-d420-449f-abc3-233df163ebed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:16:00 compute-0 nova_compute[187185]: 2025-11-29 07:16:00.750 187189 DEBUG oslo_concurrency.lockutils [req-e5049d3e-06fb-4011-856d-095fd75567a5 req-53582a80-0b56-409c-bdf9-436831be87eb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "084a0f8e-19b7-4b24-a503-c015b26addbc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:16:00 compute-0 nova_compute[187185]: 2025-11-29 07:16:00.750 187189 DEBUG oslo_concurrency.lockutils [req-e5049d3e-06fb-4011-856d-095fd75567a5 req-53582a80-0b56-409c-bdf9-436831be87eb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "084a0f8e-19b7-4b24-a503-c015b26addbc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:16:00 compute-0 nova_compute[187185]: 2025-11-29 07:16:00.750 187189 DEBUG oslo_concurrency.lockutils [req-e5049d3e-06fb-4011-856d-095fd75567a5 req-53582a80-0b56-409c-bdf9-436831be87eb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "084a0f8e-19b7-4b24-a503-c015b26addbc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:16:00 compute-0 nova_compute[187185]: 2025-11-29 07:16:00.751 187189 DEBUG nova.compute.manager [req-e5049d3e-06fb-4011-856d-095fd75567a5 req-53582a80-0b56-409c-bdf9-436831be87eb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] No waiting events found dispatching network-vif-plugged-60943dec-d420-449f-abc3-233df163ebed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:16:00 compute-0 nova_compute[187185]: 2025-11-29 07:16:00.751 187189 WARNING nova.compute.manager [req-e5049d3e-06fb-4011-856d-095fd75567a5 req-53582a80-0b56-409c-bdf9-436831be87eb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Received unexpected event network-vif-plugged-60943dec-d420-449f-abc3-233df163ebed for instance with vm_state active and task_state deleting.
Nov 29 07:16:01 compute-0 nova_compute[187185]: 2025-11-29 07:16:01.054 187189 DEBUG oslo_concurrency.lockutils [None req-a6197097-3d4e-4b0e-8b11-da5ea9e3d9cc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:16:01 compute-0 nova_compute[187185]: 2025-11-29 07:16:01.055 187189 DEBUG oslo_concurrency.lockutils [None req-a6197097-3d4e-4b0e-8b11-da5ea9e3d9cc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:16:01 compute-0 nova_compute[187185]: 2025-11-29 07:16:01.061 187189 DEBUG oslo_concurrency.lockutils [None req-a6197097-3d4e-4b0e-8b11-da5ea9e3d9cc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:16:01 compute-0 nova_compute[187185]: 2025-11-29 07:16:01.149 187189 INFO nova.scheduler.client.report [None req-a6197097-3d4e-4b0e-8b11-da5ea9e3d9cc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Deleted allocations for instance 084a0f8e-19b7-4b24-a503-c015b26addbc
Nov 29 07:16:01 compute-0 nova_compute[187185]: 2025-11-29 07:16:01.238 187189 INFO nova.virt.libvirt.driver [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Creating config drive at /var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363/disk.config
Nov 29 07:16:01 compute-0 nova_compute[187185]: 2025-11-29 07:16:01.245 187189 DEBUG oslo_concurrency.processutils [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxz6v9x61 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:16:01 compute-0 nova_compute[187185]: 2025-11-29 07:16:01.376 187189 DEBUG oslo_concurrency.processutils [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxz6v9x61" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:16:01 compute-0 kernel: tap29881f52-aa: entered promiscuous mode
Nov 29 07:16:01 compute-0 NetworkManager[55227]: <info>  [1764400561.4422] manager: (tap29881f52-aa): new Tun device (/org/freedesktop/NetworkManager/Devices/146)
Nov 29 07:16:01 compute-0 ovn_controller[95281]: 2025-11-29T07:16:01Z|00268|binding|INFO|Claiming lport 29881f52-aa42-4a78-a87b-06e906811ff2 for this chassis.
Nov 29 07:16:01 compute-0 ovn_controller[95281]: 2025-11-29T07:16:01Z|00269|binding|INFO|29881f52-aa42-4a78-a87b-06e906811ff2: Claiming fa:16:3e:44:9d:fe 10.100.0.6
Nov 29 07:16:01 compute-0 nova_compute[187185]: 2025-11-29 07:16:01.446 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:01 compute-0 ovn_controller[95281]: 2025-11-29T07:16:01Z|00270|binding|INFO|Setting lport 29881f52-aa42-4a78-a87b-06e906811ff2 ovn-installed in OVS
Nov 29 07:16:01 compute-0 nova_compute[187185]: 2025-11-29 07:16:01.457 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:01 compute-0 nova_compute[187185]: 2025-11-29 07:16:01.458 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:01 compute-0 systemd-udevd[229249]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:16:01 compute-0 NetworkManager[55227]: <info>  [1764400561.4869] device (tap29881f52-aa): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:16:01 compute-0 NetworkManager[55227]: <info>  [1764400561.4880] device (tap29881f52-aa): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:16:01 compute-0 systemd-machined[153486]: New machine qemu-37-instance-00000066.
Nov 29 07:16:01 compute-0 systemd[1]: Started Virtual Machine qemu-37-instance-00000066.
Nov 29 07:16:01 compute-0 ovn_controller[95281]: 2025-11-29T07:16:01Z|00271|binding|INFO|Setting lport 29881f52-aa42-4a78-a87b-06e906811ff2 up in Southbound
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:01.541 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:44:9d:fe 10.100.0.6'], port_security=['fa:16:3e:44:9d:fe 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '704c4aa7-3239-4ecc-bfdc-c72642678363', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f8ce59f3-d777-4899-bf5b-171901097199', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c231e63624d44fc19e0989abfb1afb22', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0aded770-2a08-4693-9d94-82fba33c50bb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c8856130-4f24-493d-8324-579a0d608efb, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=29881f52-aa42-4a78-a87b-06e906811ff2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:01.542 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 29881f52-aa42-4a78-a87b-06e906811ff2 in datapath f8ce59f3-d777-4899-bf5b-171901097199 bound to our chassis
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:01.544 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f8ce59f3-d777-4899-bf5b-171901097199
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:01.557 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[3e08144e-df60-4182-9394-e8fe6442d485]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:01.558 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf8ce59f3-d1 in ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:01.559 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf8ce59f3-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:01.559 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f411205d-8a47-4ec7-8079-387558003b35]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:01.560 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e08e7c1e-4371-43e9-9a52-728f14934690]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:01.570 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[0e767e7d-d57b-42e7-8b79-f3ddba8a178b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:01.592 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a2ad7afe-ab09-4711-9dfb-e29e05e725de]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:01.622 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[8edad47c-b040-422d-9b46-4f6d4660352b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:16:01 compute-0 NetworkManager[55227]: <info>  [1764400561.6298] manager: (tapf8ce59f3-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/147)
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:01.629 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[efcde56d-7745-4291-9bfd-ef564dcd2c42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:16:01 compute-0 nova_compute[187185]: 2025-11-29 07:16:01.654 187189 DEBUG oslo_concurrency.lockutils [None req-a6197097-3d4e-4b0e-8b11-da5ea9e3d9cc ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "084a0f8e-19b7-4b24-a503-c015b26addbc" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.068s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:01.665 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[e4cebd8b-8317-481e-bc20-14cc569c62be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:01.670 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[a8b406a2-8c19-4377-8663-abc096a649e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:16:01 compute-0 NetworkManager[55227]: <info>  [1764400561.6911] device (tapf8ce59f3-d0): carrier: link connected
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:01.699 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[617d52b9-0db4-478b-8644-8e84e964b179]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:01.717 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[53451d3d-5da7-49ab-89ed-9238564df478]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf8ce59f3-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:b0:20'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 88], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600935, 'reachable_time': 31755, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229283, 'error': None, 'target': 'ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:01.737 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[ee9cd3b1-b041-48a4-96ce-50532263d882]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef7:b020'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 600935, 'tstamp': 600935}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229284, 'error': None, 'target': 'ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:01.754 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[2c6ee51a-8bce-42a7-944f-c90610b92b44]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf8ce59f3-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:b0:20'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 88], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600935, 'reachable_time': 31755, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229285, 'error': None, 'target': 'ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:01.787 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[ff66211b-ad1a-412f-8d41-6c2f1d2010d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:01.855 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[888da475-e017-45eb-8386-3d27916b1937]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:01.857 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf8ce59f3-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:01.857 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:01.858 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf8ce59f3-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:16:01 compute-0 kernel: tapf8ce59f3-d0: entered promiscuous mode
Nov 29 07:16:01 compute-0 NetworkManager[55227]: <info>  [1764400561.8618] manager: (tapf8ce59f3-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/148)
Nov 29 07:16:01 compute-0 nova_compute[187185]: 2025-11-29 07:16:01.861 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:01.863 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf8ce59f3-d0, col_values=(('external_ids', {'iface-id': 'c10b6573-55b8-4259-8949-c467435d65c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:16:01 compute-0 ovn_controller[95281]: 2025-11-29T07:16:01Z|00272|binding|INFO|Releasing lport c10b6573-55b8-4259-8949-c467435d65c0 from this chassis (sb_readonly=0)
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:01.866 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f8ce59f3-d777-4899-bf5b-171901097199.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f8ce59f3-d777-4899-bf5b-171901097199.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:01.876 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[3ee236ae-c502-4a8a-a503-9cdc3b9b2ec4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:16:01 compute-0 nova_compute[187185]: 2025-11-29 07:16:01.876 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:01.877 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-f8ce59f3-d777-4899-bf5b-171901097199
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/f8ce59f3-d777-4899-bf5b-171901097199.pid.haproxy
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID f8ce59f3-d777-4899-bf5b-171901097199
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:16:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:01.878 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199', 'env', 'PROCESS_TAG=haproxy-f8ce59f3-d777-4899-bf5b-171901097199', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f8ce59f3-d777-4899-bf5b-171901097199.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:16:02 compute-0 nova_compute[187185]: 2025-11-29 07:16:02.045 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400562.0444396, 704c4aa7-3239-4ecc-bfdc-c72642678363 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:16:02 compute-0 nova_compute[187185]: 2025-11-29 07:16:02.045 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] VM Started (Lifecycle Event)
Nov 29 07:16:02 compute-0 nova_compute[187185]: 2025-11-29 07:16:02.091 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:16:02 compute-0 nova_compute[187185]: 2025-11-29 07:16:02.102 187189 DEBUG nova.compute.manager [req-edb855a8-cf45-453d-a4db-5b25e095190d req-a2d0182f-1903-4cf7-8ad9-0618fc8de3c1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Received event network-vif-plugged-29881f52-aa42-4a78-a87b-06e906811ff2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:16:02 compute-0 nova_compute[187185]: 2025-11-29 07:16:02.103 187189 DEBUG oslo_concurrency.lockutils [req-edb855a8-cf45-453d-a4db-5b25e095190d req-a2d0182f-1903-4cf7-8ad9-0618fc8de3c1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "704c4aa7-3239-4ecc-bfdc-c72642678363-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:16:02 compute-0 nova_compute[187185]: 2025-11-29 07:16:02.103 187189 DEBUG oslo_concurrency.lockutils [req-edb855a8-cf45-453d-a4db-5b25e095190d req-a2d0182f-1903-4cf7-8ad9-0618fc8de3c1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "704c4aa7-3239-4ecc-bfdc-c72642678363-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:16:02 compute-0 nova_compute[187185]: 2025-11-29 07:16:02.104 187189 DEBUG oslo_concurrency.lockutils [req-edb855a8-cf45-453d-a4db-5b25e095190d req-a2d0182f-1903-4cf7-8ad9-0618fc8de3c1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "704c4aa7-3239-4ecc-bfdc-c72642678363-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:16:02 compute-0 nova_compute[187185]: 2025-11-29 07:16:02.104 187189 DEBUG nova.compute.manager [req-edb855a8-cf45-453d-a4db-5b25e095190d req-a2d0182f-1903-4cf7-8ad9-0618fc8de3c1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Processing event network-vif-plugged-29881f52-aa42-4a78-a87b-06e906811ff2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:16:02 compute-0 nova_compute[187185]: 2025-11-29 07:16:02.105 187189 DEBUG nova.compute.manager [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:16:02 compute-0 nova_compute[187185]: 2025-11-29 07:16:02.105 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400562.0446763, 704c4aa7-3239-4ecc-bfdc-c72642678363 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:16:02 compute-0 nova_compute[187185]: 2025-11-29 07:16:02.106 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] VM Paused (Lifecycle Event)
Nov 29 07:16:02 compute-0 nova_compute[187185]: 2025-11-29 07:16:02.111 187189 DEBUG nova.virt.libvirt.driver [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:16:02 compute-0 nova_compute[187185]: 2025-11-29 07:16:02.114 187189 INFO nova.virt.libvirt.driver [-] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Instance spawned successfully.
Nov 29 07:16:02 compute-0 nova_compute[187185]: 2025-11-29 07:16:02.114 187189 DEBUG nova.virt.libvirt.driver [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:16:02 compute-0 nova_compute[187185]: 2025-11-29 07:16:02.217 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:16:02 compute-0 nova_compute[187185]: 2025-11-29 07:16:02.225 187189 DEBUG nova.virt.libvirt.driver [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:16:02 compute-0 nova_compute[187185]: 2025-11-29 07:16:02.225 187189 DEBUG nova.virt.libvirt.driver [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:16:02 compute-0 nova_compute[187185]: 2025-11-29 07:16:02.226 187189 DEBUG nova.virt.libvirt.driver [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:16:02 compute-0 nova_compute[187185]: 2025-11-29 07:16:02.227 187189 DEBUG nova.virt.libvirt.driver [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:16:02 compute-0 nova_compute[187185]: 2025-11-29 07:16:02.227 187189 DEBUG nova.virt.libvirt.driver [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:16:02 compute-0 nova_compute[187185]: 2025-11-29 07:16:02.228 187189 DEBUG nova.virt.libvirt.driver [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:16:02 compute-0 nova_compute[187185]: 2025-11-29 07:16:02.236 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400562.1097085, 704c4aa7-3239-4ecc-bfdc-c72642678363 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:16:02 compute-0 nova_compute[187185]: 2025-11-29 07:16:02.237 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] VM Resumed (Lifecycle Event)
Nov 29 07:16:02 compute-0 podman[229324]: 2025-11-29 07:16:02.266615714 +0000 UTC m=+0.078599800 container create 58bf4e59539b7fcd8b376f3cf337b5e0a574a1fb0cce529ef3e974dcbc08450c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2)
Nov 29 07:16:02 compute-0 podman[229324]: 2025-11-29 07:16:02.224443718 +0000 UTC m=+0.036427814 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:16:02 compute-0 systemd[1]: Started libpod-conmon-58bf4e59539b7fcd8b376f3cf337b5e0a574a1fb0cce529ef3e974dcbc08450c.scope.
Nov 29 07:16:02 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:16:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6471ecca0f8a443025464549e15742f540bc87fb1faf7ea59887c3e65e61b607/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:16:02 compute-0 podman[229324]: 2025-11-29 07:16:02.364017477 +0000 UTC m=+0.176001553 container init 58bf4e59539b7fcd8b376f3cf337b5e0a574a1fb0cce529ef3e974dcbc08450c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 29 07:16:02 compute-0 podman[229324]: 2025-11-29 07:16:02.372119847 +0000 UTC m=+0.184103903 container start 58bf4e59539b7fcd8b376f3cf337b5e0a574a1fb0cce529ef3e974dcbc08450c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 07:16:02 compute-0 neutron-haproxy-ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199[229339]: [NOTICE]   (229343) : New worker (229345) forked
Nov 29 07:16:02 compute-0 neutron-haproxy-ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199[229339]: [NOTICE]   (229343) : Loading success.
Nov 29 07:16:02 compute-0 nova_compute[187185]: 2025-11-29 07:16:02.453 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:16:02 compute-0 nova_compute[187185]: 2025-11-29 07:16:02.459 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:16:02 compute-0 nova_compute[187185]: 2025-11-29 07:16:02.604 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:16:02 compute-0 nova_compute[187185]: 2025-11-29 07:16:02.720 187189 INFO nova.compute.manager [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Took 10.01 seconds to spawn the instance on the hypervisor.
Nov 29 07:16:02 compute-0 nova_compute[187185]: 2025-11-29 07:16:02.720 187189 DEBUG nova.compute.manager [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:16:02 compute-0 nova_compute[187185]: 2025-11-29 07:16:02.877 187189 DEBUG nova.network.neutron [req-c69bd32f-11f8-4817-a65c-128a91004984 req-948694af-cc72-4208-8984-a6f2775df16e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Updated VIF entry in instance network info cache for port 29881f52-aa42-4a78-a87b-06e906811ff2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:16:02 compute-0 nova_compute[187185]: 2025-11-29 07:16:02.878 187189 DEBUG nova.network.neutron [req-c69bd32f-11f8-4817-a65c-128a91004984 req-948694af-cc72-4208-8984-a6f2775df16e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Updating instance_info_cache with network_info: [{"id": "29881f52-aa42-4a78-a87b-06e906811ff2", "address": "fa:16:3e:44:9d:fe", "network": {"id": "f8ce59f3-d777-4899-bf5b-171901097199", "bridge": "br-int", "label": "tempest-network-smoke--1303008278", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29881f52-aa", "ovs_interfaceid": "29881f52-aa42-4a78-a87b-06e906811ff2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:16:02 compute-0 nova_compute[187185]: 2025-11-29 07:16:02.957 187189 DEBUG oslo_concurrency.lockutils [req-c69bd32f-11f8-4817-a65c-128a91004984 req-948694af-cc72-4208-8984-a6f2775df16e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-704c4aa7-3239-4ecc-bfdc-c72642678363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:16:03 compute-0 nova_compute[187185]: 2025-11-29 07:16:03.145 187189 INFO nova.compute.manager [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Took 12.29 seconds to build instance.
Nov 29 07:16:03 compute-0 nova_compute[187185]: 2025-11-29 07:16:03.207 187189 DEBUG oslo_concurrency.lockutils [None req-d697f017-b1ca-4507-a8c6-1eb82df976b1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "704c4aa7-3239-4ecc-bfdc-c72642678363" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.692s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:16:03 compute-0 podman[229354]: 2025-11-29 07:16:03.82436523 +0000 UTC m=+0.083689295 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:16:03 compute-0 podman[229355]: 2025-11-29 07:16:03.838553752 +0000 UTC m=+0.092256817 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.openshift.expose-services=, version=9.6, io.buildah.version=1.33.7, name=ubi9-minimal, maintainer=Red Hat, Inc., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 29 07:16:03 compute-0 podman[229356]: 2025-11-29 07:16:03.859007013 +0000 UTC m=+0.100740619 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 07:16:04 compute-0 nova_compute[187185]: 2025-11-29 07:16:04.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:16:04 compute-0 nova_compute[187185]: 2025-11-29 07:16:04.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:16:04 compute-0 nova_compute[187185]: 2025-11-29 07:16:04.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:16:04 compute-0 nova_compute[187185]: 2025-11-29 07:16:04.387 187189 DEBUG nova.compute.manager [req-e3738fe0-cc2c-47c8-9937-a17d96255e2c req-3a8a4436-5fde-4ea6-99ab-7decdd983b4d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Received event network-vif-plugged-29881f52-aa42-4a78-a87b-06e906811ff2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:16:04 compute-0 nova_compute[187185]: 2025-11-29 07:16:04.388 187189 DEBUG oslo_concurrency.lockutils [req-e3738fe0-cc2c-47c8-9937-a17d96255e2c req-3a8a4436-5fde-4ea6-99ab-7decdd983b4d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "704c4aa7-3239-4ecc-bfdc-c72642678363-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:16:04 compute-0 nova_compute[187185]: 2025-11-29 07:16:04.388 187189 DEBUG oslo_concurrency.lockutils [req-e3738fe0-cc2c-47c8-9937-a17d96255e2c req-3a8a4436-5fde-4ea6-99ab-7decdd983b4d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "704c4aa7-3239-4ecc-bfdc-c72642678363-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:16:04 compute-0 nova_compute[187185]: 2025-11-29 07:16:04.389 187189 DEBUG oslo_concurrency.lockutils [req-e3738fe0-cc2c-47c8-9937-a17d96255e2c req-3a8a4436-5fde-4ea6-99ab-7decdd983b4d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "704c4aa7-3239-4ecc-bfdc-c72642678363-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:16:04 compute-0 nova_compute[187185]: 2025-11-29 07:16:04.389 187189 DEBUG nova.compute.manager [req-e3738fe0-cc2c-47c8-9937-a17d96255e2c req-3a8a4436-5fde-4ea6-99ab-7decdd983b4d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] No waiting events found dispatching network-vif-plugged-29881f52-aa42-4a78-a87b-06e906811ff2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:16:04 compute-0 nova_compute[187185]: 2025-11-29 07:16:04.390 187189 WARNING nova.compute.manager [req-e3738fe0-cc2c-47c8-9937-a17d96255e2c req-3a8a4436-5fde-4ea6-99ab-7decdd983b4d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Received unexpected event network-vif-plugged-29881f52-aa42-4a78-a87b-06e906811ff2 for instance with vm_state active and task_state None.
Nov 29 07:16:04 compute-0 nova_compute[187185]: 2025-11-29 07:16:04.484 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "refresh_cache-704c4aa7-3239-4ecc-bfdc-c72642678363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:16:04 compute-0 nova_compute[187185]: 2025-11-29 07:16:04.484 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquired lock "refresh_cache-704c4aa7-3239-4ecc-bfdc-c72642678363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:16:04 compute-0 nova_compute[187185]: 2025-11-29 07:16:04.485 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 29 07:16:04 compute-0 nova_compute[187185]: 2025-11-29 07:16:04.485 187189 DEBUG nova.objects.instance [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 704c4aa7-3239-4ecc-bfdc-c72642678363 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:16:05 compute-0 nova_compute[187185]: 2025-11-29 07:16:05.064 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:05 compute-0 nova_compute[187185]: 2025-11-29 07:16:05.395 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:07 compute-0 nova_compute[187185]: 2025-11-29 07:16:07.068 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Updating instance_info_cache with network_info: [{"id": "29881f52-aa42-4a78-a87b-06e906811ff2", "address": "fa:16:3e:44:9d:fe", "network": {"id": "f8ce59f3-d777-4899-bf5b-171901097199", "bridge": "br-int", "label": "tempest-network-smoke--1303008278", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29881f52-aa", "ovs_interfaceid": "29881f52-aa42-4a78-a87b-06e906811ff2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:16:07 compute-0 nova_compute[187185]: 2025-11-29 07:16:07.171 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Releasing lock "refresh_cache-704c4aa7-3239-4ecc-bfdc-c72642678363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:16:07 compute-0 nova_compute[187185]: 2025-11-29 07:16:07.171 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 29 07:16:07 compute-0 nova_compute[187185]: 2025-11-29 07:16:07.171 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:16:07 compute-0 nova_compute[187185]: 2025-11-29 07:16:07.172 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:16:07 compute-0 nova_compute[187185]: 2025-11-29 07:16:07.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:16:07 compute-0 nova_compute[187185]: 2025-11-29 07:16:07.649 187189 DEBUG nova.compute.manager [req-6b111dfb-ed5f-4ebe-b572-57e52e5aec0d req-f519f7ae-57f9-41f9-995f-df33763f2391 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Received event network-changed-29881f52-aa42-4a78-a87b-06e906811ff2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:16:07 compute-0 nova_compute[187185]: 2025-11-29 07:16:07.650 187189 DEBUG nova.compute.manager [req-6b111dfb-ed5f-4ebe-b572-57e52e5aec0d req-f519f7ae-57f9-41f9-995f-df33763f2391 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Refreshing instance network info cache due to event network-changed-29881f52-aa42-4a78-a87b-06e906811ff2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:16:07 compute-0 nova_compute[187185]: 2025-11-29 07:16:07.650 187189 DEBUG oslo_concurrency.lockutils [req-6b111dfb-ed5f-4ebe-b572-57e52e5aec0d req-f519f7ae-57f9-41f9-995f-df33763f2391 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-704c4aa7-3239-4ecc-bfdc-c72642678363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:16:07 compute-0 nova_compute[187185]: 2025-11-29 07:16:07.650 187189 DEBUG oslo_concurrency.lockutils [req-6b111dfb-ed5f-4ebe-b572-57e52e5aec0d req-f519f7ae-57f9-41f9-995f-df33763f2391 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-704c4aa7-3239-4ecc-bfdc-c72642678363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:16:07 compute-0 nova_compute[187185]: 2025-11-29 07:16:07.650 187189 DEBUG nova.network.neutron [req-6b111dfb-ed5f-4ebe-b572-57e52e5aec0d req-f519f7ae-57f9-41f9-995f-df33763f2391 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Refreshing network info cache for port 29881f52-aa42-4a78-a87b-06e906811ff2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:16:08 compute-0 nova_compute[187185]: 2025-11-29 07:16:08.310 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:16:08 compute-0 nova_compute[187185]: 2025-11-29 07:16:08.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:16:08 compute-0 nova_compute[187185]: 2025-11-29 07:16:08.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:16:08 compute-0 nova_compute[187185]: 2025-11-29 07:16:08.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:16:08 compute-0 nova_compute[187185]: 2025-11-29 07:16:08.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:16:10 compute-0 nova_compute[187185]: 2025-11-29 07:16:10.067 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:10 compute-0 nova_compute[187185]: 2025-11-29 07:16:10.203 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:16:10 compute-0 nova_compute[187185]: 2025-11-29 07:16:10.203 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:16:10 compute-0 nova_compute[187185]: 2025-11-29 07:16:10.204 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:16:10 compute-0 nova_compute[187185]: 2025-11-29 07:16:10.204 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:16:10 compute-0 nova_compute[187185]: 2025-11-29 07:16:10.396 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:10 compute-0 nova_compute[187185]: 2025-11-29 07:16:10.530 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:16:10 compute-0 nova_compute[187185]: 2025-11-29 07:16:10.624 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:16:10 compute-0 nova_compute[187185]: 2025-11-29 07:16:10.625 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:16:10 compute-0 nova_compute[187185]: 2025-11-29 07:16:10.711 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:16:10 compute-0 nova_compute[187185]: 2025-11-29 07:16:10.888 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:16:10 compute-0 nova_compute[187185]: 2025-11-29 07:16:10.889 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5580MB free_disk=73.29388046264648GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:16:10 compute-0 nova_compute[187185]: 2025-11-29 07:16:10.889 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:16:10 compute-0 nova_compute[187185]: 2025-11-29 07:16:10.890 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:16:10 compute-0 nova_compute[187185]: 2025-11-29 07:16:10.959 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance 704c4aa7-3239-4ecc-bfdc-c72642678363 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:16:10 compute-0 nova_compute[187185]: 2025-11-29 07:16:10.960 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:16:10 compute-0 nova_compute[187185]: 2025-11-29 07:16:10.960 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:16:10 compute-0 nova_compute[187185]: 2025-11-29 07:16:10.972 187189 DEBUG nova.network.neutron [req-6b111dfb-ed5f-4ebe-b572-57e52e5aec0d req-f519f7ae-57f9-41f9-995f-df33763f2391 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Updated VIF entry in instance network info cache for port 29881f52-aa42-4a78-a87b-06e906811ff2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:16:10 compute-0 nova_compute[187185]: 2025-11-29 07:16:10.974 187189 DEBUG nova.network.neutron [req-6b111dfb-ed5f-4ebe-b572-57e52e5aec0d req-f519f7ae-57f9-41f9-995f-df33763f2391 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Updating instance_info_cache with network_info: [{"id": "29881f52-aa42-4a78-a87b-06e906811ff2", "address": "fa:16:3e:44:9d:fe", "network": {"id": "f8ce59f3-d777-4899-bf5b-171901097199", "bridge": "br-int", "label": "tempest-network-smoke--1303008278", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29881f52-aa", "ovs_interfaceid": "29881f52-aa42-4a78-a87b-06e906811ff2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:16:10 compute-0 nova_compute[187185]: 2025-11-29 07:16:10.995 187189 DEBUG oslo_concurrency.lockutils [req-6b111dfb-ed5f-4ebe-b572-57e52e5aec0d req-f519f7ae-57f9-41f9-995f-df33763f2391 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-704c4aa7-3239-4ecc-bfdc-c72642678363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:16:11 compute-0 nova_compute[187185]: 2025-11-29 07:16:11.002 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:16:11 compute-0 nova_compute[187185]: 2025-11-29 07:16:11.036 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:16:11 compute-0 nova_compute[187185]: 2025-11-29 07:16:11.058 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:16:11 compute-0 nova_compute[187185]: 2025-11-29 07:16:11.059 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:16:12 compute-0 nova_compute[187185]: 2025-11-29 07:16:12.060 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:16:13 compute-0 nova_compute[187185]: 2025-11-29 07:16:13.167 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400558.1662555, 084a0f8e-19b7-4b24-a503-c015b26addbc => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:16:13 compute-0 nova_compute[187185]: 2025-11-29 07:16:13.168 187189 INFO nova.compute.manager [-] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] VM Stopped (Lifecycle Event)
Nov 29 07:16:13 compute-0 nova_compute[187185]: 2025-11-29 07:16:13.191 187189 DEBUG nova.compute.manager [None req-80d48ec2-fa35-4f0d-8ecc-ecf163527995 - - - - - -] [instance: 084a0f8e-19b7-4b24-a503-c015b26addbc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:16:13 compute-0 podman[229438]: 2025-11-29 07:16:13.836577828 +0000 UTC m=+0.102169219 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2)
Nov 29 07:16:14 compute-0 ovn_controller[95281]: 2025-11-29T07:16:14Z|00035|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:44:9d:fe 10.100.0.6
Nov 29 07:16:14 compute-0 ovn_controller[95281]: 2025-11-29T07:16:14Z|00036|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:44:9d:fe 10.100.0.6
Nov 29 07:16:15 compute-0 nova_compute[187185]: 2025-11-29 07:16:15.070 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:15 compute-0 nova_compute[187185]: 2025-11-29 07:16:15.399 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:17 compute-0 ovn_controller[95281]: 2025-11-29T07:16:17Z|00273|binding|INFO|Releasing lport c10b6573-55b8-4259-8949-c467435d65c0 from this chassis (sb_readonly=0)
Nov 29 07:16:18 compute-0 nova_compute[187185]: 2025-11-29 07:16:18.019 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:19 compute-0 podman[229465]: 2025-11-29 07:16:19.808289857 +0000 UTC m=+0.075719559 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 07:16:20 compute-0 nova_compute[187185]: 2025-11-29 07:16:20.099 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:20 compute-0 nova_compute[187185]: 2025-11-29 07:16:20.312 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:16:20 compute-0 nova_compute[187185]: 2025-11-29 07:16:20.401 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:20 compute-0 nova_compute[187185]: 2025-11-29 07:16:20.918 187189 INFO nova.compute.manager [None req-9f88b92a-5a15-4d97-8452-ed91ace21497 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Get console output
Nov 29 07:16:20 compute-0 nova_compute[187185]: 2025-11-29 07:16:20.925 213758 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 29 07:16:21 compute-0 podman[229490]: 2025-11-29 07:16:21.817798477 +0000 UTC m=+0.078219410 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 07:16:21 compute-0 podman[229491]: 2025-11-29 07:16:21.820324339 +0000 UTC m=+0.076227093 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 07:16:22 compute-0 nova_compute[187185]: 2025-11-29 07:16:22.489 187189 INFO nova.compute.manager [None req-fef1c6bb-6100-4a23-b4e2-31e85a3b7812 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Get console output
Nov 29 07:16:22 compute-0 nova_compute[187185]: 2025-11-29 07:16:22.496 213758 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 29 07:16:25 compute-0 nova_compute[187185]: 2025-11-29 07:16:25.102 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:25 compute-0 nova_compute[187185]: 2025-11-29 07:16:25.404 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:25.505 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:16:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:25.507 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:16:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:25.507 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:16:28 compute-0 nova_compute[187185]: 2025-11-29 07:16:28.555 187189 DEBUG nova.virt.libvirt.driver [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Check if temp file /var/lib/nova/instances/tmpgy45ripl exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Nov 29 07:16:28 compute-0 nova_compute[187185]: 2025-11-29 07:16:28.560 187189 DEBUG oslo_concurrency.processutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:16:28 compute-0 nova_compute[187185]: 2025-11-29 07:16:28.640 187189 DEBUG oslo_concurrency.processutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:16:28 compute-0 nova_compute[187185]: 2025-11-29 07:16:28.641 187189 DEBUG oslo_concurrency.processutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:16:28 compute-0 nova_compute[187185]: 2025-11-29 07:16:28.694 187189 DEBUG oslo_concurrency.processutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:16:28 compute-0 nova_compute[187185]: 2025-11-29 07:16:28.695 187189 DEBUG nova.compute.manager [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpgy45ripl',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='704c4aa7-3239-4ecc-bfdc-c72642678363',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Nov 29 07:16:29 compute-0 nova_compute[187185]: 2025-11-29 07:16:29.386 187189 DEBUG oslo_concurrency.processutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:16:29 compute-0 nova_compute[187185]: 2025-11-29 07:16:29.463 187189 DEBUG oslo_concurrency.processutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:16:29 compute-0 nova_compute[187185]: 2025-11-29 07:16:29.464 187189 DEBUG oslo_concurrency.processutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:16:29 compute-0 nova_compute[187185]: 2025-11-29 07:16:29.518 187189 DEBUG oslo_concurrency.processutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:16:30 compute-0 nova_compute[187185]: 2025-11-29 07:16:30.106 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:30 compute-0 nova_compute[187185]: 2025-11-29 07:16:30.406 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:31 compute-0 sshd-session[229543]: Accepted publickey for nova from 192.168.122.102 port 51376 ssh2: ECDSA SHA256:TRxHqyAgYthLa8Amy2/P9EvhpVYcaH8++okwzvcHmaY
Nov 29 07:16:31 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Nov 29 07:16:32 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 29 07:16:32 compute-0 systemd-logind[788]: New session 34 of user nova.
Nov 29 07:16:32 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 29 07:16:32 compute-0 systemd[1]: Starting User Manager for UID 42436...
Nov 29 07:16:32 compute-0 systemd[229547]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 29 07:16:32 compute-0 systemd[229547]: Queued start job for default target Main User Target.
Nov 29 07:16:32 compute-0 systemd[229547]: Created slice User Application Slice.
Nov 29 07:16:32 compute-0 systemd[229547]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 29 07:16:32 compute-0 systemd[229547]: Started Daily Cleanup of User's Temporary Directories.
Nov 29 07:16:32 compute-0 systemd[229547]: Reached target Paths.
Nov 29 07:16:32 compute-0 systemd[229547]: Reached target Timers.
Nov 29 07:16:32 compute-0 systemd[229547]: Starting D-Bus User Message Bus Socket...
Nov 29 07:16:32 compute-0 systemd[229547]: Starting Create User's Volatile Files and Directories...
Nov 29 07:16:32 compute-0 systemd[229547]: Finished Create User's Volatile Files and Directories.
Nov 29 07:16:32 compute-0 systemd[229547]: Listening on D-Bus User Message Bus Socket.
Nov 29 07:16:32 compute-0 systemd[229547]: Reached target Sockets.
Nov 29 07:16:32 compute-0 systemd[229547]: Reached target Basic System.
Nov 29 07:16:32 compute-0 systemd[229547]: Reached target Main User Target.
Nov 29 07:16:32 compute-0 systemd[229547]: Startup finished in 157ms.
Nov 29 07:16:32 compute-0 systemd[1]: Started User Manager for UID 42436.
Nov 29 07:16:32 compute-0 systemd[1]: Started Session 34 of User nova.
Nov 29 07:16:32 compute-0 sshd-session[229543]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 29 07:16:32 compute-0 sshd-session[229562]: Received disconnect from 192.168.122.102 port 51376:11: disconnected by user
Nov 29 07:16:32 compute-0 sshd-session[229562]: Disconnected from user nova 192.168.122.102 port 51376
Nov 29 07:16:32 compute-0 sshd-session[229543]: pam_unix(sshd:session): session closed for user nova
Nov 29 07:16:32 compute-0 systemd[1]: session-34.scope: Deactivated successfully.
Nov 29 07:16:32 compute-0 systemd-logind[788]: Session 34 logged out. Waiting for processes to exit.
Nov 29 07:16:32 compute-0 systemd-logind[788]: Removed session 34.
Nov 29 07:16:34 compute-0 nova_compute[187185]: 2025-11-29 07:16:34.317 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:34 compute-0 nova_compute[187185]: 2025-11-29 07:16:34.354 187189 DEBUG nova.compute.manager [req-2298b7db-363e-42fb-9338-e20c3c74ece8 req-5b2b235e-3d68-452d-904f-1b210ab15504 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Received event network-vif-unplugged-29881f52-aa42-4a78-a87b-06e906811ff2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:16:34 compute-0 nova_compute[187185]: 2025-11-29 07:16:34.354 187189 DEBUG oslo_concurrency.lockutils [req-2298b7db-363e-42fb-9338-e20c3c74ece8 req-5b2b235e-3d68-452d-904f-1b210ab15504 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "704c4aa7-3239-4ecc-bfdc-c72642678363-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:16:34 compute-0 nova_compute[187185]: 2025-11-29 07:16:34.355 187189 DEBUG oslo_concurrency.lockutils [req-2298b7db-363e-42fb-9338-e20c3c74ece8 req-5b2b235e-3d68-452d-904f-1b210ab15504 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "704c4aa7-3239-4ecc-bfdc-c72642678363-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:16:34 compute-0 nova_compute[187185]: 2025-11-29 07:16:34.355 187189 DEBUG oslo_concurrency.lockutils [req-2298b7db-363e-42fb-9338-e20c3c74ece8 req-5b2b235e-3d68-452d-904f-1b210ab15504 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "704c4aa7-3239-4ecc-bfdc-c72642678363-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:16:34 compute-0 nova_compute[187185]: 2025-11-29 07:16:34.355 187189 DEBUG nova.compute.manager [req-2298b7db-363e-42fb-9338-e20c3c74ece8 req-5b2b235e-3d68-452d-904f-1b210ab15504 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] No waiting events found dispatching network-vif-unplugged-29881f52-aa42-4a78-a87b-06e906811ff2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:16:34 compute-0 nova_compute[187185]: 2025-11-29 07:16:34.355 187189 DEBUG nova.compute.manager [req-2298b7db-363e-42fb-9338-e20c3c74ece8 req-5b2b235e-3d68-452d-904f-1b210ab15504 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Received event network-vif-unplugged-29881f52-aa42-4a78-a87b-06e906811ff2 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 07:16:34 compute-0 podman[229564]: 2025-11-29 07:16:34.809971643 +0000 UTC m=+0.067008402 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 07:16:34 compute-0 podman[229566]: 2025-11-29 07:16:34.814981365 +0000 UTC m=+0.061431924 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 07:16:34 compute-0 podman[229565]: 2025-11-29 07:16:34.818979808 +0000 UTC m=+0.075104851 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, name=ubi9-minimal, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vcs-type=git, version=9.6, com.redhat.component=ubi9-minimal-container)
Nov 29 07:16:35 compute-0 nova_compute[187185]: 2025-11-29 07:16:35.153 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:35 compute-0 nova_compute[187185]: 2025-11-29 07:16:35.409 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:36 compute-0 nova_compute[187185]: 2025-11-29 07:16:36.577 187189 DEBUG nova.compute.manager [req-c1deb6e4-b5b2-4e63-b3aa-eed3e3fa47a4 req-87bf4bcf-31d0-448f-8592-f7125a5663f5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Received event network-vif-plugged-29881f52-aa42-4a78-a87b-06e906811ff2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:16:36 compute-0 nova_compute[187185]: 2025-11-29 07:16:36.578 187189 DEBUG oslo_concurrency.lockutils [req-c1deb6e4-b5b2-4e63-b3aa-eed3e3fa47a4 req-87bf4bcf-31d0-448f-8592-f7125a5663f5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "704c4aa7-3239-4ecc-bfdc-c72642678363-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:16:36 compute-0 nova_compute[187185]: 2025-11-29 07:16:36.578 187189 DEBUG oslo_concurrency.lockutils [req-c1deb6e4-b5b2-4e63-b3aa-eed3e3fa47a4 req-87bf4bcf-31d0-448f-8592-f7125a5663f5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "704c4aa7-3239-4ecc-bfdc-c72642678363-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:16:36 compute-0 nova_compute[187185]: 2025-11-29 07:16:36.578 187189 DEBUG oslo_concurrency.lockutils [req-c1deb6e4-b5b2-4e63-b3aa-eed3e3fa47a4 req-87bf4bcf-31d0-448f-8592-f7125a5663f5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "704c4aa7-3239-4ecc-bfdc-c72642678363-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:16:36 compute-0 nova_compute[187185]: 2025-11-29 07:16:36.579 187189 DEBUG nova.compute.manager [req-c1deb6e4-b5b2-4e63-b3aa-eed3e3fa47a4 req-87bf4bcf-31d0-448f-8592-f7125a5663f5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] No waiting events found dispatching network-vif-plugged-29881f52-aa42-4a78-a87b-06e906811ff2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:16:36 compute-0 nova_compute[187185]: 2025-11-29 07:16:36.579 187189 WARNING nova.compute.manager [req-c1deb6e4-b5b2-4e63-b3aa-eed3e3fa47a4 req-87bf4bcf-31d0-448f-8592-f7125a5663f5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Received unexpected event network-vif-plugged-29881f52-aa42-4a78-a87b-06e906811ff2 for instance with vm_state active and task_state migrating.
Nov 29 07:16:37 compute-0 nova_compute[187185]: 2025-11-29 07:16:37.330 187189 INFO nova.compute.manager [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Took 7.81 seconds for pre_live_migration on destination host compute-2.ctlplane.example.com.
Nov 29 07:16:37 compute-0 nova_compute[187185]: 2025-11-29 07:16:37.331 187189 DEBUG nova.compute.manager [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:16:37 compute-0 nova_compute[187185]: 2025-11-29 07:16:37.362 187189 DEBUG nova.compute.manager [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpgy45ripl',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='704c4aa7-3239-4ecc-bfdc-c72642678363',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(d11e3672-6cff-4636-96b9-c3cf40816fbe),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Nov 29 07:16:37 compute-0 nova_compute[187185]: 2025-11-29 07:16:37.399 187189 DEBUG nova.objects.instance [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Lazy-loading 'migration_context' on Instance uuid 704c4aa7-3239-4ecc-bfdc-c72642678363 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:16:37 compute-0 nova_compute[187185]: 2025-11-29 07:16:37.401 187189 DEBUG nova.virt.libvirt.driver [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Nov 29 07:16:37 compute-0 nova_compute[187185]: 2025-11-29 07:16:37.403 187189 DEBUG nova.virt.libvirt.driver [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Nov 29 07:16:37 compute-0 nova_compute[187185]: 2025-11-29 07:16:37.403 187189 DEBUG nova.virt.libvirt.driver [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Nov 29 07:16:37 compute-0 nova_compute[187185]: 2025-11-29 07:16:37.420 187189 DEBUG nova.virt.libvirt.vif [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:15:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1683200929',display_name='tempest-TestNetworkAdvancedServerOps-server-1683200929',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1683200929',id=102,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJpBdlQTrwm1jTLhsIWvBArp7FJbNV/DmsxpavKG+fSfuJYeopMQPEBt+TLRsvwJz1i5TrgMP98T/zGS4tH40QimuRAQV56ulySp5fCUrK73vauhbVZ7xUa0c5MPUYrHZg==',key_name='tempest-TestNetworkAdvancedServerOps-1740209866',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:16:02Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-v4kh9s3g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:16:02Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=704c4aa7-3239-4ecc-bfdc-c72642678363,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "29881f52-aa42-4a78-a87b-06e906811ff2", "address": "fa:16:3e:44:9d:fe", "network": {"id": "f8ce59f3-d777-4899-bf5b-171901097199", "bridge": "br-int", "label": "tempest-network-smoke--1303008278", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap29881f52-aa", "ovs_interfaceid": "29881f52-aa42-4a78-a87b-06e906811ff2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:16:37 compute-0 nova_compute[187185]: 2025-11-29 07:16:37.421 187189 DEBUG nova.network.os_vif_util [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Converting VIF {"id": "29881f52-aa42-4a78-a87b-06e906811ff2", "address": "fa:16:3e:44:9d:fe", "network": {"id": "f8ce59f3-d777-4899-bf5b-171901097199", "bridge": "br-int", "label": "tempest-network-smoke--1303008278", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap29881f52-aa", "ovs_interfaceid": "29881f52-aa42-4a78-a87b-06e906811ff2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:16:37 compute-0 nova_compute[187185]: 2025-11-29 07:16:37.422 187189 DEBUG nova.network.os_vif_util [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:44:9d:fe,bridge_name='br-int',has_traffic_filtering=True,id=29881f52-aa42-4a78-a87b-06e906811ff2,network=Network(f8ce59f3-d777-4899-bf5b-171901097199),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29881f52-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:16:37 compute-0 nova_compute[187185]: 2025-11-29 07:16:37.423 187189 DEBUG nova.virt.libvirt.migration [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Updating guest XML with vif config: <interface type="ethernet">
Nov 29 07:16:37 compute-0 nova_compute[187185]:   <mac address="fa:16:3e:44:9d:fe"/>
Nov 29 07:16:37 compute-0 nova_compute[187185]:   <model type="virtio"/>
Nov 29 07:16:37 compute-0 nova_compute[187185]:   <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:16:37 compute-0 nova_compute[187185]:   <mtu size="1442"/>
Nov 29 07:16:37 compute-0 nova_compute[187185]:   <target dev="tap29881f52-aa"/>
Nov 29 07:16:37 compute-0 nova_compute[187185]: </interface>
Nov 29 07:16:37 compute-0 nova_compute[187185]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Nov 29 07:16:37 compute-0 nova_compute[187185]: 2025-11-29 07:16:37.424 187189 DEBUG nova.virt.libvirt.driver [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Nov 29 07:16:37 compute-0 nova_compute[187185]: 2025-11-29 07:16:37.907 187189 DEBUG nova.virt.libvirt.migration [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 29 07:16:37 compute-0 nova_compute[187185]: 2025-11-29 07:16:37.908 187189 INFO nova.virt.libvirt.migration [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Increasing downtime to 50 ms after 0 sec elapsed time
Nov 29 07:16:37 compute-0 nova_compute[187185]: 2025-11-29 07:16:37.994 187189 INFO nova.virt.libvirt.driver [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Nov 29 07:16:38 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:38.301 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:16:38 compute-0 nova_compute[187185]: 2025-11-29 07:16:38.302 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:38 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:38.303 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:16:38 compute-0 nova_compute[187185]: 2025-11-29 07:16:38.497 187189 DEBUG nova.virt.libvirt.migration [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 29 07:16:38 compute-0 nova_compute[187185]: 2025-11-29 07:16:38.497 187189 DEBUG nova.virt.libvirt.migration [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 29 07:16:38 compute-0 nova_compute[187185]: 2025-11-29 07:16:38.736 187189 DEBUG nova.compute.manager [req-6749d7a8-4018-4fb2-bd70-45d4ba0769d5 req-67747558-8cbe-4729-8cfc-140df8f782ba 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Received event network-changed-29881f52-aa42-4a78-a87b-06e906811ff2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:16:38 compute-0 nova_compute[187185]: 2025-11-29 07:16:38.737 187189 DEBUG nova.compute.manager [req-6749d7a8-4018-4fb2-bd70-45d4ba0769d5 req-67747558-8cbe-4729-8cfc-140df8f782ba 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Refreshing instance network info cache due to event network-changed-29881f52-aa42-4a78-a87b-06e906811ff2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:16:38 compute-0 nova_compute[187185]: 2025-11-29 07:16:38.737 187189 DEBUG oslo_concurrency.lockutils [req-6749d7a8-4018-4fb2-bd70-45d4ba0769d5 req-67747558-8cbe-4729-8cfc-140df8f782ba 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-704c4aa7-3239-4ecc-bfdc-c72642678363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:16:38 compute-0 nova_compute[187185]: 2025-11-29 07:16:38.737 187189 DEBUG oslo_concurrency.lockutils [req-6749d7a8-4018-4fb2-bd70-45d4ba0769d5 req-67747558-8cbe-4729-8cfc-140df8f782ba 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-704c4aa7-3239-4ecc-bfdc-c72642678363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:16:38 compute-0 nova_compute[187185]: 2025-11-29 07:16:38.738 187189 DEBUG nova.network.neutron [req-6749d7a8-4018-4fb2-bd70-45d4ba0769d5 req-67747558-8cbe-4729-8cfc-140df8f782ba 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Refreshing network info cache for port 29881f52-aa42-4a78-a87b-06e906811ff2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:16:39 compute-0 nova_compute[187185]: 2025-11-29 07:16:39.002 187189 DEBUG nova.virt.libvirt.migration [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 29 07:16:39 compute-0 nova_compute[187185]: 2025-11-29 07:16:39.003 187189 DEBUG nova.virt.libvirt.migration [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 29 07:16:39 compute-0 nova_compute[187185]: 2025-11-29 07:16:39.447 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400599.4471998, 704c4aa7-3239-4ecc-bfdc-c72642678363 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:16:39 compute-0 nova_compute[187185]: 2025-11-29 07:16:39.448 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] VM Paused (Lifecycle Event)
Nov 29 07:16:39 compute-0 nova_compute[187185]: 2025-11-29 07:16:39.469 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:16:39 compute-0 nova_compute[187185]: 2025-11-29 07:16:39.475 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:16:39 compute-0 nova_compute[187185]: 2025-11-29 07:16:39.500 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] During sync_power_state the instance has a pending task (migrating). Skip.
Nov 29 07:16:39 compute-0 nova_compute[187185]: 2025-11-29 07:16:39.506 187189 DEBUG nova.virt.libvirt.migration [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 29 07:16:39 compute-0 nova_compute[187185]: 2025-11-29 07:16:39.507 187189 DEBUG nova.virt.libvirt.migration [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 29 07:16:39 compute-0 kernel: tap29881f52-aa (unregistering): left promiscuous mode
Nov 29 07:16:39 compute-0 NetworkManager[55227]: <info>  [1764400599.5936] device (tap29881f52-aa): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:16:39 compute-0 nova_compute[187185]: 2025-11-29 07:16:39.621 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:39 compute-0 ovn_controller[95281]: 2025-11-29T07:16:39Z|00274|binding|INFO|Releasing lport 29881f52-aa42-4a78-a87b-06e906811ff2 from this chassis (sb_readonly=0)
Nov 29 07:16:39 compute-0 ovn_controller[95281]: 2025-11-29T07:16:39Z|00275|binding|INFO|Setting lport 29881f52-aa42-4a78-a87b-06e906811ff2 down in Southbound
Nov 29 07:16:39 compute-0 ovn_controller[95281]: 2025-11-29T07:16:39Z|00276|binding|INFO|Removing iface tap29881f52-aa ovn-installed in OVS
Nov 29 07:16:39 compute-0 nova_compute[187185]: 2025-11-29 07:16:39.627 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:39.632 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:44:9d:fe 10.100.0.6'], port_security=['fa:16:3e:44:9d:fe 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-2.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '704c4aa7-3239-4ecc-bfdc-c72642678363', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f8ce59f3-d777-4899-bf5b-171901097199', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c231e63624d44fc19e0989abfb1afb22', 'neutron:revision_number': '8', 'neutron:security_group_ids': '0aded770-2a08-4693-9d94-82fba33c50bb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.225'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c8856130-4f24-493d-8324-579a0d608efb, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=29881f52-aa42-4a78-a87b-06e906811ff2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:16:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:39.633 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 29881f52-aa42-4a78-a87b-06e906811ff2 in datapath f8ce59f3-d777-4899-bf5b-171901097199 unbound from our chassis
Nov 29 07:16:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:39.635 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f8ce59f3-d777-4899-bf5b-171901097199, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:16:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:39.638 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[7fe66cf9-7381-4848-a337-cfb7c1c856e0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:16:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:39.639 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199 namespace which is not needed anymore
Nov 29 07:16:39 compute-0 nova_compute[187185]: 2025-11-29 07:16:39.640 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:39 compute-0 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d00000066.scope: Deactivated successfully.
Nov 29 07:16:39 compute-0 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d00000066.scope: Consumed 14.592s CPU time.
Nov 29 07:16:39 compute-0 systemd-machined[153486]: Machine qemu-37-instance-00000066 terminated.
Nov 29 07:16:39 compute-0 neutron-haproxy-ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199[229339]: [NOTICE]   (229343) : haproxy version is 2.8.14-c23fe91
Nov 29 07:16:39 compute-0 neutron-haproxy-ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199[229339]: [NOTICE]   (229343) : path to executable is /usr/sbin/haproxy
Nov 29 07:16:39 compute-0 neutron-haproxy-ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199[229339]: [WARNING]  (229343) : Exiting Master process...
Nov 29 07:16:39 compute-0 neutron-haproxy-ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199[229339]: [ALERT]    (229343) : Current worker (229345) exited with code 143 (Terminated)
Nov 29 07:16:39 compute-0 neutron-haproxy-ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199[229339]: [WARNING]  (229343) : All workers exited. Exiting... (0)
Nov 29 07:16:39 compute-0 systemd[1]: libpod-58bf4e59539b7fcd8b376f3cf337b5e0a574a1fb0cce529ef3e974dcbc08450c.scope: Deactivated successfully.
Nov 29 07:16:39 compute-0 nova_compute[187185]: 2025-11-29 07:16:39.787 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:39 compute-0 podman[229658]: 2025-11-29 07:16:39.790699462 +0000 UTC m=+0.050482253 container died 58bf4e59539b7fcd8b376f3cf337b5e0a574a1fb0cce529ef3e974dcbc08450c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 07:16:39 compute-0 nova_compute[187185]: 2025-11-29 07:16:39.791 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:39 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-58bf4e59539b7fcd8b376f3cf337b5e0a574a1fb0cce529ef3e974dcbc08450c-userdata-shm.mount: Deactivated successfully.
Nov 29 07:16:39 compute-0 systemd[1]: var-lib-containers-storage-overlay-6471ecca0f8a443025464549e15742f540bc87fb1faf7ea59887c3e65e61b607-merged.mount: Deactivated successfully.
Nov 29 07:16:39 compute-0 podman[229658]: 2025-11-29 07:16:39.831598492 +0000 UTC m=+0.091381283 container cleanup 58bf4e59539b7fcd8b376f3cf337b5e0a574a1fb0cce529ef3e974dcbc08450c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 07:16:39 compute-0 nova_compute[187185]: 2025-11-29 07:16:39.837 187189 DEBUG nova.virt.libvirt.driver [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Nov 29 07:16:39 compute-0 nova_compute[187185]: 2025-11-29 07:16:39.837 187189 DEBUG nova.virt.libvirt.driver [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Nov 29 07:16:39 compute-0 nova_compute[187185]: 2025-11-29 07:16:39.837 187189 DEBUG nova.virt.libvirt.driver [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Nov 29 07:16:39 compute-0 systemd[1]: libpod-conmon-58bf4e59539b7fcd8b376f3cf337b5e0a574a1fb0cce529ef3e974dcbc08450c.scope: Deactivated successfully.
Nov 29 07:16:39 compute-0 podman[229702]: 2025-11-29 07:16:39.903984195 +0000 UTC m=+0.047416786 container remove 58bf4e59539b7fcd8b376f3cf337b5e0a574a1fb0cce529ef3e974dcbc08450c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 07:16:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:39.910 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[3bd14f9a-78c1-48cb-aa8a-b351c7d35ec9]: (4, ('Sat Nov 29 07:16:39 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199 (58bf4e59539b7fcd8b376f3cf337b5e0a574a1fb0cce529ef3e974dcbc08450c)\n58bf4e59539b7fcd8b376f3cf337b5e0a574a1fb0cce529ef3e974dcbc08450c\nSat Nov 29 07:16:39 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199 (58bf4e59539b7fcd8b376f3cf337b5e0a574a1fb0cce529ef3e974dcbc08450c)\n58bf4e59539b7fcd8b376f3cf337b5e0a574a1fb0cce529ef3e974dcbc08450c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:16:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:39.912 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[2d5ccb79-bf69-4b16-8868-8f410a841f2e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:16:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:39.914 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf8ce59f3-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:16:39 compute-0 nova_compute[187185]: 2025-11-29 07:16:39.918 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:39 compute-0 kernel: tapf8ce59f3-d0: left promiscuous mode
Nov 29 07:16:39 compute-0 nova_compute[187185]: 2025-11-29 07:16:39.946 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:39.952 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[8411c987-95be-46f3-83a0-cca06719449b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:16:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:39.972 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6b45f53a-bb0b-4837-ad8a-ad1a4a20769b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:16:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:39.974 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[fdae30fc-f180-4614-aef0-5673db68836e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:16:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:39.996 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[79d5a0be-10f7-4fec-94ba-041547811e19]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600928, 'reachable_time': 25155, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229721, 'error': None, 'target': 'ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:16:40 compute-0 systemd[1]: run-netns-ovnmeta\x2df8ce59f3\x2dd777\x2d4899\x2dbf5b\x2d171901097199.mount: Deactivated successfully.
Nov 29 07:16:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:40.001 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:16:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:40.002 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[4b65d59b-0316-46b5-8bc3-d9d391812498]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:16:40 compute-0 nova_compute[187185]: 2025-11-29 07:16:40.009 187189 DEBUG nova.virt.libvirt.guest [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '704c4aa7-3239-4ecc-bfdc-c72642678363' (instance-00000066) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Nov 29 07:16:40 compute-0 nova_compute[187185]: 2025-11-29 07:16:40.010 187189 INFO nova.virt.libvirt.driver [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Migration operation has completed
Nov 29 07:16:40 compute-0 nova_compute[187185]: 2025-11-29 07:16:40.010 187189 INFO nova.compute.manager [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] _post_live_migration() is started..
Nov 29 07:16:40 compute-0 nova_compute[187185]: 2025-11-29 07:16:40.154 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:40 compute-0 nova_compute[187185]: 2025-11-29 07:16:40.413 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:42 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Nov 29 07:16:42 compute-0 systemd[229547]: Activating special unit Exit the Session...
Nov 29 07:16:42 compute-0 systemd[229547]: Stopped target Main User Target.
Nov 29 07:16:42 compute-0 systemd[229547]: Stopped target Basic System.
Nov 29 07:16:42 compute-0 systemd[229547]: Stopped target Paths.
Nov 29 07:16:42 compute-0 systemd[229547]: Stopped target Sockets.
Nov 29 07:16:42 compute-0 systemd[229547]: Stopped target Timers.
Nov 29 07:16:42 compute-0 systemd[229547]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 29 07:16:42 compute-0 systemd[229547]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 29 07:16:42 compute-0 systemd[229547]: Closed D-Bus User Message Bus Socket.
Nov 29 07:16:42 compute-0 systemd[229547]: Stopped Create User's Volatile Files and Directories.
Nov 29 07:16:42 compute-0 systemd[229547]: Removed slice User Application Slice.
Nov 29 07:16:42 compute-0 systemd[229547]: Reached target Shutdown.
Nov 29 07:16:42 compute-0 systemd[229547]: Finished Exit the Session.
Nov 29 07:16:42 compute-0 systemd[229547]: Reached target Exit the Session.
Nov 29 07:16:42 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Nov 29 07:16:42 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Nov 29 07:16:42 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 29 07:16:42 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 29 07:16:42 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 29 07:16:42 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 29 07:16:42 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Nov 29 07:16:44 compute-0 nova_compute[187185]: 2025-11-29 07:16:44.130 187189 DEBUG nova.compute.manager [req-150a9bb6-4be9-46f7-b80c-2fc0ac89a154 req-f6923fff-e4f2-45e2-9c99-c0900cc60000 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Received event network-vif-unplugged-29881f52-aa42-4a78-a87b-06e906811ff2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:16:44 compute-0 nova_compute[187185]: 2025-11-29 07:16:44.133 187189 DEBUG oslo_concurrency.lockutils [req-150a9bb6-4be9-46f7-b80c-2fc0ac89a154 req-f6923fff-e4f2-45e2-9c99-c0900cc60000 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "704c4aa7-3239-4ecc-bfdc-c72642678363-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:16:44 compute-0 nova_compute[187185]: 2025-11-29 07:16:44.134 187189 DEBUG oslo_concurrency.lockutils [req-150a9bb6-4be9-46f7-b80c-2fc0ac89a154 req-f6923fff-e4f2-45e2-9c99-c0900cc60000 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "704c4aa7-3239-4ecc-bfdc-c72642678363-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:16:44 compute-0 nova_compute[187185]: 2025-11-29 07:16:44.134 187189 DEBUG oslo_concurrency.lockutils [req-150a9bb6-4be9-46f7-b80c-2fc0ac89a154 req-f6923fff-e4f2-45e2-9c99-c0900cc60000 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "704c4aa7-3239-4ecc-bfdc-c72642678363-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:16:44 compute-0 nova_compute[187185]: 2025-11-29 07:16:44.135 187189 DEBUG nova.compute.manager [req-150a9bb6-4be9-46f7-b80c-2fc0ac89a154 req-f6923fff-e4f2-45e2-9c99-c0900cc60000 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] No waiting events found dispatching network-vif-unplugged-29881f52-aa42-4a78-a87b-06e906811ff2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:16:44 compute-0 nova_compute[187185]: 2025-11-29 07:16:44.135 187189 DEBUG nova.compute.manager [req-150a9bb6-4be9-46f7-b80c-2fc0ac89a154 req-f6923fff-e4f2-45e2-9c99-c0900cc60000 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Received event network-vif-unplugged-29881f52-aa42-4a78-a87b-06e906811ff2 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 07:16:44 compute-0 nova_compute[187185]: 2025-11-29 07:16:44.210 187189 DEBUG nova.network.neutron [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Activated binding for port 29881f52-aa42-4a78-a87b-06e906811ff2 and host compute-2.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Nov 29 07:16:44 compute-0 nova_compute[187185]: 2025-11-29 07:16:44.212 187189 DEBUG nova.compute.manager [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "29881f52-aa42-4a78-a87b-06e906811ff2", "address": "fa:16:3e:44:9d:fe", "network": {"id": "f8ce59f3-d777-4899-bf5b-171901097199", "bridge": "br-int", "label": "tempest-network-smoke--1303008278", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29881f52-aa", "ovs_interfaceid": "29881f52-aa42-4a78-a87b-06e906811ff2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Nov 29 07:16:44 compute-0 nova_compute[187185]: 2025-11-29 07:16:44.214 187189 DEBUG nova.virt.libvirt.vif [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:15:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1683200929',display_name='tempest-TestNetworkAdvancedServerOps-server-1683200929',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1683200929',id=102,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJpBdlQTrwm1jTLhsIWvBArp7FJbNV/DmsxpavKG+fSfuJYeopMQPEBt+TLRsvwJz1i5TrgMP98T/zGS4tH40QimuRAQV56ulySp5fCUrK73vauhbVZ7xUa0c5MPUYrHZg==',key_name='tempest-TestNetworkAdvancedServerOps-1740209866',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:16:02Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-v4kh9s3g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:16:25Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=704c4aa7-3239-4ecc-bfdc-c72642678363,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "29881f52-aa42-4a78-a87b-06e906811ff2", "address": "fa:16:3e:44:9d:fe", "network": {"id": "f8ce59f3-d777-4899-bf5b-171901097199", "bridge": "br-int", "label": "tempest-network-smoke--1303008278", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29881f52-aa", "ovs_interfaceid": "29881f52-aa42-4a78-a87b-06e906811ff2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:16:44 compute-0 nova_compute[187185]: 2025-11-29 07:16:44.215 187189 DEBUG nova.network.os_vif_util [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Converting VIF {"id": "29881f52-aa42-4a78-a87b-06e906811ff2", "address": "fa:16:3e:44:9d:fe", "network": {"id": "f8ce59f3-d777-4899-bf5b-171901097199", "bridge": "br-int", "label": "tempest-network-smoke--1303008278", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29881f52-aa", "ovs_interfaceid": "29881f52-aa42-4a78-a87b-06e906811ff2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:16:44 compute-0 nova_compute[187185]: 2025-11-29 07:16:44.217 187189 DEBUG nova.network.os_vif_util [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:44:9d:fe,bridge_name='br-int',has_traffic_filtering=True,id=29881f52-aa42-4a78-a87b-06e906811ff2,network=Network(f8ce59f3-d777-4899-bf5b-171901097199),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29881f52-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:16:44 compute-0 nova_compute[187185]: 2025-11-29 07:16:44.217 187189 DEBUG os_vif [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:44:9d:fe,bridge_name='br-int',has_traffic_filtering=True,id=29881f52-aa42-4a78-a87b-06e906811ff2,network=Network(f8ce59f3-d777-4899-bf5b-171901097199),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29881f52-aa') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:16:44 compute-0 nova_compute[187185]: 2025-11-29 07:16:44.221 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:44 compute-0 nova_compute[187185]: 2025-11-29 07:16:44.222 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap29881f52-aa, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:16:44 compute-0 nova_compute[187185]: 2025-11-29 07:16:44.224 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:44 compute-0 nova_compute[187185]: 2025-11-29 07:16:44.227 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:16:44 compute-0 nova_compute[187185]: 2025-11-29 07:16:44.227 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:44 compute-0 nova_compute[187185]: 2025-11-29 07:16:44.231 187189 INFO os_vif [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:44:9d:fe,bridge_name='br-int',has_traffic_filtering=True,id=29881f52-aa42-4a78-a87b-06e906811ff2,network=Network(f8ce59f3-d777-4899-bf5b-171901097199),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29881f52-aa')
Nov 29 07:16:44 compute-0 nova_compute[187185]: 2025-11-29 07:16:44.231 187189 DEBUG oslo_concurrency.lockutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:16:44 compute-0 nova_compute[187185]: 2025-11-29 07:16:44.232 187189 DEBUG oslo_concurrency.lockutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:16:44 compute-0 nova_compute[187185]: 2025-11-29 07:16:44.232 187189 DEBUG oslo_concurrency.lockutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:16:44 compute-0 nova_compute[187185]: 2025-11-29 07:16:44.232 187189 DEBUG nova.compute.manager [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Nov 29 07:16:44 compute-0 nova_compute[187185]: 2025-11-29 07:16:44.233 187189 INFO nova.virt.libvirt.driver [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Deleting instance files /var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363_del
Nov 29 07:16:44 compute-0 nova_compute[187185]: 2025-11-29 07:16:44.234 187189 INFO nova.virt.libvirt.driver [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Deletion of /var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363_del complete
Nov 29 07:16:44 compute-0 nova_compute[187185]: 2025-11-29 07:16:44.485 187189 DEBUG nova.network.neutron [req-6749d7a8-4018-4fb2-bd70-45d4ba0769d5 req-67747558-8cbe-4729-8cfc-140df8f782ba 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Updated VIF entry in instance network info cache for port 29881f52-aa42-4a78-a87b-06e906811ff2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:16:44 compute-0 nova_compute[187185]: 2025-11-29 07:16:44.486 187189 DEBUG nova.network.neutron [req-6749d7a8-4018-4fb2-bd70-45d4ba0769d5 req-67747558-8cbe-4729-8cfc-140df8f782ba 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Updating instance_info_cache with network_info: [{"id": "29881f52-aa42-4a78-a87b-06e906811ff2", "address": "fa:16:3e:44:9d:fe", "network": {"id": "f8ce59f3-d777-4899-bf5b-171901097199", "bridge": "br-int", "label": "tempest-network-smoke--1303008278", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29881f52-aa", "ovs_interfaceid": "29881f52-aa42-4a78-a87b-06e906811ff2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-2.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:16:44 compute-0 nova_compute[187185]: 2025-11-29 07:16:44.537 187189 DEBUG oslo_concurrency.lockutils [req-6749d7a8-4018-4fb2-bd70-45d4ba0769d5 req-67747558-8cbe-4729-8cfc-140df8f782ba 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-704c4aa7-3239-4ecc-bfdc-c72642678363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:16:44 compute-0 nova_compute[187185]: 2025-11-29 07:16:44.789 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:44 compute-0 podman[229723]: 2025-11-29 07:16:44.87075892 +0000 UTC m=+0.128403043 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 07:16:45 compute-0 nova_compute[187185]: 2025-11-29 07:16:45.448 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:47 compute-0 nova_compute[187185]: 2025-11-29 07:16:47.062 187189 DEBUG nova.compute.manager [req-9ed5df97-d8bd-467a-ae06-7e479256f194 req-92ea969f-780e-41e7-b205-be79b135c270 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Received event network-vif-plugged-29881f52-aa42-4a78-a87b-06e906811ff2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:16:47 compute-0 nova_compute[187185]: 2025-11-29 07:16:47.062 187189 DEBUG oslo_concurrency.lockutils [req-9ed5df97-d8bd-467a-ae06-7e479256f194 req-92ea969f-780e-41e7-b205-be79b135c270 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "704c4aa7-3239-4ecc-bfdc-c72642678363-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:16:47 compute-0 nova_compute[187185]: 2025-11-29 07:16:47.063 187189 DEBUG oslo_concurrency.lockutils [req-9ed5df97-d8bd-467a-ae06-7e479256f194 req-92ea969f-780e-41e7-b205-be79b135c270 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "704c4aa7-3239-4ecc-bfdc-c72642678363-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:16:47 compute-0 nova_compute[187185]: 2025-11-29 07:16:47.063 187189 DEBUG oslo_concurrency.lockutils [req-9ed5df97-d8bd-467a-ae06-7e479256f194 req-92ea969f-780e-41e7-b205-be79b135c270 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "704c4aa7-3239-4ecc-bfdc-c72642678363-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:16:47 compute-0 nova_compute[187185]: 2025-11-29 07:16:47.064 187189 DEBUG nova.compute.manager [req-9ed5df97-d8bd-467a-ae06-7e479256f194 req-92ea969f-780e-41e7-b205-be79b135c270 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] No waiting events found dispatching network-vif-plugged-29881f52-aa42-4a78-a87b-06e906811ff2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:16:47 compute-0 nova_compute[187185]: 2025-11-29 07:16:47.064 187189 WARNING nova.compute.manager [req-9ed5df97-d8bd-467a-ae06-7e479256f194 req-92ea969f-780e-41e7-b205-be79b135c270 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Received unexpected event network-vif-plugged-29881f52-aa42-4a78-a87b-06e906811ff2 for instance with vm_state active and task_state migrating.
Nov 29 07:16:47 compute-0 nova_compute[187185]: 2025-11-29 07:16:47.064 187189 DEBUG nova.compute.manager [req-9ed5df97-d8bd-467a-ae06-7e479256f194 req-92ea969f-780e-41e7-b205-be79b135c270 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Received event network-vif-plugged-29881f52-aa42-4a78-a87b-06e906811ff2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:16:47 compute-0 nova_compute[187185]: 2025-11-29 07:16:47.065 187189 DEBUG oslo_concurrency.lockutils [req-9ed5df97-d8bd-467a-ae06-7e479256f194 req-92ea969f-780e-41e7-b205-be79b135c270 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "704c4aa7-3239-4ecc-bfdc-c72642678363-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:16:47 compute-0 nova_compute[187185]: 2025-11-29 07:16:47.065 187189 DEBUG oslo_concurrency.lockutils [req-9ed5df97-d8bd-467a-ae06-7e479256f194 req-92ea969f-780e-41e7-b205-be79b135c270 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "704c4aa7-3239-4ecc-bfdc-c72642678363-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:16:47 compute-0 nova_compute[187185]: 2025-11-29 07:16:47.066 187189 DEBUG oslo_concurrency.lockutils [req-9ed5df97-d8bd-467a-ae06-7e479256f194 req-92ea969f-780e-41e7-b205-be79b135c270 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "704c4aa7-3239-4ecc-bfdc-c72642678363-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:16:47 compute-0 nova_compute[187185]: 2025-11-29 07:16:47.066 187189 DEBUG nova.compute.manager [req-9ed5df97-d8bd-467a-ae06-7e479256f194 req-92ea969f-780e-41e7-b205-be79b135c270 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] No waiting events found dispatching network-vif-plugged-29881f52-aa42-4a78-a87b-06e906811ff2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:16:47 compute-0 nova_compute[187185]: 2025-11-29 07:16:47.066 187189 WARNING nova.compute.manager [req-9ed5df97-d8bd-467a-ae06-7e479256f194 req-92ea969f-780e-41e7-b205-be79b135c270 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Received unexpected event network-vif-plugged-29881f52-aa42-4a78-a87b-06e906811ff2 for instance with vm_state active and task_state migrating.
Nov 29 07:16:47 compute-0 nova_compute[187185]: 2025-11-29 07:16:47.067 187189 DEBUG nova.compute.manager [req-9ed5df97-d8bd-467a-ae06-7e479256f194 req-92ea969f-780e-41e7-b205-be79b135c270 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Received event network-vif-plugged-29881f52-aa42-4a78-a87b-06e906811ff2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:16:47 compute-0 nova_compute[187185]: 2025-11-29 07:16:47.067 187189 DEBUG oslo_concurrency.lockutils [req-9ed5df97-d8bd-467a-ae06-7e479256f194 req-92ea969f-780e-41e7-b205-be79b135c270 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "704c4aa7-3239-4ecc-bfdc-c72642678363-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:16:47 compute-0 nova_compute[187185]: 2025-11-29 07:16:47.067 187189 DEBUG oslo_concurrency.lockutils [req-9ed5df97-d8bd-467a-ae06-7e479256f194 req-92ea969f-780e-41e7-b205-be79b135c270 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "704c4aa7-3239-4ecc-bfdc-c72642678363-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:16:47 compute-0 nova_compute[187185]: 2025-11-29 07:16:47.068 187189 DEBUG oslo_concurrency.lockutils [req-9ed5df97-d8bd-467a-ae06-7e479256f194 req-92ea969f-780e-41e7-b205-be79b135c270 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "704c4aa7-3239-4ecc-bfdc-c72642678363-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:16:47 compute-0 nova_compute[187185]: 2025-11-29 07:16:47.068 187189 DEBUG nova.compute.manager [req-9ed5df97-d8bd-467a-ae06-7e479256f194 req-92ea969f-780e-41e7-b205-be79b135c270 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] No waiting events found dispatching network-vif-plugged-29881f52-aa42-4a78-a87b-06e906811ff2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:16:47 compute-0 nova_compute[187185]: 2025-11-29 07:16:47.068 187189 WARNING nova.compute.manager [req-9ed5df97-d8bd-467a-ae06-7e479256f194 req-92ea969f-780e-41e7-b205-be79b135c270 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Received unexpected event network-vif-plugged-29881f52-aa42-4a78-a87b-06e906811ff2 for instance with vm_state active and task_state migrating.
Nov 29 07:16:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:16:47.995 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:16:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:16:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:16:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:16:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:16:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:16:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:16:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:16:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:16:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:16:47.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:16:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:16:47.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:16:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:16:47.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:16:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:16:47.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:16:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:16:47.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:16:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:16:47.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:16:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:16:47.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:16:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:16:47.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:16:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:16:47.997 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:16:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:16:47.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:16:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:16:47.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:16:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:16:47.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:16:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:16:47.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:16:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:16:47.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:16:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:16:47.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:16:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:16:47.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:16:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:16:47.998 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:16:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:16:47.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:16:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:16:47.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:16:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:16:47.999 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:16:48 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:16:48.306 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:16:49 compute-0 nova_compute[187185]: 2025-11-29 07:16:49.226 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:50 compute-0 nova_compute[187185]: 2025-11-29 07:16:50.451 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:50 compute-0 podman[229750]: 2025-11-29 07:16:50.819864728 +0000 UTC m=+0.070454860 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 07:16:51 compute-0 nova_compute[187185]: 2025-11-29 07:16:51.146 187189 DEBUG oslo_concurrency.lockutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Acquiring lock "704c4aa7-3239-4ecc-bfdc-c72642678363-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:16:51 compute-0 nova_compute[187185]: 2025-11-29 07:16:51.147 187189 DEBUG oslo_concurrency.lockutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Lock "704c4aa7-3239-4ecc-bfdc-c72642678363-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:16:51 compute-0 nova_compute[187185]: 2025-11-29 07:16:51.147 187189 DEBUG oslo_concurrency.lockutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Lock "704c4aa7-3239-4ecc-bfdc-c72642678363-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:16:51 compute-0 nova_compute[187185]: 2025-11-29 07:16:51.173 187189 DEBUG oslo_concurrency.lockutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:16:51 compute-0 nova_compute[187185]: 2025-11-29 07:16:51.173 187189 DEBUG oslo_concurrency.lockutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:16:51 compute-0 nova_compute[187185]: 2025-11-29 07:16:51.174 187189 DEBUG oslo_concurrency.lockutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:16:51 compute-0 nova_compute[187185]: 2025-11-29 07:16:51.174 187189 DEBUG nova.compute.resource_tracker [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:16:51 compute-0 nova_compute[187185]: 2025-11-29 07:16:51.357 187189 WARNING nova.virt.libvirt.driver [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:16:51 compute-0 nova_compute[187185]: 2025-11-29 07:16:51.358 187189 DEBUG nova.compute.resource_tracker [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5743MB free_disk=73.29463958740234GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:16:51 compute-0 nova_compute[187185]: 2025-11-29 07:16:51.358 187189 DEBUG oslo_concurrency.lockutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:16:51 compute-0 nova_compute[187185]: 2025-11-29 07:16:51.358 187189 DEBUG oslo_concurrency.lockutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:16:51 compute-0 nova_compute[187185]: 2025-11-29 07:16:51.836 187189 DEBUG nova.compute.resource_tracker [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Migration for instance 704c4aa7-3239-4ecc-bfdc-c72642678363 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Nov 29 07:16:52 compute-0 nova_compute[187185]: 2025-11-29 07:16:52.363 187189 DEBUG nova.compute.resource_tracker [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Nov 29 07:16:52 compute-0 nova_compute[187185]: 2025-11-29 07:16:52.386 187189 DEBUG nova.compute.resource_tracker [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Migration d11e3672-6cff-4636-96b9-c3cf40816fbe is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Nov 29 07:16:52 compute-0 nova_compute[187185]: 2025-11-29 07:16:52.386 187189 DEBUG nova.compute.resource_tracker [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:16:52 compute-0 nova_compute[187185]: 2025-11-29 07:16:52.387 187189 DEBUG nova.compute.resource_tracker [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:16:52 compute-0 nova_compute[187185]: 2025-11-29 07:16:52.429 187189 DEBUG nova.compute.provider_tree [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:16:52 compute-0 nova_compute[187185]: 2025-11-29 07:16:52.760 187189 DEBUG nova.scheduler.client.report [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:16:52 compute-0 podman[229775]: 2025-11-29 07:16:52.791565474 +0000 UTC m=+0.055267689 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 07:16:52 compute-0 nova_compute[187185]: 2025-11-29 07:16:52.796 187189 DEBUG nova.compute.resource_tracker [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:16:52 compute-0 nova_compute[187185]: 2025-11-29 07:16:52.797 187189 DEBUG oslo_concurrency.lockutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.438s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:16:52 compute-0 podman[229776]: 2025-11-29 07:16:52.804167642 +0000 UTC m=+0.061152846 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:16:52 compute-0 nova_compute[187185]: 2025-11-29 07:16:52.812 187189 INFO nova.compute.manager [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Migrating instance to compute-2.ctlplane.example.com finished successfully.
Nov 29 07:16:52 compute-0 nova_compute[187185]: 2025-11-29 07:16:52.904 187189 INFO nova.scheduler.client.report [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Deleted allocation for migration d11e3672-6cff-4636-96b9-c3cf40816fbe
Nov 29 07:16:52 compute-0 nova_compute[187185]: 2025-11-29 07:16:52.905 187189 DEBUG nova.virt.libvirt.driver [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Nov 29 07:16:54 compute-0 nova_compute[187185]: 2025-11-29 07:16:54.228 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:54 compute-0 nova_compute[187185]: 2025-11-29 07:16:54.834 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400599.8327742, 704c4aa7-3239-4ecc-bfdc-c72642678363 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:16:54 compute-0 nova_compute[187185]: 2025-11-29 07:16:54.835 187189 INFO nova.compute.manager [-] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] VM Stopped (Lifecycle Event)
Nov 29 07:16:54 compute-0 nova_compute[187185]: 2025-11-29 07:16:54.877 187189 DEBUG nova.compute.manager [None req-e607acd7-bfa1-4b52-a242-051fbe99afe2 - - - - - -] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:16:55 compute-0 nova_compute[187185]: 2025-11-29 07:16:55.453 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:16:59 compute-0 nova_compute[187185]: 2025-11-29 07:16:59.230 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:00 compute-0 nova_compute[187185]: 2025-11-29 07:17:00.455 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:01 compute-0 anacron[29976]: Job `cron.monthly' started
Nov 29 07:17:01 compute-0 anacron[29976]: Job `cron.monthly' terminated
Nov 29 07:17:01 compute-0 anacron[29976]: Normal exit (3 jobs run)
Nov 29 07:17:04 compute-0 nova_compute[187185]: 2025-11-29 07:17:04.233 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:04 compute-0 nova_compute[187185]: 2025-11-29 07:17:04.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:17:04 compute-0 nova_compute[187185]: 2025-11-29 07:17:04.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:17:04 compute-0 nova_compute[187185]: 2025-11-29 07:17:04.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:17:04 compute-0 nova_compute[187185]: 2025-11-29 07:17:04.344 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 07:17:04 compute-0 nova_compute[187185]: 2025-11-29 07:17:04.472 187189 DEBUG oslo_concurrency.lockutils [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "bb1bd9c2-1ccf-4021-b983-63a50858328f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:17:04 compute-0 nova_compute[187185]: 2025-11-29 07:17:04.473 187189 DEBUG oslo_concurrency.lockutils [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "bb1bd9c2-1ccf-4021-b983-63a50858328f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:17:04 compute-0 nova_compute[187185]: 2025-11-29 07:17:04.514 187189 DEBUG nova.compute.manager [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 07:17:04 compute-0 nova_compute[187185]: 2025-11-29 07:17:04.652 187189 DEBUG oslo_concurrency.lockutils [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:17:04 compute-0 nova_compute[187185]: 2025-11-29 07:17:04.652 187189 DEBUG oslo_concurrency.lockutils [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:17:04 compute-0 nova_compute[187185]: 2025-11-29 07:17:04.662 187189 DEBUG nova.virt.hardware [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:17:04 compute-0 nova_compute[187185]: 2025-11-29 07:17:04.662 187189 INFO nova.compute.claims [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:17:04 compute-0 nova_compute[187185]: 2025-11-29 07:17:04.846 187189 DEBUG nova.compute.provider_tree [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:17:04 compute-0 nova_compute[187185]: 2025-11-29 07:17:04.866 187189 DEBUG nova.scheduler.client.report [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:17:04 compute-0 nova_compute[187185]: 2025-11-29 07:17:04.895 187189 DEBUG oslo_concurrency.lockutils [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.242s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:17:04 compute-0 nova_compute[187185]: 2025-11-29 07:17:04.896 187189 DEBUG nova.compute.manager [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 07:17:05 compute-0 nova_compute[187185]: 2025-11-29 07:17:05.004 187189 DEBUG nova.compute.manager [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 07:17:05 compute-0 nova_compute[187185]: 2025-11-29 07:17:05.005 187189 DEBUG nova.network.neutron [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 07:17:05 compute-0 nova_compute[187185]: 2025-11-29 07:17:05.038 187189 INFO nova.virt.libvirt.driver [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 07:17:05 compute-0 nova_compute[187185]: 2025-11-29 07:17:05.060 187189 DEBUG nova.compute.manager [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 07:17:05 compute-0 nova_compute[187185]: 2025-11-29 07:17:05.203 187189 DEBUG nova.compute.manager [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 07:17:05 compute-0 nova_compute[187185]: 2025-11-29 07:17:05.205 187189 DEBUG nova.virt.libvirt.driver [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:17:05 compute-0 nova_compute[187185]: 2025-11-29 07:17:05.205 187189 INFO nova.virt.libvirt.driver [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Creating image(s)
Nov 29 07:17:05 compute-0 nova_compute[187185]: 2025-11-29 07:17:05.207 187189 DEBUG oslo_concurrency.lockutils [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "/var/lib/nova/instances/bb1bd9c2-1ccf-4021-b983-63a50858328f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:17:05 compute-0 nova_compute[187185]: 2025-11-29 07:17:05.207 187189 DEBUG oslo_concurrency.lockutils [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "/var/lib/nova/instances/bb1bd9c2-1ccf-4021-b983-63a50858328f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:17:05 compute-0 nova_compute[187185]: 2025-11-29 07:17:05.208 187189 DEBUG oslo_concurrency.lockutils [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "/var/lib/nova/instances/bb1bd9c2-1ccf-4021-b983-63a50858328f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:17:05 compute-0 nova_compute[187185]: 2025-11-29 07:17:05.236 187189 DEBUG nova.policy [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ee2d4931cb504b13b92a2f52c95c05ce', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 07:17:05 compute-0 nova_compute[187185]: 2025-11-29 07:17:05.241 187189 DEBUG oslo_concurrency.processutils [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:17:05 compute-0 nova_compute[187185]: 2025-11-29 07:17:05.301 187189 DEBUG oslo_concurrency.processutils [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:17:05 compute-0 nova_compute[187185]: 2025-11-29 07:17:05.303 187189 DEBUG oslo_concurrency.lockutils [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:17:05 compute-0 nova_compute[187185]: 2025-11-29 07:17:05.304 187189 DEBUG oslo_concurrency.lockutils [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:17:05 compute-0 nova_compute[187185]: 2025-11-29 07:17:05.328 187189 DEBUG oslo_concurrency.processutils [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:17:05 compute-0 nova_compute[187185]: 2025-11-29 07:17:05.401 187189 DEBUG oslo_concurrency.processutils [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:17:05 compute-0 nova_compute[187185]: 2025-11-29 07:17:05.403 187189 DEBUG oslo_concurrency.processutils [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/bb1bd9c2-1ccf-4021-b983-63a50858328f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:17:05 compute-0 nova_compute[187185]: 2025-11-29 07:17:05.459 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:05 compute-0 nova_compute[187185]: 2025-11-29 07:17:05.464 187189 DEBUG oslo_concurrency.processutils [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/bb1bd9c2-1ccf-4021-b983-63a50858328f/disk 1073741824" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:17:05 compute-0 nova_compute[187185]: 2025-11-29 07:17:05.465 187189 DEBUG oslo_concurrency.lockutils [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.161s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:17:05 compute-0 nova_compute[187185]: 2025-11-29 07:17:05.466 187189 DEBUG oslo_concurrency.processutils [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:17:05 compute-0 nova_compute[187185]: 2025-11-29 07:17:05.556 187189 DEBUG oslo_concurrency.processutils [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:17:05 compute-0 nova_compute[187185]: 2025-11-29 07:17:05.558 187189 DEBUG nova.virt.disk.api [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Checking if we can resize image /var/lib/nova/instances/bb1bd9c2-1ccf-4021-b983-63a50858328f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:17:05 compute-0 nova_compute[187185]: 2025-11-29 07:17:05.558 187189 DEBUG oslo_concurrency.processutils [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bb1bd9c2-1ccf-4021-b983-63a50858328f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:17:05 compute-0 nova_compute[187185]: 2025-11-29 07:17:05.623 187189 DEBUG oslo_concurrency.processutils [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bb1bd9c2-1ccf-4021-b983-63a50858328f/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:17:05 compute-0 nova_compute[187185]: 2025-11-29 07:17:05.624 187189 DEBUG nova.virt.disk.api [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Cannot resize image /var/lib/nova/instances/bb1bd9c2-1ccf-4021-b983-63a50858328f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:17:05 compute-0 nova_compute[187185]: 2025-11-29 07:17:05.625 187189 DEBUG nova.objects.instance [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'migration_context' on Instance uuid bb1bd9c2-1ccf-4021-b983-63a50858328f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:17:05 compute-0 podman[229832]: 2025-11-29 07:17:05.823349055 +0000 UTC m=+0.070257164 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 07:17:05 compute-0 podman[229830]: 2025-11-29 07:17:05.827687158 +0000 UTC m=+0.080384421 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 29 07:17:05 compute-0 podman[229831]: 2025-11-29 07:17:05.841646434 +0000 UTC m=+0.090943561 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, config_id=edpm, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, distribution-scope=public)
Nov 29 07:17:06 compute-0 nova_compute[187185]: 2025-11-29 07:17:06.075 187189 DEBUG nova.virt.libvirt.driver [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:17:06 compute-0 nova_compute[187185]: 2025-11-29 07:17:06.076 187189 DEBUG nova.virt.libvirt.driver [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Ensure instance console log exists: /var/lib/nova/instances/bb1bd9c2-1ccf-4021-b983-63a50858328f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:17:06 compute-0 nova_compute[187185]: 2025-11-29 07:17:06.077 187189 DEBUG oslo_concurrency.lockutils [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:17:06 compute-0 nova_compute[187185]: 2025-11-29 07:17:06.078 187189 DEBUG oslo_concurrency.lockutils [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:17:06 compute-0 nova_compute[187185]: 2025-11-29 07:17:06.078 187189 DEBUG oslo_concurrency.lockutils [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:17:06 compute-0 nova_compute[187185]: 2025-11-29 07:17:06.300 187189 DEBUG nova.network.neutron [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Successfully created port: 4138daf0-53ec-4cf3-ad1f-cb966c2e96a3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:17:07 compute-0 nova_compute[187185]: 2025-11-29 07:17:07.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:17:07 compute-0 nova_compute[187185]: 2025-11-29 07:17:07.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:17:07 compute-0 nova_compute[187185]: 2025-11-29 07:17:07.664 187189 DEBUG nova.network.neutron [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Successfully updated port: 4138daf0-53ec-4cf3-ad1f-cb966c2e96a3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:17:08 compute-0 nova_compute[187185]: 2025-11-29 07:17:08.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:17:08 compute-0 nova_compute[187185]: 2025-11-29 07:17:08.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:17:08 compute-0 nova_compute[187185]: 2025-11-29 07:17:08.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:17:08 compute-0 nova_compute[187185]: 2025-11-29 07:17:08.380 187189 DEBUG oslo_concurrency.lockutils [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "refresh_cache-bb1bd9c2-1ccf-4021-b983-63a50858328f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:17:08 compute-0 nova_compute[187185]: 2025-11-29 07:17:08.380 187189 DEBUG oslo_concurrency.lockutils [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquired lock "refresh_cache-bb1bd9c2-1ccf-4021-b983-63a50858328f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:17:08 compute-0 nova_compute[187185]: 2025-11-29 07:17:08.381 187189 DEBUG nova.network.neutron [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:17:08 compute-0 nova_compute[187185]: 2025-11-29 07:17:08.448 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:17:08 compute-0 nova_compute[187185]: 2025-11-29 07:17:08.449 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:17:08 compute-0 nova_compute[187185]: 2025-11-29 07:17:08.450 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:17:08 compute-0 nova_compute[187185]: 2025-11-29 07:17:08.450 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:17:08 compute-0 nova_compute[187185]: 2025-11-29 07:17:08.578 187189 DEBUG nova.compute.manager [req-2c236dab-c0fa-4be3-a54f-8c7c810df2d5 req-e858235e-36d0-42b2-ab5c-d7a40c86de78 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Received event network-changed-4138daf0-53ec-4cf3-ad1f-cb966c2e96a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:17:08 compute-0 nova_compute[187185]: 2025-11-29 07:17:08.579 187189 DEBUG nova.compute.manager [req-2c236dab-c0fa-4be3-a54f-8c7c810df2d5 req-e858235e-36d0-42b2-ab5c-d7a40c86de78 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Refreshing instance network info cache due to event network-changed-4138daf0-53ec-4cf3-ad1f-cb966c2e96a3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:17:08 compute-0 nova_compute[187185]: 2025-11-29 07:17:08.580 187189 DEBUG oslo_concurrency.lockutils [req-2c236dab-c0fa-4be3-a54f-8c7c810df2d5 req-e858235e-36d0-42b2-ab5c-d7a40c86de78 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-bb1bd9c2-1ccf-4021-b983-63a50858328f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:17:08 compute-0 nova_compute[187185]: 2025-11-29 07:17:08.687 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:17:08 compute-0 nova_compute[187185]: 2025-11-29 07:17:08.688 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5741MB free_disk=73.29442596435547GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:17:08 compute-0 nova_compute[187185]: 2025-11-29 07:17:08.688 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:17:08 compute-0 nova_compute[187185]: 2025-11-29 07:17:08.688 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:17:08 compute-0 nova_compute[187185]: 2025-11-29 07:17:08.862 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance bb1bd9c2-1ccf-4021-b983-63a50858328f actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:17:08 compute-0 nova_compute[187185]: 2025-11-29 07:17:08.862 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:17:08 compute-0 nova_compute[187185]: 2025-11-29 07:17:08.863 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:17:08 compute-0 nova_compute[187185]: 2025-11-29 07:17:08.954 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:17:08 compute-0 nova_compute[187185]: 2025-11-29 07:17:08.958 187189 DEBUG nova.network.neutron [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:17:08 compute-0 nova_compute[187185]: 2025-11-29 07:17:08.991 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:17:09 compute-0 nova_compute[187185]: 2025-11-29 07:17:09.045 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:17:09 compute-0 nova_compute[187185]: 2025-11-29 07:17:09.045 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.357s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:17:09 compute-0 nova_compute[187185]: 2025-11-29 07:17:09.236 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.040 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.041 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.041 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.042 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.422 187189 DEBUG nova.network.neutron [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Updating instance_info_cache with network_info: [{"id": "4138daf0-53ec-4cf3-ad1f-cb966c2e96a3", "address": "fa:16:3e:bd:1e:b0", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4138daf0-53", "ovs_interfaceid": "4138daf0-53ec-4cf3-ad1f-cb966c2e96a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.461 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.497 187189 DEBUG oslo_concurrency.lockutils [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Releasing lock "refresh_cache-bb1bd9c2-1ccf-4021-b983-63a50858328f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.498 187189 DEBUG nova.compute.manager [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Instance network_info: |[{"id": "4138daf0-53ec-4cf3-ad1f-cb966c2e96a3", "address": "fa:16:3e:bd:1e:b0", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4138daf0-53", "ovs_interfaceid": "4138daf0-53ec-4cf3-ad1f-cb966c2e96a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.498 187189 DEBUG oslo_concurrency.lockutils [req-2c236dab-c0fa-4be3-a54f-8c7c810df2d5 req-e858235e-36d0-42b2-ab5c-d7a40c86de78 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-bb1bd9c2-1ccf-4021-b983-63a50858328f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.499 187189 DEBUG nova.network.neutron [req-2c236dab-c0fa-4be3-a54f-8c7c810df2d5 req-e858235e-36d0-42b2-ab5c-d7a40c86de78 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Refreshing network info cache for port 4138daf0-53ec-4cf3-ad1f-cb966c2e96a3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.505 187189 DEBUG nova.virt.libvirt.driver [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Start _get_guest_xml network_info=[{"id": "4138daf0-53ec-4cf3-ad1f-cb966c2e96a3", "address": "fa:16:3e:bd:1e:b0", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4138daf0-53", "ovs_interfaceid": "4138daf0-53ec-4cf3-ad1f-cb966c2e96a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.512 187189 WARNING nova.virt.libvirt.driver [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.517 187189 DEBUG nova.virt.libvirt.host [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.517 187189 DEBUG nova.virt.libvirt.host [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.523 187189 DEBUG nova.virt.libvirt.host [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.523 187189 DEBUG nova.virt.libvirt.host [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.525 187189 DEBUG nova.virt.libvirt.driver [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.525 187189 DEBUG nova.virt.hardware [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.525 187189 DEBUG nova.virt.hardware [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.525 187189 DEBUG nova.virt.hardware [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.526 187189 DEBUG nova.virt.hardware [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.526 187189 DEBUG nova.virt.hardware [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.526 187189 DEBUG nova.virt.hardware [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.526 187189 DEBUG nova.virt.hardware [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.526 187189 DEBUG nova.virt.hardware [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.527 187189 DEBUG nova.virt.hardware [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.527 187189 DEBUG nova.virt.hardware [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.527 187189 DEBUG nova.virt.hardware [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.532 187189 DEBUG nova.virt.libvirt.vif [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:17:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1640649360',display_name='tempest-ServerActionsTestOtherB-server-1640649360',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1640649360',id=105,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='32e51e3a9a8f4a1ca6e022735ebf5f7b',ramdisk_id='',reservation_id='r-nwwrzkpv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1538648925',owner_user_name='tempest-ServerActionsTestOtherB-1538648925-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:17:05Z,user_data=None,user_id='ee2d4931cb504b13b92a2f52c95c05ce',uuid=bb1bd9c2-1ccf-4021-b983-63a50858328f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4138daf0-53ec-4cf3-ad1f-cb966c2e96a3", "address": "fa:16:3e:bd:1e:b0", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4138daf0-53", "ovs_interfaceid": "4138daf0-53ec-4cf3-ad1f-cb966c2e96a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.532 187189 DEBUG nova.network.os_vif_util [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converting VIF {"id": "4138daf0-53ec-4cf3-ad1f-cb966c2e96a3", "address": "fa:16:3e:bd:1e:b0", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4138daf0-53", "ovs_interfaceid": "4138daf0-53ec-4cf3-ad1f-cb966c2e96a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.533 187189 DEBUG nova.network.os_vif_util [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bd:1e:b0,bridge_name='br-int',has_traffic_filtering=True,id=4138daf0-53ec-4cf3-ad1f-cb966c2e96a3,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4138daf0-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.534 187189 DEBUG nova.objects.instance [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'pci_devices' on Instance uuid bb1bd9c2-1ccf-4021-b983-63a50858328f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.555 187189 DEBUG nova.virt.libvirt.driver [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:17:10 compute-0 nova_compute[187185]:   <uuid>bb1bd9c2-1ccf-4021-b983-63a50858328f</uuid>
Nov 29 07:17:10 compute-0 nova_compute[187185]:   <name>instance-00000069</name>
Nov 29 07:17:10 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:17:10 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:17:10 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:17:10 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:       <nova:name>tempest-ServerActionsTestOtherB-server-1640649360</nova:name>
Nov 29 07:17:10 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:17:10</nova:creationTime>
Nov 29 07:17:10 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:17:10 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:17:10 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:17:10 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:17:10 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:17:10 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:17:10 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:17:10 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:17:10 compute-0 nova_compute[187185]:         <nova:user uuid="ee2d4931cb504b13b92a2f52c95c05ce">tempest-ServerActionsTestOtherB-1538648925-project-member</nova:user>
Nov 29 07:17:10 compute-0 nova_compute[187185]:         <nova:project uuid="32e51e3a9a8f4a1ca6e022735ebf5f7b">tempest-ServerActionsTestOtherB-1538648925</nova:project>
Nov 29 07:17:10 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:17:10 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:17:10 compute-0 nova_compute[187185]:         <nova:port uuid="4138daf0-53ec-4cf3-ad1f-cb966c2e96a3">
Nov 29 07:17:10 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:17:10 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:17:10 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:17:10 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <system>
Nov 29 07:17:10 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:17:10 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:17:10 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:17:10 compute-0 nova_compute[187185]:       <entry name="serial">bb1bd9c2-1ccf-4021-b983-63a50858328f</entry>
Nov 29 07:17:10 compute-0 nova_compute[187185]:       <entry name="uuid">bb1bd9c2-1ccf-4021-b983-63a50858328f</entry>
Nov 29 07:17:10 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     </system>
Nov 29 07:17:10 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:17:10 compute-0 nova_compute[187185]:   <os>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:   </os>
Nov 29 07:17:10 compute-0 nova_compute[187185]:   <features>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:   </features>
Nov 29 07:17:10 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:17:10 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:17:10 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:17:10 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/bb1bd9c2-1ccf-4021-b983-63a50858328f/disk"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:17:10 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/bb1bd9c2-1ccf-4021-b983-63a50858328f/disk.config"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:17:10 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:bd:1e:b0"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:       <target dev="tap4138daf0-53"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:17:10 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/bb1bd9c2-1ccf-4021-b983-63a50858328f/console.log" append="off"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <video>
Nov 29 07:17:10 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     </video>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:17:10 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:17:10 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:17:10 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:17:10 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:17:10 compute-0 nova_compute[187185]: </domain>
Nov 29 07:17:10 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.557 187189 DEBUG nova.compute.manager [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Preparing to wait for external event network-vif-plugged-4138daf0-53ec-4cf3-ad1f-cb966c2e96a3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.558 187189 DEBUG oslo_concurrency.lockutils [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "bb1bd9c2-1ccf-4021-b983-63a50858328f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.558 187189 DEBUG oslo_concurrency.lockutils [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "bb1bd9c2-1ccf-4021-b983-63a50858328f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.559 187189 DEBUG oslo_concurrency.lockutils [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "bb1bd9c2-1ccf-4021-b983-63a50858328f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.560 187189 DEBUG nova.virt.libvirt.vif [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:17:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1640649360',display_name='tempest-ServerActionsTestOtherB-server-1640649360',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1640649360',id=105,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='32e51e3a9a8f4a1ca6e022735ebf5f7b',ramdisk_id='',reservation_id='r-nwwrzkpv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1538648925',owner_user_name='tempest-ServerActionsTestOtherB-1538648925-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:17:05Z,user_data=None,user_id='ee2d4931cb504b13b92a2f52c95c05ce',uuid=bb1bd9c2-1ccf-4021-b983-63a50858328f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4138daf0-53ec-4cf3-ad1f-cb966c2e96a3", "address": "fa:16:3e:bd:1e:b0", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4138daf0-53", "ovs_interfaceid": "4138daf0-53ec-4cf3-ad1f-cb966c2e96a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.561 187189 DEBUG nova.network.os_vif_util [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converting VIF {"id": "4138daf0-53ec-4cf3-ad1f-cb966c2e96a3", "address": "fa:16:3e:bd:1e:b0", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4138daf0-53", "ovs_interfaceid": "4138daf0-53ec-4cf3-ad1f-cb966c2e96a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.562 187189 DEBUG nova.network.os_vif_util [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bd:1e:b0,bridge_name='br-int',has_traffic_filtering=True,id=4138daf0-53ec-4cf3-ad1f-cb966c2e96a3,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4138daf0-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.562 187189 DEBUG os_vif [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bd:1e:b0,bridge_name='br-int',has_traffic_filtering=True,id=4138daf0-53ec-4cf3-ad1f-cb966c2e96a3,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4138daf0-53') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.563 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.564 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.565 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.569 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.570 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4138daf0-53, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.571 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4138daf0-53, col_values=(('external_ids', {'iface-id': '4138daf0-53ec-4cf3-ad1f-cb966c2e96a3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bd:1e:b0', 'vm-uuid': 'bb1bd9c2-1ccf-4021-b983-63a50858328f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:17:10 compute-0 NetworkManager[55227]: <info>  [1764400630.5753] manager: (tap4138daf0-53): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/149)
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.577 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.583 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.585 187189 INFO os_vif [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bd:1e:b0,bridge_name='br-int',has_traffic_filtering=True,id=4138daf0-53ec-4cf3-ad1f-cb966c2e96a3,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4138daf0-53')
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.766 187189 DEBUG oslo_concurrency.lockutils [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Acquiring lock "1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.767 187189 DEBUG oslo_concurrency.lockutils [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.780 187189 DEBUG nova.virt.libvirt.driver [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.781 187189 DEBUG nova.virt.libvirt.driver [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.781 187189 DEBUG nova.virt.libvirt.driver [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] No VIF found with MAC fa:16:3e:bd:1e:b0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.782 187189 INFO nova.virt.libvirt.driver [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Using config drive
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.784 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:10 compute-0 nova_compute[187185]: 2025-11-29 07:17:10.921 187189 DEBUG nova.compute.manager [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 07:17:11 compute-0 nova_compute[187185]: 2025-11-29 07:17:11.034 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:11 compute-0 nova_compute[187185]: 2025-11-29 07:17:11.335 187189 DEBUG oslo_concurrency.lockutils [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:17:11 compute-0 nova_compute[187185]: 2025-11-29 07:17:11.335 187189 DEBUG oslo_concurrency.lockutils [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:17:11 compute-0 nova_compute[187185]: 2025-11-29 07:17:11.342 187189 DEBUG nova.virt.hardware [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:17:11 compute-0 nova_compute[187185]: 2025-11-29 07:17:11.342 187189 INFO nova.compute.claims [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:17:11 compute-0 nova_compute[187185]: 2025-11-29 07:17:11.810 187189 DEBUG nova.compute.provider_tree [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:17:11 compute-0 nova_compute[187185]: 2025-11-29 07:17:11.934 187189 INFO nova.virt.libvirt.driver [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Creating config drive at /var/lib/nova/instances/bb1bd9c2-1ccf-4021-b983-63a50858328f/disk.config
Nov 29 07:17:11 compute-0 nova_compute[187185]: 2025-11-29 07:17:11.940 187189 DEBUG oslo_concurrency.processutils [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bb1bd9c2-1ccf-4021-b983-63a50858328f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2jfr17tz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:17:12 compute-0 nova_compute[187185]: 2025-11-29 07:17:12.080 187189 DEBUG oslo_concurrency.processutils [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bb1bd9c2-1ccf-4021-b983-63a50858328f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2jfr17tz" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:17:12 compute-0 kernel: tap4138daf0-53: entered promiscuous mode
Nov 29 07:17:12 compute-0 NetworkManager[55227]: <info>  [1764400632.1753] manager: (tap4138daf0-53): new Tun device (/org/freedesktop/NetworkManager/Devices/150)
Nov 29 07:17:12 compute-0 nova_compute[187185]: 2025-11-29 07:17:12.177 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:12 compute-0 ovn_controller[95281]: 2025-11-29T07:17:12Z|00277|binding|INFO|Claiming lport 4138daf0-53ec-4cf3-ad1f-cb966c2e96a3 for this chassis.
Nov 29 07:17:12 compute-0 ovn_controller[95281]: 2025-11-29T07:17:12Z|00278|binding|INFO|4138daf0-53ec-4cf3-ad1f-cb966c2e96a3: Claiming fa:16:3e:bd:1e:b0 10.100.0.6
Nov 29 07:17:12 compute-0 nova_compute[187185]: 2025-11-29 07:17:12.182 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:12 compute-0 nova_compute[187185]: 2025-11-29 07:17:12.187 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:12 compute-0 nova_compute[187185]: 2025-11-29 07:17:12.194 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:12 compute-0 nova_compute[187185]: 2025-11-29 07:17:12.200 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:12 compute-0 NetworkManager[55227]: <info>  [1764400632.2013] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/151)
Nov 29 07:17:12 compute-0 NetworkManager[55227]: <info>  [1764400632.2020] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/152)
Nov 29 07:17:12 compute-0 systemd-udevd[229910]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:17:12 compute-0 NetworkManager[55227]: <info>  [1764400632.2279] device (tap4138daf0-53): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:17:12 compute-0 NetworkManager[55227]: <info>  [1764400632.2296] device (tap4138daf0-53): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:17:12 compute-0 systemd-machined[153486]: New machine qemu-38-instance-00000069.
Nov 29 07:17:12 compute-0 systemd[1]: Started Virtual Machine qemu-38-instance-00000069.
Nov 29 07:17:12 compute-0 nova_compute[187185]: 2025-11-29 07:17:12.389 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:12 compute-0 nova_compute[187185]: 2025-11-29 07:17:12.402 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:12 compute-0 nova_compute[187185]: 2025-11-29 07:17:12.540 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400632.5402467, bb1bd9c2-1ccf-4021-b983-63a50858328f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:17:12 compute-0 nova_compute[187185]: 2025-11-29 07:17:12.541 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] VM Started (Lifecycle Event)
Nov 29 07:17:13 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:13.738 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bd:1e:b0 10.100.0.6'], port_security=['fa:16:3e:bd:1e:b0 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'bb1bd9c2-1ccf-4021-b983-63a50858328f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8547e4c2-e200-4173-9eba-476619f06150', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=04b58113-8105-402c-a103-4692d3989228, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=4138daf0-53ec-4cf3-ad1f-cb966c2e96a3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:17:13 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:13.740 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 4138daf0-53ec-4cf3-ad1f-cb966c2e96a3 in datapath df7cfc35-3f76-45b2-b70c-e4525d38f410 bound to our chassis
Nov 29 07:17:13 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:13.742 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network df7cfc35-3f76-45b2-b70c-e4525d38f410
Nov 29 07:17:13 compute-0 nova_compute[187185]: 2025-11-29 07:17:13.750 187189 DEBUG nova.scheduler.client.report [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:17:13 compute-0 ovn_controller[95281]: 2025-11-29T07:17:13Z|00279|binding|INFO|Setting lport 4138daf0-53ec-4cf3-ad1f-cb966c2e96a3 ovn-installed in OVS
Nov 29 07:17:13 compute-0 ovn_controller[95281]: 2025-11-29T07:17:13Z|00280|binding|INFO|Setting lport 4138daf0-53ec-4cf3-ad1f-cb966c2e96a3 up in Southbound
Nov 29 07:17:13 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:13.772 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[ff889a97-78a6-471b-a17b-8f2033bc3ddb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:13 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:13.773 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdf7cfc35-31 in ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:17:13 compute-0 nova_compute[187185]: 2025-11-29 07:17:13.774 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:13 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:13.779 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdf7cfc35-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:17:13 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:13.779 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6df21a5d-6890-4ee6-920e-2d6a5df245e1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:13 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:13.780 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f73dac46-5322-487d-88bf-be954958bbc9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:13 compute-0 nova_compute[187185]: 2025-11-29 07:17:13.782 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:13 compute-0 nova_compute[187185]: 2025-11-29 07:17:13.797 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:17:13 compute-0 nova_compute[187185]: 2025-11-29 07:17:13.803 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400632.5404081, bb1bd9c2-1ccf-4021-b983-63a50858328f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:17:13 compute-0 nova_compute[187185]: 2025-11-29 07:17:13.803 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] VM Paused (Lifecycle Event)
Nov 29 07:17:13 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:13.804 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[c40d0bc0-3830-45b8-b312-a97b5a42ea82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:13 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:13.822 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d1510465-5a3b-4f63-9217-b15bbc87dc26]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:13 compute-0 nova_compute[187185]: 2025-11-29 07:17:13.833 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:17:13 compute-0 nova_compute[187185]: 2025-11-29 07:17:13.836 187189 DEBUG oslo_concurrency.lockutils [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.501s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:17:13 compute-0 nova_compute[187185]: 2025-11-29 07:17:13.837 187189 DEBUG nova.compute.manager [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 07:17:13 compute-0 nova_compute[187185]: 2025-11-29 07:17:13.841 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:17:13 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:13.871 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[279abd89-61dd-4ba4-9f08-85b5be1bb8b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:13 compute-0 NetworkManager[55227]: <info>  [1764400633.8844] manager: (tapdf7cfc35-30): new Veth device (/org/freedesktop/NetworkManager/Devices/153)
Nov 29 07:17:13 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:13.884 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[79e66c0a-8abc-4698-94b9-43973f702027]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:13 compute-0 nova_compute[187185]: 2025-11-29 07:17:13.894 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:17:13 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:13.935 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[43b41890-de7e-444a-9575-a5529b2f48aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:13 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:13.940 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[f166dfe9-f098-4db2-a467-94e53d1f7d90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:13 compute-0 NetworkManager[55227]: <info>  [1764400633.9702] device (tapdf7cfc35-30): carrier: link connected
Nov 29 07:17:13 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:13.979 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[797be8bb-f64b-4223-8ed6-140fddc7b8a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:14.006 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[51995c58-5c68-4e6c-936c-212334979b67]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdf7cfc35-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:ae:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 91], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 608163, 'reachable_time': 44292, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229951, 'error': None, 'target': 'ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:14.029 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[b5bb5398-8040-49a6-87f5-96efbeae539b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe84:aeb6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 608163, 'tstamp': 608163}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229952, 'error': None, 'target': 'ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:14.048 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f5b73b21-b41e-4b19-bed5-fea51669b701]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdf7cfc35-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:ae:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 91], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 608163, 'reachable_time': 44292, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229953, 'error': None, 'target': 'ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:14.085 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[998fdef9-5529-4d5e-9d04-f87bf03cb67a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:14.148 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[44cbccd0-e8da-4975-8036-bb66088b19a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:14.150 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdf7cfc35-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:17:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:14.150 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:17:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:14.150 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdf7cfc35-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:17:14 compute-0 nova_compute[187185]: 2025-11-29 07:17:14.160 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:14 compute-0 NetworkManager[55227]: <info>  [1764400634.1605] manager: (tapdf7cfc35-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/154)
Nov 29 07:17:14 compute-0 kernel: tapdf7cfc35-30: entered promiscuous mode
Nov 29 07:17:14 compute-0 nova_compute[187185]: 2025-11-29 07:17:14.164 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:14.165 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdf7cfc35-30, col_values=(('external_ids', {'iface-id': 'cab31803-36dd-4107-bb9e-3d36862142c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:17:14 compute-0 nova_compute[187185]: 2025-11-29 07:17:14.167 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:14 compute-0 ovn_controller[95281]: 2025-11-29T07:17:14Z|00281|binding|INFO|Releasing lport cab31803-36dd-4107-bb9e-3d36862142c0 from this chassis (sb_readonly=0)
Nov 29 07:17:14 compute-0 nova_compute[187185]: 2025-11-29 07:17:14.192 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:14.194 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/df7cfc35-3f76-45b2-b70c-e4525d38f410.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/df7cfc35-3f76-45b2-b70c-e4525d38f410.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:17:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:14.195 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[4c6335db-b6d8-41a9-a424-3b2ae82f8746]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:14.196 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:17:14 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:17:14 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:17:14 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-df7cfc35-3f76-45b2-b70c-e4525d38f410
Nov 29 07:17:14 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:17:14 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:17:14 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:17:14 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/df7cfc35-3f76-45b2-b70c-e4525d38f410.pid.haproxy
Nov 29 07:17:14 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:17:14 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:17:14 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:17:14 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:17:14 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:17:14 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:17:14 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:17:14 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:17:14 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:17:14 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:17:14 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:17:14 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:17:14 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:17:14 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:17:14 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:17:14 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:17:14 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:17:14 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:17:14 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:17:14 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:17:14 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID df7cfc35-3f76-45b2-b70c-e4525d38f410
Nov 29 07:17:14 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:17:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:14.197 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'env', 'PROCESS_TAG=haproxy-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/df7cfc35-3f76-45b2-b70c-e4525d38f410.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:17:14 compute-0 nova_compute[187185]: 2025-11-29 07:17:14.741 187189 DEBUG nova.compute.manager [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 07:17:14 compute-0 nova_compute[187185]: 2025-11-29 07:17:14.742 187189 DEBUG nova.network.neutron [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 07:17:14 compute-0 podman[229986]: 2025-11-29 07:17:14.654208443 +0000 UTC m=+0.043024971 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:17:14 compute-0 nova_compute[187185]: 2025-11-29 07:17:14.898 187189 INFO nova.virt.libvirt.driver [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 07:17:15 compute-0 nova_compute[187185]: 2025-11-29 07:17:15.077 187189 DEBUG nova.compute.manager [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 07:17:15 compute-0 nova_compute[187185]: 2025-11-29 07:17:15.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:17:15 compute-0 nova_compute[187185]: 2025-11-29 07:17:15.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 29 07:17:15 compute-0 podman[229986]: 2025-11-29 07:17:15.403624761 +0000 UTC m=+0.792441239 container create 492bff5a1daf62aee2db25bff574607a65c0608b85be8e28144e56a420882f0f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 07:17:15 compute-0 systemd[1]: Started libpod-conmon-492bff5a1daf62aee2db25bff574607a65c0608b85be8e28144e56a420882f0f.scope.
Nov 29 07:17:15 compute-0 nova_compute[187185]: 2025-11-29 07:17:15.464 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:15 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:17:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe44fed11c53a7866f82dad4ba35621650c9dd0591eaa4101ce56b3e6890c80f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:17:15 compute-0 podman[229986]: 2025-11-29 07:17:15.50546911 +0000 UTC m=+0.894285638 container init 492bff5a1daf62aee2db25bff574607a65c0608b85be8e28144e56a420882f0f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:17:15 compute-0 podman[229986]: 2025-11-29 07:17:15.516619416 +0000 UTC m=+0.905435904 container start 492bff5a1daf62aee2db25bff574607a65c0608b85be8e28144e56a420882f0f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 29 07:17:15 compute-0 neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410[230003]: [NOTICE]   (230017) : New worker (230025) forked
Nov 29 07:17:15 compute-0 neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410[230003]: [NOTICE]   (230017) : Loading success.
Nov 29 07:17:15 compute-0 nova_compute[187185]: 2025-11-29 07:17:15.574 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:15 compute-0 podman[230000]: 2025-11-29 07:17:15.612972369 +0000 UTC m=+0.155887682 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:17:15 compute-0 nova_compute[187185]: 2025-11-29 07:17:15.768 187189 DEBUG nova.compute.manager [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 07:17:15 compute-0 nova_compute[187185]: 2025-11-29 07:17:15.770 187189 DEBUG nova.virt.libvirt.driver [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:17:15 compute-0 nova_compute[187185]: 2025-11-29 07:17:15.770 187189 INFO nova.virt.libvirt.driver [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Creating image(s)
Nov 29 07:17:15 compute-0 nova_compute[187185]: 2025-11-29 07:17:15.771 187189 DEBUG oslo_concurrency.lockutils [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Acquiring lock "/var/lib/nova/instances/1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:17:15 compute-0 nova_compute[187185]: 2025-11-29 07:17:15.772 187189 DEBUG oslo_concurrency.lockutils [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "/var/lib/nova/instances/1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:17:15 compute-0 nova_compute[187185]: 2025-11-29 07:17:15.773 187189 DEBUG oslo_concurrency.lockutils [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "/var/lib/nova/instances/1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:17:15 compute-0 nova_compute[187185]: 2025-11-29 07:17:15.803 187189 DEBUG oslo_concurrency.processutils [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:17:15 compute-0 nova_compute[187185]: 2025-11-29 07:17:15.896 187189 DEBUG oslo_concurrency.processutils [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:17:15 compute-0 nova_compute[187185]: 2025-11-29 07:17:15.898 187189 DEBUG oslo_concurrency.lockutils [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:17:15 compute-0 nova_compute[187185]: 2025-11-29 07:17:15.899 187189 DEBUG oslo_concurrency.lockutils [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:17:15 compute-0 nova_compute[187185]: 2025-11-29 07:17:15.921 187189 DEBUG oslo_concurrency.processutils [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:17:15 compute-0 nova_compute[187185]: 2025-11-29 07:17:15.942 187189 DEBUG nova.policy [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 07:17:15 compute-0 nova_compute[187185]: 2025-11-29 07:17:15.977 187189 DEBUG oslo_concurrency.processutils [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:17:15 compute-0 nova_compute[187185]: 2025-11-29 07:17:15.978 187189 DEBUG oslo_concurrency.processutils [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:17:16 compute-0 nova_compute[187185]: 2025-11-29 07:17:16.027 187189 DEBUG oslo_concurrency.processutils [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d/disk 1073741824" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:17:16 compute-0 nova_compute[187185]: 2025-11-29 07:17:16.029 187189 DEBUG oslo_concurrency.lockutils [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:17:16 compute-0 nova_compute[187185]: 2025-11-29 07:17:16.030 187189 DEBUG oslo_concurrency.processutils [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:17:16 compute-0 nova_compute[187185]: 2025-11-29 07:17:16.124 187189 DEBUG oslo_concurrency.processutils [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:17:16 compute-0 nova_compute[187185]: 2025-11-29 07:17:16.125 187189 DEBUG nova.virt.disk.api [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Checking if we can resize image /var/lib/nova/instances/1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:17:16 compute-0 nova_compute[187185]: 2025-11-29 07:17:16.126 187189 DEBUG oslo_concurrency.processutils [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:17:16 compute-0 nova_compute[187185]: 2025-11-29 07:17:16.203 187189 DEBUG oslo_concurrency.processutils [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:17:16 compute-0 nova_compute[187185]: 2025-11-29 07:17:16.204 187189 DEBUG nova.virt.disk.api [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Cannot resize image /var/lib/nova/instances/1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:17:16 compute-0 nova_compute[187185]: 2025-11-29 07:17:16.205 187189 DEBUG nova.objects.instance [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lazy-loading 'migration_context' on Instance uuid 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:17:16 compute-0 nova_compute[187185]: 2025-11-29 07:17:16.228 187189 DEBUG nova.virt.libvirt.driver [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:17:16 compute-0 nova_compute[187185]: 2025-11-29 07:17:16.228 187189 DEBUG nova.virt.libvirt.driver [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Ensure instance console log exists: /var/lib/nova/instances/1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:17:16 compute-0 nova_compute[187185]: 2025-11-29 07:17:16.229 187189 DEBUG oslo_concurrency.lockutils [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:17:16 compute-0 nova_compute[187185]: 2025-11-29 07:17:16.229 187189 DEBUG oslo_concurrency.lockutils [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:17:16 compute-0 nova_compute[187185]: 2025-11-29 07:17:16.229 187189 DEBUG oslo_concurrency.lockutils [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:17:17 compute-0 nova_compute[187185]: 2025-11-29 07:17:17.272 187189 DEBUG nova.network.neutron [req-2c236dab-c0fa-4be3-a54f-8c7c810df2d5 req-e858235e-36d0-42b2-ab5c-d7a40c86de78 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Updated VIF entry in instance network info cache for port 4138daf0-53ec-4cf3-ad1f-cb966c2e96a3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:17:17 compute-0 nova_compute[187185]: 2025-11-29 07:17:17.273 187189 DEBUG nova.network.neutron [req-2c236dab-c0fa-4be3-a54f-8c7c810df2d5 req-e858235e-36d0-42b2-ab5c-d7a40c86de78 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Updating instance_info_cache with network_info: [{"id": "4138daf0-53ec-4cf3-ad1f-cb966c2e96a3", "address": "fa:16:3e:bd:1e:b0", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4138daf0-53", "ovs_interfaceid": "4138daf0-53ec-4cf3-ad1f-cb966c2e96a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:17:17 compute-0 nova_compute[187185]: 2025-11-29 07:17:17.295 187189 DEBUG nova.compute.manager [req-65ee9d0c-1564-457c-b6a6-0873098242c0 req-762bd9ee-91ff-4dfe-bea1-74be2f87dd49 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Received event network-vif-plugged-4138daf0-53ec-4cf3-ad1f-cb966c2e96a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:17:17 compute-0 nova_compute[187185]: 2025-11-29 07:17:17.296 187189 DEBUG oslo_concurrency.lockutils [req-65ee9d0c-1564-457c-b6a6-0873098242c0 req-762bd9ee-91ff-4dfe-bea1-74be2f87dd49 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "bb1bd9c2-1ccf-4021-b983-63a50858328f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:17:17 compute-0 nova_compute[187185]: 2025-11-29 07:17:17.296 187189 DEBUG oslo_concurrency.lockutils [req-65ee9d0c-1564-457c-b6a6-0873098242c0 req-762bd9ee-91ff-4dfe-bea1-74be2f87dd49 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "bb1bd9c2-1ccf-4021-b983-63a50858328f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:17:17 compute-0 nova_compute[187185]: 2025-11-29 07:17:17.296 187189 DEBUG oslo_concurrency.lockutils [req-65ee9d0c-1564-457c-b6a6-0873098242c0 req-762bd9ee-91ff-4dfe-bea1-74be2f87dd49 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "bb1bd9c2-1ccf-4021-b983-63a50858328f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:17:17 compute-0 nova_compute[187185]: 2025-11-29 07:17:17.296 187189 DEBUG nova.compute.manager [req-65ee9d0c-1564-457c-b6a6-0873098242c0 req-762bd9ee-91ff-4dfe-bea1-74be2f87dd49 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Processing event network-vif-plugged-4138daf0-53ec-4cf3-ad1f-cb966c2e96a3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:17:17 compute-0 nova_compute[187185]: 2025-11-29 07:17:17.298 187189 DEBUG nova.compute.manager [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:17:17 compute-0 nova_compute[187185]: 2025-11-29 07:17:17.298 187189 DEBUG oslo_concurrency.lockutils [req-2c236dab-c0fa-4be3-a54f-8c7c810df2d5 req-e858235e-36d0-42b2-ab5c-d7a40c86de78 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-bb1bd9c2-1ccf-4021-b983-63a50858328f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:17:17 compute-0 nova_compute[187185]: 2025-11-29 07:17:17.303 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400637.3036003, bb1bd9c2-1ccf-4021-b983-63a50858328f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:17:17 compute-0 nova_compute[187185]: 2025-11-29 07:17:17.304 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] VM Resumed (Lifecycle Event)
Nov 29 07:17:17 compute-0 nova_compute[187185]: 2025-11-29 07:17:17.306 187189 DEBUG nova.virt.libvirt.driver [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:17:17 compute-0 nova_compute[187185]: 2025-11-29 07:17:17.309 187189 INFO nova.virt.libvirt.driver [-] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Instance spawned successfully.
Nov 29 07:17:17 compute-0 nova_compute[187185]: 2025-11-29 07:17:17.310 187189 DEBUG nova.virt.libvirt.driver [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:17:17 compute-0 nova_compute[187185]: 2025-11-29 07:17:17.328 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:17:17 compute-0 nova_compute[187185]: 2025-11-29 07:17:17.337 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:17:17 compute-0 nova_compute[187185]: 2025-11-29 07:17:17.342 187189 DEBUG nova.virt.libvirt.driver [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:17:17 compute-0 nova_compute[187185]: 2025-11-29 07:17:17.342 187189 DEBUG nova.virt.libvirt.driver [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:17:17 compute-0 nova_compute[187185]: 2025-11-29 07:17:17.343 187189 DEBUG nova.virt.libvirt.driver [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:17:17 compute-0 nova_compute[187185]: 2025-11-29 07:17:17.344 187189 DEBUG nova.virt.libvirt.driver [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:17:17 compute-0 nova_compute[187185]: 2025-11-29 07:17:17.344 187189 DEBUG nova.virt.libvirt.driver [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:17:17 compute-0 nova_compute[187185]: 2025-11-29 07:17:17.345 187189 DEBUG nova.virt.libvirt.driver [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:17:17 compute-0 nova_compute[187185]: 2025-11-29 07:17:17.358 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:17:17 compute-0 nova_compute[187185]: 2025-11-29 07:17:17.412 187189 INFO nova.compute.manager [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Took 12.21 seconds to spawn the instance on the hypervisor.
Nov 29 07:17:17 compute-0 nova_compute[187185]: 2025-11-29 07:17:17.413 187189 DEBUG nova.compute.manager [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:17:17 compute-0 nova_compute[187185]: 2025-11-29 07:17:17.563 187189 INFO nova.compute.manager [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Took 12.96 seconds to build instance.
Nov 29 07:17:17 compute-0 nova_compute[187185]: 2025-11-29 07:17:17.589 187189 DEBUG oslo_concurrency.lockutils [None req-453d71a3-cbb4-4df2-a666-1b3dbd54aec2 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "bb1bd9c2-1ccf-4021-b983-63a50858328f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:17:18 compute-0 nova_compute[187185]: 2025-11-29 07:17:18.791 187189 DEBUG nova.network.neutron [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Successfully created port: 0ad86a88-0ccb-498d-a1e4-43aef563961d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:17:20 compute-0 nova_compute[187185]: 2025-11-29 07:17:20.029 187189 DEBUG nova.compute.manager [req-9b161218-1693-4317-b113-670039937534 req-7d25068f-e13f-409c-9144-af069e7bd558 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Received event network-vif-plugged-4138daf0-53ec-4cf3-ad1f-cb966c2e96a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:17:20 compute-0 nova_compute[187185]: 2025-11-29 07:17:20.030 187189 DEBUG oslo_concurrency.lockutils [req-9b161218-1693-4317-b113-670039937534 req-7d25068f-e13f-409c-9144-af069e7bd558 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "bb1bd9c2-1ccf-4021-b983-63a50858328f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:17:20 compute-0 nova_compute[187185]: 2025-11-29 07:17:20.030 187189 DEBUG oslo_concurrency.lockutils [req-9b161218-1693-4317-b113-670039937534 req-7d25068f-e13f-409c-9144-af069e7bd558 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "bb1bd9c2-1ccf-4021-b983-63a50858328f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:17:20 compute-0 nova_compute[187185]: 2025-11-29 07:17:20.031 187189 DEBUG oslo_concurrency.lockutils [req-9b161218-1693-4317-b113-670039937534 req-7d25068f-e13f-409c-9144-af069e7bd558 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "bb1bd9c2-1ccf-4021-b983-63a50858328f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:17:20 compute-0 nova_compute[187185]: 2025-11-29 07:17:20.031 187189 DEBUG nova.compute.manager [req-9b161218-1693-4317-b113-670039937534 req-7d25068f-e13f-409c-9144-af069e7bd558 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] No waiting events found dispatching network-vif-plugged-4138daf0-53ec-4cf3-ad1f-cb966c2e96a3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:17:20 compute-0 nova_compute[187185]: 2025-11-29 07:17:20.032 187189 WARNING nova.compute.manager [req-9b161218-1693-4317-b113-670039937534 req-7d25068f-e13f-409c-9144-af069e7bd558 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Received unexpected event network-vif-plugged-4138daf0-53ec-4cf3-ad1f-cb966c2e96a3 for instance with vm_state active and task_state None.
Nov 29 07:17:20 compute-0 nova_compute[187185]: 2025-11-29 07:17:20.161 187189 INFO nova.compute.manager [None req-e1774bb2-38e0-47e9-8f70-649ca2ba7a32 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Pausing
Nov 29 07:17:20 compute-0 nova_compute[187185]: 2025-11-29 07:17:20.163 187189 DEBUG nova.objects.instance [None req-e1774bb2-38e0-47e9-8f70-649ca2ba7a32 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'flavor' on Instance uuid bb1bd9c2-1ccf-4021-b983-63a50858328f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:17:20 compute-0 nova_compute[187185]: 2025-11-29 07:17:20.208 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400640.208003, bb1bd9c2-1ccf-4021-b983-63a50858328f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:17:20 compute-0 nova_compute[187185]: 2025-11-29 07:17:20.209 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] VM Paused (Lifecycle Event)
Nov 29 07:17:20 compute-0 nova_compute[187185]: 2025-11-29 07:17:20.214 187189 DEBUG nova.compute.manager [None req-e1774bb2-38e0-47e9-8f70-649ca2ba7a32 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:17:20 compute-0 nova_compute[187185]: 2025-11-29 07:17:20.239 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:17:20 compute-0 nova_compute[187185]: 2025-11-29 07:17:20.245 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:17:20 compute-0 nova_compute[187185]: 2025-11-29 07:17:20.270 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] During sync_power_state the instance has a pending task (pausing). Skip.
Nov 29 07:17:20 compute-0 nova_compute[187185]: 2025-11-29 07:17:20.467 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:20 compute-0 nova_compute[187185]: 2025-11-29 07:17:20.575 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:21 compute-0 nova_compute[187185]: 2025-11-29 07:17:21.619 187189 DEBUG nova.network.neutron [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Successfully updated port: 0ad86a88-0ccb-498d-a1e4-43aef563961d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:17:21 compute-0 nova_compute[187185]: 2025-11-29 07:17:21.636 187189 DEBUG oslo_concurrency.lockutils [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Acquiring lock "refresh_cache-1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:17:21 compute-0 nova_compute[187185]: 2025-11-29 07:17:21.636 187189 DEBUG oslo_concurrency.lockutils [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Acquired lock "refresh_cache-1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:17:21 compute-0 nova_compute[187185]: 2025-11-29 07:17:21.637 187189 DEBUG nova.network.neutron [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:17:21 compute-0 nova_compute[187185]: 2025-11-29 07:17:21.775 187189 DEBUG nova.network.neutron [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:17:21 compute-0 podman[230059]: 2025-11-29 07:17:21.829233385 +0000 UTC m=+0.082610864 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 07:17:22 compute-0 nova_compute[187185]: 2025-11-29 07:17:22.339 187189 DEBUG nova.compute.manager [req-d36497fa-6cd0-4d47-8f7a-0fd3e86205d1 req-cbb5870f-c3d9-472c-bfed-dacd1fcdac44 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Received event network-changed-0ad86a88-0ccb-498d-a1e4-43aef563961d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:17:22 compute-0 nova_compute[187185]: 2025-11-29 07:17:22.340 187189 DEBUG nova.compute.manager [req-d36497fa-6cd0-4d47-8f7a-0fd3e86205d1 req-cbb5870f-c3d9-472c-bfed-dacd1fcdac44 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Refreshing instance network info cache due to event network-changed-0ad86a88-0ccb-498d-a1e4-43aef563961d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:17:22 compute-0 nova_compute[187185]: 2025-11-29 07:17:22.340 187189 DEBUG oslo_concurrency.lockutils [req-d36497fa-6cd0-4d47-8f7a-0fd3e86205d1 req-cbb5870f-c3d9-472c-bfed-dacd1fcdac44 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.251 187189 DEBUG nova.network.neutron [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Updating instance_info_cache with network_info: [{"id": "0ad86a88-0ccb-498d-a1e4-43aef563961d", "address": "fa:16:3e:31:f6:04", "network": {"id": "14d61e69-b152-4adc-a95c-58748969e299", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-2003556983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "329bbbdd41424742b3045e77150a498e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ad86a88-0c", "ovs_interfaceid": "0ad86a88-0ccb-498d-a1e4-43aef563961d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.272 187189 DEBUG oslo_concurrency.lockutils [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Releasing lock "refresh_cache-1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.273 187189 DEBUG nova.compute.manager [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Instance network_info: |[{"id": "0ad86a88-0ccb-498d-a1e4-43aef563961d", "address": "fa:16:3e:31:f6:04", "network": {"id": "14d61e69-b152-4adc-a95c-58748969e299", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-2003556983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "329bbbdd41424742b3045e77150a498e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ad86a88-0c", "ovs_interfaceid": "0ad86a88-0ccb-498d-a1e4-43aef563961d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.274 187189 DEBUG oslo_concurrency.lockutils [req-d36497fa-6cd0-4d47-8f7a-0fd3e86205d1 req-cbb5870f-c3d9-472c-bfed-dacd1fcdac44 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.274 187189 DEBUG nova.network.neutron [req-d36497fa-6cd0-4d47-8f7a-0fd3e86205d1 req-cbb5870f-c3d9-472c-bfed-dacd1fcdac44 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Refreshing network info cache for port 0ad86a88-0ccb-498d-a1e4-43aef563961d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.279 187189 DEBUG nova.virt.libvirt.driver [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Start _get_guest_xml network_info=[{"id": "0ad86a88-0ccb-498d-a1e4-43aef563961d", "address": "fa:16:3e:31:f6:04", "network": {"id": "14d61e69-b152-4adc-a95c-58748969e299", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-2003556983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "329bbbdd41424742b3045e77150a498e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ad86a88-0c", "ovs_interfaceid": "0ad86a88-0ccb-498d-a1e4-43aef563961d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.288 187189 WARNING nova.virt.libvirt.driver [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.301 187189 DEBUG nova.virt.libvirt.host [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.303 187189 DEBUG nova.virt.libvirt.host [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.318 187189 DEBUG nova.virt.libvirt.host [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.319 187189 DEBUG nova.virt.libvirt.host [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.321 187189 DEBUG nova.virt.libvirt.driver [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.321 187189 DEBUG nova.virt.hardware [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.322 187189 DEBUG nova.virt.hardware [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.323 187189 DEBUG nova.virt.hardware [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.323 187189 DEBUG nova.virt.hardware [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.324 187189 DEBUG nova.virt.hardware [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.324 187189 DEBUG nova.virt.hardware [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.325 187189 DEBUG nova.virt.hardware [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.326 187189 DEBUG nova.virt.hardware [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.326 187189 DEBUG nova.virt.hardware [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.326 187189 DEBUG nova.virt.hardware [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.327 187189 DEBUG nova.virt.hardware [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.332 187189 DEBUG nova.virt.libvirt.vif [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:17:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-138799320',display_name='tempest-ServersNegativeTestJSON-server-138799320',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-138799320',id=106,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='329bbbdd41424742b3045e77150a498e',ramdisk_id='',reservation_id='r-8zzi07jm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1191192320',owner_user_name='tempest-ServersNegativeTestJSON-1191192320-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:17:15Z,user_data=None,user_id='2647a3e4fc214b4a85db1283eb7ef117',uuid=1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0ad86a88-0ccb-498d-a1e4-43aef563961d", "address": "fa:16:3e:31:f6:04", "network": {"id": "14d61e69-b152-4adc-a95c-58748969e299", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-2003556983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "329bbbdd41424742b3045e77150a498e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ad86a88-0c", "ovs_interfaceid": "0ad86a88-0ccb-498d-a1e4-43aef563961d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.333 187189 DEBUG nova.network.os_vif_util [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Converting VIF {"id": "0ad86a88-0ccb-498d-a1e4-43aef563961d", "address": "fa:16:3e:31:f6:04", "network": {"id": "14d61e69-b152-4adc-a95c-58748969e299", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-2003556983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "329bbbdd41424742b3045e77150a498e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ad86a88-0c", "ovs_interfaceid": "0ad86a88-0ccb-498d-a1e4-43aef563961d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.334 187189 DEBUG nova.network.os_vif_util [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:31:f6:04,bridge_name='br-int',has_traffic_filtering=True,id=0ad86a88-0ccb-498d-a1e4-43aef563961d,network=Network(14d61e69-b152-4adc-a95c-58748969e299),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ad86a88-0c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.335 187189 DEBUG nova.objects.instance [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lazy-loading 'pci_devices' on Instance uuid 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.358 187189 DEBUG nova.virt.libvirt.driver [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:17:23 compute-0 nova_compute[187185]:   <uuid>1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d</uuid>
Nov 29 07:17:23 compute-0 nova_compute[187185]:   <name>instance-0000006a</name>
Nov 29 07:17:23 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:17:23 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:17:23 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:17:23 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:       <nova:name>tempest-ServersNegativeTestJSON-server-138799320</nova:name>
Nov 29 07:17:23 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:17:23</nova:creationTime>
Nov 29 07:17:23 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:17:23 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:17:23 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:17:23 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:17:23 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:17:23 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:17:23 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:17:23 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:17:23 compute-0 nova_compute[187185]:         <nova:user uuid="2647a3e4fc214b4a85db1283eb7ef117">tempest-ServersNegativeTestJSON-1191192320-project-member</nova:user>
Nov 29 07:17:23 compute-0 nova_compute[187185]:         <nova:project uuid="329bbbdd41424742b3045e77150a498e">tempest-ServersNegativeTestJSON-1191192320</nova:project>
Nov 29 07:17:23 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:17:23 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:17:23 compute-0 nova_compute[187185]:         <nova:port uuid="0ad86a88-0ccb-498d-a1e4-43aef563961d">
Nov 29 07:17:23 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:17:23 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:17:23 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:17:23 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <system>
Nov 29 07:17:23 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:17:23 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:17:23 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:17:23 compute-0 nova_compute[187185]:       <entry name="serial">1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d</entry>
Nov 29 07:17:23 compute-0 nova_compute[187185]:       <entry name="uuid">1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d</entry>
Nov 29 07:17:23 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     </system>
Nov 29 07:17:23 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:17:23 compute-0 nova_compute[187185]:   <os>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:   </os>
Nov 29 07:17:23 compute-0 nova_compute[187185]:   <features>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:   </features>
Nov 29 07:17:23 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:17:23 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:17:23 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:17:23 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d/disk"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:17:23 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d/disk.config"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:17:23 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:31:f6:04"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:       <target dev="tap0ad86a88-0c"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:17:23 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d/console.log" append="off"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <video>
Nov 29 07:17:23 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     </video>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:17:23 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:17:23 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:17:23 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:17:23 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:17:23 compute-0 nova_compute[187185]: </domain>
Nov 29 07:17:23 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.360 187189 DEBUG nova.compute.manager [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Preparing to wait for external event network-vif-plugged-0ad86a88-0ccb-498d-a1e4-43aef563961d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.360 187189 DEBUG oslo_concurrency.lockutils [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Acquiring lock "1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.361 187189 DEBUG oslo_concurrency.lockutils [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.361 187189 DEBUG oslo_concurrency.lockutils [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.361 187189 DEBUG nova.virt.libvirt.vif [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:17:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-138799320',display_name='tempest-ServersNegativeTestJSON-server-138799320',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-138799320',id=106,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='329bbbdd41424742b3045e77150a498e',ramdisk_id='',reservation_id='r-8zzi07jm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1191192320',owner_user_name='tempest-ServersNegativeTestJSON-1191192320-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:17:15Z,user_data=None,user_id='2647a3e4fc214b4a85db1283eb7ef117',uuid=1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0ad86a88-0ccb-498d-a1e4-43aef563961d", "address": "fa:16:3e:31:f6:04", "network": {"id": "14d61e69-b152-4adc-a95c-58748969e299", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-2003556983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "329bbbdd41424742b3045e77150a498e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ad86a88-0c", "ovs_interfaceid": "0ad86a88-0ccb-498d-a1e4-43aef563961d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.362 187189 DEBUG nova.network.os_vif_util [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Converting VIF {"id": "0ad86a88-0ccb-498d-a1e4-43aef563961d", "address": "fa:16:3e:31:f6:04", "network": {"id": "14d61e69-b152-4adc-a95c-58748969e299", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-2003556983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "329bbbdd41424742b3045e77150a498e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ad86a88-0c", "ovs_interfaceid": "0ad86a88-0ccb-498d-a1e4-43aef563961d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.363 187189 DEBUG nova.network.os_vif_util [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:31:f6:04,bridge_name='br-int',has_traffic_filtering=True,id=0ad86a88-0ccb-498d-a1e4-43aef563961d,network=Network(14d61e69-b152-4adc-a95c-58748969e299),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ad86a88-0c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.363 187189 DEBUG os_vif [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:f6:04,bridge_name='br-int',has_traffic_filtering=True,id=0ad86a88-0ccb-498d-a1e4-43aef563961d,network=Network(14d61e69-b152-4adc-a95c-58748969e299),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ad86a88-0c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.364 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.364 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.365 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.369 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.370 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0ad86a88-0c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.370 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0ad86a88-0c, col_values=(('external_ids', {'iface-id': '0ad86a88-0ccb-498d-a1e4-43aef563961d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:31:f6:04', 'vm-uuid': '1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.372 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:23 compute-0 NetworkManager[55227]: <info>  [1764400643.3741] manager: (tap0ad86a88-0c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/155)
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.375 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.385 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.386 187189 INFO os_vif [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:f6:04,bridge_name='br-int',has_traffic_filtering=True,id=0ad86a88-0ccb-498d-a1e4-43aef563961d,network=Network(14d61e69-b152-4adc-a95c-58748969e299),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ad86a88-0c')
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.962 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:23 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:23.962 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=34, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=33) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:17:23 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:23.965 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.979 187189 DEBUG oslo_concurrency.lockutils [None req-f1995a4f-5e4a-4927-b891-b40186b5caab ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "bb1bd9c2-1ccf-4021-b983-63a50858328f" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.979 187189 DEBUG oslo_concurrency.lockutils [None req-f1995a4f-5e4a-4927-b891-b40186b5caab ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "bb1bd9c2-1ccf-4021-b983-63a50858328f" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.980 187189 INFO nova.compute.manager [None req-f1995a4f-5e4a-4927-b891-b40186b5caab ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Shelving
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.995 187189 DEBUG nova.virt.libvirt.driver [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.995 187189 DEBUG nova.virt.libvirt.driver [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.996 187189 DEBUG nova.virt.libvirt.driver [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] No VIF found with MAC fa:16:3e:31:f6:04, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:17:23 compute-0 nova_compute[187185]: 2025-11-29 07:17:23.997 187189 INFO nova.virt.libvirt.driver [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Using config drive
Nov 29 07:17:24 compute-0 podman[230085]: 2025-11-29 07:17:24.014992525 +0000 UTC m=+0.082764079 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 07:17:24 compute-0 podman[230086]: 2025-11-29 07:17:24.029791725 +0000 UTC m=+0.098400493 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:17:24 compute-0 kernel: tap4138daf0-53 (unregistering): left promiscuous mode
Nov 29 07:17:24 compute-0 NetworkManager[55227]: <info>  [1764400644.2013] device (tap4138daf0-53): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:17:24 compute-0 nova_compute[187185]: 2025-11-29 07:17:24.218 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:24 compute-0 ovn_controller[95281]: 2025-11-29T07:17:24Z|00282|binding|INFO|Releasing lport 4138daf0-53ec-4cf3-ad1f-cb966c2e96a3 from this chassis (sb_readonly=0)
Nov 29 07:17:24 compute-0 ovn_controller[95281]: 2025-11-29T07:17:24Z|00283|binding|INFO|Setting lport 4138daf0-53ec-4cf3-ad1f-cb966c2e96a3 down in Southbound
Nov 29 07:17:24 compute-0 ovn_controller[95281]: 2025-11-29T07:17:24Z|00284|binding|INFO|Removing iface tap4138daf0-53 ovn-installed in OVS
Nov 29 07:17:24 compute-0 nova_compute[187185]: 2025-11-29 07:17:24.221 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:24.228 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bd:1e:b0 10.100.0.6'], port_security=['fa:16:3e:bd:1e:b0 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'bb1bd9c2-1ccf-4021-b983-63a50858328f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8547e4c2-e200-4173-9eba-476619f06150', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=04b58113-8105-402c-a103-4692d3989228, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=4138daf0-53ec-4cf3-ad1f-cb966c2e96a3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:17:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:24.230 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 4138daf0-53ec-4cf3-ad1f-cb966c2e96a3 in datapath df7cfc35-3f76-45b2-b70c-e4525d38f410 unbound from our chassis
Nov 29 07:17:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:24.233 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network df7cfc35-3f76-45b2-b70c-e4525d38f410, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:17:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:24.234 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[8a6bc00a-444d-46ca-87b1-b4b3c7aeaaff]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:24.235 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410 namespace which is not needed anymore
Nov 29 07:17:24 compute-0 nova_compute[187185]: 2025-11-29 07:17:24.248 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:24 compute-0 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d00000069.scope: Deactivated successfully.
Nov 29 07:17:24 compute-0 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d00000069.scope: Consumed 3.240s CPU time.
Nov 29 07:17:24 compute-0 systemd-machined[153486]: Machine qemu-38-instance-00000069 terminated.
Nov 29 07:17:24 compute-0 nova_compute[187185]: 2025-11-29 07:17:24.389 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:24 compute-0 nova_compute[187185]: 2025-11-29 07:17:24.403 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:24 compute-0 neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410[230003]: [NOTICE]   (230017) : haproxy version is 2.8.14-c23fe91
Nov 29 07:17:24 compute-0 neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410[230003]: [NOTICE]   (230017) : path to executable is /usr/sbin/haproxy
Nov 29 07:17:24 compute-0 neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410[230003]: [WARNING]  (230017) : Exiting Master process...
Nov 29 07:17:24 compute-0 neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410[230003]: [WARNING]  (230017) : Exiting Master process...
Nov 29 07:17:24 compute-0 neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410[230003]: [ALERT]    (230017) : Current worker (230025) exited with code 143 (Terminated)
Nov 29 07:17:24 compute-0 neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410[230003]: [WARNING]  (230017) : All workers exited. Exiting... (0)
Nov 29 07:17:24 compute-0 systemd[1]: libpod-492bff5a1daf62aee2db25bff574607a65c0608b85be8e28144e56a420882f0f.scope: Deactivated successfully.
Nov 29 07:17:24 compute-0 conmon[230003]: conmon 492bff5a1daf62aee2db <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-492bff5a1daf62aee2db25bff574607a65c0608b85be8e28144e56a420882f0f.scope/container/memory.events
Nov 29 07:17:24 compute-0 podman[230154]: 2025-11-29 07:17:24.432454235 +0000 UTC m=+0.068740921 container died 492bff5a1daf62aee2db25bff574607a65c0608b85be8e28144e56a420882f0f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 29 07:17:24 compute-0 nova_compute[187185]: 2025-11-29 07:17:24.449 187189 INFO nova.virt.libvirt.driver [-] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Instance destroyed successfully.
Nov 29 07:17:24 compute-0 nova_compute[187185]: 2025-11-29 07:17:24.449 187189 DEBUG nova.objects.instance [None req-f1995a4f-5e4a-4927-b891-b40186b5caab ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'numa_topology' on Instance uuid bb1bd9c2-1ccf-4021-b983-63a50858328f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:17:24 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-492bff5a1daf62aee2db25bff574607a65c0608b85be8e28144e56a420882f0f-userdata-shm.mount: Deactivated successfully.
Nov 29 07:17:24 compute-0 systemd[1]: var-lib-containers-storage-overlay-fe44fed11c53a7866f82dad4ba35621650c9dd0591eaa4101ce56b3e6890c80f-merged.mount: Deactivated successfully.
Nov 29 07:17:24 compute-0 podman[230154]: 2025-11-29 07:17:24.489174074 +0000 UTC m=+0.125460730 container cleanup 492bff5a1daf62aee2db25bff574607a65c0608b85be8e28144e56a420882f0f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 07:17:24 compute-0 systemd[1]: libpod-conmon-492bff5a1daf62aee2db25bff574607a65c0608b85be8e28144e56a420882f0f.scope: Deactivated successfully.
Nov 29 07:17:24 compute-0 podman[230203]: 2025-11-29 07:17:24.56941664 +0000 UTC m=+0.054032173 container remove 492bff5a1daf62aee2db25bff574607a65c0608b85be8e28144e56a420882f0f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 07:17:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:24.576 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[fc8ffdc8-8684-41df-800d-c00e41edfbb4]: (4, ('Sat Nov 29 07:17:24 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410 (492bff5a1daf62aee2db25bff574607a65c0608b85be8e28144e56a420882f0f)\n492bff5a1daf62aee2db25bff574607a65c0608b85be8e28144e56a420882f0f\nSat Nov 29 07:17:24 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410 (492bff5a1daf62aee2db25bff574607a65c0608b85be8e28144e56a420882f0f)\n492bff5a1daf62aee2db25bff574607a65c0608b85be8e28144e56a420882f0f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:24.578 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[36c55a6b-ec2d-4e40-9c6e-7ce1cb5918a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:24.579 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdf7cfc35-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:17:24 compute-0 nova_compute[187185]: 2025-11-29 07:17:24.582 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:24 compute-0 kernel: tapdf7cfc35-30: left promiscuous mode
Nov 29 07:17:24 compute-0 nova_compute[187185]: 2025-11-29 07:17:24.617 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:24.621 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[0b188341-1980-45f0-9112-188a57e1b17a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:24.645 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f51a64e1-85f5-466f-a402-7a3928b65e8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:24.646 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a04667b9-5b21-40c0-b811-6d2886511da9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:24.668 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6ea9cc18-0434-4025-9198-526d7cd86aa6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 608152, 'reachable_time': 33291, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230225, 'error': None, 'target': 'ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:24.671 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:17:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:24.671 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[ec06491d-1dae-40b8-9dd3-1a7413990e8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:24 compute-0 systemd[1]: run-netns-ovnmeta\x2ddf7cfc35\x2d3f76\x2d45b2\x2db70c\x2de4525d38f410.mount: Deactivated successfully.
Nov 29 07:17:24 compute-0 nova_compute[187185]: 2025-11-29 07:17:24.760 187189 INFO nova.virt.libvirt.driver [None req-f1995a4f-5e4a-4927-b891-b40186b5caab ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Beginning cold snapshot process
Nov 29 07:17:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:24.968 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '34'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:17:25 compute-0 nova_compute[187185]: 2025-11-29 07:17:25.045 187189 INFO nova.virt.libvirt.driver [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Creating config drive at /var/lib/nova/instances/1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d/disk.config
Nov 29 07:17:25 compute-0 nova_compute[187185]: 2025-11-29 07:17:25.054 187189 DEBUG oslo_concurrency.processutils [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppeuwni1n execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:17:25 compute-0 nova_compute[187185]: 2025-11-29 07:17:25.197 187189 DEBUG oslo_concurrency.processutils [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppeuwni1n" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:17:25 compute-0 nova_compute[187185]: 2025-11-29 07:17:25.218 187189 DEBUG nova.compute.manager [req-063dcffa-52a3-4eee-86b3-8b4a06e219ac req-9a229c8f-b616-4e7c-86c8-bf5157b2e257 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Received event network-vif-unplugged-4138daf0-53ec-4cf3-ad1f-cb966c2e96a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:17:25 compute-0 nova_compute[187185]: 2025-11-29 07:17:25.218 187189 DEBUG oslo_concurrency.lockutils [req-063dcffa-52a3-4eee-86b3-8b4a06e219ac req-9a229c8f-b616-4e7c-86c8-bf5157b2e257 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "bb1bd9c2-1ccf-4021-b983-63a50858328f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:17:25 compute-0 nova_compute[187185]: 2025-11-29 07:17:25.219 187189 DEBUG oslo_concurrency.lockutils [req-063dcffa-52a3-4eee-86b3-8b4a06e219ac req-9a229c8f-b616-4e7c-86c8-bf5157b2e257 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "bb1bd9c2-1ccf-4021-b983-63a50858328f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:17:25 compute-0 nova_compute[187185]: 2025-11-29 07:17:25.219 187189 DEBUG oslo_concurrency.lockutils [req-063dcffa-52a3-4eee-86b3-8b4a06e219ac req-9a229c8f-b616-4e7c-86c8-bf5157b2e257 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "bb1bd9c2-1ccf-4021-b983-63a50858328f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:17:25 compute-0 nova_compute[187185]: 2025-11-29 07:17:25.220 187189 DEBUG nova.compute.manager [req-063dcffa-52a3-4eee-86b3-8b4a06e219ac req-9a229c8f-b616-4e7c-86c8-bf5157b2e257 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] No waiting events found dispatching network-vif-unplugged-4138daf0-53ec-4cf3-ad1f-cb966c2e96a3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:17:25 compute-0 nova_compute[187185]: 2025-11-29 07:17:25.220 187189 WARNING nova.compute.manager [req-063dcffa-52a3-4eee-86b3-8b4a06e219ac req-9a229c8f-b616-4e7c-86c8-bf5157b2e257 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Received unexpected event network-vif-unplugged-4138daf0-53ec-4cf3-ad1f-cb966c2e96a3 for instance with vm_state paused and task_state shelving_image_uploading.
Nov 29 07:17:25 compute-0 systemd-udevd[230132]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:17:25 compute-0 NetworkManager[55227]: <info>  [1764400645.2824] manager: (tap0ad86a88-0c): new Tun device (/org/freedesktop/NetworkManager/Devices/156)
Nov 29 07:17:25 compute-0 kernel: tap0ad86a88-0c: entered promiscuous mode
Nov 29 07:17:25 compute-0 NetworkManager[55227]: <info>  [1764400645.2952] device (tap0ad86a88-0c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:17:25 compute-0 NetworkManager[55227]: <info>  [1764400645.2962] device (tap0ad86a88-0c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:17:25 compute-0 nova_compute[187185]: 2025-11-29 07:17:25.292 187189 DEBUG nova.privsep.utils [None req-f1995a4f-5e4a-4927-b891-b40186b5caab ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 29 07:17:25 compute-0 nova_compute[187185]: 2025-11-29 07:17:25.293 187189 DEBUG oslo_concurrency.processutils [None req-f1995a4f-5e4a-4927-b891-b40186b5caab ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/bb1bd9c2-1ccf-4021-b983-63a50858328f/disk /var/lib/nova/instances/snapshots/tmp4zr075_e/b2ccb452b68243f4be91eff5d01df1c7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:17:25 compute-0 ovn_controller[95281]: 2025-11-29T07:17:25Z|00285|binding|INFO|Claiming lport 0ad86a88-0ccb-498d-a1e4-43aef563961d for this chassis.
Nov 29 07:17:25 compute-0 ovn_controller[95281]: 2025-11-29T07:17:25Z|00286|binding|INFO|0ad86a88-0ccb-498d-a1e4-43aef563961d: Claiming fa:16:3e:31:f6:04 10.100.0.8
Nov 29 07:17:25 compute-0 nova_compute[187185]: 2025-11-29 07:17:25.333 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:25.339 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:31:f6:04 10.100.0.8'], port_security=['fa:16:3e:31:f6:04 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-14d61e69-b152-4adc-a95c-58748969e299', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '329bbbdd41424742b3045e77150a498e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '24db58f8-235a-4b76-869f-efe13404b22a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=61c05e4b-7426-41e7-9cd6-8f37a87e832e, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=0ad86a88-0ccb-498d-a1e4-43aef563961d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:25.341 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 0ad86a88-0ccb-498d-a1e4-43aef563961d in datapath 14d61e69-b152-4adc-a95c-58748969e299 bound to our chassis
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:25.344 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 14d61e69-b152-4adc-a95c-58748969e299
Nov 29 07:17:25 compute-0 ovn_controller[95281]: 2025-11-29T07:17:25Z|00287|binding|INFO|Setting lport 0ad86a88-0ccb-498d-a1e4-43aef563961d ovn-installed in OVS
Nov 29 07:17:25 compute-0 ovn_controller[95281]: 2025-11-29T07:17:25Z|00288|binding|INFO|Setting lport 0ad86a88-0ccb-498d-a1e4-43aef563961d up in Southbound
Nov 29 07:17:25 compute-0 nova_compute[187185]: 2025-11-29 07:17:25.351 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:25.357 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[4808caeb-7fad-4d57-8b6e-0d2d8568c647]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:25.358 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap14d61e69-b1 in ovnmeta-14d61e69-b152-4adc-a95c-58748969e299 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:17:25 compute-0 nova_compute[187185]: 2025-11-29 07:17:25.360 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:25.364 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap14d61e69-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:25.364 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[29eac6c1-a1c4-4123-adf7-606c93d1ed99]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:25 compute-0 systemd-machined[153486]: New machine qemu-39-instance-0000006a.
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:25.366 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[51b8b098-aedf-4a09-9bb1-64a702b8136b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:25 compute-0 systemd[1]: Started Virtual Machine qemu-39-instance-0000006a.
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:25.383 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[800b0e96-db53-4bbf-94ac-a5968c4d37ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:25.416 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[4d86387e-34e6-4b62-b3b5-baa4bc0aba29]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:25.463 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[343723e7-ec94-4e5e-837f-304feb055912]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:25 compute-0 NetworkManager[55227]: <info>  [1764400645.4739] manager: (tap14d61e69-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/157)
Nov 29 07:17:25 compute-0 nova_compute[187185]: 2025-11-29 07:17:25.473 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:25.473 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[9b83d361-0834-472c-9f8b-ef775bac9006]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:25.506 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:25.507 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:25.507 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:25.509 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[53d2a291-abd2-4abf-9233-19d34eb6ac0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:25.512 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[f9a01bf9-0804-4f78-b8f8-480b5007eae5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:25 compute-0 NetworkManager[55227]: <info>  [1764400645.5422] device (tap14d61e69-b0): carrier: link connected
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:25.549 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[840ce3b4-a5df-4cbe-b5fc-6d0bb977741b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:25.570 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[8c0defb2-3752-4c6e-9c5c-2fa455d04a1c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap14d61e69-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:42:d7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 94], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 609320, 'reachable_time': 15400, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230289, 'error': None, 'target': 'ovnmeta-14d61e69-b152-4adc-a95c-58748969e299', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:25.588 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[68bb401e-76f5-4672-9926-b1fc2c1b88ec]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed1:42d7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 609320, 'tstamp': 609320}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230290, 'error': None, 'target': 'ovnmeta-14d61e69-b152-4adc-a95c-58748969e299', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:25 compute-0 nova_compute[187185]: 2025-11-29 07:17:25.593 187189 DEBUG oslo_concurrency.processutils [None req-f1995a4f-5e4a-4927-b891-b40186b5caab ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/bb1bd9c2-1ccf-4021-b983-63a50858328f/disk /var/lib/nova/instances/snapshots/tmp4zr075_e/b2ccb452b68243f4be91eff5d01df1c7" returned: 0 in 0.300s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:17:25 compute-0 nova_compute[187185]: 2025-11-29 07:17:25.593 187189 INFO nova.virt.libvirt.driver [None req-f1995a4f-5e4a-4927-b891-b40186b5caab ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Snapshot extracted, beginning image upload
Nov 29 07:17:25 compute-0 nova_compute[187185]: 2025-11-29 07:17:25.607 187189 DEBUG nova.network.neutron [req-d36497fa-6cd0-4d47-8f7a-0fd3e86205d1 req-cbb5870f-c3d9-472c-bfed-dacd1fcdac44 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Updated VIF entry in instance network info cache for port 0ad86a88-0ccb-498d-a1e4-43aef563961d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:17:25 compute-0 nova_compute[187185]: 2025-11-29 07:17:25.608 187189 DEBUG nova.network.neutron [req-d36497fa-6cd0-4d47-8f7a-0fd3e86205d1 req-cbb5870f-c3d9-472c-bfed-dacd1fcdac44 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Updating instance_info_cache with network_info: [{"id": "0ad86a88-0ccb-498d-a1e4-43aef563961d", "address": "fa:16:3e:31:f6:04", "network": {"id": "14d61e69-b152-4adc-a95c-58748969e299", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-2003556983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "329bbbdd41424742b3045e77150a498e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ad86a88-0c", "ovs_interfaceid": "0ad86a88-0ccb-498d-a1e4-43aef563961d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:25.609 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a22029e5-c5dc-4864-9320-bd11c932fe23]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap14d61e69-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:42:d7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 94], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 609320, 'reachable_time': 15400, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230291, 'error': None, 'target': 'ovnmeta-14d61e69-b152-4adc-a95c-58748969e299', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:25 compute-0 nova_compute[187185]: 2025-11-29 07:17:25.635 187189 DEBUG oslo_concurrency.lockutils [req-d36497fa-6cd0-4d47-8f7a-0fd3e86205d1 req-cbb5870f-c3d9-472c-bfed-dacd1fcdac44 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:25.645 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[c51ce1c8-5170-4d53-8896-01ce4fa05203]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:25.728 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[b4489fe2-dd46-4098-a7ec-1b12a90afa60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:25.730 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap14d61e69-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:25.730 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:25.730 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap14d61e69-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:17:25 compute-0 NetworkManager[55227]: <info>  [1764400645.7332] manager: (tap14d61e69-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/158)
Nov 29 07:17:25 compute-0 nova_compute[187185]: 2025-11-29 07:17:25.732 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:25 compute-0 kernel: tap14d61e69-b0: entered promiscuous mode
Nov 29 07:17:25 compute-0 nova_compute[187185]: 2025-11-29 07:17:25.737 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:25.738 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap14d61e69-b0, col_values=(('external_ids', {'iface-id': '17905b79-5cd7-4b55-9191-5d935325b1f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:17:25 compute-0 nova_compute[187185]: 2025-11-29 07:17:25.739 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:25 compute-0 ovn_controller[95281]: 2025-11-29T07:17:25Z|00289|binding|INFO|Releasing lport 17905b79-5cd7-4b55-9191-5d935325b1f0 from this chassis (sb_readonly=0)
Nov 29 07:17:25 compute-0 nova_compute[187185]: 2025-11-29 07:17:25.751 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:25 compute-0 nova_compute[187185]: 2025-11-29 07:17:25.758 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:25.758 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/14d61e69-b152-4adc-a95c-58748969e299.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/14d61e69-b152-4adc-a95c-58748969e299.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:25.759 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[3acc0f79-9454-4385-a255-c4e711b2e790]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:25.760 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-14d61e69-b152-4adc-a95c-58748969e299
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/14d61e69-b152-4adc-a95c-58748969e299.pid.haproxy
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID 14d61e69-b152-4adc-a95c-58748969e299
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:17:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:25.761 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-14d61e69-b152-4adc-a95c-58748969e299', 'env', 'PROCESS_TAG=haproxy-14d61e69-b152-4adc-a95c-58748969e299', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/14d61e69-b152-4adc-a95c-58748969e299.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:17:25 compute-0 nova_compute[187185]: 2025-11-29 07:17:25.846 187189 DEBUG nova.compute.manager [req-eca93614-7899-4b5e-b5fb-587037e7df77 req-53495974-fff3-4372-a55b-2ca9511e9493 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Received event network-vif-plugged-0ad86a88-0ccb-498d-a1e4-43aef563961d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:17:25 compute-0 nova_compute[187185]: 2025-11-29 07:17:25.857 187189 DEBUG oslo_concurrency.lockutils [req-eca93614-7899-4b5e-b5fb-587037e7df77 req-53495974-fff3-4372-a55b-2ca9511e9493 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:17:25 compute-0 nova_compute[187185]: 2025-11-29 07:17:25.857 187189 DEBUG oslo_concurrency.lockutils [req-eca93614-7899-4b5e-b5fb-587037e7df77 req-53495974-fff3-4372-a55b-2ca9511e9493 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.011s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:17:25 compute-0 nova_compute[187185]: 2025-11-29 07:17:25.858 187189 DEBUG oslo_concurrency.lockutils [req-eca93614-7899-4b5e-b5fb-587037e7df77 req-53495974-fff3-4372-a55b-2ca9511e9493 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:17:25 compute-0 nova_compute[187185]: 2025-11-29 07:17:25.858 187189 DEBUG nova.compute.manager [req-eca93614-7899-4b5e-b5fb-587037e7df77 req-53495974-fff3-4372-a55b-2ca9511e9493 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Processing event network-vif-plugged-0ad86a88-0ccb-498d-a1e4-43aef563961d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:17:26 compute-0 nova_compute[187185]: 2025-11-29 07:17:26.052 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400646.0520422, 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:17:26 compute-0 nova_compute[187185]: 2025-11-29 07:17:26.053 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] VM Started (Lifecycle Event)
Nov 29 07:17:26 compute-0 nova_compute[187185]: 2025-11-29 07:17:26.057 187189 DEBUG nova.compute.manager [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:17:26 compute-0 nova_compute[187185]: 2025-11-29 07:17:26.062 187189 DEBUG nova.virt.libvirt.driver [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:17:26 compute-0 nova_compute[187185]: 2025-11-29 07:17:26.067 187189 INFO nova.virt.libvirt.driver [-] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Instance spawned successfully.
Nov 29 07:17:26 compute-0 nova_compute[187185]: 2025-11-29 07:17:26.067 187189 DEBUG nova.virt.libvirt.driver [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:17:26 compute-0 nova_compute[187185]: 2025-11-29 07:17:26.073 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:17:26 compute-0 nova_compute[187185]: 2025-11-29 07:17:26.079 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:17:26 compute-0 nova_compute[187185]: 2025-11-29 07:17:26.135 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:17:26 compute-0 nova_compute[187185]: 2025-11-29 07:17:26.136 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400646.0523236, 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:17:26 compute-0 nova_compute[187185]: 2025-11-29 07:17:26.136 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] VM Paused (Lifecycle Event)
Nov 29 07:17:26 compute-0 nova_compute[187185]: 2025-11-29 07:17:26.143 187189 DEBUG nova.virt.libvirt.driver [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:17:26 compute-0 nova_compute[187185]: 2025-11-29 07:17:26.143 187189 DEBUG nova.virt.libvirt.driver [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:17:26 compute-0 nova_compute[187185]: 2025-11-29 07:17:26.144 187189 DEBUG nova.virt.libvirt.driver [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:17:26 compute-0 nova_compute[187185]: 2025-11-29 07:17:26.145 187189 DEBUG nova.virt.libvirt.driver [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:17:26 compute-0 nova_compute[187185]: 2025-11-29 07:17:26.146 187189 DEBUG nova.virt.libvirt.driver [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:17:26 compute-0 nova_compute[187185]: 2025-11-29 07:17:26.146 187189 DEBUG nova.virt.libvirt.driver [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:17:26 compute-0 nova_compute[187185]: 2025-11-29 07:17:26.155 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:17:26 compute-0 nova_compute[187185]: 2025-11-29 07:17:26.159 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400646.0615525, 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:17:26 compute-0 nova_compute[187185]: 2025-11-29 07:17:26.159 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] VM Resumed (Lifecycle Event)
Nov 29 07:17:26 compute-0 nova_compute[187185]: 2025-11-29 07:17:26.178 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:17:26 compute-0 nova_compute[187185]: 2025-11-29 07:17:26.182 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:17:26 compute-0 nova_compute[187185]: 2025-11-29 07:17:26.205 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:17:26 compute-0 nova_compute[187185]: 2025-11-29 07:17:26.223 187189 INFO nova.compute.manager [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Took 10.45 seconds to spawn the instance on the hypervisor.
Nov 29 07:17:26 compute-0 nova_compute[187185]: 2025-11-29 07:17:26.224 187189 DEBUG nova.compute.manager [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:17:26 compute-0 podman[230331]: 2025-11-29 07:17:26.18403414 +0000 UTC m=+0.036909958 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:17:26 compute-0 nova_compute[187185]: 2025-11-29 07:17:26.309 187189 INFO nova.compute.manager [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Took 15.15 seconds to build instance.
Nov 29 07:17:26 compute-0 nova_compute[187185]: 2025-11-29 07:17:26.327 187189 DEBUG oslo_concurrency.lockutils [None req-384ba36f-0997-4bda-9965-90ed2af71de5 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.560s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:17:26 compute-0 podman[230331]: 2025-11-29 07:17:26.431342905 +0000 UTC m=+0.284218673 container create ccde70cb9364257c603d1fe4da9389a4053586fec00fbbbcef5816a5e6a351dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS)
Nov 29 07:17:26 compute-0 systemd[1]: Started libpod-conmon-ccde70cb9364257c603d1fe4da9389a4053586fec00fbbbcef5816a5e6a351dd.scope.
Nov 29 07:17:26 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:17:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3d83ce4b69558a10623432d37fce3c4fe68ca8fed21ac0c1281554ff06d199f8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:17:26 compute-0 podman[230331]: 2025-11-29 07:17:26.603801817 +0000 UTC m=+0.456677635 container init ccde70cb9364257c603d1fe4da9389a4053586fec00fbbbcef5816a5e6a351dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:17:26 compute-0 nova_compute[187185]: 2025-11-29 07:17:26.608 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:26 compute-0 podman[230331]: 2025-11-29 07:17:26.612566205 +0000 UTC m=+0.465441983 container start ccde70cb9364257c603d1fe4da9389a4053586fec00fbbbcef5816a5e6a351dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:17:26 compute-0 neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299[230346]: [NOTICE]   (230350) : New worker (230352) forked
Nov 29 07:17:26 compute-0 neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299[230346]: [NOTICE]   (230350) : Loading success.
Nov 29 07:17:27 compute-0 nova_compute[187185]: 2025-11-29 07:17:27.095 187189 DEBUG oslo_concurrency.lockutils [None req-640c24c5-b030-4f40-906c-615d34eecf0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Acquiring lock "1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:17:27 compute-0 nova_compute[187185]: 2025-11-29 07:17:27.095 187189 DEBUG oslo_concurrency.lockutils [None req-640c24c5-b030-4f40-906c-615d34eecf0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:17:27 compute-0 nova_compute[187185]: 2025-11-29 07:17:27.096 187189 DEBUG oslo_concurrency.lockutils [None req-640c24c5-b030-4f40-906c-615d34eecf0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Acquiring lock "1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:17:27 compute-0 nova_compute[187185]: 2025-11-29 07:17:27.096 187189 DEBUG oslo_concurrency.lockutils [None req-640c24c5-b030-4f40-906c-615d34eecf0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:17:27 compute-0 nova_compute[187185]: 2025-11-29 07:17:27.097 187189 DEBUG oslo_concurrency.lockutils [None req-640c24c5-b030-4f40-906c-615d34eecf0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:17:27 compute-0 nova_compute[187185]: 2025-11-29 07:17:27.110 187189 INFO nova.compute.manager [None req-640c24c5-b030-4f40-906c-615d34eecf0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Terminating instance
Nov 29 07:17:27 compute-0 nova_compute[187185]: 2025-11-29 07:17:27.123 187189 DEBUG nova.compute.manager [None req-640c24c5-b030-4f40-906c-615d34eecf0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 07:17:27 compute-0 kernel: tap0ad86a88-0c (unregistering): left promiscuous mode
Nov 29 07:17:27 compute-0 NetworkManager[55227]: <info>  [1764400647.1496] device (tap0ad86a88-0c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:17:27 compute-0 ovn_controller[95281]: 2025-11-29T07:17:27Z|00290|binding|INFO|Releasing lport 0ad86a88-0ccb-498d-a1e4-43aef563961d from this chassis (sb_readonly=0)
Nov 29 07:17:27 compute-0 ovn_controller[95281]: 2025-11-29T07:17:27Z|00291|binding|INFO|Setting lport 0ad86a88-0ccb-498d-a1e4-43aef563961d down in Southbound
Nov 29 07:17:27 compute-0 ovn_controller[95281]: 2025-11-29T07:17:27Z|00292|binding|INFO|Removing iface tap0ad86a88-0c ovn-installed in OVS
Nov 29 07:17:27 compute-0 nova_compute[187185]: 2025-11-29 07:17:27.165 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:27 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:27.174 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:31:f6:04 10.100.0.8'], port_security=['fa:16:3e:31:f6:04 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-14d61e69-b152-4adc-a95c-58748969e299', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '329bbbdd41424742b3045e77150a498e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '24db58f8-235a-4b76-869f-efe13404b22a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=61c05e4b-7426-41e7-9cd6-8f37a87e832e, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=0ad86a88-0ccb-498d-a1e4-43aef563961d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:17:27 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:27.177 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 0ad86a88-0ccb-498d-a1e4-43aef563961d in datapath 14d61e69-b152-4adc-a95c-58748969e299 unbound from our chassis
Nov 29 07:17:27 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:27.179 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 14d61e69-b152-4adc-a95c-58748969e299, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:17:27 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:27.180 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[dcdbf6a6-1888-407c-b87d-251043803b20]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:27 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:27.181 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-14d61e69-b152-4adc-a95c-58748969e299 namespace which is not needed anymore
Nov 29 07:17:27 compute-0 nova_compute[187185]: 2025-11-29 07:17:27.191 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:27 compute-0 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d0000006a.scope: Deactivated successfully.
Nov 29 07:17:27 compute-0 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d0000006a.scope: Consumed 1.732s CPU time.
Nov 29 07:17:27 compute-0 systemd-machined[153486]: Machine qemu-39-instance-0000006a terminated.
Nov 29 07:17:27 compute-0 nova_compute[187185]: 2025-11-29 07:17:27.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:17:27 compute-0 NetworkManager[55227]: <info>  [1764400647.3427] manager: (tap0ad86a88-0c): new Tun device (/org/freedesktop/NetworkManager/Devices/159)
Nov 29 07:17:27 compute-0 nova_compute[187185]: 2025-11-29 07:17:27.359 187189 DEBUG nova.compute.manager [req-d07024ce-897c-4b9f-947b-da3e381875f1 req-2e433bc4-1c3c-42a6-aff6-53ee027cc936 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Received event network-vif-plugged-4138daf0-53ec-4cf3-ad1f-cb966c2e96a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:17:27 compute-0 neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299[230346]: [NOTICE]   (230350) : haproxy version is 2.8.14-c23fe91
Nov 29 07:17:27 compute-0 neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299[230346]: [NOTICE]   (230350) : path to executable is /usr/sbin/haproxy
Nov 29 07:17:27 compute-0 neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299[230346]: [WARNING]  (230350) : Exiting Master process...
Nov 29 07:17:27 compute-0 neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299[230346]: [WARNING]  (230350) : Exiting Master process...
Nov 29 07:17:27 compute-0 nova_compute[187185]: 2025-11-29 07:17:27.359 187189 DEBUG oslo_concurrency.lockutils [req-d07024ce-897c-4b9f-947b-da3e381875f1 req-2e433bc4-1c3c-42a6-aff6-53ee027cc936 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "bb1bd9c2-1ccf-4021-b983-63a50858328f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:17:27 compute-0 nova_compute[187185]: 2025-11-29 07:17:27.360 187189 DEBUG oslo_concurrency.lockutils [req-d07024ce-897c-4b9f-947b-da3e381875f1 req-2e433bc4-1c3c-42a6-aff6-53ee027cc936 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "bb1bd9c2-1ccf-4021-b983-63a50858328f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:17:27 compute-0 nova_compute[187185]: 2025-11-29 07:17:27.360 187189 DEBUG oslo_concurrency.lockutils [req-d07024ce-897c-4b9f-947b-da3e381875f1 req-2e433bc4-1c3c-42a6-aff6-53ee027cc936 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "bb1bd9c2-1ccf-4021-b983-63a50858328f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:17:27 compute-0 nova_compute[187185]: 2025-11-29 07:17:27.360 187189 DEBUG nova.compute.manager [req-d07024ce-897c-4b9f-947b-da3e381875f1 req-2e433bc4-1c3c-42a6-aff6-53ee027cc936 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] No waiting events found dispatching network-vif-plugged-4138daf0-53ec-4cf3-ad1f-cb966c2e96a3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:17:27 compute-0 nova_compute[187185]: 2025-11-29 07:17:27.360 187189 WARNING nova.compute.manager [req-d07024ce-897c-4b9f-947b-da3e381875f1 req-2e433bc4-1c3c-42a6-aff6-53ee027cc936 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Received unexpected event network-vif-plugged-4138daf0-53ec-4cf3-ad1f-cb966c2e96a3 for instance with vm_state paused and task_state shelving_image_uploading.
Nov 29 07:17:27 compute-0 neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299[230346]: [ALERT]    (230350) : Current worker (230352) exited with code 143 (Terminated)
Nov 29 07:17:27 compute-0 neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299[230346]: [WARNING]  (230350) : All workers exited. Exiting... (0)
Nov 29 07:17:27 compute-0 systemd[1]: libpod-ccde70cb9364257c603d1fe4da9389a4053586fec00fbbbcef5816a5e6a351dd.scope: Deactivated successfully.
Nov 29 07:17:27 compute-0 podman[230385]: 2025-11-29 07:17:27.375052093 +0000 UTC m=+0.063650136 container died ccde70cb9364257c603d1fe4da9389a4053586fec00fbbbcef5816a5e6a351dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 07:17:27 compute-0 nova_compute[187185]: 2025-11-29 07:17:27.398 187189 INFO nova.virt.libvirt.driver [-] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Instance destroyed successfully.
Nov 29 07:17:27 compute-0 nova_compute[187185]: 2025-11-29 07:17:27.399 187189 DEBUG nova.objects.instance [None req-640c24c5-b030-4f40-906c-615d34eecf0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lazy-loading 'resources' on Instance uuid 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:17:27 compute-0 nova_compute[187185]: 2025-11-29 07:17:27.412 187189 DEBUG nova.virt.libvirt.vif [None req-640c24c5-b030-4f40-906c-615d34eecf0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:17:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-138799320',display_name='tempest-ServersNegativeTestJSON-server-138799320',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-138799320',id=106,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:17:26Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='329bbbdd41424742b3045e77150a498e',ramdisk_id='',reservation_id='r-8zzi07jm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-1191192320',owner_user_name='tempest-ServersNegativeTestJSON-1191192320-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:17:26Z,user_data=None,user_id='2647a3e4fc214b4a85db1283eb7ef117',uuid=1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0ad86a88-0ccb-498d-a1e4-43aef563961d", "address": "fa:16:3e:31:f6:04", "network": {"id": "14d61e69-b152-4adc-a95c-58748969e299", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-2003556983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "329bbbdd41424742b3045e77150a498e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ad86a88-0c", "ovs_interfaceid": "0ad86a88-0ccb-498d-a1e4-43aef563961d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:17:27 compute-0 nova_compute[187185]: 2025-11-29 07:17:27.413 187189 DEBUG nova.network.os_vif_util [None req-640c24c5-b030-4f40-906c-615d34eecf0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Converting VIF {"id": "0ad86a88-0ccb-498d-a1e4-43aef563961d", "address": "fa:16:3e:31:f6:04", "network": {"id": "14d61e69-b152-4adc-a95c-58748969e299", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-2003556983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "329bbbdd41424742b3045e77150a498e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ad86a88-0c", "ovs_interfaceid": "0ad86a88-0ccb-498d-a1e4-43aef563961d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:17:27 compute-0 nova_compute[187185]: 2025-11-29 07:17:27.414 187189 DEBUG nova.network.os_vif_util [None req-640c24c5-b030-4f40-906c-615d34eecf0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:31:f6:04,bridge_name='br-int',has_traffic_filtering=True,id=0ad86a88-0ccb-498d-a1e4-43aef563961d,network=Network(14d61e69-b152-4adc-a95c-58748969e299),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ad86a88-0c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:17:27 compute-0 nova_compute[187185]: 2025-11-29 07:17:27.414 187189 DEBUG os_vif [None req-640c24c5-b030-4f40-906c-615d34eecf0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:f6:04,bridge_name='br-int',has_traffic_filtering=True,id=0ad86a88-0ccb-498d-a1e4-43aef563961d,network=Network(14d61e69-b152-4adc-a95c-58748969e299),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ad86a88-0c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:17:27 compute-0 nova_compute[187185]: 2025-11-29 07:17:27.416 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:27 compute-0 nova_compute[187185]: 2025-11-29 07:17:27.416 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0ad86a88-0c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:17:27 compute-0 nova_compute[187185]: 2025-11-29 07:17:27.417 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:27 compute-0 nova_compute[187185]: 2025-11-29 07:17:27.419 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:17:27 compute-0 nova_compute[187185]: 2025-11-29 07:17:27.424 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:27 compute-0 nova_compute[187185]: 2025-11-29 07:17:27.428 187189 INFO os_vif [None req-640c24c5-b030-4f40-906c-615d34eecf0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:f6:04,bridge_name='br-int',has_traffic_filtering=True,id=0ad86a88-0ccb-498d-a1e4-43aef563961d,network=Network(14d61e69-b152-4adc-a95c-58748969e299),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ad86a88-0c')
Nov 29 07:17:27 compute-0 nova_compute[187185]: 2025-11-29 07:17:27.428 187189 INFO nova.virt.libvirt.driver [None req-640c24c5-b030-4f40-906c-615d34eecf0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Deleting instance files /var/lib/nova/instances/1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d_del
Nov 29 07:17:27 compute-0 nova_compute[187185]: 2025-11-29 07:17:27.429 187189 INFO nova.virt.libvirt.driver [None req-640c24c5-b030-4f40-906c-615d34eecf0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Deletion of /var/lib/nova/instances/1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d_del complete
Nov 29 07:17:27 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ccde70cb9364257c603d1fe4da9389a4053586fec00fbbbcef5816a5e6a351dd-userdata-shm.mount: Deactivated successfully.
Nov 29 07:17:27 compute-0 systemd[1]: var-lib-containers-storage-overlay-3d83ce4b69558a10623432d37fce3c4fe68ca8fed21ac0c1281554ff06d199f8-merged.mount: Deactivated successfully.
Nov 29 07:17:27 compute-0 podman[230385]: 2025-11-29 07:17:27.51350406 +0000 UTC m=+0.202102103 container cleanup ccde70cb9364257c603d1fe4da9389a4053586fec00fbbbcef5816a5e6a351dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:17:27 compute-0 nova_compute[187185]: 2025-11-29 07:17:27.515 187189 INFO nova.compute.manager [None req-640c24c5-b030-4f40-906c-615d34eecf0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Took 0.39 seconds to destroy the instance on the hypervisor.
Nov 29 07:17:27 compute-0 nova_compute[187185]: 2025-11-29 07:17:27.516 187189 DEBUG oslo.service.loopingcall [None req-640c24c5-b030-4f40-906c-615d34eecf0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 07:17:27 compute-0 nova_compute[187185]: 2025-11-29 07:17:27.517 187189 DEBUG nova.compute.manager [-] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 07:17:27 compute-0 nova_compute[187185]: 2025-11-29 07:17:27.517 187189 DEBUG nova.network.neutron [-] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 07:17:27 compute-0 systemd[1]: libpod-conmon-ccde70cb9364257c603d1fe4da9389a4053586fec00fbbbcef5816a5e6a351dd.scope: Deactivated successfully.
Nov 29 07:17:27 compute-0 podman[230435]: 2025-11-29 07:17:27.658549175 +0000 UTC m=+0.119752858 container remove ccde70cb9364257c603d1fe4da9389a4053586fec00fbbbcef5816a5e6a351dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 29 07:17:27 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:27.668 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[7a69293c-ab8f-42d7-a502-7ed3e5de4975]: (4, ('Sat Nov 29 07:17:27 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299 (ccde70cb9364257c603d1fe4da9389a4053586fec00fbbbcef5816a5e6a351dd)\nccde70cb9364257c603d1fe4da9389a4053586fec00fbbbcef5816a5e6a351dd\nSat Nov 29 07:17:27 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299 (ccde70cb9364257c603d1fe4da9389a4053586fec00fbbbcef5816a5e6a351dd)\nccde70cb9364257c603d1fe4da9389a4053586fec00fbbbcef5816a5e6a351dd\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:27 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:27.670 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[70a210c2-d4a4-4c91-b273-f8f9bd1c7d86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:27 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:27.672 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap14d61e69-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:17:27 compute-0 nova_compute[187185]: 2025-11-29 07:17:27.675 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:27 compute-0 kernel: tap14d61e69-b0: left promiscuous mode
Nov 29 07:17:27 compute-0 nova_compute[187185]: 2025-11-29 07:17:27.690 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:27 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:27.695 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[4ff393ce-a80d-43d8-90fd-2418543e8deb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:27 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:27.708 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a8c94fd5-a81d-4208-9f92-accdc6cd9dcd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:27 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:27.709 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[90c6f503-16fd-46f4-94fd-b4db50ed7986]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:27 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:27.729 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[fda5301c-70b5-42b6-8ed9-670ec7b73b18]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 609311, 'reachable_time': 44079, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230454, 'error': None, 'target': 'ovnmeta-14d61e69-b152-4adc-a95c-58748969e299', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:27 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:27.733 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-14d61e69-b152-4adc-a95c-58748969e299 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:17:27 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:17:27.734 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[b9f680a8-6ea6-4b63-ba00-c560206bdc3f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:17:27 compute-0 systemd[1]: run-netns-ovnmeta\x2d14d61e69\x2db152\x2d4adc\x2da95c\x2d58748969e299.mount: Deactivated successfully.
Nov 29 07:17:29 compute-0 nova_compute[187185]: 2025-11-29 07:17:29.071 187189 DEBUG nova.compute.manager [req-8eea2cff-dbda-43b9-82c6-acde6efab0de req-301cb3d9-31f2-43c2-b1e7-78fc63032b86 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Received event network-vif-plugged-0ad86a88-0ccb-498d-a1e4-43aef563961d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:17:29 compute-0 nova_compute[187185]: 2025-11-29 07:17:29.071 187189 DEBUG oslo_concurrency.lockutils [req-8eea2cff-dbda-43b9-82c6-acde6efab0de req-301cb3d9-31f2-43c2-b1e7-78fc63032b86 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:17:29 compute-0 nova_compute[187185]: 2025-11-29 07:17:29.072 187189 DEBUG oslo_concurrency.lockutils [req-8eea2cff-dbda-43b9-82c6-acde6efab0de req-301cb3d9-31f2-43c2-b1e7-78fc63032b86 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:17:29 compute-0 nova_compute[187185]: 2025-11-29 07:17:29.072 187189 DEBUG oslo_concurrency.lockutils [req-8eea2cff-dbda-43b9-82c6-acde6efab0de req-301cb3d9-31f2-43c2-b1e7-78fc63032b86 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:17:29 compute-0 nova_compute[187185]: 2025-11-29 07:17:29.073 187189 DEBUG nova.compute.manager [req-8eea2cff-dbda-43b9-82c6-acde6efab0de req-301cb3d9-31f2-43c2-b1e7-78fc63032b86 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] No waiting events found dispatching network-vif-plugged-0ad86a88-0ccb-498d-a1e4-43aef563961d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:17:29 compute-0 nova_compute[187185]: 2025-11-29 07:17:29.073 187189 WARNING nova.compute.manager [req-8eea2cff-dbda-43b9-82c6-acde6efab0de req-301cb3d9-31f2-43c2-b1e7-78fc63032b86 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Received unexpected event network-vif-plugged-0ad86a88-0ccb-498d-a1e4-43aef563961d for instance with vm_state active and task_state deleting.
Nov 29 07:17:29 compute-0 nova_compute[187185]: 2025-11-29 07:17:29.073 187189 DEBUG nova.compute.manager [req-8eea2cff-dbda-43b9-82c6-acde6efab0de req-301cb3d9-31f2-43c2-b1e7-78fc63032b86 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Received event network-vif-unplugged-0ad86a88-0ccb-498d-a1e4-43aef563961d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:17:29 compute-0 nova_compute[187185]: 2025-11-29 07:17:29.074 187189 DEBUG oslo_concurrency.lockutils [req-8eea2cff-dbda-43b9-82c6-acde6efab0de req-301cb3d9-31f2-43c2-b1e7-78fc63032b86 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:17:29 compute-0 nova_compute[187185]: 2025-11-29 07:17:29.074 187189 DEBUG oslo_concurrency.lockutils [req-8eea2cff-dbda-43b9-82c6-acde6efab0de req-301cb3d9-31f2-43c2-b1e7-78fc63032b86 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:17:29 compute-0 nova_compute[187185]: 2025-11-29 07:17:29.075 187189 DEBUG oslo_concurrency.lockutils [req-8eea2cff-dbda-43b9-82c6-acde6efab0de req-301cb3d9-31f2-43c2-b1e7-78fc63032b86 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:17:29 compute-0 nova_compute[187185]: 2025-11-29 07:17:29.075 187189 DEBUG nova.compute.manager [req-8eea2cff-dbda-43b9-82c6-acde6efab0de req-301cb3d9-31f2-43c2-b1e7-78fc63032b86 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] No waiting events found dispatching network-vif-unplugged-0ad86a88-0ccb-498d-a1e4-43aef563961d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:17:29 compute-0 nova_compute[187185]: 2025-11-29 07:17:29.075 187189 DEBUG nova.compute.manager [req-8eea2cff-dbda-43b9-82c6-acde6efab0de req-301cb3d9-31f2-43c2-b1e7-78fc63032b86 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Received event network-vif-unplugged-0ad86a88-0ccb-498d-a1e4-43aef563961d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 07:17:29 compute-0 nova_compute[187185]: 2025-11-29 07:17:29.963 187189 INFO nova.virt.libvirt.driver [None req-f1995a4f-5e4a-4927-b891-b40186b5caab ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Snapshot image upload complete
Nov 29 07:17:29 compute-0 nova_compute[187185]: 2025-11-29 07:17:29.963 187189 DEBUG nova.compute.manager [None req-f1995a4f-5e4a-4927-b891-b40186b5caab ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:17:30 compute-0 nova_compute[187185]: 2025-11-29 07:17:30.479 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:30 compute-0 nova_compute[187185]: 2025-11-29 07:17:30.794 187189 DEBUG nova.network.neutron [-] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:17:30 compute-0 nova_compute[187185]: 2025-11-29 07:17:30.807 187189 INFO nova.compute.manager [None req-f1995a4f-5e4a-4927-b891-b40186b5caab ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Shelve offloading
Nov 29 07:17:30 compute-0 nova_compute[187185]: 2025-11-29 07:17:30.819 187189 INFO nova.compute.manager [-] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Took 3.30 seconds to deallocate network for instance.
Nov 29 07:17:30 compute-0 nova_compute[187185]: 2025-11-29 07:17:30.843 187189 INFO nova.virt.libvirt.driver [-] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Instance destroyed successfully.
Nov 29 07:17:30 compute-0 nova_compute[187185]: 2025-11-29 07:17:30.844 187189 DEBUG nova.compute.manager [None req-f1995a4f-5e4a-4927-b891-b40186b5caab ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:17:30 compute-0 nova_compute[187185]: 2025-11-29 07:17:30.847 187189 DEBUG oslo_concurrency.lockutils [None req-f1995a4f-5e4a-4927-b891-b40186b5caab ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "refresh_cache-bb1bd9c2-1ccf-4021-b983-63a50858328f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:17:30 compute-0 nova_compute[187185]: 2025-11-29 07:17:30.847 187189 DEBUG oslo_concurrency.lockutils [None req-f1995a4f-5e4a-4927-b891-b40186b5caab ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquired lock "refresh_cache-bb1bd9c2-1ccf-4021-b983-63a50858328f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:17:30 compute-0 nova_compute[187185]: 2025-11-29 07:17:30.848 187189 DEBUG nova.network.neutron [None req-f1995a4f-5e4a-4927-b891-b40186b5caab ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:17:30 compute-0 nova_compute[187185]: 2025-11-29 07:17:30.892 187189 DEBUG nova.compute.manager [req-7c781cf3-7892-4990-9991-dc5b28e96865 req-49068c30-3831-47fd-903c-1498da2c84a7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Received event network-vif-deleted-0ad86a88-0ccb-498d-a1e4-43aef563961d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:17:30 compute-0 nova_compute[187185]: 2025-11-29 07:17:30.902 187189 DEBUG oslo_concurrency.lockutils [None req-640c24c5-b030-4f40-906c-615d34eecf0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:17:30 compute-0 nova_compute[187185]: 2025-11-29 07:17:30.902 187189 DEBUG oslo_concurrency.lockutils [None req-640c24c5-b030-4f40-906c-615d34eecf0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:17:30 compute-0 nova_compute[187185]: 2025-11-29 07:17:30.991 187189 DEBUG nova.compute.provider_tree [None req-640c24c5-b030-4f40-906c-615d34eecf0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:17:31 compute-0 nova_compute[187185]: 2025-11-29 07:17:31.038 187189 DEBUG nova.scheduler.client.report [None req-640c24c5-b030-4f40-906c-615d34eecf0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:17:31 compute-0 nova_compute[187185]: 2025-11-29 07:17:31.077 187189 DEBUG oslo_concurrency.lockutils [None req-640c24c5-b030-4f40-906c-615d34eecf0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:17:31 compute-0 nova_compute[187185]: 2025-11-29 07:17:31.119 187189 INFO nova.scheduler.client.report [None req-640c24c5-b030-4f40-906c-615d34eecf0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Deleted allocations for instance 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d
Nov 29 07:17:31 compute-0 nova_compute[187185]: 2025-11-29 07:17:31.171 187189 DEBUG nova.compute.manager [req-0ccee920-dc3f-4b18-9690-ef93ed9309f0 req-c5a05709-bbca-4f54-b76e-ba9484130791 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Received event network-vif-plugged-0ad86a88-0ccb-498d-a1e4-43aef563961d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:17:31 compute-0 nova_compute[187185]: 2025-11-29 07:17:31.172 187189 DEBUG oslo_concurrency.lockutils [req-0ccee920-dc3f-4b18-9690-ef93ed9309f0 req-c5a05709-bbca-4f54-b76e-ba9484130791 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:17:31 compute-0 nova_compute[187185]: 2025-11-29 07:17:31.172 187189 DEBUG oslo_concurrency.lockutils [req-0ccee920-dc3f-4b18-9690-ef93ed9309f0 req-c5a05709-bbca-4f54-b76e-ba9484130791 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:17:31 compute-0 nova_compute[187185]: 2025-11-29 07:17:31.173 187189 DEBUG oslo_concurrency.lockutils [req-0ccee920-dc3f-4b18-9690-ef93ed9309f0 req-c5a05709-bbca-4f54-b76e-ba9484130791 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:17:31 compute-0 nova_compute[187185]: 2025-11-29 07:17:31.173 187189 DEBUG nova.compute.manager [req-0ccee920-dc3f-4b18-9690-ef93ed9309f0 req-c5a05709-bbca-4f54-b76e-ba9484130791 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] No waiting events found dispatching network-vif-plugged-0ad86a88-0ccb-498d-a1e4-43aef563961d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:17:31 compute-0 nova_compute[187185]: 2025-11-29 07:17:31.173 187189 WARNING nova.compute.manager [req-0ccee920-dc3f-4b18-9690-ef93ed9309f0 req-c5a05709-bbca-4f54-b76e-ba9484130791 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Received unexpected event network-vif-plugged-0ad86a88-0ccb-498d-a1e4-43aef563961d for instance with vm_state deleted and task_state None.
Nov 29 07:17:31 compute-0 nova_compute[187185]: 2025-11-29 07:17:31.203 187189 DEBUG oslo_concurrency.lockutils [None req-640c24c5-b030-4f40-906c-615d34eecf0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:17:32 compute-0 nova_compute[187185]: 2025-11-29 07:17:32.419 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:32 compute-0 nova_compute[187185]: 2025-11-29 07:17:32.974 187189 DEBUG nova.network.neutron [None req-f1995a4f-5e4a-4927-b891-b40186b5caab ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Updating instance_info_cache with network_info: [{"id": "4138daf0-53ec-4cf3-ad1f-cb966c2e96a3", "address": "fa:16:3e:bd:1e:b0", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4138daf0-53", "ovs_interfaceid": "4138daf0-53ec-4cf3-ad1f-cb966c2e96a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:17:33 compute-0 nova_compute[187185]: 2025-11-29 07:17:33.256 187189 DEBUG oslo_concurrency.lockutils [None req-f1995a4f-5e4a-4927-b891-b40186b5caab ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Releasing lock "refresh_cache-bb1bd9c2-1ccf-4021-b983-63a50858328f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:17:35 compute-0 nova_compute[187185]: 2025-11-29 07:17:35.481 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:36 compute-0 podman[230455]: 2025-11-29 07:17:36.805602072 +0000 UTC m=+0.064120180 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:17:36 compute-0 podman[230457]: 2025-11-29 07:17:36.815318538 +0000 UTC m=+0.067916368 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 07:17:36 compute-0 podman[230456]: 2025-11-29 07:17:36.820504135 +0000 UTC m=+0.076825731 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., name=ubi9-minimal, config_id=edpm, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, container_name=openstack_network_exporter, architecture=x86_64)
Nov 29 07:17:36 compute-0 nova_compute[187185]: 2025-11-29 07:17:36.926 187189 INFO nova.virt.libvirt.driver [-] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Instance destroyed successfully.
Nov 29 07:17:36 compute-0 nova_compute[187185]: 2025-11-29 07:17:36.926 187189 DEBUG nova.objects.instance [None req-f1995a4f-5e4a-4927-b891-b40186b5caab ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'resources' on Instance uuid bb1bd9c2-1ccf-4021-b983-63a50858328f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:17:36 compute-0 nova_compute[187185]: 2025-11-29 07:17:36.945 187189 DEBUG nova.virt.libvirt.vif [None req-f1995a4f-5e4a-4927-b891-b40186b5caab ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:17:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1640649360',display_name='tempest-ServerActionsTestOtherB-server-1640649360',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1640649360',id=105,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:17:17Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='32e51e3a9a8f4a1ca6e022735ebf5f7b',ramdisk_id='',reservation_id='r-nwwrzkpv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1538648925',owner_user_name='tempest-ServerActionsTestOtherB-1538648925-project-member',shelved_at='2025-11-29T07:17:29.963725',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='c6c2714c-f58c-4ee4-8e98-2614dbfcac37'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:17:25Z,user_data=None,user_id='ee2d4931cb504b13b92a2f52c95c05ce',uuid=bb1bd9c2-1ccf-4021-b983-63a50858328f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "4138daf0-53ec-4cf3-ad1f-cb966c2e96a3", "address": "fa:16:3e:bd:1e:b0", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4138daf0-53", "ovs_interfaceid": "4138daf0-53ec-4cf3-ad1f-cb966c2e96a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:17:36 compute-0 nova_compute[187185]: 2025-11-29 07:17:36.946 187189 DEBUG nova.network.os_vif_util [None req-f1995a4f-5e4a-4927-b891-b40186b5caab ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converting VIF {"id": "4138daf0-53ec-4cf3-ad1f-cb966c2e96a3", "address": "fa:16:3e:bd:1e:b0", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4138daf0-53", "ovs_interfaceid": "4138daf0-53ec-4cf3-ad1f-cb966c2e96a3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:17:36 compute-0 nova_compute[187185]: 2025-11-29 07:17:36.947 187189 DEBUG nova.network.os_vif_util [None req-f1995a4f-5e4a-4927-b891-b40186b5caab ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bd:1e:b0,bridge_name='br-int',has_traffic_filtering=True,id=4138daf0-53ec-4cf3-ad1f-cb966c2e96a3,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4138daf0-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:17:36 compute-0 nova_compute[187185]: 2025-11-29 07:17:36.948 187189 DEBUG os_vif [None req-f1995a4f-5e4a-4927-b891-b40186b5caab ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bd:1e:b0,bridge_name='br-int',has_traffic_filtering=True,id=4138daf0-53ec-4cf3-ad1f-cb966c2e96a3,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4138daf0-53') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:17:36 compute-0 nova_compute[187185]: 2025-11-29 07:17:36.950 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:36 compute-0 nova_compute[187185]: 2025-11-29 07:17:36.950 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4138daf0-53, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:17:36 compute-0 nova_compute[187185]: 2025-11-29 07:17:36.952 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:36 compute-0 nova_compute[187185]: 2025-11-29 07:17:36.955 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:17:36 compute-0 nova_compute[187185]: 2025-11-29 07:17:36.958 187189 INFO os_vif [None req-f1995a4f-5e4a-4927-b891-b40186b5caab ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bd:1e:b0,bridge_name='br-int',has_traffic_filtering=True,id=4138daf0-53ec-4cf3-ad1f-cb966c2e96a3,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4138daf0-53')
Nov 29 07:17:36 compute-0 nova_compute[187185]: 2025-11-29 07:17:36.959 187189 INFO nova.virt.libvirt.driver [None req-f1995a4f-5e4a-4927-b891-b40186b5caab ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Deleting instance files /var/lib/nova/instances/bb1bd9c2-1ccf-4021-b983-63a50858328f_del
Nov 29 07:17:36 compute-0 nova_compute[187185]: 2025-11-29 07:17:36.960 187189 INFO nova.virt.libvirt.driver [None req-f1995a4f-5e4a-4927-b891-b40186b5caab ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Deletion of /var/lib/nova/instances/bb1bd9c2-1ccf-4021-b983-63a50858328f_del complete
Nov 29 07:17:37 compute-0 nova_compute[187185]: 2025-11-29 07:17:37.487 187189 DEBUG nova.compute.manager [req-4e550334-d66b-463c-84b1-7afd78cdb6e7 req-3a4030a0-96fa-4da8-b662-4ee9bc95b82e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Received event network-changed-4138daf0-53ec-4cf3-ad1f-cb966c2e96a3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:17:37 compute-0 nova_compute[187185]: 2025-11-29 07:17:37.488 187189 DEBUG nova.compute.manager [req-4e550334-d66b-463c-84b1-7afd78cdb6e7 req-3a4030a0-96fa-4da8-b662-4ee9bc95b82e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Refreshing instance network info cache due to event network-changed-4138daf0-53ec-4cf3-ad1f-cb966c2e96a3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:17:37 compute-0 nova_compute[187185]: 2025-11-29 07:17:37.488 187189 DEBUG oslo_concurrency.lockutils [req-4e550334-d66b-463c-84b1-7afd78cdb6e7 req-3a4030a0-96fa-4da8-b662-4ee9bc95b82e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-bb1bd9c2-1ccf-4021-b983-63a50858328f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:17:37 compute-0 nova_compute[187185]: 2025-11-29 07:17:37.489 187189 DEBUG oslo_concurrency.lockutils [req-4e550334-d66b-463c-84b1-7afd78cdb6e7 req-3a4030a0-96fa-4da8-b662-4ee9bc95b82e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-bb1bd9c2-1ccf-4021-b983-63a50858328f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:17:37 compute-0 nova_compute[187185]: 2025-11-29 07:17:37.489 187189 DEBUG nova.network.neutron [req-4e550334-d66b-463c-84b1-7afd78cdb6e7 req-3a4030a0-96fa-4da8-b662-4ee9bc95b82e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Refreshing network info cache for port 4138daf0-53ec-4cf3-ad1f-cb966c2e96a3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:17:38 compute-0 nova_compute[187185]: 2025-11-29 07:17:38.109 187189 INFO nova.scheduler.client.report [None req-f1995a4f-5e4a-4927-b891-b40186b5caab ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Deleted allocations for instance bb1bd9c2-1ccf-4021-b983-63a50858328f
Nov 29 07:17:38 compute-0 nova_compute[187185]: 2025-11-29 07:17:38.330 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:17:38 compute-0 nova_compute[187185]: 2025-11-29 07:17:38.330 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 29 07:17:38 compute-0 nova_compute[187185]: 2025-11-29 07:17:38.448 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 29 07:17:38 compute-0 nova_compute[187185]: 2025-11-29 07:17:38.463 187189 DEBUG oslo_concurrency.lockutils [None req-f1995a4f-5e4a-4927-b891-b40186b5caab ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:17:38 compute-0 nova_compute[187185]: 2025-11-29 07:17:38.463 187189 DEBUG oslo_concurrency.lockutils [None req-f1995a4f-5e4a-4927-b891-b40186b5caab ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:17:38 compute-0 nova_compute[187185]: 2025-11-29 07:17:38.503 187189 DEBUG nova.compute.provider_tree [None req-f1995a4f-5e4a-4927-b891-b40186b5caab ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:17:38 compute-0 nova_compute[187185]: 2025-11-29 07:17:38.562 187189 DEBUG nova.scheduler.client.report [None req-f1995a4f-5e4a-4927-b891-b40186b5caab ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:17:38 compute-0 nova_compute[187185]: 2025-11-29 07:17:38.710 187189 DEBUG oslo_concurrency.lockutils [None req-f1995a4f-5e4a-4927-b891-b40186b5caab ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.247s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:17:39 compute-0 nova_compute[187185]: 2025-11-29 07:17:39.400 187189 DEBUG oslo_concurrency.lockutils [None req-f1995a4f-5e4a-4927-b891-b40186b5caab ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "bb1bd9c2-1ccf-4021-b983-63a50858328f" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 15.420s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:17:39 compute-0 nova_compute[187185]: 2025-11-29 07:17:39.447 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400644.4461982, bb1bd9c2-1ccf-4021-b983-63a50858328f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:17:39 compute-0 nova_compute[187185]: 2025-11-29 07:17:39.448 187189 INFO nova.compute.manager [-] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] VM Stopped (Lifecycle Event)
Nov 29 07:17:39 compute-0 nova_compute[187185]: 2025-11-29 07:17:39.651 187189 DEBUG nova.compute.manager [None req-c16f9151-09a4-4874-8e79-99e7dd9f478a - - - - - -] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:17:40 compute-0 nova_compute[187185]: 2025-11-29 07:17:40.483 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:41 compute-0 nova_compute[187185]: 2025-11-29 07:17:41.199 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:41 compute-0 nova_compute[187185]: 2025-11-29 07:17:41.953 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:42 compute-0 nova_compute[187185]: 2025-11-29 07:17:42.397 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400647.396097, 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:17:42 compute-0 nova_compute[187185]: 2025-11-29 07:17:42.398 187189 INFO nova.compute.manager [-] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] VM Stopped (Lifecycle Event)
Nov 29 07:17:42 compute-0 nova_compute[187185]: 2025-11-29 07:17:42.477 187189 DEBUG nova.compute.manager [None req-67280ccc-00a3-438c-83f6-81712af82004 - - - - - -] [instance: 1d412eda-3ff5-4b5f-bd54-dcc7959ffc2d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:17:42 compute-0 nova_compute[187185]: 2025-11-29 07:17:42.910 187189 DEBUG nova.network.neutron [req-4e550334-d66b-463c-84b1-7afd78cdb6e7 req-3a4030a0-96fa-4da8-b662-4ee9bc95b82e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Updated VIF entry in instance network info cache for port 4138daf0-53ec-4cf3-ad1f-cb966c2e96a3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:17:42 compute-0 nova_compute[187185]: 2025-11-29 07:17:42.911 187189 DEBUG nova.network.neutron [req-4e550334-d66b-463c-84b1-7afd78cdb6e7 req-3a4030a0-96fa-4da8-b662-4ee9bc95b82e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bb1bd9c2-1ccf-4021-b983-63a50858328f] Updating instance_info_cache with network_info: [{"id": "4138daf0-53ec-4cf3-ad1f-cb966c2e96a3", "address": "fa:16:3e:bd:1e:b0", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": null, "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tap4138daf0-53", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:17:43 compute-0 nova_compute[187185]: 2025-11-29 07:17:43.323 187189 DEBUG oslo_concurrency.lockutils [req-4e550334-d66b-463c-84b1-7afd78cdb6e7 req-3a4030a0-96fa-4da8-b662-4ee9bc95b82e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-bb1bd9c2-1ccf-4021-b983-63a50858328f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:17:45 compute-0 nova_compute[187185]: 2025-11-29 07:17:45.485 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:45 compute-0 podman[230516]: 2025-11-29 07:17:45.869586882 +0000 UTC m=+0.099911115 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 07:17:47 compute-0 nova_compute[187185]: 2025-11-29 07:17:47.004 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:47 compute-0 nova_compute[187185]: 2025-11-29 07:17:47.857 187189 DEBUG oslo_concurrency.lockutils [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "18ed7c04-c274-458a-9bcf-da56ec34bd62" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:17:47 compute-0 nova_compute[187185]: 2025-11-29 07:17:47.858 187189 DEBUG oslo_concurrency.lockutils [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "18ed7c04-c274-458a-9bcf-da56ec34bd62" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:17:47 compute-0 nova_compute[187185]: 2025-11-29 07:17:47.978 187189 DEBUG nova.compute.manager [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 07:17:48 compute-0 nova_compute[187185]: 2025-11-29 07:17:48.267 187189 DEBUG oslo_concurrency.lockutils [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:17:48 compute-0 nova_compute[187185]: 2025-11-29 07:17:48.267 187189 DEBUG oslo_concurrency.lockutils [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:17:48 compute-0 nova_compute[187185]: 2025-11-29 07:17:48.274 187189 DEBUG nova.virt.hardware [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:17:48 compute-0 nova_compute[187185]: 2025-11-29 07:17:48.275 187189 INFO nova.compute.claims [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:17:48 compute-0 nova_compute[187185]: 2025-11-29 07:17:48.633 187189 DEBUG nova.compute.provider_tree [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:17:48 compute-0 nova_compute[187185]: 2025-11-29 07:17:48.741 187189 DEBUG nova.scheduler.client.report [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:17:48 compute-0 nova_compute[187185]: 2025-11-29 07:17:48.819 187189 DEBUG oslo_concurrency.lockutils [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.552s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:17:48 compute-0 nova_compute[187185]: 2025-11-29 07:17:48.820 187189 DEBUG nova.compute.manager [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 07:17:49 compute-0 nova_compute[187185]: 2025-11-29 07:17:49.074 187189 DEBUG nova.compute.manager [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 07:17:49 compute-0 nova_compute[187185]: 2025-11-29 07:17:49.074 187189 DEBUG nova.network.neutron [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 07:17:49 compute-0 nova_compute[187185]: 2025-11-29 07:17:49.227 187189 INFO nova.virt.libvirt.driver [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 07:17:49 compute-0 nova_compute[187185]: 2025-11-29 07:17:49.267 187189 DEBUG nova.policy [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 07:17:49 compute-0 nova_compute[187185]: 2025-11-29 07:17:49.349 187189 DEBUG nova.compute.manager [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 07:17:50 compute-0 nova_compute[187185]: 2025-11-29 07:17:50.487 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:51 compute-0 nova_compute[187185]: 2025-11-29 07:17:51.664 187189 DEBUG nova.compute.manager [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 07:17:51 compute-0 nova_compute[187185]: 2025-11-29 07:17:51.666 187189 DEBUG nova.virt.libvirt.driver [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:17:51 compute-0 nova_compute[187185]: 2025-11-29 07:17:51.667 187189 INFO nova.virt.libvirt.driver [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Creating image(s)
Nov 29 07:17:51 compute-0 nova_compute[187185]: 2025-11-29 07:17:51.668 187189 DEBUG oslo_concurrency.lockutils [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "/var/lib/nova/instances/18ed7c04-c274-458a-9bcf-da56ec34bd62/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:17:51 compute-0 nova_compute[187185]: 2025-11-29 07:17:51.669 187189 DEBUG oslo_concurrency.lockutils [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "/var/lib/nova/instances/18ed7c04-c274-458a-9bcf-da56ec34bd62/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:17:51 compute-0 nova_compute[187185]: 2025-11-29 07:17:51.670 187189 DEBUG oslo_concurrency.lockutils [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "/var/lib/nova/instances/18ed7c04-c274-458a-9bcf-da56ec34bd62/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:17:51 compute-0 nova_compute[187185]: 2025-11-29 07:17:51.697 187189 DEBUG oslo_concurrency.processutils [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:17:51 compute-0 nova_compute[187185]: 2025-11-29 07:17:51.770 187189 DEBUG oslo_concurrency.processutils [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:17:51 compute-0 nova_compute[187185]: 2025-11-29 07:17:51.772 187189 DEBUG oslo_concurrency.lockutils [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:17:51 compute-0 nova_compute[187185]: 2025-11-29 07:17:51.772 187189 DEBUG oslo_concurrency.lockutils [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:17:51 compute-0 nova_compute[187185]: 2025-11-29 07:17:51.789 187189 DEBUG oslo_concurrency.processutils [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:17:51 compute-0 nova_compute[187185]: 2025-11-29 07:17:51.872 187189 DEBUG oslo_concurrency.processutils [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:17:51 compute-0 nova_compute[187185]: 2025-11-29 07:17:51.874 187189 DEBUG oslo_concurrency.processutils [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/18ed7c04-c274-458a-9bcf-da56ec34bd62/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:17:51 compute-0 nova_compute[187185]: 2025-11-29 07:17:51.998 187189 DEBUG oslo_concurrency.processutils [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/18ed7c04-c274-458a-9bcf-da56ec34bd62/disk 1073741824" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:17:52 compute-0 nova_compute[187185]: 2025-11-29 07:17:51.999 187189 DEBUG oslo_concurrency.lockutils [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.227s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:17:52 compute-0 nova_compute[187185]: 2025-11-29 07:17:52.000 187189 DEBUG oslo_concurrency.processutils [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:17:52 compute-0 nova_compute[187185]: 2025-11-29 07:17:52.019 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:52 compute-0 nova_compute[187185]: 2025-11-29 07:17:52.059 187189 DEBUG oslo_concurrency.processutils [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:17:52 compute-0 nova_compute[187185]: 2025-11-29 07:17:52.060 187189 DEBUG nova.virt.disk.api [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Checking if we can resize image /var/lib/nova/instances/18ed7c04-c274-458a-9bcf-da56ec34bd62/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:17:52 compute-0 nova_compute[187185]: 2025-11-29 07:17:52.060 187189 DEBUG oslo_concurrency.processutils [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/18ed7c04-c274-458a-9bcf-da56ec34bd62/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:17:52 compute-0 nova_compute[187185]: 2025-11-29 07:17:52.114 187189 DEBUG oslo_concurrency.processutils [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/18ed7c04-c274-458a-9bcf-da56ec34bd62/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:17:52 compute-0 nova_compute[187185]: 2025-11-29 07:17:52.115 187189 DEBUG nova.virt.disk.api [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Cannot resize image /var/lib/nova/instances/18ed7c04-c274-458a-9bcf-da56ec34bd62/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:17:52 compute-0 nova_compute[187185]: 2025-11-29 07:17:52.115 187189 DEBUG nova.objects.instance [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'migration_context' on Instance uuid 18ed7c04-c274-458a-9bcf-da56ec34bd62 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:17:52 compute-0 nova_compute[187185]: 2025-11-29 07:17:52.353 187189 DEBUG nova.virt.libvirt.driver [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:17:52 compute-0 nova_compute[187185]: 2025-11-29 07:17:52.354 187189 DEBUG nova.virt.libvirt.driver [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Ensure instance console log exists: /var/lib/nova/instances/18ed7c04-c274-458a-9bcf-da56ec34bd62/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:17:52 compute-0 nova_compute[187185]: 2025-11-29 07:17:52.355 187189 DEBUG oslo_concurrency.lockutils [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:17:52 compute-0 nova_compute[187185]: 2025-11-29 07:17:52.356 187189 DEBUG oslo_concurrency.lockutils [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:17:52 compute-0 nova_compute[187185]: 2025-11-29 07:17:52.356 187189 DEBUG oslo_concurrency.lockutils [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:17:52 compute-0 podman[230557]: 2025-11-29 07:17:52.799362569 +0000 UTC m=+0.063823852 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 07:17:53 compute-0 nova_compute[187185]: 2025-11-29 07:17:53.157 187189 DEBUG nova.network.neutron [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Successfully created port: 038c98cb-23f6-48a9-b2fd-2dbf4c409143 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:17:54 compute-0 podman[230582]: 2025-11-29 07:17:54.841825362 +0000 UTC m=+0.089423278 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 29 07:17:54 compute-0 podman[230581]: 2025-11-29 07:17:54.853549815 +0000 UTC m=+0.101456059 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd)
Nov 29 07:17:55 compute-0 nova_compute[187185]: 2025-11-29 07:17:55.524 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:57 compute-0 nova_compute[187185]: 2025-11-29 07:17:57.023 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:17:57 compute-0 nova_compute[187185]: 2025-11-29 07:17:57.569 187189 DEBUG nova.network.neutron [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Successfully updated port: 038c98cb-23f6-48a9-b2fd-2dbf4c409143 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:17:58 compute-0 nova_compute[187185]: 2025-11-29 07:17:58.387 187189 DEBUG nova.compute.manager [req-9e66e90f-ca6b-4668-b5b1-c822fb8dcbcb req-042b1284-5bcd-4381-802f-8af807837fc8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Received event network-changed-038c98cb-23f6-48a9-b2fd-2dbf4c409143 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:17:58 compute-0 nova_compute[187185]: 2025-11-29 07:17:58.387 187189 DEBUG nova.compute.manager [req-9e66e90f-ca6b-4668-b5b1-c822fb8dcbcb req-042b1284-5bcd-4381-802f-8af807837fc8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Refreshing instance network info cache due to event network-changed-038c98cb-23f6-48a9-b2fd-2dbf4c409143. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:17:58 compute-0 nova_compute[187185]: 2025-11-29 07:17:58.387 187189 DEBUG oslo_concurrency.lockutils [req-9e66e90f-ca6b-4668-b5b1-c822fb8dcbcb req-042b1284-5bcd-4381-802f-8af807837fc8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-18ed7c04-c274-458a-9bcf-da56ec34bd62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:17:58 compute-0 nova_compute[187185]: 2025-11-29 07:17:58.387 187189 DEBUG oslo_concurrency.lockutils [req-9e66e90f-ca6b-4668-b5b1-c822fb8dcbcb req-042b1284-5bcd-4381-802f-8af807837fc8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-18ed7c04-c274-458a-9bcf-da56ec34bd62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:17:58 compute-0 nova_compute[187185]: 2025-11-29 07:17:58.387 187189 DEBUG nova.network.neutron [req-9e66e90f-ca6b-4668-b5b1-c822fb8dcbcb req-042b1284-5bcd-4381-802f-8af807837fc8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Refreshing network info cache for port 038c98cb-23f6-48a9-b2fd-2dbf4c409143 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:17:58 compute-0 nova_compute[187185]: 2025-11-29 07:17:58.389 187189 DEBUG oslo_concurrency.lockutils [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "refresh_cache-18ed7c04-c274-458a-9bcf-da56ec34bd62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:17:59 compute-0 nova_compute[187185]: 2025-11-29 07:17:59.972 187189 DEBUG nova.network.neutron [req-9e66e90f-ca6b-4668-b5b1-c822fb8dcbcb req-042b1284-5bcd-4381-802f-8af807837fc8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:18:00 compute-0 nova_compute[187185]: 2025-11-29 07:18:00.526 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:18:00 compute-0 nova_compute[187185]: 2025-11-29 07:18:00.902 187189 DEBUG nova.network.neutron [req-9e66e90f-ca6b-4668-b5b1-c822fb8dcbcb req-042b1284-5bcd-4381-802f-8af807837fc8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:18:01 compute-0 nova_compute[187185]: 2025-11-29 07:18:01.455 187189 DEBUG oslo_concurrency.lockutils [req-9e66e90f-ca6b-4668-b5b1-c822fb8dcbcb req-042b1284-5bcd-4381-802f-8af807837fc8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-18ed7c04-c274-458a-9bcf-da56ec34bd62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:18:01 compute-0 nova_compute[187185]: 2025-11-29 07:18:01.456 187189 DEBUG oslo_concurrency.lockutils [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquired lock "refresh_cache-18ed7c04-c274-458a-9bcf-da56ec34bd62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:18:01 compute-0 nova_compute[187185]: 2025-11-29 07:18:01.456 187189 DEBUG nova.network.neutron [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:18:02 compute-0 nova_compute[187185]: 2025-11-29 07:18:02.026 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:18:03 compute-0 nova_compute[187185]: 2025-11-29 07:18:03.523 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:18:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:18:03.522 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=35, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=34) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:18:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:18:03.526 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:18:03 compute-0 nova_compute[187185]: 2025-11-29 07:18:03.833 187189 DEBUG nova.network.neutron [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:18:05 compute-0 nova_compute[187185]: 2025-11-29 07:18:05.435 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:18:05 compute-0 nova_compute[187185]: 2025-11-29 07:18:05.435 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:18:05 compute-0 nova_compute[187185]: 2025-11-29 07:18:05.435 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:18:05 compute-0 nova_compute[187185]: 2025-11-29 07:18:05.530 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:18:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:18:06.530 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '35'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:18:06 compute-0 nova_compute[187185]: 2025-11-29 07:18:06.727 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Nov 29 07:18:06 compute-0 nova_compute[187185]: 2025-11-29 07:18:06.728 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 07:18:06 compute-0 nova_compute[187185]: 2025-11-29 07:18:06.989 187189 DEBUG nova.network.neutron [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Updating instance_info_cache with network_info: [{"id": "038c98cb-23f6-48a9-b2fd-2dbf4c409143", "address": "fa:16:3e:44:d2:ef", "network": {"id": "3ac75f3c-d209-4778-86b5-e99a50d91b8d", "bridge": "br-int", "label": "tempest-network-smoke--1784468808", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap038c98cb-23", "ovs_interfaceid": "038c98cb-23f6-48a9-b2fd-2dbf4c409143", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:18:07 compute-0 nova_compute[187185]: 2025-11-29 07:18:07.059 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:18:07 compute-0 podman[230623]: 2025-11-29 07:18:07.830437598 +0000 UTC m=+0.057833092 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:18:07 compute-0 podman[230622]: 2025-11-29 07:18:07.848827829 +0000 UTC m=+0.095805738 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, config_id=edpm, vcs-type=git, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 29 07:18:07 compute-0 nova_compute[187185]: 2025-11-29 07:18:07.873 187189 DEBUG oslo_concurrency.lockutils [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Releasing lock "refresh_cache-18ed7c04-c274-458a-9bcf-da56ec34bd62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:18:07 compute-0 nova_compute[187185]: 2025-11-29 07:18:07.875 187189 DEBUG nova.compute.manager [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Instance network_info: |[{"id": "038c98cb-23f6-48a9-b2fd-2dbf4c409143", "address": "fa:16:3e:44:d2:ef", "network": {"id": "3ac75f3c-d209-4778-86b5-e99a50d91b8d", "bridge": "br-int", "label": "tempest-network-smoke--1784468808", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap038c98cb-23", "ovs_interfaceid": "038c98cb-23f6-48a9-b2fd-2dbf4c409143", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 07:18:07 compute-0 nova_compute[187185]: 2025-11-29 07:18:07.881 187189 DEBUG nova.virt.libvirt.driver [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Start _get_guest_xml network_info=[{"id": "038c98cb-23f6-48a9-b2fd-2dbf4c409143", "address": "fa:16:3e:44:d2:ef", "network": {"id": "3ac75f3c-d209-4778-86b5-e99a50d91b8d", "bridge": "br-int", "label": "tempest-network-smoke--1784468808", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap038c98cb-23", "ovs_interfaceid": "038c98cb-23f6-48a9-b2fd-2dbf4c409143", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:18:07 compute-0 podman[230621]: 2025-11-29 07:18:07.883862283 +0000 UTC m=+0.130456371 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:18:07 compute-0 nova_compute[187185]: 2025-11-29 07:18:07.889 187189 WARNING nova.virt.libvirt.driver [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:18:07 compute-0 nova_compute[187185]: 2025-11-29 07:18:07.898 187189 DEBUG nova.virt.libvirt.host [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:18:07 compute-0 nova_compute[187185]: 2025-11-29 07:18:07.898 187189 DEBUG nova.virt.libvirt.host [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:18:07 compute-0 nova_compute[187185]: 2025-11-29 07:18:07.901 187189 DEBUG nova.virt.libvirt.host [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:18:07 compute-0 nova_compute[187185]: 2025-11-29 07:18:07.902 187189 DEBUG nova.virt.libvirt.host [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:18:07 compute-0 nova_compute[187185]: 2025-11-29 07:18:07.903 187189 DEBUG nova.virt.libvirt.driver [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:18:07 compute-0 nova_compute[187185]: 2025-11-29 07:18:07.903 187189 DEBUG nova.virt.hardware [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:18:07 compute-0 nova_compute[187185]: 2025-11-29 07:18:07.904 187189 DEBUG nova.virt.hardware [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:18:07 compute-0 nova_compute[187185]: 2025-11-29 07:18:07.904 187189 DEBUG nova.virt.hardware [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:18:07 compute-0 nova_compute[187185]: 2025-11-29 07:18:07.904 187189 DEBUG nova.virt.hardware [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:18:07 compute-0 nova_compute[187185]: 2025-11-29 07:18:07.904 187189 DEBUG nova.virt.hardware [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:18:07 compute-0 nova_compute[187185]: 2025-11-29 07:18:07.905 187189 DEBUG nova.virt.hardware [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:18:07 compute-0 nova_compute[187185]: 2025-11-29 07:18:07.905 187189 DEBUG nova.virt.hardware [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:18:07 compute-0 nova_compute[187185]: 2025-11-29 07:18:07.905 187189 DEBUG nova.virt.hardware [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:18:07 compute-0 nova_compute[187185]: 2025-11-29 07:18:07.905 187189 DEBUG nova.virt.hardware [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:18:07 compute-0 nova_compute[187185]: 2025-11-29 07:18:07.905 187189 DEBUG nova.virt.hardware [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:18:07 compute-0 nova_compute[187185]: 2025-11-29 07:18:07.906 187189 DEBUG nova.virt.hardware [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:18:07 compute-0 nova_compute[187185]: 2025-11-29 07:18:07.911 187189 DEBUG nova.virt.libvirt.vif [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:17:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1716255593',display_name='tempest-TestNetworkAdvancedServerOps-server-1716255593',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1716255593',id=108,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIavjyz1OvfroRFOIbOeRi+rHfxFQBrSW+Ld/D6LfbYvLBX02KQOODf0uKY/ADuzicnqMPwaohmsFg2wFh++jvC+svictzajGD2N3biFjV8neBCmyTW4WlcmDbgS9W7F8g==',key_name='tempest-TestNetworkAdvancedServerOps-536242546',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-qkbt03s8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:17:49Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=18ed7c04-c274-458a-9bcf-da56ec34bd62,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "038c98cb-23f6-48a9-b2fd-2dbf4c409143", "address": "fa:16:3e:44:d2:ef", "network": {"id": "3ac75f3c-d209-4778-86b5-e99a50d91b8d", "bridge": "br-int", "label": "tempest-network-smoke--1784468808", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap038c98cb-23", "ovs_interfaceid": "038c98cb-23f6-48a9-b2fd-2dbf4c409143", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:18:07 compute-0 nova_compute[187185]: 2025-11-29 07:18:07.912 187189 DEBUG nova.network.os_vif_util [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converting VIF {"id": "038c98cb-23f6-48a9-b2fd-2dbf4c409143", "address": "fa:16:3e:44:d2:ef", "network": {"id": "3ac75f3c-d209-4778-86b5-e99a50d91b8d", "bridge": "br-int", "label": "tempest-network-smoke--1784468808", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap038c98cb-23", "ovs_interfaceid": "038c98cb-23f6-48a9-b2fd-2dbf4c409143", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:18:07 compute-0 nova_compute[187185]: 2025-11-29 07:18:07.913 187189 DEBUG nova.network.os_vif_util [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:d2:ef,bridge_name='br-int',has_traffic_filtering=True,id=038c98cb-23f6-48a9-b2fd-2dbf4c409143,network=Network(3ac75f3c-d209-4778-86b5-e99a50d91b8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap038c98cb-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:18:07 compute-0 nova_compute[187185]: 2025-11-29 07:18:07.914 187189 DEBUG nova.objects.instance [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'pci_devices' on Instance uuid 18ed7c04-c274-458a-9bcf-da56ec34bd62 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:18:08 compute-0 nova_compute[187185]: 2025-11-29 07:18:08.204 187189 DEBUG nova.virt.libvirt.driver [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:18:08 compute-0 nova_compute[187185]:   <uuid>18ed7c04-c274-458a-9bcf-da56ec34bd62</uuid>
Nov 29 07:18:08 compute-0 nova_compute[187185]:   <name>instance-0000006c</name>
Nov 29 07:18:08 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:18:08 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:18:08 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:18:08 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-1716255593</nova:name>
Nov 29 07:18:08 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:18:07</nova:creationTime>
Nov 29 07:18:08 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:18:08 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:18:08 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:18:08 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:18:08 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:18:08 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:18:08 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:18:08 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:18:08 compute-0 nova_compute[187185]:         <nova:user uuid="bfd2024670594b10941cec8a59d2573f">tempest-TestNetworkAdvancedServerOps-1380683659-project-member</nova:user>
Nov 29 07:18:08 compute-0 nova_compute[187185]:         <nova:project uuid="c231e63624d44fc19e0989abfb1afb22">tempest-TestNetworkAdvancedServerOps-1380683659</nova:project>
Nov 29 07:18:08 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:18:08 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:18:08 compute-0 nova_compute[187185]:         <nova:port uuid="038c98cb-23f6-48a9-b2fd-2dbf4c409143">
Nov 29 07:18:08 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:18:08 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:18:08 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:18:08 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <system>
Nov 29 07:18:08 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:18:08 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:18:08 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:18:08 compute-0 nova_compute[187185]:       <entry name="serial">18ed7c04-c274-458a-9bcf-da56ec34bd62</entry>
Nov 29 07:18:08 compute-0 nova_compute[187185]:       <entry name="uuid">18ed7c04-c274-458a-9bcf-da56ec34bd62</entry>
Nov 29 07:18:08 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     </system>
Nov 29 07:18:08 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:18:08 compute-0 nova_compute[187185]:   <os>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:   </os>
Nov 29 07:18:08 compute-0 nova_compute[187185]:   <features>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:   </features>
Nov 29 07:18:08 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:18:08 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:18:08 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:18:08 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/18ed7c04-c274-458a-9bcf-da56ec34bd62/disk"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:18:08 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/18ed7c04-c274-458a-9bcf-da56ec34bd62/disk.config"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:18:08 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:44:d2:ef"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:       <target dev="tap038c98cb-23"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:18:08 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/18ed7c04-c274-458a-9bcf-da56ec34bd62/console.log" append="off"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <video>
Nov 29 07:18:08 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     </video>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:18:08 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:18:08 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:18:08 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:18:08 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:18:08 compute-0 nova_compute[187185]: </domain>
Nov 29 07:18:08 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:18:08 compute-0 nova_compute[187185]: 2025-11-29 07:18:08.205 187189 DEBUG nova.compute.manager [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Preparing to wait for external event network-vif-plugged-038c98cb-23f6-48a9-b2fd-2dbf4c409143 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:18:08 compute-0 nova_compute[187185]: 2025-11-29 07:18:08.206 187189 DEBUG oslo_concurrency.lockutils [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "18ed7c04-c274-458a-9bcf-da56ec34bd62-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:18:08 compute-0 nova_compute[187185]: 2025-11-29 07:18:08.207 187189 DEBUG oslo_concurrency.lockutils [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "18ed7c04-c274-458a-9bcf-da56ec34bd62-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:18:08 compute-0 nova_compute[187185]: 2025-11-29 07:18:08.207 187189 DEBUG oslo_concurrency.lockutils [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "18ed7c04-c274-458a-9bcf-da56ec34bd62-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:18:08 compute-0 nova_compute[187185]: 2025-11-29 07:18:08.208 187189 DEBUG nova.virt.libvirt.vif [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:17:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1716255593',display_name='tempest-TestNetworkAdvancedServerOps-server-1716255593',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1716255593',id=108,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIavjyz1OvfroRFOIbOeRi+rHfxFQBrSW+Ld/D6LfbYvLBX02KQOODf0uKY/ADuzicnqMPwaohmsFg2wFh++jvC+svictzajGD2N3biFjV8neBCmyTW4WlcmDbgS9W7F8g==',key_name='tempest-TestNetworkAdvancedServerOps-536242546',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-qkbt03s8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:17:49Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=18ed7c04-c274-458a-9bcf-da56ec34bd62,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "038c98cb-23f6-48a9-b2fd-2dbf4c409143", "address": "fa:16:3e:44:d2:ef", "network": {"id": "3ac75f3c-d209-4778-86b5-e99a50d91b8d", "bridge": "br-int", "label": "tempest-network-smoke--1784468808", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap038c98cb-23", "ovs_interfaceid": "038c98cb-23f6-48a9-b2fd-2dbf4c409143", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:18:08 compute-0 nova_compute[187185]: 2025-11-29 07:18:08.209 187189 DEBUG nova.network.os_vif_util [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converting VIF {"id": "038c98cb-23f6-48a9-b2fd-2dbf4c409143", "address": "fa:16:3e:44:d2:ef", "network": {"id": "3ac75f3c-d209-4778-86b5-e99a50d91b8d", "bridge": "br-int", "label": "tempest-network-smoke--1784468808", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap038c98cb-23", "ovs_interfaceid": "038c98cb-23f6-48a9-b2fd-2dbf4c409143", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:18:08 compute-0 nova_compute[187185]: 2025-11-29 07:18:08.210 187189 DEBUG nova.network.os_vif_util [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:d2:ef,bridge_name='br-int',has_traffic_filtering=True,id=038c98cb-23f6-48a9-b2fd-2dbf4c409143,network=Network(3ac75f3c-d209-4778-86b5-e99a50d91b8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap038c98cb-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:18:08 compute-0 nova_compute[187185]: 2025-11-29 07:18:08.211 187189 DEBUG os_vif [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:d2:ef,bridge_name='br-int',has_traffic_filtering=True,id=038c98cb-23f6-48a9-b2fd-2dbf4c409143,network=Network(3ac75f3c-d209-4778-86b5-e99a50d91b8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap038c98cb-23') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:18:08 compute-0 nova_compute[187185]: 2025-11-29 07:18:08.211 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:18:08 compute-0 nova_compute[187185]: 2025-11-29 07:18:08.212 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:18:08 compute-0 nova_compute[187185]: 2025-11-29 07:18:08.213 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:18:08 compute-0 nova_compute[187185]: 2025-11-29 07:18:08.220 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:18:08 compute-0 nova_compute[187185]: 2025-11-29 07:18:08.221 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap038c98cb-23, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:18:08 compute-0 nova_compute[187185]: 2025-11-29 07:18:08.222 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap038c98cb-23, col_values=(('external_ids', {'iface-id': '038c98cb-23f6-48a9-b2fd-2dbf4c409143', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:44:d2:ef', 'vm-uuid': '18ed7c04-c274-458a-9bcf-da56ec34bd62'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:18:08 compute-0 NetworkManager[55227]: <info>  [1764400688.2559] manager: (tap038c98cb-23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/160)
Nov 29 07:18:08 compute-0 nova_compute[187185]: 2025-11-29 07:18:08.254 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:18:08 compute-0 nova_compute[187185]: 2025-11-29 07:18:08.258 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:18:08 compute-0 nova_compute[187185]: 2025-11-29 07:18:08.266 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:18:08 compute-0 nova_compute[187185]: 2025-11-29 07:18:08.268 187189 INFO os_vif [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:d2:ef,bridge_name='br-int',has_traffic_filtering=True,id=038c98cb-23f6-48a9-b2fd-2dbf4c409143,network=Network(3ac75f3c-d209-4778-86b5-e99a50d91b8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap038c98cb-23')
Nov 29 07:18:09 compute-0 nova_compute[187185]: 2025-11-29 07:18:09.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:18:09 compute-0 nova_compute[187185]: 2025-11-29 07:18:09.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:18:09 compute-0 nova_compute[187185]: 2025-11-29 07:18:09.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:18:09 compute-0 nova_compute[187185]: 2025-11-29 07:18:09.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:18:09 compute-0 nova_compute[187185]: 2025-11-29 07:18:09.318 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:18:09 compute-0 nova_compute[187185]: 2025-11-29 07:18:09.318 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:18:09 compute-0 nova_compute[187185]: 2025-11-29 07:18:09.990 187189 DEBUG nova.virt.libvirt.driver [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:18:09 compute-0 nova_compute[187185]: 2025-11-29 07:18:09.990 187189 DEBUG nova.virt.libvirt.driver [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:18:09 compute-0 nova_compute[187185]: 2025-11-29 07:18:09.991 187189 DEBUG nova.virt.libvirt.driver [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] No VIF found with MAC fa:16:3e:44:d2:ef, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:18:09 compute-0 nova_compute[187185]: 2025-11-29 07:18:09.991 187189 INFO nova.virt.libvirt.driver [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Using config drive
Nov 29 07:18:10 compute-0 nova_compute[187185]: 2025-11-29 07:18:10.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:18:10 compute-0 nova_compute[187185]: 2025-11-29 07:18:10.507 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:18:10 compute-0 nova_compute[187185]: 2025-11-29 07:18:10.507 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:18:10 compute-0 nova_compute[187185]: 2025-11-29 07:18:10.508 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:18:10 compute-0 nova_compute[187185]: 2025-11-29 07:18:10.508 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:18:10 compute-0 nova_compute[187185]: 2025-11-29 07:18:10.569 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:18:12 compute-0 nova_compute[187185]: 2025-11-29 07:18:12.873 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/18ed7c04-c274-458a-9bcf-da56ec34bd62/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:18:12 compute-0 nova_compute[187185]: 2025-11-29 07:18:12.957 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/18ed7c04-c274-458a-9bcf-da56ec34bd62/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:18:12 compute-0 nova_compute[187185]: 2025-11-29 07:18:12.958 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/18ed7c04-c274-458a-9bcf-da56ec34bd62/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:18:13 compute-0 nova_compute[187185]: 2025-11-29 07:18:13.025 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/18ed7c04-c274-458a-9bcf-da56ec34bd62/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:18:13 compute-0 nova_compute[187185]: 2025-11-29 07:18:13.027 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Periodic task is updating the host stat, it is trying to get disk instance-0000006c, but disk file was removed by concurrent operations such as resize.: FileNotFoundError: [Errno 2] No such file or directory: '/var/lib/nova/instances/18ed7c04-c274-458a-9bcf-da56ec34bd62/disk.config'
Nov 29 07:18:13 compute-0 nova_compute[187185]: 2025-11-29 07:18:13.186 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:18:13 compute-0 nova_compute[187185]: 2025-11-29 07:18:13.189 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5746MB free_disk=73.2944107055664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:18:13 compute-0 nova_compute[187185]: 2025-11-29 07:18:13.189 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:18:13 compute-0 nova_compute[187185]: 2025-11-29 07:18:13.190 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:18:13 compute-0 nova_compute[187185]: 2025-11-29 07:18:13.255 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:18:13 compute-0 nova_compute[187185]: 2025-11-29 07:18:13.738 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance 18ed7c04-c274-458a-9bcf-da56ec34bd62 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:18:13 compute-0 nova_compute[187185]: 2025-11-29 07:18:13.739 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:18:13 compute-0 nova_compute[187185]: 2025-11-29 07:18:13.739 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:18:13 compute-0 nova_compute[187185]: 2025-11-29 07:18:13.785 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:18:13 compute-0 nova_compute[187185]: 2025-11-29 07:18:13.955 187189 INFO nova.virt.libvirt.driver [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Creating config drive at /var/lib/nova/instances/18ed7c04-c274-458a-9bcf-da56ec34bd62/disk.config
Nov 29 07:18:13 compute-0 nova_compute[187185]: 2025-11-29 07:18:13.962 187189 DEBUG oslo_concurrency.processutils [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/18ed7c04-c274-458a-9bcf-da56ec34bd62/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7h60z6la execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:18:14 compute-0 nova_compute[187185]: 2025-11-29 07:18:14.089 187189 DEBUG oslo_concurrency.processutils [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/18ed7c04-c274-458a-9bcf-da56ec34bd62/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7h60z6la" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:18:14 compute-0 kernel: tap038c98cb-23: entered promiscuous mode
Nov 29 07:18:14 compute-0 NetworkManager[55227]: <info>  [1764400694.1763] manager: (tap038c98cb-23): new Tun device (/org/freedesktop/NetworkManager/Devices/161)
Nov 29 07:18:14 compute-0 ovn_controller[95281]: 2025-11-29T07:18:14Z|00293|binding|INFO|Claiming lport 038c98cb-23f6-48a9-b2fd-2dbf4c409143 for this chassis.
Nov 29 07:18:14 compute-0 ovn_controller[95281]: 2025-11-29T07:18:14Z|00294|binding|INFO|038c98cb-23f6-48a9-b2fd-2dbf4c409143: Claiming fa:16:3e:44:d2:ef 10.100.0.6
Nov 29 07:18:14 compute-0 nova_compute[187185]: 2025-11-29 07:18:14.182 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:18:14 compute-0 nova_compute[187185]: 2025-11-29 07:18:14.185 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:18:14 compute-0 ovn_controller[95281]: 2025-11-29T07:18:14Z|00295|binding|INFO|Setting lport 038c98cb-23f6-48a9-b2fd-2dbf4c409143 ovn-installed in OVS
Nov 29 07:18:14 compute-0 nova_compute[187185]: 2025-11-29 07:18:14.208 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:18:14 compute-0 nova_compute[187185]: 2025-11-29 07:18:14.211 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:18:14 compute-0 systemd-udevd[230710]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:18:14 compute-0 systemd-machined[153486]: New machine qemu-40-instance-0000006c.
Nov 29 07:18:14 compute-0 NetworkManager[55227]: <info>  [1764400694.2478] device (tap038c98cb-23): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:18:14 compute-0 NetworkManager[55227]: <info>  [1764400694.2505] device (tap038c98cb-23): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:18:14 compute-0 systemd[1]: Started Virtual Machine qemu-40-instance-0000006c.
Nov 29 07:18:14 compute-0 ovn_controller[95281]: 2025-11-29T07:18:14Z|00296|binding|INFO|Setting lport 038c98cb-23f6-48a9-b2fd-2dbf4c409143 up in Southbound
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:18:14.516 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:44:d2:ef 10.100.0.6'], port_security=['fa:16:3e:44:d2:ef 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3ac75f3c-d209-4778-86b5-e99a50d91b8d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c231e63624d44fc19e0989abfb1afb22', 'neutron:revision_number': '2', 'neutron:security_group_ids': '11768807-c865-49c1-8b66-4622c035d7da', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=86d0dade-0fd3-4b3f-a097-a83258b59083, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=038c98cb-23f6-48a9-b2fd-2dbf4c409143) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:18:14.519 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 038c98cb-23f6-48a9-b2fd-2dbf4c409143 in datapath 3ac75f3c-d209-4778-86b5-e99a50d91b8d bound to our chassis
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:18:14.521 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3ac75f3c-d209-4778-86b5-e99a50d91b8d
Nov 29 07:18:14 compute-0 nova_compute[187185]: 2025-11-29 07:18:14.525 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:18:14.536 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a7cae7f6-c8c6-4e6e-8977-59d1eaa8d43a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:18:14.537 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3ac75f3c-d1 in ovnmeta-3ac75f3c-d209-4778-86b5-e99a50d91b8d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:18:14.541 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3ac75f3c-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:18:14.541 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[976f0951-52d8-4ee7-a2da-0db445ac15d1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:18:14.542 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[b2d5a822-97dd-404b-b989-4df16501b2a5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:18:14.558 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[d8979ee5-1ffd-473d-a958-7708c3986852]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:18:14.575 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a4fadb53-fece-48d3-b24d-a5f58a884673]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:18:14.608 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[989b2a5f-0da0-4c34-a017-e975669f895b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:18:14 compute-0 NetworkManager[55227]: <info>  [1764400694.6144] manager: (tap3ac75f3c-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/162)
Nov 29 07:18:14 compute-0 systemd-udevd[230714]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:18:14.615 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f5a3881d-55e0-4e0c-9420-e205f11a27f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:18:14.674 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[e2e6df2d-be2e-479e-a779-f4430e19ce3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:18:14.677 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[040242a4-afa7-48fc-8091-aa4fa47b2bc8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:18:14 compute-0 NetworkManager[55227]: <info>  [1764400694.7071] device (tap3ac75f3c-d0): carrier: link connected
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:18:14.715 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[f6734ce1-2974-4a1c-8cfd-c675c8a38319]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:18:14 compute-0 nova_compute[187185]: 2025-11-29 07:18:14.738 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:18:14 compute-0 nova_compute[187185]: 2025-11-29 07:18:14.739 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.549s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:18:14.741 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[21391642-635f-4710-8d38-1d8550b207bf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3ac75f3c-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:72:7c:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 97], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 614237, 'reachable_time': 44497, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230745, 'error': None, 'target': 'ovnmeta-3ac75f3c-d209-4778-86b5-e99a50d91b8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:18:14.762 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[14c7df4f-4989-4924-b8a9-43c1734b0eae]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe72:7c02'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 614237, 'tstamp': 614237}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230750, 'error': None, 'target': 'ovnmeta-3ac75f3c-d209-4778-86b5-e99a50d91b8d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:18:14.785 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[5bd4ec3f-677e-4595-84b3-50352378a3a4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3ac75f3c-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:72:7c:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 97], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 614237, 'reachable_time': 44497, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230753, 'error': None, 'target': 'ovnmeta-3ac75f3c-d209-4778-86b5-e99a50d91b8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:18:14.822 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[45966b87-ec5a-4354-a842-6ab6bae94893]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:18:14 compute-0 nova_compute[187185]: 2025-11-29 07:18:14.847 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400694.8469489, 18ed7c04-c274-458a-9bcf-da56ec34bd62 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:18:14 compute-0 nova_compute[187185]: 2025-11-29 07:18:14.847 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] VM Started (Lifecycle Event)
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:18:14.900 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[8dcefa6b-cfd4-4323-9c47-308ac40cc60e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:18:14.901 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3ac75f3c-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:18:14.902 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:18:14.902 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3ac75f3c-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:18:14 compute-0 nova_compute[187185]: 2025-11-29 07:18:14.931 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:18:14 compute-0 NetworkManager[55227]: <info>  [1764400694.9321] manager: (tap3ac75f3c-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/163)
Nov 29 07:18:14 compute-0 kernel: tap3ac75f3c-d0: entered promiscuous mode
Nov 29 07:18:14 compute-0 nova_compute[187185]: 2025-11-29 07:18:14.934 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:18:14.937 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3ac75f3c-d0, col_values=(('external_ids', {'iface-id': '834ef8b9-4dd9-4718-9d7f-ea3eb8cecb7d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:18:14 compute-0 ovn_controller[95281]: 2025-11-29T07:18:14Z|00297|binding|INFO|Releasing lport 834ef8b9-4dd9-4718-9d7f-ea3eb8cecb7d from this chassis (sb_readonly=0)
Nov 29 07:18:14 compute-0 nova_compute[187185]: 2025-11-29 07:18:14.939 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:18:14 compute-0 nova_compute[187185]: 2025-11-29 07:18:14.940 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:18:14.942 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3ac75f3c-d209-4778-86b5-e99a50d91b8d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3ac75f3c-d209-4778-86b5-e99a50d91b8d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:18:14.943 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[be52d193-04ba-4dec-a4f2-0dc2fe3e743d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:18:14.945 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-3ac75f3c-d209-4778-86b5-e99a50d91b8d
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/3ac75f3c-d209-4778-86b5-e99a50d91b8d.pid.haproxy
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID 3ac75f3c-d209-4778-86b5-e99a50d91b8d
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:18:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:18:14.946 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3ac75f3c-d209-4778-86b5-e99a50d91b8d', 'env', 'PROCESS_TAG=haproxy-3ac75f3c-d209-4778-86b5-e99a50d91b8d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3ac75f3c-d209-4778-86b5-e99a50d91b8d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:18:14 compute-0 nova_compute[187185]: 2025-11-29 07:18:14.956 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:18:15 compute-0 podman[230786]: 2025-11-29 07:18:15.439105489 +0000 UTC m=+0.081717669 container create 3ce608646acacaddf192de43ab5fd681bb0ddf092a2a33139d7667488ab4b9de (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3ac75f3c-d209-4778-86b5-e99a50d91b8d, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 29 07:18:15 compute-0 podman[230786]: 2025-11-29 07:18:15.398436675 +0000 UTC m=+0.041048925 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:18:15 compute-0 systemd[1]: Started libpod-conmon-3ce608646acacaddf192de43ab5fd681bb0ddf092a2a33139d7667488ab4b9de.scope.
Nov 29 07:18:15 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:18:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4cf5f324c83e0e329004dbe96b12a8efb6cdd593ebbf8431d32489d24008cbe6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:18:15 compute-0 nova_compute[187185]: 2025-11-29 07:18:15.572 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:18:15 compute-0 podman[230786]: 2025-11-29 07:18:15.573598874 +0000 UTC m=+0.216211054 container init 3ce608646acacaddf192de43ab5fd681bb0ddf092a2a33139d7667488ab4b9de (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3ac75f3c-d209-4778-86b5-e99a50d91b8d, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 07:18:15 compute-0 podman[230786]: 2025-11-29 07:18:15.580689375 +0000 UTC m=+0.223301525 container start 3ce608646acacaddf192de43ab5fd681bb0ddf092a2a33139d7667488ab4b9de (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3ac75f3c-d209-4778-86b5-e99a50d91b8d, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 29 07:18:15 compute-0 neutron-haproxy-ovnmeta-3ac75f3c-d209-4778-86b5-e99a50d91b8d[230801]: [NOTICE]   (230805) : New worker (230807) forked
Nov 29 07:18:15 compute-0 neutron-haproxy-ovnmeta-3ac75f3c-d209-4778-86b5-e99a50d91b8d[230801]: [NOTICE]   (230805) : Loading success.
Nov 29 07:18:15 compute-0 nova_compute[187185]: 2025-11-29 07:18:15.734 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:18:15 compute-0 nova_compute[187185]: 2025-11-29 07:18:15.735 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:18:16 compute-0 nova_compute[187185]: 2025-11-29 07:18:16.111 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:18:16 compute-0 nova_compute[187185]: 2025-11-29 07:18:16.118 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400694.849983, 18ed7c04-c274-458a-9bcf-da56ec34bd62 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:18:16 compute-0 nova_compute[187185]: 2025-11-29 07:18:16.119 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] VM Paused (Lifecycle Event)
Nov 29 07:18:16 compute-0 nova_compute[187185]: 2025-11-29 07:18:16.217 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:18:16 compute-0 nova_compute[187185]: 2025-11-29 07:18:16.221 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:18:16 compute-0 nova_compute[187185]: 2025-11-29 07:18:16.371 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:18:16 compute-0 podman[230816]: 2025-11-29 07:18:16.88772977 +0000 UTC m=+0.140466427 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller)
Nov 29 07:18:18 compute-0 nova_compute[187185]: 2025-11-29 07:18:18.257 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:18:18 compute-0 nova_compute[187185]: 2025-11-29 07:18:18.683 187189 DEBUG nova.compute.manager [req-f4bfea77-c8c4-4f8b-89c8-c901a23fedf3 req-b4685d49-4fc1-4c44-a94b-4f280984eb7a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Received event network-vif-plugged-038c98cb-23f6-48a9-b2fd-2dbf4c409143 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:18:18 compute-0 nova_compute[187185]: 2025-11-29 07:18:18.684 187189 DEBUG oslo_concurrency.lockutils [req-f4bfea77-c8c4-4f8b-89c8-c901a23fedf3 req-b4685d49-4fc1-4c44-a94b-4f280984eb7a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "18ed7c04-c274-458a-9bcf-da56ec34bd62-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:18:18 compute-0 nova_compute[187185]: 2025-11-29 07:18:18.685 187189 DEBUG oslo_concurrency.lockutils [req-f4bfea77-c8c4-4f8b-89c8-c901a23fedf3 req-b4685d49-4fc1-4c44-a94b-4f280984eb7a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "18ed7c04-c274-458a-9bcf-da56ec34bd62-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:18:18 compute-0 nova_compute[187185]: 2025-11-29 07:18:18.685 187189 DEBUG oslo_concurrency.lockutils [req-f4bfea77-c8c4-4f8b-89c8-c901a23fedf3 req-b4685d49-4fc1-4c44-a94b-4f280984eb7a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "18ed7c04-c274-458a-9bcf-da56ec34bd62-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:18:18 compute-0 nova_compute[187185]: 2025-11-29 07:18:18.686 187189 DEBUG nova.compute.manager [req-f4bfea77-c8c4-4f8b-89c8-c901a23fedf3 req-b4685d49-4fc1-4c44-a94b-4f280984eb7a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Processing event network-vif-plugged-038c98cb-23f6-48a9-b2fd-2dbf4c409143 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:18:18 compute-0 nova_compute[187185]: 2025-11-29 07:18:18.687 187189 DEBUG nova.compute.manager [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:18:18 compute-0 nova_compute[187185]: 2025-11-29 07:18:18.693 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400698.6931455, 18ed7c04-c274-458a-9bcf-da56ec34bd62 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:18:18 compute-0 nova_compute[187185]: 2025-11-29 07:18:18.693 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] VM Resumed (Lifecycle Event)
Nov 29 07:18:18 compute-0 nova_compute[187185]: 2025-11-29 07:18:18.697 187189 DEBUG nova.virt.libvirt.driver [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:18:18 compute-0 nova_compute[187185]: 2025-11-29 07:18:18.702 187189 INFO nova.virt.libvirt.driver [-] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Instance spawned successfully.
Nov 29 07:18:18 compute-0 nova_compute[187185]: 2025-11-29 07:18:18.703 187189 DEBUG nova.virt.libvirt.driver [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:18:19 compute-0 nova_compute[187185]: 2025-11-29 07:18:19.071 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:18:19 compute-0 nova_compute[187185]: 2025-11-29 07:18:19.080 187189 DEBUG nova.virt.libvirt.driver [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:18:19 compute-0 nova_compute[187185]: 2025-11-29 07:18:19.080 187189 DEBUG nova.virt.libvirt.driver [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:18:19 compute-0 nova_compute[187185]: 2025-11-29 07:18:19.081 187189 DEBUG nova.virt.libvirt.driver [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:18:19 compute-0 nova_compute[187185]: 2025-11-29 07:18:19.081 187189 DEBUG nova.virt.libvirt.driver [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:18:19 compute-0 nova_compute[187185]: 2025-11-29 07:18:19.082 187189 DEBUG nova.virt.libvirt.driver [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:18:19 compute-0 nova_compute[187185]: 2025-11-29 07:18:19.082 187189 DEBUG nova.virt.libvirt.driver [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:18:19 compute-0 nova_compute[187185]: 2025-11-29 07:18:19.085 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:18:19 compute-0 nova_compute[187185]: 2025-11-29 07:18:19.179 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:18:19 compute-0 nova_compute[187185]: 2025-11-29 07:18:19.285 187189 INFO nova.compute.manager [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Took 27.62 seconds to spawn the instance on the hypervisor.
Nov 29 07:18:19 compute-0 nova_compute[187185]: 2025-11-29 07:18:19.286 187189 DEBUG nova.compute.manager [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:18:19 compute-0 nova_compute[187185]: 2025-11-29 07:18:19.614 187189 INFO nova.compute.manager [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Took 31.39 seconds to build instance.
Nov 29 07:18:20 compute-0 nova_compute[187185]: 2025-11-29 07:18:20.052 187189 DEBUG oslo_concurrency.lockutils [None req-0b004589-8fc3-41b8-9651-805a8ec457b9 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "18ed7c04-c274-458a-9bcf-da56ec34bd62" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 32.194s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:18:20 compute-0 nova_compute[187185]: 2025-11-29 07:18:20.574 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:18:21 compute-0 nova_compute[187185]: 2025-11-29 07:18:21.209 187189 DEBUG nova.compute.manager [req-3eb3f06e-c212-49b2-b9a5-2a98664837c0 req-7d209403-3b8f-4a95-b3db-81d42198fe10 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Received event network-vif-plugged-038c98cb-23f6-48a9-b2fd-2dbf4c409143 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:18:21 compute-0 nova_compute[187185]: 2025-11-29 07:18:21.210 187189 DEBUG oslo_concurrency.lockutils [req-3eb3f06e-c212-49b2-b9a5-2a98664837c0 req-7d209403-3b8f-4a95-b3db-81d42198fe10 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "18ed7c04-c274-458a-9bcf-da56ec34bd62-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:18:21 compute-0 nova_compute[187185]: 2025-11-29 07:18:21.210 187189 DEBUG oslo_concurrency.lockutils [req-3eb3f06e-c212-49b2-b9a5-2a98664837c0 req-7d209403-3b8f-4a95-b3db-81d42198fe10 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "18ed7c04-c274-458a-9bcf-da56ec34bd62-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:18:21 compute-0 nova_compute[187185]: 2025-11-29 07:18:21.211 187189 DEBUG oslo_concurrency.lockutils [req-3eb3f06e-c212-49b2-b9a5-2a98664837c0 req-7d209403-3b8f-4a95-b3db-81d42198fe10 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "18ed7c04-c274-458a-9bcf-da56ec34bd62-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:18:21 compute-0 nova_compute[187185]: 2025-11-29 07:18:21.212 187189 DEBUG nova.compute.manager [req-3eb3f06e-c212-49b2-b9a5-2a98664837c0 req-7d209403-3b8f-4a95-b3db-81d42198fe10 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] No waiting events found dispatching network-vif-plugged-038c98cb-23f6-48a9-b2fd-2dbf4c409143 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:18:21 compute-0 nova_compute[187185]: 2025-11-29 07:18:21.212 187189 WARNING nova.compute.manager [req-3eb3f06e-c212-49b2-b9a5-2a98664837c0 req-7d209403-3b8f-4a95-b3db-81d42198fe10 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Received unexpected event network-vif-plugged-038c98cb-23f6-48a9-b2fd-2dbf4c409143 for instance with vm_state active and task_state None.
Nov 29 07:18:23 compute-0 nova_compute[187185]: 2025-11-29 07:18:23.261 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:18:23 compute-0 podman[230842]: 2025-11-29 07:18:23.812332708 +0000 UTC m=+0.072767675 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 07:18:25 compute-0 nova_compute[187185]: 2025-11-29 07:18:25.310 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:18:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:18:25.507 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:18:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:18:25.508 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:18:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:18:25.509 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:18:25 compute-0 nova_compute[187185]: 2025-11-29 07:18:25.578 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:18:25 compute-0 podman[230869]: 2025-11-29 07:18:25.828810725 +0000 UTC m=+0.095827599 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 07:18:25 compute-0 podman[230868]: 2025-11-29 07:18:25.828814255 +0000 UTC m=+0.096367094 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:18:28 compute-0 nova_compute[187185]: 2025-11-29 07:18:28.266 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:18:30 compute-0 nova_compute[187185]: 2025-11-29 07:18:30.580 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:18:32 compute-0 ovn_controller[95281]: 2025-11-29T07:18:32Z|00037|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:44:d2:ef 10.100.0.6
Nov 29 07:18:32 compute-0 ovn_controller[95281]: 2025-11-29T07:18:32Z|00038|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:44:d2:ef 10.100.0.6
Nov 29 07:18:33 compute-0 nova_compute[187185]: 2025-11-29 07:18:33.270 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:18:35 compute-0 nova_compute[187185]: 2025-11-29 07:18:35.582 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:18:38 compute-0 nova_compute[187185]: 2025-11-29 07:18:38.274 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:18:38 compute-0 podman[230926]: 2025-11-29 07:18:38.799616945 +0000 UTC m=+0.067597578 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 07:18:38 compute-0 podman[230933]: 2025-11-29 07:18:38.81880044 +0000 UTC m=+0.064448180 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 07:18:38 compute-0 podman[230927]: 2025-11-29 07:18:38.838035015 +0000 UTC m=+0.093577985 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2025-08-20T13:12:41, version=9.6, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9)
Nov 29 07:18:40 compute-0 nova_compute[187185]: 2025-11-29 07:18:40.585 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:18:43 compute-0 nova_compute[187185]: 2025-11-29 07:18:43.279 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:18:45 compute-0 nova_compute[187185]: 2025-11-29 07:18:45.616 187189 DEBUG nova.compute.manager [req-4e931299-6498-4a9c-bece-65dd03e60288 req-0fd9688b-1d8f-4062-869a-80dbfd07b390 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Received event network-changed-038c98cb-23f6-48a9-b2fd-2dbf4c409143 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:18:45 compute-0 nova_compute[187185]: 2025-11-29 07:18:45.616 187189 DEBUG nova.compute.manager [req-4e931299-6498-4a9c-bece-65dd03e60288 req-0fd9688b-1d8f-4062-869a-80dbfd07b390 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Refreshing instance network info cache due to event network-changed-038c98cb-23f6-48a9-b2fd-2dbf4c409143. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:18:45 compute-0 nova_compute[187185]: 2025-11-29 07:18:45.617 187189 DEBUG oslo_concurrency.lockutils [req-4e931299-6498-4a9c-bece-65dd03e60288 req-0fd9688b-1d8f-4062-869a-80dbfd07b390 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-18ed7c04-c274-458a-9bcf-da56ec34bd62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:18:45 compute-0 nova_compute[187185]: 2025-11-29 07:18:45.617 187189 DEBUG oslo_concurrency.lockutils [req-4e931299-6498-4a9c-bece-65dd03e60288 req-0fd9688b-1d8f-4062-869a-80dbfd07b390 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-18ed7c04-c274-458a-9bcf-da56ec34bd62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:18:45 compute-0 nova_compute[187185]: 2025-11-29 07:18:45.618 187189 DEBUG nova.network.neutron [req-4e931299-6498-4a9c-bece-65dd03e60288 req-0fd9688b-1d8f-4062-869a-80dbfd07b390 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Refreshing network info cache for port 038c98cb-23f6-48a9-b2fd-2dbf4c409143 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:18:45 compute-0 nova_compute[187185]: 2025-11-29 07:18:45.621 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:18:47 compute-0 nova_compute[187185]: 2025-11-29 07:18:47.201 187189 INFO nova.compute.manager [None req-30d0cb69-f1ae-4437-8810-f2cf1b7b6c9d bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Get console output
Nov 29 07:18:47 compute-0 nova_compute[187185]: 2025-11-29 07:18:47.208 213758 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 29 07:18:47 compute-0 podman[230991]: 2025-11-29 07:18:47.851677098 +0000 UTC m=+0.106760879 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2)
Nov 29 07:18:47 compute-0 nova_compute[187185]: 2025-11-29 07:18:47.983 187189 DEBUG nova.network.neutron [req-4e931299-6498-4a9c-bece-65dd03e60288 req-0fd9688b-1d8f-4062-869a-80dbfd07b390 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Updated VIF entry in instance network info cache for port 038c98cb-23f6-48a9-b2fd-2dbf4c409143. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:18:47 compute-0 nova_compute[187185]: 2025-11-29 07:18:47.984 187189 DEBUG nova.network.neutron [req-4e931299-6498-4a9c-bece-65dd03e60288 req-0fd9688b-1d8f-4062-869a-80dbfd07b390 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Updating instance_info_cache with network_info: [{"id": "038c98cb-23f6-48a9-b2fd-2dbf4c409143", "address": "fa:16:3e:44:d2:ef", "network": {"id": "3ac75f3c-d209-4778-86b5-e99a50d91b8d", "bridge": "br-int", "label": "tempest-network-smoke--1784468808", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap038c98cb-23", "ovs_interfaceid": "038c98cb-23f6-48a9-b2fd-2dbf4c409143", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:18:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:47.997 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '18ed7c04-c274-458a-9bcf-da56ec34bd62', 'name': 'tempest-TestNetworkAdvancedServerOps-server-1716255593', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000006c', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c231e63624d44fc19e0989abfb1afb22', 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'hostId': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 07:18:47 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:47.998 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.031 12 DEBUG ceilometer.compute.pollsters [-] 18ed7c04-c274-458a-9bcf-da56ec34bd62/disk.device.read.requests volume: 1090 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.031 12 DEBUG ceilometer.compute.pollsters [-] 18ed7c04-c274-458a-9bcf-da56ec34bd62/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f3a044e1-e321-4908-99d2-51ceb4104bb2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1090, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62-vda', 'timestamp': '2025-11-29T07:18:47.998488', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1716255593', 'name': 'instance-0000006c', 'instance_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a5dcc712-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6175.716726865, 'message_signature': 'e4ceece4b020104e1e0744211d459e38969d7996dccd29e75eb6ce11f1ce7b77'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62-sda', 'timestamp': '2025-11-29T07:18:47.998488', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1716255593', 'name': 'instance-0000006c', 'instance_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a5dcd34c-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6175.716726865, 'message_signature': '1e14b312c63003dc42a15497e66712e7d5b915603fae1473932496dc29bdbb42'}]}, 'timestamp': '2025-11-29 07:18:48.031926', '_unique_id': '69243784e8cf40b99fee58b79d200393'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.033 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.035 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.038 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 18ed7c04-c274-458a-9bcf-da56ec34bd62 / tap038c98cb-23 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.039 12 DEBUG ceilometer.compute.pollsters [-] 18ed7c04-c274-458a-9bcf-da56ec34bd62/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8005e9c6-179f-46db-a6a6-ebe922f1eb44', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-0000006c-18ed7c04-c274-458a-9bcf-da56ec34bd62-tap038c98cb-23', 'timestamp': '2025-11-29T07:18:48.035255', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1716255593', 'name': 'tap038c98cb-23', 'instance_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:44:d2:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap038c98cb-23'}, 'message_id': 'a5de006e-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6175.753506018, 'message_signature': '9bc2c1245aa305e1a93b5ce4d05db96d0c5996c065532cfcd4f447c2b2884031'}]}, 'timestamp': '2025-11-29 07:18:48.039618', '_unique_id': 'ea262c850f0a460cb595c3c3b0ee3a1a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.040 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.041 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.041 12 DEBUG ceilometer.compute.pollsters [-] 18ed7c04-c274-458a-9bcf-da56ec34bd62/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '17df3733-3dfe-4090-96b5-8dba958d613f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-0000006c-18ed7c04-c274-458a-9bcf-da56ec34bd62-tap038c98cb-23', 'timestamp': '2025-11-29T07:18:48.041940', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1716255593', 'name': 'tap038c98cb-23', 'instance_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:44:d2:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap038c98cb-23'}, 'message_id': 'a5de6c2a-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6175.753506018, 'message_signature': '952900d86f8342f765e341ca15e3a938700fc4c1ea22073f47448029d75e31fc'}]}, 'timestamp': '2025-11-29 07:18:48.042419', '_unique_id': 'e333ee84d06140dba26829c038d138c8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.043 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.044 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.044 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.045 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1716255593>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1716255593>]
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.045 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.045 12 DEBUG ceilometer.compute.pollsters [-] 18ed7c04-c274-458a-9bcf-da56ec34bd62/disk.device.read.latency volume: 217122753 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.045 12 DEBUG ceilometer.compute.pollsters [-] 18ed7c04-c274-458a-9bcf-da56ec34bd62/disk.device.read.latency volume: 23349665 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6a8ebcb5-bd78-44bb-817a-e2d453cc4807', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 217122753, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62-vda', 'timestamp': '2025-11-29T07:18:48.045469', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1716255593', 'name': 'instance-0000006c', 'instance_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a5def5c8-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6175.716726865, 'message_signature': '1e38675d9b6bb66c5b3e1f0eafae41e056ef7c7ba76c9f26441f24282c42ab94'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 23349665, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62-sda', 'timestamp': '2025-11-29T07:18:48.045469', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1716255593', 'name': 'instance-0000006c', 'instance_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a5df057c-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6175.716726865, 'message_signature': '45f54ef37b8c672771970407c9bfde55a5f602f3956c57e56a556d3582d6d6a9'}]}, 'timestamp': '2025-11-29 07:18:48.046269', '_unique_id': '39c6d560bec24c0da1f2e6c129bea5de'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.047 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.048 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.048 12 DEBUG ceilometer.compute.pollsters [-] 18ed7c04-c274-458a-9bcf-da56ec34bd62/network.incoming.bytes volume: 3997 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '387436b0-dda6-45b2-8e94-3fb21ca14e49', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3997, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-0000006c-18ed7c04-c274-458a-9bcf-da56ec34bd62-tap038c98cb-23', 'timestamp': '2025-11-29T07:18:48.048509', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1716255593', 'name': 'tap038c98cb-23', 'instance_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:44:d2:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap038c98cb-23'}, 'message_id': 'a5df6c1a-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6175.753506018, 'message_signature': '282509c5ae3e86f40d999f3ac2447434ecf4d10485001c514d3b50435434bf82'}]}, 'timestamp': '2025-11-29 07:18:48.048945', '_unique_id': '6f4a51bf473841038f2f065c3319e2fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.049 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.050 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.051 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.051 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1716255593>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1716255593>]
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.051 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.051 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.051 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1716255593>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1716255593>]
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.052 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.052 12 DEBUG ceilometer.compute.pollsters [-] 18ed7c04-c274-458a-9bcf-da56ec34bd62/network.outgoing.packets volume: 29 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '181b3323-5050-4a09-956f-5ac4dacfd191', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 29, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-0000006c-18ed7c04-c274-458a-9bcf-da56ec34bd62-tap038c98cb-23', 'timestamp': '2025-11-29T07:18:48.052240', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1716255593', 'name': 'tap038c98cb-23', 'instance_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:44:d2:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap038c98cb-23'}, 'message_id': 'a5dffe32-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6175.753506018, 'message_signature': '608f4ac85cb2330f3c6b9d7cb7eee44a79616786f5ef9f66f161322939706562'}]}, 'timestamp': '2025-11-29 07:18:48.052678', '_unique_id': '58a5ef4d5b774106b3596bf8187439ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.053 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.054 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.054 12 DEBUG ceilometer.compute.pollsters [-] 18ed7c04-c274-458a-9bcf-da56ec34bd62/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '36f97701-6a88-46be-8ed2-7a7d7a56d149', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-0000006c-18ed7c04-c274-458a-9bcf-da56ec34bd62-tap038c98cb-23', 'timestamp': '2025-11-29T07:18:48.054713', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1716255593', 'name': 'tap038c98cb-23', 'instance_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:44:d2:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap038c98cb-23'}, 'message_id': 'a5e05f08-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6175.753506018, 'message_signature': 'c014b3b5b80de0c4c92cf76e3e7f006da8433dd55882b9ca42d71357b88c29f2'}]}, 'timestamp': '2025-11-29 07:18:48.055137', '_unique_id': 'bd31186e07174825823f17308a0942d6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.055 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.057 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.075 12 DEBUG ceilometer.compute.pollsters [-] 18ed7c04-c274-458a-9bcf-da56ec34bd62/cpu volume: 11620000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bcde0b64-7935-4572-8831-1d2a058da3de', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11620000000, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62', 'timestamp': '2025-11-29T07:18:48.057176', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1716255593', 'name': 'instance-0000006c', 'instance_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'a5e3858e-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6175.793241325, 'message_signature': 'f9aeb082daa1c701db072a12c36845ce3352fe6cc0a251109ff45b533f65c3f2'}]}, 'timestamp': '2025-11-29 07:18:48.075857', '_unique_id': '492ff02ca57b45538865e8d73adf3863'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.076 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.078 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.078 12 DEBUG ceilometer.compute.pollsters [-] 18ed7c04-c274-458a-9bcf-da56ec34bd62/memory.usage volume: 42.625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '99a90ea3-de8e-4694-bc07-0e8a918687fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.625, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62', 'timestamp': '2025-11-29T07:18:48.078345', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1716255593', 'name': 'instance-0000006c', 'instance_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'a5e3f9e2-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6175.793241325, 'message_signature': 'e13a6f0cf3bda351421ba722d8fdf28a271a72ab7193973046e57c09acc03914'}]}, 'timestamp': '2025-11-29 07:18:48.078730', '_unique_id': '96332a53d7e64564b4c4c59de6da5d68'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.079 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.080 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.080 12 DEBUG ceilometer.compute.pollsters [-] 18ed7c04-c274-458a-9bcf-da56ec34bd62/disk.device.read.bytes volume: 30304768 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.080 12 DEBUG ceilometer.compute.pollsters [-] 18ed7c04-c274-458a-9bcf-da56ec34bd62/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e9378862-b755-445d-8b68-941f954b4bf8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30304768, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62-vda', 'timestamp': '2025-11-29T07:18:48.080632', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1716255593', 'name': 'instance-0000006c', 'instance_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a5e45126-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6175.716726865, 'message_signature': '32d270870ce66346635e48005984bc5ddc7c473c90641d43493792a4b8035da7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62-sda', 'timestamp': '2025-11-29T07:18:48.080632', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1716255593', 'name': 'instance-0000006c', 'instance_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a5e45e3c-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6175.716726865, 'message_signature': '94de153f825ca3a09ce42f0fb9e396ae59e7f8da9a2285a6291813a18a1e5ef3'}]}, 'timestamp': '2025-11-29 07:18:48.081286', '_unique_id': 'f1c195af6ddb4110b36f264d7dcc54cd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.081 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.083 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.083 12 DEBUG ceilometer.compute.pollsters [-] 18ed7c04-c274-458a-9bcf-da56ec34bd62/disk.device.write.latency volume: 8091310254 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.083 12 DEBUG ceilometer.compute.pollsters [-] 18ed7c04-c274-458a-9bcf-da56ec34bd62/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '83d0acb6-b5ae-4718-940f-eca487f9e762', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 8091310254, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62-vda', 'timestamp': '2025-11-29T07:18:48.083161', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1716255593', 'name': 'instance-0000006c', 'instance_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a5e4b3e6-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6175.716726865, 'message_signature': '11b5c829ad936678ca91c92ae5445b349a4cb7ddf95c32d0ef465fa591131499'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62-sda', 'timestamp': '2025-11-29T07:18:48.083161', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1716255593', 'name': 'instance-0000006c', 'instance_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a5e4bff8-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6175.716726865, 'message_signature': '5228bd723f0f7beace1839c125596306b8a1e2b9a45dd5517cd830da5ceea873'}]}, 'timestamp': '2025-11-29 07:18:48.083784', '_unique_id': 'fac4bcfb43614659b2d1d9208911e4a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.084 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.086 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.086 12 DEBUG ceilometer.compute.pollsters [-] 18ed7c04-c274-458a-9bcf-da56ec34bd62/network.incoming.packets volume: 23 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b32e4a3a-37c3-4fd0-8daf-fa9c89956196', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 23, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-0000006c-18ed7c04-c274-458a-9bcf-da56ec34bd62-tap038c98cb-23', 'timestamp': '2025-11-29T07:18:48.086489', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1716255593', 'name': 'tap038c98cb-23', 'instance_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:44:d2:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap038c98cb-23'}, 'message_id': 'a5e53af0-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6175.753506018, 'message_signature': 'fb467d57de24f63af69d0a91b5c581994443f0614ece87f98bc17fe2fe7f9be1'}]}, 'timestamp': '2025-11-29 07:18:48.087067', '_unique_id': '6b7116c34a91484baef5d6d8b95c14ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.088 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.101 12 DEBUG ceilometer.compute.pollsters [-] 18ed7c04-c274-458a-9bcf-da56ec34bd62/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.102 12 DEBUG ceilometer.compute.pollsters [-] 18ed7c04-c274-458a-9bcf-da56ec34bd62/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ee67502-ce07-4df5-8896-65b79b251680', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62-vda', 'timestamp': '2025-11-29T07:18:48.088982', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1716255593', 'name': 'instance-0000006c', 'instance_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a5e7853a-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6175.807215461, 'message_signature': '590c085922989a9af665f990ca17287dc730ff23b39d248296c4858f9c4be337'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62-sda', 'timestamp': '2025-11-29T07:18:48.088982', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1716255593', 'name': 'instance-0000006c', 'instance_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a5e7948a-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6175.807215461, 'message_signature': '42e0e777bc5f995634476e79bf0d073303d27f91974e8fb7bc3be8333c12ee82'}]}, 'timestamp': '2025-11-29 07:18:48.102367', '_unique_id': 'a2987df67def43a2b15515fb53369b92'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.103 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.104 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.104 12 DEBUG ceilometer.compute.pollsters [-] 18ed7c04-c274-458a-9bcf-da56ec34bd62/disk.device.write.bytes volume: 72937472 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:18:48 compute-0 nova_compute[187185]: 2025-11-29 07:18:48.104 187189 DEBUG oslo_concurrency.lockutils [req-4e931299-6498-4a9c-bece-65dd03e60288 req-0fd9688b-1d8f-4062-869a-80dbfd07b390 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-18ed7c04-c274-458a-9bcf-da56ec34bd62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.104 12 DEBUG ceilometer.compute.pollsters [-] 18ed7c04-c274-458a-9bcf-da56ec34bd62/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '865eec81-eba2-47cc-aabf-78277a29f419', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72937472, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62-vda', 'timestamp': '2025-11-29T07:18:48.104395', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1716255593', 'name': 'instance-0000006c', 'instance_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a5e7f0e2-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6175.716726865, 'message_signature': 'ed4434a058cd80881dcec10e4024273225e43d60af95a3633fd48e0a71aeb5bf'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62-sda', 'timestamp': '2025-11-29T07:18:48.104395', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1716255593', 'name': 'instance-0000006c', 'instance_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a5e7fd44-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6175.716726865, 'message_signature': '5b942217aa9a542eb5ec0367a0f3322da19d6aa5ebf8b6c5af1098c2b05f2b2d'}]}, 'timestamp': '2025-11-29 07:18:48.105033', '_unique_id': 'e148bab227894973955578a63ce7e321'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.105 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.106 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.106 12 DEBUG ceilometer.compute.pollsters [-] 18ed7c04-c274-458a-9bcf-da56ec34bd62/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '50cdee95-966e-4cf2-9313-99d8c590441c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-0000006c-18ed7c04-c274-458a-9bcf-da56ec34bd62-tap038c98cb-23', 'timestamp': '2025-11-29T07:18:48.106709', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1716255593', 'name': 'tap038c98cb-23', 'instance_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:44:d2:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap038c98cb-23'}, 'message_id': 'a5e84c40-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6175.753506018, 'message_signature': '99e3b486c5db23f3d3a418da0f69e46c750f03e6217f7277c95d2c8cabb1f368'}]}, 'timestamp': '2025-11-29 07:18:48.107083', '_unique_id': 'd96c50bb5044462dae30223b6626dbdd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.107 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.108 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.108 12 DEBUG ceilometer.compute.pollsters [-] 18ed7c04-c274-458a-9bcf-da56ec34bd62/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 DEBUG ceilometer.compute.pollsters [-] 18ed7c04-c274-458a-9bcf-da56ec34bd62/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fdfaf823-2ebf-4667-90d3-4a5cbe9ca6af', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62-vda', 'timestamp': '2025-11-29T07:18:48.108703', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1716255593', 'name': 'instance-0000006c', 'instance_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a5e899de-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6175.807215461, 'message_signature': '2730c241d834dc52ec89b137becb1b926d8f0626f13d52f36c8166bab6fb8bed'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62-sda', 'timestamp': '2025-11-29T07:18:48.108703', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1716255593', 'name': 'instance-0000006c', 'instance_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a5e8a564-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6175.807215461, 'message_signature': '3fb74632607ddf11e59842b14358ce6013a632954decb9922c360885e17dceb0'}]}, 'timestamp': '2025-11-29 07:18:48.109321', '_unique_id': '36f003de6d014f73bf7584efd274ee4d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.109 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.110 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.110 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.110 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1716255593>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1716255593>]
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.110 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 DEBUG ceilometer.compute.pollsters [-] 18ed7c04-c274-458a-9bcf-da56ec34bd62/network.outgoing.bytes volume: 3488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a8b3f0e-2620-4925-83e7-5b00190ac986', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3488, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-0000006c-18ed7c04-c274-458a-9bcf-da56ec34bd62-tap038c98cb-23', 'timestamp': '2025-11-29T07:18:48.110984', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1716255593', 'name': 'tap038c98cb-23', 'instance_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:44:d2:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap038c98cb-23'}, 'message_id': 'a5e8f000-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6175.753506018, 'message_signature': '8ae9209123627f65240b52e073c00d04fa12d432f35fc8b5f12f48b97908c1e2'}]}, 'timestamp': '2025-11-29 07:18:48.111212', '_unique_id': '3e17bdb3f2924999b35b7198a584463e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.111 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.112 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.112 12 DEBUG ceilometer.compute.pollsters [-] 18ed7c04-c274-458a-9bcf-da56ec34bd62/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9ee58508-6b26-422e-beca-c42da1295400', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-0000006c-18ed7c04-c274-458a-9bcf-da56ec34bd62-tap038c98cb-23', 'timestamp': '2025-11-29T07:18:48.112471', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1716255593', 'name': 'tap038c98cb-23', 'instance_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:44:d2:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap038c98cb-23'}, 'message_id': 'a5e929e4-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6175.753506018, 'message_signature': '731afbd2c3c9ad5fe1cdc66f02e845121fd394430872da0ebdd4aaa323cdc486'}]}, 'timestamp': '2025-11-29 07:18:48.112694', '_unique_id': '3502cd51fa6843399b5549d81c5ab6cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.113 12 DEBUG ceilometer.compute.pollsters [-] 18ed7c04-c274-458a-9bcf-da56ec34bd62/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '18a61cd1-cc23-49c4-863a-84b6a4c38e36', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-0000006c-18ed7c04-c274-458a-9bcf-da56ec34bd62-tap038c98cb-23', 'timestamp': '2025-11-29T07:18:48.113750', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1716255593', 'name': 'tap038c98cb-23', 'instance_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:44:d2:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap038c98cb-23'}, 'message_id': 'a5e95cac-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6175.753506018, 'message_signature': 'fc317bd006b5fb10be6580d0e0eed416575aeb8221b20d6d3a7db0c00de80a17'}]}, 'timestamp': '2025-11-29 07:18:48.113994', '_unique_id': '9685720db7cd475c9298e0f54376f6bd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.114 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 DEBUG ceilometer.compute.pollsters [-] 18ed7c04-c274-458a-9bcf-da56ec34bd62/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 DEBUG ceilometer.compute.pollsters [-] 18ed7c04-c274-458a-9bcf-da56ec34bd62/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '79506dbf-0924-4e17-8df0-40ccca2bd0a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62-vda', 'timestamp': '2025-11-29T07:18:48.115070', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1716255593', 'name': 'instance-0000006c', 'instance_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a5e98f6a-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6175.807215461, 'message_signature': 'a86fbc4fdf9f87bfeec2ddd448e1d8f603de7d2c3353b5c63f4e66962d7cd9fa'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62-sda', 'timestamp': '2025-11-29T07:18:48.115070', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1716255593', 'name': 'instance-0000006c', 'instance_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a5e9976c-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6175.807215461, 'message_signature': '072e434cbef8e2c36a78dc8c2384c803ab0622b3a4aea08f7a9ca61e114b4b48'}]}, 'timestamp': '2025-11-29 07:18:48.115499', '_unique_id': '153597ee362d49b2914d744200f403e2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.115 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.116 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.116 12 DEBUG ceilometer.compute.pollsters [-] 18ed7c04-c274-458a-9bcf-da56ec34bd62/disk.device.write.requests volume: 314 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.116 12 DEBUG ceilometer.compute.pollsters [-] 18ed7c04-c274-458a-9bcf-da56ec34bd62/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2f2d1679-1526-4abe-ac20-3f84aeaf8806', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 314, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62-vda', 'timestamp': '2025-11-29T07:18:48.116565', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1716255593', 'name': 'instance-0000006c', 'instance_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a5e9ca2a-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6175.716726865, 'message_signature': '1a9d951b7b3f1ef27c68390a59c73da7cd38e21d8034dc26eae399808a87d4da'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62-sda', 'timestamp': '2025-11-29T07:18:48.116565', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1716255593', 'name': 'instance-0000006c', 'instance_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a5e9d27c-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6175.716726865, 'message_signature': 'cebd35148e2bab3508fe0355499f5495a081cce1a40253838c95b4927a263950'}]}, 'timestamp': '2025-11-29 07:18:48.116996', '_unique_id': '92d1b911b5c548fb92cd29c788acf47b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:18:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:18:48.117 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:18:48 compute-0 nova_compute[187185]: 2025-11-29 07:18:48.231 187189 INFO nova.compute.manager [None req-f4499ff1-7d62-473a-80dc-4b38f80c2206 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Pausing
Nov 29 07:18:48 compute-0 nova_compute[187185]: 2025-11-29 07:18:48.232 187189 DEBUG nova.objects.instance [None req-f4499ff1-7d62-473a-80dc-4b38f80c2206 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'flavor' on Instance uuid 18ed7c04-c274-458a-9bcf-da56ec34bd62 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:18:48 compute-0 nova_compute[187185]: 2025-11-29 07:18:48.280 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:18:48 compute-0 nova_compute[187185]: 2025-11-29 07:18:48.315 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400728.3150358, 18ed7c04-c274-458a-9bcf-da56ec34bd62 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:18:48 compute-0 nova_compute[187185]: 2025-11-29 07:18:48.315 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] VM Paused (Lifecycle Event)
Nov 29 07:18:48 compute-0 nova_compute[187185]: 2025-11-29 07:18:48.317 187189 DEBUG nova.compute.manager [None req-f4499ff1-7d62-473a-80dc-4b38f80c2206 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:18:48 compute-0 nova_compute[187185]: 2025-11-29 07:18:48.357 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:18:48 compute-0 nova_compute[187185]: 2025-11-29 07:18:48.362 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:18:48 compute-0 nova_compute[187185]: 2025-11-29 07:18:48.561 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] During sync_power_state the instance has a pending task (pausing). Skip.
Nov 29 07:18:50 compute-0 nova_compute[187185]: 2025-11-29 07:18:50.623 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:18:53 compute-0 nova_compute[187185]: 2025-11-29 07:18:53.285 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:18:53 compute-0 nova_compute[187185]: 2025-11-29 07:18:53.919 187189 INFO nova.compute.manager [None req-1945857d-bf74-409d-9e3d-cf14f8a38bd2 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Get console output
Nov 29 07:18:53 compute-0 nova_compute[187185]: 2025-11-29 07:18:53.926 213758 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 29 07:18:54 compute-0 podman[231017]: 2025-11-29 07:18:54.806687868 +0000 UTC m=+0.062184715 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:18:55 compute-0 nova_compute[187185]: 2025-11-29 07:18:55.626 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:18:55 compute-0 nova_compute[187185]: 2025-11-29 07:18:55.999 187189 INFO nova.compute.manager [None req-d6df44e9-33a7-4c59-aeee-31340261fdaf bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Unpausing
Nov 29 07:18:56 compute-0 nova_compute[187185]: 2025-11-29 07:18:56.000 187189 DEBUG nova.objects.instance [None req-d6df44e9-33a7-4c59-aeee-31340261fdaf bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'flavor' on Instance uuid 18ed7c04-c274-458a-9bcf-da56ec34bd62 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:18:56 compute-0 podman[231041]: 2025-11-29 07:18:56.786749934 +0000 UTC m=+0.057753190 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd)
Nov 29 07:18:56 compute-0 podman[231042]: 2025-11-29 07:18:56.79368409 +0000 UTC m=+0.059125668 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 29 07:18:57 compute-0 nova_compute[187185]: 2025-11-29 07:18:57.655 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400737.6556618, 18ed7c04-c274-458a-9bcf-da56ec34bd62 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:18:57 compute-0 nova_compute[187185]: 2025-11-29 07:18:57.656 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] VM Resumed (Lifecycle Event)
Nov 29 07:18:57 compute-0 virtqemud[186729]: argument unsupported: QEMU guest agent is not configured
Nov 29 07:18:57 compute-0 nova_compute[187185]: 2025-11-29 07:18:57.659 187189 DEBUG nova.virt.libvirt.guest [None req-d6df44e9-33a7-4c59-aeee-31340261fdaf bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Nov 29 07:18:57 compute-0 nova_compute[187185]: 2025-11-29 07:18:57.660 187189 DEBUG nova.compute.manager [None req-d6df44e9-33a7-4c59-aeee-31340261fdaf bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:18:57 compute-0 nova_compute[187185]: 2025-11-29 07:18:57.692 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:18:57 compute-0 nova_compute[187185]: 2025-11-29 07:18:57.695 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: paused, current task_state: unpausing, current DB power_state: 3, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:18:58 compute-0 nova_compute[187185]: 2025-11-29 07:18:58.337 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:18:59 compute-0 nova_compute[187185]: 2025-11-29 07:18:59.238 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] During sync_power_state the instance has a pending task (unpausing). Skip.
Nov 29 07:19:00 compute-0 nova_compute[187185]: 2025-11-29 07:19:00.671 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:19:03 compute-0 nova_compute[187185]: 2025-11-29 07:19:03.341 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:19:05 compute-0 nova_compute[187185]: 2025-11-29 07:19:05.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:19:05 compute-0 nova_compute[187185]: 2025-11-29 07:19:05.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:19:05 compute-0 nova_compute[187185]: 2025-11-29 07:19:05.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:19:05 compute-0 nova_compute[187185]: 2025-11-29 07:19:05.637 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "refresh_cache-18ed7c04-c274-458a-9bcf-da56ec34bd62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:19:05 compute-0 nova_compute[187185]: 2025-11-29 07:19:05.637 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquired lock "refresh_cache-18ed7c04-c274-458a-9bcf-da56ec34bd62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:19:05 compute-0 nova_compute[187185]: 2025-11-29 07:19:05.637 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 29 07:19:05 compute-0 nova_compute[187185]: 2025-11-29 07:19:05.638 187189 DEBUG nova.objects.instance [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 18ed7c04-c274-458a-9bcf-da56ec34bd62 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:19:05 compute-0 nova_compute[187185]: 2025-11-29 07:19:05.674 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:19:06 compute-0 nova_compute[187185]: 2025-11-29 07:19:06.422 187189 INFO nova.compute.manager [None req-f1c61a2f-f1e8-44b3-ab97-91b740b78fae bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Get console output
Nov 29 07:19:06 compute-0 nova_compute[187185]: 2025-11-29 07:19:06.427 213758 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 29 07:19:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:19:07.558 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=36, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=35) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:19:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:19:07.559 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:19:07 compute-0 nova_compute[187185]: 2025-11-29 07:19:07.600 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:19:08 compute-0 nova_compute[187185]: 2025-11-29 07:19:08.345 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:19:08 compute-0 nova_compute[187185]: 2025-11-29 07:19:08.347 187189 DEBUG oslo_concurrency.lockutils [None req-df2bee22-dda1-48b4-904f-4da543ce08c5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "18ed7c04-c274-458a-9bcf-da56ec34bd62" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:19:08 compute-0 nova_compute[187185]: 2025-11-29 07:19:08.348 187189 DEBUG oslo_concurrency.lockutils [None req-df2bee22-dda1-48b4-904f-4da543ce08c5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "18ed7c04-c274-458a-9bcf-da56ec34bd62" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:19:08 compute-0 nova_compute[187185]: 2025-11-29 07:19:08.348 187189 DEBUG oslo_concurrency.lockutils [None req-df2bee22-dda1-48b4-904f-4da543ce08c5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "18ed7c04-c274-458a-9bcf-da56ec34bd62-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:19:08 compute-0 nova_compute[187185]: 2025-11-29 07:19:08.349 187189 DEBUG oslo_concurrency.lockutils [None req-df2bee22-dda1-48b4-904f-4da543ce08c5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "18ed7c04-c274-458a-9bcf-da56ec34bd62-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:19:08 compute-0 nova_compute[187185]: 2025-11-29 07:19:08.349 187189 DEBUG oslo_concurrency.lockutils [None req-df2bee22-dda1-48b4-904f-4da543ce08c5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "18ed7c04-c274-458a-9bcf-da56ec34bd62-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:19:08 compute-0 nova_compute[187185]: 2025-11-29 07:19:08.364 187189 INFO nova.compute.manager [None req-df2bee22-dda1-48b4-904f-4da543ce08c5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Terminating instance
Nov 29 07:19:08 compute-0 nova_compute[187185]: 2025-11-29 07:19:08.381 187189 DEBUG nova.compute.manager [None req-df2bee22-dda1-48b4-904f-4da543ce08c5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 07:19:08 compute-0 nova_compute[187185]: 2025-11-29 07:19:08.396 187189 DEBUG nova.compute.manager [req-f1bcc6bd-4ef6-4eb9-9667-ef11f2f3fbc1 req-71d3e7d7-fa38-420a-820a-0c8442ea9c91 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Received event network-changed-038c98cb-23f6-48a9-b2fd-2dbf4c409143 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:19:08 compute-0 nova_compute[187185]: 2025-11-29 07:19:08.396 187189 DEBUG nova.compute.manager [req-f1bcc6bd-4ef6-4eb9-9667-ef11f2f3fbc1 req-71d3e7d7-fa38-420a-820a-0c8442ea9c91 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Refreshing instance network info cache due to event network-changed-038c98cb-23f6-48a9-b2fd-2dbf4c409143. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:19:08 compute-0 nova_compute[187185]: 2025-11-29 07:19:08.397 187189 DEBUG oslo_concurrency.lockutils [req-f1bcc6bd-4ef6-4eb9-9667-ef11f2f3fbc1 req-71d3e7d7-fa38-420a-820a-0c8442ea9c91 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-18ed7c04-c274-458a-9bcf-da56ec34bd62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:19:08 compute-0 kernel: tap038c98cb-23 (unregistering): left promiscuous mode
Nov 29 07:19:08 compute-0 NetworkManager[55227]: <info>  [1764400748.4047] device (tap038c98cb-23): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:19:08 compute-0 nova_compute[187185]: 2025-11-29 07:19:08.407 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:19:08 compute-0 ovn_controller[95281]: 2025-11-29T07:19:08Z|00298|binding|INFO|Releasing lport 038c98cb-23f6-48a9-b2fd-2dbf4c409143 from this chassis (sb_readonly=0)
Nov 29 07:19:08 compute-0 ovn_controller[95281]: 2025-11-29T07:19:08Z|00299|binding|INFO|Setting lport 038c98cb-23f6-48a9-b2fd-2dbf4c409143 down in Southbound
Nov 29 07:19:08 compute-0 ovn_controller[95281]: 2025-11-29T07:19:08Z|00300|binding|INFO|Removing iface tap038c98cb-23 ovn-installed in OVS
Nov 29 07:19:08 compute-0 nova_compute[187185]: 2025-11-29 07:19:08.411 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:19:08 compute-0 nova_compute[187185]: 2025-11-29 07:19:08.425 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:19:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:19:08.442 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:44:d2:ef 10.100.0.6'], port_security=['fa:16:3e:44:d2:ef 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '18ed7c04-c274-458a-9bcf-da56ec34bd62', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3ac75f3c-d209-4778-86b5-e99a50d91b8d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c231e63624d44fc19e0989abfb1afb22', 'neutron:revision_number': '4', 'neutron:security_group_ids': '11768807-c865-49c1-8b66-4622c035d7da', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=86d0dade-0fd3-4b3f-a097-a83258b59083, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=038c98cb-23f6-48a9-b2fd-2dbf4c409143) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:19:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:19:08.443 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 038c98cb-23f6-48a9-b2fd-2dbf4c409143 in datapath 3ac75f3c-d209-4778-86b5-e99a50d91b8d unbound from our chassis
Nov 29 07:19:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:19:08.446 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3ac75f3c-d209-4778-86b5-e99a50d91b8d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:19:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:19:08.447 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[11580649-3e36-418d-86f3-6c85f86e420c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:19:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:19:08.448 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3ac75f3c-d209-4778-86b5-e99a50d91b8d namespace which is not needed anymore
Nov 29 07:19:08 compute-0 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d0000006c.scope: Deactivated successfully.
Nov 29 07:19:08 compute-0 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d0000006c.scope: Consumed 14.209s CPU time.
Nov 29 07:19:08 compute-0 systemd-machined[153486]: Machine qemu-40-instance-0000006c terminated.
Nov 29 07:19:08 compute-0 nova_compute[187185]: 2025-11-29 07:19:08.499 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Updating instance_info_cache with network_info: [{"id": "038c98cb-23f6-48a9-b2fd-2dbf4c409143", "address": "fa:16:3e:44:d2:ef", "network": {"id": "3ac75f3c-d209-4778-86b5-e99a50d91b8d", "bridge": "br-int", "label": "tempest-network-smoke--1784468808", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap038c98cb-23", "ovs_interfaceid": "038c98cb-23f6-48a9-b2fd-2dbf4c409143", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:19:08 compute-0 nova_compute[187185]: 2025-11-29 07:19:08.514 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Releasing lock "refresh_cache-18ed7c04-c274-458a-9bcf-da56ec34bd62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:19:08 compute-0 nova_compute[187185]: 2025-11-29 07:19:08.514 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 29 07:19:08 compute-0 nova_compute[187185]: 2025-11-29 07:19:08.515 187189 DEBUG oslo_concurrency.lockutils [req-f1bcc6bd-4ef6-4eb9-9667-ef11f2f3fbc1 req-71d3e7d7-fa38-420a-820a-0c8442ea9c91 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-18ed7c04-c274-458a-9bcf-da56ec34bd62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:19:08 compute-0 nova_compute[187185]: 2025-11-29 07:19:08.515 187189 DEBUG nova.network.neutron [req-f1bcc6bd-4ef6-4eb9-9667-ef11f2f3fbc1 req-71d3e7d7-fa38-420a-820a-0c8442ea9c91 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Refreshing network info cache for port 038c98cb-23f6-48a9-b2fd-2dbf4c409143 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:19:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:19:08.561 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '36'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:19:08 compute-0 neutron-haproxy-ovnmeta-3ac75f3c-d209-4778-86b5-e99a50d91b8d[230801]: [NOTICE]   (230805) : haproxy version is 2.8.14-c23fe91
Nov 29 07:19:08 compute-0 neutron-haproxy-ovnmeta-3ac75f3c-d209-4778-86b5-e99a50d91b8d[230801]: [NOTICE]   (230805) : path to executable is /usr/sbin/haproxy
Nov 29 07:19:08 compute-0 neutron-haproxy-ovnmeta-3ac75f3c-d209-4778-86b5-e99a50d91b8d[230801]: [WARNING]  (230805) : Exiting Master process...
Nov 29 07:19:08 compute-0 neutron-haproxy-ovnmeta-3ac75f3c-d209-4778-86b5-e99a50d91b8d[230801]: [ALERT]    (230805) : Current worker (230807) exited with code 143 (Terminated)
Nov 29 07:19:08 compute-0 neutron-haproxy-ovnmeta-3ac75f3c-d209-4778-86b5-e99a50d91b8d[230801]: [WARNING]  (230805) : All workers exited. Exiting... (0)
Nov 29 07:19:08 compute-0 systemd[1]: libpod-3ce608646acacaddf192de43ab5fd681bb0ddf092a2a33139d7667488ab4b9de.scope: Deactivated successfully.
Nov 29 07:19:08 compute-0 conmon[230801]: conmon 3ce608646acacaddf192 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-3ce608646acacaddf192de43ab5fd681bb0ddf092a2a33139d7667488ab4b9de.scope/container/memory.events
Nov 29 07:19:08 compute-0 podman[231106]: 2025-11-29 07:19:08.581604296 +0000 UTC m=+0.046765247 container died 3ce608646acacaddf192de43ab5fd681bb0ddf092a2a33139d7667488ab4b9de (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3ac75f3c-d209-4778-86b5-e99a50d91b8d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:19:08 compute-0 nova_compute[187185]: 2025-11-29 07:19:08.644 187189 INFO nova.virt.libvirt.driver [-] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Instance destroyed successfully.
Nov 29 07:19:08 compute-0 nova_compute[187185]: 2025-11-29 07:19:08.645 187189 DEBUG nova.objects.instance [None req-df2bee22-dda1-48b4-904f-4da543ce08c5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'resources' on Instance uuid 18ed7c04-c274-458a-9bcf-da56ec34bd62 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:19:08 compute-0 nova_compute[187185]: 2025-11-29 07:19:08.668 187189 DEBUG nova.virt.libvirt.vif [None req-df2bee22-dda1-48b4-904f-4da543ce08c5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:17:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1716255593',display_name='tempest-TestNetworkAdvancedServerOps-server-1716255593',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1716255593',id=108,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIavjyz1OvfroRFOIbOeRi+rHfxFQBrSW+Ld/D6LfbYvLBX02KQOODf0uKY/ADuzicnqMPwaohmsFg2wFh++jvC+svictzajGD2N3biFjV8neBCmyTW4WlcmDbgS9W7F8g==',key_name='tempest-TestNetworkAdvancedServerOps-536242546',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:18:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-qkbt03s8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:18:59Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=18ed7c04-c274-458a-9bcf-da56ec34bd62,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "038c98cb-23f6-48a9-b2fd-2dbf4c409143", "address": "fa:16:3e:44:d2:ef", "network": {"id": "3ac75f3c-d209-4778-86b5-e99a50d91b8d", "bridge": "br-int", "label": "tempest-network-smoke--1784468808", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap038c98cb-23", "ovs_interfaceid": "038c98cb-23f6-48a9-b2fd-2dbf4c409143", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:19:08 compute-0 nova_compute[187185]: 2025-11-29 07:19:08.669 187189 DEBUG nova.network.os_vif_util [None req-df2bee22-dda1-48b4-904f-4da543ce08c5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converting VIF {"id": "038c98cb-23f6-48a9-b2fd-2dbf4c409143", "address": "fa:16:3e:44:d2:ef", "network": {"id": "3ac75f3c-d209-4778-86b5-e99a50d91b8d", "bridge": "br-int", "label": "tempest-network-smoke--1784468808", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap038c98cb-23", "ovs_interfaceid": "038c98cb-23f6-48a9-b2fd-2dbf4c409143", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:19:08 compute-0 nova_compute[187185]: 2025-11-29 07:19:08.670 187189 DEBUG nova.network.os_vif_util [None req-df2bee22-dda1-48b4-904f-4da543ce08c5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:44:d2:ef,bridge_name='br-int',has_traffic_filtering=True,id=038c98cb-23f6-48a9-b2fd-2dbf4c409143,network=Network(3ac75f3c-d209-4778-86b5-e99a50d91b8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap038c98cb-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:19:08 compute-0 nova_compute[187185]: 2025-11-29 07:19:08.670 187189 DEBUG os_vif [None req-df2bee22-dda1-48b4-904f-4da543ce08c5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:44:d2:ef,bridge_name='br-int',has_traffic_filtering=True,id=038c98cb-23f6-48a9-b2fd-2dbf4c409143,network=Network(3ac75f3c-d209-4778-86b5-e99a50d91b8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap038c98cb-23') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:19:08 compute-0 nova_compute[187185]: 2025-11-29 07:19:08.672 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:19:08 compute-0 nova_compute[187185]: 2025-11-29 07:19:08.672 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap038c98cb-23, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:19:08 compute-0 nova_compute[187185]: 2025-11-29 07:19:08.732 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:19:08 compute-0 nova_compute[187185]: 2025-11-29 07:19:08.734 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:19:08 compute-0 nova_compute[187185]: 2025-11-29 07:19:08.737 187189 INFO os_vif [None req-df2bee22-dda1-48b4-904f-4da543ce08c5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:44:d2:ef,bridge_name='br-int',has_traffic_filtering=True,id=038c98cb-23f6-48a9-b2fd-2dbf4c409143,network=Network(3ac75f3c-d209-4778-86b5-e99a50d91b8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap038c98cb-23')
Nov 29 07:19:08 compute-0 nova_compute[187185]: 2025-11-29 07:19:08.738 187189 INFO nova.virt.libvirt.driver [None req-df2bee22-dda1-48b4-904f-4da543ce08c5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Deleting instance files /var/lib/nova/instances/18ed7c04-c274-458a-9bcf-da56ec34bd62_del
Nov 29 07:19:08 compute-0 nova_compute[187185]: 2025-11-29 07:19:08.738 187189 INFO nova.virt.libvirt.driver [None req-df2bee22-dda1-48b4-904f-4da543ce08c5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Deletion of /var/lib/nova/instances/18ed7c04-c274-458a-9bcf-da56ec34bd62_del complete
Nov 29 07:19:08 compute-0 nova_compute[187185]: 2025-11-29 07:19:08.821 187189 INFO nova.compute.manager [None req-df2bee22-dda1-48b4-904f-4da543ce08c5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Took 0.44 seconds to destroy the instance on the hypervisor.
Nov 29 07:19:08 compute-0 nova_compute[187185]: 2025-11-29 07:19:08.822 187189 DEBUG oslo.service.loopingcall [None req-df2bee22-dda1-48b4-904f-4da543ce08c5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 07:19:08 compute-0 nova_compute[187185]: 2025-11-29 07:19:08.822 187189 DEBUG nova.compute.manager [-] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 07:19:08 compute-0 nova_compute[187185]: 2025-11-29 07:19:08.822 187189 DEBUG nova.network.neutron [-] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 07:19:08 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3ce608646acacaddf192de43ab5fd681bb0ddf092a2a33139d7667488ab4b9de-userdata-shm.mount: Deactivated successfully.
Nov 29 07:19:08 compute-0 systemd[1]: var-lib-containers-storage-overlay-4cf5f324c83e0e329004dbe96b12a8efb6cdd593ebbf8431d32489d24008cbe6-merged.mount: Deactivated successfully.
Nov 29 07:19:08 compute-0 podman[231106]: 2025-11-29 07:19:08.967250385 +0000 UTC m=+0.432411336 container cleanup 3ce608646acacaddf192de43ab5fd681bb0ddf092a2a33139d7667488ab4b9de (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3ac75f3c-d209-4778-86b5-e99a50d91b8d, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 29 07:19:08 compute-0 systemd[1]: libpod-conmon-3ce608646acacaddf192de43ab5fd681bb0ddf092a2a33139d7667488ab4b9de.scope: Deactivated successfully.
Nov 29 07:19:08 compute-0 podman[231152]: 2025-11-29 07:19:08.999208362 +0000 UTC m=+0.120361235 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, config_id=edpm, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.buildah.version=1.33.7, release=1755695350, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 29 07:19:09 compute-0 podman[231151]: 2025-11-29 07:19:09.010592015 +0000 UTC m=+0.139367344 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 29 07:19:09 compute-0 podman[231153]: 2025-11-29 07:19:09.032812025 +0000 UTC m=+0.146283400 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 07:19:09 compute-0 podman[231194]: 2025-11-29 07:19:09.053488752 +0000 UTC m=+0.056583346 container remove 3ce608646acacaddf192de43ab5fd681bb0ddf092a2a33139d7667488ab4b9de (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3ac75f3c-d209-4778-86b5-e99a50d91b8d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:19:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:19:09.058 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d0a4846f-76e6-47f3-a36f-18310632c585]: (4, ('Sat Nov 29 07:19:08 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-3ac75f3c-d209-4778-86b5-e99a50d91b8d (3ce608646acacaddf192de43ab5fd681bb0ddf092a2a33139d7667488ab4b9de)\n3ce608646acacaddf192de43ab5fd681bb0ddf092a2a33139d7667488ab4b9de\nSat Nov 29 07:19:08 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-3ac75f3c-d209-4778-86b5-e99a50d91b8d (3ce608646acacaddf192de43ab5fd681bb0ddf092a2a33139d7667488ab4b9de)\n3ce608646acacaddf192de43ab5fd681bb0ddf092a2a33139d7667488ab4b9de\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:19:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:19:09.060 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[745e1a55-266c-4e74-8c2c-a463c3e69cb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:19:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:19:09.061 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3ac75f3c-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:19:09 compute-0 nova_compute[187185]: 2025-11-29 07:19:09.063 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:19:09 compute-0 kernel: tap3ac75f3c-d0: left promiscuous mode
Nov 29 07:19:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:19:09.068 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[7ced5c73-8c81-4550-b834-091c31eba9c3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:19:09 compute-0 nova_compute[187185]: 2025-11-29 07:19:09.075 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:19:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:19:09.082 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a525cbdb-a4d4-4171-8b60-a54a5f1170cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:19:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:19:09.083 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6d460bcd-dfcd-4b9c-9585-4704967d8ab2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:19:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:19:09.101 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[7369eeb4-921e-4dc7-858b-e33bfc7abab9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 614226, 'reachable_time': 40365, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231228, 'error': None, 'target': 'ovnmeta-3ac75f3c-d209-4778-86b5-e99a50d91b8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:19:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:19:09.104 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3ac75f3c-d209-4778-86b5-e99a50d91b8d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:19:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:19:09.104 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[6aa38e62-7294-436c-b4d6-7a5e708bc835]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:19:09 compute-0 systemd[1]: run-netns-ovnmeta\x2d3ac75f3c\x2dd209\x2d4778\x2d86b5\x2de99a50d91b8d.mount: Deactivated successfully.
Nov 29 07:19:09 compute-0 nova_compute[187185]: 2025-11-29 07:19:09.161 187189 DEBUG nova.compute.manager [req-81cbc089-50f6-4a88-aa02-4f84697f9efc req-eb90c3f3-6661-4da2-849c-b42a0fe0b47a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Received event network-vif-unplugged-038c98cb-23f6-48a9-b2fd-2dbf4c409143 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:19:09 compute-0 nova_compute[187185]: 2025-11-29 07:19:09.162 187189 DEBUG oslo_concurrency.lockutils [req-81cbc089-50f6-4a88-aa02-4f84697f9efc req-eb90c3f3-6661-4da2-849c-b42a0fe0b47a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "18ed7c04-c274-458a-9bcf-da56ec34bd62-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:19:09 compute-0 nova_compute[187185]: 2025-11-29 07:19:09.162 187189 DEBUG oslo_concurrency.lockutils [req-81cbc089-50f6-4a88-aa02-4f84697f9efc req-eb90c3f3-6661-4da2-849c-b42a0fe0b47a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "18ed7c04-c274-458a-9bcf-da56ec34bd62-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:19:09 compute-0 nova_compute[187185]: 2025-11-29 07:19:09.162 187189 DEBUG oslo_concurrency.lockutils [req-81cbc089-50f6-4a88-aa02-4f84697f9efc req-eb90c3f3-6661-4da2-849c-b42a0fe0b47a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "18ed7c04-c274-458a-9bcf-da56ec34bd62-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:19:09 compute-0 nova_compute[187185]: 2025-11-29 07:19:09.163 187189 DEBUG nova.compute.manager [req-81cbc089-50f6-4a88-aa02-4f84697f9efc req-eb90c3f3-6661-4da2-849c-b42a0fe0b47a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] No waiting events found dispatching network-vif-unplugged-038c98cb-23f6-48a9-b2fd-2dbf4c409143 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:19:09 compute-0 nova_compute[187185]: 2025-11-29 07:19:09.163 187189 DEBUG nova.compute.manager [req-81cbc089-50f6-4a88-aa02-4f84697f9efc req-eb90c3f3-6661-4da2-849c-b42a0fe0b47a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Received event network-vif-unplugged-038c98cb-23f6-48a9-b2fd-2dbf4c409143 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 07:19:09 compute-0 nova_compute[187185]: 2025-11-29 07:19:09.631 187189 DEBUG nova.network.neutron [-] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:19:09 compute-0 nova_compute[187185]: 2025-11-29 07:19:09.653 187189 INFO nova.compute.manager [-] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Took 0.83 seconds to deallocate network for instance.
Nov 29 07:19:09 compute-0 nova_compute[187185]: 2025-11-29 07:19:09.754 187189 DEBUG nova.compute.manager [req-d06caedf-5897-4d09-a9d7-bf83d1d4f156 req-f35c02ed-a5aa-472f-a422-327e47880e81 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Received event network-vif-deleted-038c98cb-23f6-48a9-b2fd-2dbf4c409143 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:19:09 compute-0 nova_compute[187185]: 2025-11-29 07:19:09.889 187189 DEBUG oslo_concurrency.lockutils [None req-df2bee22-dda1-48b4-904f-4da543ce08c5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:19:09 compute-0 nova_compute[187185]: 2025-11-29 07:19:09.890 187189 DEBUG oslo_concurrency.lockutils [None req-df2bee22-dda1-48b4-904f-4da543ce08c5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:19:10 compute-0 nova_compute[187185]: 2025-11-29 07:19:10.024 187189 DEBUG nova.scheduler.client.report [None req-df2bee22-dda1-48b4-904f-4da543ce08c5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Refreshing inventories for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 29 07:19:10 compute-0 nova_compute[187185]: 2025-11-29 07:19:10.103 187189 DEBUG nova.scheduler.client.report [None req-df2bee22-dda1-48b4-904f-4da543ce08c5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Updating ProviderTree inventory for provider 4e39a026-df39-4e20-874a-dbb5a40df044 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 29 07:19:10 compute-0 nova_compute[187185]: 2025-11-29 07:19:10.105 187189 DEBUG nova.compute.provider_tree [None req-df2bee22-dda1-48b4-904f-4da543ce08c5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Updating inventory in ProviderTree for provider 4e39a026-df39-4e20-874a-dbb5a40df044 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 29 07:19:10 compute-0 nova_compute[187185]: 2025-11-29 07:19:10.213 187189 DEBUG nova.scheduler.client.report [None req-df2bee22-dda1-48b4-904f-4da543ce08c5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Refreshing aggregate associations for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 29 07:19:10 compute-0 nova_compute[187185]: 2025-11-29 07:19:10.237 187189 DEBUG nova.scheduler.client.report [None req-df2bee22-dda1-48b4-904f-4da543ce08c5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Refreshing trait associations for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044, traits: HW_CPU_X86_SSE,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_TRUSTED_CERTS,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 29 07:19:10 compute-0 nova_compute[187185]: 2025-11-29 07:19:10.284 187189 DEBUG nova.compute.provider_tree [None req-df2bee22-dda1-48b4-904f-4da543ce08c5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:19:10 compute-0 nova_compute[187185]: 2025-11-29 07:19:10.303 187189 DEBUG nova.scheduler.client.report [None req-df2bee22-dda1-48b4-904f-4da543ce08c5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:19:10 compute-0 nova_compute[187185]: 2025-11-29 07:19:10.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:19:10 compute-0 nova_compute[187185]: 2025-11-29 07:19:10.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:19:10 compute-0 nova_compute[187185]: 2025-11-29 07:19:10.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:19:10 compute-0 nova_compute[187185]: 2025-11-29 07:19:10.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:19:10 compute-0 nova_compute[187185]: 2025-11-29 07:19:10.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:19:10 compute-0 nova_compute[187185]: 2025-11-29 07:19:10.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:19:10 compute-0 nova_compute[187185]: 2025-11-29 07:19:10.325 187189 DEBUG oslo_concurrency.lockutils [None req-df2bee22-dda1-48b4-904f-4da543ce08c5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.435s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:19:10 compute-0 nova_compute[187185]: 2025-11-29 07:19:10.378 187189 INFO nova.scheduler.client.report [None req-df2bee22-dda1-48b4-904f-4da543ce08c5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Deleted allocations for instance 18ed7c04-c274-458a-9bcf-da56ec34bd62
Nov 29 07:19:10 compute-0 nova_compute[187185]: 2025-11-29 07:19:10.467 187189 DEBUG oslo_concurrency.lockutils [None req-df2bee22-dda1-48b4-904f-4da543ce08c5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "18ed7c04-c274-458a-9bcf-da56ec34bd62" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:19:10 compute-0 nova_compute[187185]: 2025-11-29 07:19:10.678 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:19:11 compute-0 nova_compute[187185]: 2025-11-29 07:19:11.151 187189 DEBUG nova.network.neutron [req-f1bcc6bd-4ef6-4eb9-9667-ef11f2f3fbc1 req-71d3e7d7-fa38-420a-820a-0c8442ea9c91 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Updated VIF entry in instance network info cache for port 038c98cb-23f6-48a9-b2fd-2dbf4c409143. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:19:11 compute-0 nova_compute[187185]: 2025-11-29 07:19:11.151 187189 DEBUG nova.network.neutron [req-f1bcc6bd-4ef6-4eb9-9667-ef11f2f3fbc1 req-71d3e7d7-fa38-420a-820a-0c8442ea9c91 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Updating instance_info_cache with network_info: [{"id": "038c98cb-23f6-48a9-b2fd-2dbf4c409143", "address": "fa:16:3e:44:d2:ef", "network": {"id": "3ac75f3c-d209-4778-86b5-e99a50d91b8d", "bridge": "br-int", "label": "tempest-network-smoke--1784468808", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap038c98cb-23", "ovs_interfaceid": "038c98cb-23f6-48a9-b2fd-2dbf4c409143", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:19:11 compute-0 nova_compute[187185]: 2025-11-29 07:19:11.184 187189 DEBUG oslo_concurrency.lockutils [req-f1bcc6bd-4ef6-4eb9-9667-ef11f2f3fbc1 req-71d3e7d7-fa38-420a-820a-0c8442ea9c91 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-18ed7c04-c274-458a-9bcf-da56ec34bd62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:19:11 compute-0 nova_compute[187185]: 2025-11-29 07:19:11.238 187189 DEBUG nova.compute.manager [req-37a5ac53-de49-4ded-991c-128cdd28b3d4 req-d947792f-03d8-4702-8f23-7afe982348dc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Received event network-vif-plugged-038c98cb-23f6-48a9-b2fd-2dbf4c409143 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:19:11 compute-0 nova_compute[187185]: 2025-11-29 07:19:11.238 187189 DEBUG oslo_concurrency.lockutils [req-37a5ac53-de49-4ded-991c-128cdd28b3d4 req-d947792f-03d8-4702-8f23-7afe982348dc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "18ed7c04-c274-458a-9bcf-da56ec34bd62-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:19:11 compute-0 nova_compute[187185]: 2025-11-29 07:19:11.239 187189 DEBUG oslo_concurrency.lockutils [req-37a5ac53-de49-4ded-991c-128cdd28b3d4 req-d947792f-03d8-4702-8f23-7afe982348dc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "18ed7c04-c274-458a-9bcf-da56ec34bd62-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:19:11 compute-0 nova_compute[187185]: 2025-11-29 07:19:11.239 187189 DEBUG oslo_concurrency.lockutils [req-37a5ac53-de49-4ded-991c-128cdd28b3d4 req-d947792f-03d8-4702-8f23-7afe982348dc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "18ed7c04-c274-458a-9bcf-da56ec34bd62-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:19:11 compute-0 nova_compute[187185]: 2025-11-29 07:19:11.239 187189 DEBUG nova.compute.manager [req-37a5ac53-de49-4ded-991c-128cdd28b3d4 req-d947792f-03d8-4702-8f23-7afe982348dc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] No waiting events found dispatching network-vif-plugged-038c98cb-23f6-48a9-b2fd-2dbf4c409143 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:19:11 compute-0 nova_compute[187185]: 2025-11-29 07:19:11.239 187189 WARNING nova.compute.manager [req-37a5ac53-de49-4ded-991c-128cdd28b3d4 req-d947792f-03d8-4702-8f23-7afe982348dc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Received unexpected event network-vif-plugged-038c98cb-23f6-48a9-b2fd-2dbf4c409143 for instance with vm_state deleted and task_state None.
Nov 29 07:19:11 compute-0 nova_compute[187185]: 2025-11-29 07:19:11.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:19:11 compute-0 nova_compute[187185]: 2025-11-29 07:19:11.434 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:19:11 compute-0 nova_compute[187185]: 2025-11-29 07:19:11.435 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:19:11 compute-0 nova_compute[187185]: 2025-11-29 07:19:11.435 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:19:11 compute-0 nova_compute[187185]: 2025-11-29 07:19:11.435 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:19:11 compute-0 nova_compute[187185]: 2025-11-29 07:19:11.618 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:19:11 compute-0 nova_compute[187185]: 2025-11-29 07:19:11.619 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5728MB free_disk=73.29460525512695GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:19:11 compute-0 nova_compute[187185]: 2025-11-29 07:19:11.619 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:19:11 compute-0 nova_compute[187185]: 2025-11-29 07:19:11.620 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:19:11 compute-0 nova_compute[187185]: 2025-11-29 07:19:11.714 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:19:11 compute-0 nova_compute[187185]: 2025-11-29 07:19:11.715 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:19:11 compute-0 nova_compute[187185]: 2025-11-29 07:19:11.740 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:19:11 compute-0 nova_compute[187185]: 2025-11-29 07:19:11.771 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:19:11 compute-0 nova_compute[187185]: 2025-11-29 07:19:11.802 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:19:11 compute-0 nova_compute[187185]: 2025-11-29 07:19:11.803 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.183s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:19:12 compute-0 nova_compute[187185]: 2025-11-29 07:19:12.807 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:19:13 compute-0 nova_compute[187185]: 2025-11-29 07:19:13.311 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:19:13 compute-0 nova_compute[187185]: 2025-11-29 07:19:13.733 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:19:15 compute-0 nova_compute[187185]: 2025-11-29 07:19:15.505 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:19:15 compute-0 nova_compute[187185]: 2025-11-29 07:19:15.699 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:19:15 compute-0 nova_compute[187185]: 2025-11-29 07:19:15.713 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:19:18 compute-0 nova_compute[187185]: 2025-11-29 07:19:18.736 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:19:18 compute-0 podman[231231]: 2025-11-29 07:19:18.829243382 +0000 UTC m=+0.102072687 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 29 07:19:20 compute-0 nova_compute[187185]: 2025-11-29 07:19:20.703 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:19:23 compute-0 nova_compute[187185]: 2025-11-29 07:19:23.643 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400748.6420784, 18ed7c04-c274-458a-9bcf-da56ec34bd62 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:19:23 compute-0 nova_compute[187185]: 2025-11-29 07:19:23.643 187189 INFO nova.compute.manager [-] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] VM Stopped (Lifecycle Event)
Nov 29 07:19:23 compute-0 nova_compute[187185]: 2025-11-29 07:19:23.737 187189 DEBUG nova.compute.manager [None req-046fed48-d11c-4c93-b777-72d2468d9f7a - - - - - -] [instance: 18ed7c04-c274-458a-9bcf-da56ec34bd62] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:19:23 compute-0 nova_compute[187185]: 2025-11-29 07:19:23.738 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:19:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:19:25.508 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:19:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:19:25.508 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:19:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:19:25.508 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:19:25 compute-0 nova_compute[187185]: 2025-11-29 07:19:25.705 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:19:25 compute-0 podman[231259]: 2025-11-29 07:19:25.813986356 +0000 UTC m=+0.081318337 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 07:19:27 compute-0 podman[231285]: 2025-11-29 07:19:27.785642222 +0000 UTC m=+0.052273554 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 07:19:27 compute-0 podman[231286]: 2025-11-29 07:19:27.817749923 +0000 UTC m=+0.076396488 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 07:19:28 compute-0 nova_compute[187185]: 2025-11-29 07:19:28.740 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:19:30 compute-0 nova_compute[187185]: 2025-11-29 07:19:30.708 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:19:33 compute-0 nova_compute[187185]: 2025-11-29 07:19:33.744 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:19:35 compute-0 nova_compute[187185]: 2025-11-29 07:19:35.710 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:19:38 compute-0 nova_compute[187185]: 2025-11-29 07:19:38.746 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:19:39 compute-0 podman[231324]: 2025-11-29 07:19:39.812083235 +0000 UTC m=+0.063640237 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 07:19:39 compute-0 podman[231323]: 2025-11-29 07:19:39.819084683 +0000 UTC m=+0.070605464 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, version=9.6, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, release=1755695350, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64)
Nov 29 07:19:39 compute-0 podman[231322]: 2025-11-29 07:19:39.835958412 +0000 UTC m=+0.093633237 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Nov 29 07:19:40 compute-0 nova_compute[187185]: 2025-11-29 07:19:40.713 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:19:43 compute-0 nova_compute[187185]: 2025-11-29 07:19:43.749 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:19:45 compute-0 nova_compute[187185]: 2025-11-29 07:19:45.714 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:19:48 compute-0 nova_compute[187185]: 2025-11-29 07:19:48.751 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:19:49 compute-0 podman[231386]: 2025-11-29 07:19:49.82595464 +0000 UTC m=+0.094096550 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 07:19:50 compute-0 nova_compute[187185]: 2025-11-29 07:19:50.716 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:19:53 compute-0 nova_compute[187185]: 2025-11-29 07:19:53.754 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:19:55 compute-0 nova_compute[187185]: 2025-11-29 07:19:55.781 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:19:56 compute-0 podman[231412]: 2025-11-29 07:19:56.779800317 +0000 UTC m=+0.048393603 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:19:58 compute-0 nova_compute[187185]: 2025-11-29 07:19:58.756 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:19:58 compute-0 podman[231436]: 2025-11-29 07:19:58.806456133 +0000 UTC m=+0.072562959 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 29 07:19:58 compute-0 podman[231437]: 2025-11-29 07:19:58.8313787 +0000 UTC m=+0.090515129 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:20:00 compute-0 nova_compute[187185]: 2025-11-29 07:20:00.784 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:03 compute-0 nova_compute[187185]: 2025-11-29 07:20:03.758 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:05 compute-0 nova_compute[187185]: 2025-11-29 07:20:05.786 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:06 compute-0 nova_compute[187185]: 2025-11-29 07:20:06.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:20:06 compute-0 nova_compute[187185]: 2025-11-29 07:20:06.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:20:06 compute-0 nova_compute[187185]: 2025-11-29 07:20:06.318 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:20:06 compute-0 nova_compute[187185]: 2025-11-29 07:20:06.339 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 07:20:07 compute-0 nova_compute[187185]: 2025-11-29 07:20:07.352 187189 DEBUG oslo_concurrency.lockutils [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "385b61e0-d06f-45d5-833f-956226dbe647" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:20:07 compute-0 nova_compute[187185]: 2025-11-29 07:20:07.353 187189 DEBUG oslo_concurrency.lockutils [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "385b61e0-d06f-45d5-833f-956226dbe647" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:20:07 compute-0 nova_compute[187185]: 2025-11-29 07:20:07.371 187189 DEBUG nova.compute.manager [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 07:20:07 compute-0 nova_compute[187185]: 2025-11-29 07:20:07.484 187189 DEBUG oslo_concurrency.lockutils [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:20:07 compute-0 nova_compute[187185]: 2025-11-29 07:20:07.484 187189 DEBUG oslo_concurrency.lockutils [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:20:07 compute-0 nova_compute[187185]: 2025-11-29 07:20:07.491 187189 DEBUG nova.virt.hardware [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:20:07 compute-0 nova_compute[187185]: 2025-11-29 07:20:07.491 187189 INFO nova.compute.claims [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:20:07 compute-0 nova_compute[187185]: 2025-11-29 07:20:07.693 187189 DEBUG nova.compute.provider_tree [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:20:07 compute-0 nova_compute[187185]: 2025-11-29 07:20:07.724 187189 DEBUG nova.scheduler.client.report [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:20:07 compute-0 nova_compute[187185]: 2025-11-29 07:20:07.780 187189 DEBUG oslo_concurrency.lockutils [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.295s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:20:07 compute-0 nova_compute[187185]: 2025-11-29 07:20:07.780 187189 DEBUG nova.compute.manager [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 07:20:07 compute-0 nova_compute[187185]: 2025-11-29 07:20:07.839 187189 DEBUG nova.compute.manager [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 07:20:07 compute-0 nova_compute[187185]: 2025-11-29 07:20:07.839 187189 DEBUG nova.network.neutron [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 07:20:07 compute-0 nova_compute[187185]: 2025-11-29 07:20:07.930 187189 INFO nova.virt.libvirt.driver [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 07:20:07 compute-0 nova_compute[187185]: 2025-11-29 07:20:07.978 187189 DEBUG nova.compute.manager [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 07:20:08 compute-0 nova_compute[187185]: 2025-11-29 07:20:08.123 187189 DEBUG nova.compute.manager [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 07:20:08 compute-0 nova_compute[187185]: 2025-11-29 07:20:08.125 187189 DEBUG nova.virt.libvirt.driver [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:20:08 compute-0 nova_compute[187185]: 2025-11-29 07:20:08.126 187189 INFO nova.virt.libvirt.driver [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Creating image(s)
Nov 29 07:20:08 compute-0 nova_compute[187185]: 2025-11-29 07:20:08.127 187189 DEBUG oslo_concurrency.lockutils [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "/var/lib/nova/instances/385b61e0-d06f-45d5-833f-956226dbe647/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:20:08 compute-0 nova_compute[187185]: 2025-11-29 07:20:08.128 187189 DEBUG oslo_concurrency.lockutils [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "/var/lib/nova/instances/385b61e0-d06f-45d5-833f-956226dbe647/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:20:08 compute-0 nova_compute[187185]: 2025-11-29 07:20:08.129 187189 DEBUG oslo_concurrency.lockutils [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "/var/lib/nova/instances/385b61e0-d06f-45d5-833f-956226dbe647/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:20:08 compute-0 nova_compute[187185]: 2025-11-29 07:20:08.161 187189 DEBUG nova.policy [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 07:20:08 compute-0 nova_compute[187185]: 2025-11-29 07:20:08.167 187189 DEBUG oslo_concurrency.processutils [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:20:08 compute-0 nova_compute[187185]: 2025-11-29 07:20:08.250 187189 DEBUG oslo_concurrency.processutils [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:20:08 compute-0 nova_compute[187185]: 2025-11-29 07:20:08.251 187189 DEBUG oslo_concurrency.lockutils [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:20:08 compute-0 nova_compute[187185]: 2025-11-29 07:20:08.251 187189 DEBUG oslo_concurrency.lockutils [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:20:08 compute-0 nova_compute[187185]: 2025-11-29 07:20:08.263 187189 DEBUG oslo_concurrency.processutils [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:20:08 compute-0 nova_compute[187185]: 2025-11-29 07:20:08.343 187189 DEBUG oslo_concurrency.processutils [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:20:08 compute-0 nova_compute[187185]: 2025-11-29 07:20:08.344 187189 DEBUG oslo_concurrency.processutils [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/385b61e0-d06f-45d5-833f-956226dbe647/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:20:08 compute-0 nova_compute[187185]: 2025-11-29 07:20:08.445 187189 DEBUG oslo_concurrency.processutils [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/385b61e0-d06f-45d5-833f-956226dbe647/disk 1073741824" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:20:08 compute-0 nova_compute[187185]: 2025-11-29 07:20:08.448 187189 DEBUG oslo_concurrency.lockutils [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:20:08 compute-0 nova_compute[187185]: 2025-11-29 07:20:08.448 187189 DEBUG oslo_concurrency.processutils [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:20:08 compute-0 nova_compute[187185]: 2025-11-29 07:20:08.515 187189 DEBUG oslo_concurrency.processutils [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:20:08 compute-0 nova_compute[187185]: 2025-11-29 07:20:08.516 187189 DEBUG nova.virt.disk.api [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Checking if we can resize image /var/lib/nova/instances/385b61e0-d06f-45d5-833f-956226dbe647/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:20:08 compute-0 nova_compute[187185]: 2025-11-29 07:20:08.517 187189 DEBUG oslo_concurrency.processutils [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/385b61e0-d06f-45d5-833f-956226dbe647/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:20:08 compute-0 nova_compute[187185]: 2025-11-29 07:20:08.578 187189 DEBUG oslo_concurrency.processutils [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/385b61e0-d06f-45d5-833f-956226dbe647/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:20:08 compute-0 nova_compute[187185]: 2025-11-29 07:20:08.579 187189 DEBUG nova.virt.disk.api [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Cannot resize image /var/lib/nova/instances/385b61e0-d06f-45d5-833f-956226dbe647/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:20:08 compute-0 nova_compute[187185]: 2025-11-29 07:20:08.580 187189 DEBUG nova.objects.instance [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'migration_context' on Instance uuid 385b61e0-d06f-45d5-833f-956226dbe647 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:20:08 compute-0 nova_compute[187185]: 2025-11-29 07:20:08.598 187189 DEBUG nova.virt.libvirt.driver [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:20:08 compute-0 nova_compute[187185]: 2025-11-29 07:20:08.599 187189 DEBUG nova.virt.libvirt.driver [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Ensure instance console log exists: /var/lib/nova/instances/385b61e0-d06f-45d5-833f-956226dbe647/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:20:08 compute-0 nova_compute[187185]: 2025-11-29 07:20:08.599 187189 DEBUG oslo_concurrency.lockutils [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:20:08 compute-0 nova_compute[187185]: 2025-11-29 07:20:08.600 187189 DEBUG oslo_concurrency.lockutils [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:20:08 compute-0 nova_compute[187185]: 2025-11-29 07:20:08.600 187189 DEBUG oslo_concurrency.lockutils [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:20:08 compute-0 nova_compute[187185]: 2025-11-29 07:20:08.760 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:08.796 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=37, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=36) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:20:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:08.798 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:20:08 compute-0 nova_compute[187185]: 2025-11-29 07:20:08.798 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:09 compute-0 nova_compute[187185]: 2025-11-29 07:20:09.609 187189 DEBUG nova.network.neutron [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Successfully created port: d69c5dfd-952c-44e7-9e26-18e9807fcaf6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:20:10 compute-0 nova_compute[187185]: 2025-11-29 07:20:10.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:20:10 compute-0 nova_compute[187185]: 2025-11-29 07:20:10.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:20:10 compute-0 nova_compute[187185]: 2025-11-29 07:20:10.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:20:10 compute-0 nova_compute[187185]: 2025-11-29 07:20:10.789 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:10 compute-0 podman[231492]: 2025-11-29 07:20:10.792739101 +0000 UTC m=+0.051283729 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 07:20:10 compute-0 podman[231490]: 2025-11-29 07:20:10.793578675 +0000 UTC m=+0.061777986 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 07:20:10 compute-0 podman[231491]: 2025-11-29 07:20:10.79377893 +0000 UTC m=+0.060905890 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.)
Nov 29 07:20:11 compute-0 nova_compute[187185]: 2025-11-29 07:20:11.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:20:11 compute-0 nova_compute[187185]: 2025-11-29 07:20:11.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:20:11 compute-0 nova_compute[187185]: 2025-11-29 07:20:11.743 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:20:11 compute-0 nova_compute[187185]: 2025-11-29 07:20:11.744 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:20:11 compute-0 nova_compute[187185]: 2025-11-29 07:20:11.744 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:20:11 compute-0 nova_compute[187185]: 2025-11-29 07:20:11.744 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:20:11 compute-0 nova_compute[187185]: 2025-11-29 07:20:11.912 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:20:11 compute-0 nova_compute[187185]: 2025-11-29 07:20:11.913 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5745MB free_disk=73.29447937011719GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:20:11 compute-0 nova_compute[187185]: 2025-11-29 07:20:11.913 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:20:11 compute-0 nova_compute[187185]: 2025-11-29 07:20:11.913 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:20:11 compute-0 nova_compute[187185]: 2025-11-29 07:20:11.996 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance 385b61e0-d06f-45d5-833f-956226dbe647 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:20:11 compute-0 nova_compute[187185]: 2025-11-29 07:20:11.996 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:20:11 compute-0 nova_compute[187185]: 2025-11-29 07:20:11.997 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:20:12 compute-0 nova_compute[187185]: 2025-11-29 07:20:12.062 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:20:12 compute-0 nova_compute[187185]: 2025-11-29 07:20:12.076 187189 DEBUG nova.network.neutron [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Successfully updated port: d69c5dfd-952c-44e7-9e26-18e9807fcaf6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:20:12 compute-0 nova_compute[187185]: 2025-11-29 07:20:12.107 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:20:12 compute-0 nova_compute[187185]: 2025-11-29 07:20:12.112 187189 DEBUG oslo_concurrency.lockutils [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "refresh_cache-385b61e0-d06f-45d5-833f-956226dbe647" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:20:12 compute-0 nova_compute[187185]: 2025-11-29 07:20:12.113 187189 DEBUG oslo_concurrency.lockutils [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquired lock "refresh_cache-385b61e0-d06f-45d5-833f-956226dbe647" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:20:12 compute-0 nova_compute[187185]: 2025-11-29 07:20:12.113 187189 DEBUG nova.network.neutron [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:20:12 compute-0 nova_compute[187185]: 2025-11-29 07:20:12.141 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:20:12 compute-0 nova_compute[187185]: 2025-11-29 07:20:12.141 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.228s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:20:12 compute-0 nova_compute[187185]: 2025-11-29 07:20:12.273 187189 DEBUG nova.compute.manager [req-3d529326-3009-4021-aad5-b88c03846b7e req-4883fafb-4459-4d7a-a52c-255c0c9a9f07 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Received event network-changed-d69c5dfd-952c-44e7-9e26-18e9807fcaf6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:20:12 compute-0 nova_compute[187185]: 2025-11-29 07:20:12.274 187189 DEBUG nova.compute.manager [req-3d529326-3009-4021-aad5-b88c03846b7e req-4883fafb-4459-4d7a-a52c-255c0c9a9f07 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Refreshing instance network info cache due to event network-changed-d69c5dfd-952c-44e7-9e26-18e9807fcaf6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:20:12 compute-0 nova_compute[187185]: 2025-11-29 07:20:12.274 187189 DEBUG oslo_concurrency.lockutils [req-3d529326-3009-4021-aad5-b88c03846b7e req-4883fafb-4459-4d7a-a52c-255c0c9a9f07 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-385b61e0-d06f-45d5-833f-956226dbe647" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:20:12 compute-0 nova_compute[187185]: 2025-11-29 07:20:12.309 187189 DEBUG nova.network.neutron [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:20:13 compute-0 nova_compute[187185]: 2025-11-29 07:20:13.142 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:20:13 compute-0 nova_compute[187185]: 2025-11-29 07:20:13.142 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:20:13 compute-0 nova_compute[187185]: 2025-11-29 07:20:13.143 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:20:13 compute-0 nova_compute[187185]: 2025-11-29 07:20:13.762 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:13 compute-0 nova_compute[187185]: 2025-11-29 07:20:13.960 187189 DEBUG nova.network.neutron [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Updating instance_info_cache with network_info: [{"id": "d69c5dfd-952c-44e7-9e26-18e9807fcaf6", "address": "fa:16:3e:ff:90:f9", "network": {"id": "b1af4918-cc03-490f-9e76-dc4f5fd7f840", "bridge": "br-int", "label": "tempest-network-smoke--713941231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd69c5dfd-95", "ovs_interfaceid": "d69c5dfd-952c-44e7-9e26-18e9807fcaf6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:20:13 compute-0 nova_compute[187185]: 2025-11-29 07:20:13.986 187189 DEBUG oslo_concurrency.lockutils [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Releasing lock "refresh_cache-385b61e0-d06f-45d5-833f-956226dbe647" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:20:13 compute-0 nova_compute[187185]: 2025-11-29 07:20:13.986 187189 DEBUG nova.compute.manager [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Instance network_info: |[{"id": "d69c5dfd-952c-44e7-9e26-18e9807fcaf6", "address": "fa:16:3e:ff:90:f9", "network": {"id": "b1af4918-cc03-490f-9e76-dc4f5fd7f840", "bridge": "br-int", "label": "tempest-network-smoke--713941231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd69c5dfd-95", "ovs_interfaceid": "d69c5dfd-952c-44e7-9e26-18e9807fcaf6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 07:20:13 compute-0 nova_compute[187185]: 2025-11-29 07:20:13.987 187189 DEBUG oslo_concurrency.lockutils [req-3d529326-3009-4021-aad5-b88c03846b7e req-4883fafb-4459-4d7a-a52c-255c0c9a9f07 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-385b61e0-d06f-45d5-833f-956226dbe647" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:20:13 compute-0 nova_compute[187185]: 2025-11-29 07:20:13.987 187189 DEBUG nova.network.neutron [req-3d529326-3009-4021-aad5-b88c03846b7e req-4883fafb-4459-4d7a-a52c-255c0c9a9f07 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Refreshing network info cache for port d69c5dfd-952c-44e7-9e26-18e9807fcaf6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:20:13 compute-0 nova_compute[187185]: 2025-11-29 07:20:13.991 187189 DEBUG nova.virt.libvirt.driver [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Start _get_guest_xml network_info=[{"id": "d69c5dfd-952c-44e7-9e26-18e9807fcaf6", "address": "fa:16:3e:ff:90:f9", "network": {"id": "b1af4918-cc03-490f-9e76-dc4f5fd7f840", "bridge": "br-int", "label": "tempest-network-smoke--713941231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd69c5dfd-95", "ovs_interfaceid": "d69c5dfd-952c-44e7-9e26-18e9807fcaf6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:20:13 compute-0 nova_compute[187185]: 2025-11-29 07:20:13.996 187189 WARNING nova.virt.libvirt.driver [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.000 187189 DEBUG nova.virt.libvirt.host [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.001 187189 DEBUG nova.virt.libvirt.host [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.005 187189 DEBUG nova.virt.libvirt.host [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.006 187189 DEBUG nova.virt.libvirt.host [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.009 187189 DEBUG nova.virt.libvirt.driver [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.009 187189 DEBUG nova.virt.hardware [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.010 187189 DEBUG nova.virt.hardware [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.010 187189 DEBUG nova.virt.hardware [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.011 187189 DEBUG nova.virt.hardware [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.011 187189 DEBUG nova.virt.hardware [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.011 187189 DEBUG nova.virt.hardware [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.012 187189 DEBUG nova.virt.hardware [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.012 187189 DEBUG nova.virt.hardware [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.013 187189 DEBUG nova.virt.hardware [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.013 187189 DEBUG nova.virt.hardware [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.013 187189 DEBUG nova.virt.hardware [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.020 187189 DEBUG nova.virt.libvirt.vif [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:20:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1185749356',display_name='tempest-TestNetworkAdvancedServerOps-server-1185749356',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1185749356',id=110,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGSwWnxtNlh6nwv52DSkLcXTayESRz1TKsVEdkgArNOlSLUsPbKJSA3vkHHPtiLgyTWyPt3OWUciI+4jP4we7rtLJMxd1AlvhHoYZGKDwql7lK2v6bgvKYVGR4xaItQ7lg==',key_name='tempest-TestNetworkAdvancedServerOps-167732049',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-ymcbmpe0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:20:08Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=385b61e0-d06f-45d5-833f-956226dbe647,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d69c5dfd-952c-44e7-9e26-18e9807fcaf6", "address": "fa:16:3e:ff:90:f9", "network": {"id": "b1af4918-cc03-490f-9e76-dc4f5fd7f840", "bridge": "br-int", "label": "tempest-network-smoke--713941231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd69c5dfd-95", "ovs_interfaceid": "d69c5dfd-952c-44e7-9e26-18e9807fcaf6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.021 187189 DEBUG nova.network.os_vif_util [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converting VIF {"id": "d69c5dfd-952c-44e7-9e26-18e9807fcaf6", "address": "fa:16:3e:ff:90:f9", "network": {"id": "b1af4918-cc03-490f-9e76-dc4f5fd7f840", "bridge": "br-int", "label": "tempest-network-smoke--713941231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd69c5dfd-95", "ovs_interfaceid": "d69c5dfd-952c-44e7-9e26-18e9807fcaf6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.023 187189 DEBUG nova.network.os_vif_util [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:90:f9,bridge_name='br-int',has_traffic_filtering=True,id=d69c5dfd-952c-44e7-9e26-18e9807fcaf6,network=Network(b1af4918-cc03-490f-9e76-dc4f5fd7f840),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd69c5dfd-95') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.024 187189 DEBUG nova.objects.instance [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'pci_devices' on Instance uuid 385b61e0-d06f-45d5-833f-956226dbe647 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.054 187189 DEBUG nova.virt.libvirt.driver [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:20:14 compute-0 nova_compute[187185]:   <uuid>385b61e0-d06f-45d5-833f-956226dbe647</uuid>
Nov 29 07:20:14 compute-0 nova_compute[187185]:   <name>instance-0000006e</name>
Nov 29 07:20:14 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:20:14 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:20:14 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:20:14 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-1185749356</nova:name>
Nov 29 07:20:14 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:20:13</nova:creationTime>
Nov 29 07:20:14 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:20:14 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:20:14 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:20:14 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:20:14 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:20:14 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:20:14 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:20:14 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:20:14 compute-0 nova_compute[187185]:         <nova:user uuid="bfd2024670594b10941cec8a59d2573f">tempest-TestNetworkAdvancedServerOps-1380683659-project-member</nova:user>
Nov 29 07:20:14 compute-0 nova_compute[187185]:         <nova:project uuid="c231e63624d44fc19e0989abfb1afb22">tempest-TestNetworkAdvancedServerOps-1380683659</nova:project>
Nov 29 07:20:14 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:20:14 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:20:14 compute-0 nova_compute[187185]:         <nova:port uuid="d69c5dfd-952c-44e7-9e26-18e9807fcaf6">
Nov 29 07:20:14 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:20:14 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:20:14 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:20:14 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <system>
Nov 29 07:20:14 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:20:14 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:20:14 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:20:14 compute-0 nova_compute[187185]:       <entry name="serial">385b61e0-d06f-45d5-833f-956226dbe647</entry>
Nov 29 07:20:14 compute-0 nova_compute[187185]:       <entry name="uuid">385b61e0-d06f-45d5-833f-956226dbe647</entry>
Nov 29 07:20:14 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     </system>
Nov 29 07:20:14 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:20:14 compute-0 nova_compute[187185]:   <os>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:   </os>
Nov 29 07:20:14 compute-0 nova_compute[187185]:   <features>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:   </features>
Nov 29 07:20:14 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:20:14 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:20:14 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:20:14 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/385b61e0-d06f-45d5-833f-956226dbe647/disk"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:20:14 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/385b61e0-d06f-45d5-833f-956226dbe647/disk.config"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:20:14 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:ff:90:f9"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:       <target dev="tapd69c5dfd-95"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:20:14 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/385b61e0-d06f-45d5-833f-956226dbe647/console.log" append="off"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <video>
Nov 29 07:20:14 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     </video>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:20:14 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:20:14 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:20:14 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:20:14 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:20:14 compute-0 nova_compute[187185]: </domain>
Nov 29 07:20:14 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.055 187189 DEBUG nova.compute.manager [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Preparing to wait for external event network-vif-plugged-d69c5dfd-952c-44e7-9e26-18e9807fcaf6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.055 187189 DEBUG oslo_concurrency.lockutils [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "385b61e0-d06f-45d5-833f-956226dbe647-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.056 187189 DEBUG oslo_concurrency.lockutils [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "385b61e0-d06f-45d5-833f-956226dbe647-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.056 187189 DEBUG oslo_concurrency.lockutils [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "385b61e0-d06f-45d5-833f-956226dbe647-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.057 187189 DEBUG nova.virt.libvirt.vif [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:20:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1185749356',display_name='tempest-TestNetworkAdvancedServerOps-server-1185749356',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1185749356',id=110,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGSwWnxtNlh6nwv52DSkLcXTayESRz1TKsVEdkgArNOlSLUsPbKJSA3vkHHPtiLgyTWyPt3OWUciI+4jP4we7rtLJMxd1AlvhHoYZGKDwql7lK2v6bgvKYVGR4xaItQ7lg==',key_name='tempest-TestNetworkAdvancedServerOps-167732049',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-ymcbmpe0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:20:08Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=385b61e0-d06f-45d5-833f-956226dbe647,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d69c5dfd-952c-44e7-9e26-18e9807fcaf6", "address": "fa:16:3e:ff:90:f9", "network": {"id": "b1af4918-cc03-490f-9e76-dc4f5fd7f840", "bridge": "br-int", "label": "tempest-network-smoke--713941231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd69c5dfd-95", "ovs_interfaceid": "d69c5dfd-952c-44e7-9e26-18e9807fcaf6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.058 187189 DEBUG nova.network.os_vif_util [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converting VIF {"id": "d69c5dfd-952c-44e7-9e26-18e9807fcaf6", "address": "fa:16:3e:ff:90:f9", "network": {"id": "b1af4918-cc03-490f-9e76-dc4f5fd7f840", "bridge": "br-int", "label": "tempest-network-smoke--713941231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd69c5dfd-95", "ovs_interfaceid": "d69c5dfd-952c-44e7-9e26-18e9807fcaf6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.059 187189 DEBUG nova.network.os_vif_util [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:90:f9,bridge_name='br-int',has_traffic_filtering=True,id=d69c5dfd-952c-44e7-9e26-18e9807fcaf6,network=Network(b1af4918-cc03-490f-9e76-dc4f5fd7f840),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd69c5dfd-95') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.059 187189 DEBUG os_vif [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:90:f9,bridge_name='br-int',has_traffic_filtering=True,id=d69c5dfd-952c-44e7-9e26-18e9807fcaf6,network=Network(b1af4918-cc03-490f-9e76-dc4f5fd7f840),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd69c5dfd-95') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.060 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.061 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.061 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.066 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.066 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd69c5dfd-95, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.067 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd69c5dfd-95, col_values=(('external_ids', {'iface-id': 'd69c5dfd-952c-44e7-9e26-18e9807fcaf6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ff:90:f9', 'vm-uuid': '385b61e0-d06f-45d5-833f-956226dbe647'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.069 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:14 compute-0 NetworkManager[55227]: <info>  [1764400814.0704] manager: (tapd69c5dfd-95): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/164)
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.072 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.078 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.079 187189 INFO os_vif [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:90:f9,bridge_name='br-int',has_traffic_filtering=True,id=d69c5dfd-952c-44e7-9e26-18e9807fcaf6,network=Network(b1af4918-cc03-490f-9e76-dc4f5fd7f840),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd69c5dfd-95')
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.144 187189 DEBUG nova.virt.libvirt.driver [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.145 187189 DEBUG nova.virt.libvirt.driver [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.145 187189 DEBUG nova.virt.libvirt.driver [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] No VIF found with MAC fa:16:3e:ff:90:f9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.146 187189 INFO nova.virt.libvirt.driver [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Using config drive
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.311 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.601 187189 INFO nova.virt.libvirt.driver [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Creating config drive at /var/lib/nova/instances/385b61e0-d06f-45d5-833f-956226dbe647/disk.config
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.612 187189 DEBUG oslo_concurrency.processutils [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/385b61e0-d06f-45d5-833f-956226dbe647/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8b44rhxi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.760 187189 DEBUG oslo_concurrency.processutils [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/385b61e0-d06f-45d5-833f-956226dbe647/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8b44rhxi" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:20:14 compute-0 kernel: tapd69c5dfd-95: entered promiscuous mode
Nov 29 07:20:14 compute-0 NetworkManager[55227]: <info>  [1764400814.8636] manager: (tapd69c5dfd-95): new Tun device (/org/freedesktop/NetworkManager/Devices/165)
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.864 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:14 compute-0 ovn_controller[95281]: 2025-11-29T07:20:14Z|00301|binding|INFO|Claiming lport d69c5dfd-952c-44e7-9e26-18e9807fcaf6 for this chassis.
Nov 29 07:20:14 compute-0 ovn_controller[95281]: 2025-11-29T07:20:14Z|00302|binding|INFO|d69c5dfd-952c-44e7-9e26-18e9807fcaf6: Claiming fa:16:3e:ff:90:f9 10.100.0.5
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.873 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:14.895 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:90:f9 10.100.0.5'], port_security=['fa:16:3e:ff:90:f9 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '385b61e0-d06f-45d5-833f-956226dbe647', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b1af4918-cc03-490f-9e76-dc4f5fd7f840', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c231e63624d44fc19e0989abfb1afb22', 'neutron:revision_number': '2', 'neutron:security_group_ids': '64c144d3-0e65-4786-8bd8-0434ea854658', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=578489e5-bc3a-4682-96b0-942be7815ce6, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=d69c5dfd-952c-44e7-9e26-18e9807fcaf6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:20:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:14.897 104254 INFO neutron.agent.ovn.metadata.agent [-] Port d69c5dfd-952c-44e7-9e26-18e9807fcaf6 in datapath b1af4918-cc03-490f-9e76-dc4f5fd7f840 bound to our chassis
Nov 29 07:20:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:14.900 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b1af4918-cc03-490f-9e76-dc4f5fd7f840
Nov 29 07:20:14 compute-0 systemd-udevd[231571]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:20:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:14.912 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e38d5c85-e860-4671-9d32-788e5e3f914e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:20:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:14.913 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb1af4918-c1 in ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:20:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:14.915 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb1af4918-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:20:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:14.916 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[20107b21-3fc9-472f-9132-7a637f6ccbc2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:20:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:14.917 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[795a0d5e-c05f-4183-8899-3e9b77291750]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:20:14 compute-0 NetworkManager[55227]: <info>  [1764400814.9213] device (tapd69c5dfd-95): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:20:14 compute-0 NetworkManager[55227]: <info>  [1764400814.9222] device (tapd69c5dfd-95): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:20:14 compute-0 systemd-machined[153486]: New machine qemu-41-instance-0000006e.
Nov 29 07:20:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:14.934 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[e8273012-8354-4979-9b25-f2e761fb6d7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.957 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:14.958 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[219e76fc-5d1e-404b-b5e8-ec386dca8e50]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.961 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:14 compute-0 ovn_controller[95281]: 2025-11-29T07:20:14Z|00303|binding|INFO|Setting lport d69c5dfd-952c-44e7-9e26-18e9807fcaf6 ovn-installed in OVS
Nov 29 07:20:14 compute-0 ovn_controller[95281]: 2025-11-29T07:20:14Z|00304|binding|INFO|Setting lport d69c5dfd-952c-44e7-9e26-18e9807fcaf6 up in Southbound
Nov 29 07:20:14 compute-0 nova_compute[187185]: 2025-11-29 07:20:14.966 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:14 compute-0 systemd[1]: Started Virtual Machine qemu-41-instance-0000006e.
Nov 29 07:20:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:14.996 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[f23c3797-7f88-43cd-a8d2-3592032aa6b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:20:15 compute-0 systemd-udevd[231576]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:15.001 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[1a761941-21be-4093-980a-c5996cb62338]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:20:15 compute-0 NetworkManager[55227]: <info>  [1764400815.0032] manager: (tapb1af4918-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/166)
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:15.036 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[744974fa-bb9f-43cb-b0aa-4e557c18eb3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:15.039 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[d638bd75-7ee5-42db-b33b-cc4d800511f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:20:15 compute-0 NetworkManager[55227]: <info>  [1764400815.0634] device (tapb1af4918-c0): carrier: link connected
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:15.070 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[c2a591d6-1853-4a8a-bddd-01eee28aa01c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:15.087 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[9a0d5289-92d7-4a20-b507-f292daa7e7ef]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb1af4918-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:15:17:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 100], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 626272, 'reachable_time': 40381, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231605, 'error': None, 'target': 'ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:15.103 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[928f9c06-7412-49d2-8655-ff358c8dd046]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe15:17c1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 626272, 'tstamp': 626272}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231606, 'error': None, 'target': 'ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:15.117 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6c54f251-7073-451a-8138-53279a9e0ee0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb1af4918-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:15:17:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 100], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 626272, 'reachable_time': 40381, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231607, 'error': None, 'target': 'ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:15.145 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d964eb8c-cb2b-43fa-9f09-48032b125514]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:15.215 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e3455450-4c33-47db-bb36-5c33246f7067]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:15.217 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb1af4918-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:15.217 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:15.218 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb1af4918-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:20:15 compute-0 nova_compute[187185]: 2025-11-29 07:20:15.219 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:15 compute-0 NetworkManager[55227]: <info>  [1764400815.2201] manager: (tapb1af4918-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/167)
Nov 29 07:20:15 compute-0 kernel: tapb1af4918-c0: entered promiscuous mode
Nov 29 07:20:15 compute-0 nova_compute[187185]: 2025-11-29 07:20:15.222 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:15.223 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb1af4918-c0, col_values=(('external_ids', {'iface-id': '38692e46-2719-40b3-95b5-e8d28c9319d3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:20:15 compute-0 nova_compute[187185]: 2025-11-29 07:20:15.223 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:15 compute-0 ovn_controller[95281]: 2025-11-29T07:20:15Z|00305|binding|INFO|Releasing lport 38692e46-2719-40b3-95b5-e8d28c9319d3 from this chassis (sb_readonly=0)
Nov 29 07:20:15 compute-0 nova_compute[187185]: 2025-11-29 07:20:15.246 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:15.248 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b1af4918-cc03-490f-9e76-dc4f5fd7f840.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b1af4918-cc03-490f-9e76-dc4f5fd7f840.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:15.249 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[57971afa-601e-4676-8a0d-b76e75961131]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:15.250 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-b1af4918-cc03-490f-9e76-dc4f5fd7f840
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/b1af4918-cc03-490f-9e76-dc4f5fd7f840.pid.haproxy
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID b1af4918-cc03-490f-9e76-dc4f5fd7f840
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:20:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:15.252 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840', 'env', 'PROCESS_TAG=haproxy-b1af4918-cc03-490f-9e76-dc4f5fd7f840', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b1af4918-cc03-490f-9e76-dc4f5fd7f840.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:20:15 compute-0 nova_compute[187185]: 2025-11-29 07:20:15.422 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400815.4213493, 385b61e0-d06f-45d5-833f-956226dbe647 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:20:15 compute-0 nova_compute[187185]: 2025-11-29 07:20:15.423 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] VM Started (Lifecycle Event)
Nov 29 07:20:15 compute-0 nova_compute[187185]: 2025-11-29 07:20:15.452 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:20:15 compute-0 nova_compute[187185]: 2025-11-29 07:20:15.457 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400815.421525, 385b61e0-d06f-45d5-833f-956226dbe647 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:20:15 compute-0 nova_compute[187185]: 2025-11-29 07:20:15.458 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] VM Paused (Lifecycle Event)
Nov 29 07:20:15 compute-0 nova_compute[187185]: 2025-11-29 07:20:15.477 187189 DEBUG nova.network.neutron [req-3d529326-3009-4021-aad5-b88c03846b7e req-4883fafb-4459-4d7a-a52c-255c0c9a9f07 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Updated VIF entry in instance network info cache for port d69c5dfd-952c-44e7-9e26-18e9807fcaf6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:20:15 compute-0 nova_compute[187185]: 2025-11-29 07:20:15.478 187189 DEBUG nova.network.neutron [req-3d529326-3009-4021-aad5-b88c03846b7e req-4883fafb-4459-4d7a-a52c-255c0c9a9f07 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Updating instance_info_cache with network_info: [{"id": "d69c5dfd-952c-44e7-9e26-18e9807fcaf6", "address": "fa:16:3e:ff:90:f9", "network": {"id": "b1af4918-cc03-490f-9e76-dc4f5fd7f840", "bridge": "br-int", "label": "tempest-network-smoke--713941231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd69c5dfd-95", "ovs_interfaceid": "d69c5dfd-952c-44e7-9e26-18e9807fcaf6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:20:15 compute-0 nova_compute[187185]: 2025-11-29 07:20:15.482 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:20:15 compute-0 nova_compute[187185]: 2025-11-29 07:20:15.487 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:20:15 compute-0 nova_compute[187185]: 2025-11-29 07:20:15.495 187189 DEBUG oslo_concurrency.lockutils [req-3d529326-3009-4021-aad5-b88c03846b7e req-4883fafb-4459-4d7a-a52c-255c0c9a9f07 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-385b61e0-d06f-45d5-833f-956226dbe647" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:20:15 compute-0 nova_compute[187185]: 2025-11-29 07:20:15.525 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:20:15 compute-0 podman[231646]: 2025-11-29 07:20:15.651157503 +0000 UTC m=+0.062354011 container create 2c97e8bf0be4164737450f388112fac40cdb4499975b7b41d52487e53db64edf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 29 07:20:15 compute-0 systemd[1]: Started libpod-conmon-2c97e8bf0be4164737450f388112fac40cdb4499975b7b41d52487e53db64edf.scope.
Nov 29 07:20:15 compute-0 podman[231646]: 2025-11-29 07:20:15.614888929 +0000 UTC m=+0.026085457 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:20:15 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:20:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/894d588e214ecf69d3f49372baeb587d43fc6dd4d7f5e21f66b1d5afea862e1b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:20:15 compute-0 podman[231646]: 2025-11-29 07:20:15.757045923 +0000 UTC m=+0.168242471 container init 2c97e8bf0be4164737450f388112fac40cdb4499975b7b41d52487e53db64edf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 29 07:20:15 compute-0 podman[231646]: 2025-11-29 07:20:15.7661572 +0000 UTC m=+0.177353718 container start 2c97e8bf0be4164737450f388112fac40cdb4499975b7b41d52487e53db64edf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:20:15 compute-0 nova_compute[187185]: 2025-11-29 07:20:15.792 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:15 compute-0 neutron-haproxy-ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840[231661]: [NOTICE]   (231665) : New worker (231667) forked
Nov 29 07:20:15 compute-0 neutron-haproxy-ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840[231661]: [NOTICE]   (231665) : Loading success.
Nov 29 07:20:16 compute-0 nova_compute[187185]: 2025-11-29 07:20:16.112 187189 DEBUG nova.compute.manager [req-4c5e9f7b-bc4a-4565-8284-db089c72a9bc req-43ccc6bb-0217-4dc8-9956-0ffe3669c2a2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Received event network-vif-plugged-d69c5dfd-952c-44e7-9e26-18e9807fcaf6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:20:16 compute-0 nova_compute[187185]: 2025-11-29 07:20:16.114 187189 DEBUG oslo_concurrency.lockutils [req-4c5e9f7b-bc4a-4565-8284-db089c72a9bc req-43ccc6bb-0217-4dc8-9956-0ffe3669c2a2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "385b61e0-d06f-45d5-833f-956226dbe647-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:20:16 compute-0 nova_compute[187185]: 2025-11-29 07:20:16.115 187189 DEBUG oslo_concurrency.lockutils [req-4c5e9f7b-bc4a-4565-8284-db089c72a9bc req-43ccc6bb-0217-4dc8-9956-0ffe3669c2a2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "385b61e0-d06f-45d5-833f-956226dbe647-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:20:16 compute-0 nova_compute[187185]: 2025-11-29 07:20:16.116 187189 DEBUG oslo_concurrency.lockutils [req-4c5e9f7b-bc4a-4565-8284-db089c72a9bc req-43ccc6bb-0217-4dc8-9956-0ffe3669c2a2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "385b61e0-d06f-45d5-833f-956226dbe647-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:20:16 compute-0 nova_compute[187185]: 2025-11-29 07:20:16.117 187189 DEBUG nova.compute.manager [req-4c5e9f7b-bc4a-4565-8284-db089c72a9bc req-43ccc6bb-0217-4dc8-9956-0ffe3669c2a2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Processing event network-vif-plugged-d69c5dfd-952c-44e7-9e26-18e9807fcaf6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:20:16 compute-0 nova_compute[187185]: 2025-11-29 07:20:16.117 187189 DEBUG nova.compute.manager [req-4c5e9f7b-bc4a-4565-8284-db089c72a9bc req-43ccc6bb-0217-4dc8-9956-0ffe3669c2a2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Received event network-vif-plugged-d69c5dfd-952c-44e7-9e26-18e9807fcaf6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:20:16 compute-0 nova_compute[187185]: 2025-11-29 07:20:16.117 187189 DEBUG oslo_concurrency.lockutils [req-4c5e9f7b-bc4a-4565-8284-db089c72a9bc req-43ccc6bb-0217-4dc8-9956-0ffe3669c2a2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "385b61e0-d06f-45d5-833f-956226dbe647-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:20:16 compute-0 nova_compute[187185]: 2025-11-29 07:20:16.117 187189 DEBUG oslo_concurrency.lockutils [req-4c5e9f7b-bc4a-4565-8284-db089c72a9bc req-43ccc6bb-0217-4dc8-9956-0ffe3669c2a2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "385b61e0-d06f-45d5-833f-956226dbe647-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:20:16 compute-0 nova_compute[187185]: 2025-11-29 07:20:16.118 187189 DEBUG oslo_concurrency.lockutils [req-4c5e9f7b-bc4a-4565-8284-db089c72a9bc req-43ccc6bb-0217-4dc8-9956-0ffe3669c2a2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "385b61e0-d06f-45d5-833f-956226dbe647-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:20:16 compute-0 nova_compute[187185]: 2025-11-29 07:20:16.118 187189 DEBUG nova.compute.manager [req-4c5e9f7b-bc4a-4565-8284-db089c72a9bc req-43ccc6bb-0217-4dc8-9956-0ffe3669c2a2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] No waiting events found dispatching network-vif-plugged-d69c5dfd-952c-44e7-9e26-18e9807fcaf6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:20:16 compute-0 nova_compute[187185]: 2025-11-29 07:20:16.118 187189 WARNING nova.compute.manager [req-4c5e9f7b-bc4a-4565-8284-db089c72a9bc req-43ccc6bb-0217-4dc8-9956-0ffe3669c2a2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Received unexpected event network-vif-plugged-d69c5dfd-952c-44e7-9e26-18e9807fcaf6 for instance with vm_state building and task_state spawning.
Nov 29 07:20:16 compute-0 nova_compute[187185]: 2025-11-29 07:20:16.120 187189 DEBUG nova.compute.manager [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:20:16 compute-0 nova_compute[187185]: 2025-11-29 07:20:16.126 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400816.1259744, 385b61e0-d06f-45d5-833f-956226dbe647 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:20:16 compute-0 nova_compute[187185]: 2025-11-29 07:20:16.127 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] VM Resumed (Lifecycle Event)
Nov 29 07:20:16 compute-0 nova_compute[187185]: 2025-11-29 07:20:16.131 187189 DEBUG nova.virt.libvirt.driver [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:20:16 compute-0 nova_compute[187185]: 2025-11-29 07:20:16.136 187189 INFO nova.virt.libvirt.driver [-] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Instance spawned successfully.
Nov 29 07:20:16 compute-0 nova_compute[187185]: 2025-11-29 07:20:16.137 187189 DEBUG nova.virt.libvirt.driver [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:20:16 compute-0 nova_compute[187185]: 2025-11-29 07:20:16.157 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:20:16 compute-0 nova_compute[187185]: 2025-11-29 07:20:16.168 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:20:16 compute-0 nova_compute[187185]: 2025-11-29 07:20:16.172 187189 DEBUG nova.virt.libvirt.driver [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:20:16 compute-0 nova_compute[187185]: 2025-11-29 07:20:16.172 187189 DEBUG nova.virt.libvirt.driver [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:20:16 compute-0 nova_compute[187185]: 2025-11-29 07:20:16.173 187189 DEBUG nova.virt.libvirt.driver [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:20:16 compute-0 nova_compute[187185]: 2025-11-29 07:20:16.174 187189 DEBUG nova.virt.libvirt.driver [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:20:16 compute-0 nova_compute[187185]: 2025-11-29 07:20:16.175 187189 DEBUG nova.virt.libvirt.driver [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:20:16 compute-0 nova_compute[187185]: 2025-11-29 07:20:16.175 187189 DEBUG nova.virt.libvirt.driver [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:20:16 compute-0 nova_compute[187185]: 2025-11-29 07:20:16.230 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:20:16 compute-0 nova_compute[187185]: 2025-11-29 07:20:16.397 187189 INFO nova.compute.manager [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Took 8.27 seconds to spawn the instance on the hypervisor.
Nov 29 07:20:16 compute-0 nova_compute[187185]: 2025-11-29 07:20:16.398 187189 DEBUG nova.compute.manager [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:20:16 compute-0 nova_compute[187185]: 2025-11-29 07:20:16.516 187189 INFO nova.compute.manager [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Took 9.09 seconds to build instance.
Nov 29 07:20:16 compute-0 nova_compute[187185]: 2025-11-29 07:20:16.546 187189 DEBUG oslo_concurrency.lockutils [None req-3339f796-d946-482f-a2ba-9d9d1041dda1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "385b61e0-d06f-45d5-833f-956226dbe647" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.193s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:20:17 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:17.801 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '37'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:20:19 compute-0 nova_compute[187185]: 2025-11-29 07:20:19.070 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:20 compute-0 nova_compute[187185]: 2025-11-29 07:20:20.847 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:20 compute-0 podman[231676]: 2025-11-29 07:20:20.867764067 +0000 UTC m=+0.131939016 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 07:20:21 compute-0 nova_compute[187185]: 2025-11-29 07:20:21.281 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:21 compute-0 NetworkManager[55227]: <info>  [1764400821.2826] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/168)
Nov 29 07:20:21 compute-0 NetworkManager[55227]: <info>  [1764400821.2832] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/169)
Nov 29 07:20:21 compute-0 nova_compute[187185]: 2025-11-29 07:20:21.417 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:21 compute-0 ovn_controller[95281]: 2025-11-29T07:20:21Z|00306|binding|INFO|Releasing lport 38692e46-2719-40b3-95b5-e8d28c9319d3 from this chassis (sb_readonly=0)
Nov 29 07:20:21 compute-0 nova_compute[187185]: 2025-11-29 07:20:21.442 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:22 compute-0 nova_compute[187185]: 2025-11-29 07:20:22.718 187189 DEBUG nova.compute.manager [req-27d2cdf5-0082-47ab-8a48-f0f575f87d10 req-fb306d27-65cb-4e5c-a1cb-0c40029c5cfe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Received event network-changed-d69c5dfd-952c-44e7-9e26-18e9807fcaf6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:20:22 compute-0 nova_compute[187185]: 2025-11-29 07:20:22.719 187189 DEBUG nova.compute.manager [req-27d2cdf5-0082-47ab-8a48-f0f575f87d10 req-fb306d27-65cb-4e5c-a1cb-0c40029c5cfe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Refreshing instance network info cache due to event network-changed-d69c5dfd-952c-44e7-9e26-18e9807fcaf6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:20:22 compute-0 nova_compute[187185]: 2025-11-29 07:20:22.720 187189 DEBUG oslo_concurrency.lockutils [req-27d2cdf5-0082-47ab-8a48-f0f575f87d10 req-fb306d27-65cb-4e5c-a1cb-0c40029c5cfe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-385b61e0-d06f-45d5-833f-956226dbe647" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:20:22 compute-0 nova_compute[187185]: 2025-11-29 07:20:22.720 187189 DEBUG oslo_concurrency.lockutils [req-27d2cdf5-0082-47ab-8a48-f0f575f87d10 req-fb306d27-65cb-4e5c-a1cb-0c40029c5cfe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-385b61e0-d06f-45d5-833f-956226dbe647" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:20:22 compute-0 nova_compute[187185]: 2025-11-29 07:20:22.721 187189 DEBUG nova.network.neutron [req-27d2cdf5-0082-47ab-8a48-f0f575f87d10 req-fb306d27-65cb-4e5c-a1cb-0c40029c5cfe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Refreshing network info cache for port d69c5dfd-952c-44e7-9e26-18e9807fcaf6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:20:24 compute-0 nova_compute[187185]: 2025-11-29 07:20:24.073 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:25.509 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:20:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:25.510 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:20:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:25.512 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:20:25 compute-0 nova_compute[187185]: 2025-11-29 07:20:25.564 187189 DEBUG nova.network.neutron [req-27d2cdf5-0082-47ab-8a48-f0f575f87d10 req-fb306d27-65cb-4e5c-a1cb-0c40029c5cfe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Updated VIF entry in instance network info cache for port d69c5dfd-952c-44e7-9e26-18e9807fcaf6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:20:25 compute-0 nova_compute[187185]: 2025-11-29 07:20:25.566 187189 DEBUG nova.network.neutron [req-27d2cdf5-0082-47ab-8a48-f0f575f87d10 req-fb306d27-65cb-4e5c-a1cb-0c40029c5cfe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Updating instance_info_cache with network_info: [{"id": "d69c5dfd-952c-44e7-9e26-18e9807fcaf6", "address": "fa:16:3e:ff:90:f9", "network": {"id": "b1af4918-cc03-490f-9e76-dc4f5fd7f840", "bridge": "br-int", "label": "tempest-network-smoke--713941231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd69c5dfd-95", "ovs_interfaceid": "d69c5dfd-952c-44e7-9e26-18e9807fcaf6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:20:25 compute-0 nova_compute[187185]: 2025-11-29 07:20:25.594 187189 DEBUG oslo_concurrency.lockutils [req-27d2cdf5-0082-47ab-8a48-f0f575f87d10 req-fb306d27-65cb-4e5c-a1cb-0c40029c5cfe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-385b61e0-d06f-45d5-833f-956226dbe647" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:20:25 compute-0 nova_compute[187185]: 2025-11-29 07:20:25.850 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:27 compute-0 podman[231706]: 2025-11-29 07:20:27.792334951 +0000 UTC m=+0.053190743 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 07:20:29 compute-0 ovn_controller[95281]: 2025-11-29T07:20:29Z|00039|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ff:90:f9 10.100.0.5
Nov 29 07:20:29 compute-0 ovn_controller[95281]: 2025-11-29T07:20:29Z|00040|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ff:90:f9 10.100.0.5
Nov 29 07:20:29 compute-0 nova_compute[187185]: 2025-11-29 07:20:29.075 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:29 compute-0 nova_compute[187185]: 2025-11-29 07:20:29.311 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:20:29 compute-0 podman[231748]: 2025-11-29 07:20:29.829451894 +0000 UTC m=+0.087143252 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 07:20:29 compute-0 podman[231749]: 2025-11-29 07:20:29.847315128 +0000 UTC m=+0.099333505 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 07:20:30 compute-0 nova_compute[187185]: 2025-11-29 07:20:30.891 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:31 compute-0 ovn_controller[95281]: 2025-11-29T07:20:31Z|00307|binding|INFO|Releasing lport 38692e46-2719-40b3-95b5-e8d28c9319d3 from this chassis (sb_readonly=0)
Nov 29 07:20:31 compute-0 nova_compute[187185]: 2025-11-29 07:20:31.460 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:34 compute-0 nova_compute[187185]: 2025-11-29 07:20:34.079 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:35 compute-0 nova_compute[187185]: 2025-11-29 07:20:35.891 187189 INFO nova.compute.manager [None req-c6d692c0-7df9-40bd-aa33-28814329c806 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Get console output
Nov 29 07:20:35 compute-0 nova_compute[187185]: 2025-11-29 07:20:35.893 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:35 compute-0 nova_compute[187185]: 2025-11-29 07:20:35.900 213758 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 29 07:20:36 compute-0 nova_compute[187185]: 2025-11-29 07:20:36.288 187189 DEBUG oslo_concurrency.lockutils [None req-702da04b-33ab-40dd-a453-7031f2f5bfa8 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "385b61e0-d06f-45d5-833f-956226dbe647" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:20:36 compute-0 nova_compute[187185]: 2025-11-29 07:20:36.289 187189 DEBUG oslo_concurrency.lockutils [None req-702da04b-33ab-40dd-a453-7031f2f5bfa8 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "385b61e0-d06f-45d5-833f-956226dbe647" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:20:36 compute-0 nova_compute[187185]: 2025-11-29 07:20:36.290 187189 INFO nova.compute.manager [None req-702da04b-33ab-40dd-a453-7031f2f5bfa8 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Rebooting instance
Nov 29 07:20:36 compute-0 nova_compute[187185]: 2025-11-29 07:20:36.312 187189 DEBUG oslo_concurrency.lockutils [None req-702da04b-33ab-40dd-a453-7031f2f5bfa8 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "refresh_cache-385b61e0-d06f-45d5-833f-956226dbe647" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:20:36 compute-0 nova_compute[187185]: 2025-11-29 07:20:36.312 187189 DEBUG oslo_concurrency.lockutils [None req-702da04b-33ab-40dd-a453-7031f2f5bfa8 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquired lock "refresh_cache-385b61e0-d06f-45d5-833f-956226dbe647" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:20:36 compute-0 nova_compute[187185]: 2025-11-29 07:20:36.313 187189 DEBUG nova.network.neutron [None req-702da04b-33ab-40dd-a453-7031f2f5bfa8 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:20:37 compute-0 nova_compute[187185]: 2025-11-29 07:20:37.332 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:38 compute-0 nova_compute[187185]: 2025-11-29 07:20:38.812 187189 DEBUG nova.network.neutron [None req-702da04b-33ab-40dd-a453-7031f2f5bfa8 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Updating instance_info_cache with network_info: [{"id": "d69c5dfd-952c-44e7-9e26-18e9807fcaf6", "address": "fa:16:3e:ff:90:f9", "network": {"id": "b1af4918-cc03-490f-9e76-dc4f5fd7f840", "bridge": "br-int", "label": "tempest-network-smoke--713941231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd69c5dfd-95", "ovs_interfaceid": "d69c5dfd-952c-44e7-9e26-18e9807fcaf6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:20:38 compute-0 nova_compute[187185]: 2025-11-29 07:20:38.875 187189 DEBUG oslo_concurrency.lockutils [None req-702da04b-33ab-40dd-a453-7031f2f5bfa8 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Releasing lock "refresh_cache-385b61e0-d06f-45d5-833f-956226dbe647" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:20:39 compute-0 nova_compute[187185]: 2025-11-29 07:20:39.081 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:39 compute-0 nova_compute[187185]: 2025-11-29 07:20:39.221 187189 DEBUG nova.compute.manager [None req-702da04b-33ab-40dd-a453-7031f2f5bfa8 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:20:40 compute-0 nova_compute[187185]: 2025-11-29 07:20:40.452 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:40 compute-0 nova_compute[187185]: 2025-11-29 07:20:40.894 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:41 compute-0 nova_compute[187185]: 2025-11-29 07:20:41.526 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:41 compute-0 podman[231788]: 2025-11-29 07:20:41.796797063 +0000 UTC m=+0.061823507 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:20:41 compute-0 podman[231789]: 2025-11-29 07:20:41.83495316 +0000 UTC m=+0.094434577 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6)
Nov 29 07:20:41 compute-0 podman[231790]: 2025-11-29 07:20:41.84026269 +0000 UTC m=+0.088314495 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:20:41 compute-0 kernel: tapd69c5dfd-95 (unregistering): left promiscuous mode
Nov 29 07:20:41 compute-0 NetworkManager[55227]: <info>  [1764400841.9678] device (tapd69c5dfd-95): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:20:41 compute-0 ovn_controller[95281]: 2025-11-29T07:20:41Z|00308|binding|INFO|Releasing lport d69c5dfd-952c-44e7-9e26-18e9807fcaf6 from this chassis (sb_readonly=0)
Nov 29 07:20:41 compute-0 ovn_controller[95281]: 2025-11-29T07:20:41Z|00309|binding|INFO|Setting lport d69c5dfd-952c-44e7-9e26-18e9807fcaf6 down in Southbound
Nov 29 07:20:41 compute-0 nova_compute[187185]: 2025-11-29 07:20:41.976 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:41 compute-0 ovn_controller[95281]: 2025-11-29T07:20:41Z|00310|binding|INFO|Removing iface tapd69c5dfd-95 ovn-installed in OVS
Nov 29 07:20:41 compute-0 nova_compute[187185]: 2025-11-29 07:20:41.978 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:41.988 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:90:f9 10.100.0.5'], port_security=['fa:16:3e:ff:90:f9 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '385b61e0-d06f-45d5-833f-956226dbe647', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b1af4918-cc03-490f-9e76-dc4f5fd7f840', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c231e63624d44fc19e0989abfb1afb22', 'neutron:revision_number': '4', 'neutron:security_group_ids': '64c144d3-0e65-4786-8bd8-0434ea854658', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.218'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=578489e5-bc3a-4682-96b0-942be7815ce6, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=d69c5dfd-952c-44e7-9e26-18e9807fcaf6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:20:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:41.990 104254 INFO neutron.agent.ovn.metadata.agent [-] Port d69c5dfd-952c-44e7-9e26-18e9807fcaf6 in datapath b1af4918-cc03-490f-9e76-dc4f5fd7f840 unbound from our chassis
Nov 29 07:20:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:41.993 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b1af4918-cc03-490f-9e76-dc4f5fd7f840, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:20:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:41.994 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[562724dd-c1e1-4757-accc-617589f0929b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:20:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:41.995 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840 namespace which is not needed anymore
Nov 29 07:20:41 compute-0 nova_compute[187185]: 2025-11-29 07:20:41.998 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:42 compute-0 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d0000006e.scope: Deactivated successfully.
Nov 29 07:20:42 compute-0 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d0000006e.scope: Consumed 13.713s CPU time.
Nov 29 07:20:42 compute-0 systemd-machined[153486]: Machine qemu-41-instance-0000006e terminated.
Nov 29 07:20:42 compute-0 neutron-haproxy-ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840[231661]: [NOTICE]   (231665) : haproxy version is 2.8.14-c23fe91
Nov 29 07:20:42 compute-0 neutron-haproxy-ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840[231661]: [NOTICE]   (231665) : path to executable is /usr/sbin/haproxy
Nov 29 07:20:42 compute-0 neutron-haproxy-ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840[231661]: [WARNING]  (231665) : Exiting Master process...
Nov 29 07:20:42 compute-0 neutron-haproxy-ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840[231661]: [ALERT]    (231665) : Current worker (231667) exited with code 143 (Terminated)
Nov 29 07:20:42 compute-0 neutron-haproxy-ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840[231661]: [WARNING]  (231665) : All workers exited. Exiting... (0)
Nov 29 07:20:42 compute-0 systemd[1]: libpod-2c97e8bf0be4164737450f388112fac40cdb4499975b7b41d52487e53db64edf.scope: Deactivated successfully.
Nov 29 07:20:42 compute-0 podman[231875]: 2025-11-29 07:20:42.137654036 +0000 UTC m=+0.046955387 container died 2c97e8bf0be4164737450f388112fac40cdb4499975b7b41d52487e53db64edf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:20:42 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2c97e8bf0be4164737450f388112fac40cdb4499975b7b41d52487e53db64edf-userdata-shm.mount: Deactivated successfully.
Nov 29 07:20:42 compute-0 systemd[1]: var-lib-containers-storage-overlay-894d588e214ecf69d3f49372baeb587d43fc6dd4d7f5e21f66b1d5afea862e1b-merged.mount: Deactivated successfully.
Nov 29 07:20:42 compute-0 podman[231875]: 2025-11-29 07:20:42.185538988 +0000 UTC m=+0.094840339 container cleanup 2c97e8bf0be4164737450f388112fac40cdb4499975b7b41d52487e53db64edf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:20:42 compute-0 systemd[1]: libpod-conmon-2c97e8bf0be4164737450f388112fac40cdb4499975b7b41d52487e53db64edf.scope: Deactivated successfully.
Nov 29 07:20:42 compute-0 podman[231908]: 2025-11-29 07:20:42.276398103 +0000 UTC m=+0.059691546 container remove 2c97e8bf0be4164737450f388112fac40cdb4499975b7b41d52487e53db64edf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:42.283 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e385e6d2-c103-4a32-8a4c-7e55e579860f]: (4, ('Sat Nov 29 07:20:42 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840 (2c97e8bf0be4164737450f388112fac40cdb4499975b7b41d52487e53db64edf)\n2c97e8bf0be4164737450f388112fac40cdb4499975b7b41d52487e53db64edf\nSat Nov 29 07:20:42 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840 (2c97e8bf0be4164737450f388112fac40cdb4499975b7b41d52487e53db64edf)\n2c97e8bf0be4164737450f388112fac40cdb4499975b7b41d52487e53db64edf\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:42.286 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[44b75c63-4639-4acc-bcd2-32b81ad1cadc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:42.287 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb1af4918-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:20:42 compute-0 nova_compute[187185]: 2025-11-29 07:20:42.289 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:42 compute-0 kernel: tapb1af4918-c0: left promiscuous mode
Nov 29 07:20:42 compute-0 nova_compute[187185]: 2025-11-29 07:20:42.304 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:42.307 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[872c663c-9206-427d-a532-c1bc819df628]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:42.324 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[822ef641-c6b9-468c-a567-10462776cd6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:42.325 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a175028b-34d5-4690-992d-3f513f8c2551]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:42.338 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[887718d5-8d66-44a2-b187-0ce834caf424]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 626265, 'reachable_time': 24874, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231942, 'error': None, 'target': 'ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:42.342 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:42.342 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[ab837967-179b-4e5b-af23-31167852e571]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:20:42 compute-0 systemd[1]: run-netns-ovnmeta\x2db1af4918\x2dcc03\x2d490f\x2d9e76\x2ddc4f5fd7f840.mount: Deactivated successfully.
Nov 29 07:20:42 compute-0 nova_compute[187185]: 2025-11-29 07:20:42.386 187189 INFO nova.virt.libvirt.driver [None req-702da04b-33ab-40dd-a453-7031f2f5bfa8 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Instance shutdown successfully.
Nov 29 07:20:42 compute-0 kernel: tapd69c5dfd-95: entered promiscuous mode
Nov 29 07:20:42 compute-0 systemd-udevd[231856]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:20:42 compute-0 NetworkManager[55227]: <info>  [1764400842.4819] manager: (tapd69c5dfd-95): new Tun device (/org/freedesktop/NetworkManager/Devices/170)
Nov 29 07:20:42 compute-0 nova_compute[187185]: 2025-11-29 07:20:42.481 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:42 compute-0 ovn_controller[95281]: 2025-11-29T07:20:42Z|00311|binding|INFO|Claiming lport d69c5dfd-952c-44e7-9e26-18e9807fcaf6 for this chassis.
Nov 29 07:20:42 compute-0 ovn_controller[95281]: 2025-11-29T07:20:42Z|00312|binding|INFO|d69c5dfd-952c-44e7-9e26-18e9807fcaf6: Claiming fa:16:3e:ff:90:f9 10.100.0.5
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:42.490 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:90:f9 10.100.0.5'], port_security=['fa:16:3e:ff:90:f9 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '385b61e0-d06f-45d5-833f-956226dbe647', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b1af4918-cc03-490f-9e76-dc4f5fd7f840', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c231e63624d44fc19e0989abfb1afb22', 'neutron:revision_number': '5', 'neutron:security_group_ids': '64c144d3-0e65-4786-8bd8-0434ea854658', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.218'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=578489e5-bc3a-4682-96b0-942be7815ce6, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=d69c5dfd-952c-44e7-9e26-18e9807fcaf6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:42.492 104254 INFO neutron.agent.ovn.metadata.agent [-] Port d69c5dfd-952c-44e7-9e26-18e9807fcaf6 in datapath b1af4918-cc03-490f-9e76-dc4f5fd7f840 bound to our chassis
Nov 29 07:20:42 compute-0 ovn_controller[95281]: 2025-11-29T07:20:42Z|00313|binding|INFO|Setting lport d69c5dfd-952c-44e7-9e26-18e9807fcaf6 ovn-installed in OVS
Nov 29 07:20:42 compute-0 ovn_controller[95281]: 2025-11-29T07:20:42Z|00314|binding|INFO|Setting lport d69c5dfd-952c-44e7-9e26-18e9807fcaf6 up in Southbound
Nov 29 07:20:42 compute-0 NetworkManager[55227]: <info>  [1764400842.4947] device (tapd69c5dfd-95): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:20:42 compute-0 NetworkManager[55227]: <info>  [1764400842.4956] device (tapd69c5dfd-95): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:20:42 compute-0 nova_compute[187185]: 2025-11-29 07:20:42.495 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:42.494 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b1af4918-cc03-490f-9e76-dc4f5fd7f840
Nov 29 07:20:42 compute-0 nova_compute[187185]: 2025-11-29 07:20:42.496 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:42 compute-0 nova_compute[187185]: 2025-11-29 07:20:42.498 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:42 compute-0 nova_compute[187185]: 2025-11-29 07:20:42.502 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:42.510 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e5405fc7-b22b-4926-b61b-3722bc86588f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:42.511 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb1af4918-c1 in ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:42.513 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb1af4918-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:42.514 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[446fa46c-bdd4-4912-81ee-f25de0efc373]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:42.515 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[56c3516a-a5c0-4e64-972a-ea9a6c882203]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:42.529 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[15db8211-618d-4603-aeae-db575ee5cb0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:20:42 compute-0 systemd-machined[153486]: New machine qemu-42-instance-0000006e.
Nov 29 07:20:42 compute-0 systemd[1]: Started Virtual Machine qemu-42-instance-0000006e.
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:42.553 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[135c20aa-b01f-4054-baa4-0d8ca146a48c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:42.592 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[5ef5a173-6748-41c2-be6b-dfcf5006af68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:20:42 compute-0 NetworkManager[55227]: <info>  [1764400842.6023] manager: (tapb1af4918-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/171)
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:42.601 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d8a46534-220b-4e91-b10b-f9b63efb64ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:42.645 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[8c4cddcc-21d4-4199-b1cd-d93ae236a1db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:42.650 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[dc5363ef-4898-4b36-8f0e-952417d73958]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:20:42 compute-0 NetworkManager[55227]: <info>  [1764400842.6777] device (tapb1af4918-c0): carrier: link connected
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:42.684 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[6994be61-86d2-48e3-9c97-716883e0f6c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:42.704 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a7c2fdd7-8d2d-4ee5-8507-0bcd76815349]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb1af4918-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:15:17:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 103], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 629034, 'reachable_time': 38541, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231988, 'error': None, 'target': 'ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:42.721 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[5975d73a-d8d8-4626-b0d2-8a35902f63ba]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe15:17c1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 629034, 'tstamp': 629034}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231989, 'error': None, 'target': 'ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:42.740 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[b6730fe3-c462-4a95-9a1d-57b2dc3725c8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb1af4918-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:15:17:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 103], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 629034, 'reachable_time': 38541, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231990, 'error': None, 'target': 'ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:42.776 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[87e18cbe-2ee7-4acb-83e2-244a6ecdb922]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:42.852 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[fbd6e531-a53b-4224-b070-ea1813074b62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:42.853 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb1af4918-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:42.853 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:42.854 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb1af4918-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:20:42 compute-0 nova_compute[187185]: 2025-11-29 07:20:42.856 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:42 compute-0 kernel: tapb1af4918-c0: entered promiscuous mode
Nov 29 07:20:42 compute-0 NetworkManager[55227]: <info>  [1764400842.8582] manager: (tapb1af4918-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/172)
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:42.860 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb1af4918-c0, col_values=(('external_ids', {'iface-id': '38692e46-2719-40b3-95b5-e8d28c9319d3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:20:42 compute-0 ovn_controller[95281]: 2025-11-29T07:20:42Z|00315|binding|INFO|Releasing lport 38692e46-2719-40b3-95b5-e8d28c9319d3 from this chassis (sb_readonly=0)
Nov 29 07:20:42 compute-0 nova_compute[187185]: 2025-11-29 07:20:42.862 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:42.863 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b1af4918-cc03-490f-9e76-dc4f5fd7f840.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b1af4918-cc03-490f-9e76-dc4f5fd7f840.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:42.864 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[2e882941-eba6-4d07-a945-d1b92cb98584]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:42.865 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-b1af4918-cc03-490f-9e76-dc4f5fd7f840
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/b1af4918-cc03-490f-9e76-dc4f5fd7f840.pid.haproxy
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID b1af4918-cc03-490f-9e76-dc4f5fd7f840
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:20:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:20:42.866 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840', 'env', 'PROCESS_TAG=haproxy-b1af4918-cc03-490f-9e76-dc4f5fd7f840', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b1af4918-cc03-490f-9e76-dc4f5fd7f840.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:20:42 compute-0 nova_compute[187185]: 2025-11-29 07:20:42.875 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:43 compute-0 nova_compute[187185]: 2025-11-29 07:20:43.075 187189 DEBUG nova.virt.libvirt.host [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Removed pending event for 385b61e0-d06f-45d5-833f-956226dbe647 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Nov 29 07:20:43 compute-0 nova_compute[187185]: 2025-11-29 07:20:43.076 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400843.075343, 385b61e0-d06f-45d5-833f-956226dbe647 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:20:43 compute-0 nova_compute[187185]: 2025-11-29 07:20:43.076 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] VM Resumed (Lifecycle Event)
Nov 29 07:20:43 compute-0 nova_compute[187185]: 2025-11-29 07:20:43.080 187189 INFO nova.virt.libvirt.driver [-] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Instance running successfully.
Nov 29 07:20:43 compute-0 nova_compute[187185]: 2025-11-29 07:20:43.080 187189 INFO nova.virt.libvirt.driver [None req-702da04b-33ab-40dd-a453-7031f2f5bfa8 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Instance soft rebooted successfully.
Nov 29 07:20:43 compute-0 nova_compute[187185]: 2025-11-29 07:20:43.080 187189 DEBUG nova.compute.manager [None req-702da04b-33ab-40dd-a453-7031f2f5bfa8 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:20:43 compute-0 nova_compute[187185]: 2025-11-29 07:20:43.124 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:20:43 compute-0 nova_compute[187185]: 2025-11-29 07:20:43.128 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:20:43 compute-0 nova_compute[187185]: 2025-11-29 07:20:43.152 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] During sync_power_state the instance has a pending task (reboot_started). Skip.
Nov 29 07:20:43 compute-0 nova_compute[187185]: 2025-11-29 07:20:43.153 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400843.0760894, 385b61e0-d06f-45d5-833f-956226dbe647 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:20:43 compute-0 nova_compute[187185]: 2025-11-29 07:20:43.153 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] VM Started (Lifecycle Event)
Nov 29 07:20:43 compute-0 nova_compute[187185]: 2025-11-29 07:20:43.175 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:20:43 compute-0 nova_compute[187185]: 2025-11-29 07:20:43.183 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:20:43 compute-0 nova_compute[187185]: 2025-11-29 07:20:43.194 187189 DEBUG oslo_concurrency.lockutils [None req-702da04b-33ab-40dd-a453-7031f2f5bfa8 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "385b61e0-d06f-45d5-833f-956226dbe647" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 6.905s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:20:43 compute-0 podman[232029]: 2025-11-29 07:20:43.239929715 +0000 UTC m=+0.050635850 container create 58dc36e690dfd6938f24a1bfe01f51f2cb95556008dff11d6a0ec6df4e213fb9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:20:43 compute-0 systemd[1]: Started libpod-conmon-58dc36e690dfd6938f24a1bfe01f51f2cb95556008dff11d6a0ec6df4e213fb9.scope.
Nov 29 07:20:43 compute-0 podman[232029]: 2025-11-29 07:20:43.21032655 +0000 UTC m=+0.021032705 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:20:43 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:20:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e0460d6cc72aa4a486124250f4366d75e772e418bf5de6f0bc36bca52107157/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:20:43 compute-0 podman[232029]: 2025-11-29 07:20:43.331888242 +0000 UTC m=+0.142594407 container init 58dc36e690dfd6938f24a1bfe01f51f2cb95556008dff11d6a0ec6df4e213fb9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 07:20:43 compute-0 podman[232029]: 2025-11-29 07:20:43.337649744 +0000 UTC m=+0.148355879 container start 58dc36e690dfd6938f24a1bfe01f51f2cb95556008dff11d6a0ec6df4e213fb9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:20:43 compute-0 neutron-haproxy-ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840[232044]: [NOTICE]   (232048) : New worker (232050) forked
Nov 29 07:20:43 compute-0 neutron-haproxy-ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840[232044]: [NOTICE]   (232048) : Loading success.
Nov 29 07:20:44 compute-0 nova_compute[187185]: 2025-11-29 07:20:44.084 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:44 compute-0 nova_compute[187185]: 2025-11-29 07:20:44.478 187189 DEBUG nova.compute.manager [req-b90c1e5d-c91b-45d8-a9a8-5e143008ec31 req-f4d8a434-2913-4c78-a651-c572839fe5b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Received event network-vif-unplugged-d69c5dfd-952c-44e7-9e26-18e9807fcaf6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:20:44 compute-0 nova_compute[187185]: 2025-11-29 07:20:44.479 187189 DEBUG oslo_concurrency.lockutils [req-b90c1e5d-c91b-45d8-a9a8-5e143008ec31 req-f4d8a434-2913-4c78-a651-c572839fe5b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "385b61e0-d06f-45d5-833f-956226dbe647-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:20:44 compute-0 nova_compute[187185]: 2025-11-29 07:20:44.479 187189 DEBUG oslo_concurrency.lockutils [req-b90c1e5d-c91b-45d8-a9a8-5e143008ec31 req-f4d8a434-2913-4c78-a651-c572839fe5b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "385b61e0-d06f-45d5-833f-956226dbe647-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:20:44 compute-0 nova_compute[187185]: 2025-11-29 07:20:44.479 187189 DEBUG oslo_concurrency.lockutils [req-b90c1e5d-c91b-45d8-a9a8-5e143008ec31 req-f4d8a434-2913-4c78-a651-c572839fe5b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "385b61e0-d06f-45d5-833f-956226dbe647-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:20:44 compute-0 nova_compute[187185]: 2025-11-29 07:20:44.480 187189 DEBUG nova.compute.manager [req-b90c1e5d-c91b-45d8-a9a8-5e143008ec31 req-f4d8a434-2913-4c78-a651-c572839fe5b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] No waiting events found dispatching network-vif-unplugged-d69c5dfd-952c-44e7-9e26-18e9807fcaf6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:20:44 compute-0 nova_compute[187185]: 2025-11-29 07:20:44.480 187189 WARNING nova.compute.manager [req-b90c1e5d-c91b-45d8-a9a8-5e143008ec31 req-f4d8a434-2913-4c78-a651-c572839fe5b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Received unexpected event network-vif-unplugged-d69c5dfd-952c-44e7-9e26-18e9807fcaf6 for instance with vm_state active and task_state None.
Nov 29 07:20:44 compute-0 nova_compute[187185]: 2025-11-29 07:20:44.480 187189 DEBUG nova.compute.manager [req-b90c1e5d-c91b-45d8-a9a8-5e143008ec31 req-f4d8a434-2913-4c78-a651-c572839fe5b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Received event network-vif-plugged-d69c5dfd-952c-44e7-9e26-18e9807fcaf6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:20:44 compute-0 nova_compute[187185]: 2025-11-29 07:20:44.481 187189 DEBUG oslo_concurrency.lockutils [req-b90c1e5d-c91b-45d8-a9a8-5e143008ec31 req-f4d8a434-2913-4c78-a651-c572839fe5b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "385b61e0-d06f-45d5-833f-956226dbe647-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:20:44 compute-0 nova_compute[187185]: 2025-11-29 07:20:44.481 187189 DEBUG oslo_concurrency.lockutils [req-b90c1e5d-c91b-45d8-a9a8-5e143008ec31 req-f4d8a434-2913-4c78-a651-c572839fe5b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "385b61e0-d06f-45d5-833f-956226dbe647-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:20:44 compute-0 nova_compute[187185]: 2025-11-29 07:20:44.481 187189 DEBUG oslo_concurrency.lockutils [req-b90c1e5d-c91b-45d8-a9a8-5e143008ec31 req-f4d8a434-2913-4c78-a651-c572839fe5b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "385b61e0-d06f-45d5-833f-956226dbe647-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:20:44 compute-0 nova_compute[187185]: 2025-11-29 07:20:44.482 187189 DEBUG nova.compute.manager [req-b90c1e5d-c91b-45d8-a9a8-5e143008ec31 req-f4d8a434-2913-4c78-a651-c572839fe5b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] No waiting events found dispatching network-vif-plugged-d69c5dfd-952c-44e7-9e26-18e9807fcaf6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:20:44 compute-0 nova_compute[187185]: 2025-11-29 07:20:44.482 187189 WARNING nova.compute.manager [req-b90c1e5d-c91b-45d8-a9a8-5e143008ec31 req-f4d8a434-2913-4c78-a651-c572839fe5b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Received unexpected event network-vif-plugged-d69c5dfd-952c-44e7-9e26-18e9807fcaf6 for instance with vm_state active and task_state None.
Nov 29 07:20:44 compute-0 nova_compute[187185]: 2025-11-29 07:20:44.482 187189 DEBUG nova.compute.manager [req-b90c1e5d-c91b-45d8-a9a8-5e143008ec31 req-f4d8a434-2913-4c78-a651-c572839fe5b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Received event network-vif-plugged-d69c5dfd-952c-44e7-9e26-18e9807fcaf6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:20:44 compute-0 nova_compute[187185]: 2025-11-29 07:20:44.483 187189 DEBUG oslo_concurrency.lockutils [req-b90c1e5d-c91b-45d8-a9a8-5e143008ec31 req-f4d8a434-2913-4c78-a651-c572839fe5b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "385b61e0-d06f-45d5-833f-956226dbe647-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:20:44 compute-0 nova_compute[187185]: 2025-11-29 07:20:44.483 187189 DEBUG oslo_concurrency.lockutils [req-b90c1e5d-c91b-45d8-a9a8-5e143008ec31 req-f4d8a434-2913-4c78-a651-c572839fe5b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "385b61e0-d06f-45d5-833f-956226dbe647-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:20:44 compute-0 nova_compute[187185]: 2025-11-29 07:20:44.483 187189 DEBUG oslo_concurrency.lockutils [req-b90c1e5d-c91b-45d8-a9a8-5e143008ec31 req-f4d8a434-2913-4c78-a651-c572839fe5b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "385b61e0-d06f-45d5-833f-956226dbe647-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:20:44 compute-0 nova_compute[187185]: 2025-11-29 07:20:44.483 187189 DEBUG nova.compute.manager [req-b90c1e5d-c91b-45d8-a9a8-5e143008ec31 req-f4d8a434-2913-4c78-a651-c572839fe5b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] No waiting events found dispatching network-vif-plugged-d69c5dfd-952c-44e7-9e26-18e9807fcaf6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:20:44 compute-0 nova_compute[187185]: 2025-11-29 07:20:44.484 187189 WARNING nova.compute.manager [req-b90c1e5d-c91b-45d8-a9a8-5e143008ec31 req-f4d8a434-2913-4c78-a651-c572839fe5b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Received unexpected event network-vif-plugged-d69c5dfd-952c-44e7-9e26-18e9807fcaf6 for instance with vm_state active and task_state None.
Nov 29 07:20:44 compute-0 nova_compute[187185]: 2025-11-29 07:20:44.484 187189 DEBUG nova.compute.manager [req-b90c1e5d-c91b-45d8-a9a8-5e143008ec31 req-f4d8a434-2913-4c78-a651-c572839fe5b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Received event network-vif-plugged-d69c5dfd-952c-44e7-9e26-18e9807fcaf6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:20:44 compute-0 nova_compute[187185]: 2025-11-29 07:20:44.484 187189 DEBUG oslo_concurrency.lockutils [req-b90c1e5d-c91b-45d8-a9a8-5e143008ec31 req-f4d8a434-2913-4c78-a651-c572839fe5b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "385b61e0-d06f-45d5-833f-956226dbe647-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:20:44 compute-0 nova_compute[187185]: 2025-11-29 07:20:44.484 187189 DEBUG oslo_concurrency.lockutils [req-b90c1e5d-c91b-45d8-a9a8-5e143008ec31 req-f4d8a434-2913-4c78-a651-c572839fe5b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "385b61e0-d06f-45d5-833f-956226dbe647-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:20:44 compute-0 nova_compute[187185]: 2025-11-29 07:20:44.485 187189 DEBUG oslo_concurrency.lockutils [req-b90c1e5d-c91b-45d8-a9a8-5e143008ec31 req-f4d8a434-2913-4c78-a651-c572839fe5b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "385b61e0-d06f-45d5-833f-956226dbe647-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:20:44 compute-0 nova_compute[187185]: 2025-11-29 07:20:44.485 187189 DEBUG nova.compute.manager [req-b90c1e5d-c91b-45d8-a9a8-5e143008ec31 req-f4d8a434-2913-4c78-a651-c572839fe5b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] No waiting events found dispatching network-vif-plugged-d69c5dfd-952c-44e7-9e26-18e9807fcaf6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:20:44 compute-0 nova_compute[187185]: 2025-11-29 07:20:44.485 187189 WARNING nova.compute.manager [req-b90c1e5d-c91b-45d8-a9a8-5e143008ec31 req-f4d8a434-2913-4c78-a651-c572839fe5b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Received unexpected event network-vif-plugged-d69c5dfd-952c-44e7-9e26-18e9807fcaf6 for instance with vm_state active and task_state None.
Nov 29 07:20:45 compute-0 nova_compute[187185]: 2025-11-29 07:20:45.896 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:46 compute-0 nova_compute[187185]: 2025-11-29 07:20:46.457 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:47.999 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '385b61e0-d06f-45d5-833f-956226dbe647', 'name': 'tempest-TestNetworkAdvancedServerOps-server-1185749356', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000006e', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c231e63624d44fc19e0989abfb1afb22', 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'hostId': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.000 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.004 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 385b61e0-d06f-45d5-833f-956226dbe647 / tapd69c5dfd-95 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.004 12 DEBUG ceilometer.compute.pollsters [-] 385b61e0-d06f-45d5-833f-956226dbe647/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f7343ef9-2a97-44cb-81ce-c2316689997a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-0000006e-385b61e0-d06f-45d5-833f-956226dbe647-tapd69c5dfd-95', 'timestamp': '2025-11-29T07:20:48.000696', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1185749356', 'name': 'tapd69c5dfd-95', 'instance_id': '385b61e0-d06f-45d5-833f-956226dbe647', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:90:f9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd69c5dfd-95'}, 'message_id': 'ed5f4d6c-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6295.718932823, 'message_signature': 'e415ebf4a89d1f5fa7c41a84a2101c1882e979165e5a9e5ae9db97bbe14b6501'}]}, 'timestamp': '2025-11-29 07:20:48.005362', '_unique_id': '838c556e6c4d4238b22b8596c53fa97c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.006 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.007 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.022 12 DEBUG ceilometer.compute.pollsters [-] 385b61e0-d06f-45d5-833f-956226dbe647/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.022 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 385b61e0-d06f-45d5-833f-956226dbe647: ceilometer.compute.pollsters.NoVolumeException
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.022 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.022 12 DEBUG ceilometer.compute.pollsters [-] 385b61e0-d06f-45d5-833f-956226dbe647/cpu volume: 4740000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '60ad739e-8ee2-491e-a515-22c0c3d77c98', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4740000000, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '385b61e0-d06f-45d5-833f-956226dbe647', 'timestamp': '2025-11-29T07:20:48.022527', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1185749356', 'name': 'instance-0000006e', 'instance_id': '385b61e0-d06f-45d5-833f-956226dbe647', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'ed6200ca-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6295.740134272, 'message_signature': '3c9d026c9e0fc26c2de5022122a63b7506070b33ae3804efe9982c2f106ebd3a'}]}, 'timestamp': '2025-11-29 07:20:48.022974', '_unique_id': 'e348804932b94d9e847c672467c10951'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.023 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.024 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.024 12 DEBUG ceilometer.compute.pollsters [-] 385b61e0-d06f-45d5-833f-956226dbe647/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5548dbc1-c7db-435d-918a-962c0e16ff2b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-0000006e-385b61e0-d06f-45d5-833f-956226dbe647-tapd69c5dfd-95', 'timestamp': '2025-11-29T07:20:48.024815', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1185749356', 'name': 'tapd69c5dfd-95', 'instance_id': '385b61e0-d06f-45d5-833f-956226dbe647', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:90:f9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd69c5dfd-95'}, 'message_id': 'ed6258ea-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6295.718932823, 'message_signature': '9ef1dfc603e49e3e0b509ff300e4d6f3a77b8964cac586a7f657fcf85ca4f864'}]}, 'timestamp': '2025-11-29 07:20:48.025387', '_unique_id': '811ec7c6d7e7474fa874cb44fb86860a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.026 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.027 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.027 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1185749356>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1185749356>]
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.027 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.038 12 DEBUG ceilometer.compute.pollsters [-] 385b61e0-d06f-45d5-833f-956226dbe647/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.039 12 DEBUG ceilometer.compute.pollsters [-] 385b61e0-d06f-45d5-833f-956226dbe647/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7983ae15-cd70-41d9-a6c2-7216980dbec2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '385b61e0-d06f-45d5-833f-956226dbe647-vda', 'timestamp': '2025-11-29T07:20:48.027457', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1185749356', 'name': 'instance-0000006e', 'instance_id': '385b61e0-d06f-45d5-833f-956226dbe647', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ed647aa8-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6295.745689688, 'message_signature': '5b48ae58d4706a249f7786800570f91122741add4b527a6ee238ca13d69f7589'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '385b61e0-d06f-45d5-833f-956226dbe647-sda', 'timestamp': '2025-11-29T07:20:48.027457', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1185749356', 'name': 'instance-0000006e', 'instance_id': '385b61e0-d06f-45d5-833f-956226dbe647', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ed648778-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6295.745689688, 'message_signature': 'ce9e8fc8a663db2aade9b89d4df6ac33e74ea6d8a2e32fc450ca4ef2829c6551'}]}, 'timestamp': '2025-11-29 07:20:48.039450', '_unique_id': '3b11ced287244394b6e944848218be06'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.040 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.041 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.041 12 DEBUG ceilometer.compute.pollsters [-] 385b61e0-d06f-45d5-833f-956226dbe647/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.041 12 DEBUG ceilometer.compute.pollsters [-] 385b61e0-d06f-45d5-833f-956226dbe647/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0bf301e1-c1b9-4db4-8691-d4c87270a731', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '385b61e0-d06f-45d5-833f-956226dbe647-vda', 'timestamp': '2025-11-29T07:20:48.041449', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1185749356', 'name': 'instance-0000006e', 'instance_id': '385b61e0-d06f-45d5-833f-956226dbe647', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ed64e0f6-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6295.745689688, 'message_signature': '0c7f893805c241535af48df71aa2cfb0dc8b7041774f44f656297f72b650dbae'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '385b61e0-d06f-45d5-833f-956226dbe647-sda', 'timestamp': '2025-11-29T07:20:48.041449', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1185749356', 'name': 'instance-0000006e', 'instance_id': '385b61e0-d06f-45d5-833f-956226dbe647', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ed64ebe6-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6295.745689688, 'message_signature': 'ae72922cbc771212a6db1c55cbf5ebda3ee9c3e1950f45f2409f9f89f47c8be4'}]}, 'timestamp': '2025-11-29 07:20:48.042006', '_unique_id': '405113a3c31e4cbb94a50de26edae739'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.042 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.043 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.043 12 DEBUG ceilometer.compute.pollsters [-] 385b61e0-d06f-45d5-833f-956226dbe647/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '557967d0-ddc0-4bef-987e-f996ecda651f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-0000006e-385b61e0-d06f-45d5-833f-956226dbe647-tapd69c5dfd-95', 'timestamp': '2025-11-29T07:20:48.043498', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1185749356', 'name': 'tapd69c5dfd-95', 'instance_id': '385b61e0-d06f-45d5-833f-956226dbe647', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:90:f9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd69c5dfd-95'}, 'message_id': 'ed653196-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6295.718932823, 'message_signature': '84dfb4a61ccf88b67e26cd90b19e8930bf1ee4ba0aa464366bfc1a8aece4d48b'}]}, 'timestamp': '2025-11-29 07:20:48.043811', '_unique_id': '76043ee591e442859afee5d783f7a95b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.044 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.045 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.045 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.045 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1185749356>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1185749356>]
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.045 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.070 12 DEBUG ceilometer.compute.pollsters [-] 385b61e0-d06f-45d5-833f-956226dbe647/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.071 12 DEBUG ceilometer.compute.pollsters [-] 385b61e0-d06f-45d5-833f-956226dbe647/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '23ec75b6-6913-49a0-95a6-3d3f890b3e03', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '385b61e0-d06f-45d5-833f-956226dbe647-vda', 'timestamp': '2025-11-29T07:20:48.045594', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1185749356', 'name': 'instance-0000006e', 'instance_id': '385b61e0-d06f-45d5-833f-956226dbe647', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ed696662-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6295.76382552, 'message_signature': '5ebdfd57ef89ad74d909fba402bce571c8f2ae684807e37a12c5d31d9e530c65'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '385b61e0-d06f-45d5-833f-956226dbe647-sda', 'timestamp': '2025-11-29T07:20:48.045594', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1185749356', 'name': 'instance-0000006e', 'instance_id': '385b61e0-d06f-45d5-833f-956226dbe647', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ed6974a4-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6295.76382552, 'message_signature': 'd853f9f671b6192536a9bb9abb15e8f3d25e9b781e79801b15dfde5460bca879'}]}, 'timestamp': '2025-11-29 07:20:48.071726', '_unique_id': '8e40bc40a0b941d0bbc02beadbe59be6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.072 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.073 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.073 12 DEBUG ceilometer.compute.pollsters [-] 385b61e0-d06f-45d5-833f-956226dbe647/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.073 12 DEBUG ceilometer.compute.pollsters [-] 385b61e0-d06f-45d5-833f-956226dbe647/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '58e235fd-7233-451e-91c8-15fec28f902c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '385b61e0-d06f-45d5-833f-956226dbe647-vda', 'timestamp': '2025-11-29T07:20:48.073511', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1185749356', 'name': 'instance-0000006e', 'instance_id': '385b61e0-d06f-45d5-833f-956226dbe647', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ed69c67a-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6295.76382552, 'message_signature': '9598dc47ea15384e22792d8c6d975f4efaa9af2cb7111c0b8150be622faa6d8d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '385b61e0-d06f-45d5-833f-956226dbe647-sda', 'timestamp': '2025-11-29T07:20:48.073511', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1185749356', 'name': 'instance-0000006e', 'instance_id': '385b61e0-d06f-45d5-833f-956226dbe647', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ed69d084-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6295.76382552, 'message_signature': '1c9bcdb5483cdf308b26eb4d076a33c96bb76a37d520395060206a8c9c4332d0'}]}, 'timestamp': '2025-11-29 07:20:48.074075', '_unique_id': '2e11e2e6fda943719795b203ed88d6a8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.074 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.075 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.075 12 DEBUG ceilometer.compute.pollsters [-] 385b61e0-d06f-45d5-833f-956226dbe647/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'da5f9599-8138-463f-b2d6-151e0ecf8b33', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-0000006e-385b61e0-d06f-45d5-833f-956226dbe647-tapd69c5dfd-95', 'timestamp': '2025-11-29T07:20:48.075368', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1185749356', 'name': 'tapd69c5dfd-95', 'instance_id': '385b61e0-d06f-45d5-833f-956226dbe647', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:90:f9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd69c5dfd-95'}, 'message_id': 'ed6a0cfc-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6295.718932823, 'message_signature': '1e25bee21087eb56ff39a34da5a6fa3bcac75ced984cd56ac3d0e49be6dc3b9a'}]}, 'timestamp': '2025-11-29 07:20:48.075619', '_unique_id': '24bf33d93760405aaa8216a2df3471ae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.076 12 DEBUG ceilometer.compute.pollsters [-] 385b61e0-d06f-45d5-833f-956226dbe647/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 DEBUG ceilometer.compute.pollsters [-] 385b61e0-d06f-45d5-833f-956226dbe647/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '24cecbf0-43dd-459b-b9dc-9766fb455a99', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '385b61e0-d06f-45d5-833f-956226dbe647-vda', 'timestamp': '2025-11-29T07:20:48.076906', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1185749356', 'name': 'instance-0000006e', 'instance_id': '385b61e0-d06f-45d5-833f-956226dbe647', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ed6a4a14-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6295.76382552, 'message_signature': '7ad0407c35ab3176c2a898eb5037a07b076044801906d96b99d129bb4f0e4c0e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '385b61e0-d06f-45d5-833f-956226dbe647-sda', 'timestamp': '2025-11-29T07:20:48.076906', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1185749356', 'name': 'instance-0000006e', 'instance_id': '385b61e0-d06f-45d5-833f-956226dbe647', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ed6a52e8-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6295.76382552, 'message_signature': 'd9695786ab8678404a9b0460d278ff5d60df1a23e355e6385b3f8bad7c2c2692'}]}, 'timestamp': '2025-11-29 07:20:48.077380', '_unique_id': 'da04fba07df04af7b4cfeb844560011b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.077 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.078 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.078 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.078 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1185749356>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1185749356>]
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.079 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.079 12 DEBUG ceilometer.compute.pollsters [-] 385b61e0-d06f-45d5-833f-956226dbe647/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.079 12 DEBUG ceilometer.compute.pollsters [-] 385b61e0-d06f-45d5-833f-956226dbe647/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9fe80d04-4283-4fd2-8145-cf1bac16f34d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '385b61e0-d06f-45d5-833f-956226dbe647-vda', 'timestamp': '2025-11-29T07:20:48.079111', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1185749356', 'name': 'instance-0000006e', 'instance_id': '385b61e0-d06f-45d5-833f-956226dbe647', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ed6aa0a4-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6295.76382552, 'message_signature': '2eaffdc4449ba9264cd8c5437e80708dedc4f203a4b4b44f7144bab3b9f82aee'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '385b61e0-d06f-45d5-833f-956226dbe647-sda', 'timestamp': '2025-11-29T07:20:48.079111', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1185749356', 'name': 'instance-0000006e', 'instance_id': '385b61e0-d06f-45d5-833f-956226dbe647', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ed6aaa36-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6295.76382552, 'message_signature': '17c77b7df921620260cfd112a60899b8fee634101ed65a7cb9e5b5f5d5b355e4'}]}, 'timestamp': '2025-11-29 07:20:48.079615', '_unique_id': 'f69627b2be4a437ca320009a658e0334'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.080 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1185749356>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1185749356>]
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 DEBUG ceilometer.compute.pollsters [-] 385b61e0-d06f-45d5-833f-956226dbe647/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '95efcde3-0e3e-44c1-b8e7-ff90dfb7faf2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-0000006e-385b61e0-d06f-45d5-833f-956226dbe647-tapd69c5dfd-95', 'timestamp': '2025-11-29T07:20:48.081128', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1185749356', 'name': 'tapd69c5dfd-95', 'instance_id': '385b61e0-d06f-45d5-833f-956226dbe647', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:90:f9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd69c5dfd-95'}, 'message_id': 'ed6aef0a-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6295.718932823, 'message_signature': '5fbe3670cc6e0ce31e12d24a28ee3f25ca8779522a2284231a35038d527325d6'}]}, 'timestamp': '2025-11-29 07:20:48.081421', '_unique_id': '917cfb78338548aa8b2b06f85b669896'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.081 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.082 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.082 12 DEBUG ceilometer.compute.pollsters [-] 385b61e0-d06f-45d5-833f-956226dbe647/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ecbb4bb4-0620-42be-9da5-e1c8a33b454f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-0000006e-385b61e0-d06f-45d5-833f-956226dbe647-tapd69c5dfd-95', 'timestamp': '2025-11-29T07:20:48.082743', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1185749356', 'name': 'tapd69c5dfd-95', 'instance_id': '385b61e0-d06f-45d5-833f-956226dbe647', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:90:f9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd69c5dfd-95'}, 'message_id': 'ed6b2e66-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6295.718932823, 'message_signature': 'c56f53551de2bbd8406321c0ad4e077517bdd542e60acc8abf107e3fc46887e0'}]}, 'timestamp': '2025-11-29 07:20:48.083039', '_unique_id': 'f2df98bbc73041dca8bb720f47e0a5ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.083 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.084 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.084 12 DEBUG ceilometer.compute.pollsters [-] 385b61e0-d06f-45d5-833f-956226dbe647/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.084 12 DEBUG ceilometer.compute.pollsters [-] 385b61e0-d06f-45d5-833f-956226dbe647/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '108970b8-222a-4864-bc8c-dd262e8b4579', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '385b61e0-d06f-45d5-833f-956226dbe647-vda', 'timestamp': '2025-11-29T07:20:48.084403', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1185749356', 'name': 'instance-0000006e', 'instance_id': '385b61e0-d06f-45d5-833f-956226dbe647', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ed6b6ef8-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6295.76382552, 'message_signature': 'c2fcc093a7d95b85f9c3c37deadd488ae923798a10c637ecd3a986bfec46e83e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '385b61e0-d06f-45d5-833f-956226dbe647-sda', 'timestamp': '2025-11-29T07:20:48.084403', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1185749356', 'name': 'instance-0000006e', 'instance_id': '385b61e0-d06f-45d5-833f-956226dbe647', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ed6b7a88-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6295.76382552, 'message_signature': '7b78daa485d7953e576632fc40b820bfb7c89d8c1baabe9b63ad12787a1ce6f0'}]}, 'timestamp': '2025-11-29 07:20:48.085004', '_unique_id': '7a4834e6b75c4cb6942e84888922a33b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.085 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.086 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.086 12 DEBUG ceilometer.compute.pollsters [-] 385b61e0-d06f-45d5-833f-956226dbe647/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '95a600bf-c570-4040-869f-86a37b97b036', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-0000006e-385b61e0-d06f-45d5-833f-956226dbe647-tapd69c5dfd-95', 'timestamp': '2025-11-29T07:20:48.086529', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1185749356', 'name': 'tapd69c5dfd-95', 'instance_id': '385b61e0-d06f-45d5-833f-956226dbe647', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:90:f9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd69c5dfd-95'}, 'message_id': 'ed6bc0d8-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6295.718932823, 'message_signature': '43352619559089402f22481ddf590767a7a3db336b978bd4f712dc35a9c28748'}]}, 'timestamp': '2025-11-29 07:20:48.086762', '_unique_id': '6678a03843d84226922136816140b084'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.087 12 DEBUG ceilometer.compute.pollsters [-] 385b61e0-d06f-45d5-833f-956226dbe647/disk.device.read.latency volume: 206510719 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 DEBUG ceilometer.compute.pollsters [-] 385b61e0-d06f-45d5-833f-956226dbe647/disk.device.read.latency volume: 552635 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '802d3067-9dc4-4e7c-96bc-4ba6582e69e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 206510719, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '385b61e0-d06f-45d5-833f-956226dbe647-vda', 'timestamp': '2025-11-29T07:20:48.087826', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1185749356', 'name': 'instance-0000006e', 'instance_id': '385b61e0-d06f-45d5-833f-956226dbe647', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ed6bf472-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6295.76382552, 'message_signature': 'e3203f526ef12462cc3caf1180626bc84a3253e01b84f2a213dc86c3690004ff'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 552635, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '385b61e0-d06f-45d5-833f-956226dbe647-sda', 'timestamp': '2025-11-29T07:20:48.087826', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1185749356', 'name': 'instance-0000006e', 'instance_id': '385b61e0-d06f-45d5-833f-956226dbe647', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ed6bfc06-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6295.76382552, 'message_signature': 'c3f63846e47688bb278a5850678dec3c1918493b90faedbdca8f82a123d03652'}]}, 'timestamp': '2025-11-29 07:20:48.088260', '_unique_id': '598bd678b46b4bfabf70a9d381f9df2a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.088 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.089 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.089 12 DEBUG ceilometer.compute.pollsters [-] 385b61e0-d06f-45d5-833f-956226dbe647/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ab547121-85e1-4950-b9aa-dc137301f363', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-0000006e-385b61e0-d06f-45d5-833f-956226dbe647-tapd69c5dfd-95', 'timestamp': '2025-11-29T07:20:48.089339', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1185749356', 'name': 'tapd69c5dfd-95', 'instance_id': '385b61e0-d06f-45d5-833f-956226dbe647', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:90:f9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd69c5dfd-95'}, 'message_id': 'ed6c2e56-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6295.718932823, 'message_signature': '5f0e274a0b1c888daf947e69aa2a0db38114434b5b12248f499dabdbc3748418'}]}, 'timestamp': '2025-11-29 07:20:48.089567', '_unique_id': '6700763578104654b5af326822af9ce0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 DEBUG ceilometer.compute.pollsters [-] 385b61e0-d06f-45d5-833f-956226dbe647/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.090 12 DEBUG ceilometer.compute.pollsters [-] 385b61e0-d06f-45d5-833f-956226dbe647/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eb890ed2-5d77-473a-9b39-99bfb1238f12', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '385b61e0-d06f-45d5-833f-956226dbe647-vda', 'timestamp': '2025-11-29T07:20:48.090600', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1185749356', 'name': 'instance-0000006e', 'instance_id': '385b61e0-d06f-45d5-833f-956226dbe647', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ed6c5f7a-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6295.745689688, 'message_signature': '52fc4712a86bde3a449413e2d72951eaef3e31228399ff54516132bdd911ac4c'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '385b61e0-d06f-45d5-833f-956226dbe647-sda', 'timestamp': '2025-11-29T07:20:48.090600', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1185749356', 'name': 'instance-0000006e', 'instance_id': '385b61e0-d06f-45d5-833f-956226dbe647', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ed6c67c2-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6295.745689688, 'message_signature': 'c058475cb3c0577198cc7c08d412dfb00de24f19f6a4a3744b838ac01e5ad605'}]}, 'timestamp': '2025-11-29 07:20:48.091023', '_unique_id': '6071653a270b4c2faf7f817c171776fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.091 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 DEBUG ceilometer.compute.pollsters [-] 385b61e0-d06f-45d5-833f-956226dbe647/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '614985db-b03e-4de4-8db5-ce6880eee952', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-0000006e-385b61e0-d06f-45d5-833f-956226dbe647-tapd69c5dfd-95', 'timestamp': '2025-11-29T07:20:48.092238', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1185749356', 'name': 'tapd69c5dfd-95', 'instance_id': '385b61e0-d06f-45d5-833f-956226dbe647', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:90:f9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd69c5dfd-95'}, 'message_id': 'ed6c9fa8-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6295.718932823, 'message_signature': 'ed91e1d8ca2cf5e5a0f879b13cdb26899048f96fac4f71dc225a9d06d85b4998'}]}, 'timestamp': '2025-11-29 07:20:48.092466', '_unique_id': '691aa6dcca854b1f96f12aa3240162d3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.092 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.093 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.093 12 DEBUG ceilometer.compute.pollsters [-] 385b61e0-d06f-45d5-833f-956226dbe647/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a62bdf1b-927e-4253-9d78-61f738799636', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-0000006e-385b61e0-d06f-45d5-833f-956226dbe647-tapd69c5dfd-95', 'timestamp': '2025-11-29T07:20:48.093533', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1185749356', 'name': 'tapd69c5dfd-95', 'instance_id': '385b61e0-d06f-45d5-833f-956226dbe647', 'instance_type': 'm1.nano', 'host': '69f4b3b6b7ed4050e071d416dd7b9d13cf27e9b5579d7f525e0e9bfc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:90:f9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd69c5dfd-95'}, 'message_id': 'ed6cd234-ccf3-11f0-8f64-fa163e220349', 'monotonic_time': 6295.718932823, 'message_signature': '0735c8282ac3588808e4d54648f0711b119a8109b36a3a0c2c08ed0d126b0228'}]}, 'timestamp': '2025-11-29 07:20:48.093756', '_unique_id': '55aa98314c4f43fb855fe890b9b1bb7e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:20:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:20:48.094 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:20:49 compute-0 nova_compute[187185]: 2025-11-29 07:20:49.087 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:50 compute-0 nova_compute[187185]: 2025-11-29 07:20:50.897 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:51 compute-0 podman[232060]: 2025-11-29 07:20:51.83745656 +0000 UTC m=+0.103529884 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 07:20:54 compute-0 nova_compute[187185]: 2025-11-29 07:20:54.090 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:55 compute-0 nova_compute[187185]: 2025-11-29 07:20:55.898 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:20:56 compute-0 ovn_controller[95281]: 2025-11-29T07:20:56Z|00041|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ff:90:f9 10.100.0.5
Nov 29 07:20:58 compute-0 podman[232098]: 2025-11-29 07:20:58.792916205 +0000 UTC m=+0.053756849 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 07:20:59 compute-0 nova_compute[187185]: 2025-11-29 07:20:59.094 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:21:00 compute-0 podman[232122]: 2025-11-29 07:21:00.803070895 +0000 UTC m=+0.059445849 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd)
Nov 29 07:21:00 compute-0 podman[232123]: 2025-11-29 07:21:00.821052973 +0000 UTC m=+0.071245302 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 29 07:21:00 compute-0 nova_compute[187185]: 2025-11-29 07:21:00.901 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:21:01 compute-0 nova_compute[187185]: 2025-11-29 07:21:01.960 187189 INFO nova.compute.manager [None req-94d6cb5b-7a2c-4d2b-8e38-600b074eb062 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Get console output
Nov 29 07:21:01 compute-0 nova_compute[187185]: 2025-11-29 07:21:01.966 213758 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 29 07:21:02 compute-0 ovn_controller[95281]: 2025-11-29T07:21:02Z|00316|binding|INFO|Releasing lport 38692e46-2719-40b3-95b5-e8d28c9319d3 from this chassis (sb_readonly=0)
Nov 29 07:21:02 compute-0 nova_compute[187185]: 2025-11-29 07:21:02.520 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:21:02 compute-0 nova_compute[187185]: 2025-11-29 07:21:02.855 187189 DEBUG nova.compute.manager [req-b7f9d6e5-b864-467f-b4ff-8916e1129c35 req-32b269ce-1fbe-4bfa-a580-3b0518439ad7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Received event network-changed-d69c5dfd-952c-44e7-9e26-18e9807fcaf6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:21:02 compute-0 nova_compute[187185]: 2025-11-29 07:21:02.856 187189 DEBUG nova.compute.manager [req-b7f9d6e5-b864-467f-b4ff-8916e1129c35 req-32b269ce-1fbe-4bfa-a580-3b0518439ad7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Refreshing instance network info cache due to event network-changed-d69c5dfd-952c-44e7-9e26-18e9807fcaf6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:21:02 compute-0 nova_compute[187185]: 2025-11-29 07:21:02.856 187189 DEBUG oslo_concurrency.lockutils [req-b7f9d6e5-b864-467f-b4ff-8916e1129c35 req-32b269ce-1fbe-4bfa-a580-3b0518439ad7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-385b61e0-d06f-45d5-833f-956226dbe647" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:21:02 compute-0 nova_compute[187185]: 2025-11-29 07:21:02.857 187189 DEBUG oslo_concurrency.lockutils [req-b7f9d6e5-b864-467f-b4ff-8916e1129c35 req-32b269ce-1fbe-4bfa-a580-3b0518439ad7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-385b61e0-d06f-45d5-833f-956226dbe647" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:21:02 compute-0 nova_compute[187185]: 2025-11-29 07:21:02.857 187189 DEBUG nova.network.neutron [req-b7f9d6e5-b864-467f-b4ff-8916e1129c35 req-32b269ce-1fbe-4bfa-a580-3b0518439ad7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Refreshing network info cache for port d69c5dfd-952c-44e7-9e26-18e9807fcaf6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:21:02 compute-0 nova_compute[187185]: 2025-11-29 07:21:02.932 187189 DEBUG oslo_concurrency.lockutils [None req-95288d11-f6bb-479a-a16c-b7873407a125 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "385b61e0-d06f-45d5-833f-956226dbe647" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:21:02 compute-0 nova_compute[187185]: 2025-11-29 07:21:02.932 187189 DEBUG oslo_concurrency.lockutils [None req-95288d11-f6bb-479a-a16c-b7873407a125 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "385b61e0-d06f-45d5-833f-956226dbe647" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:21:02 compute-0 nova_compute[187185]: 2025-11-29 07:21:02.933 187189 DEBUG oslo_concurrency.lockutils [None req-95288d11-f6bb-479a-a16c-b7873407a125 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "385b61e0-d06f-45d5-833f-956226dbe647-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:21:02 compute-0 nova_compute[187185]: 2025-11-29 07:21:02.933 187189 DEBUG oslo_concurrency.lockutils [None req-95288d11-f6bb-479a-a16c-b7873407a125 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "385b61e0-d06f-45d5-833f-956226dbe647-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:21:02 compute-0 nova_compute[187185]: 2025-11-29 07:21:02.933 187189 DEBUG oslo_concurrency.lockutils [None req-95288d11-f6bb-479a-a16c-b7873407a125 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "385b61e0-d06f-45d5-833f-956226dbe647-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:21:02 compute-0 nova_compute[187185]: 2025-11-29 07:21:02.945 187189 INFO nova.compute.manager [None req-95288d11-f6bb-479a-a16c-b7873407a125 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Terminating instance
Nov 29 07:21:02 compute-0 nova_compute[187185]: 2025-11-29 07:21:02.966 187189 DEBUG nova.compute.manager [None req-95288d11-f6bb-479a-a16c-b7873407a125 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 07:21:02 compute-0 kernel: tapd69c5dfd-95 (unregistering): left promiscuous mode
Nov 29 07:21:02 compute-0 NetworkManager[55227]: <info>  [1764400862.9985] device (tapd69c5dfd-95): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:21:03 compute-0 ovn_controller[95281]: 2025-11-29T07:21:03Z|00317|binding|INFO|Releasing lport d69c5dfd-952c-44e7-9e26-18e9807fcaf6 from this chassis (sb_readonly=0)
Nov 29 07:21:03 compute-0 ovn_controller[95281]: 2025-11-29T07:21:03Z|00318|binding|INFO|Setting lport d69c5dfd-952c-44e7-9e26-18e9807fcaf6 down in Southbound
Nov 29 07:21:03 compute-0 nova_compute[187185]: 2025-11-29 07:21:03.004 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:21:03 compute-0 ovn_controller[95281]: 2025-11-29T07:21:03Z|00319|binding|INFO|Removing iface tapd69c5dfd-95 ovn-installed in OVS
Nov 29 07:21:03 compute-0 nova_compute[187185]: 2025-11-29 07:21:03.007 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:21:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:21:03.015 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:90:f9 10.100.0.5'], port_security=['fa:16:3e:ff:90:f9 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '385b61e0-d06f-45d5-833f-956226dbe647', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b1af4918-cc03-490f-9e76-dc4f5fd7f840', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c231e63624d44fc19e0989abfb1afb22', 'neutron:revision_number': '6', 'neutron:security_group_ids': '64c144d3-0e65-4786-8bd8-0434ea854658', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=578489e5-bc3a-4682-96b0-942be7815ce6, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=d69c5dfd-952c-44e7-9e26-18e9807fcaf6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:21:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:21:03.017 104254 INFO neutron.agent.ovn.metadata.agent [-] Port d69c5dfd-952c-44e7-9e26-18e9807fcaf6 in datapath b1af4918-cc03-490f-9e76-dc4f5fd7f840 unbound from our chassis
Nov 29 07:21:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:21:03.019 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b1af4918-cc03-490f-9e76-dc4f5fd7f840, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:21:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:21:03.020 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[5e610da1-4aae-4f33-98b3-8a6e44222e55]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:21:03 compute-0 nova_compute[187185]: 2025-11-29 07:21:03.021 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:21:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:21:03.021 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840 namespace which is not needed anymore
Nov 29 07:21:03 compute-0 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d0000006e.scope: Deactivated successfully.
Nov 29 07:21:03 compute-0 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d0000006e.scope: Consumed 13.155s CPU time.
Nov 29 07:21:03 compute-0 ovn_controller[95281]: 2025-11-29T07:21:03Z|00320|binding|INFO|Releasing lport 38692e46-2719-40b3-95b5-e8d28c9319d3 from this chassis (sb_readonly=0)
Nov 29 07:21:03 compute-0 systemd-machined[153486]: Machine qemu-42-instance-0000006e terminated.
Nov 29 07:21:03 compute-0 nova_compute[187185]: 2025-11-29 07:21:03.075 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:21:03 compute-0 NetworkManager[55227]: <info>  [1764400863.1860] manager: (tapd69c5dfd-95): new Tun device (/org/freedesktop/NetworkManager/Devices/173)
Nov 29 07:21:03 compute-0 ovn_controller[95281]: 2025-11-29T07:21:03Z|00321|binding|INFO|Releasing lport 38692e46-2719-40b3-95b5-e8d28c9319d3 from this chassis (sb_readonly=0)
Nov 29 07:21:03 compute-0 nova_compute[187185]: 2025-11-29 07:21:03.240 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:21:03 compute-0 nova_compute[187185]: 2025-11-29 07:21:03.272 187189 INFO nova.virt.libvirt.driver [-] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Instance destroyed successfully.
Nov 29 07:21:03 compute-0 nova_compute[187185]: 2025-11-29 07:21:03.272 187189 DEBUG nova.objects.instance [None req-95288d11-f6bb-479a-a16c-b7873407a125 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'resources' on Instance uuid 385b61e0-d06f-45d5-833f-956226dbe647 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:21:03 compute-0 nova_compute[187185]: 2025-11-29 07:21:03.289 187189 DEBUG nova.virt.libvirt.vif [None req-95288d11-f6bb-479a-a16c-b7873407a125 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:20:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1185749356',display_name='tempest-TestNetworkAdvancedServerOps-server-1185749356',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1185749356',id=110,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGSwWnxtNlh6nwv52DSkLcXTayESRz1TKsVEdkgArNOlSLUsPbKJSA3vkHHPtiLgyTWyPt3OWUciI+4jP4we7rtLJMxd1AlvhHoYZGKDwql7lK2v6bgvKYVGR4xaItQ7lg==',key_name='tempest-TestNetworkAdvancedServerOps-167732049',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:20:16Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-ymcbmpe0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:20:43Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=385b61e0-d06f-45d5-833f-956226dbe647,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d69c5dfd-952c-44e7-9e26-18e9807fcaf6", "address": "fa:16:3e:ff:90:f9", "network": {"id": "b1af4918-cc03-490f-9e76-dc4f5fd7f840", "bridge": "br-int", "label": "tempest-network-smoke--713941231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd69c5dfd-95", "ovs_interfaceid": "d69c5dfd-952c-44e7-9e26-18e9807fcaf6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:21:03 compute-0 nova_compute[187185]: 2025-11-29 07:21:03.289 187189 DEBUG nova.network.os_vif_util [None req-95288d11-f6bb-479a-a16c-b7873407a125 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converting VIF {"id": "d69c5dfd-952c-44e7-9e26-18e9807fcaf6", "address": "fa:16:3e:ff:90:f9", "network": {"id": "b1af4918-cc03-490f-9e76-dc4f5fd7f840", "bridge": "br-int", "label": "tempest-network-smoke--713941231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd69c5dfd-95", "ovs_interfaceid": "d69c5dfd-952c-44e7-9e26-18e9807fcaf6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:21:03 compute-0 nova_compute[187185]: 2025-11-29 07:21:03.290 187189 DEBUG nova.network.os_vif_util [None req-95288d11-f6bb-479a-a16c-b7873407a125 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ff:90:f9,bridge_name='br-int',has_traffic_filtering=True,id=d69c5dfd-952c-44e7-9e26-18e9807fcaf6,network=Network(b1af4918-cc03-490f-9e76-dc4f5fd7f840),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd69c5dfd-95') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:21:03 compute-0 nova_compute[187185]: 2025-11-29 07:21:03.291 187189 DEBUG os_vif [None req-95288d11-f6bb-479a-a16c-b7873407a125 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ff:90:f9,bridge_name='br-int',has_traffic_filtering=True,id=d69c5dfd-952c-44e7-9e26-18e9807fcaf6,network=Network(b1af4918-cc03-490f-9e76-dc4f5fd7f840),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd69c5dfd-95') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:21:03 compute-0 nova_compute[187185]: 2025-11-29 07:21:03.293 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:21:03 compute-0 nova_compute[187185]: 2025-11-29 07:21:03.294 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd69c5dfd-95, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:21:03 compute-0 nova_compute[187185]: 2025-11-29 07:21:03.295 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:21:03 compute-0 nova_compute[187185]: 2025-11-29 07:21:03.297 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:21:03 compute-0 nova_compute[187185]: 2025-11-29 07:21:03.301 187189 INFO os_vif [None req-95288d11-f6bb-479a-a16c-b7873407a125 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ff:90:f9,bridge_name='br-int',has_traffic_filtering=True,id=d69c5dfd-952c-44e7-9e26-18e9807fcaf6,network=Network(b1af4918-cc03-490f-9e76-dc4f5fd7f840),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd69c5dfd-95')
Nov 29 07:21:03 compute-0 nova_compute[187185]: 2025-11-29 07:21:03.301 187189 INFO nova.virt.libvirt.driver [None req-95288d11-f6bb-479a-a16c-b7873407a125 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Deleting instance files /var/lib/nova/instances/385b61e0-d06f-45d5-833f-956226dbe647_del
Nov 29 07:21:03 compute-0 nova_compute[187185]: 2025-11-29 07:21:03.302 187189 INFO nova.virt.libvirt.driver [None req-95288d11-f6bb-479a-a16c-b7873407a125 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Deletion of /var/lib/nova/instances/385b61e0-d06f-45d5-833f-956226dbe647_del complete
Nov 29 07:21:03 compute-0 nova_compute[187185]: 2025-11-29 07:21:03.389 187189 INFO nova.compute.manager [None req-95288d11-f6bb-479a-a16c-b7873407a125 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Took 0.42 seconds to destroy the instance on the hypervisor.
Nov 29 07:21:03 compute-0 nova_compute[187185]: 2025-11-29 07:21:03.390 187189 DEBUG oslo.service.loopingcall [None req-95288d11-f6bb-479a-a16c-b7873407a125 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 07:21:03 compute-0 nova_compute[187185]: 2025-11-29 07:21:03.390 187189 DEBUG nova.compute.manager [-] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 07:21:03 compute-0 nova_compute[187185]: 2025-11-29 07:21:03.390 187189 DEBUG nova.network.neutron [-] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 07:21:03 compute-0 neutron-haproxy-ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840[232044]: [NOTICE]   (232048) : haproxy version is 2.8.14-c23fe91
Nov 29 07:21:03 compute-0 neutron-haproxy-ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840[232044]: [NOTICE]   (232048) : path to executable is /usr/sbin/haproxy
Nov 29 07:21:03 compute-0 neutron-haproxy-ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840[232044]: [WARNING]  (232048) : Exiting Master process...
Nov 29 07:21:03 compute-0 neutron-haproxy-ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840[232044]: [ALERT]    (232048) : Current worker (232050) exited with code 143 (Terminated)
Nov 29 07:21:03 compute-0 neutron-haproxy-ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840[232044]: [WARNING]  (232048) : All workers exited. Exiting... (0)
Nov 29 07:21:03 compute-0 systemd[1]: libpod-58dc36e690dfd6938f24a1bfe01f51f2cb95556008dff11d6a0ec6df4e213fb9.scope: Deactivated successfully.
Nov 29 07:21:03 compute-0 podman[232186]: 2025-11-29 07:21:03.434423822 +0000 UTC m=+0.311082934 container died 58dc36e690dfd6938f24a1bfe01f51f2cb95556008dff11d6a0ec6df4e213fb9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:21:03 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-58dc36e690dfd6938f24a1bfe01f51f2cb95556008dff11d6a0ec6df4e213fb9-userdata-shm.mount: Deactivated successfully.
Nov 29 07:21:03 compute-0 systemd[1]: var-lib-containers-storage-overlay-3e0460d6cc72aa4a486124250f4366d75e772e418bf5de6f0bc36bca52107157-merged.mount: Deactivated successfully.
Nov 29 07:21:03 compute-0 podman[232186]: 2025-11-29 07:21:03.807818703 +0000 UTC m=+0.684477805 container cleanup 58dc36e690dfd6938f24a1bfe01f51f2cb95556008dff11d6a0ec6df4e213fb9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 29 07:21:03 compute-0 systemd[1]: libpod-conmon-58dc36e690dfd6938f24a1bfe01f51f2cb95556008dff11d6a0ec6df4e213fb9.scope: Deactivated successfully.
Nov 29 07:21:03 compute-0 podman[232235]: 2025-11-29 07:21:03.905318686 +0000 UTC m=+0.071206841 container remove 58dc36e690dfd6938f24a1bfe01f51f2cb95556008dff11d6a0ec6df4e213fb9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 29 07:21:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:21:03.913 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[dd32383a-3e42-43b3-8a55-62f4ad8617a6]: (4, ('Sat Nov 29 07:21:03 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840 (58dc36e690dfd6938f24a1bfe01f51f2cb95556008dff11d6a0ec6df4e213fb9)\n58dc36e690dfd6938f24a1bfe01f51f2cb95556008dff11d6a0ec6df4e213fb9\nSat Nov 29 07:21:03 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840 (58dc36e690dfd6938f24a1bfe01f51f2cb95556008dff11d6a0ec6df4e213fb9)\n58dc36e690dfd6938f24a1bfe01f51f2cb95556008dff11d6a0ec6df4e213fb9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:21:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:21:03.915 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[1c819df3-f60b-419f-b49f-a515816b475b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:21:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:21:03.916 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb1af4918-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:21:03 compute-0 nova_compute[187185]: 2025-11-29 07:21:03.918 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:21:03 compute-0 kernel: tapb1af4918-c0: left promiscuous mode
Nov 29 07:21:03 compute-0 nova_compute[187185]: 2025-11-29 07:21:03.940 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:21:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:21:03.944 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[9c9bebb5-3d62-479f-abcd-9e06f9e48045]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:21:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:21:03.970 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[ed1b4237-e31d-4b9f-b17e-5ce748b40f5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:21:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:21:03.971 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[8a094371-6cac-4ca3-8526-f4286c25338c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:21:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:21:03.988 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[c168d95b-e588-4f5d-8ea4-2d0c4bc934c0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 629024, 'reachable_time': 44001, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232250, 'error': None, 'target': 'ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:21:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:21:03.991 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b1af4918-cc03-490f-9e76-dc4f5fd7f840 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:21:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:21:03.991 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[afa24ed8-95a8-4fff-b2b0-1966e2c41a0a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:21:03 compute-0 systemd[1]: run-netns-ovnmeta\x2db1af4918\x2dcc03\x2d490f\x2d9e76\x2ddc4f5fd7f840.mount: Deactivated successfully.
Nov 29 07:21:04 compute-0 nova_compute[187185]: 2025-11-29 07:21:04.026 187189 DEBUG nova.compute.manager [req-051a5b44-c049-4c05-a44e-267989c09450 req-b7a9cbba-ad9a-47c6-9222-bc4d617d67d1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Received event network-vif-unplugged-d69c5dfd-952c-44e7-9e26-18e9807fcaf6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:21:04 compute-0 nova_compute[187185]: 2025-11-29 07:21:04.027 187189 DEBUG oslo_concurrency.lockutils [req-051a5b44-c049-4c05-a44e-267989c09450 req-b7a9cbba-ad9a-47c6-9222-bc4d617d67d1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "385b61e0-d06f-45d5-833f-956226dbe647-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:21:04 compute-0 nova_compute[187185]: 2025-11-29 07:21:04.027 187189 DEBUG oslo_concurrency.lockutils [req-051a5b44-c049-4c05-a44e-267989c09450 req-b7a9cbba-ad9a-47c6-9222-bc4d617d67d1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "385b61e0-d06f-45d5-833f-956226dbe647-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:21:04 compute-0 nova_compute[187185]: 2025-11-29 07:21:04.027 187189 DEBUG oslo_concurrency.lockutils [req-051a5b44-c049-4c05-a44e-267989c09450 req-b7a9cbba-ad9a-47c6-9222-bc4d617d67d1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "385b61e0-d06f-45d5-833f-956226dbe647-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:21:04 compute-0 nova_compute[187185]: 2025-11-29 07:21:04.027 187189 DEBUG nova.compute.manager [req-051a5b44-c049-4c05-a44e-267989c09450 req-b7a9cbba-ad9a-47c6-9222-bc4d617d67d1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] No waiting events found dispatching network-vif-unplugged-d69c5dfd-952c-44e7-9e26-18e9807fcaf6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:21:04 compute-0 nova_compute[187185]: 2025-11-29 07:21:04.028 187189 DEBUG nova.compute.manager [req-051a5b44-c049-4c05-a44e-267989c09450 req-b7a9cbba-ad9a-47c6-9222-bc4d617d67d1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Received event network-vif-unplugged-d69c5dfd-952c-44e7-9e26-18e9807fcaf6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 07:21:04 compute-0 nova_compute[187185]: 2025-11-29 07:21:04.028 187189 DEBUG nova.compute.manager [req-051a5b44-c049-4c05-a44e-267989c09450 req-b7a9cbba-ad9a-47c6-9222-bc4d617d67d1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Received event network-vif-plugged-d69c5dfd-952c-44e7-9e26-18e9807fcaf6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:21:04 compute-0 nova_compute[187185]: 2025-11-29 07:21:04.028 187189 DEBUG oslo_concurrency.lockutils [req-051a5b44-c049-4c05-a44e-267989c09450 req-b7a9cbba-ad9a-47c6-9222-bc4d617d67d1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "385b61e0-d06f-45d5-833f-956226dbe647-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:21:04 compute-0 nova_compute[187185]: 2025-11-29 07:21:04.029 187189 DEBUG oslo_concurrency.lockutils [req-051a5b44-c049-4c05-a44e-267989c09450 req-b7a9cbba-ad9a-47c6-9222-bc4d617d67d1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "385b61e0-d06f-45d5-833f-956226dbe647-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:21:04 compute-0 nova_compute[187185]: 2025-11-29 07:21:04.029 187189 DEBUG oslo_concurrency.lockutils [req-051a5b44-c049-4c05-a44e-267989c09450 req-b7a9cbba-ad9a-47c6-9222-bc4d617d67d1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "385b61e0-d06f-45d5-833f-956226dbe647-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:21:04 compute-0 nova_compute[187185]: 2025-11-29 07:21:04.029 187189 DEBUG nova.compute.manager [req-051a5b44-c049-4c05-a44e-267989c09450 req-b7a9cbba-ad9a-47c6-9222-bc4d617d67d1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] No waiting events found dispatching network-vif-plugged-d69c5dfd-952c-44e7-9e26-18e9807fcaf6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:21:04 compute-0 nova_compute[187185]: 2025-11-29 07:21:04.029 187189 WARNING nova.compute.manager [req-051a5b44-c049-4c05-a44e-267989c09450 req-b7a9cbba-ad9a-47c6-9222-bc4d617d67d1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Received unexpected event network-vif-plugged-d69c5dfd-952c-44e7-9e26-18e9807fcaf6 for instance with vm_state active and task_state deleting.
Nov 29 07:21:04 compute-0 nova_compute[187185]: 2025-11-29 07:21:04.346 187189 DEBUG nova.network.neutron [-] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:21:04 compute-0 nova_compute[187185]: 2025-11-29 07:21:04.371 187189 INFO nova.compute.manager [-] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Took 0.98 seconds to deallocate network for instance.
Nov 29 07:21:04 compute-0 nova_compute[187185]: 2025-11-29 07:21:04.470 187189 DEBUG nova.network.neutron [req-b7f9d6e5-b864-467f-b4ff-8916e1129c35 req-32b269ce-1fbe-4bfa-a580-3b0518439ad7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Updated VIF entry in instance network info cache for port d69c5dfd-952c-44e7-9e26-18e9807fcaf6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:21:04 compute-0 nova_compute[187185]: 2025-11-29 07:21:04.471 187189 DEBUG nova.network.neutron [req-b7f9d6e5-b864-467f-b4ff-8916e1129c35 req-32b269ce-1fbe-4bfa-a580-3b0518439ad7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Updating instance_info_cache with network_info: [{"id": "d69c5dfd-952c-44e7-9e26-18e9807fcaf6", "address": "fa:16:3e:ff:90:f9", "network": {"id": "b1af4918-cc03-490f-9e76-dc4f5fd7f840", "bridge": "br-int", "label": "tempest-network-smoke--713941231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd69c5dfd-95", "ovs_interfaceid": "d69c5dfd-952c-44e7-9e26-18e9807fcaf6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:21:04 compute-0 nova_compute[187185]: 2025-11-29 07:21:04.475 187189 DEBUG oslo_concurrency.lockutils [None req-95288d11-f6bb-479a-a16c-b7873407a125 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:21:04 compute-0 nova_compute[187185]: 2025-11-29 07:21:04.476 187189 DEBUG oslo_concurrency.lockutils [None req-95288d11-f6bb-479a-a16c-b7873407a125 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:21:04 compute-0 nova_compute[187185]: 2025-11-29 07:21:04.502 187189 DEBUG oslo_concurrency.lockutils [req-b7f9d6e5-b864-467f-b4ff-8916e1129c35 req-32b269ce-1fbe-4bfa-a580-3b0518439ad7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-385b61e0-d06f-45d5-833f-956226dbe647" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:21:04 compute-0 nova_compute[187185]: 2025-11-29 07:21:04.576 187189 DEBUG nova.compute.provider_tree [None req-95288d11-f6bb-479a-a16c-b7873407a125 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:21:04 compute-0 nova_compute[187185]: 2025-11-29 07:21:04.596 187189 DEBUG nova.scheduler.client.report [None req-95288d11-f6bb-479a-a16c-b7873407a125 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:21:04 compute-0 nova_compute[187185]: 2025-11-29 07:21:04.630 187189 DEBUG oslo_concurrency.lockutils [None req-95288d11-f6bb-479a-a16c-b7873407a125 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.154s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:21:04 compute-0 nova_compute[187185]: 2025-11-29 07:21:04.679 187189 INFO nova.scheduler.client.report [None req-95288d11-f6bb-479a-a16c-b7873407a125 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Deleted allocations for instance 385b61e0-d06f-45d5-833f-956226dbe647
Nov 29 07:21:04 compute-0 nova_compute[187185]: 2025-11-29 07:21:04.778 187189 DEBUG oslo_concurrency.lockutils [None req-95288d11-f6bb-479a-a16c-b7873407a125 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "385b61e0-d06f-45d5-833f-956226dbe647" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.845s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:21:05 compute-0 nova_compute[187185]: 2025-11-29 07:21:05.903 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:21:06 compute-0 nova_compute[187185]: 2025-11-29 07:21:06.183 187189 DEBUG nova.compute.manager [req-37aa157b-abef-4844-b46c-c85d6cd20575 req-093a7f51-836c-4242-b2df-d4a4d03b86b1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Received event network-vif-deleted-d69c5dfd-952c-44e7-9e26-18e9807fcaf6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:21:06 compute-0 nova_compute[187185]: 2025-11-29 07:21:06.185 187189 INFO nova.compute.manager [req-37aa157b-abef-4844-b46c-c85d6cd20575 req-093a7f51-836c-4242-b2df-d4a4d03b86b1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Neutron deleted interface d69c5dfd-952c-44e7-9e26-18e9807fcaf6; detaching it from the instance and deleting it from the info cache
Nov 29 07:21:06 compute-0 nova_compute[187185]: 2025-11-29 07:21:06.185 187189 DEBUG nova.network.neutron [req-37aa157b-abef-4844-b46c-c85d6cd20575 req-093a7f51-836c-4242-b2df-d4a4d03b86b1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Nov 29 07:21:06 compute-0 nova_compute[187185]: 2025-11-29 07:21:06.188 187189 DEBUG nova.compute.manager [req-37aa157b-abef-4844-b46c-c85d6cd20575 req-093a7f51-836c-4242-b2df-d4a4d03b86b1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Detach interface failed, port_id=d69c5dfd-952c-44e7-9e26-18e9807fcaf6, reason: Instance 385b61e0-d06f-45d5-833f-956226dbe647 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Nov 29 07:21:08 compute-0 nova_compute[187185]: 2025-11-29 07:21:08.299 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:21:08 compute-0 nova_compute[187185]: 2025-11-29 07:21:08.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:21:08 compute-0 nova_compute[187185]: 2025-11-29 07:21:08.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:21:08 compute-0 nova_compute[187185]: 2025-11-29 07:21:08.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:21:08 compute-0 nova_compute[187185]: 2025-11-29 07:21:08.339 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 07:21:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:21:10.570 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=38, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=37) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:21:10 compute-0 nova_compute[187185]: 2025-11-29 07:21:10.570 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:21:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:21:10.573 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:21:10 compute-0 nova_compute[187185]: 2025-11-29 07:21:10.905 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:21:11 compute-0 nova_compute[187185]: 2025-11-29 07:21:11.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:21:11 compute-0 nova_compute[187185]: 2025-11-29 07:21:11.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:21:11 compute-0 nova_compute[187185]: 2025-11-29 07:21:11.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:21:11 compute-0 nova_compute[187185]: 2025-11-29 07:21:11.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:21:11 compute-0 nova_compute[187185]: 2025-11-29 07:21:11.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:21:11 compute-0 nova_compute[187185]: 2025-11-29 07:21:11.351 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:21:11 compute-0 nova_compute[187185]: 2025-11-29 07:21:11.351 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:21:11 compute-0 nova_compute[187185]: 2025-11-29 07:21:11.351 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:21:11 compute-0 nova_compute[187185]: 2025-11-29 07:21:11.352 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:21:11 compute-0 nova_compute[187185]: 2025-11-29 07:21:11.535 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:21:11 compute-0 nova_compute[187185]: 2025-11-29 07:21:11.536 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5728MB free_disk=73.29447555541992GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:21:11 compute-0 nova_compute[187185]: 2025-11-29 07:21:11.536 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:21:11 compute-0 nova_compute[187185]: 2025-11-29 07:21:11.537 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:21:11 compute-0 nova_compute[187185]: 2025-11-29 07:21:11.615 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:21:11 compute-0 nova_compute[187185]: 2025-11-29 07:21:11.616 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:21:11 compute-0 nova_compute[187185]: 2025-11-29 07:21:11.633 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:21:11 compute-0 nova_compute[187185]: 2025-11-29 07:21:11.649 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:21:11 compute-0 nova_compute[187185]: 2025-11-29 07:21:11.673 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:21:11 compute-0 nova_compute[187185]: 2025-11-29 07:21:11.673 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.136s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:21:12 compute-0 nova_compute[187185]: 2025-11-29 07:21:12.674 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:21:12 compute-0 nova_compute[187185]: 2025-11-29 07:21:12.675 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:21:12 compute-0 podman[232255]: 2025-11-29 07:21:12.791224312 +0000 UTC m=+0.053526282 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 07:21:12 compute-0 podman[232254]: 2025-11-29 07:21:12.796750308 +0000 UTC m=+0.062108954 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.buildah.version=1.33.7, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, config_id=edpm)
Nov 29 07:21:12 compute-0 podman[232253]: 2025-11-29 07:21:12.81452413 +0000 UTC m=+0.082145270 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 29 07:21:13 compute-0 nova_compute[187185]: 2025-11-29 07:21:13.302 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:21:13 compute-0 nova_compute[187185]: 2025-11-29 07:21:13.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:21:15 compute-0 nova_compute[187185]: 2025-11-29 07:21:15.907 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:21:16 compute-0 nova_compute[187185]: 2025-11-29 07:21:16.310 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:21:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:21:16.575 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '38'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:21:18 compute-0 nova_compute[187185]: 2025-11-29 07:21:18.270 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400863.2692764, 385b61e0-d06f-45d5-833f-956226dbe647 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:21:18 compute-0 nova_compute[187185]: 2025-11-29 07:21:18.271 187189 INFO nova.compute.manager [-] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] VM Stopped (Lifecycle Event)
Nov 29 07:21:18 compute-0 nova_compute[187185]: 2025-11-29 07:21:18.304 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:21:19 compute-0 nova_compute[187185]: 2025-11-29 07:21:19.347 187189 DEBUG nova.compute.manager [None req-2c938c25-2124-4917-9796-cb101ba0bbfa - - - - - -] [instance: 385b61e0-d06f-45d5-833f-956226dbe647] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:21:20 compute-0 nova_compute[187185]: 2025-11-29 07:21:20.907 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:21:22 compute-0 podman[232313]: 2025-11-29 07:21:22.824800249 +0000 UTC m=+0.091032411 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:21:23 compute-0 nova_compute[187185]: 2025-11-29 07:21:23.306 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:21:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:21:25.510 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:21:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:21:25.510 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:21:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:21:25.511 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:21:25 compute-0 nova_compute[187185]: 2025-11-29 07:21:25.909 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:21:28 compute-0 nova_compute[187185]: 2025-11-29 07:21:28.311 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:21:29 compute-0 podman[232341]: 2025-11-29 07:21:29.786801062 +0000 UTC m=+0.052906525 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 07:21:30 compute-0 nova_compute[187185]: 2025-11-29 07:21:30.911 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:21:31 compute-0 podman[232366]: 2025-11-29 07:21:31.808433116 +0000 UTC m=+0.068650720 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:21:31 compute-0 podman[232365]: 2025-11-29 07:21:31.817157162 +0000 UTC m=+0.079692171 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3)
Nov 29 07:21:33 compute-0 nova_compute[187185]: 2025-11-29 07:21:33.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:21:33 compute-0 nova_compute[187185]: 2025-11-29 07:21:33.317 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:21:33 compute-0 nova_compute[187185]: 2025-11-29 07:21:33.318 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:21:33 compute-0 nova_compute[187185]: 2025-11-29 07:21:33.319 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:21:33 compute-0 nova_compute[187185]: 2025-11-29 07:21:33.319 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:21:33 compute-0 nova_compute[187185]: 2025-11-29 07:21:33.319 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:21:33 compute-0 nova_compute[187185]: 2025-11-29 07:21:33.319 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:21:33 compute-0 nova_compute[187185]: 2025-11-29 07:21:33.321 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:21:33 compute-0 nova_compute[187185]: 2025-11-29 07:21:33.373 187189 DEBUG nova.virt.libvirt.imagecache [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314
Nov 29 07:21:33 compute-0 nova_compute[187185]: 2025-11-29 07:21:33.373 187189 WARNING nova.virt.libvirt.imagecache [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28
Nov 29 07:21:33 compute-0 nova_compute[187185]: 2025-11-29 07:21:33.373 187189 WARNING nova.virt.libvirt.imagecache [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123
Nov 29 07:21:33 compute-0 nova_compute[187185]: 2025-11-29 07:21:33.374 187189 INFO nova.virt.libvirt.imagecache [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Removable base files: /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123
Nov 29 07:21:33 compute-0 nova_compute[187185]: 2025-11-29 07:21:33.374 187189 INFO nova.virt.libvirt.imagecache [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28
Nov 29 07:21:33 compute-0 nova_compute[187185]: 2025-11-29 07:21:33.374 187189 INFO nova.virt.libvirt.imagecache [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123
Nov 29 07:21:33 compute-0 nova_compute[187185]: 2025-11-29 07:21:33.374 187189 DEBUG nova.virt.libvirt.imagecache [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350
Nov 29 07:21:33 compute-0 nova_compute[187185]: 2025-11-29 07:21:33.374 187189 DEBUG nova.virt.libvirt.imagecache [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299
Nov 29 07:21:33 compute-0 nova_compute[187185]: 2025-11-29 07:21:33.375 187189 DEBUG nova.virt.libvirt.imagecache [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284
Nov 29 07:21:35 compute-0 nova_compute[187185]: 2025-11-29 07:21:35.912 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:21:38 compute-0 nova_compute[187185]: 2025-11-29 07:21:38.325 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:21:40 compute-0 nova_compute[187185]: 2025-11-29 07:21:40.960 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:21:43 compute-0 nova_compute[187185]: 2025-11-29 07:21:43.330 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:21:43 compute-0 podman[232404]: 2025-11-29 07:21:43.791944583 +0000 UTC m=+0.057368531 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Nov 29 07:21:43 compute-0 podman[232406]: 2025-11-29 07:21:43.795880544 +0000 UTC m=+0.054733446 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 07:21:43 compute-0 podman[232405]: 2025-11-29 07:21:43.821702703 +0000 UTC m=+0.086395050 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, name=ubi9-minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, vcs-type=git, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, container_name=openstack_network_exporter, version=9.6, release=1755695350, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 29 07:21:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:21:45.873 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:36:9b:10 10.100.0.2 2001:db8::f816:3eff:fe36:9b10'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe36:9b10/64', 'neutron:device_id': 'ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f75dc671-4e0c-40f1-8afd-c16b5e416d95', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=944fc855-be48-4f5c-ba58-0898fe543a04, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=6897d2ce-b04d-4d85-9bb6-9da51e7d7f20) old=Port_Binding(mac=['fa:16:3e:36:9b:10 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f75dc671-4e0c-40f1-8afd-c16b5e416d95', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:21:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:21:45.875 104254 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 6897d2ce-b04d-4d85-9bb6-9da51e7d7f20 in datapath f75dc671-4e0c-40f1-8afd-c16b5e416d95 updated
Nov 29 07:21:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:21:45.876 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f75dc671-4e0c-40f1-8afd-c16b5e416d95, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:21:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:21:45.877 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f8a12a6e-9e75-4d28-b36c-5a17a85d0679]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:21:45 compute-0 nova_compute[187185]: 2025-11-29 07:21:45.962 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:21:48 compute-0 nova_compute[187185]: 2025-11-29 07:21:48.336 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:21:50 compute-0 nova_compute[187185]: 2025-11-29 07:21:50.965 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:21:53 compute-0 nova_compute[187185]: 2025-11-29 07:21:53.342 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:21:53 compute-0 podman[232469]: 2025-11-29 07:21:53.847572822 +0000 UTC m=+0.100920420 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:21:55 compute-0 nova_compute[187185]: 2025-11-29 07:21:55.967 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:21:56 compute-0 nova_compute[187185]: 2025-11-29 07:21:56.772 187189 DEBUG oslo_concurrency.lockutils [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "1be71451-9dcb-4882-afc5-fa2b37f3fa96" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:21:56 compute-0 nova_compute[187185]: 2025-11-29 07:21:56.774 187189 DEBUG oslo_concurrency.lockutils [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "1be71451-9dcb-4882-afc5-fa2b37f3fa96" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:21:56 compute-0 nova_compute[187185]: 2025-11-29 07:21:56.863 187189 DEBUG nova.compute.manager [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 07:21:57 compute-0 nova_compute[187185]: 2025-11-29 07:21:57.049 187189 DEBUG oslo_concurrency.lockutils [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:21:57 compute-0 nova_compute[187185]: 2025-11-29 07:21:57.049 187189 DEBUG oslo_concurrency.lockutils [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:21:57 compute-0 nova_compute[187185]: 2025-11-29 07:21:57.059 187189 DEBUG nova.virt.hardware [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:21:57 compute-0 nova_compute[187185]: 2025-11-29 07:21:57.060 187189 INFO nova.compute.claims [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:21:57 compute-0 nova_compute[187185]: 2025-11-29 07:21:57.267 187189 DEBUG nova.compute.provider_tree [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:21:57 compute-0 nova_compute[187185]: 2025-11-29 07:21:57.284 187189 DEBUG nova.scheduler.client.report [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:21:57 compute-0 nova_compute[187185]: 2025-11-29 07:21:57.343 187189 DEBUG oslo_concurrency.lockutils [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.294s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:21:57 compute-0 nova_compute[187185]: 2025-11-29 07:21:57.344 187189 DEBUG nova.compute.manager [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 07:21:57 compute-0 nova_compute[187185]: 2025-11-29 07:21:57.447 187189 DEBUG nova.compute.manager [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 07:21:57 compute-0 nova_compute[187185]: 2025-11-29 07:21:57.448 187189 DEBUG nova.network.neutron [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 07:21:57 compute-0 nova_compute[187185]: 2025-11-29 07:21:57.522 187189 INFO nova.virt.libvirt.driver [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 07:21:57 compute-0 nova_compute[187185]: 2025-11-29 07:21:57.560 187189 DEBUG nova.compute.manager [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 07:21:57 compute-0 nova_compute[187185]: 2025-11-29 07:21:57.714 187189 DEBUG nova.compute.manager [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 07:21:57 compute-0 nova_compute[187185]: 2025-11-29 07:21:57.716 187189 DEBUG nova.virt.libvirt.driver [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:21:57 compute-0 nova_compute[187185]: 2025-11-29 07:21:57.716 187189 INFO nova.virt.libvirt.driver [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Creating image(s)
Nov 29 07:21:57 compute-0 nova_compute[187185]: 2025-11-29 07:21:57.717 187189 DEBUG oslo_concurrency.lockutils [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "/var/lib/nova/instances/1be71451-9dcb-4882-afc5-fa2b37f3fa96/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:21:57 compute-0 nova_compute[187185]: 2025-11-29 07:21:57.717 187189 DEBUG oslo_concurrency.lockutils [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "/var/lib/nova/instances/1be71451-9dcb-4882-afc5-fa2b37f3fa96/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:21:57 compute-0 nova_compute[187185]: 2025-11-29 07:21:57.718 187189 DEBUG oslo_concurrency.lockutils [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "/var/lib/nova/instances/1be71451-9dcb-4882-afc5-fa2b37f3fa96/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:21:57 compute-0 nova_compute[187185]: 2025-11-29 07:21:57.738 187189 DEBUG oslo_concurrency.processutils [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:21:57 compute-0 nova_compute[187185]: 2025-11-29 07:21:57.802 187189 DEBUG oslo_concurrency.processutils [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:21:57 compute-0 nova_compute[187185]: 2025-11-29 07:21:57.804 187189 DEBUG oslo_concurrency.lockutils [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:21:57 compute-0 nova_compute[187185]: 2025-11-29 07:21:57.804 187189 DEBUG oslo_concurrency.lockutils [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:21:57 compute-0 nova_compute[187185]: 2025-11-29 07:21:57.816 187189 DEBUG oslo_concurrency.processutils [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:21:57 compute-0 nova_compute[187185]: 2025-11-29 07:21:57.861 187189 DEBUG oslo_concurrency.lockutils [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Acquiring lock "729b50e7-7084-4f08-90ca-48a841f50cc9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:21:57 compute-0 nova_compute[187185]: 2025-11-29 07:21:57.862 187189 DEBUG oslo_concurrency.lockutils [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "729b50e7-7084-4f08-90ca-48a841f50cc9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:21:57 compute-0 nova_compute[187185]: 2025-11-29 07:21:57.882 187189 DEBUG oslo_concurrency.processutils [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:21:57 compute-0 nova_compute[187185]: 2025-11-29 07:21:57.882 187189 DEBUG oslo_concurrency.processutils [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/1be71451-9dcb-4882-afc5-fa2b37f3fa96/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:21:57 compute-0 nova_compute[187185]: 2025-11-29 07:21:57.901 187189 DEBUG nova.compute.manager [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 07:21:58 compute-0 nova_compute[187185]: 2025-11-29 07:21:58.034 187189 DEBUG nova.policy [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 07:21:58 compute-0 nova_compute[187185]: 2025-11-29 07:21:58.117 187189 DEBUG oslo_concurrency.lockutils [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:21:58 compute-0 nova_compute[187185]: 2025-11-29 07:21:58.118 187189 DEBUG oslo_concurrency.lockutils [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:21:58 compute-0 nova_compute[187185]: 2025-11-29 07:21:58.124 187189 DEBUG nova.virt.hardware [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:21:58 compute-0 nova_compute[187185]: 2025-11-29 07:21:58.124 187189 INFO nova.compute.claims [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:21:58 compute-0 nova_compute[187185]: 2025-11-29 07:21:58.127 187189 DEBUG oslo_concurrency.processutils [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/1be71451-9dcb-4882-afc5-fa2b37f3fa96/disk 1073741824" returned: 0 in 0.245s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:21:58 compute-0 nova_compute[187185]: 2025-11-29 07:21:58.128 187189 DEBUG oslo_concurrency.lockutils [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.324s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:21:58 compute-0 nova_compute[187185]: 2025-11-29 07:21:58.128 187189 DEBUG oslo_concurrency.processutils [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:21:58 compute-0 nova_compute[187185]: 2025-11-29 07:21:58.181 187189 DEBUG oslo_concurrency.processutils [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:21:58 compute-0 nova_compute[187185]: 2025-11-29 07:21:58.182 187189 DEBUG nova.virt.disk.api [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Checking if we can resize image /var/lib/nova/instances/1be71451-9dcb-4882-afc5-fa2b37f3fa96/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:21:58 compute-0 nova_compute[187185]: 2025-11-29 07:21:58.182 187189 DEBUG oslo_concurrency.processutils [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1be71451-9dcb-4882-afc5-fa2b37f3fa96/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:21:58 compute-0 nova_compute[187185]: 2025-11-29 07:21:58.236 187189 DEBUG oslo_concurrency.processutils [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1be71451-9dcb-4882-afc5-fa2b37f3fa96/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:21:58 compute-0 nova_compute[187185]: 2025-11-29 07:21:58.237 187189 DEBUG nova.virt.disk.api [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Cannot resize image /var/lib/nova/instances/1be71451-9dcb-4882-afc5-fa2b37f3fa96/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:21:58 compute-0 nova_compute[187185]: 2025-11-29 07:21:58.238 187189 DEBUG nova.objects.instance [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'migration_context' on Instance uuid 1be71451-9dcb-4882-afc5-fa2b37f3fa96 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:21:58 compute-0 nova_compute[187185]: 2025-11-29 07:21:58.476 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:21:58 compute-0 nova_compute[187185]: 2025-11-29 07:21:58.594 187189 DEBUG nova.virt.libvirt.driver [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:21:58 compute-0 nova_compute[187185]: 2025-11-29 07:21:58.595 187189 DEBUG nova.virt.libvirt.driver [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Ensure instance console log exists: /var/lib/nova/instances/1be71451-9dcb-4882-afc5-fa2b37f3fa96/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:21:58 compute-0 nova_compute[187185]: 2025-11-29 07:21:58.596 187189 DEBUG oslo_concurrency.lockutils [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:21:58 compute-0 nova_compute[187185]: 2025-11-29 07:21:58.597 187189 DEBUG oslo_concurrency.lockutils [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:21:58 compute-0 nova_compute[187185]: 2025-11-29 07:21:58.597 187189 DEBUG oslo_concurrency.lockutils [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:21:58 compute-0 nova_compute[187185]: 2025-11-29 07:21:58.652 187189 DEBUG nova.compute.provider_tree [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:21:58 compute-0 nova_compute[187185]: 2025-11-29 07:21:58.696 187189 DEBUG nova.scheduler.client.report [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:21:58 compute-0 nova_compute[187185]: 2025-11-29 07:21:58.735 187189 DEBUG oslo_concurrency.lockutils [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.617s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:21:58 compute-0 nova_compute[187185]: 2025-11-29 07:21:58.737 187189 DEBUG nova.compute.manager [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 07:21:58 compute-0 nova_compute[187185]: 2025-11-29 07:21:58.824 187189 DEBUG nova.compute.manager [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 07:21:58 compute-0 nova_compute[187185]: 2025-11-29 07:21:58.824 187189 DEBUG nova.network.neutron [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 07:21:58 compute-0 nova_compute[187185]: 2025-11-29 07:21:58.881 187189 INFO nova.virt.libvirt.driver [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 07:21:58 compute-0 nova_compute[187185]: 2025-11-29 07:21:58.963 187189 DEBUG nova.compute.manager [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 07:21:59 compute-0 nova_compute[187185]: 2025-11-29 07:21:59.257 187189 DEBUG nova.compute.manager [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 07:21:59 compute-0 nova_compute[187185]: 2025-11-29 07:21:59.259 187189 DEBUG nova.virt.libvirt.driver [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:21:59 compute-0 nova_compute[187185]: 2025-11-29 07:21:59.260 187189 INFO nova.virt.libvirt.driver [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Creating image(s)
Nov 29 07:21:59 compute-0 nova_compute[187185]: 2025-11-29 07:21:59.261 187189 DEBUG oslo_concurrency.lockutils [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Acquiring lock "/var/lib/nova/instances/729b50e7-7084-4f08-90ca-48a841f50cc9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:21:59 compute-0 nova_compute[187185]: 2025-11-29 07:21:59.262 187189 DEBUG oslo_concurrency.lockutils [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "/var/lib/nova/instances/729b50e7-7084-4f08-90ca-48a841f50cc9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:21:59 compute-0 nova_compute[187185]: 2025-11-29 07:21:59.263 187189 DEBUG oslo_concurrency.lockutils [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "/var/lib/nova/instances/729b50e7-7084-4f08-90ca-48a841f50cc9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:21:59 compute-0 nova_compute[187185]: 2025-11-29 07:21:59.292 187189 DEBUG oslo_concurrency.processutils [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:21:59 compute-0 nova_compute[187185]: 2025-11-29 07:21:59.358 187189 DEBUG oslo_concurrency.processutils [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:21:59 compute-0 nova_compute[187185]: 2025-11-29 07:21:59.360 187189 DEBUG oslo_concurrency.lockutils [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:21:59 compute-0 nova_compute[187185]: 2025-11-29 07:21:59.360 187189 DEBUG oslo_concurrency.lockutils [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:21:59 compute-0 nova_compute[187185]: 2025-11-29 07:21:59.372 187189 DEBUG oslo_concurrency.processutils [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:21:59 compute-0 nova_compute[187185]: 2025-11-29 07:21:59.449 187189 DEBUG oslo_concurrency.processutils [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:21:59 compute-0 nova_compute[187185]: 2025-11-29 07:21:59.450 187189 DEBUG oslo_concurrency.processutils [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/729b50e7-7084-4f08-90ca-48a841f50cc9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:21:59 compute-0 nova_compute[187185]: 2025-11-29 07:21:59.490 187189 DEBUG oslo_concurrency.processutils [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/729b50e7-7084-4f08-90ca-48a841f50cc9/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:21:59 compute-0 nova_compute[187185]: 2025-11-29 07:21:59.492 187189 DEBUG oslo_concurrency.lockutils [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:21:59 compute-0 nova_compute[187185]: 2025-11-29 07:21:59.493 187189 DEBUG oslo_concurrency.processutils [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:21:59 compute-0 nova_compute[187185]: 2025-11-29 07:21:59.520 187189 DEBUG nova.policy [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a992c32ce5fb4cbab645023852f14adc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '980ddbfed54546c89c75e94503491a61', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 07:21:59 compute-0 nova_compute[187185]: 2025-11-29 07:21:59.555 187189 DEBUG oslo_concurrency.processutils [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:21:59 compute-0 nova_compute[187185]: 2025-11-29 07:21:59.556 187189 DEBUG nova.virt.disk.api [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Checking if we can resize image /var/lib/nova/instances/729b50e7-7084-4f08-90ca-48a841f50cc9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:21:59 compute-0 nova_compute[187185]: 2025-11-29 07:21:59.556 187189 DEBUG oslo_concurrency.processutils [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/729b50e7-7084-4f08-90ca-48a841f50cc9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:21:59 compute-0 nova_compute[187185]: 2025-11-29 07:21:59.628 187189 DEBUG oslo_concurrency.processutils [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/729b50e7-7084-4f08-90ca-48a841f50cc9/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:21:59 compute-0 nova_compute[187185]: 2025-11-29 07:21:59.630 187189 DEBUG nova.virt.disk.api [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Cannot resize image /var/lib/nova/instances/729b50e7-7084-4f08-90ca-48a841f50cc9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:21:59 compute-0 nova_compute[187185]: 2025-11-29 07:21:59.630 187189 DEBUG nova.objects.instance [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lazy-loading 'migration_context' on Instance uuid 729b50e7-7084-4f08-90ca-48a841f50cc9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:21:59 compute-0 nova_compute[187185]: 2025-11-29 07:21:59.667 187189 DEBUG nova.virt.libvirt.driver [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:21:59 compute-0 nova_compute[187185]: 2025-11-29 07:21:59.668 187189 DEBUG nova.virt.libvirt.driver [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Ensure instance console log exists: /var/lib/nova/instances/729b50e7-7084-4f08-90ca-48a841f50cc9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:21:59 compute-0 nova_compute[187185]: 2025-11-29 07:21:59.669 187189 DEBUG oslo_concurrency.lockutils [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:21:59 compute-0 nova_compute[187185]: 2025-11-29 07:21:59.669 187189 DEBUG oslo_concurrency.lockutils [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:21:59 compute-0 nova_compute[187185]: 2025-11-29 07:21:59.669 187189 DEBUG oslo_concurrency.lockutils [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:22:00 compute-0 podman[232526]: 2025-11-29 07:22:00.804474229 +0000 UTC m=+0.067326362 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 07:22:00 compute-0 nova_compute[187185]: 2025-11-29 07:22:00.969 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:02 compute-0 podman[232550]: 2025-11-29 07:22:02.793509554 +0000 UTC m=+0.057933287 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 07:22:02 compute-0 podman[232551]: 2025-11-29 07:22:02.802095666 +0000 UTC m=+0.061292182 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 07:22:03 compute-0 nova_compute[187185]: 2025-11-29 07:22:03.479 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:03 compute-0 nova_compute[187185]: 2025-11-29 07:22:03.505 187189 DEBUG nova.network.neutron [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Successfully created port: 0b0600de-624c-4784-aeea-87a27d98f344 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:22:04 compute-0 nova_compute[187185]: 2025-11-29 07:22:04.450 187189 DEBUG nova.network.neutron [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Successfully created port: 81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:22:05 compute-0 nova_compute[187185]: 2025-11-29 07:22:05.974 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:07 compute-0 nova_compute[187185]: 2025-11-29 07:22:07.721 187189 DEBUG nova.network.neutron [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Successfully updated port: 0b0600de-624c-4784-aeea-87a27d98f344 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:22:07 compute-0 nova_compute[187185]: 2025-11-29 07:22:07.741 187189 DEBUG oslo_concurrency.lockutils [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "refresh_cache-1be71451-9dcb-4882-afc5-fa2b37f3fa96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:22:07 compute-0 nova_compute[187185]: 2025-11-29 07:22:07.742 187189 DEBUG oslo_concurrency.lockutils [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquired lock "refresh_cache-1be71451-9dcb-4882-afc5-fa2b37f3fa96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:22:07 compute-0 nova_compute[187185]: 2025-11-29 07:22:07.743 187189 DEBUG nova.network.neutron [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:22:07 compute-0 nova_compute[187185]: 2025-11-29 07:22:07.876 187189 DEBUG nova.compute.manager [req-02d6993a-72ec-40ee-b95f-ebcad918921d req-ec163bcd-c4d4-4d0f-b412-433ef01f2eb7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Received event network-changed-0b0600de-624c-4784-aeea-87a27d98f344 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:22:07 compute-0 nova_compute[187185]: 2025-11-29 07:22:07.877 187189 DEBUG nova.compute.manager [req-02d6993a-72ec-40ee-b95f-ebcad918921d req-ec163bcd-c4d4-4d0f-b412-433ef01f2eb7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Refreshing instance network info cache due to event network-changed-0b0600de-624c-4784-aeea-87a27d98f344. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:22:07 compute-0 nova_compute[187185]: 2025-11-29 07:22:07.877 187189 DEBUG oslo_concurrency.lockutils [req-02d6993a-72ec-40ee-b95f-ebcad918921d req-ec163bcd-c4d4-4d0f-b412-433ef01f2eb7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-1be71451-9dcb-4882-afc5-fa2b37f3fa96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:22:07 compute-0 nova_compute[187185]: 2025-11-29 07:22:07.945 187189 DEBUG nova.network.neutron [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Successfully updated port: 81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:22:07 compute-0 nova_compute[187185]: 2025-11-29 07:22:07.982 187189 DEBUG oslo_concurrency.lockutils [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Acquiring lock "refresh_cache-729b50e7-7084-4f08-90ca-48a841f50cc9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:22:07 compute-0 nova_compute[187185]: 2025-11-29 07:22:07.983 187189 DEBUG oslo_concurrency.lockutils [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Acquired lock "refresh_cache-729b50e7-7084-4f08-90ca-48a841f50cc9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:22:07 compute-0 nova_compute[187185]: 2025-11-29 07:22:07.983 187189 DEBUG nova.network.neutron [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:22:08 compute-0 nova_compute[187185]: 2025-11-29 07:22:08.021 187189 DEBUG nova.network.neutron [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:22:08 compute-0 nova_compute[187185]: 2025-11-29 07:22:08.098 187189 DEBUG nova.compute.manager [req-9183a757-d5c8-4584-8d34-a45e494e9a08 req-43ffab80-e7cc-46b5-aeb3-7041787fcb11 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Received event network-changed-81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:22:08 compute-0 nova_compute[187185]: 2025-11-29 07:22:08.099 187189 DEBUG nova.compute.manager [req-9183a757-d5c8-4584-8d34-a45e494e9a08 req-43ffab80-e7cc-46b5-aeb3-7041787fcb11 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Refreshing instance network info cache due to event network-changed-81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:22:08 compute-0 nova_compute[187185]: 2025-11-29 07:22:08.099 187189 DEBUG oslo_concurrency.lockutils [req-9183a757-d5c8-4584-8d34-a45e494e9a08 req-43ffab80-e7cc-46b5-aeb3-7041787fcb11 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-729b50e7-7084-4f08-90ca-48a841f50cc9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:22:08 compute-0 nova_compute[187185]: 2025-11-29 07:22:08.190 187189 DEBUG nova.network.neutron [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:22:08 compute-0 nova_compute[187185]: 2025-11-29 07:22:08.527 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:09 compute-0 nova_compute[187185]: 2025-11-29 07:22:09.374 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:22:09 compute-0 nova_compute[187185]: 2025-11-29 07:22:09.375 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:22:09 compute-0 nova_compute[187185]: 2025-11-29 07:22:09.375 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:22:10 compute-0 nova_compute[187185]: 2025-11-29 07:22:10.183 187189 DEBUG nova.network.neutron [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Updating instance_info_cache with network_info: [{"id": "0b0600de-624c-4784-aeea-87a27d98f344", "address": "fa:16:3e:b0:47:d2", "network": {"id": "f75dc671-4e0c-40f1-8afd-c16b5e416d95", "bridge": "br-int", "label": "tempest-network-smoke--588217173", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb0:47d2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b0600de-62", "ovs_interfaceid": "0b0600de-624c-4784-aeea-87a27d98f344", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:22:10 compute-0 nova_compute[187185]: 2025-11-29 07:22:10.978 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.473 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.474 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.474 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.475 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.475 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.476 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.497 187189 DEBUG oslo_concurrency.lockutils [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Releasing lock "refresh_cache-1be71451-9dcb-4882-afc5-fa2b37f3fa96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.497 187189 DEBUG nova.compute.manager [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Instance network_info: |[{"id": "0b0600de-624c-4784-aeea-87a27d98f344", "address": "fa:16:3e:b0:47:d2", "network": {"id": "f75dc671-4e0c-40f1-8afd-c16b5e416d95", "bridge": "br-int", "label": "tempest-network-smoke--588217173", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb0:47d2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b0600de-62", "ovs_interfaceid": "0b0600de-624c-4784-aeea-87a27d98f344", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.498 187189 DEBUG oslo_concurrency.lockutils [req-02d6993a-72ec-40ee-b95f-ebcad918921d req-ec163bcd-c4d4-4d0f-b412-433ef01f2eb7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-1be71451-9dcb-4882-afc5-fa2b37f3fa96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.498 187189 DEBUG nova.network.neutron [req-02d6993a-72ec-40ee-b95f-ebcad918921d req-ec163bcd-c4d4-4d0f-b412-433ef01f2eb7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Refreshing network info cache for port 0b0600de-624c-4784-aeea-87a27d98f344 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.501 187189 DEBUG nova.virt.libvirt.driver [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Start _get_guest_xml network_info=[{"id": "0b0600de-624c-4784-aeea-87a27d98f344", "address": "fa:16:3e:b0:47:d2", "network": {"id": "f75dc671-4e0c-40f1-8afd-c16b5e416d95", "bridge": "br-int", "label": "tempest-network-smoke--588217173", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb0:47d2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b0600de-62", "ovs_interfaceid": "0b0600de-624c-4784-aeea-87a27d98f344", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.505 187189 WARNING nova.virt.libvirt.driver [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.526 187189 DEBUG nova.virt.libvirt.host [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.527 187189 DEBUG nova.virt.libvirt.host [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.534 187189 DEBUG nova.virt.libvirt.host [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.535 187189 DEBUG nova.virt.libvirt.host [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.537 187189 DEBUG nova.virt.libvirt.driver [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.537 187189 DEBUG nova.virt.hardware [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.538 187189 DEBUG nova.virt.hardware [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.538 187189 DEBUG nova.virt.hardware [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.539 187189 DEBUG nova.virt.hardware [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.539 187189 DEBUG nova.virt.hardware [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.539 187189 DEBUG nova.virt.hardware [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.539 187189 DEBUG nova.virt.hardware [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.540 187189 DEBUG nova.virt.hardware [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.540 187189 DEBUG nova.virt.hardware [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.540 187189 DEBUG nova.virt.hardware [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.541 187189 DEBUG nova.virt.hardware [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.546 187189 DEBUG nova.virt.libvirt.vif [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:21:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-210192598',display_name='tempest-TestGettingAddress-server-210192598',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-210192598',id=115,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMqBtOVeWFxVzcYFJOuDJtYVuL20oDyqcRBPHq57GiuWFxaCS3KceqmhPXeIi9sFvrUoM3x5G9a+RY7U7UfyTQLwWhQmn8+j5tk7QGxgOZ6WpsSYFLeoEl1770NJZUoryw==',key_name='tempest-TestGettingAddress-2009457088',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-tnig7st0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:21:57Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=1be71451-9dcb-4882-afc5-fa2b37f3fa96,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0b0600de-624c-4784-aeea-87a27d98f344", "address": "fa:16:3e:b0:47:d2", "network": {"id": "f75dc671-4e0c-40f1-8afd-c16b5e416d95", "bridge": "br-int", "label": "tempest-network-smoke--588217173", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb0:47d2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b0600de-62", "ovs_interfaceid": "0b0600de-624c-4784-aeea-87a27d98f344", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.547 187189 DEBUG nova.network.os_vif_util [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "0b0600de-624c-4784-aeea-87a27d98f344", "address": "fa:16:3e:b0:47:d2", "network": {"id": "f75dc671-4e0c-40f1-8afd-c16b5e416d95", "bridge": "br-int", "label": "tempest-network-smoke--588217173", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb0:47d2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b0600de-62", "ovs_interfaceid": "0b0600de-624c-4784-aeea-87a27d98f344", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.548 187189 DEBUG nova.network.os_vif_util [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b0:47:d2,bridge_name='br-int',has_traffic_filtering=True,id=0b0600de-624c-4784-aeea-87a27d98f344,network=Network(f75dc671-4e0c-40f1-8afd-c16b5e416d95),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b0600de-62') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.549 187189 DEBUG nova.objects.instance [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'pci_devices' on Instance uuid 1be71451-9dcb-4882-afc5-fa2b37f3fa96 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.563 187189 DEBUG nova.virt.libvirt.driver [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:22:12 compute-0 nova_compute[187185]:   <uuid>1be71451-9dcb-4882-afc5-fa2b37f3fa96</uuid>
Nov 29 07:22:12 compute-0 nova_compute[187185]:   <name>instance-00000073</name>
Nov 29 07:22:12 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:22:12 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:22:12 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:22:12 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:       <nova:name>tempest-TestGettingAddress-server-210192598</nova:name>
Nov 29 07:22:12 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:22:12</nova:creationTime>
Nov 29 07:22:12 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:22:12 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:22:12 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:22:12 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:22:12 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:22:12 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:22:12 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:22:12 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:22:12 compute-0 nova_compute[187185]:         <nova:user uuid="31ac7b05b012433b89143dc9f259644a">tempest-TestGettingAddress-1465017630-project-member</nova:user>
Nov 29 07:22:12 compute-0 nova_compute[187185]:         <nova:project uuid="0111c22b4b954ea586ca20d91ed3970f">tempest-TestGettingAddress-1465017630</nova:project>
Nov 29 07:22:12 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:22:12 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:22:12 compute-0 nova_compute[187185]:         <nova:port uuid="0b0600de-624c-4784-aeea-87a27d98f344">
Nov 29 07:22:12 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:feb0:47d2" ipVersion="6"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:22:12 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:22:12 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:22:12 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <system>
Nov 29 07:22:12 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:22:12 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:22:12 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:22:12 compute-0 nova_compute[187185]:       <entry name="serial">1be71451-9dcb-4882-afc5-fa2b37f3fa96</entry>
Nov 29 07:22:12 compute-0 nova_compute[187185]:       <entry name="uuid">1be71451-9dcb-4882-afc5-fa2b37f3fa96</entry>
Nov 29 07:22:12 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     </system>
Nov 29 07:22:12 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:22:12 compute-0 nova_compute[187185]:   <os>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:   </os>
Nov 29 07:22:12 compute-0 nova_compute[187185]:   <features>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:   </features>
Nov 29 07:22:12 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:22:12 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:22:12 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:22:12 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/1be71451-9dcb-4882-afc5-fa2b37f3fa96/disk"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:22:12 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/1be71451-9dcb-4882-afc5-fa2b37f3fa96/disk.config"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:22:12 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:b0:47:d2"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:       <target dev="tap0b0600de-62"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:22:12 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/1be71451-9dcb-4882-afc5-fa2b37f3fa96/console.log" append="off"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <video>
Nov 29 07:22:12 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     </video>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:22:12 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:22:12 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:22:12 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:22:12 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:22:12 compute-0 nova_compute[187185]: </domain>
Nov 29 07:22:12 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.565 187189 DEBUG nova.compute.manager [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Preparing to wait for external event network-vif-plugged-0b0600de-624c-4784-aeea-87a27d98f344 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.565 187189 DEBUG oslo_concurrency.lockutils [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "1be71451-9dcb-4882-afc5-fa2b37f3fa96-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.565 187189 DEBUG oslo_concurrency.lockutils [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "1be71451-9dcb-4882-afc5-fa2b37f3fa96-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.566 187189 DEBUG oslo_concurrency.lockutils [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "1be71451-9dcb-4882-afc5-fa2b37f3fa96-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.566 187189 DEBUG nova.virt.libvirt.vif [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:21:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-210192598',display_name='tempest-TestGettingAddress-server-210192598',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-210192598',id=115,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMqBtOVeWFxVzcYFJOuDJtYVuL20oDyqcRBPHq57GiuWFxaCS3KceqmhPXeIi9sFvrUoM3x5G9a+RY7U7UfyTQLwWhQmn8+j5tk7QGxgOZ6WpsSYFLeoEl1770NJZUoryw==',key_name='tempest-TestGettingAddress-2009457088',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-tnig7st0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:21:57Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=1be71451-9dcb-4882-afc5-fa2b37f3fa96,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0b0600de-624c-4784-aeea-87a27d98f344", "address": "fa:16:3e:b0:47:d2", "network": {"id": "f75dc671-4e0c-40f1-8afd-c16b5e416d95", "bridge": "br-int", "label": "tempest-network-smoke--588217173", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb0:47d2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b0600de-62", "ovs_interfaceid": "0b0600de-624c-4784-aeea-87a27d98f344", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.567 187189 DEBUG nova.network.os_vif_util [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "0b0600de-624c-4784-aeea-87a27d98f344", "address": "fa:16:3e:b0:47:d2", "network": {"id": "f75dc671-4e0c-40f1-8afd-c16b5e416d95", "bridge": "br-int", "label": "tempest-network-smoke--588217173", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb0:47d2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b0600de-62", "ovs_interfaceid": "0b0600de-624c-4784-aeea-87a27d98f344", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.568 187189 DEBUG nova.network.os_vif_util [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b0:47:d2,bridge_name='br-int',has_traffic_filtering=True,id=0b0600de-624c-4784-aeea-87a27d98f344,network=Network(f75dc671-4e0c-40f1-8afd-c16b5e416d95),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b0600de-62') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.569 187189 DEBUG os_vif [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b0:47:d2,bridge_name='br-int',has_traffic_filtering=True,id=0b0600de-624c-4784-aeea-87a27d98f344,network=Network(f75dc671-4e0c-40f1-8afd-c16b5e416d95),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b0600de-62') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.569 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.570 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.570 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.573 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.573 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0b0600de-62, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.574 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0b0600de-62, col_values=(('external_ids', {'iface-id': '0b0600de-624c-4784-aeea-87a27d98f344', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b0:47:d2', 'vm-uuid': '1be71451-9dcb-4882-afc5-fa2b37f3fa96'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.576 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.578 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:22:12 compute-0 NetworkManager[55227]: <info>  [1764400932.5798] manager: (tap0b0600de-62): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/174)
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.582 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.584 187189 INFO os_vif [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b0:47:d2,bridge_name='br-int',has_traffic_filtering=True,id=0b0600de-624c-4784-aeea-87a27d98f344,network=Network(f75dc671-4e0c-40f1-8afd-c16b5e416d95),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b0600de-62')
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.652 187189 DEBUG nova.virt.libvirt.driver [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.653 187189 DEBUG nova.virt.libvirt.driver [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.653 187189 DEBUG nova.virt.libvirt.driver [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No VIF found with MAC fa:16:3e:b0:47:d2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:22:12 compute-0 nova_compute[187185]: 2025-11-29 07:22:12.654 187189 INFO nova.virt.libvirt.driver [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Using config drive
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.220 187189 DEBUG nova.network.neutron [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Updating instance_info_cache with network_info: [{"id": "81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f", "address": "fa:16:3e:62:5e:11", "network": {"id": "b58443a3-f575-4ff1-951d-e92781861793", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1930314854-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "980ddbfed54546c89c75e94503491a61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81aef0aa-c8", "ovs_interfaceid": "81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.257 187189 DEBUG oslo_concurrency.lockutils [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Releasing lock "refresh_cache-729b50e7-7084-4f08-90ca-48a841f50cc9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.258 187189 DEBUG nova.compute.manager [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Instance network_info: |[{"id": "81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f", "address": "fa:16:3e:62:5e:11", "network": {"id": "b58443a3-f575-4ff1-951d-e92781861793", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1930314854-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "980ddbfed54546c89c75e94503491a61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81aef0aa-c8", "ovs_interfaceid": "81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.259 187189 DEBUG oslo_concurrency.lockutils [req-9183a757-d5c8-4584-8d34-a45e494e9a08 req-43ffab80-e7cc-46b5-aeb3-7041787fcb11 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-729b50e7-7084-4f08-90ca-48a841f50cc9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.260 187189 DEBUG nova.network.neutron [req-9183a757-d5c8-4584-8d34-a45e494e9a08 req-43ffab80-e7cc-46b5-aeb3-7041787fcb11 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Refreshing network info cache for port 81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.266 187189 DEBUG nova.virt.libvirt.driver [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Start _get_guest_xml network_info=[{"id": "81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f", "address": "fa:16:3e:62:5e:11", "network": {"id": "b58443a3-f575-4ff1-951d-e92781861793", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1930314854-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "980ddbfed54546c89c75e94503491a61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81aef0aa-c8", "ovs_interfaceid": "81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.273 187189 WARNING nova.virt.libvirt.driver [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.277 187189 DEBUG nova.virt.libvirt.host [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.278 187189 DEBUG nova.virt.libvirt.host [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.282 187189 DEBUG nova.virt.libvirt.host [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.282 187189 DEBUG nova.virt.libvirt.host [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.283 187189 DEBUG nova.virt.libvirt.driver [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.283 187189 DEBUG nova.virt.hardware [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.284 187189 DEBUG nova.virt.hardware [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.284 187189 DEBUG nova.virt.hardware [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.284 187189 DEBUG nova.virt.hardware [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.285 187189 DEBUG nova.virt.hardware [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.285 187189 DEBUG nova.virt.hardware [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.285 187189 DEBUG nova.virt.hardware [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.285 187189 DEBUG nova.virt.hardware [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.286 187189 DEBUG nova.virt.hardware [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.286 187189 DEBUG nova.virt.hardware [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.286 187189 DEBUG nova.virt.hardware [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.289 187189 DEBUG nova.virt.libvirt.vif [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:21:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-1961507248',display_name='tempest-ServerRescueTestJSON-server-1961507248',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-1961507248',id=116,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='980ddbfed54546c89c75e94503491a61',ramdisk_id='',reservation_id='r-ok5v6h99',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSON-1854570869',owner_user_name='tempest-ServerRescueTestJSON-1854570869-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:21:59Z,user_data=None,user_id='a992c32ce5fb4cbab645023852f14adc',uuid=729b50e7-7084-4f08-90ca-48a841f50cc9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f", "address": "fa:16:3e:62:5e:11", "network": {"id": "b58443a3-f575-4ff1-951d-e92781861793", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1930314854-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "980ddbfed54546c89c75e94503491a61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81aef0aa-c8", "ovs_interfaceid": "81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.289 187189 DEBUG nova.network.os_vif_util [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Converting VIF {"id": "81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f", "address": "fa:16:3e:62:5e:11", "network": {"id": "b58443a3-f575-4ff1-951d-e92781861793", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1930314854-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "980ddbfed54546c89c75e94503491a61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81aef0aa-c8", "ovs_interfaceid": "81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.290 187189 DEBUG nova.network.os_vif_util [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:5e:11,bridge_name='br-int',has_traffic_filtering=True,id=81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f,network=Network(b58443a3-f575-4ff1-951d-e92781861793),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81aef0aa-c8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.291 187189 DEBUG nova.objects.instance [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lazy-loading 'pci_devices' on Instance uuid 729b50e7-7084-4f08-90ca-48a841f50cc9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.315 187189 DEBUG nova.virt.libvirt.driver [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:22:13 compute-0 nova_compute[187185]:   <uuid>729b50e7-7084-4f08-90ca-48a841f50cc9</uuid>
Nov 29 07:22:13 compute-0 nova_compute[187185]:   <name>instance-00000074</name>
Nov 29 07:22:13 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:22:13 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:22:13 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:22:13 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:       <nova:name>tempest-ServerRescueTestJSON-server-1961507248</nova:name>
Nov 29 07:22:13 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:22:13</nova:creationTime>
Nov 29 07:22:13 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:22:13 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:22:13 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:22:13 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:22:13 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:22:13 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:22:13 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:22:13 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:22:13 compute-0 nova_compute[187185]:         <nova:user uuid="a992c32ce5fb4cbab645023852f14adc">tempest-ServerRescueTestJSON-1854570869-project-member</nova:user>
Nov 29 07:22:13 compute-0 nova_compute[187185]:         <nova:project uuid="980ddbfed54546c89c75e94503491a61">tempest-ServerRescueTestJSON-1854570869</nova:project>
Nov 29 07:22:13 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:22:13 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:22:13 compute-0 nova_compute[187185]:         <nova:port uuid="81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f">
Nov 29 07:22:13 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:22:13 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:22:13 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:22:13 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <system>
Nov 29 07:22:13 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:22:13 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:22:13 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:22:13 compute-0 nova_compute[187185]:       <entry name="serial">729b50e7-7084-4f08-90ca-48a841f50cc9</entry>
Nov 29 07:22:13 compute-0 nova_compute[187185]:       <entry name="uuid">729b50e7-7084-4f08-90ca-48a841f50cc9</entry>
Nov 29 07:22:13 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     </system>
Nov 29 07:22:13 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:22:13 compute-0 nova_compute[187185]:   <os>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:   </os>
Nov 29 07:22:13 compute-0 nova_compute[187185]:   <features>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:   </features>
Nov 29 07:22:13 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:22:13 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:22:13 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:22:13 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/729b50e7-7084-4f08-90ca-48a841f50cc9/disk"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:22:13 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/729b50e7-7084-4f08-90ca-48a841f50cc9/disk.config"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:22:13 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:62:5e:11"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:       <target dev="tap81aef0aa-c8"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:22:13 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/729b50e7-7084-4f08-90ca-48a841f50cc9/console.log" append="off"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <video>
Nov 29 07:22:13 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     </video>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:22:13 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:22:13 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:22:13 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:22:13 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:22:13 compute-0 nova_compute[187185]: </domain>
Nov 29 07:22:13 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.317 187189 DEBUG nova.compute.manager [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Preparing to wait for external event network-vif-plugged-81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.317 187189 DEBUG oslo_concurrency.lockutils [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Acquiring lock "729b50e7-7084-4f08-90ca-48a841f50cc9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.317 187189 DEBUG oslo_concurrency.lockutils [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "729b50e7-7084-4f08-90ca-48a841f50cc9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.318 187189 DEBUG oslo_concurrency.lockutils [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "729b50e7-7084-4f08-90ca-48a841f50cc9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.318 187189 DEBUG nova.virt.libvirt.vif [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:21:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-1961507248',display_name='tempest-ServerRescueTestJSON-server-1961507248',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-1961507248',id=116,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='980ddbfed54546c89c75e94503491a61',ramdisk_id='',reservation_id='r-ok5v6h99',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSON-1854570869',owner_user_name='tempest-ServerRescueTestJSON-1854570869-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:21:59Z,user_data=None,user_id='a992c32ce5fb4cbab645023852f14adc',uuid=729b50e7-7084-4f08-90ca-48a841f50cc9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f", "address": "fa:16:3e:62:5e:11", "network": {"id": "b58443a3-f575-4ff1-951d-e92781861793", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1930314854-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "980ddbfed54546c89c75e94503491a61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81aef0aa-c8", "ovs_interfaceid": "81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.318 187189 DEBUG nova.network.os_vif_util [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Converting VIF {"id": "81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f", "address": "fa:16:3e:62:5e:11", "network": {"id": "b58443a3-f575-4ff1-951d-e92781861793", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1930314854-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "980ddbfed54546c89c75e94503491a61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81aef0aa-c8", "ovs_interfaceid": "81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.319 187189 DEBUG nova.network.os_vif_util [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:5e:11,bridge_name='br-int',has_traffic_filtering=True,id=81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f,network=Network(b58443a3-f575-4ff1-951d-e92781861793),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81aef0aa-c8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.319 187189 DEBUG os_vif [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:5e:11,bridge_name='br-int',has_traffic_filtering=True,id=81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f,network=Network(b58443a3-f575-4ff1-951d-e92781861793),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81aef0aa-c8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.319 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.320 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.320 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.320 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.323 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.325 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.325 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap81aef0aa-c8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.326 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap81aef0aa-c8, col_values=(('external_ids', {'iface-id': '81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:62:5e:11', 'vm-uuid': '729b50e7-7084-4f08-90ca-48a841f50cc9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.330 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:13 compute-0 NetworkManager[55227]: <info>  [1764400933.3312] manager: (tap81aef0aa-c8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/175)
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.333 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.340 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.341 187189 INFO os_vif [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:5e:11,bridge_name='br-int',has_traffic_filtering=True,id=81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f,network=Network(b58443a3-f575-4ff1-951d-e92781861793),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81aef0aa-c8')
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.396 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.396 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.396 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.396 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.455 187189 DEBUG nova.virt.libvirt.driver [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.455 187189 DEBUG nova.virt.libvirt.driver [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.455 187189 DEBUG nova.virt.libvirt.driver [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] No VIF found with MAC fa:16:3e:62:5e:11, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.456 187189 INFO nova.virt.libvirt.driver [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Using config drive
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.513 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1be71451-9dcb-4882-afc5-fa2b37f3fa96/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.579 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1be71451-9dcb-4882-afc5-fa2b37f3fa96/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.580 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1be71451-9dcb-4882-afc5-fa2b37f3fa96/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.648 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1be71451-9dcb-4882-afc5-fa2b37f3fa96/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.649 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Periodic task is updating the host stat, it is trying to get disk instance-00000073, but disk file was removed by concurrent operations such as resize.: FileNotFoundError: [Errno 2] No such file or directory: '/var/lib/nova/instances/1be71451-9dcb-4882-afc5-fa2b37f3fa96/disk.config'
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.654 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/729b50e7-7084-4f08-90ca-48a841f50cc9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.678 187189 INFO nova.virt.libvirt.driver [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Creating config drive at /var/lib/nova/instances/1be71451-9dcb-4882-afc5-fa2b37f3fa96/disk.config
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.684 187189 DEBUG oslo_concurrency.processutils [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1be71451-9dcb-4882-afc5-fa2b37f3fa96/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0dq6q7qp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.717 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/729b50e7-7084-4f08-90ca-48a841f50cc9/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.718 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/729b50e7-7084-4f08-90ca-48a841f50cc9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.781 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/729b50e7-7084-4f08-90ca-48a841f50cc9/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.783 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Periodic task is updating the host stat, it is trying to get disk instance-00000074, but disk file was removed by concurrent operations such as resize.: FileNotFoundError: [Errno 2] No such file or directory: '/var/lib/nova/instances/729b50e7-7084-4f08-90ca-48a841f50cc9/disk.config'
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.818 187189 DEBUG oslo_concurrency.processutils [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1be71451-9dcb-4882-afc5-fa2b37f3fa96/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0dq6q7qp" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:22:13 compute-0 ovn_controller[95281]: 2025-11-29T07:22:13Z|00322|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.865 187189 INFO nova.virt.libvirt.driver [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Creating config drive at /var/lib/nova/instances/729b50e7-7084-4f08-90ca-48a841f50cc9/disk.config
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.870 187189 DEBUG oslo_concurrency.processutils [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/729b50e7-7084-4f08-90ca-48a841f50cc9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpijrdkg5g execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:22:13 compute-0 NetworkManager[55227]: <info>  [1764400933.9254] manager: (tap0b0600de-62): new Tun device (/org/freedesktop/NetworkManager/Devices/176)
Nov 29 07:22:13 compute-0 kernel: tap0b0600de-62: entered promiscuous mode
Nov 29 07:22:13 compute-0 ovn_controller[95281]: 2025-11-29T07:22:13Z|00323|binding|INFO|Claiming lport 0b0600de-624c-4784-aeea-87a27d98f344 for this chassis.
Nov 29 07:22:13 compute-0 ovn_controller[95281]: 2025-11-29T07:22:13Z|00324|binding|INFO|0b0600de-624c-4784-aeea-87a27d98f344: Claiming fa:16:3e:b0:47:d2 10.100.0.13 2001:db8::f816:3eff:feb0:47d2
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.930 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.937 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:13 compute-0 systemd-udevd[232662]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:22:13 compute-0 NetworkManager[55227]: <info>  [1764400933.9708] device (tap0b0600de-62): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:22:13 compute-0 NetworkManager[55227]: <info>  [1764400933.9716] device (tap0b0600de-62): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:22:13 compute-0 systemd-machined[153486]: New machine qemu-43-instance-00000073.
Nov 29 07:22:13 compute-0 podman[232615]: 2025-11-29 07:22:13.992904146 +0000 UTC m=+0.084135127 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, config_id=edpm, name=ubi9-minimal, architecture=x86_64, io.buildah.version=1.33.7, io.openshift.expose-services=, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.995 187189 DEBUG oslo_concurrency.processutils [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/729b50e7-7084-4f08-90ca-48a841f50cc9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpijrdkg5g" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:22:13 compute-0 systemd[1]: Started Virtual Machine qemu-43-instance-00000073.
Nov 29 07:22:13 compute-0 nova_compute[187185]: 2025-11-29 07:22:13.998 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:14 compute-0 podman[232618]: 2025-11-29 07:22:14.007863548 +0000 UTC m=+0.094303333 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:22:14 compute-0 ovn_controller[95281]: 2025-11-29T07:22:14Z|00325|binding|INFO|Setting lport 0b0600de-624c-4784-aeea-87a27d98f344 ovn-installed in OVS
Nov 29 07:22:14 compute-0 nova_compute[187185]: 2025-11-29 07:22:14.009 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:14 compute-0 podman[232613]: 2025-11-29 07:22:14.014639259 +0000 UTC m=+0.113063623 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125)
Nov 29 07:22:14 compute-0 kernel: tap81aef0aa-c8: entered promiscuous mode
Nov 29 07:22:14 compute-0 NetworkManager[55227]: <info>  [1764400934.0707] manager: (tap81aef0aa-c8): new Tun device (/org/freedesktop/NetworkManager/Devices/177)
Nov 29 07:22:14 compute-0 nova_compute[187185]: 2025-11-29 07:22:14.072 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:14 compute-0 ovn_controller[95281]: 2025-11-29T07:22:14Z|00326|if_status|INFO|Not updating pb chassis for 81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f now as sb is readonly
Nov 29 07:22:14 compute-0 nova_compute[187185]: 2025-11-29 07:22:14.078 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:14 compute-0 NetworkManager[55227]: <info>  [1764400934.0867] device (tap81aef0aa-c8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:22:14 compute-0 NetworkManager[55227]: <info>  [1764400934.0877] device (tap81aef0aa-c8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:22:14 compute-0 nova_compute[187185]: 2025-11-29 07:22:14.088 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:14 compute-0 systemd-machined[153486]: New machine qemu-44-instance-00000074.
Nov 29 07:22:14 compute-0 nova_compute[187185]: 2025-11-29 07:22:14.123 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:22:14 compute-0 nova_compute[187185]: 2025-11-29 07:22:14.124 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5742MB free_disk=73.29401016235352GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:22:14 compute-0 nova_compute[187185]: 2025-11-29 07:22:14.124 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:22:14 compute-0 nova_compute[187185]: 2025-11-29 07:22:14.124 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:22:14 compute-0 systemd[1]: Started Virtual Machine qemu-44-instance-00000074.
Nov 29 07:22:14 compute-0 ovn_controller[95281]: 2025-11-29T07:22:14Z|00327|binding|INFO|Claiming lport 81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f for this chassis.
Nov 29 07:22:14 compute-0 ovn_controller[95281]: 2025-11-29T07:22:14Z|00328|binding|INFO|81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f: Claiming fa:16:3e:62:5e:11 10.100.0.8
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:14.184 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b0:47:d2 10.100.0.13 2001:db8::f816:3eff:feb0:47d2'], port_security=['fa:16:3e:b0:47:d2 10.100.0.13 2001:db8::f816:3eff:feb0:47d2'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28 2001:db8::f816:3eff:feb0:47d2/64', 'neutron:device_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f75dc671-4e0c-40f1-8afd-c16b5e416d95', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '17fd93d9-fafe-4a7d-9c01-ce54fbe8f760', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=944fc855-be48-4f5c-ba58-0898fe543a04, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=0b0600de-624c-4784-aeea-87a27d98f344) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:14.186 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 0b0600de-624c-4784-aeea-87a27d98f344 in datapath f75dc671-4e0c-40f1-8afd-c16b5e416d95 bound to our chassis
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:14.188 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f75dc671-4e0c-40f1-8afd-c16b5e416d95
Nov 29 07:22:14 compute-0 ovn_controller[95281]: 2025-11-29T07:22:14Z|00329|binding|INFO|Setting lport 0b0600de-624c-4784-aeea-87a27d98f344 up in Southbound
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:14.199 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:62:5e:11 10.100.0.8'], port_security=['fa:16:3e:62:5e:11 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b58443a3-f575-4ff1-951d-e92781861793', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '980ddbfed54546c89c75e94503491a61', 'neutron:revision_number': '2', 'neutron:security_group_ids': '41411d2f-bfa5-47e9-8f9d-c921ac196944', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0ea1aa9c-a11a-4c3e-9a7b-dc58c9931652, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:14.201 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[9337002d-f4a9-446c-a4da-94fd466b99ef]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:14.202 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf75dc671-41 in ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:14.203 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf75dc671-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:14.203 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[08f918e8-1eac-412f-bbe7-b2a051e96d6f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:14.204 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[76a3f3f6-ae30-478f-8953-2d4ebbc293e8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:14.217 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[b4335172-809d-478c-a992-8381b570962a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:14.242 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e6a35e81-061a-4768-89f4-3ea07dbc78bc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:22:14 compute-0 ovn_controller[95281]: 2025-11-29T07:22:14Z|00330|binding|INFO|Setting lport 81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f ovn-installed in OVS
Nov 29 07:22:14 compute-0 ovn_controller[95281]: 2025-11-29T07:22:14Z|00331|binding|INFO|Setting lport 81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f up in Southbound
Nov 29 07:22:14 compute-0 nova_compute[187185]: 2025-11-29 07:22:14.244 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:14.276 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[91d83894-e281-44ae-80f2-056c13a7a98d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:22:14 compute-0 NetworkManager[55227]: <info>  [1764400934.2843] manager: (tapf75dc671-40): new Veth device (/org/freedesktop/NetworkManager/Devices/178)
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:14.286 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6e857b3c-2926-4c7f-a34d-2ce928252d41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:22:14 compute-0 nova_compute[187185]: 2025-11-29 07:22:14.306 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance 1be71451-9dcb-4882-afc5-fa2b37f3fa96 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:22:14 compute-0 nova_compute[187185]: 2025-11-29 07:22:14.306 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance 729b50e7-7084-4f08-90ca-48a841f50cc9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:22:14 compute-0 nova_compute[187185]: 2025-11-29 07:22:14.306 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:22:14 compute-0 nova_compute[187185]: 2025-11-29 07:22:14.307 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:14.321 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[6af3cead-95ca-48aa-8fe5-3cce7a089635]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:14.326 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[d06437e7-edbd-44e7-8670-ac3122523e44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:22:14 compute-0 NetworkManager[55227]: <info>  [1764400934.3496] device (tapf75dc671-40): carrier: link connected
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:14.355 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[487ce53f-97df-46b6-8b86-f9a89f0a407d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:14.371 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[0d140e87-26c8-438b-9510-8259386ffa59]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf75dc671-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:9b:10'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 107], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 638201, 'reachable_time': 37273, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232746, 'error': None, 'target': 'ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:14.389 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d3d40061-9203-4c3b-a3bc-409f2428532a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe36:9b10'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 638201, 'tstamp': 638201}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232747, 'error': None, 'target': 'ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:14.408 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[58fde23a-9363-42b5-992f-9b53a9ba5419]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf75dc671-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:9b:10'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 107], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 638201, 'reachable_time': 37273, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232748, 'error': None, 'target': 'ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:22:14 compute-0 nova_compute[187185]: 2025-11-29 07:22:14.423 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:14.439 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d3ece8b5-40a8-44cc-bb89-b71f9055ac4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:22:14 compute-0 nova_compute[187185]: 2025-11-29 07:22:14.492 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:22:14 compute-0 nova_compute[187185]: 2025-11-29 07:22:14.498 187189 DEBUG nova.compute.manager [req-c4bd4b86-647b-4387-a64d-3bd6daff04cc req-1bf2ede8-8437-43b4-b685-ded04eda1cb2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Received event network-vif-plugged-0b0600de-624c-4784-aeea-87a27d98f344 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:14.493 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f2529fa2-6fff-4488-91ec-24c2dae4c9f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:14.500 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf75dc671-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:14.500 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:14.500 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf75dc671-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:22:14 compute-0 nova_compute[187185]: 2025-11-29 07:22:14.498 187189 DEBUG oslo_concurrency.lockutils [req-c4bd4b86-647b-4387-a64d-3bd6daff04cc req-1bf2ede8-8437-43b4-b685-ded04eda1cb2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "1be71451-9dcb-4882-afc5-fa2b37f3fa96-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:22:14 compute-0 nova_compute[187185]: 2025-11-29 07:22:14.501 187189 DEBUG oslo_concurrency.lockutils [req-c4bd4b86-647b-4387-a64d-3bd6daff04cc req-1bf2ede8-8437-43b4-b685-ded04eda1cb2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1be71451-9dcb-4882-afc5-fa2b37f3fa96-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:22:14 compute-0 NetworkManager[55227]: <info>  [1764400934.5041] manager: (tapf75dc671-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/179)
Nov 29 07:22:14 compute-0 nova_compute[187185]: 2025-11-29 07:22:14.504 187189 DEBUG oslo_concurrency.lockutils [req-c4bd4b86-647b-4387-a64d-3bd6daff04cc req-1bf2ede8-8437-43b4-b685-ded04eda1cb2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1be71451-9dcb-4882-afc5-fa2b37f3fa96-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:22:14 compute-0 kernel: tapf75dc671-40: entered promiscuous mode
Nov 29 07:22:14 compute-0 nova_compute[187185]: 2025-11-29 07:22:14.504 187189 DEBUG nova.compute.manager [req-c4bd4b86-647b-4387-a64d-3bd6daff04cc req-1bf2ede8-8437-43b4-b685-ded04eda1cb2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Processing event network-vif-plugged-0b0600de-624c-4784-aeea-87a27d98f344 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:22:14 compute-0 nova_compute[187185]: 2025-11-29 07:22:14.505 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:14.509 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf75dc671-40, col_values=(('external_ids', {'iface-id': '6897d2ce-b04d-4d85-9bb6-9da51e7d7f20'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:22:14 compute-0 ovn_controller[95281]: 2025-11-29T07:22:14Z|00332|binding|INFO|Releasing lport 6897d2ce-b04d-4d85-9bb6-9da51e7d7f20 from this chassis (sb_readonly=0)
Nov 29 07:22:14 compute-0 nova_compute[187185]: 2025-11-29 07:22:14.511 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:14.514 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f75dc671-4e0c-40f1-8afd-c16b5e416d95.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f75dc671-4e0c-40f1-8afd-c16b5e416d95.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:22:14 compute-0 nova_compute[187185]: 2025-11-29 07:22:14.521 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:22:14 compute-0 nova_compute[187185]: 2025-11-29 07:22:14.522 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.397s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:14.524 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[25ee0d9f-7f96-48a5-aae8-372a8b6f0e3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:14.526 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-f75dc671-4e0c-40f1-8afd-c16b5e416d95
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/f75dc671-4e0c-40f1-8afd-c16b5e416d95.pid.haproxy
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID f75dc671-4e0c-40f1-8afd-c16b5e416d95
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:22:14 compute-0 nova_compute[187185]: 2025-11-29 07:22:14.527 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:14.528 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95', 'env', 'PROCESS_TAG=haproxy-f75dc671-4e0c-40f1-8afd-c16b5e416d95', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f75dc671-4e0c-40f1-8afd-c16b5e416d95.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:22:14 compute-0 nova_compute[187185]: 2025-11-29 07:22:14.766 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400934.7658, 729b50e7-7084-4f08-90ca-48a841f50cc9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:22:14 compute-0 nova_compute[187185]: 2025-11-29 07:22:14.767 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] VM Started (Lifecycle Event)
Nov 29 07:22:14 compute-0 nova_compute[187185]: 2025-11-29 07:22:14.798 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:22:14 compute-0 nova_compute[187185]: 2025-11-29 07:22:14.803 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400934.7670798, 729b50e7-7084-4f08-90ca-48a841f50cc9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:22:14 compute-0 nova_compute[187185]: 2025-11-29 07:22:14.804 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] VM Paused (Lifecycle Event)
Nov 29 07:22:14 compute-0 nova_compute[187185]: 2025-11-29 07:22:14.840 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:22:14 compute-0 nova_compute[187185]: 2025-11-29 07:22:14.844 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:22:14 compute-0 podman[232787]: 2025-11-29 07:22:14.893317465 +0000 UTC m=+0.022320221 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:22:15 compute-0 nova_compute[187185]: 2025-11-29 07:22:15.031 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:22:15 compute-0 nova_compute[187185]: 2025-11-29 07:22:15.287 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:15.289 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=39, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=38) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:22:15 compute-0 nova_compute[187185]: 2025-11-29 07:22:15.421 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400935.42098, 1be71451-9dcb-4882-afc5-fa2b37f3fa96 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:22:15 compute-0 nova_compute[187185]: 2025-11-29 07:22:15.422 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] VM Started (Lifecycle Event)
Nov 29 07:22:15 compute-0 nova_compute[187185]: 2025-11-29 07:22:15.425 187189 DEBUG nova.compute.manager [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:22:15 compute-0 nova_compute[187185]: 2025-11-29 07:22:15.429 187189 DEBUG nova.virt.libvirt.driver [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:22:15 compute-0 nova_compute[187185]: 2025-11-29 07:22:15.433 187189 INFO nova.virt.libvirt.driver [-] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Instance spawned successfully.
Nov 29 07:22:15 compute-0 nova_compute[187185]: 2025-11-29 07:22:15.433 187189 DEBUG nova.virt.libvirt.driver [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:22:15 compute-0 nova_compute[187185]: 2025-11-29 07:22:15.517 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:22:15 compute-0 nova_compute[187185]: 2025-11-29 07:22:15.518 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:22:15 compute-0 nova_compute[187185]: 2025-11-29 07:22:15.518 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:22:15 compute-0 podman[232787]: 2025-11-29 07:22:15.538266814 +0000 UTC m=+0.667269560 container create 45435afbfd6fea5f31eeb43dbfebbf120bf5e5419cc0e203aecb1e450dc74bf5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 07:22:15 compute-0 systemd[1]: Started libpod-conmon-45435afbfd6fea5f31eeb43dbfebbf120bf5e5419cc0e203aecb1e450dc74bf5.scope.
Nov 29 07:22:15 compute-0 nova_compute[187185]: 2025-11-29 07:22:15.625 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:22:15 compute-0 nova_compute[187185]: 2025-11-29 07:22:15.632 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:22:15 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:22:15 compute-0 nova_compute[187185]: 2025-11-29 07:22:15.636 187189 DEBUG nova.virt.libvirt.driver [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:22:15 compute-0 nova_compute[187185]: 2025-11-29 07:22:15.636 187189 DEBUG nova.virt.libvirt.driver [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:22:15 compute-0 nova_compute[187185]: 2025-11-29 07:22:15.637 187189 DEBUG nova.virt.libvirt.driver [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:22:15 compute-0 nova_compute[187185]: 2025-11-29 07:22:15.637 187189 DEBUG nova.virt.libvirt.driver [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:22:15 compute-0 nova_compute[187185]: 2025-11-29 07:22:15.638 187189 DEBUG nova.virt.libvirt.driver [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:22:15 compute-0 nova_compute[187185]: 2025-11-29 07:22:15.638 187189 DEBUG nova.virt.libvirt.driver [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:22:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/84478e0d93e119f51dbf82e39562f63d97cc070d4c52c626eaf8a590132d34d8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:22:15 compute-0 podman[232787]: 2025-11-29 07:22:15.66460246 +0000 UTC m=+0.793605226 container init 45435afbfd6fea5f31eeb43dbfebbf120bf5e5419cc0e203aecb1e450dc74bf5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Nov 29 07:22:15 compute-0 nova_compute[187185]: 2025-11-29 07:22:15.673 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:22:15 compute-0 nova_compute[187185]: 2025-11-29 07:22:15.674 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400935.4224946, 1be71451-9dcb-4882-afc5-fa2b37f3fa96 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:22:15 compute-0 nova_compute[187185]: 2025-11-29 07:22:15.674 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] VM Paused (Lifecycle Event)
Nov 29 07:22:15 compute-0 podman[232787]: 2025-11-29 07:22:15.678664497 +0000 UTC m=+0.807667233 container start 45435afbfd6fea5f31eeb43dbfebbf120bf5e5419cc0e203aecb1e450dc74bf5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 29 07:22:15 compute-0 nova_compute[187185]: 2025-11-29 07:22:15.686 187189 DEBUG nova.network.neutron [req-02d6993a-72ec-40ee-b95f-ebcad918921d req-ec163bcd-c4d4-4d0f-b412-433ef01f2eb7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Updated VIF entry in instance network info cache for port 0b0600de-624c-4784-aeea-87a27d98f344. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:22:15 compute-0 nova_compute[187185]: 2025-11-29 07:22:15.687 187189 DEBUG nova.network.neutron [req-02d6993a-72ec-40ee-b95f-ebcad918921d req-ec163bcd-c4d4-4d0f-b412-433ef01f2eb7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Updating instance_info_cache with network_info: [{"id": "0b0600de-624c-4784-aeea-87a27d98f344", "address": "fa:16:3e:b0:47:d2", "network": {"id": "f75dc671-4e0c-40f1-8afd-c16b5e416d95", "bridge": "br-int", "label": "tempest-network-smoke--588217173", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb0:47d2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b0600de-62", "ovs_interfaceid": "0b0600de-624c-4784-aeea-87a27d98f344", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:22:15 compute-0 neutron-haproxy-ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95[232809]: [NOTICE]   (232814) : New worker (232816) forked
Nov 29 07:22:15 compute-0 neutron-haproxy-ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95[232809]: [NOTICE]   (232814) : Loading success.
Nov 29 07:22:15 compute-0 nova_compute[187185]: 2025-11-29 07:22:15.716 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:22:15 compute-0 nova_compute[187185]: 2025-11-29 07:22:15.718 187189 DEBUG oslo_concurrency.lockutils [req-02d6993a-72ec-40ee-b95f-ebcad918921d req-ec163bcd-c4d4-4d0f-b412-433ef01f2eb7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-1be71451-9dcb-4882-afc5-fa2b37f3fa96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:22:15 compute-0 nova_compute[187185]: 2025-11-29 07:22:15.722 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400935.4280684, 1be71451-9dcb-4882-afc5-fa2b37f3fa96 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:22:15 compute-0 nova_compute[187185]: 2025-11-29 07:22:15.722 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] VM Resumed (Lifecycle Event)
Nov 29 07:22:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:15.742 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f in datapath b58443a3-f575-4ff1-951d-e92781861793 unbound from our chassis
Nov 29 07:22:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:15.744 104254 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network b58443a3-f575-4ff1-951d-e92781861793 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 29 07:22:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:15.745 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6ef15c1e-0e3e-4f92-a7f9-e4d35594e8ec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:22:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:15.746 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:22:15 compute-0 nova_compute[187185]: 2025-11-29 07:22:15.749 187189 INFO nova.compute.manager [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Took 18.03 seconds to spawn the instance on the hypervisor.
Nov 29 07:22:15 compute-0 nova_compute[187185]: 2025-11-29 07:22:15.750 187189 DEBUG nova.compute.manager [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:22:15 compute-0 nova_compute[187185]: 2025-11-29 07:22:15.751 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:22:15 compute-0 nova_compute[187185]: 2025-11-29 07:22:15.758 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:22:15 compute-0 nova_compute[187185]: 2025-11-29 07:22:15.790 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:22:15 compute-0 nova_compute[187185]: 2025-11-29 07:22:15.845 187189 INFO nova.compute.manager [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Took 18.87 seconds to build instance.
Nov 29 07:22:15 compute-0 nova_compute[187185]: 2025-11-29 07:22:15.865 187189 DEBUG oslo_concurrency.lockutils [None req-1403f8e9-07ff-4622-9e53-c2a7d92dc293 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "1be71451-9dcb-4882-afc5-fa2b37f3fa96" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.091s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:22:15 compute-0 nova_compute[187185]: 2025-11-29 07:22:15.982 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:16 compute-0 nova_compute[187185]: 2025-11-29 07:22:16.107 187189 DEBUG nova.network.neutron [req-9183a757-d5c8-4584-8d34-a45e494e9a08 req-43ffab80-e7cc-46b5-aeb3-7041787fcb11 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Updated VIF entry in instance network info cache for port 81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:22:16 compute-0 nova_compute[187185]: 2025-11-29 07:22:16.108 187189 DEBUG nova.network.neutron [req-9183a757-d5c8-4584-8d34-a45e494e9a08 req-43ffab80-e7cc-46b5-aeb3-7041787fcb11 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Updating instance_info_cache with network_info: [{"id": "81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f", "address": "fa:16:3e:62:5e:11", "network": {"id": "b58443a3-f575-4ff1-951d-e92781861793", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1930314854-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "980ddbfed54546c89c75e94503491a61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81aef0aa-c8", "ovs_interfaceid": "81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:22:16 compute-0 nova_compute[187185]: 2025-11-29 07:22:16.127 187189 DEBUG oslo_concurrency.lockutils [req-9183a757-d5c8-4584-8d34-a45e494e9a08 req-43ffab80-e7cc-46b5-aeb3-7041787fcb11 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-729b50e7-7084-4f08-90ca-48a841f50cc9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:22:16 compute-0 nova_compute[187185]: 2025-11-29 07:22:16.681 187189 DEBUG nova.compute.manager [req-336207d7-bbe5-4a2a-9340-a67275657bb6 req-3a90c66b-1d0f-4735-8958-80d136504ec8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Received event network-vif-plugged-0b0600de-624c-4784-aeea-87a27d98f344 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:22:16 compute-0 nova_compute[187185]: 2025-11-29 07:22:16.682 187189 DEBUG oslo_concurrency.lockutils [req-336207d7-bbe5-4a2a-9340-a67275657bb6 req-3a90c66b-1d0f-4735-8958-80d136504ec8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "1be71451-9dcb-4882-afc5-fa2b37f3fa96-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:22:16 compute-0 nova_compute[187185]: 2025-11-29 07:22:16.683 187189 DEBUG oslo_concurrency.lockutils [req-336207d7-bbe5-4a2a-9340-a67275657bb6 req-3a90c66b-1d0f-4735-8958-80d136504ec8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1be71451-9dcb-4882-afc5-fa2b37f3fa96-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:22:16 compute-0 nova_compute[187185]: 2025-11-29 07:22:16.683 187189 DEBUG oslo_concurrency.lockutils [req-336207d7-bbe5-4a2a-9340-a67275657bb6 req-3a90c66b-1d0f-4735-8958-80d136504ec8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1be71451-9dcb-4882-afc5-fa2b37f3fa96-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:22:16 compute-0 nova_compute[187185]: 2025-11-29 07:22:16.684 187189 DEBUG nova.compute.manager [req-336207d7-bbe5-4a2a-9340-a67275657bb6 req-3a90c66b-1d0f-4735-8958-80d136504ec8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] No waiting events found dispatching network-vif-plugged-0b0600de-624c-4784-aeea-87a27d98f344 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:22:16 compute-0 nova_compute[187185]: 2025-11-29 07:22:16.684 187189 WARNING nova.compute.manager [req-336207d7-bbe5-4a2a-9340-a67275657bb6 req-3a90c66b-1d0f-4735-8958-80d136504ec8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Received unexpected event network-vif-plugged-0b0600de-624c-4784-aeea-87a27d98f344 for instance with vm_state active and task_state None.
Nov 29 07:22:16 compute-0 nova_compute[187185]: 2025-11-29 07:22:16.684 187189 DEBUG nova.compute.manager [req-336207d7-bbe5-4a2a-9340-a67275657bb6 req-3a90c66b-1d0f-4735-8958-80d136504ec8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Received event network-vif-plugged-81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:22:16 compute-0 nova_compute[187185]: 2025-11-29 07:22:16.685 187189 DEBUG oslo_concurrency.lockutils [req-336207d7-bbe5-4a2a-9340-a67275657bb6 req-3a90c66b-1d0f-4735-8958-80d136504ec8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "729b50e7-7084-4f08-90ca-48a841f50cc9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:22:16 compute-0 nova_compute[187185]: 2025-11-29 07:22:16.685 187189 DEBUG oslo_concurrency.lockutils [req-336207d7-bbe5-4a2a-9340-a67275657bb6 req-3a90c66b-1d0f-4735-8958-80d136504ec8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "729b50e7-7084-4f08-90ca-48a841f50cc9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:22:16 compute-0 nova_compute[187185]: 2025-11-29 07:22:16.685 187189 DEBUG oslo_concurrency.lockutils [req-336207d7-bbe5-4a2a-9340-a67275657bb6 req-3a90c66b-1d0f-4735-8958-80d136504ec8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "729b50e7-7084-4f08-90ca-48a841f50cc9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:22:16 compute-0 nova_compute[187185]: 2025-11-29 07:22:16.686 187189 DEBUG nova.compute.manager [req-336207d7-bbe5-4a2a-9340-a67275657bb6 req-3a90c66b-1d0f-4735-8958-80d136504ec8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Processing event network-vif-plugged-81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:22:16 compute-0 nova_compute[187185]: 2025-11-29 07:22:16.686 187189 DEBUG nova.compute.manager [req-336207d7-bbe5-4a2a-9340-a67275657bb6 req-3a90c66b-1d0f-4735-8958-80d136504ec8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Received event network-vif-plugged-81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:22:16 compute-0 nova_compute[187185]: 2025-11-29 07:22:16.686 187189 DEBUG oslo_concurrency.lockutils [req-336207d7-bbe5-4a2a-9340-a67275657bb6 req-3a90c66b-1d0f-4735-8958-80d136504ec8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "729b50e7-7084-4f08-90ca-48a841f50cc9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:22:16 compute-0 nova_compute[187185]: 2025-11-29 07:22:16.687 187189 DEBUG oslo_concurrency.lockutils [req-336207d7-bbe5-4a2a-9340-a67275657bb6 req-3a90c66b-1d0f-4735-8958-80d136504ec8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "729b50e7-7084-4f08-90ca-48a841f50cc9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:22:16 compute-0 nova_compute[187185]: 2025-11-29 07:22:16.687 187189 DEBUG oslo_concurrency.lockutils [req-336207d7-bbe5-4a2a-9340-a67275657bb6 req-3a90c66b-1d0f-4735-8958-80d136504ec8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "729b50e7-7084-4f08-90ca-48a841f50cc9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:22:16 compute-0 nova_compute[187185]: 2025-11-29 07:22:16.687 187189 DEBUG nova.compute.manager [req-336207d7-bbe5-4a2a-9340-a67275657bb6 req-3a90c66b-1d0f-4735-8958-80d136504ec8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] No waiting events found dispatching network-vif-plugged-81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:22:16 compute-0 nova_compute[187185]: 2025-11-29 07:22:16.688 187189 WARNING nova.compute.manager [req-336207d7-bbe5-4a2a-9340-a67275657bb6 req-3a90c66b-1d0f-4735-8958-80d136504ec8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Received unexpected event network-vif-plugged-81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f for instance with vm_state building and task_state spawning.
Nov 29 07:22:16 compute-0 nova_compute[187185]: 2025-11-29 07:22:16.688 187189 DEBUG nova.compute.manager [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:22:16 compute-0 nova_compute[187185]: 2025-11-29 07:22:16.692 187189 DEBUG nova.virt.libvirt.driver [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:22:16 compute-0 nova_compute[187185]: 2025-11-29 07:22:16.695 187189 INFO nova.virt.libvirt.driver [-] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Instance spawned successfully.
Nov 29 07:22:16 compute-0 nova_compute[187185]: 2025-11-29 07:22:16.696 187189 DEBUG nova.virt.libvirt.driver [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:22:16 compute-0 nova_compute[187185]: 2025-11-29 07:22:16.698 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400936.697485, 729b50e7-7084-4f08-90ca-48a841f50cc9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:22:16 compute-0 nova_compute[187185]: 2025-11-29 07:22:16.698 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] VM Resumed (Lifecycle Event)
Nov 29 07:22:16 compute-0 nova_compute[187185]: 2025-11-29 07:22:16.720 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:22:16 compute-0 nova_compute[187185]: 2025-11-29 07:22:16.725 187189 DEBUG nova.virt.libvirt.driver [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:22:16 compute-0 nova_compute[187185]: 2025-11-29 07:22:16.725 187189 DEBUG nova.virt.libvirt.driver [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:22:16 compute-0 nova_compute[187185]: 2025-11-29 07:22:16.726 187189 DEBUG nova.virt.libvirt.driver [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:22:16 compute-0 nova_compute[187185]: 2025-11-29 07:22:16.727 187189 DEBUG nova.virt.libvirt.driver [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:22:16 compute-0 nova_compute[187185]: 2025-11-29 07:22:16.727 187189 DEBUG nova.virt.libvirt.driver [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:22:16 compute-0 nova_compute[187185]: 2025-11-29 07:22:16.728 187189 DEBUG nova.virt.libvirt.driver [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:22:16 compute-0 nova_compute[187185]: 2025-11-29 07:22:16.731 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:22:16 compute-0 nova_compute[187185]: 2025-11-29 07:22:16.788 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:22:16 compute-0 nova_compute[187185]: 2025-11-29 07:22:16.820 187189 INFO nova.compute.manager [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Took 17.56 seconds to spawn the instance on the hypervisor.
Nov 29 07:22:16 compute-0 nova_compute[187185]: 2025-11-29 07:22:16.820 187189 DEBUG nova.compute.manager [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:22:16 compute-0 nova_compute[187185]: 2025-11-29 07:22:16.919 187189 INFO nova.compute.manager [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Took 18.86 seconds to build instance.
Nov 29 07:22:16 compute-0 nova_compute[187185]: 2025-11-29 07:22:16.941 187189 DEBUG oslo_concurrency.lockutils [None req-4564de3d-2752-4ad6-8cec-7ea8afcc5144 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "729b50e7-7084-4f08-90ca-48a841f50cc9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.080s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:22:17 compute-0 nova_compute[187185]: 2025-11-29 07:22:17.953 187189 INFO nova.compute.manager [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Rescuing
Nov 29 07:22:17 compute-0 nova_compute[187185]: 2025-11-29 07:22:17.954 187189 DEBUG oslo_concurrency.lockutils [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Acquiring lock "refresh_cache-729b50e7-7084-4f08-90ca-48a841f50cc9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:22:17 compute-0 nova_compute[187185]: 2025-11-29 07:22:17.955 187189 DEBUG oslo_concurrency.lockutils [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Acquired lock "refresh_cache-729b50e7-7084-4f08-90ca-48a841f50cc9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:22:17 compute-0 nova_compute[187185]: 2025-11-29 07:22:17.955 187189 DEBUG nova.network.neutron [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:22:18 compute-0 nova_compute[187185]: 2025-11-29 07:22:18.310 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:22:18 compute-0 nova_compute[187185]: 2025-11-29 07:22:18.332 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:20 compute-0 nova_compute[187185]: 2025-11-29 07:22:20.985 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:21 compute-0 nova_compute[187185]: 2025-11-29 07:22:21.493 187189 DEBUG nova.network.neutron [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Updating instance_info_cache with network_info: [{"id": "81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f", "address": "fa:16:3e:62:5e:11", "network": {"id": "b58443a3-f575-4ff1-951d-e92781861793", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1930314854-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "980ddbfed54546c89c75e94503491a61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81aef0aa-c8", "ovs_interfaceid": "81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:22:21 compute-0 nova_compute[187185]: 2025-11-29 07:22:21.761 187189 DEBUG oslo_concurrency.lockutils [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Releasing lock "refresh_cache-729b50e7-7084-4f08-90ca-48a841f50cc9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:22:21 compute-0 NetworkManager[55227]: <info>  [1764400941.7789] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/180)
Nov 29 07:22:21 compute-0 NetworkManager[55227]: <info>  [1764400941.7798] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/181)
Nov 29 07:22:21 compute-0 nova_compute[187185]: 2025-11-29 07:22:21.780 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:21 compute-0 nova_compute[187185]: 2025-11-29 07:22:21.918 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:21 compute-0 ovn_controller[95281]: 2025-11-29T07:22:21Z|00333|binding|INFO|Releasing lport 6897d2ce-b04d-4d85-9bb6-9da51e7d7f20 from this chassis (sb_readonly=0)
Nov 29 07:22:21 compute-0 nova_compute[187185]: 2025-11-29 07:22:21.965 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:22 compute-0 nova_compute[187185]: 2025-11-29 07:22:22.421 187189 DEBUG nova.virt.libvirt.driver [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Nov 29 07:22:23 compute-0 nova_compute[187185]: 2025-11-29 07:22:23.382 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:23 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:23.750 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '39'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:22:24 compute-0 nova_compute[187185]: 2025-11-29 07:22:24.739 187189 DEBUG nova.compute.manager [req-95484f93-3cb8-4e40-b88a-d33f882c3013 req-cfc8e04e-5588-4ccb-a6e6-09dfe5edb21a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Received event network-changed-0b0600de-624c-4784-aeea-87a27d98f344 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:22:24 compute-0 nova_compute[187185]: 2025-11-29 07:22:24.739 187189 DEBUG nova.compute.manager [req-95484f93-3cb8-4e40-b88a-d33f882c3013 req-cfc8e04e-5588-4ccb-a6e6-09dfe5edb21a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Refreshing instance network info cache due to event network-changed-0b0600de-624c-4784-aeea-87a27d98f344. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:22:24 compute-0 nova_compute[187185]: 2025-11-29 07:22:24.739 187189 DEBUG oslo_concurrency.lockutils [req-95484f93-3cb8-4e40-b88a-d33f882c3013 req-cfc8e04e-5588-4ccb-a6e6-09dfe5edb21a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-1be71451-9dcb-4882-afc5-fa2b37f3fa96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:22:24 compute-0 nova_compute[187185]: 2025-11-29 07:22:24.740 187189 DEBUG oslo_concurrency.lockutils [req-95484f93-3cb8-4e40-b88a-d33f882c3013 req-cfc8e04e-5588-4ccb-a6e6-09dfe5edb21a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-1be71451-9dcb-4882-afc5-fa2b37f3fa96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:22:24 compute-0 nova_compute[187185]: 2025-11-29 07:22:24.740 187189 DEBUG nova.network.neutron [req-95484f93-3cb8-4e40-b88a-d33f882c3013 req-cfc8e04e-5588-4ccb-a6e6-09dfe5edb21a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Refreshing network info cache for port 0b0600de-624c-4784-aeea-87a27d98f344 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:22:24 compute-0 podman[232826]: 2025-11-29 07:22:24.842547512 +0000 UTC m=+0.108050692 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:22:25 compute-0 nova_compute[187185]: 2025-11-29 07:22:25.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:22:25 compute-0 nova_compute[187185]: 2025-11-29 07:22:25.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 29 07:22:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:25.511 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:22:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:25.512 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:22:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:25.513 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:22:25 compute-0 nova_compute[187185]: 2025-11-29 07:22:25.987 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:27 compute-0 nova_compute[187185]: 2025-11-29 07:22:27.246 187189 DEBUG nova.network.neutron [req-95484f93-3cb8-4e40-b88a-d33f882c3013 req-cfc8e04e-5588-4ccb-a6e6-09dfe5edb21a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Updated VIF entry in instance network info cache for port 0b0600de-624c-4784-aeea-87a27d98f344. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:22:27 compute-0 nova_compute[187185]: 2025-11-29 07:22:27.247 187189 DEBUG nova.network.neutron [req-95484f93-3cb8-4e40-b88a-d33f882c3013 req-cfc8e04e-5588-4ccb-a6e6-09dfe5edb21a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Updating instance_info_cache with network_info: [{"id": "0b0600de-624c-4784-aeea-87a27d98f344", "address": "fa:16:3e:b0:47:d2", "network": {"id": "f75dc671-4e0c-40f1-8afd-c16b5e416d95", "bridge": "br-int", "label": "tempest-network-smoke--588217173", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb0:47d2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b0600de-62", "ovs_interfaceid": "0b0600de-624c-4784-aeea-87a27d98f344", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:22:27 compute-0 nova_compute[187185]: 2025-11-29 07:22:27.338 187189 DEBUG oslo_concurrency.lockutils [req-95484f93-3cb8-4e40-b88a-d33f882c3013 req-cfc8e04e-5588-4ccb-a6e6-09dfe5edb21a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-1be71451-9dcb-4882-afc5-fa2b37f3fa96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:22:27 compute-0 ovn_controller[95281]: 2025-11-29T07:22:27Z|00042|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b0:47:d2 10.100.0.13
Nov 29 07:22:27 compute-0 ovn_controller[95281]: 2025-11-29T07:22:27Z|00043|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b0:47:d2 10.100.0.13
Nov 29 07:22:28 compute-0 nova_compute[187185]: 2025-11-29 07:22:28.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:22:28 compute-0 nova_compute[187185]: 2025-11-29 07:22:28.384 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:30 compute-0 nova_compute[187185]: 2025-11-29 07:22:30.989 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:31 compute-0 podman[232876]: 2025-11-29 07:22:31.784800284 +0000 UTC m=+0.056803235 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 07:22:32 compute-0 nova_compute[187185]: 2025-11-29 07:22:32.864 187189 DEBUG nova.virt.libvirt.driver [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Nov 29 07:22:33 compute-0 nova_compute[187185]: 2025-11-29 07:22:33.390 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:33 compute-0 podman[232902]: 2025-11-29 07:22:33.790292712 +0000 UTC m=+0.058525683 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 07:22:33 compute-0 podman[232903]: 2025-11-29 07:22:33.81181769 +0000 UTC m=+0.072414465 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:22:34 compute-0 nova_compute[187185]: 2025-11-29 07:22:34.337 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:22:35 compute-0 kernel: tap81aef0aa-c8 (unregistering): left promiscuous mode
Nov 29 07:22:35 compute-0 NetworkManager[55227]: <info>  [1764400955.1079] device (tap81aef0aa-c8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:22:35 compute-0 ovn_controller[95281]: 2025-11-29T07:22:35Z|00334|binding|INFO|Releasing lport 81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f from this chassis (sb_readonly=0)
Nov 29 07:22:35 compute-0 ovn_controller[95281]: 2025-11-29T07:22:35Z|00335|binding|INFO|Setting lport 81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f down in Southbound
Nov 29 07:22:35 compute-0 ovn_controller[95281]: 2025-11-29T07:22:35Z|00336|binding|INFO|Removing iface tap81aef0aa-c8 ovn-installed in OVS
Nov 29 07:22:35 compute-0 nova_compute[187185]: 2025-11-29 07:22:35.114 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:35 compute-0 nova_compute[187185]: 2025-11-29 07:22:35.116 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:35 compute-0 nova_compute[187185]: 2025-11-29 07:22:35.127 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:35 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:35.127 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:62:5e:11 10.100.0.8'], port_security=['fa:16:3e:62:5e:11 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b58443a3-f575-4ff1-951d-e92781861793', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '980ddbfed54546c89c75e94503491a61', 'neutron:revision_number': '4', 'neutron:security_group_ids': '41411d2f-bfa5-47e9-8f9d-c921ac196944', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0ea1aa9c-a11a-4c3e-9a7b-dc58c9931652, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:22:35 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:35.130 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f in datapath b58443a3-f575-4ff1-951d-e92781861793 unbound from our chassis
Nov 29 07:22:35 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:35.132 104254 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network b58443a3-f575-4ff1-951d-e92781861793 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 29 07:22:35 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:35.134 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[12e63b09-5f23-4c5e-be0f-3847bb0cc30c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:22:35 compute-0 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d00000074.scope: Deactivated successfully.
Nov 29 07:22:35 compute-0 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d00000074.scope: Consumed 13.145s CPU time.
Nov 29 07:22:35 compute-0 systemd-machined[153486]: Machine qemu-44-instance-00000074 terminated.
Nov 29 07:22:35 compute-0 nova_compute[187185]: 2025-11-29 07:22:35.730 187189 DEBUG nova.compute.manager [req-0dc70712-a1c7-449f-80d1-7e0b8b5295ec req-247b13ac-ed8d-4eac-bb23-406f98b3bab4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Received event network-vif-unplugged-81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:22:35 compute-0 nova_compute[187185]: 2025-11-29 07:22:35.731 187189 DEBUG oslo_concurrency.lockutils [req-0dc70712-a1c7-449f-80d1-7e0b8b5295ec req-247b13ac-ed8d-4eac-bb23-406f98b3bab4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "729b50e7-7084-4f08-90ca-48a841f50cc9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:22:35 compute-0 nova_compute[187185]: 2025-11-29 07:22:35.731 187189 DEBUG oslo_concurrency.lockutils [req-0dc70712-a1c7-449f-80d1-7e0b8b5295ec req-247b13ac-ed8d-4eac-bb23-406f98b3bab4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "729b50e7-7084-4f08-90ca-48a841f50cc9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:22:35 compute-0 nova_compute[187185]: 2025-11-29 07:22:35.732 187189 DEBUG oslo_concurrency.lockutils [req-0dc70712-a1c7-449f-80d1-7e0b8b5295ec req-247b13ac-ed8d-4eac-bb23-406f98b3bab4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "729b50e7-7084-4f08-90ca-48a841f50cc9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:22:35 compute-0 nova_compute[187185]: 2025-11-29 07:22:35.732 187189 DEBUG nova.compute.manager [req-0dc70712-a1c7-449f-80d1-7e0b8b5295ec req-247b13ac-ed8d-4eac-bb23-406f98b3bab4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] No waiting events found dispatching network-vif-unplugged-81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:22:35 compute-0 nova_compute[187185]: 2025-11-29 07:22:35.732 187189 WARNING nova.compute.manager [req-0dc70712-a1c7-449f-80d1-7e0b8b5295ec req-247b13ac-ed8d-4eac-bb23-406f98b3bab4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Received unexpected event network-vif-unplugged-81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f for instance with vm_state active and task_state rescuing.
Nov 29 07:22:35 compute-0 nova_compute[187185]: 2025-11-29 07:22:35.879 187189 INFO nova.virt.libvirt.driver [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Instance shutdown successfully after 13 seconds.
Nov 29 07:22:35 compute-0 nova_compute[187185]: 2025-11-29 07:22:35.885 187189 INFO nova.virt.libvirt.driver [-] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Instance destroyed successfully.
Nov 29 07:22:35 compute-0 nova_compute[187185]: 2025-11-29 07:22:35.885 187189 DEBUG nova.objects.instance [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lazy-loading 'numa_topology' on Instance uuid 729b50e7-7084-4f08-90ca-48a841f50cc9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:22:35 compute-0 nova_compute[187185]: 2025-11-29 07:22:35.990 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:36 compute-0 nova_compute[187185]: 2025-11-29 07:22:36.799 187189 INFO nova.virt.libvirt.driver [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Attempting rescue
Nov 29 07:22:36 compute-0 nova_compute[187185]: 2025-11-29 07:22:36.800 187189 DEBUG nova.virt.libvirt.driver [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314
Nov 29 07:22:36 compute-0 nova_compute[187185]: 2025-11-29 07:22:36.805 187189 DEBUG nova.virt.libvirt.driver [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Nov 29 07:22:36 compute-0 nova_compute[187185]: 2025-11-29 07:22:36.805 187189 INFO nova.virt.libvirt.driver [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Creating image(s)
Nov 29 07:22:36 compute-0 nova_compute[187185]: 2025-11-29 07:22:36.806 187189 DEBUG oslo_concurrency.lockutils [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Acquiring lock "/var/lib/nova/instances/729b50e7-7084-4f08-90ca-48a841f50cc9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:22:36 compute-0 nova_compute[187185]: 2025-11-29 07:22:36.806 187189 DEBUG oslo_concurrency.lockutils [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "/var/lib/nova/instances/729b50e7-7084-4f08-90ca-48a841f50cc9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:22:36 compute-0 nova_compute[187185]: 2025-11-29 07:22:36.807 187189 DEBUG oslo_concurrency.lockutils [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "/var/lib/nova/instances/729b50e7-7084-4f08-90ca-48a841f50cc9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:22:36 compute-0 nova_compute[187185]: 2025-11-29 07:22:36.807 187189 DEBUG nova.objects.instance [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 729b50e7-7084-4f08-90ca-48a841f50cc9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:22:36 compute-0 nova_compute[187185]: 2025-11-29 07:22:36.863 187189 DEBUG oslo_concurrency.lockutils [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:22:36 compute-0 nova_compute[187185]: 2025-11-29 07:22:36.863 187189 DEBUG oslo_concurrency.lockutils [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:22:36 compute-0 nova_compute[187185]: 2025-11-29 07:22:36.878 187189 DEBUG oslo_concurrency.processutils [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:22:36 compute-0 nova_compute[187185]: 2025-11-29 07:22:36.943 187189 DEBUG oslo_concurrency.processutils [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:22:36 compute-0 nova_compute[187185]: 2025-11-29 07:22:36.945 187189 DEBUG oslo_concurrency.processutils [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/729b50e7-7084-4f08-90ca-48a841f50cc9/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:22:36 compute-0 nova_compute[187185]: 2025-11-29 07:22:36.982 187189 DEBUG oslo_concurrency.processutils [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/729b50e7-7084-4f08-90ca-48a841f50cc9/disk.rescue" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:22:36 compute-0 nova_compute[187185]: 2025-11-29 07:22:36.983 187189 DEBUG oslo_concurrency.lockutils [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:22:36 compute-0 nova_compute[187185]: 2025-11-29 07:22:36.983 187189 DEBUG nova.objects.instance [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lazy-loading 'migration_context' on Instance uuid 729b50e7-7084-4f08-90ca-48a841f50cc9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:22:37 compute-0 nova_compute[187185]: 2025-11-29 07:22:36.999 187189 DEBUG nova.virt.libvirt.driver [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:22:37 compute-0 nova_compute[187185]: 2025-11-29 07:22:37.000 187189 DEBUG nova.virt.libvirt.driver [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Start _get_guest_xml network_info=[{"id": "81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f", "address": "fa:16:3e:62:5e:11", "network": {"id": "b58443a3-f575-4ff1-951d-e92781861793", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1930314854-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-1930314854-network", "vif_mac": "fa:16:3e:62:5e:11"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "980ddbfed54546c89c75e94503491a61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81aef0aa-c8", "ovs_interfaceid": "81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:22:37 compute-0 nova_compute[187185]: 2025-11-29 07:22:37.001 187189 DEBUG nova.objects.instance [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lazy-loading 'resources' on Instance uuid 729b50e7-7084-4f08-90ca-48a841f50cc9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:22:37 compute-0 nova_compute[187185]: 2025-11-29 07:22:37.020 187189 WARNING nova.virt.libvirt.driver [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:22:37 compute-0 nova_compute[187185]: 2025-11-29 07:22:37.031 187189 DEBUG nova.virt.libvirt.host [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:22:37 compute-0 nova_compute[187185]: 2025-11-29 07:22:37.032 187189 DEBUG nova.virt.libvirt.host [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:22:37 compute-0 nova_compute[187185]: 2025-11-29 07:22:37.035 187189 DEBUG nova.virt.libvirt.host [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:22:37 compute-0 nova_compute[187185]: 2025-11-29 07:22:37.036 187189 DEBUG nova.virt.libvirt.host [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:22:37 compute-0 nova_compute[187185]: 2025-11-29 07:22:37.037 187189 DEBUG nova.virt.libvirt.driver [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:22:37 compute-0 nova_compute[187185]: 2025-11-29 07:22:37.037 187189 DEBUG nova.virt.hardware [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:22:37 compute-0 nova_compute[187185]: 2025-11-29 07:22:37.037 187189 DEBUG nova.virt.hardware [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:22:37 compute-0 nova_compute[187185]: 2025-11-29 07:22:37.038 187189 DEBUG nova.virt.hardware [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:22:37 compute-0 nova_compute[187185]: 2025-11-29 07:22:37.038 187189 DEBUG nova.virt.hardware [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:22:37 compute-0 nova_compute[187185]: 2025-11-29 07:22:37.038 187189 DEBUG nova.virt.hardware [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:22:37 compute-0 nova_compute[187185]: 2025-11-29 07:22:37.038 187189 DEBUG nova.virt.hardware [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:22:37 compute-0 nova_compute[187185]: 2025-11-29 07:22:37.038 187189 DEBUG nova.virt.hardware [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:22:37 compute-0 nova_compute[187185]: 2025-11-29 07:22:37.039 187189 DEBUG nova.virt.hardware [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:22:37 compute-0 nova_compute[187185]: 2025-11-29 07:22:37.039 187189 DEBUG nova.virt.hardware [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:22:37 compute-0 nova_compute[187185]: 2025-11-29 07:22:37.039 187189 DEBUG nova.virt.hardware [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:22:37 compute-0 nova_compute[187185]: 2025-11-29 07:22:37.039 187189 DEBUG nova.virt.hardware [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:22:37 compute-0 nova_compute[187185]: 2025-11-29 07:22:37.039 187189 DEBUG nova.objects.instance [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 729b50e7-7084-4f08-90ca-48a841f50cc9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:22:37 compute-0 nova_compute[187185]: 2025-11-29 07:22:37.069 187189 DEBUG nova.virt.libvirt.vif [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:21:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-1961507248',display_name='tempest-ServerRescueTestJSON-server-1961507248',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-1961507248',id=116,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:22:16Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='980ddbfed54546c89c75e94503491a61',ramdisk_id='',reservation_id='r-ok5v6h99',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSON-1854570869',owner_user_name='tempest-ServerRescueTestJSON-1854570869-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:22:16Z,user_data=None,user_id='a992c32ce5fb4cbab645023852f14adc',uuid=729b50e7-7084-4f08-90ca-48a841f50cc9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f", "address": "fa:16:3e:62:5e:11", "network": {"id": "b58443a3-f575-4ff1-951d-e92781861793", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1930314854-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-1930314854-network", "vif_mac": "fa:16:3e:62:5e:11"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "980ddbfed54546c89c75e94503491a61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81aef0aa-c8", "ovs_interfaceid": "81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:22:37 compute-0 nova_compute[187185]: 2025-11-29 07:22:37.069 187189 DEBUG nova.network.os_vif_util [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Converting VIF {"id": "81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f", "address": "fa:16:3e:62:5e:11", "network": {"id": "b58443a3-f575-4ff1-951d-e92781861793", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1930314854-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-1930314854-network", "vif_mac": "fa:16:3e:62:5e:11"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "980ddbfed54546c89c75e94503491a61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81aef0aa-c8", "ovs_interfaceid": "81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:22:37 compute-0 nova_compute[187185]: 2025-11-29 07:22:37.070 187189 DEBUG nova.network.os_vif_util [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:62:5e:11,bridge_name='br-int',has_traffic_filtering=True,id=81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f,network=Network(b58443a3-f575-4ff1-951d-e92781861793),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81aef0aa-c8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:22:37 compute-0 nova_compute[187185]: 2025-11-29 07:22:37.071 187189 DEBUG nova.objects.instance [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lazy-loading 'pci_devices' on Instance uuid 729b50e7-7084-4f08-90ca-48a841f50cc9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:22:37 compute-0 nova_compute[187185]: 2025-11-29 07:22:37.086 187189 DEBUG nova.virt.libvirt.driver [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:22:37 compute-0 nova_compute[187185]:   <uuid>729b50e7-7084-4f08-90ca-48a841f50cc9</uuid>
Nov 29 07:22:37 compute-0 nova_compute[187185]:   <name>instance-00000074</name>
Nov 29 07:22:37 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:22:37 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:22:37 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:22:37 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:       <nova:name>tempest-ServerRescueTestJSON-server-1961507248</nova:name>
Nov 29 07:22:37 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:22:37</nova:creationTime>
Nov 29 07:22:37 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:22:37 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:22:37 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:22:37 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:22:37 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:22:37 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:22:37 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:22:37 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:22:37 compute-0 nova_compute[187185]:         <nova:user uuid="a992c32ce5fb4cbab645023852f14adc">tempest-ServerRescueTestJSON-1854570869-project-member</nova:user>
Nov 29 07:22:37 compute-0 nova_compute[187185]:         <nova:project uuid="980ddbfed54546c89c75e94503491a61">tempest-ServerRescueTestJSON-1854570869</nova:project>
Nov 29 07:22:37 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:22:37 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:22:37 compute-0 nova_compute[187185]:         <nova:port uuid="81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f">
Nov 29 07:22:37 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:22:37 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:22:37 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:22:37 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <system>
Nov 29 07:22:37 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:22:37 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:22:37 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:22:37 compute-0 nova_compute[187185]:       <entry name="serial">729b50e7-7084-4f08-90ca-48a841f50cc9</entry>
Nov 29 07:22:37 compute-0 nova_compute[187185]:       <entry name="uuid">729b50e7-7084-4f08-90ca-48a841f50cc9</entry>
Nov 29 07:22:37 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     </system>
Nov 29 07:22:37 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:22:37 compute-0 nova_compute[187185]:   <os>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:   </os>
Nov 29 07:22:37 compute-0 nova_compute[187185]:   <features>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:   </features>
Nov 29 07:22:37 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:22:37 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:22:37 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:22:37 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/729b50e7-7084-4f08-90ca-48a841f50cc9/disk.rescue"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:22:37 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/729b50e7-7084-4f08-90ca-48a841f50cc9/disk"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:       <target dev="vdb" bus="virtio"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:22:37 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/729b50e7-7084-4f08-90ca-48a841f50cc9/disk.config.rescue"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:22:37 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:62:5e:11"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:       <target dev="tap81aef0aa-c8"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:22:37 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/729b50e7-7084-4f08-90ca-48a841f50cc9/console.log" append="off"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <video>
Nov 29 07:22:37 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     </video>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:22:37 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:22:37 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:22:37 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:22:37 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:22:37 compute-0 nova_compute[187185]: </domain>
Nov 29 07:22:37 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:22:37 compute-0 nova_compute[187185]: 2025-11-29 07:22:37.095 187189 INFO nova.virt.libvirt.driver [-] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Instance destroyed successfully.
Nov 29 07:22:37 compute-0 nova_compute[187185]: 2025-11-29 07:22:37.163 187189 DEBUG nova.virt.libvirt.driver [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:22:37 compute-0 nova_compute[187185]: 2025-11-29 07:22:37.163 187189 DEBUG nova.virt.libvirt.driver [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:22:37 compute-0 nova_compute[187185]: 2025-11-29 07:22:37.163 187189 DEBUG nova.virt.libvirt.driver [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:22:37 compute-0 nova_compute[187185]: 2025-11-29 07:22:37.164 187189 DEBUG nova.virt.libvirt.driver [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] No VIF found with MAC fa:16:3e:62:5e:11, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:22:37 compute-0 nova_compute[187185]: 2025-11-29 07:22:37.164 187189 INFO nova.virt.libvirt.driver [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Using config drive
Nov 29 07:22:37 compute-0 nova_compute[187185]: 2025-11-29 07:22:37.185 187189 DEBUG nova.objects.instance [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 729b50e7-7084-4f08-90ca-48a841f50cc9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:22:37 compute-0 nova_compute[187185]: 2025-11-29 07:22:37.226 187189 DEBUG nova.objects.instance [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lazy-loading 'keypairs' on Instance uuid 729b50e7-7084-4f08-90ca-48a841f50cc9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:22:37 compute-0 nova_compute[187185]: 2025-11-29 07:22:37.868 187189 DEBUG nova.compute.manager [req-a1d84c22-ee25-495e-bafc-cb15d1308954 req-4158c1c8-4d24-4eff-959b-29f6e504c43e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Received event network-vif-plugged-81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:22:37 compute-0 nova_compute[187185]: 2025-11-29 07:22:37.869 187189 DEBUG oslo_concurrency.lockutils [req-a1d84c22-ee25-495e-bafc-cb15d1308954 req-4158c1c8-4d24-4eff-959b-29f6e504c43e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "729b50e7-7084-4f08-90ca-48a841f50cc9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:22:37 compute-0 nova_compute[187185]: 2025-11-29 07:22:37.869 187189 DEBUG oslo_concurrency.lockutils [req-a1d84c22-ee25-495e-bafc-cb15d1308954 req-4158c1c8-4d24-4eff-959b-29f6e504c43e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "729b50e7-7084-4f08-90ca-48a841f50cc9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:22:37 compute-0 nova_compute[187185]: 2025-11-29 07:22:37.869 187189 DEBUG oslo_concurrency.lockutils [req-a1d84c22-ee25-495e-bafc-cb15d1308954 req-4158c1c8-4d24-4eff-959b-29f6e504c43e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "729b50e7-7084-4f08-90ca-48a841f50cc9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:22:37 compute-0 nova_compute[187185]: 2025-11-29 07:22:37.870 187189 DEBUG nova.compute.manager [req-a1d84c22-ee25-495e-bafc-cb15d1308954 req-4158c1c8-4d24-4eff-959b-29f6e504c43e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] No waiting events found dispatching network-vif-plugged-81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:22:37 compute-0 nova_compute[187185]: 2025-11-29 07:22:37.870 187189 WARNING nova.compute.manager [req-a1d84c22-ee25-495e-bafc-cb15d1308954 req-4158c1c8-4d24-4eff-959b-29f6e504c43e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Received unexpected event network-vif-plugged-81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f for instance with vm_state active and task_state rescuing.
Nov 29 07:22:38 compute-0 nova_compute[187185]: 2025-11-29 07:22:38.394 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:39 compute-0 nova_compute[187185]: 2025-11-29 07:22:39.269 187189 INFO nova.virt.libvirt.driver [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Creating config drive at /var/lib/nova/instances/729b50e7-7084-4f08-90ca-48a841f50cc9/disk.config.rescue
Nov 29 07:22:39 compute-0 nova_compute[187185]: 2025-11-29 07:22:39.275 187189 DEBUG oslo_concurrency.processutils [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/729b50e7-7084-4f08-90ca-48a841f50cc9/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjhq69qj3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:22:39 compute-0 nova_compute[187185]: 2025-11-29 07:22:39.409 187189 DEBUG oslo_concurrency.processutils [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/729b50e7-7084-4f08-90ca-48a841f50cc9/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjhq69qj3" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:22:39 compute-0 kernel: tap81aef0aa-c8: entered promiscuous mode
Nov 29 07:22:39 compute-0 NetworkManager[55227]: <info>  [1764400959.4695] manager: (tap81aef0aa-c8): new Tun device (/org/freedesktop/NetworkManager/Devices/182)
Nov 29 07:22:39 compute-0 ovn_controller[95281]: 2025-11-29T07:22:39Z|00337|binding|INFO|Claiming lport 81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f for this chassis.
Nov 29 07:22:39 compute-0 nova_compute[187185]: 2025-11-29 07:22:39.471 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:39 compute-0 ovn_controller[95281]: 2025-11-29T07:22:39Z|00338|binding|INFO|81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f: Claiming fa:16:3e:62:5e:11 10.100.0.8
Nov 29 07:22:39 compute-0 systemd-udevd[232991]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:22:39 compute-0 NetworkManager[55227]: <info>  [1764400959.5031] device (tap81aef0aa-c8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:22:39 compute-0 ovn_controller[95281]: 2025-11-29T07:22:39Z|00339|binding|INFO|Setting lport 81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f ovn-installed in OVS
Nov 29 07:22:39 compute-0 NetworkManager[55227]: <info>  [1764400959.5043] device (tap81aef0aa-c8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:22:39 compute-0 nova_compute[187185]: 2025-11-29 07:22:39.505 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:39 compute-0 nova_compute[187185]: 2025-11-29 07:22:39.508 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:39 compute-0 systemd-machined[153486]: New machine qemu-45-instance-00000074.
Nov 29 07:22:39 compute-0 systemd[1]: Started Virtual Machine qemu-45-instance-00000074.
Nov 29 07:22:39 compute-0 ovn_controller[95281]: 2025-11-29T07:22:39Z|00340|binding|INFO|Setting lport 81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f up in Southbound
Nov 29 07:22:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:39.608 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:62:5e:11 10.100.0.8'], port_security=['fa:16:3e:62:5e:11 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b58443a3-f575-4ff1-951d-e92781861793', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '980ddbfed54546c89c75e94503491a61', 'neutron:revision_number': '5', 'neutron:security_group_ids': '41411d2f-bfa5-47e9-8f9d-c921ac196944', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0ea1aa9c-a11a-4c3e-9a7b-dc58c9931652, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:22:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:39.610 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f in datapath b58443a3-f575-4ff1-951d-e92781861793 bound to our chassis
Nov 29 07:22:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:39.612 104254 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network b58443a3-f575-4ff1-951d-e92781861793 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 29 07:22:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:22:39.613 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[59f525ad-280d-41bc-aeb6-5f40e49e87d3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:22:40 compute-0 nova_compute[187185]: 2025-11-29 07:22:40.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:22:40 compute-0 nova_compute[187185]: 2025-11-29 07:22:40.318 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 29 07:22:40 compute-0 nova_compute[187185]: 2025-11-29 07:22:40.659 187189 DEBUG nova.virt.libvirt.host [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Removed pending event for 729b50e7-7084-4f08-90ca-48a841f50cc9 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Nov 29 07:22:40 compute-0 nova_compute[187185]: 2025-11-29 07:22:40.661 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400960.6587102, 729b50e7-7084-4f08-90ca-48a841f50cc9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:22:40 compute-0 nova_compute[187185]: 2025-11-29 07:22:40.661 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] VM Resumed (Lifecycle Event)
Nov 29 07:22:40 compute-0 nova_compute[187185]: 2025-11-29 07:22:40.992 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:43 compute-0 nova_compute[187185]: 2025-11-29 07:22:43.397 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:44 compute-0 podman[233010]: 2025-11-29 07:22:44.802174099 +0000 UTC m=+0.061665812 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 07:22:44 compute-0 podman[233012]: 2025-11-29 07:22:44.80220848 +0000 UTC m=+0.060827958 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 07:22:44 compute-0 podman[233011]: 2025-11-29 07:22:44.807755147 +0000 UTC m=+0.069744310 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7, version=9.6, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.openshift.tags=minimal rhel9, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git)
Nov 29 07:22:45 compute-0 nova_compute[187185]: 2025-11-29 07:22:45.485 187189 DEBUG nova.compute.manager [req-0f45cef1-45ad-4793-b59c-c18bc88f535b req-5be5b99d-7e1d-471d-8408-e4cf8a31544a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Received event network-vif-plugged-81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:22:45 compute-0 nova_compute[187185]: 2025-11-29 07:22:45.485 187189 DEBUG oslo_concurrency.lockutils [req-0f45cef1-45ad-4793-b59c-c18bc88f535b req-5be5b99d-7e1d-471d-8408-e4cf8a31544a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "729b50e7-7084-4f08-90ca-48a841f50cc9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:22:45 compute-0 nova_compute[187185]: 2025-11-29 07:22:45.486 187189 DEBUG oslo_concurrency.lockutils [req-0f45cef1-45ad-4793-b59c-c18bc88f535b req-5be5b99d-7e1d-471d-8408-e4cf8a31544a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "729b50e7-7084-4f08-90ca-48a841f50cc9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:22:45 compute-0 nova_compute[187185]: 2025-11-29 07:22:45.486 187189 DEBUG oslo_concurrency.lockutils [req-0f45cef1-45ad-4793-b59c-c18bc88f535b req-5be5b99d-7e1d-471d-8408-e4cf8a31544a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "729b50e7-7084-4f08-90ca-48a841f50cc9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:22:45 compute-0 nova_compute[187185]: 2025-11-29 07:22:45.486 187189 DEBUG nova.compute.manager [req-0f45cef1-45ad-4793-b59c-c18bc88f535b req-5be5b99d-7e1d-471d-8408-e4cf8a31544a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] No waiting events found dispatching network-vif-plugged-81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:22:45 compute-0 nova_compute[187185]: 2025-11-29 07:22:45.486 187189 WARNING nova.compute.manager [req-0f45cef1-45ad-4793-b59c-c18bc88f535b req-5be5b99d-7e1d-471d-8408-e4cf8a31544a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Received unexpected event network-vif-plugged-81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f for instance with vm_state active and task_state rescuing.
Nov 29 07:22:45 compute-0 nova_compute[187185]: 2025-11-29 07:22:45.488 187189 DEBUG nova.compute.manager [None req-0b0a8d42-56d4-4a26-9370-1f599b3e72e6 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:22:45 compute-0 nova_compute[187185]: 2025-11-29 07:22:45.620 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:22:45 compute-0 nova_compute[187185]: 2025-11-29 07:22:45.621 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 29 07:22:45 compute-0 nova_compute[187185]: 2025-11-29 07:22:45.626 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:22:45 compute-0 nova_compute[187185]: 2025-11-29 07:22:45.663 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] During sync_power_state the instance has a pending task (rescuing). Skip.
Nov 29 07:22:45 compute-0 nova_compute[187185]: 2025-11-29 07:22:45.664 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764400960.6601098, 729b50e7-7084-4f08-90ca-48a841f50cc9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:22:45 compute-0 nova_compute[187185]: 2025-11-29 07:22:45.664 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] VM Started (Lifecycle Event)
Nov 29 07:22:45 compute-0 nova_compute[187185]: 2025-11-29 07:22:45.994 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:46 compute-0 nova_compute[187185]: 2025-11-29 07:22:46.024 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:22:46 compute-0 nova_compute[187185]: 2025-11-29 07:22:46.029 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.001 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96', 'name': 'tempest-TestGettingAddress-server-210192598', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000073', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '0111c22b4b954ea586ca20d91ed3970f', 'user_id': '31ac7b05b012433b89143dc9f259644a', 'hostId': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.005 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'name': 'tempest-ServerRescueTestJSON-server-1961507248', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000074', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '980ddbfed54546c89c75e94503491a61', 'user_id': 'a992c32ce5fb4cbab645023852f14adc', 'hostId': 'c7f75f7a20c9811e89034f282d556cbdad4fa495e2d09a505238d522', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.005 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.026 12 DEBUG ceilometer.compute.pollsters [-] 1be71451-9dcb-4882-afc5-fa2b37f3fa96/cpu volume: 12070000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.047 12 DEBUG ceilometer.compute.pollsters [-] 729b50e7-7084-4f08-90ca-48a841f50cc9/cpu volume: 7000000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4a77f22-c956-4c18-a4aa-50817ceb734d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12070000000, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96', 'timestamp': '2025-11-29T07:22:48.005574', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-210192598', 'name': 'instance-00000073', 'instance_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '34e94656-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.744558084, 'message_signature': 'd680d1e1a2f4f7179337f0780ea27bdcf366e91ccd1417716d61fac9a5d21ee4'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 7000000000, 'user_id': 'a992c32ce5fb4cbab645023852f14adc', 'user_name': None, 'project_id': '980ddbfed54546c89c75e94503491a61', 'project_name': None, 'resource_id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'timestamp': '2025-11-29T07:22:48.005574', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1961507248', 'name': 'instance-00000074', 'instance_id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'instance_type': 'm1.nano', 'host': 'c7f75f7a20c9811e89034f282d556cbdad4fa495e2d09a505238d522', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '34ec5f30-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.765337791, 'message_signature': '3a40aecb3757894c8b7994039643e60b2c70cf623cae5bca660cba1809b160b3'}]}, 'timestamp': '2025-11-29 07:22:48.048011', '_unique_id': '3b9f8aecd86949d5a4072b7928338f5f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.049 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.052 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.054 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 1be71451-9dcb-4882-afc5-fa2b37f3fa96 / tap0b0600de-62 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.055 12 DEBUG ceilometer.compute.pollsters [-] 1be71451-9dcb-4882-afc5-fa2b37f3fa96/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.057 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 729b50e7-7084-4f08-90ca-48a841f50cc9 / tap81aef0aa-c8 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.057 12 DEBUG ceilometer.compute.pollsters [-] 729b50e7-7084-4f08-90ca-48a841f50cc9/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '436b2db1-c725-499d-933e-c618b25bfea1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000073-1be71451-9dcb-4882-afc5-fa2b37f3fa96-tap0b0600de-62', 'timestamp': '2025-11-29T07:22:48.052798', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-210192598', 'name': 'tap0b0600de-62', 'instance_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b0:47:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0b0600de-62'}, 'message_id': '34ed8a68-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.771029312, 'message_signature': 'c220a31da75f236f186a8cd3bcc89e488cc6b6fa6d8d5edd14cbd7af6044ce6d'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a992c32ce5fb4cbab645023852f14adc', 'user_name': None, 'project_id': '980ddbfed54546c89c75e94503491a61', 'project_name': None, 'resource_id': 'instance-00000074-729b50e7-7084-4f08-90ca-48a841f50cc9-tap81aef0aa-c8', 'timestamp': '2025-11-29T07:22:48.052798', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1961507248', 'name': 'tap81aef0aa-c8', 'instance_id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'instance_type': 'm1.nano', 'host': 'c7f75f7a20c9811e89034f282d556cbdad4fa495e2d09a505238d522', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:62:5e:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap81aef0aa-c8'}, 'message_id': '34ede5a8-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.773757709, 'message_signature': 'a12e1717b3e5c1cb54e1e5700247150bc480bd80b80c7c95aba39bcbb9be44af'}]}, 'timestamp': '2025-11-29 07:22:48.057931', '_unique_id': 'c677e9bad67545a18a125216fc374039'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.058 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.061 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.090 12 DEBUG ceilometer.compute.pollsters [-] 1be71451-9dcb-4882-afc5-fa2b37f3fa96/disk.device.write.bytes volume: 72941568 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.090 12 DEBUG ceilometer.compute.pollsters [-] 1be71451-9dcb-4882-afc5-fa2b37f3fa96/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.162 12 DEBUG ceilometer.compute.pollsters [-] 729b50e7-7084-4f08-90ca-48a841f50cc9/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.163 12 DEBUG ceilometer.compute.pollsters [-] 729b50e7-7084-4f08-90ca-48a841f50cc9/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.164 12 DEBUG ceilometer.compute.pollsters [-] 729b50e7-7084-4f08-90ca-48a841f50cc9/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dea73444-b3d7-45da-8167-b55983993dc0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72941568, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96-vda', 'timestamp': '2025-11-29T07:22:48.061564', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-210192598', 'name': 'instance-00000073', 'instance_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '34f2e6ca-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.779777989, 'message_signature': '16451b12842eec658d79a0fa50bb3eaea8c712a86a996e46ae57282341724a10'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96-sda', 'timestamp': '2025-11-29T07:22:48.061564', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-210192598', 'name': 'instance-00000073', 'instance_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '34f2f534-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.779777989, 'message_signature': '2a962b9dfa9608e18ab195fad31a453b11927c7c52449612afa6e4823431bb0b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a992c32ce5fb4cbab645023852f14adc', 'user_name': None, 'project_id': '980ddbfed54546c89c75e94503491a61', 'project_name': None, 'resource_id': '729b50e7-7084-4f08-90ca-48a841f50cc9-vda', 'timestamp': '2025-11-29T07:22:48.061564', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1961507248', 'name': 'instance-00000074', 'instance_id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'instance_type': 'm1.nano', 'host': 'c7f75f7a20c9811e89034f282d556cbdad4fa495e2d09a505238d522', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '34fe0a64-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.809248871, 'message_signature': 'd4b9baf1da036af9e1e669f8621d772bb608f3b883405c7c60216a445962cc29'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a992c32ce5fb4cbab645023852f14adc', 'user_name': None, 'project_id': '980ddbfed54546c89c75e94503491a61', 'project_name': None, 'resource_id': '729b50e7-7084-4f08-90ca-48a841f50cc9-vdb', 'timestamp': '2025-11-29T07:22:48.061564', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1961507248', 'name': 'instance-00000074', 'instance_id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'instance_type': 'm1.nano', 'host': 'c7f75f7a20c9811e89034f282d556cbdad4fa495e2d09a505238d522', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '34fe2a6c-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.809248871, 'message_signature': 'd20152a7e46a6d7cec3c68c3ef7046e4713f0f70dff478f78b1f879470e7bb81'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a992c32ce5fb4cbab645023852f14adc', 'user_name': None, 'project_id': '980ddbfed54546c89c75e94503491a61', 'project_name': None, 'resource_id': '729b50e7-7084-4f08-90ca-48a841f50cc9-sda', 'timestamp': '2025-11-29T07:22:48.061564', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1961507248', 'name': 'instance-00000074', 'instance_id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'instance_type': 'm1.nano', 'host': 'c7f75f7a20c9811e89034f282d556cbdad4fa495e2d09a505238d522', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '34fe4614-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.809248871, 'message_signature': 'f888063ae60e7128df1375089cca5fb72d22adba32120b33edc773c49def302f'}]}, 'timestamp': '2025-11-29 07:22:48.165358', '_unique_id': 'a9cebb8285c2465aa050fc8051d53a2a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.166 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.173 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.173 12 DEBUG ceilometer.compute.pollsters [-] 1be71451-9dcb-4882-afc5-fa2b37f3fa96/network.outgoing.packets volume: 31 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.174 12 DEBUG ceilometer.compute.pollsters [-] 729b50e7-7084-4f08-90ca-48a841f50cc9/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a965dcae-bc2a-404d-bb1e-84e7af1b0174', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 31, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000073-1be71451-9dcb-4882-afc5-fa2b37f3fa96-tap0b0600de-62', 'timestamp': '2025-11-29T07:22:48.173804', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-210192598', 'name': 'tap0b0600de-62', 'instance_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b0:47:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0b0600de-62'}, 'message_id': '34ffb0f8-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.771029312, 'message_signature': 'c6296e0c6068310e62aa9e3c214e74c364ce272e81032639096f71e5ff15df67'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a992c32ce5fb4cbab645023852f14adc', 'user_name': None, 'project_id': '980ddbfed54546c89c75e94503491a61', 'project_name': None, 'resource_id': 'instance-00000074-729b50e7-7084-4f08-90ca-48a841f50cc9-tap81aef0aa-c8', 'timestamp': '2025-11-29T07:22:48.173804', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1961507248', 'name': 'tap81aef0aa-c8', 'instance_id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'instance_type': 'm1.nano', 'host': 'c7f75f7a20c9811e89034f282d556cbdad4fa495e2d09a505238d522', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:62:5e:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap81aef0aa-c8'}, 'message_id': '34ffd29a-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.773757709, 'message_signature': 'df2ae172fcb4c8a9cf473af0d13f8e12fe6ef0512db269c23ff86bdb62a7abf8'}]}, 'timestamp': '2025-11-29 07:22:48.175564', '_unique_id': '7910bacdb0784933ae3e7e9a4d9e4382'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.178 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.202 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.202 12 DEBUG ceilometer.compute.pollsters [-] 1be71451-9dcb-4882-afc5-fa2b37f3fa96/disk.device.read.bytes volume: 30534144 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.203 12 DEBUG ceilometer.compute.pollsters [-] 1be71451-9dcb-4882-afc5-fa2b37f3fa96/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.204 12 DEBUG ceilometer.compute.pollsters [-] 729b50e7-7084-4f08-90ca-48a841f50cc9/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.205 12 DEBUG ceilometer.compute.pollsters [-] 729b50e7-7084-4f08-90ca-48a841f50cc9/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.205 12 DEBUG ceilometer.compute.pollsters [-] 729b50e7-7084-4f08-90ca-48a841f50cc9/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '801fbce4-3056-4567-8188-228ced5f0567', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30534144, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96-vda', 'timestamp': '2025-11-29T07:22:48.202809', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-210192598', 'name': 'instance-00000073', 'instance_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '35041c92-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.779777989, 'message_signature': '904db69e187e5488281aa7419aafadfe9b2150f27e15a057217b375818b326d9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96-sda', 'timestamp': '2025-11-29T07:22:48.202809', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-210192598', 'name': 'instance-00000073', 'instance_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '35043c4a-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.779777989, 'message_signature': '1ea18dc5b56515c376215765fa47dd898efc87b6918475f5871dbb81e34a91eb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': 'a992c32ce5fb4cbab645023852f14adc', 'user_name': None, 'project_id': '980ddbfed54546c89c75e94503491a61', 'project_name': None, 'resource_id': '729b50e7-7084-4f08-90ca-48a841f50cc9-vda', 'timestamp': '2025-11-29T07:22:48.202809', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1961507248', 'name': 'instance-00000074', 'instance_id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'instance_type': 'm1.nano', 'host': 'c7f75f7a20c9811e89034f282d556cbdad4fa495e2d09a505238d522', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3504572a-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.809248871, 'message_signature': 'a565e7bd7f61ac10a77a77bd4ae5d18a36d46da2bc229e793d4339fa9cfce206'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a992c32ce5fb4cbab645023852f14adc', 'user_name': None, 'project_id': '980ddbfed54546c89c75e94503491a61', 'project_name': None, 'resource_id': '729b50e7-7084-4f08-90ca-48a841f50cc9-vdb', 'timestamp': '2025-11-29T07:22:48.202809', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1961507248', 'name': 'instance-00000074', 'instance_id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'instance_type': 'm1.nano', 'host': 'c7f75f7a20c9811e89034f282d556cbdad4fa495e2d09a505238d522', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '350470e8-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.809248871, 'message_signature': 'c55ceee3fd4932f6fa1cdfb9eceadf9b0b440ad1a61fd04c01291bf8aa54386f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'a992c32ce5fb4cbab645023852f14adc', 'user_name': None, 'project_id': '980ddbfed54546c89c75e94503491a61', 'project_name': None, 'resource_id': '729b50e7-7084-4f08-90ca-48a841f50cc9-sda', 'timestamp': '2025-11-29T07:22:48.202809', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1961507248', 'name': 'instance-00000074', 'instance_id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'instance_type': 'm1.nano', 'host': 'c7f75f7a20c9811e89034f282d556cbdad4fa495e2d09a505238d522', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '35048b3c-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.809248871, 'message_signature': '82d5b49bd24f6159afe958519aa089a8161d5c6705840ad37e999d7cc1b5818c'}]}, 'timestamp': '2025-11-29 07:22:48.206435', '_unique_id': '6b9d69c72cc64d2180988c67c1d655df'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.207 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.211 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.212 12 DEBUG ceilometer.compute.pollsters [-] 1be71451-9dcb-4882-afc5-fa2b37f3fa96/network.outgoing.bytes volume: 3704 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.212 12 DEBUG ceilometer.compute.pollsters [-] 729b50e7-7084-4f08-90ca-48a841f50cc9/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '402cb53b-42e0-4099-8fb3-502b1a03d68b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3704, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000073-1be71451-9dcb-4882-afc5-fa2b37f3fa96-tap0b0600de-62', 'timestamp': '2025-11-29T07:22:48.212072', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-210192598', 'name': 'tap0b0600de-62', 'instance_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b0:47:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0b0600de-62'}, 'message_id': '35057bbe-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.771029312, 'message_signature': '2bd2a574483f15bc0b45882b1cd23ec9d95951b03d6bc38c535bf4515e35c44e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a992c32ce5fb4cbab645023852f14adc', 'user_name': None, 'project_id': '980ddbfed54546c89c75e94503491a61', 'project_name': None, 'resource_id': 'instance-00000074-729b50e7-7084-4f08-90ca-48a841f50cc9-tap81aef0aa-c8', 'timestamp': '2025-11-29T07:22:48.212072', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1961507248', 'name': 'tap81aef0aa-c8', 'instance_id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'instance_type': 'm1.nano', 'host': 'c7f75f7a20c9811e89034f282d556cbdad4fa495e2d09a505238d522', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:62:5e:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap81aef0aa-c8'}, 'message_id': '35058c26-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.773757709, 'message_signature': '5b23f141944d1464d3b30ca96c3346e7a2eaafecf79a3599b1e5580b89ff8393'}]}, 'timestamp': '2025-11-29 07:22:48.212956', '_unique_id': '7aa3aa7517c94be89722c9ee229f41d9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.213 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.217 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.230 12 DEBUG ceilometer.compute.pollsters [-] 1be71451-9dcb-4882-afc5-fa2b37f3fa96/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.231 12 DEBUG ceilometer.compute.pollsters [-] 1be71451-9dcb-4882-afc5-fa2b37f3fa96/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.257 12 DEBUG ceilometer.compute.pollsters [-] 729b50e7-7084-4f08-90ca-48a841f50cc9/disk.device.capacity volume: 117440512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.258 12 DEBUG ceilometer.compute.pollsters [-] 729b50e7-7084-4f08-90ca-48a841f50cc9/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.259 12 DEBUG ceilometer.compute.pollsters [-] 729b50e7-7084-4f08-90ca-48a841f50cc9/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d666772-9ceb-489b-ba60-fc762dc3bd68', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96-vda', 'timestamp': '2025-11-29T07:22:48.217762', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-210192598', 'name': 'instance-00000073', 'instance_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3508588e-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.9360236, 'message_signature': 'b0ea09508616894b273fb3ecd2b0c84928d68c419699449caab6c36d57414f8b'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96-sda', 'timestamp': '2025-11-29T07:22:48.217762', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-210192598', 'name': 'instance-00000073', 'instance_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '350869f0-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.9360236, 'message_signature': 'b6de1692cce758a6cd7cc58479b655a728cf4e5523092963ab57f4055884aa37'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 117440512, 'user_id': 'a992c32ce5fb4cbab645023852f14adc', 'user_name': None, 'project_id': '980ddbfed54546c89c75e94503491a61', 'project_name': None, 'resource_id': '729b50e7-7084-4f08-90ca-48a841f50cc9-vda', 'timestamp': '2025-11-29T07:22:48.217762', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1961507248', 'name': 'instance-00000074', 'instance_id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'instance_type': 'm1.nano', 'host': 'c7f75f7a20c9811e89034f282d556cbdad4fa495e2d09a505238d522', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '350c72e8-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.949952403, 'message_signature': 'c9625f9bec29407ae976b4c9a032f626b9bd94b1d1cf62842814c9dfe171e851'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'a992c32ce5fb4cbab645023852f14adc', 'user_name': None, 'project_id': '980ddbfed54546c89c75e94503491a61', 'project_name': None, 'resource_id': '729b50e7-7084-4f08-90ca-48a841f50cc9-vdb', 'timestamp': '2025-11-29T07:22:48.217762', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1961507248', 'name': 'instance-00000074', 'instance_id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'instance_type': 'm1.nano', 'host': 'c7f75f7a20c9811e89034f282d556cbdad4fa495e2d09a505238d522', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '350c90c0-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.949952403, 'message_signature': 'd3bd8d3964c241222e3c11c871f5f42c3e0f69015277a0ab6d23179de48ad427'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a992c32ce5fb4cbab645023852f14adc', 'user_name': None, 'project_id': '980ddbfed54546c89c75e94503491a61', 'project_name': None, 'resource_id': '729b50e7-7084-4f08-90ca-48a841f50cc9-sda', 'timestamp': '2025-11-29T07:22:48.217762', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1961507248', 'name': 'instance-00000074', 'instance_id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'instance_type': 'm1.nano', 'host': 'c7f75f7a20c9811e89034f282d556cbdad4fa495e2d09a505238d522', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '350cb2bc-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.949952403, 'message_signature': 'c4a8f593ab4d96319b669124b05d1f15975354ee7d042b5cb8e71f9cfcdcdf3b'}]}, 'timestamp': '2025-11-29 07:22:48.259970', '_unique_id': 'fa61ff5bccea451284ea46259418e670'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.261 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.277 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.277 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.277 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-210192598>, <NovaLikeServer: tempest-ServerRescueTestJSON-server-1961507248>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-210192598>, <NovaLikeServer: tempest-ServerRescueTestJSON-server-1961507248>]
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.278 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.278 12 DEBUG ceilometer.compute.pollsters [-] 1be71451-9dcb-4882-afc5-fa2b37f3fa96/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.279 12 DEBUG ceilometer.compute.pollsters [-] 729b50e7-7084-4f08-90ca-48a841f50cc9/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '84deeef6-8e9b-4101-8198-952825c2c08f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000073-1be71451-9dcb-4882-afc5-fa2b37f3fa96-tap0b0600de-62', 'timestamp': '2025-11-29T07:22:48.278443', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-210192598', 'name': 'tap0b0600de-62', 'instance_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b0:47:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0b0600de-62'}, 'message_id': '350f9e32-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.771029312, 'message_signature': 'd0d2b4418a27b40220aef27363ea0471b62946d8bf08045b920d80d0b163e1e1'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a992c32ce5fb4cbab645023852f14adc', 'user_name': None, 'project_id': '980ddbfed54546c89c75e94503491a61', 'project_name': None, 'resource_id': 'instance-00000074-729b50e7-7084-4f08-90ca-48a841f50cc9-tap81aef0aa-c8', 'timestamp': '2025-11-29T07:22:48.278443', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1961507248', 'name': 'tap81aef0aa-c8', 'instance_id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'instance_type': 'm1.nano', 'host': 'c7f75f7a20c9811e89034f282d556cbdad4fa495e2d09a505238d522', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:62:5e:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap81aef0aa-c8'}, 'message_id': '350fb8cc-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.773757709, 'message_signature': 'ce7d5266a0775d1fd06cf497ae5c82010d9162289a08865b8b00c82734a063c4'}]}, 'timestamp': '2025-11-29 07:22:48.279789', '_unique_id': 'c92e94b3900b4f288e7d210f57e09eb7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.282 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.290 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.291 12 DEBUG ceilometer.compute.pollsters [-] 1be71451-9dcb-4882-afc5-fa2b37f3fa96/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.291 12 DEBUG ceilometer.compute.pollsters [-] 1be71451-9dcb-4882-afc5-fa2b37f3fa96/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.292 12 DEBUG ceilometer.compute.pollsters [-] 729b50e7-7084-4f08-90ca-48a841f50cc9/disk.device.usage volume: 196616 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.292 12 DEBUG ceilometer.compute.pollsters [-] 729b50e7-7084-4f08-90ca-48a841f50cc9/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.293 12 DEBUG ceilometer.compute.pollsters [-] 729b50e7-7084-4f08-90ca-48a841f50cc9/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dafba674-a331-417c-b831-c1ac15a82937', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96-vda', 'timestamp': '2025-11-29T07:22:48.290987', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-210192598', 'name': 'instance-00000073', 'instance_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3511881e-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.9360236, 'message_signature': 'd624854eb7de85b4e339c8e69523d400f84026685e0a4b189bb94c73d029c361'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96-sda', 'timestamp': '2025-11-29T07:22:48.290987', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-210192598', 'name': 'instance-00000073', 'instance_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3511a4b6-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.9360236, 'message_signature': 'ff63539a6a0848478d307f24f782c09421d4caf769aca292870068aeea0271a0'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196616, 'user_id': 'a992c32ce5fb4cbab645023852f14adc', 'user_name': None, 'project_id': '980ddbfed54546c89c75e94503491a61', 'project_name': None, 'resource_id': '729b50e7-7084-4f08-90ca-48a841f50cc9-vda', 'timestamp': '2025-11-29T07:22:48.290987', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1961507248', 'name': 'instance-00000074', 'instance_id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'instance_type': 'm1.nano', 'host': 'c7f75f7a20c9811e89034f282d556cbdad4fa495e2d09a505238d522', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3511b91a-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.949952403, 'message_signature': '3073f03529f87826362b6d70c0e8db54c752ab95a4208f493a6299c4808e0f3b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'a992c32ce5fb4cbab645023852f14adc', 'user_name': None, 'project_id': '980ddbfed54546c89c75e94503491a61', 'project_name': None, 'resource_id': '729b50e7-7084-4f08-90ca-48a841f50cc9-vdb', 'timestamp': '2025-11-29T07:22:48.290987', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1961507248', 'name': 'instance-00000074', 'instance_id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'instance_type': 'm1.nano', 'host': 'c7f75f7a20c9811e89034f282d556cbdad4fa495e2d09a505238d522', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '3511c66c-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.949952403, 'message_signature': '32848df9066fb0d60e42de47173b5bf669d5f282932b4653035a14591173103d'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a992c32ce5fb4cbab645023852f14adc', 'user_name': None, 'project_id': '980ddbfed54546c89c75e94503491a61', 'project_name': None, 'resource_id': '729b50e7-7084-4f08-90ca-48a841f50cc9-sda', 'timestamp': '2025-11-29T07:22:48.290987', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1961507248', 'name': 'instance-00000074', 'instance_id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'instance_type': 'm1.nano', 'host': 'c7f75f7a20c9811e89034f282d556cbdad4fa495e2d09a505238d522', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3511d2d8-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.949952403, 'message_signature': '7d56470aa2d6d111854d8560d999a8a86e9417bb3a34f77922106c5d25a230ae'}]}, 'timestamp': '2025-11-29 07:22:48.293404', '_unique_id': '75033a35d3bc4efca3539eb493896750'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.295 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.300 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.300 12 DEBUG ceilometer.compute.pollsters [-] 1be71451-9dcb-4882-afc5-fa2b37f3fa96/network.incoming.bytes volume: 4313 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.301 12 DEBUG ceilometer.compute.pollsters [-] 729b50e7-7084-4f08-90ca-48a841f50cc9/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '444f2147-5f3d-4b5c-b282-72fde9642d74', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4313, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000073-1be71451-9dcb-4882-afc5-fa2b37f3fa96-tap0b0600de-62', 'timestamp': '2025-11-29T07:22:48.300492', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-210192598', 'name': 'tap0b0600de-62', 'instance_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b0:47:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0b0600de-62'}, 'message_id': '3512fc4e-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.771029312, 'message_signature': '47312e57256c08dd2623d3ff6f9a80ee03e11b6d6b6d8d849ef086f034d33002'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'a992c32ce5fb4cbab645023852f14adc', 'user_name': None, 'project_id': '980ddbfed54546c89c75e94503491a61', 'project_name': None, 'resource_id': 'instance-00000074-729b50e7-7084-4f08-90ca-48a841f50cc9-tap81aef0aa-c8', 'timestamp': '2025-11-29T07:22:48.300492', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1961507248', 'name': 'tap81aef0aa-c8', 'instance_id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'instance_type': 'm1.nano', 'host': 'c7f75f7a20c9811e89034f282d556cbdad4fa495e2d09a505238d522', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:62:5e:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap81aef0aa-c8'}, 'message_id': '35130d6a-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.773757709, 'message_signature': '6347c88161ef0a57888c1a6b0b38f39940330cf94a41f34d481f855e37e0d686'}]}, 'timestamp': '2025-11-29 07:22:48.301463', '_unique_id': '3defaeaac0124dc6a466bd1b10747fd9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.302 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.304 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.304 12 DEBUG ceilometer.compute.pollsters [-] 1be71451-9dcb-4882-afc5-fa2b37f3fa96/disk.device.allocation volume: 30679040 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.304 12 DEBUG ceilometer.compute.pollsters [-] 1be71451-9dcb-4882-afc5-fa2b37f3fa96/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.305 12 DEBUG ceilometer.compute.pollsters [-] 729b50e7-7084-4f08-90ca-48a841f50cc9/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.305 12 DEBUG ceilometer.compute.pollsters [-] 729b50e7-7084-4f08-90ca-48a841f50cc9/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.305 12 DEBUG ceilometer.compute.pollsters [-] 729b50e7-7084-4f08-90ca-48a841f50cc9/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae59a769-e194-4be9-81d8-28fdf9fff0ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30679040, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96-vda', 'timestamp': '2025-11-29T07:22:48.304339', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-210192598', 'name': 'instance-00000073', 'instance_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '35138ca4-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.9360236, 'message_signature': 'f12563e1197702994bb472450a0f71ea624941ab757ddde4ca8a3f22c5c35c2f'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96-sda', 'timestamp': '2025-11-29T07:22:48.304339', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-210192598', 'name': 'instance-00000073', 'instance_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '35139bb8-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.9360236, 'message_signature': '57ddff771b841c70d630cc6d309925d30db78f2ee7b2d468e60b7551ec893638'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'a992c32ce5fb4cbab645023852f14adc', 'user_name': None, 'project_id': '980ddbfed54546c89c75e94503491a61', 'project_name': None, 'resource_id': '729b50e7-7084-4f08-90ca-48a841f50cc9-vda', 'timestamp': '2025-11-29T07:22:48.304339', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1961507248', 'name': 'instance-00000074', 'instance_id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'instance_type': 'm1.nano', 'host': 'c7f75f7a20c9811e89034f282d556cbdad4fa495e2d09a505238d522', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3513a86a-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.949952403, 'message_signature': 'a1d279305078662670e1326fd41d9c748f36422af117468144863fcbab609616'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'a992c32ce5fb4cbab645023852f14adc', 'user_name': None, 'project_id': '980ddbfed54546c89c75e94503491a61', 'project_name': None, 'resource_id': '729b50e7-7084-4f08-90ca-48a841f50cc9-vdb', 'timestamp': '2025-11-29T07:22:48.304339', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1961507248', 'name': 'instance-00000074', 'instance_id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'instance_type': 'm1.nano', 'host': 'c7f75f7a20c9811e89034f282d556cbdad4fa495e2d09a505238d522', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '3513b4ea-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.949952403, 'message_signature': 'fe88640f07122a7234108f694f8430b0877a614725e8e74238eb96e95bcb648a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'a992c32ce5fb4cbab645023852f14adc', 'user_name': None, 'project_id': '980ddbfed54546c89c75e94503491a61', 'project_name': None, 'resource_id': '729b50e7-7084-4f08-90ca-48a841f50cc9-sda', 'timestamp': '2025-11-29T07:22:48.304339', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1961507248', 'name': 'instance-00000074', 'instance_id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'instance_type': 'm1.nano', 'host': 'c7f75f7a20c9811e89034f282d556cbdad4fa495e2d09a505238d522', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3513c390-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.949952403, 'message_signature': '0707f07fa2cdf9e30c533e6cefe3f5a6215683ef2320a9bc26f68c9d3d707fea'}]}, 'timestamp': '2025-11-29 07:22:48.306101', '_unique_id': 'db2d233d10e54f64b4b922f9babec7da'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.306 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.308 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.308 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.308 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-210192598>, <NovaLikeServer: tempest-ServerRescueTestJSON-server-1961507248>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-210192598>, <NovaLikeServer: tempest-ServerRescueTestJSON-server-1961507248>]
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.308 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.308 12 DEBUG ceilometer.compute.pollsters [-] 1be71451-9dcb-4882-afc5-fa2b37f3fa96/disk.device.read.requests volume: 1101 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.309 12 DEBUG ceilometer.compute.pollsters [-] 1be71451-9dcb-4882-afc5-fa2b37f3fa96/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.309 12 DEBUG ceilometer.compute.pollsters [-] 729b50e7-7084-4f08-90ca-48a841f50cc9/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.309 12 DEBUG ceilometer.compute.pollsters [-] 729b50e7-7084-4f08-90ca-48a841f50cc9/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.310 12 DEBUG ceilometer.compute.pollsters [-] 729b50e7-7084-4f08-90ca-48a841f50cc9/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c189227b-6fe1-4b88-887b-cfa516658ae1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1101, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96-vda', 'timestamp': '2025-11-29T07:22:48.308923', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-210192598', 'name': 'instance-00000073', 'instance_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '35143f50-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.779777989, 'message_signature': '85687279da81a4f0f52182d11ebade07390cc972526da5f44987c2bbab77fb07'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96-sda', 'timestamp': '2025-11-29T07:22:48.308923', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-210192598', 'name': 'instance-00000073', 'instance_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '35144e3c-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.779777989, 'message_signature': '84023cd38a76c0373b9c5e0516294d9c2848147e890ba4851180556485e92ebc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': 'a992c32ce5fb4cbab645023852f14adc', 'user_name': None, 'project_id': '980ddbfed54546c89c75e94503491a61', 'project_name': None, 'resource_id': '729b50e7-7084-4f08-90ca-48a841f50cc9-vda', 'timestamp': '2025-11-29T07:22:48.308923', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1961507248', 'name': 'instance-00000074', 'instance_id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'instance_type': 'm1.nano', 'host': 'c7f75f7a20c9811e89034f282d556cbdad4fa495e2d09a505238d522', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '35145ab2-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.809248871, 'message_signature': '3c5df35c47a5c621ec896ae6a16d9d549f854581759a5ae6d943bbe443b74342'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a992c32ce5fb4cbab645023852f14adc', 'user_name': None, 'project_id': '980ddbfed54546c89c75e94503491a61', 'project_name': None, 'resource_id': '729b50e7-7084-4f08-90ca-48a841f50cc9-vdb', 'timestamp': '2025-11-29T07:22:48.308923', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1961507248', 'name': 'instance-00000074', 'instance_id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'instance_type': 'm1.nano', 'host': 'c7f75f7a20c9811e89034f282d556cbdad4fa495e2d09a505238d522', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '351469bc-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.809248871, 'message_signature': '8bdfcb8d9911eb0239890e10dc0a6677739a24f84b1fe060d9509f8057c45ea6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'a992c32ce5fb4cbab645023852f14adc', 'user_name': None, 'project_id': '980ddbfed54546c89c75e94503491a61', 'project_name': None, 'resource_id': '729b50e7-7084-4f08-90ca-48a841f50cc9-sda', 'timestamp': '2025-11-29T07:22:48.308923', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1961507248', 'name': 'instance-00000074', 'instance_id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'instance_type': 'm1.nano', 'host': 'c7f75f7a20c9811e89034f282d556cbdad4fa495e2d09a505238d522', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '35147678-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.809248871, 'message_signature': '24b4988316b7d18032437cd86f4ab1b2c953ff9d3f60a4f5b5c14fad0b6a834b'}]}, 'timestamp': '2025-11-29 07:22:48.310680', '_unique_id': '89398d6071244af58c6bdba1519d2cc1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.311 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.312 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.312 12 DEBUG ceilometer.compute.pollsters [-] 1be71451-9dcb-4882-afc5-fa2b37f3fa96/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.313 12 DEBUG ceilometer.compute.pollsters [-] 729b50e7-7084-4f08-90ca-48a841f50cc9/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8dbec3e3-0041-454b-a7ae-e99fae1f3d0f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000073-1be71451-9dcb-4882-afc5-fa2b37f3fa96-tap0b0600de-62', 'timestamp': '2025-11-29T07:22:48.312880', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-210192598', 'name': 'tap0b0600de-62', 'instance_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b0:47:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0b0600de-62'}, 'message_id': '3514da46-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.771029312, 'message_signature': '8cab475ee936a47680a8402c10262cd9de6974465e0944f71cde7755d58bf016'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a992c32ce5fb4cbab645023852f14adc', 'user_name': None, 'project_id': '980ddbfed54546c89c75e94503491a61', 'project_name': None, 'resource_id': 'instance-00000074-729b50e7-7084-4f08-90ca-48a841f50cc9-tap81aef0aa-c8', 'timestamp': '2025-11-29T07:22:48.312880', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1961507248', 'name': 'tap81aef0aa-c8', 'instance_id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'instance_type': 'm1.nano', 'host': 'c7f75f7a20c9811e89034f282d556cbdad4fa495e2d09a505238d522', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:62:5e:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap81aef0aa-c8'}, 'message_id': '3514e752-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.773757709, 'message_signature': '5daba6deda63f7e9a23103ebb2e8d33a1a4913e83896f7b557f471bb0afc40c0'}]}, 'timestamp': '2025-11-29 07:22:48.313577', '_unique_id': 'd2db5100f61b440dbecc91b00addc308'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:22:48 compute-0 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.314 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.315 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.315 12 DEBUG ceilometer.compute.pollsters [-] 1be71451-9dcb-4882-afc5-fa2b37f3fa96/memory.usage volume: 42.51171875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.316 12 DEBUG ceilometer.compute.pollsters [-] 729b50e7-7084-4f08-90ca-48a841f50cc9/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.316 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 729b50e7-7084-4f08-90ca-48a841f50cc9: ceilometer.compute.pollsters.NoVolumeException
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '896ea934-4547-4237-8873-2530c71c7c50', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.51171875, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96', 'timestamp': '2025-11-29T07:22:48.315677', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-210192598', 'name': 'instance-00000073', 'instance_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '35154a9e-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.744558084, 'message_signature': '845d7d5c679afa7915494687265c4877590d6b2fb77080d57e16872fec25767e'}]}, 'timestamp': '2025-11-29 07:22:48.316424', '_unique_id': '0ab639ea1e52419ea1569f38ceec3395'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.317 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.318 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.318 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.319 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-210192598>, <NovaLikeServer: tempest-ServerRescueTestJSON-server-1961507248>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-210192598>, <NovaLikeServer: tempest-ServerRescueTestJSON-server-1961507248>]
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.319 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.319 12 DEBUG ceilometer.compute.pollsters [-] 1be71451-9dcb-4882-afc5-fa2b37f3fa96/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.319 12 DEBUG ceilometer.compute.pollsters [-] 729b50e7-7084-4f08-90ca-48a841f50cc9/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c7c7ede7-db68-4b2c-a743-6b9777b5020d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000073-1be71451-9dcb-4882-afc5-fa2b37f3fa96-tap0b0600de-62', 'timestamp': '2025-11-29T07:22:48.319499', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-210192598', 'name': 'tap0b0600de-62', 'instance_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b0:47:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0b0600de-62'}, 'message_id': '3515dc52-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.771029312, 'message_signature': '9164c28a37289b18b805fb8a611c586ff14f914982340ebca3f61be7c4e3d767'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a992c32ce5fb4cbab645023852f14adc', 'user_name': None, 'project_id': '980ddbfed54546c89c75e94503491a61', 'project_name': None, 'resource_id': 'instance-00000074-729b50e7-7084-4f08-90ca-48a841f50cc9-tap81aef0aa-c8', 'timestamp': '2025-11-29T07:22:48.319499', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1961507248', 'name': 'tap81aef0aa-c8', 'instance_id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'instance_type': 'm1.nano', 'host': 'c7f75f7a20c9811e89034f282d556cbdad4fa495e2d09a505238d522', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:62:5e:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap81aef0aa-c8'}, 'message_id': '3515ed64-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.773757709, 'message_signature': 'f4ffc5c5d504efb30011ba0c5e02bb8c32bc4b58bbee97953eab0779503134e5'}]}, 'timestamp': '2025-11-29 07:22:48.320289', '_unique_id': '8e773bb2b25240e18c93a71b72963ca5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.320 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.322 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.322 12 DEBUG ceilometer.compute.pollsters [-] 1be71451-9dcb-4882-afc5-fa2b37f3fa96/disk.device.write.requests volume: 308 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.322 12 DEBUG ceilometer.compute.pollsters [-] 1be71451-9dcb-4882-afc5-fa2b37f3fa96/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.323 12 DEBUG ceilometer.compute.pollsters [-] 729b50e7-7084-4f08-90ca-48a841f50cc9/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.323 12 DEBUG ceilometer.compute.pollsters [-] 729b50e7-7084-4f08-90ca-48a841f50cc9/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.323 12 DEBUG ceilometer.compute.pollsters [-] 729b50e7-7084-4f08-90ca-48a841f50cc9/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0050479e-b7db-4857-933c-c85fbfca267f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 308, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96-vda', 'timestamp': '2025-11-29T07:22:48.322320', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-210192598', 'name': 'instance-00000073', 'instance_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '35164a8e-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.779777989, 'message_signature': 'a12c2787e13f00979ac42bc0bd0655333f668f439115eb1cfa8130032dae19b4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96-sda', 'timestamp': '2025-11-29T07:22:48.322320', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-210192598', 'name': 'instance-00000073', 'instance_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '351657c2-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.779777989, 'message_signature': 'bce69a870a4a76c7d8af4ce77e9de70923c8524558d1a4e9d5a42af9c71b6c5b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a992c32ce5fb4cbab645023852f14adc', 'user_name': None, 'project_id': '980ddbfed54546c89c75e94503491a61', 'project_name': None, 'resource_id': '729b50e7-7084-4f08-90ca-48a841f50cc9-vda', 'timestamp': '2025-11-29T07:22:48.322320', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1961507248', 'name': 'instance-00000074', 'instance_id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'instance_type': 'm1.nano', 'host': 'c7f75f7a20c9811e89034f282d556cbdad4fa495e2d09a505238d522', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3516662c-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.809248871, 'message_signature': '65375375313dc74dd3c8fcde4aaaa6c67327c7a27c16552746800d9fadf7ab0a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a992c32ce5fb4cbab645023852f14adc', 'user_name': None, 'project_id': '980ddbfed54546c89c75e94503491a61', 'project_name': None, 'resource_id': '729b50e7-7084-4f08-90ca-48a841f50cc9-vdb', 'timestamp': '2025-11-29T07:22:48.322320', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1961507248', 'name': 'instance-00000074', 'instance_id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'instance_type': 'm1.nano', 'host': 'c7f75f7a20c9811e89034f282d556cbdad4fa495e2d09a505238d522', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '35167270-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.809248871, 'message_signature': '36933c43ae1e8692101229c355884411467f87ce9944d361afb62a15bb92c6f0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a992c32ce5fb4cbab645023852f14adc', 'user_name': None, 'project_id': '980ddbfed54546c89c75e94503491a61', 'project_name': None, 'resource_id': '729b50e7-7084-4f08-90ca-48a841f50cc9-sda', 'timestamp': '2025-11-29T07:22:48.322320', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1961507248', 'name': 'instance-00000074', 'instance_id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'instance_type': 'm1.nano', 'host': 'c7f75f7a20c9811e89034f282d556cbdad4fa495e2d09a505238d522', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '35167f0e-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.809248871, 'message_signature': '4b28e2f246b81228ba4f93c58c19dc1b4636b2edd92538d9737acba9fcfc0641'}]}, 'timestamp': '2025-11-29 07:22:48.324027', '_unique_id': '2b07beed07f240028a449dc64cf8868c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.324 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.325 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.325 12 DEBUG ceilometer.compute.pollsters [-] 1be71451-9dcb-4882-afc5-fa2b37f3fa96/disk.device.write.latency volume: 3959651732 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.326 12 DEBUG ceilometer.compute.pollsters [-] 1be71451-9dcb-4882-afc5-fa2b37f3fa96/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.326 12 DEBUG ceilometer.compute.pollsters [-] 729b50e7-7084-4f08-90ca-48a841f50cc9/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.327 12 DEBUG ceilometer.compute.pollsters [-] 729b50e7-7084-4f08-90ca-48a841f50cc9/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.327 12 DEBUG ceilometer.compute.pollsters [-] 729b50e7-7084-4f08-90ca-48a841f50cc9/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9c2585f5-3e16-4d3a-b089-eba8cfae764d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3959651732, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96-vda', 'timestamp': '2025-11-29T07:22:48.325723', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-210192598', 'name': 'instance-00000073', 'instance_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3516e390-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.779777989, 'message_signature': 'bc2ac9f983a9455ddca0e580b63ccb06dc19fbf39c03684c47a62a289cfe6fe0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96-sda', 'timestamp': '2025-11-29T07:22:48.325723', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-210192598', 'name': 'instance-00000073', 'instance_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3516ee30-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.779777989, 'message_signature': '6bafcb9bc3e18d8142030a9734fef2445287197a1f9e87c4221e810cc60c4217'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a992c32ce5fb4cbab645023852f14adc', 'user_name': None, 'project_id': '980ddbfed54546c89c75e94503491a61', 'project_name': None, 'resource_id': '729b50e7-7084-4f08-90ca-48a841f50cc9-vda', 'timestamp': '2025-11-29T07:22:48.325723', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1961507248', 'name': 'instance-00000074', 'instance_id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'instance_type': 'm1.nano', 'host': 'c7f75f7a20c9811e89034f282d556cbdad4fa495e2d09a505238d522', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3516f90c-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.809248871, 'message_signature': '42bd8a7b206e86675a29d38f1ca8dee50b24e547adfe2199ed6b61c6ae63294d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a992c32ce5fb4cbab645023852f14adc', 'user_name': None, 'project_id': '980ddbfed54546c89c75e94503491a61', 'project_name': None, 'resource_id': '729b50e7-7084-4f08-90ca-48a841f50cc9-vdb', 'timestamp': '2025-11-29T07:22:48.325723', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1961507248', 'name': 'instance-00000074', 'instance_id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'instance_type': 'm1.nano', 'host': 'c7f75f7a20c9811e89034f282d556cbdad4fa495e2d09a505238d522', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '351702ee-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.809248871, 'message_signature': '6aef82bb65e9a08be6410dc611a4c1b28e32b96676738aa2b26e56012d90f9f3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a992c32ce5fb4cbab645023852f14adc', 'user_name': None, 'project_id': '980ddbfed54546c89c75e94503491a61', 'project_name': None, 'resource_id': '729b50e7-7084-4f08-90ca-48a841f50cc9-sda', 'timestamp': '2025-11-29T07:22:48.325723', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1961507248', 'name': 'instance-00000074', 'instance_id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'instance_type': 'm1.nano', 'host': 'c7f75f7a20c9811e89034f282d556cbdad4fa495e2d09a505238d522', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '351713b0-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.809248871, 'message_signature': '39cabdad26c2cc744a606d8b6d406746b18a74b97b7958001d92f6bdcedf1c87'}]}, 'timestamp': '2025-11-29 07:22:48.327810', '_unique_id': '0ff54aee29854714a6ee5c97e6b876e3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.328 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.329 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.330 12 DEBUG ceilometer.compute.pollsters [-] 1be71451-9dcb-4882-afc5-fa2b37f3fa96/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.330 12 DEBUG ceilometer.compute.pollsters [-] 729b50e7-7084-4f08-90ca-48a841f50cc9/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '682035ef-6339-4759-93de-11a474172188', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000073-1be71451-9dcb-4882-afc5-fa2b37f3fa96-tap0b0600de-62', 'timestamp': '2025-11-29T07:22:48.330043', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-210192598', 'name': 'tap0b0600de-62', 'instance_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b0:47:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0b0600de-62'}, 'message_id': '35177878-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.771029312, 'message_signature': '4a2a5d3ac98d7b4d4b5e657a93d70b1ad4bf64fbdffd984bc8482b325a5fdea2'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a992c32ce5fb4cbab645023852f14adc', 'user_name': None, 'project_id': '980ddbfed54546c89c75e94503491a61', 'project_name': None, 'resource_id': 'instance-00000074-729b50e7-7084-4f08-90ca-48a841f50cc9-tap81aef0aa-c8', 'timestamp': '2025-11-29T07:22:48.330043', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1961507248', 'name': 'tap81aef0aa-c8', 'instance_id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'instance_type': 'm1.nano', 'host': 'c7f75f7a20c9811e89034f282d556cbdad4fa495e2d09a505238d522', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:62:5e:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap81aef0aa-c8'}, 'message_id': '35178598-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.773757709, 'message_signature': '9ed5b7c031066ebbb6c7d46bec141e034f96df2eb253646dcd89f6e0b7574939'}]}, 'timestamp': '2025-11-29 07:22:48.330735', '_unique_id': '6c2690c8fe6844dd8363d1f3a4bbb416'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.331 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.332 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.333 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.333 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-210192598>, <NovaLikeServer: tempest-ServerRescueTestJSON-server-1961507248>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-210192598>, <NovaLikeServer: tempest-ServerRescueTestJSON-server-1961507248>]
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.333 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.333 12 DEBUG ceilometer.compute.pollsters [-] 1be71451-9dcb-4882-afc5-fa2b37f3fa96/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.334 12 DEBUG ceilometer.compute.pollsters [-] 729b50e7-7084-4f08-90ca-48a841f50cc9/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7660f263-3251-4bfa-90a7-e99940a76ab8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000073-1be71451-9dcb-4882-afc5-fa2b37f3fa96-tap0b0600de-62', 'timestamp': '2025-11-29T07:22:48.333560', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-210192598', 'name': 'tap0b0600de-62', 'instance_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b0:47:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0b0600de-62'}, 'message_id': '351801b2-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.771029312, 'message_signature': 'ea1e56380c927cfee2ad37f846e182bb63c5232859f2248f486a54f71d896837'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a992c32ce5fb4cbab645023852f14adc', 'user_name': None, 'project_id': '980ddbfed54546c89c75e94503491a61', 'project_name': None, 'resource_id': 'instance-00000074-729b50e7-7084-4f08-90ca-48a841f50cc9-tap81aef0aa-c8', 'timestamp': '2025-11-29T07:22:48.333560', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1961507248', 'name': 'tap81aef0aa-c8', 'instance_id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'instance_type': 'm1.nano', 'host': 'c7f75f7a20c9811e89034f282d556cbdad4fa495e2d09a505238d522', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:62:5e:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap81aef0aa-c8'}, 'message_id': '35181936-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.773757709, 'message_signature': '99685774b5df35cad7aa5a4c31faf0e53b2f701189e5ad0086406bdf40e71edd'}]}, 'timestamp': '2025-11-29 07:22:48.334517', '_unique_id': 'f2d51bfd92f04924b46421507fb2c6eb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.335 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.336 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.336 12 DEBUG ceilometer.compute.pollsters [-] 1be71451-9dcb-4882-afc5-fa2b37f3fa96/disk.device.read.latency volume: 225856342 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.337 12 DEBUG ceilometer.compute.pollsters [-] 1be71451-9dcb-4882-afc5-fa2b37f3fa96/disk.device.read.latency volume: 19605441 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.337 12 DEBUG ceilometer.compute.pollsters [-] 729b50e7-7084-4f08-90ca-48a841f50cc9/disk.device.read.latency volume: 174181996 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.337 12 DEBUG ceilometer.compute.pollsters [-] 729b50e7-7084-4f08-90ca-48a841f50cc9/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.338 12 DEBUG ceilometer.compute.pollsters [-] 729b50e7-7084-4f08-90ca-48a841f50cc9/disk.device.read.latency volume: 718110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '755696bd-e15b-4134-84f3-ebc1f92bec3d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 225856342, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96-vda', 'timestamp': '2025-11-29T07:22:48.336599', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-210192598', 'name': 'instance-00000073', 'instance_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '35187886-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.779777989, 'message_signature': '221e4c0c33659a2c6695083c14564f59cd9f56d64194cf0c5cd7f66b480907c3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 19605441, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96-sda', 'timestamp': '2025-11-29T07:22:48.336599', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-210192598', 'name': 'instance-00000073', 'instance_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '35188dd0-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.779777989, 'message_signature': '2531fe646a0229f094de61f70825b85e6db7bb2b371b343d429b87b29239feae'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 174181996, 'user_id': 'a992c32ce5fb4cbab645023852f14adc', 'user_name': None, 'project_id': '980ddbfed54546c89c75e94503491a61', 'project_name': None, 'resource_id': '729b50e7-7084-4f08-90ca-48a841f50cc9-vda', 'timestamp': '2025-11-29T07:22:48.336599', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1961507248', 'name': 'instance-00000074', 'instance_id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'instance_type': 'm1.nano', 'host': 'c7f75f7a20c9811e89034f282d556cbdad4fa495e2d09a505238d522', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '35189c94-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.809248871, 'message_signature': '5dbd356c32fec68865eb1ebc5f21fd65c828d5bf1c48c2a096dad7d9807365f4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a992c32ce5fb4cbab645023852f14adc', 'user_name': None, 'project_id': '980ddbfed54546c89c75e94503491a61', 'project_name': None, 'resource_id': '729b50e7-7084-4f08-90ca-48a841f50cc9-vdb', 'timestamp': '2025-11-29T07:22:48.336599', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1961507248', 'name': 'instance-00000074', 'instance_id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'instance_type': 'm1.nano', 'host': 'c7f75f7a20c9811e89034f282d556cbdad4fa495e2d09a505238d522', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vdb'}, 'message_id': '3518a9e6-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.809248871, 'message_signature': '36daf04ac56b40a1e9c1b63dc1fd695ba5cc8b839426016778c84a58d327b621'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 718110, 'user_id': 'a992c32ce5fb4cbab645023852f14adc', 'user_name': None, 'project_id': '980ddbfed54546c89c75e94503491a61', 'project_name': None, 'resource_id': '729b50e7-7084-4f08-90ca-48a841f50cc9-sda', 'timestamp': '2025-11-29T07:22:48.336599', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1961507248', 'name': 'instance-00000074', 'instance_id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'instance_type': 'm1.nano', 'host': 'c7f75f7a20c9811e89034f282d556cbdad4fa495e2d09a505238d522', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3518b63e-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.809248871, 'message_signature': '8982721c2f62963d44ad7ecd45fab098287d9721688b33db9a6fa775fb20190d'}]}, 'timestamp': '2025-11-29 07:22:48.338584', '_unique_id': '96332674cada492fa2292a01bd3932fe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.339 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.340 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.340 12 DEBUG ceilometer.compute.pollsters [-] 1be71451-9dcb-4882-afc5-fa2b37f3fa96/network.incoming.packets volume: 27 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.341 12 DEBUG ceilometer.compute.pollsters [-] 729b50e7-7084-4f08-90ca-48a841f50cc9/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6e24ba5e-1682-41d3-ac6b-2e53121078db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 27, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000073-1be71451-9dcb-4882-afc5-fa2b37f3fa96-tap0b0600de-62', 'timestamp': '2025-11-29T07:22:48.340858', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-210192598', 'name': 'tap0b0600de-62', 'instance_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b0:47:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0b0600de-62'}, 'message_id': '35191f84-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.771029312, 'message_signature': '235b607616ea770874c37198f0b2070f84921eda856894b7f700c84790accd83'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'a992c32ce5fb4cbab645023852f14adc', 'user_name': None, 'project_id': '980ddbfed54546c89c75e94503491a61', 'project_name': None, 'resource_id': 'instance-00000074-729b50e7-7084-4f08-90ca-48a841f50cc9-tap81aef0aa-c8', 'timestamp': '2025-11-29T07:22:48.340858', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSON-server-1961507248', 'name': 'tap81aef0aa-c8', 'instance_id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'instance_type': 'm1.nano', 'host': 'c7f75f7a20c9811e89034f282d556cbdad4fa495e2d09a505238d522', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:62:5e:11', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap81aef0aa-c8'}, 'message_id': '35192cd6-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6415.773757709, 'message_signature': '398217e377ce690bba23d9b83d9bfcd4efe94d468d8df5f449e02c886ede8c25'}]}, 'timestamp': '2025-11-29 07:22:48.341576', '_unique_id': '21012528bc8843d1af19765fa6f3910c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:22:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:22:48.342 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:22:48 compute-0 nova_compute[187185]: 2025-11-29 07:22:48.402 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:48 compute-0 nova_compute[187185]: 2025-11-29 07:22:48.960 187189 DEBUG nova.compute.manager [req-43ca86f1-d55e-4781-81ff-39a548586ef3 req-c44e9ce9-5ccb-4ac5-9a4c-05a4dc8c5745 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Received event network-vif-plugged-81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:22:48 compute-0 nova_compute[187185]: 2025-11-29 07:22:48.960 187189 DEBUG oslo_concurrency.lockutils [req-43ca86f1-d55e-4781-81ff-39a548586ef3 req-c44e9ce9-5ccb-4ac5-9a4c-05a4dc8c5745 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "729b50e7-7084-4f08-90ca-48a841f50cc9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:22:48 compute-0 nova_compute[187185]: 2025-11-29 07:22:48.960 187189 DEBUG oslo_concurrency.lockutils [req-43ca86f1-d55e-4781-81ff-39a548586ef3 req-c44e9ce9-5ccb-4ac5-9a4c-05a4dc8c5745 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "729b50e7-7084-4f08-90ca-48a841f50cc9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:22:48 compute-0 nova_compute[187185]: 2025-11-29 07:22:48.961 187189 DEBUG oslo_concurrency.lockutils [req-43ca86f1-d55e-4781-81ff-39a548586ef3 req-c44e9ce9-5ccb-4ac5-9a4c-05a4dc8c5745 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "729b50e7-7084-4f08-90ca-48a841f50cc9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:22:48 compute-0 nova_compute[187185]: 2025-11-29 07:22:48.961 187189 DEBUG nova.compute.manager [req-43ca86f1-d55e-4781-81ff-39a548586ef3 req-c44e9ce9-5ccb-4ac5-9a4c-05a4dc8c5745 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] No waiting events found dispatching network-vif-plugged-81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:22:48 compute-0 nova_compute[187185]: 2025-11-29 07:22:48.961 187189 WARNING nova.compute.manager [req-43ca86f1-d55e-4781-81ff-39a548586ef3 req-c44e9ce9-5ccb-4ac5-9a4c-05a4dc8c5745 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Received unexpected event network-vif-plugged-81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f for instance with vm_state rescued and task_state None.
Nov 29 07:22:50 compute-0 nova_compute[187185]: 2025-11-29 07:22:50.997 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:53 compute-0 nova_compute[187185]: 2025-11-29 07:22:53.405 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:55 compute-0 podman[233077]: 2025-11-29 07:22:55.851504252 +0000 UTC m=+0.107750553 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Nov 29 07:22:56 compute-0 nova_compute[187185]: 2025-11-29 07:22:55.999 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:22:58 compute-0 nova_compute[187185]: 2025-11-29 07:22:58.409 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:23:01 compute-0 nova_compute[187185]: 2025-11-29 07:23:01.001 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:23:02 compute-0 podman[233103]: 2025-11-29 07:23:02.825681258 +0000 UTC m=+0.092339418 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 07:23:03 compute-0 nova_compute[187185]: 2025-11-29 07:23:03.412 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:23:04 compute-0 podman[233128]: 2025-11-29 07:23:04.832606886 +0000 UTC m=+0.092196893 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 07:23:04 compute-0 podman[233129]: 2025-11-29 07:23:04.865272469 +0000 UTC m=+0.098116181 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 07:23:06 compute-0 nova_compute[187185]: 2025-11-29 07:23:06.003 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:23:08 compute-0 nova_compute[187185]: 2025-11-29 07:23:08.417 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:23:11 compute-0 nova_compute[187185]: 2025-11-29 07:23:11.012 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:23:13 compute-0 nova_compute[187185]: 2025-11-29 07:23:13.422 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:23:15 compute-0 ovn_controller[95281]: 2025-11-29T07:23:15Z|00341|binding|INFO|Releasing lport 6897d2ce-b04d-4d85-9bb6-9da51e7d7f20 from this chassis (sb_readonly=0)
Nov 29 07:23:15 compute-0 nova_compute[187185]: 2025-11-29 07:23:15.557 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:23:15 compute-0 nova_compute[187185]: 2025-11-29 07:23:15.621 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:23:15 compute-0 nova_compute[187185]: 2025-11-29 07:23:15.622 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:23:15 compute-0 nova_compute[187185]: 2025-11-29 07:23:15.622 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:23:15 compute-0 podman[233169]: 2025-11-29 07:23:15.835082297 +0000 UTC m=+0.093928043 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 29 07:23:15 compute-0 podman[233171]: 2025-11-29 07:23:15.841678843 +0000 UTC m=+0.078659302 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 07:23:15 compute-0 podman[233170]: 2025-11-29 07:23:15.873942674 +0000 UTC m=+0.113647430 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.expose-services=, config_id=edpm, maintainer=Red Hat, Inc.)
Nov 29 07:23:15 compute-0 nova_compute[187185]: 2025-11-29 07:23:15.896 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "refresh_cache-1be71451-9dcb-4882-afc5-fa2b37f3fa96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:23:15 compute-0 nova_compute[187185]: 2025-11-29 07:23:15.896 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquired lock "refresh_cache-1be71451-9dcb-4882-afc5-fa2b37f3fa96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:23:15 compute-0 nova_compute[187185]: 2025-11-29 07:23:15.896 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 29 07:23:15 compute-0 nova_compute[187185]: 2025-11-29 07:23:15.897 187189 DEBUG nova.objects.instance [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 1be71451-9dcb-4882-afc5-fa2b37f3fa96 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:23:16 compute-0 nova_compute[187185]: 2025-11-29 07:23:16.014 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:23:17 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:23:17.320 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=40, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=39) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:23:17 compute-0 nova_compute[187185]: 2025-11-29 07:23:17.322 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:23:17 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:23:17.323 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:23:18 compute-0 nova_compute[187185]: 2025-11-29 07:23:18.424 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:23:21 compute-0 nova_compute[187185]: 2025-11-29 07:23:21.017 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:23:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:23:21.325 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '40'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:23:22 compute-0 nova_compute[187185]: 2025-11-29 07:23:22.557 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Updating instance_info_cache with network_info: [{"id": "0b0600de-624c-4784-aeea-87a27d98f344", "address": "fa:16:3e:b0:47:d2", "network": {"id": "f75dc671-4e0c-40f1-8afd-c16b5e416d95", "bridge": "br-int", "label": "tempest-network-smoke--588217173", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb0:47d2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b0600de-62", "ovs_interfaceid": "0b0600de-624c-4784-aeea-87a27d98f344", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:23:22 compute-0 nova_compute[187185]: 2025-11-29 07:23:22.594 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Releasing lock "refresh_cache-1be71451-9dcb-4882-afc5-fa2b37f3fa96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:23:22 compute-0 nova_compute[187185]: 2025-11-29 07:23:22.594 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 29 07:23:22 compute-0 nova_compute[187185]: 2025-11-29 07:23:22.595 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:23:22 compute-0 nova_compute[187185]: 2025-11-29 07:23:22.595 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:23:22 compute-0 nova_compute[187185]: 2025-11-29 07:23:22.596 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:23:22 compute-0 nova_compute[187185]: 2025-11-29 07:23:22.596 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:23:22 compute-0 nova_compute[187185]: 2025-11-29 07:23:22.596 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:23:22 compute-0 nova_compute[187185]: 2025-11-29 07:23:22.596 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:23:22 compute-0 nova_compute[187185]: 2025-11-29 07:23:22.619 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Triggering sync for uuid 1be71451-9dcb-4882-afc5-fa2b37f3fa96 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Nov 29 07:23:22 compute-0 nova_compute[187185]: 2025-11-29 07:23:22.620 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Triggering sync for uuid 729b50e7-7084-4f08-90ca-48a841f50cc9 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Nov 29 07:23:22 compute-0 nova_compute[187185]: 2025-11-29 07:23:22.621 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "1be71451-9dcb-4882-afc5-fa2b37f3fa96" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:23:22 compute-0 nova_compute[187185]: 2025-11-29 07:23:22.622 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "1be71451-9dcb-4882-afc5-fa2b37f3fa96" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:23:22 compute-0 nova_compute[187185]: 2025-11-29 07:23:22.624 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "729b50e7-7084-4f08-90ca-48a841f50cc9" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:23:22 compute-0 nova_compute[187185]: 2025-11-29 07:23:22.624 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "729b50e7-7084-4f08-90ca-48a841f50cc9" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:23:22 compute-0 nova_compute[187185]: 2025-11-29 07:23:22.625 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:23:22 compute-0 nova_compute[187185]: 2025-11-29 07:23:22.626 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:23:22 compute-0 nova_compute[187185]: 2025-11-29 07:23:22.627 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:23:22 compute-0 nova_compute[187185]: 2025-11-29 07:23:22.662 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "1be71451-9dcb-4882-afc5-fa2b37f3fa96" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.040s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:23:22 compute-0 nova_compute[187185]: 2025-11-29 07:23:22.664 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "729b50e7-7084-4f08-90ca-48a841f50cc9" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.040s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:23:22 compute-0 nova_compute[187185]: 2025-11-29 07:23:22.673 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:23:22 compute-0 nova_compute[187185]: 2025-11-29 07:23:22.673 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:23:22 compute-0 nova_compute[187185]: 2025-11-29 07:23:22.674 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:23:22 compute-0 nova_compute[187185]: 2025-11-29 07:23:22.674 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:23:22 compute-0 nova_compute[187185]: 2025-11-29 07:23:22.784 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1be71451-9dcb-4882-afc5-fa2b37f3fa96/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:23:22 compute-0 nova_compute[187185]: 2025-11-29 07:23:22.884 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1be71451-9dcb-4882-afc5-fa2b37f3fa96/disk --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:23:22 compute-0 nova_compute[187185]: 2025-11-29 07:23:22.885 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1be71451-9dcb-4882-afc5-fa2b37f3fa96/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:23:22 compute-0 nova_compute[187185]: 2025-11-29 07:23:22.946 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1be71451-9dcb-4882-afc5-fa2b37f3fa96/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:23:22 compute-0 nova_compute[187185]: 2025-11-29 07:23:22.952 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/729b50e7-7084-4f08-90ca-48a841f50cc9/disk.rescue --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:23:23 compute-0 nova_compute[187185]: 2025-11-29 07:23:23.006 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/729b50e7-7084-4f08-90ca-48a841f50cc9/disk.rescue --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:23:23 compute-0 nova_compute[187185]: 2025-11-29 07:23:23.007 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/729b50e7-7084-4f08-90ca-48a841f50cc9/disk.rescue --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:23:23 compute-0 nova_compute[187185]: 2025-11-29 07:23:23.060 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/729b50e7-7084-4f08-90ca-48a841f50cc9/disk.rescue --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:23:23 compute-0 nova_compute[187185]: 2025-11-29 07:23:23.061 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/729b50e7-7084-4f08-90ca-48a841f50cc9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:23:23 compute-0 nova_compute[187185]: 2025-11-29 07:23:23.120 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/729b50e7-7084-4f08-90ca-48a841f50cc9/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:23:23 compute-0 nova_compute[187185]: 2025-11-29 07:23:23.121 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/729b50e7-7084-4f08-90ca-48a841f50cc9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:23:23 compute-0 nova_compute[187185]: 2025-11-29 07:23:23.183 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/729b50e7-7084-4f08-90ca-48a841f50cc9/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:23:23 compute-0 nova_compute[187185]: 2025-11-29 07:23:23.346 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:23:23 compute-0 nova_compute[187185]: 2025-11-29 07:23:23.347 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5399MB free_disk=73.23503112792969GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:23:23 compute-0 nova_compute[187185]: 2025-11-29 07:23:23.347 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:23:23 compute-0 nova_compute[187185]: 2025-11-29 07:23:23.348 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:23:23 compute-0 nova_compute[187185]: 2025-11-29 07:23:23.427 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:23:23 compute-0 nova_compute[187185]: 2025-11-29 07:23:23.649 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance 1be71451-9dcb-4882-afc5-fa2b37f3fa96 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:23:23 compute-0 nova_compute[187185]: 2025-11-29 07:23:23.650 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance 729b50e7-7084-4f08-90ca-48a841f50cc9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:23:23 compute-0 nova_compute[187185]: 2025-11-29 07:23:23.650 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:23:23 compute-0 nova_compute[187185]: 2025-11-29 07:23:23.651 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:23:23 compute-0 nova_compute[187185]: 2025-11-29 07:23:23.764 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:23:23 compute-0 nova_compute[187185]: 2025-11-29 07:23:23.799 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:23:23 compute-0 nova_compute[187185]: 2025-11-29 07:23:23.843 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:23:23 compute-0 nova_compute[187185]: 2025-11-29 07:23:23.843 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.495s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:23:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:23:25.513 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:23:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:23:25.514 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:23:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:23:25.515 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:23:26 compute-0 nova_compute[187185]: 2025-11-29 07:23:26.020 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:23:26 compute-0 nova_compute[187185]: 2025-11-29 07:23:26.532 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:23:26 compute-0 podman[233251]: 2025-11-29 07:23:26.866841304 +0000 UTC m=+0.128925781 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:23:28 compute-0 nova_compute[187185]: 2025-11-29 07:23:28.433 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:23:31 compute-0 nova_compute[187185]: 2025-11-29 07:23:31.024 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:23:33 compute-0 nova_compute[187185]: 2025-11-29 07:23:33.444 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:23:33 compute-0 podman[233277]: 2025-11-29 07:23:33.779982203 +0000 UTC m=+0.052870183 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:23:35 compute-0 podman[233302]: 2025-11-29 07:23:35.828765405 +0000 UTC m=+0.091676389 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3)
Nov 29 07:23:35 compute-0 podman[233301]: 2025-11-29 07:23:35.829223338 +0000 UTC m=+0.091001840 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 29 07:23:36 compute-0 nova_compute[187185]: 2025-11-29 07:23:36.078 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:23:38 compute-0 nova_compute[187185]: 2025-11-29 07:23:38.454 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:23:40 compute-0 nova_compute[187185]: 2025-11-29 07:23:40.444 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:23:41 compute-0 nova_compute[187185]: 2025-11-29 07:23:41.081 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:23:43 compute-0 nova_compute[187185]: 2025-11-29 07:23:43.459 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:23:46 compute-0 nova_compute[187185]: 2025-11-29 07:23:46.084 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:23:46 compute-0 podman[233342]: 2025-11-29 07:23:46.794066777 +0000 UTC m=+0.054167981 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 07:23:46 compute-0 podman[233340]: 2025-11-29 07:23:46.807953479 +0000 UTC m=+0.066072157 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:23:46 compute-0 podman[233341]: 2025-11-29 07:23:46.808195315 +0000 UTC m=+0.068657379 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, version=9.6, release=1755695350, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 29 07:23:48 compute-0 nova_compute[187185]: 2025-11-29 07:23:48.494 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:23:51 compute-0 nova_compute[187185]: 2025-11-29 07:23:51.086 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:23:53 compute-0 nova_compute[187185]: 2025-11-29 07:23:53.498 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:23:54 compute-0 nova_compute[187185]: 2025-11-29 07:23:54.055 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:23:56 compute-0 nova_compute[187185]: 2025-11-29 07:23:56.088 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:23:56 compute-0 nova_compute[187185]: 2025-11-29 07:23:56.827 187189 DEBUG nova.compute.manager [req-18ceae45-0e02-4f04-9942-eb828fb9e01b req-bd987f39-ab51-464d-a775-a6bd83a06af2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Received event network-changed-0b0600de-624c-4784-aeea-87a27d98f344 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:23:56 compute-0 nova_compute[187185]: 2025-11-29 07:23:56.828 187189 DEBUG nova.compute.manager [req-18ceae45-0e02-4f04-9942-eb828fb9e01b req-bd987f39-ab51-464d-a775-a6bd83a06af2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Refreshing instance network info cache due to event network-changed-0b0600de-624c-4784-aeea-87a27d98f344. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:23:56 compute-0 nova_compute[187185]: 2025-11-29 07:23:56.828 187189 DEBUG oslo_concurrency.lockutils [req-18ceae45-0e02-4f04-9942-eb828fb9e01b req-bd987f39-ab51-464d-a775-a6bd83a06af2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-1be71451-9dcb-4882-afc5-fa2b37f3fa96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:23:56 compute-0 nova_compute[187185]: 2025-11-29 07:23:56.828 187189 DEBUG oslo_concurrency.lockutils [req-18ceae45-0e02-4f04-9942-eb828fb9e01b req-bd987f39-ab51-464d-a775-a6bd83a06af2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-1be71451-9dcb-4882-afc5-fa2b37f3fa96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:23:56 compute-0 nova_compute[187185]: 2025-11-29 07:23:56.828 187189 DEBUG nova.network.neutron [req-18ceae45-0e02-4f04-9942-eb828fb9e01b req-bd987f39-ab51-464d-a775-a6bd83a06af2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Refreshing network info cache for port 0b0600de-624c-4784-aeea-87a27d98f344 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:23:56 compute-0 nova_compute[187185]: 2025-11-29 07:23:56.854 187189 DEBUG oslo_concurrency.lockutils [None req-8891e6d8-f041-4f90-9e7d-30b60a0e0e48 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Acquiring lock "729b50e7-7084-4f08-90ca-48a841f50cc9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:23:56 compute-0 nova_compute[187185]: 2025-11-29 07:23:56.855 187189 DEBUG oslo_concurrency.lockutils [None req-8891e6d8-f041-4f90-9e7d-30b60a0e0e48 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "729b50e7-7084-4f08-90ca-48a841f50cc9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:23:56 compute-0 nova_compute[187185]: 2025-11-29 07:23:56.855 187189 DEBUG oslo_concurrency.lockutils [None req-8891e6d8-f041-4f90-9e7d-30b60a0e0e48 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Acquiring lock "729b50e7-7084-4f08-90ca-48a841f50cc9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:23:56 compute-0 nova_compute[187185]: 2025-11-29 07:23:56.855 187189 DEBUG oslo_concurrency.lockutils [None req-8891e6d8-f041-4f90-9e7d-30b60a0e0e48 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "729b50e7-7084-4f08-90ca-48a841f50cc9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:23:56 compute-0 nova_compute[187185]: 2025-11-29 07:23:56.855 187189 DEBUG oslo_concurrency.lockutils [None req-8891e6d8-f041-4f90-9e7d-30b60a0e0e48 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "729b50e7-7084-4f08-90ca-48a841f50cc9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:23:57 compute-0 nova_compute[187185]: 2025-11-29 07:23:57.225 187189 INFO nova.compute.manager [None req-8891e6d8-f041-4f90-9e7d-30b60a0e0e48 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Terminating instance
Nov 29 07:23:57 compute-0 nova_compute[187185]: 2025-11-29 07:23:57.240 187189 DEBUG nova.compute.manager [None req-8891e6d8-f041-4f90-9e7d-30b60a0e0e48 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 07:23:57 compute-0 nova_compute[187185]: 2025-11-29 07:23:57.251 187189 DEBUG oslo_concurrency.lockutils [None req-efe63a8d-a172-4117-baa6-2a5ef1421bb1 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "1be71451-9dcb-4882-afc5-fa2b37f3fa96" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:23:57 compute-0 nova_compute[187185]: 2025-11-29 07:23:57.251 187189 DEBUG oslo_concurrency.lockutils [None req-efe63a8d-a172-4117-baa6-2a5ef1421bb1 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "1be71451-9dcb-4882-afc5-fa2b37f3fa96" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:23:57 compute-0 nova_compute[187185]: 2025-11-29 07:23:57.252 187189 DEBUG oslo_concurrency.lockutils [None req-efe63a8d-a172-4117-baa6-2a5ef1421bb1 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "1be71451-9dcb-4882-afc5-fa2b37f3fa96-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:23:57 compute-0 nova_compute[187185]: 2025-11-29 07:23:57.252 187189 DEBUG oslo_concurrency.lockutils [None req-efe63a8d-a172-4117-baa6-2a5ef1421bb1 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "1be71451-9dcb-4882-afc5-fa2b37f3fa96-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:23:57 compute-0 nova_compute[187185]: 2025-11-29 07:23:57.252 187189 DEBUG oslo_concurrency.lockutils [None req-efe63a8d-a172-4117-baa6-2a5ef1421bb1 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "1be71451-9dcb-4882-afc5-fa2b37f3fa96-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:23:57 compute-0 kernel: tap81aef0aa-c8 (unregistering): left promiscuous mode
Nov 29 07:23:57 compute-0 NetworkManager[55227]: <info>  [1764401037.2707] device (tap81aef0aa-c8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:23:57 compute-0 nova_compute[187185]: 2025-11-29 07:23:57.279 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:23:57 compute-0 ovn_controller[95281]: 2025-11-29T07:23:57Z|00342|binding|INFO|Releasing lport 81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f from this chassis (sb_readonly=0)
Nov 29 07:23:57 compute-0 ovn_controller[95281]: 2025-11-29T07:23:57Z|00343|binding|INFO|Setting lport 81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f down in Southbound
Nov 29 07:23:57 compute-0 ovn_controller[95281]: 2025-11-29T07:23:57Z|00344|binding|INFO|Removing iface tap81aef0aa-c8 ovn-installed in OVS
Nov 29 07:23:57 compute-0 nova_compute[187185]: 2025-11-29 07:23:57.281 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:23:57 compute-0 nova_compute[187185]: 2025-11-29 07:23:57.291 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:23:57 compute-0 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d00000074.scope: Deactivated successfully.
Nov 29 07:23:57 compute-0 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d00000074.scope: Consumed 16.207s CPU time.
Nov 29 07:23:57 compute-0 systemd-machined[153486]: Machine qemu-45-instance-00000074 terminated.
Nov 29 07:23:57 compute-0 podman[233415]: 2025-11-29 07:23:57.384286918 +0000 UTC m=+0.091202306 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 07:23:57 compute-0 nova_compute[187185]: 2025-11-29 07:23:57.522 187189 INFO nova.compute.manager [None req-efe63a8d-a172-4117-baa6-2a5ef1421bb1 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Terminating instance
Nov 29 07:23:57 compute-0 nova_compute[187185]: 2025-11-29 07:23:57.527 187189 INFO nova.virt.libvirt.driver [-] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Instance destroyed successfully.
Nov 29 07:23:57 compute-0 nova_compute[187185]: 2025-11-29 07:23:57.528 187189 DEBUG nova.objects.instance [None req-8891e6d8-f041-4f90-9e7d-30b60a0e0e48 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lazy-loading 'resources' on Instance uuid 729b50e7-7084-4f08-90ca-48a841f50cc9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:23:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:23:57.528 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:62:5e:11 10.100.0.8'], port_security=['fa:16:3e:62:5e:11 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '729b50e7-7084-4f08-90ca-48a841f50cc9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b58443a3-f575-4ff1-951d-e92781861793', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '980ddbfed54546c89c75e94503491a61', 'neutron:revision_number': '6', 'neutron:security_group_ids': '41411d2f-bfa5-47e9-8f9d-c921ac196944', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0ea1aa9c-a11a-4c3e-9a7b-dc58c9931652, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:23:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:23:57.530 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f in datapath b58443a3-f575-4ff1-951d-e92781861793 unbound from our chassis
Nov 29 07:23:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:23:57.532 104254 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network b58443a3-f575-4ff1-951d-e92781861793 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Nov 29 07:23:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:23:57.534 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f84aa059-2fb3-46d2-9b8c-2cff8db30dd2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:23:57 compute-0 nova_compute[187185]: 2025-11-29 07:23:57.560 187189 DEBUG nova.compute.manager [None req-efe63a8d-a172-4117-baa6-2a5ef1421bb1 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 07:23:57 compute-0 kernel: tap0b0600de-62 (unregistering): left promiscuous mode
Nov 29 07:23:57 compute-0 nova_compute[187185]: 2025-11-29 07:23:57.580 187189 DEBUG nova.virt.libvirt.vif [None req-8891e6d8-f041-4f90-9e7d-30b60a0e0e48 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:21:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-1961507248',display_name='tempest-ServerRescueTestJSON-server-1961507248',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-1961507248',id=116,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:22:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='980ddbfed54546c89c75e94503491a61',ramdisk_id='',reservation_id='r-ok5v6h99',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSON-1854570869',owner_user_name='tempest-ServerRescueTestJSON-1854570869-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:22:45Z,user_data=None,user_id='a992c32ce5fb4cbab645023852f14adc',uuid=729b50e7-7084-4f08-90ca-48a841f50cc9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='rescued') vif={"id": "81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f", "address": "fa:16:3e:62:5e:11", "network": {"id": "b58443a3-f575-4ff1-951d-e92781861793", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1930314854-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "980ddbfed54546c89c75e94503491a61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81aef0aa-c8", "ovs_interfaceid": "81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:23:57 compute-0 nova_compute[187185]: 2025-11-29 07:23:57.580 187189 DEBUG nova.network.os_vif_util [None req-8891e6d8-f041-4f90-9e7d-30b60a0e0e48 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Converting VIF {"id": "81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f", "address": "fa:16:3e:62:5e:11", "network": {"id": "b58443a3-f575-4ff1-951d-e92781861793", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1930314854-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "980ddbfed54546c89c75e94503491a61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81aef0aa-c8", "ovs_interfaceid": "81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:23:57 compute-0 NetworkManager[55227]: <info>  [1764401037.5812] device (tap0b0600de-62): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:23:57 compute-0 nova_compute[187185]: 2025-11-29 07:23:57.581 187189 DEBUG nova.network.os_vif_util [None req-8891e6d8-f041-4f90-9e7d-30b60a0e0e48 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:62:5e:11,bridge_name='br-int',has_traffic_filtering=True,id=81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f,network=Network(b58443a3-f575-4ff1-951d-e92781861793),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81aef0aa-c8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:23:57 compute-0 nova_compute[187185]: 2025-11-29 07:23:57.582 187189 DEBUG os_vif [None req-8891e6d8-f041-4f90-9e7d-30b60a0e0e48 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:62:5e:11,bridge_name='br-int',has_traffic_filtering=True,id=81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f,network=Network(b58443a3-f575-4ff1-951d-e92781861793),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81aef0aa-c8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:23:57 compute-0 nova_compute[187185]: 2025-11-29 07:23:57.583 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:23:57 compute-0 nova_compute[187185]: 2025-11-29 07:23:57.583 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap81aef0aa-c8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:23:57 compute-0 nova_compute[187185]: 2025-11-29 07:23:57.586 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:23:57 compute-0 nova_compute[187185]: 2025-11-29 07:23:57.588 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:23:57 compute-0 nova_compute[187185]: 2025-11-29 07:23:57.592 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:23:57 compute-0 ovn_controller[95281]: 2025-11-29T07:23:57Z|00345|binding|INFO|Releasing lport 0b0600de-624c-4784-aeea-87a27d98f344 from this chassis (sb_readonly=0)
Nov 29 07:23:57 compute-0 ovn_controller[95281]: 2025-11-29T07:23:57Z|00346|binding|INFO|Setting lport 0b0600de-624c-4784-aeea-87a27d98f344 down in Southbound
Nov 29 07:23:57 compute-0 ovn_controller[95281]: 2025-11-29T07:23:57Z|00347|binding|INFO|Removing iface tap0b0600de-62 ovn-installed in OVS
Nov 29 07:23:57 compute-0 nova_compute[187185]: 2025-11-29 07:23:57.596 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:23:57 compute-0 nova_compute[187185]: 2025-11-29 07:23:57.597 187189 INFO os_vif [None req-8891e6d8-f041-4f90-9e7d-30b60a0e0e48 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:62:5e:11,bridge_name='br-int',has_traffic_filtering=True,id=81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f,network=Network(b58443a3-f575-4ff1-951d-e92781861793),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81aef0aa-c8')
Nov 29 07:23:57 compute-0 nova_compute[187185]: 2025-11-29 07:23:57.598 187189 INFO nova.virt.libvirt.driver [None req-8891e6d8-f041-4f90-9e7d-30b60a0e0e48 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Deleting instance files /var/lib/nova/instances/729b50e7-7084-4f08-90ca-48a841f50cc9_del
Nov 29 07:23:57 compute-0 nova_compute[187185]: 2025-11-29 07:23:57.598 187189 INFO nova.virt.libvirt.driver [None req-8891e6d8-f041-4f90-9e7d-30b60a0e0e48 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Deletion of /var/lib/nova/instances/729b50e7-7084-4f08-90ca-48a841f50cc9_del complete
Nov 29 07:23:57 compute-0 nova_compute[187185]: 2025-11-29 07:23:57.610 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:23:57 compute-0 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000073.scope: Deactivated successfully.
Nov 29 07:23:57 compute-0 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000073.scope: Consumed 18.145s CPU time.
Nov 29 07:23:57 compute-0 systemd-machined[153486]: Machine qemu-43-instance-00000073 terminated.
Nov 29 07:23:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:23:57.699 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b0:47:d2 10.100.0.13 2001:db8::f816:3eff:feb0:47d2'], port_security=['fa:16:3e:b0:47:d2 10.100.0.13 2001:db8::f816:3eff:feb0:47d2'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28 2001:db8::f816:3eff:feb0:47d2/64', 'neutron:device_id': '1be71451-9dcb-4882-afc5-fa2b37f3fa96', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f75dc671-4e0c-40f1-8afd-c16b5e416d95', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '17fd93d9-fafe-4a7d-9c01-ce54fbe8f760', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=944fc855-be48-4f5c-ba58-0898fe543a04, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=0b0600de-624c-4784-aeea-87a27d98f344) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:23:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:23:57.700 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 0b0600de-624c-4784-aeea-87a27d98f344 in datapath f75dc671-4e0c-40f1-8afd-c16b5e416d95 unbound from our chassis
Nov 29 07:23:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:23:57.703 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f75dc671-4e0c-40f1-8afd-c16b5e416d95, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:23:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:23:57.704 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[0fcfad2d-e818-4e5a-8670-f7e21800e675]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:23:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:23:57.704 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95 namespace which is not needed anymore
Nov 29 07:23:57 compute-0 nova_compute[187185]: 2025-11-29 07:23:57.762 187189 INFO nova.compute.manager [None req-8891e6d8-f041-4f90-9e7d-30b60a0e0e48 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Took 0.52 seconds to destroy the instance on the hypervisor.
Nov 29 07:23:57 compute-0 nova_compute[187185]: 2025-11-29 07:23:57.763 187189 DEBUG oslo.service.loopingcall [None req-8891e6d8-f041-4f90-9e7d-30b60a0e0e48 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 07:23:57 compute-0 nova_compute[187185]: 2025-11-29 07:23:57.764 187189 DEBUG nova.compute.manager [-] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 07:23:57 compute-0 nova_compute[187185]: 2025-11-29 07:23:57.764 187189 DEBUG nova.network.neutron [-] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 07:23:57 compute-0 NetworkManager[55227]: <info>  [1764401037.7803] manager: (tap0b0600de-62): new Tun device (/org/freedesktop/NetworkManager/Devices/183)
Nov 29 07:23:57 compute-0 nova_compute[187185]: 2025-11-29 07:23:57.823 187189 INFO nova.virt.libvirt.driver [-] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Instance destroyed successfully.
Nov 29 07:23:57 compute-0 nova_compute[187185]: 2025-11-29 07:23:57.824 187189 DEBUG nova.objects.instance [None req-efe63a8d-a172-4117-baa6-2a5ef1421bb1 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'resources' on Instance uuid 1be71451-9dcb-4882-afc5-fa2b37f3fa96 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:23:57 compute-0 neutron-haproxy-ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95[232809]: [NOTICE]   (232814) : haproxy version is 2.8.14-c23fe91
Nov 29 07:23:57 compute-0 neutron-haproxy-ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95[232809]: [NOTICE]   (232814) : path to executable is /usr/sbin/haproxy
Nov 29 07:23:57 compute-0 neutron-haproxy-ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95[232809]: [WARNING]  (232814) : Exiting Master process...
Nov 29 07:23:57 compute-0 neutron-haproxy-ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95[232809]: [ALERT]    (232814) : Current worker (232816) exited with code 143 (Terminated)
Nov 29 07:23:57 compute-0 neutron-haproxy-ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95[232809]: [WARNING]  (232814) : All workers exited. Exiting... (0)
Nov 29 07:23:57 compute-0 systemd[1]: libpod-45435afbfd6fea5f31eeb43dbfebbf120bf5e5419cc0e203aecb1e450dc74bf5.scope: Deactivated successfully.
Nov 29 07:23:57 compute-0 podman[233500]: 2025-11-29 07:23:57.863015003 +0000 UTC m=+0.048563162 container died 45435afbfd6fea5f31eeb43dbfebbf120bf5e5419cc0e203aecb1e450dc74bf5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 29 07:23:57 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-45435afbfd6fea5f31eeb43dbfebbf120bf5e5419cc0e203aecb1e450dc74bf5-userdata-shm.mount: Deactivated successfully.
Nov 29 07:23:57 compute-0 systemd[1]: var-lib-containers-storage-overlay-84478e0d93e119f51dbf82e39562f63d97cc070d4c52c626eaf8a590132d34d8-merged.mount: Deactivated successfully.
Nov 29 07:23:57 compute-0 podman[233500]: 2025-11-29 07:23:57.905303337 +0000 UTC m=+0.090851486 container cleanup 45435afbfd6fea5f31eeb43dbfebbf120bf5e5419cc0e203aecb1e450dc74bf5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 29 07:23:57 compute-0 systemd[1]: libpod-conmon-45435afbfd6fea5f31eeb43dbfebbf120bf5e5419cc0e203aecb1e450dc74bf5.scope: Deactivated successfully.
Nov 29 07:23:57 compute-0 podman[233534]: 2025-11-29 07:23:57.975289083 +0000 UTC m=+0.039886008 container remove 45435afbfd6fea5f31eeb43dbfebbf120bf5e5419cc0e203aecb1e450dc74bf5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:23:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:23:57.980 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[229834f4-1f2a-4f44-a539-4572ee44ece7]: (4, ('Sat Nov 29 07:23:57 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95 (45435afbfd6fea5f31eeb43dbfebbf120bf5e5419cc0e203aecb1e450dc74bf5)\n45435afbfd6fea5f31eeb43dbfebbf120bf5e5419cc0e203aecb1e450dc74bf5\nSat Nov 29 07:23:57 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95 (45435afbfd6fea5f31eeb43dbfebbf120bf5e5419cc0e203aecb1e450dc74bf5)\n45435afbfd6fea5f31eeb43dbfebbf120bf5e5419cc0e203aecb1e450dc74bf5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:23:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:23:57.983 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[b62a6696-d20d-41c8-aa61-1d76419d5fa7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:23:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:23:57.984 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf75dc671-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:23:57 compute-0 nova_compute[187185]: 2025-11-29 07:23:57.986 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:23:57 compute-0 kernel: tapf75dc671-40: left promiscuous mode
Nov 29 07:23:58 compute-0 nova_compute[187185]: 2025-11-29 07:23:58.009 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:23:58 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:23:58.013 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[4b4cbb9b-14d6-482e-9727-8b8fd39c19f8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:23:58 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:23:58.031 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[3347345c-5f15-4026-807d-03c481e545e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:23:58 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:23:58.033 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[67cc06e4-cc98-460b-a69e-a8de62f40af5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:23:58 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:23:58.056 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[868a26e1-9791-4c06-9189-6e53c4400fcc]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 638193, 'reachable_time': 39053, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233554, 'error': None, 'target': 'ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:23:58 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:23:58.060 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:23:58 compute-0 systemd[1]: run-netns-ovnmeta\x2df75dc671\x2d4e0c\x2d40f1\x2d8afd\x2dc16b5e416d95.mount: Deactivated successfully.
Nov 29 07:23:58 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:23:58.060 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[c69e5dbf-a279-47a7-bb23-72dd626f64c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:23:58 compute-0 nova_compute[187185]: 2025-11-29 07:23:58.334 187189 DEBUG nova.virt.libvirt.vif [None req-efe63a8d-a172-4117-baa6-2a5ef1421bb1 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:21:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-210192598',display_name='tempest-TestGettingAddress-server-210192598',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-210192598',id=115,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMqBtOVeWFxVzcYFJOuDJtYVuL20oDyqcRBPHq57GiuWFxaCS3KceqmhPXeIi9sFvrUoM3x5G9a+RY7U7UfyTQLwWhQmn8+j5tk7QGxgOZ6WpsSYFLeoEl1770NJZUoryw==',key_name='tempest-TestGettingAddress-2009457088',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:22:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-tnig7st0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:22:15Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=1be71451-9dcb-4882-afc5-fa2b37f3fa96,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0b0600de-624c-4784-aeea-87a27d98f344", "address": "fa:16:3e:b0:47:d2", "network": {"id": "f75dc671-4e0c-40f1-8afd-c16b5e416d95", "bridge": "br-int", "label": "tempest-network-smoke--588217173", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb0:47d2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b0600de-62", "ovs_interfaceid": "0b0600de-624c-4784-aeea-87a27d98f344", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:23:58 compute-0 nova_compute[187185]: 2025-11-29 07:23:58.334 187189 DEBUG nova.network.os_vif_util [None req-efe63a8d-a172-4117-baa6-2a5ef1421bb1 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "0b0600de-624c-4784-aeea-87a27d98f344", "address": "fa:16:3e:b0:47:d2", "network": {"id": "f75dc671-4e0c-40f1-8afd-c16b5e416d95", "bridge": "br-int", "label": "tempest-network-smoke--588217173", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb0:47d2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b0600de-62", "ovs_interfaceid": "0b0600de-624c-4784-aeea-87a27d98f344", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:23:58 compute-0 nova_compute[187185]: 2025-11-29 07:23:58.337 187189 DEBUG nova.network.os_vif_util [None req-efe63a8d-a172-4117-baa6-2a5ef1421bb1 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b0:47:d2,bridge_name='br-int',has_traffic_filtering=True,id=0b0600de-624c-4784-aeea-87a27d98f344,network=Network(f75dc671-4e0c-40f1-8afd-c16b5e416d95),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b0600de-62') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:23:58 compute-0 nova_compute[187185]: 2025-11-29 07:23:58.337 187189 DEBUG os_vif [None req-efe63a8d-a172-4117-baa6-2a5ef1421bb1 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b0:47:d2,bridge_name='br-int',has_traffic_filtering=True,id=0b0600de-624c-4784-aeea-87a27d98f344,network=Network(f75dc671-4e0c-40f1-8afd-c16b5e416d95),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b0600de-62') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:23:58 compute-0 nova_compute[187185]: 2025-11-29 07:23:58.340 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:23:58 compute-0 nova_compute[187185]: 2025-11-29 07:23:58.340 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0b0600de-62, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:23:58 compute-0 nova_compute[187185]: 2025-11-29 07:23:58.342 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:23:58 compute-0 nova_compute[187185]: 2025-11-29 07:23:58.345 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:23:58 compute-0 nova_compute[187185]: 2025-11-29 07:23:58.348 187189 INFO os_vif [None req-efe63a8d-a172-4117-baa6-2a5ef1421bb1 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b0:47:d2,bridge_name='br-int',has_traffic_filtering=True,id=0b0600de-624c-4784-aeea-87a27d98f344,network=Network(f75dc671-4e0c-40f1-8afd-c16b5e416d95),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b0600de-62')
Nov 29 07:23:58 compute-0 nova_compute[187185]: 2025-11-29 07:23:58.348 187189 INFO nova.virt.libvirt.driver [None req-efe63a8d-a172-4117-baa6-2a5ef1421bb1 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Deleting instance files /var/lib/nova/instances/1be71451-9dcb-4882-afc5-fa2b37f3fa96_del
Nov 29 07:23:58 compute-0 nova_compute[187185]: 2025-11-29 07:23:58.349 187189 INFO nova.virt.libvirt.driver [None req-efe63a8d-a172-4117-baa6-2a5ef1421bb1 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Deletion of /var/lib/nova/instances/1be71451-9dcb-4882-afc5-fa2b37f3fa96_del complete
Nov 29 07:23:59 compute-0 nova_compute[187185]: 2025-11-29 07:23:59.194 187189 DEBUG nova.compute.manager [req-1dbad22d-9636-41c1-acb3-261f044787e7 req-473b6fd3-5a07-46f6-9470-12627413dfd5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Received event network-vif-unplugged-81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:23:59 compute-0 nova_compute[187185]: 2025-11-29 07:23:59.195 187189 DEBUG oslo_concurrency.lockutils [req-1dbad22d-9636-41c1-acb3-261f044787e7 req-473b6fd3-5a07-46f6-9470-12627413dfd5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "729b50e7-7084-4f08-90ca-48a841f50cc9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:23:59 compute-0 nova_compute[187185]: 2025-11-29 07:23:59.196 187189 DEBUG oslo_concurrency.lockutils [req-1dbad22d-9636-41c1-acb3-261f044787e7 req-473b6fd3-5a07-46f6-9470-12627413dfd5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "729b50e7-7084-4f08-90ca-48a841f50cc9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:23:59 compute-0 nova_compute[187185]: 2025-11-29 07:23:59.196 187189 DEBUG oslo_concurrency.lockutils [req-1dbad22d-9636-41c1-acb3-261f044787e7 req-473b6fd3-5a07-46f6-9470-12627413dfd5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "729b50e7-7084-4f08-90ca-48a841f50cc9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:23:59 compute-0 nova_compute[187185]: 2025-11-29 07:23:59.197 187189 DEBUG nova.compute.manager [req-1dbad22d-9636-41c1-acb3-261f044787e7 req-473b6fd3-5a07-46f6-9470-12627413dfd5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] No waiting events found dispatching network-vif-unplugged-81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:23:59 compute-0 nova_compute[187185]: 2025-11-29 07:23:59.197 187189 DEBUG nova.compute.manager [req-1dbad22d-9636-41c1-acb3-261f044787e7 req-473b6fd3-5a07-46f6-9470-12627413dfd5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Received event network-vif-unplugged-81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 07:23:59 compute-0 nova_compute[187185]: 2025-11-29 07:23:59.229 187189 INFO nova.compute.manager [None req-efe63a8d-a172-4117-baa6-2a5ef1421bb1 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Took 1.67 seconds to destroy the instance on the hypervisor.
Nov 29 07:23:59 compute-0 nova_compute[187185]: 2025-11-29 07:23:59.230 187189 DEBUG oslo.service.loopingcall [None req-efe63a8d-a172-4117-baa6-2a5ef1421bb1 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 07:23:59 compute-0 nova_compute[187185]: 2025-11-29 07:23:59.230 187189 DEBUG nova.compute.manager [-] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 07:23:59 compute-0 nova_compute[187185]: 2025-11-29 07:23:59.230 187189 DEBUG nova.network.neutron [-] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 07:23:59 compute-0 nova_compute[187185]: 2025-11-29 07:23:59.814 187189 DEBUG nova.network.neutron [-] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:23:59 compute-0 nova_compute[187185]: 2025-11-29 07:23:59.843 187189 DEBUG nova.compute.manager [req-f0967308-6bca-4836-8d9f-b4c0f4a31083 req-a6004976-fe72-4ca5-80fb-3c9669a27d34 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Received event network-vif-deleted-81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:23:59 compute-0 nova_compute[187185]: 2025-11-29 07:23:59.844 187189 INFO nova.compute.manager [req-f0967308-6bca-4836-8d9f-b4c0f4a31083 req-a6004976-fe72-4ca5-80fb-3c9669a27d34 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Neutron deleted interface 81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f; detaching it from the instance and deleting it from the info cache
Nov 29 07:23:59 compute-0 nova_compute[187185]: 2025-11-29 07:23:59.844 187189 DEBUG nova.network.neutron [req-f0967308-6bca-4836-8d9f-b4c0f4a31083 req-a6004976-fe72-4ca5-80fb-3c9669a27d34 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:23:59 compute-0 nova_compute[187185]: 2025-11-29 07:23:59.859 187189 INFO nova.compute.manager [-] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Took 2.09 seconds to deallocate network for instance.
Nov 29 07:23:59 compute-0 nova_compute[187185]: 2025-11-29 07:23:59.868 187189 DEBUG nova.compute.manager [req-f0967308-6bca-4836-8d9f-b4c0f4a31083 req-a6004976-fe72-4ca5-80fb-3c9669a27d34 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Detach interface failed, port_id=81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f, reason: Instance 729b50e7-7084-4f08-90ca-48a841f50cc9 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Nov 29 07:24:00 compute-0 nova_compute[187185]: 2025-11-29 07:24:00.247 187189 DEBUG oslo_concurrency.lockutils [None req-8891e6d8-f041-4f90-9e7d-30b60a0e0e48 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:24:00 compute-0 nova_compute[187185]: 2025-11-29 07:24:00.247 187189 DEBUG oslo_concurrency.lockutils [None req-8891e6d8-f041-4f90-9e7d-30b60a0e0e48 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:24:00 compute-0 nova_compute[187185]: 2025-11-29 07:24:00.331 187189 DEBUG nova.compute.provider_tree [None req-8891e6d8-f041-4f90-9e7d-30b60a0e0e48 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:24:00 compute-0 nova_compute[187185]: 2025-11-29 07:24:00.397 187189 DEBUG nova.network.neutron [req-18ceae45-0e02-4f04-9942-eb828fb9e01b req-bd987f39-ab51-464d-a775-a6bd83a06af2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Updated VIF entry in instance network info cache for port 0b0600de-624c-4784-aeea-87a27d98f344. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:24:00 compute-0 nova_compute[187185]: 2025-11-29 07:24:00.397 187189 DEBUG nova.network.neutron [req-18ceae45-0e02-4f04-9942-eb828fb9e01b req-bd987f39-ab51-464d-a775-a6bd83a06af2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Updating instance_info_cache with network_info: [{"id": "0b0600de-624c-4784-aeea-87a27d98f344", "address": "fa:16:3e:b0:47:d2", "network": {"id": "f75dc671-4e0c-40f1-8afd-c16b5e416d95", "bridge": "br-int", "label": "tempest-network-smoke--588217173", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb0:47d2", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b0600de-62", "ovs_interfaceid": "0b0600de-624c-4784-aeea-87a27d98f344", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:24:00 compute-0 nova_compute[187185]: 2025-11-29 07:24:00.400 187189 DEBUG nova.scheduler.client.report [None req-8891e6d8-f041-4f90-9e7d-30b60a0e0e48 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:24:00 compute-0 nova_compute[187185]: 2025-11-29 07:24:00.744 187189 DEBUG oslo_concurrency.lockutils [None req-8891e6d8-f041-4f90-9e7d-30b60a0e0e48 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.497s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:24:00 compute-0 nova_compute[187185]: 2025-11-29 07:24:00.748 187189 DEBUG oslo_concurrency.lockutils [req-18ceae45-0e02-4f04-9942-eb828fb9e01b req-bd987f39-ab51-464d-a775-a6bd83a06af2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-1be71451-9dcb-4882-afc5-fa2b37f3fa96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:24:00 compute-0 nova_compute[187185]: 2025-11-29 07:24:00.847 187189 INFO nova.scheduler.client.report [None req-8891e6d8-f041-4f90-9e7d-30b60a0e0e48 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Deleted allocations for instance 729b50e7-7084-4f08-90ca-48a841f50cc9
Nov 29 07:24:00 compute-0 nova_compute[187185]: 2025-11-29 07:24:00.977 187189 DEBUG oslo_concurrency.lockutils [None req-8891e6d8-f041-4f90-9e7d-30b60a0e0e48 a992c32ce5fb4cbab645023852f14adc 980ddbfed54546c89c75e94503491a61 - - default default] Lock "729b50e7-7084-4f08-90ca-48a841f50cc9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:24:01 compute-0 nova_compute[187185]: 2025-11-29 07:24:01.090 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:01 compute-0 nova_compute[187185]: 2025-11-29 07:24:01.314 187189 DEBUG nova.compute.manager [req-f9097905-9c58-4c64-8f46-9ecc6ea36611 req-a4e758c0-2628-4ccc-b0a6-2468da99b10b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Received event network-vif-plugged-81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:24:01 compute-0 nova_compute[187185]: 2025-11-29 07:24:01.314 187189 DEBUG oslo_concurrency.lockutils [req-f9097905-9c58-4c64-8f46-9ecc6ea36611 req-a4e758c0-2628-4ccc-b0a6-2468da99b10b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "729b50e7-7084-4f08-90ca-48a841f50cc9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:24:01 compute-0 nova_compute[187185]: 2025-11-29 07:24:01.315 187189 DEBUG oslo_concurrency.lockutils [req-f9097905-9c58-4c64-8f46-9ecc6ea36611 req-a4e758c0-2628-4ccc-b0a6-2468da99b10b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "729b50e7-7084-4f08-90ca-48a841f50cc9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:24:01 compute-0 nova_compute[187185]: 2025-11-29 07:24:01.315 187189 DEBUG oslo_concurrency.lockutils [req-f9097905-9c58-4c64-8f46-9ecc6ea36611 req-a4e758c0-2628-4ccc-b0a6-2468da99b10b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "729b50e7-7084-4f08-90ca-48a841f50cc9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:24:01 compute-0 nova_compute[187185]: 2025-11-29 07:24:01.315 187189 DEBUG nova.compute.manager [req-f9097905-9c58-4c64-8f46-9ecc6ea36611 req-a4e758c0-2628-4ccc-b0a6-2468da99b10b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] No waiting events found dispatching network-vif-plugged-81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:24:01 compute-0 nova_compute[187185]: 2025-11-29 07:24:01.315 187189 WARNING nova.compute.manager [req-f9097905-9c58-4c64-8f46-9ecc6ea36611 req-a4e758c0-2628-4ccc-b0a6-2468da99b10b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Received unexpected event network-vif-plugged-81aef0aa-c82c-44a0-8a3f-f3e8d2cc8c9f for instance with vm_state deleted and task_state None.
Nov 29 07:24:01 compute-0 nova_compute[187185]: 2025-11-29 07:24:01.315 187189 DEBUG nova.compute.manager [req-f9097905-9c58-4c64-8f46-9ecc6ea36611 req-a4e758c0-2628-4ccc-b0a6-2468da99b10b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Received event network-vif-unplugged-0b0600de-624c-4784-aeea-87a27d98f344 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:24:01 compute-0 nova_compute[187185]: 2025-11-29 07:24:01.316 187189 DEBUG oslo_concurrency.lockutils [req-f9097905-9c58-4c64-8f46-9ecc6ea36611 req-a4e758c0-2628-4ccc-b0a6-2468da99b10b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "1be71451-9dcb-4882-afc5-fa2b37f3fa96-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:24:01 compute-0 nova_compute[187185]: 2025-11-29 07:24:01.316 187189 DEBUG oslo_concurrency.lockutils [req-f9097905-9c58-4c64-8f46-9ecc6ea36611 req-a4e758c0-2628-4ccc-b0a6-2468da99b10b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1be71451-9dcb-4882-afc5-fa2b37f3fa96-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:24:01 compute-0 nova_compute[187185]: 2025-11-29 07:24:01.316 187189 DEBUG oslo_concurrency.lockutils [req-f9097905-9c58-4c64-8f46-9ecc6ea36611 req-a4e758c0-2628-4ccc-b0a6-2468da99b10b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1be71451-9dcb-4882-afc5-fa2b37f3fa96-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:24:01 compute-0 nova_compute[187185]: 2025-11-29 07:24:01.316 187189 DEBUG nova.compute.manager [req-f9097905-9c58-4c64-8f46-9ecc6ea36611 req-a4e758c0-2628-4ccc-b0a6-2468da99b10b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] No waiting events found dispatching network-vif-unplugged-0b0600de-624c-4784-aeea-87a27d98f344 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:24:01 compute-0 nova_compute[187185]: 2025-11-29 07:24:01.316 187189 DEBUG nova.compute.manager [req-f9097905-9c58-4c64-8f46-9ecc6ea36611 req-a4e758c0-2628-4ccc-b0a6-2468da99b10b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Received event network-vif-unplugged-0b0600de-624c-4784-aeea-87a27d98f344 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 07:24:01 compute-0 nova_compute[187185]: 2025-11-29 07:24:01.316 187189 DEBUG nova.compute.manager [req-f9097905-9c58-4c64-8f46-9ecc6ea36611 req-a4e758c0-2628-4ccc-b0a6-2468da99b10b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Received event network-vif-plugged-0b0600de-624c-4784-aeea-87a27d98f344 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:24:01 compute-0 nova_compute[187185]: 2025-11-29 07:24:01.317 187189 DEBUG oslo_concurrency.lockutils [req-f9097905-9c58-4c64-8f46-9ecc6ea36611 req-a4e758c0-2628-4ccc-b0a6-2468da99b10b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "1be71451-9dcb-4882-afc5-fa2b37f3fa96-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:24:01 compute-0 nova_compute[187185]: 2025-11-29 07:24:01.317 187189 DEBUG oslo_concurrency.lockutils [req-f9097905-9c58-4c64-8f46-9ecc6ea36611 req-a4e758c0-2628-4ccc-b0a6-2468da99b10b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1be71451-9dcb-4882-afc5-fa2b37f3fa96-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:24:01 compute-0 nova_compute[187185]: 2025-11-29 07:24:01.317 187189 DEBUG oslo_concurrency.lockutils [req-f9097905-9c58-4c64-8f46-9ecc6ea36611 req-a4e758c0-2628-4ccc-b0a6-2468da99b10b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1be71451-9dcb-4882-afc5-fa2b37f3fa96-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:24:01 compute-0 nova_compute[187185]: 2025-11-29 07:24:01.317 187189 DEBUG nova.compute.manager [req-f9097905-9c58-4c64-8f46-9ecc6ea36611 req-a4e758c0-2628-4ccc-b0a6-2468da99b10b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] No waiting events found dispatching network-vif-plugged-0b0600de-624c-4784-aeea-87a27d98f344 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:24:01 compute-0 nova_compute[187185]: 2025-11-29 07:24:01.317 187189 WARNING nova.compute.manager [req-f9097905-9c58-4c64-8f46-9ecc6ea36611 req-a4e758c0-2628-4ccc-b0a6-2468da99b10b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Received unexpected event network-vif-plugged-0b0600de-624c-4784-aeea-87a27d98f344 for instance with vm_state active and task_state deleting.
Nov 29 07:24:01 compute-0 nova_compute[187185]: 2025-11-29 07:24:01.499 187189 DEBUG nova.network.neutron [-] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:24:01 compute-0 nova_compute[187185]: 2025-11-29 07:24:01.532 187189 INFO nova.compute.manager [-] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Took 2.30 seconds to deallocate network for instance.
Nov 29 07:24:01 compute-0 nova_compute[187185]: 2025-11-29 07:24:01.641 187189 DEBUG oslo_concurrency.lockutils [None req-efe63a8d-a172-4117-baa6-2a5ef1421bb1 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:24:01 compute-0 nova_compute[187185]: 2025-11-29 07:24:01.641 187189 DEBUG oslo_concurrency.lockutils [None req-efe63a8d-a172-4117-baa6-2a5ef1421bb1 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:24:01 compute-0 nova_compute[187185]: 2025-11-29 07:24:01.685 187189 DEBUG nova.compute.provider_tree [None req-efe63a8d-a172-4117-baa6-2a5ef1421bb1 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:24:01 compute-0 nova_compute[187185]: 2025-11-29 07:24:01.698 187189 DEBUG nova.scheduler.client.report [None req-efe63a8d-a172-4117-baa6-2a5ef1421bb1 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:24:01 compute-0 nova_compute[187185]: 2025-11-29 07:24:01.720 187189 DEBUG oslo_concurrency.lockutils [None req-efe63a8d-a172-4117-baa6-2a5ef1421bb1 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.079s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:24:01 compute-0 nova_compute[187185]: 2025-11-29 07:24:01.741 187189 INFO nova.scheduler.client.report [None req-efe63a8d-a172-4117-baa6-2a5ef1421bb1 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Deleted allocations for instance 1be71451-9dcb-4882-afc5-fa2b37f3fa96
Nov 29 07:24:01 compute-0 nova_compute[187185]: 2025-11-29 07:24:01.828 187189 DEBUG oslo_concurrency.lockutils [None req-efe63a8d-a172-4117-baa6-2a5ef1421bb1 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "1be71451-9dcb-4882-afc5-fa2b37f3fa96" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.576s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:24:03 compute-0 nova_compute[187185]: 2025-11-29 07:24:03.146 187189 DEBUG nova.compute.manager [req-cda71621-896f-48c2-ae0d-8b299c501d88 req-659c340d-2984-4779-b66f-10dadaacc141 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Received event network-vif-deleted-0b0600de-624c-4784-aeea-87a27d98f344 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:24:03 compute-0 nova_compute[187185]: 2025-11-29 07:24:03.343 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:04 compute-0 podman[233556]: 2025-11-29 07:24:04.802227609 +0000 UTC m=+0.062462434 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:24:06 compute-0 nova_compute[187185]: 2025-11-29 07:24:06.092 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:06 compute-0 podman[233580]: 2025-11-29 07:24:06.791724196 +0000 UTC m=+0.060990493 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 29 07:24:06 compute-0 podman[233581]: 2025-11-29 07:24:06.793950199 +0000 UTC m=+0.061842807 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 07:24:08 compute-0 nova_compute[187185]: 2025-11-29 07:24:08.345 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:10 compute-0 nova_compute[187185]: 2025-11-29 07:24:10.189 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:10 compute-0 nova_compute[187185]: 2025-11-29 07:24:10.409 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:11 compute-0 nova_compute[187185]: 2025-11-29 07:24:11.093 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:12 compute-0 nova_compute[187185]: 2025-11-29 07:24:12.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:24:12 compute-0 nova_compute[187185]: 2025-11-29 07:24:12.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:24:12 compute-0 nova_compute[187185]: 2025-11-29 07:24:12.342 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 07:24:12 compute-0 nova_compute[187185]: 2025-11-29 07:24:12.342 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:24:12 compute-0 nova_compute[187185]: 2025-11-29 07:24:12.525 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401037.523417, 729b50e7-7084-4f08-90ca-48a841f50cc9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:24:12 compute-0 nova_compute[187185]: 2025-11-29 07:24:12.526 187189 INFO nova.compute.manager [-] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] VM Stopped (Lifecycle Event)
Nov 29 07:24:12 compute-0 nova_compute[187185]: 2025-11-29 07:24:12.550 187189 DEBUG nova.compute.manager [None req-23c53718-1d6f-4b33-b9ca-28c58cae0016 - - - - - -] [instance: 729b50e7-7084-4f08-90ca-48a841f50cc9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:24:12 compute-0 nova_compute[187185]: 2025-11-29 07:24:12.819 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401037.8184166, 1be71451-9dcb-4882-afc5-fa2b37f3fa96 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:24:12 compute-0 nova_compute[187185]: 2025-11-29 07:24:12.820 187189 INFO nova.compute.manager [-] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] VM Stopped (Lifecycle Event)
Nov 29 07:24:12 compute-0 nova_compute[187185]: 2025-11-29 07:24:12.853 187189 DEBUG nova.compute.manager [None req-422caa84-20e4-4027-93e9-fd8dd3d66ed2 - - - - - -] [instance: 1be71451-9dcb-4882-afc5-fa2b37f3fa96] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:24:13 compute-0 nova_compute[187185]: 2025-11-29 07:24:13.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:24:13 compute-0 nova_compute[187185]: 2025-11-29 07:24:13.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:24:13 compute-0 nova_compute[187185]: 2025-11-29 07:24:13.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:24:13 compute-0 nova_compute[187185]: 2025-11-29 07:24:13.347 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:13 compute-0 nova_compute[187185]: 2025-11-29 07:24:13.350 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:24:13 compute-0 nova_compute[187185]: 2025-11-29 07:24:13.350 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:24:13 compute-0 nova_compute[187185]: 2025-11-29 07:24:13.350 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:24:13 compute-0 nova_compute[187185]: 2025-11-29 07:24:13.351 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:24:13 compute-0 nova_compute[187185]: 2025-11-29 07:24:13.566 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:24:13 compute-0 nova_compute[187185]: 2025-11-29 07:24:13.567 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5734MB free_disk=73.29442596435547GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:24:13 compute-0 nova_compute[187185]: 2025-11-29 07:24:13.567 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:24:13 compute-0 nova_compute[187185]: 2025-11-29 07:24:13.568 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:24:13 compute-0 nova_compute[187185]: 2025-11-29 07:24:13.819 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:24:13 compute-0 nova_compute[187185]: 2025-11-29 07:24:13.819 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:24:13 compute-0 nova_compute[187185]: 2025-11-29 07:24:13.890 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Refreshing inventories for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 29 07:24:13 compute-0 nova_compute[187185]: 2025-11-29 07:24:13.975 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Updating ProviderTree inventory for provider 4e39a026-df39-4e20-874a-dbb5a40df044 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 29 07:24:13 compute-0 nova_compute[187185]: 2025-11-29 07:24:13.976 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Updating inventory in ProviderTree for provider 4e39a026-df39-4e20-874a-dbb5a40df044 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 29 07:24:14 compute-0 nova_compute[187185]: 2025-11-29 07:24:14.009 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Refreshing aggregate associations for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 29 07:24:14 compute-0 nova_compute[187185]: 2025-11-29 07:24:14.033 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Refreshing trait associations for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044, traits: HW_CPU_X86_SSE,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_TRUSTED_CERTS,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 29 07:24:14 compute-0 nova_compute[187185]: 2025-11-29 07:24:14.058 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:24:14 compute-0 nova_compute[187185]: 2025-11-29 07:24:14.074 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:24:14 compute-0 nova_compute[187185]: 2025-11-29 07:24:14.132 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:24:14 compute-0 nova_compute[187185]: 2025-11-29 07:24:14.132 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.564s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:24:15 compute-0 nova_compute[187185]: 2025-11-29 07:24:15.132 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:24:15 compute-0 nova_compute[187185]: 2025-11-29 07:24:15.133 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:24:16 compute-0 nova_compute[187185]: 2025-11-29 07:24:16.095 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:16 compute-0 nova_compute[187185]: 2025-11-29 07:24:16.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:24:16 compute-0 nova_compute[187185]: 2025-11-29 07:24:16.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:24:17 compute-0 podman[233621]: 2025-11-29 07:24:17.785389878 +0000 UTC m=+0.044676683 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 07:24:17 compute-0 podman[233620]: 2025-11-29 07:24:17.79289474 +0000 UTC m=+0.057739022 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_id=edpm, build-date=2025-08-20T13:12:41, version=9.6, distribution-scope=public)
Nov 29 07:24:17 compute-0 podman[233619]: 2025-11-29 07:24:17.799242759 +0000 UTC m=+0.059958804 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 07:24:18 compute-0 nova_compute[187185]: 2025-11-29 07:24:18.349 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:20 compute-0 nova_compute[187185]: 2025-11-29 07:24:20.311 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:24:21 compute-0 nova_compute[187185]: 2025-11-29 07:24:21.127 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:23 compute-0 nova_compute[187185]: 2025-11-29 07:24:23.351 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:23 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:23.943 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=41, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=40) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:24:23 compute-0 nova_compute[187185]: 2025-11-29 07:24:23.944 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:23 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:23.946 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:24:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:25.514 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:24:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:25.514 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:24:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:25.515 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:24:26 compute-0 nova_compute[187185]: 2025-11-29 07:24:26.129 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:26 compute-0 nova_compute[187185]: 2025-11-29 07:24:26.166 187189 DEBUG nova.compute.manager [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Nov 29 07:24:26 compute-0 nova_compute[187185]: 2025-11-29 07:24:26.260 187189 DEBUG oslo_concurrency.lockutils [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:24:26 compute-0 nova_compute[187185]: 2025-11-29 07:24:26.261 187189 DEBUG oslo_concurrency.lockutils [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:24:26 compute-0 nova_compute[187185]: 2025-11-29 07:24:26.283 187189 DEBUG nova.objects.instance [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lazy-loading 'pci_requests' on Instance uuid 2702fe48-44d0-408d-8d10-fd635e3779c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:24:26 compute-0 nova_compute[187185]: 2025-11-29 07:24:26.298 187189 DEBUG nova.virt.hardware [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:24:26 compute-0 nova_compute[187185]: 2025-11-29 07:24:26.299 187189 INFO nova.compute.claims [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:24:26 compute-0 nova_compute[187185]: 2025-11-29 07:24:26.299 187189 DEBUG nova.objects.instance [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lazy-loading 'resources' on Instance uuid 2702fe48-44d0-408d-8d10-fd635e3779c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:24:26 compute-0 nova_compute[187185]: 2025-11-29 07:24:26.312 187189 DEBUG nova.objects.instance [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2702fe48-44d0-408d-8d10-fd635e3779c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:24:26 compute-0 nova_compute[187185]: 2025-11-29 07:24:26.369 187189 INFO nova.compute.resource_tracker [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Updating resource usage from migration e6c22fe2-15f6-43e8-b46c-ad0badaec107
Nov 29 07:24:26 compute-0 nova_compute[187185]: 2025-11-29 07:24:26.370 187189 DEBUG nova.compute.resource_tracker [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Starting to track incoming migration e6c22fe2-15f6-43e8-b46c-ad0badaec107 with flavor e29df891-dca5-4a1c-9258-dc512a46956f _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Nov 29 07:24:26 compute-0 nova_compute[187185]: 2025-11-29 07:24:26.438 187189 DEBUG nova.compute.provider_tree [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:24:26 compute-0 nova_compute[187185]: 2025-11-29 07:24:26.463 187189 DEBUG nova.scheduler.client.report [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:24:26 compute-0 nova_compute[187185]: 2025-11-29 07:24:26.487 187189 DEBUG oslo_concurrency.lockutils [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.227s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:24:26 compute-0 nova_compute[187185]: 2025-11-29 07:24:26.488 187189 INFO nova.compute.manager [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Migrating
Nov 29 07:24:27 compute-0 podman[233684]: 2025-11-29 07:24:27.863929204 +0000 UTC m=+0.126623863 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller)
Nov 29 07:24:28 compute-0 nova_compute[187185]: 2025-11-29 07:24:28.353 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:30 compute-0 sshd-session[233711]: Accepted publickey for nova from 192.168.122.102 port 37694 ssh2: ECDSA SHA256:TRxHqyAgYthLa8Amy2/P9EvhpVYcaH8++okwzvcHmaY
Nov 29 07:24:30 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Nov 29 07:24:30 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 29 07:24:30 compute-0 systemd-logind[788]: New session 36 of user nova.
Nov 29 07:24:30 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 29 07:24:30 compute-0 systemd[1]: Starting User Manager for UID 42436...
Nov 29 07:24:30 compute-0 systemd[233715]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 29 07:24:30 compute-0 systemd[233715]: Queued start job for default target Main User Target.
Nov 29 07:24:30 compute-0 systemd[233715]: Created slice User Application Slice.
Nov 29 07:24:30 compute-0 systemd[233715]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 29 07:24:30 compute-0 systemd[233715]: Started Daily Cleanup of User's Temporary Directories.
Nov 29 07:24:30 compute-0 systemd[233715]: Reached target Paths.
Nov 29 07:24:30 compute-0 systemd[233715]: Reached target Timers.
Nov 29 07:24:30 compute-0 systemd[233715]: Starting D-Bus User Message Bus Socket...
Nov 29 07:24:30 compute-0 systemd[233715]: Starting Create User's Volatile Files and Directories...
Nov 29 07:24:30 compute-0 systemd[233715]: Listening on D-Bus User Message Bus Socket.
Nov 29 07:24:30 compute-0 systemd[233715]: Reached target Sockets.
Nov 29 07:24:30 compute-0 systemd[233715]: Finished Create User's Volatile Files and Directories.
Nov 29 07:24:30 compute-0 systemd[233715]: Reached target Basic System.
Nov 29 07:24:30 compute-0 systemd[233715]: Reached target Main User Target.
Nov 29 07:24:30 compute-0 systemd[233715]: Startup finished in 134ms.
Nov 29 07:24:30 compute-0 systemd[1]: Started User Manager for UID 42436.
Nov 29 07:24:30 compute-0 systemd[1]: Started Session 36 of User nova.
Nov 29 07:24:30 compute-0 sshd-session[233711]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 29 07:24:30 compute-0 sshd-session[233730]: Received disconnect from 192.168.122.102 port 37694:11: disconnected by user
Nov 29 07:24:30 compute-0 sshd-session[233730]: Disconnected from user nova 192.168.122.102 port 37694
Nov 29 07:24:30 compute-0 sshd-session[233711]: pam_unix(sshd:session): session closed for user nova
Nov 29 07:24:30 compute-0 systemd[1]: session-36.scope: Deactivated successfully.
Nov 29 07:24:30 compute-0 systemd-logind[788]: Session 36 logged out. Waiting for processes to exit.
Nov 29 07:24:30 compute-0 systemd-logind[788]: Removed session 36.
Nov 29 07:24:30 compute-0 sshd-session[233732]: Accepted publickey for nova from 192.168.122.102 port 37700 ssh2: ECDSA SHA256:TRxHqyAgYthLa8Amy2/P9EvhpVYcaH8++okwzvcHmaY
Nov 29 07:24:30 compute-0 systemd-logind[788]: New session 38 of user nova.
Nov 29 07:24:30 compute-0 systemd[1]: Started Session 38 of User nova.
Nov 29 07:24:30 compute-0 sshd-session[233732]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 29 07:24:31 compute-0 sshd-session[233735]: Received disconnect from 192.168.122.102 port 37700:11: disconnected by user
Nov 29 07:24:31 compute-0 sshd-session[233735]: Disconnected from user nova 192.168.122.102 port 37700
Nov 29 07:24:31 compute-0 sshd-session[233732]: pam_unix(sshd:session): session closed for user nova
Nov 29 07:24:31 compute-0 systemd[1]: session-38.scope: Deactivated successfully.
Nov 29 07:24:31 compute-0 systemd-logind[788]: Session 38 logged out. Waiting for processes to exit.
Nov 29 07:24:31 compute-0 systemd-logind[788]: Removed session 38.
Nov 29 07:24:31 compute-0 nova_compute[187185]: 2025-11-29 07:24:31.133 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:32 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:32.949 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '41'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:24:33 compute-0 nova_compute[187185]: 2025-11-29 07:24:33.358 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:33 compute-0 nova_compute[187185]: 2025-11-29 07:24:33.661 187189 DEBUG nova.compute.manager [req-e6da2939-7cb3-4532-be1e-9c7e53a974f9 req-d161dea3-0984-4d7a-9eaf-537d3f74967c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Received event network-vif-unplugged-3484baf0-bfbb-4b67-b841-a369f9a2c534 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:24:33 compute-0 nova_compute[187185]: 2025-11-29 07:24:33.662 187189 DEBUG oslo_concurrency.lockutils [req-e6da2939-7cb3-4532-be1e-9c7e53a974f9 req-d161dea3-0984-4d7a-9eaf-537d3f74967c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:24:33 compute-0 nova_compute[187185]: 2025-11-29 07:24:33.662 187189 DEBUG oslo_concurrency.lockutils [req-e6da2939-7cb3-4532-be1e-9c7e53a974f9 req-d161dea3-0984-4d7a-9eaf-537d3f74967c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:24:33 compute-0 nova_compute[187185]: 2025-11-29 07:24:33.662 187189 DEBUG oslo_concurrency.lockutils [req-e6da2939-7cb3-4532-be1e-9c7e53a974f9 req-d161dea3-0984-4d7a-9eaf-537d3f74967c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:24:33 compute-0 nova_compute[187185]: 2025-11-29 07:24:33.662 187189 DEBUG nova.compute.manager [req-e6da2939-7cb3-4532-be1e-9c7e53a974f9 req-d161dea3-0984-4d7a-9eaf-537d3f74967c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] No waiting events found dispatching network-vif-unplugged-3484baf0-bfbb-4b67-b841-a369f9a2c534 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:24:33 compute-0 nova_compute[187185]: 2025-11-29 07:24:33.662 187189 WARNING nova.compute.manager [req-e6da2939-7cb3-4532-be1e-9c7e53a974f9 req-d161dea3-0984-4d7a-9eaf-537d3f74967c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Received unexpected event network-vif-unplugged-3484baf0-bfbb-4b67-b841-a369f9a2c534 for instance with vm_state active and task_state resize_migrating.
Nov 29 07:24:34 compute-0 sshd-session[233737]: Accepted publickey for nova from 192.168.122.102 port 37702 ssh2: ECDSA SHA256:TRxHqyAgYthLa8Amy2/P9EvhpVYcaH8++okwzvcHmaY
Nov 29 07:24:34 compute-0 systemd-logind[788]: New session 39 of user nova.
Nov 29 07:24:34 compute-0 systemd[1]: Started Session 39 of User nova.
Nov 29 07:24:34 compute-0 sshd-session[233737]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 29 07:24:34 compute-0 sshd-session[233740]: Received disconnect from 192.168.122.102 port 37702:11: disconnected by user
Nov 29 07:24:34 compute-0 sshd-session[233740]: Disconnected from user nova 192.168.122.102 port 37702
Nov 29 07:24:34 compute-0 sshd-session[233737]: pam_unix(sshd:session): session closed for user nova
Nov 29 07:24:34 compute-0 systemd[1]: session-39.scope: Deactivated successfully.
Nov 29 07:24:34 compute-0 systemd-logind[788]: Session 39 logged out. Waiting for processes to exit.
Nov 29 07:24:34 compute-0 systemd-logind[788]: Removed session 39.
Nov 29 07:24:34 compute-0 sshd-session[233742]: Accepted publickey for nova from 192.168.122.102 port 37718 ssh2: ECDSA SHA256:TRxHqyAgYthLa8Amy2/P9EvhpVYcaH8++okwzvcHmaY
Nov 29 07:24:34 compute-0 systemd-logind[788]: New session 40 of user nova.
Nov 29 07:24:34 compute-0 systemd[1]: Started Session 40 of User nova.
Nov 29 07:24:34 compute-0 sshd-session[233742]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 29 07:24:35 compute-0 podman[233744]: 2025-11-29 07:24:35.016261621 +0000 UTC m=+0.062596277 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:24:35 compute-0 sshd-session[233751]: Received disconnect from 192.168.122.102 port 37718:11: disconnected by user
Nov 29 07:24:35 compute-0 sshd-session[233751]: Disconnected from user nova 192.168.122.102 port 37718
Nov 29 07:24:35 compute-0 sshd-session[233742]: pam_unix(sshd:session): session closed for user nova
Nov 29 07:24:35 compute-0 systemd[1]: session-40.scope: Deactivated successfully.
Nov 29 07:24:35 compute-0 systemd-logind[788]: Session 40 logged out. Waiting for processes to exit.
Nov 29 07:24:35 compute-0 systemd-logind[788]: Removed session 40.
Nov 29 07:24:35 compute-0 sshd-session[233771]: Accepted publickey for nova from 192.168.122.102 port 37720 ssh2: ECDSA SHA256:TRxHqyAgYthLa8Amy2/P9EvhpVYcaH8++okwzvcHmaY
Nov 29 07:24:35 compute-0 systemd-logind[788]: New session 41 of user nova.
Nov 29 07:24:35 compute-0 systemd[1]: Started Session 41 of User nova.
Nov 29 07:24:35 compute-0 sshd-session[233771]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 29 07:24:35 compute-0 sshd-session[233774]: Received disconnect from 192.168.122.102 port 37720:11: disconnected by user
Nov 29 07:24:35 compute-0 sshd-session[233774]: Disconnected from user nova 192.168.122.102 port 37720
Nov 29 07:24:35 compute-0 sshd-session[233771]: pam_unix(sshd:session): session closed for user nova
Nov 29 07:24:35 compute-0 systemd[1]: session-41.scope: Deactivated successfully.
Nov 29 07:24:35 compute-0 systemd-logind[788]: Session 41 logged out. Waiting for processes to exit.
Nov 29 07:24:35 compute-0 systemd-logind[788]: Removed session 41.
Nov 29 07:24:35 compute-0 nova_compute[187185]: 2025-11-29 07:24:35.571 187189 DEBUG nova.compute.manager [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Nov 29 07:24:35 compute-0 nova_compute[187185]: 2025-11-29 07:24:35.739 187189 DEBUG oslo_concurrency.lockutils [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:24:35 compute-0 nova_compute[187185]: 2025-11-29 07:24:35.741 187189 DEBUG oslo_concurrency.lockutils [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:24:36 compute-0 nova_compute[187185]: 2025-11-29 07:24:36.139 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:36 compute-0 nova_compute[187185]: 2025-11-29 07:24:36.227 187189 DEBUG nova.objects.instance [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'pci_requests' on Instance uuid 7c10cb24-586c-4507-8169-8258d7136397 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:24:36 compute-0 nova_compute[187185]: 2025-11-29 07:24:36.245 187189 DEBUG nova.virt.hardware [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:24:36 compute-0 nova_compute[187185]: 2025-11-29 07:24:36.245 187189 INFO nova.compute.claims [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:24:36 compute-0 nova_compute[187185]: 2025-11-29 07:24:36.246 187189 DEBUG nova.objects.instance [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'resources' on Instance uuid 7c10cb24-586c-4507-8169-8258d7136397 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:24:36 compute-0 nova_compute[187185]: 2025-11-29 07:24:36.258 187189 DEBUG nova.objects.instance [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7c10cb24-586c-4507-8169-8258d7136397 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:24:36 compute-0 nova_compute[187185]: 2025-11-29 07:24:36.310 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:24:36 compute-0 nova_compute[187185]: 2025-11-29 07:24:36.314 187189 INFO nova.compute.resource_tracker [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Updating resource usage from migration 1dc05e65-96f1-44d7-bfe8-4b2c41239656
Nov 29 07:24:36 compute-0 nova_compute[187185]: 2025-11-29 07:24:36.314 187189 DEBUG nova.compute.resource_tracker [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Starting to track incoming migration 1dc05e65-96f1-44d7-bfe8-4b2c41239656 with flavor e29df891-dca5-4a1c-9258-dc512a46956f _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Nov 29 07:24:36 compute-0 nova_compute[187185]: 2025-11-29 07:24:36.407 187189 DEBUG nova.compute.provider_tree [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:24:36 compute-0 nova_compute[187185]: 2025-11-29 07:24:36.423 187189 DEBUG nova.scheduler.client.report [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:24:36 compute-0 nova_compute[187185]: 2025-11-29 07:24:36.461 187189 DEBUG oslo_concurrency.lockutils [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.721s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:24:36 compute-0 nova_compute[187185]: 2025-11-29 07:24:36.462 187189 INFO nova.compute.manager [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Migrating
Nov 29 07:24:36 compute-0 nova_compute[187185]: 2025-11-29 07:24:36.552 187189 INFO nova.network.neutron [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Updating port 3484baf0-bfbb-4b67-b841-a369f9a2c534 with attributes {'binding:host_id': 'compute-0.ctlplane.example.com', 'device_owner': 'compute:nova'}
Nov 29 07:24:36 compute-0 nova_compute[187185]: 2025-11-29 07:24:36.722 187189 DEBUG nova.compute.manager [req-23f4ae49-089d-4d87-ad8b-b22b49bf73f7 req-c2a4959c-99e3-46e3-af8f-d3fd48d11bd6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Received event network-vif-plugged-3484baf0-bfbb-4b67-b841-a369f9a2c534 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:24:36 compute-0 nova_compute[187185]: 2025-11-29 07:24:36.723 187189 DEBUG oslo_concurrency.lockutils [req-23f4ae49-089d-4d87-ad8b-b22b49bf73f7 req-c2a4959c-99e3-46e3-af8f-d3fd48d11bd6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:24:36 compute-0 nova_compute[187185]: 2025-11-29 07:24:36.723 187189 DEBUG oslo_concurrency.lockutils [req-23f4ae49-089d-4d87-ad8b-b22b49bf73f7 req-c2a4959c-99e3-46e3-af8f-d3fd48d11bd6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:24:36 compute-0 nova_compute[187185]: 2025-11-29 07:24:36.724 187189 DEBUG oslo_concurrency.lockutils [req-23f4ae49-089d-4d87-ad8b-b22b49bf73f7 req-c2a4959c-99e3-46e3-af8f-d3fd48d11bd6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:24:36 compute-0 nova_compute[187185]: 2025-11-29 07:24:36.724 187189 DEBUG nova.compute.manager [req-23f4ae49-089d-4d87-ad8b-b22b49bf73f7 req-c2a4959c-99e3-46e3-af8f-d3fd48d11bd6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] No waiting events found dispatching network-vif-plugged-3484baf0-bfbb-4b67-b841-a369f9a2c534 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:24:36 compute-0 nova_compute[187185]: 2025-11-29 07:24:36.724 187189 WARNING nova.compute.manager [req-23f4ae49-089d-4d87-ad8b-b22b49bf73f7 req-c2a4959c-99e3-46e3-af8f-d3fd48d11bd6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Received unexpected event network-vif-plugged-3484baf0-bfbb-4b67-b841-a369f9a2c534 for instance with vm_state active and task_state resize_migrated.
Nov 29 07:24:36 compute-0 nova_compute[187185]: 2025-11-29 07:24:36.802 187189 DEBUG oslo_concurrency.lockutils [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "5f11adcd-958a-4269-905d-a017406505f0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:24:36 compute-0 nova_compute[187185]: 2025-11-29 07:24:36.803 187189 DEBUG oslo_concurrency.lockutils [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "5f11adcd-958a-4269-905d-a017406505f0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:24:36 compute-0 nova_compute[187185]: 2025-11-29 07:24:36.820 187189 DEBUG nova.compute.manager [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 07:24:36 compute-0 nova_compute[187185]: 2025-11-29 07:24:36.927 187189 DEBUG oslo_concurrency.lockutils [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:24:36 compute-0 nova_compute[187185]: 2025-11-29 07:24:36.928 187189 DEBUG oslo_concurrency.lockutils [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:24:36 compute-0 nova_compute[187185]: 2025-11-29 07:24:36.933 187189 DEBUG nova.virt.hardware [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:24:36 compute-0 nova_compute[187185]: 2025-11-29 07:24:36.934 187189 INFO nova.compute.claims [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:24:37 compute-0 nova_compute[187185]: 2025-11-29 07:24:37.076 187189 DEBUG nova.compute.provider_tree [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:24:37 compute-0 nova_compute[187185]: 2025-11-29 07:24:37.090 187189 DEBUG nova.scheduler.client.report [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:24:37 compute-0 nova_compute[187185]: 2025-11-29 07:24:37.109 187189 DEBUG oslo_concurrency.lockutils [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.181s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:24:37 compute-0 nova_compute[187185]: 2025-11-29 07:24:37.110 187189 DEBUG nova.compute.manager [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 07:24:37 compute-0 nova_compute[187185]: 2025-11-29 07:24:37.160 187189 DEBUG nova.compute.manager [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 07:24:37 compute-0 nova_compute[187185]: 2025-11-29 07:24:37.161 187189 DEBUG nova.network.neutron [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 07:24:37 compute-0 nova_compute[187185]: 2025-11-29 07:24:37.177 187189 INFO nova.virt.libvirt.driver [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 07:24:37 compute-0 nova_compute[187185]: 2025-11-29 07:24:37.194 187189 DEBUG nova.compute.manager [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 07:24:37 compute-0 nova_compute[187185]: 2025-11-29 07:24:37.308 187189 DEBUG nova.compute.manager [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 07:24:37 compute-0 nova_compute[187185]: 2025-11-29 07:24:37.309 187189 DEBUG nova.virt.libvirt.driver [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:24:37 compute-0 nova_compute[187185]: 2025-11-29 07:24:37.309 187189 INFO nova.virt.libvirt.driver [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Creating image(s)
Nov 29 07:24:37 compute-0 nova_compute[187185]: 2025-11-29 07:24:37.310 187189 DEBUG oslo_concurrency.lockutils [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "/var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:24:37 compute-0 nova_compute[187185]: 2025-11-29 07:24:37.310 187189 DEBUG oslo_concurrency.lockutils [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "/var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:24:37 compute-0 nova_compute[187185]: 2025-11-29 07:24:37.311 187189 DEBUG oslo_concurrency.lockutils [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "/var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:24:37 compute-0 nova_compute[187185]: 2025-11-29 07:24:37.326 187189 DEBUG oslo_concurrency.processutils [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:24:37 compute-0 nova_compute[187185]: 2025-11-29 07:24:37.368 187189 DEBUG nova.policy [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 07:24:37 compute-0 nova_compute[187185]: 2025-11-29 07:24:37.388 187189 DEBUG oslo_concurrency.processutils [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:24:37 compute-0 nova_compute[187185]: 2025-11-29 07:24:37.389 187189 DEBUG oslo_concurrency.lockutils [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:24:37 compute-0 nova_compute[187185]: 2025-11-29 07:24:37.389 187189 DEBUG oslo_concurrency.lockutils [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:24:37 compute-0 nova_compute[187185]: 2025-11-29 07:24:37.400 187189 DEBUG oslo_concurrency.processutils [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:24:37 compute-0 nova_compute[187185]: 2025-11-29 07:24:37.465 187189 DEBUG oslo_concurrency.processutils [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:24:37 compute-0 nova_compute[187185]: 2025-11-29 07:24:37.467 187189 DEBUG oslo_concurrency.processutils [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:24:37 compute-0 nova_compute[187185]: 2025-11-29 07:24:37.521 187189 DEBUG oslo_concurrency.processutils [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk 1073741824" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:24:37 compute-0 nova_compute[187185]: 2025-11-29 07:24:37.523 187189 DEBUG oslo_concurrency.lockutils [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:24:37 compute-0 nova_compute[187185]: 2025-11-29 07:24:37.523 187189 DEBUG oslo_concurrency.processutils [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:24:37 compute-0 nova_compute[187185]: 2025-11-29 07:24:37.585 187189 DEBUG oslo_concurrency.processutils [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:24:37 compute-0 nova_compute[187185]: 2025-11-29 07:24:37.586 187189 DEBUG nova.virt.disk.api [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Checking if we can resize image /var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:24:37 compute-0 nova_compute[187185]: 2025-11-29 07:24:37.586 187189 DEBUG oslo_concurrency.processutils [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:24:37 compute-0 nova_compute[187185]: 2025-11-29 07:24:37.609 187189 DEBUG oslo_concurrency.lockutils [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "refresh_cache-2702fe48-44d0-408d-8d10-fd635e3779c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:24:37 compute-0 nova_compute[187185]: 2025-11-29 07:24:37.611 187189 DEBUG oslo_concurrency.lockutils [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquired lock "refresh_cache-2702fe48-44d0-408d-8d10-fd635e3779c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:24:37 compute-0 nova_compute[187185]: 2025-11-29 07:24:37.611 187189 DEBUG nova.network.neutron [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:24:37 compute-0 nova_compute[187185]: 2025-11-29 07:24:37.658 187189 DEBUG oslo_concurrency.processutils [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:24:37 compute-0 nova_compute[187185]: 2025-11-29 07:24:37.659 187189 DEBUG nova.virt.disk.api [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Cannot resize image /var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:24:37 compute-0 nova_compute[187185]: 2025-11-29 07:24:37.659 187189 DEBUG nova.objects.instance [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lazy-loading 'migration_context' on Instance uuid 5f11adcd-958a-4269-905d-a017406505f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:24:37 compute-0 nova_compute[187185]: 2025-11-29 07:24:37.674 187189 DEBUG nova.virt.libvirt.driver [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:24:37 compute-0 nova_compute[187185]: 2025-11-29 07:24:37.675 187189 DEBUG nova.virt.libvirt.driver [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Ensure instance console log exists: /var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:24:37 compute-0 nova_compute[187185]: 2025-11-29 07:24:37.675 187189 DEBUG oslo_concurrency.lockutils [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:24:37 compute-0 nova_compute[187185]: 2025-11-29 07:24:37.676 187189 DEBUG oslo_concurrency.lockutils [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:24:37 compute-0 nova_compute[187185]: 2025-11-29 07:24:37.676 187189 DEBUG oslo_concurrency.lockutils [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:24:37 compute-0 nova_compute[187185]: 2025-11-29 07:24:37.783 187189 DEBUG nova.compute.manager [req-c89c3a90-90d3-437d-b105-ac43d27e0b44 req-11701fdd-9974-4395-80a4-63e08560be7d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Received event network-changed-3484baf0-bfbb-4b67-b841-a369f9a2c534 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:24:37 compute-0 nova_compute[187185]: 2025-11-29 07:24:37.784 187189 DEBUG nova.compute.manager [req-c89c3a90-90d3-437d-b105-ac43d27e0b44 req-11701fdd-9974-4395-80a4-63e08560be7d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Refreshing instance network info cache due to event network-changed-3484baf0-bfbb-4b67-b841-a369f9a2c534. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:24:37 compute-0 nova_compute[187185]: 2025-11-29 07:24:37.784 187189 DEBUG oslo_concurrency.lockutils [req-c89c3a90-90d3-437d-b105-ac43d27e0b44 req-11701fdd-9974-4395-80a4-63e08560be7d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-2702fe48-44d0-408d-8d10-fd635e3779c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:24:37 compute-0 podman[233792]: 2025-11-29 07:24:37.81450394 +0000 UTC m=+0.075512251 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:24:37 compute-0 podman[233791]: 2025-11-29 07:24:37.826455547 +0000 UTC m=+0.087646243 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 29 07:24:38 compute-0 nova_compute[187185]: 2025-11-29 07:24:38.229 187189 DEBUG nova.network.neutron [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Successfully created port: a1e67d00-8650-44ba-b75d-07f55b8d8810 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:24:38 compute-0 nova_compute[187185]: 2025-11-29 07:24:38.361 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:38 compute-0 sshd-session[233832]: Accepted publickey for nova from 192.168.122.101 port 34074 ssh2: ECDSA SHA256:TRxHqyAgYthLa8Amy2/P9EvhpVYcaH8++okwzvcHmaY
Nov 29 07:24:38 compute-0 systemd-logind[788]: New session 42 of user nova.
Nov 29 07:24:38 compute-0 systemd[1]: Started Session 42 of User nova.
Nov 29 07:24:38 compute-0 sshd-session[233832]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 29 07:24:38 compute-0 sshd-session[233835]: Received disconnect from 192.168.122.101 port 34074:11: disconnected by user
Nov 29 07:24:38 compute-0 sshd-session[233835]: Disconnected from user nova 192.168.122.101 port 34074
Nov 29 07:24:38 compute-0 sshd-session[233832]: pam_unix(sshd:session): session closed for user nova
Nov 29 07:24:38 compute-0 systemd[1]: session-42.scope: Deactivated successfully.
Nov 29 07:24:38 compute-0 systemd-logind[788]: Session 42 logged out. Waiting for processes to exit.
Nov 29 07:24:38 compute-0 systemd-logind[788]: Removed session 42.
Nov 29 07:24:38 compute-0 sshd-session[233837]: Accepted publickey for nova from 192.168.122.101 port 34088 ssh2: ECDSA SHA256:TRxHqyAgYthLa8Amy2/P9EvhpVYcaH8++okwzvcHmaY
Nov 29 07:24:38 compute-0 systemd-logind[788]: New session 43 of user nova.
Nov 29 07:24:38 compute-0 systemd[1]: Started Session 43 of User nova.
Nov 29 07:24:38 compute-0 sshd-session[233837]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 29 07:24:38 compute-0 sshd-session[233840]: Received disconnect from 192.168.122.101 port 34088:11: disconnected by user
Nov 29 07:24:38 compute-0 sshd-session[233840]: Disconnected from user nova 192.168.122.101 port 34088
Nov 29 07:24:38 compute-0 sshd-session[233837]: pam_unix(sshd:session): session closed for user nova
Nov 29 07:24:38 compute-0 systemd[1]: session-43.scope: Deactivated successfully.
Nov 29 07:24:38 compute-0 systemd-logind[788]: Session 43 logged out. Waiting for processes to exit.
Nov 29 07:24:38 compute-0 systemd-logind[788]: Removed session 43.
Nov 29 07:24:38 compute-0 nova_compute[187185]: 2025-11-29 07:24:38.992 187189 DEBUG nova.network.neutron [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Updating instance_info_cache with network_info: [{"id": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "address": "fa:16:3e:34:30:8b", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3484baf0-bf", "ovs_interfaceid": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.024 187189 DEBUG oslo_concurrency.lockutils [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Releasing lock "refresh_cache-2702fe48-44d0-408d-8d10-fd635e3779c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.028 187189 DEBUG oslo_concurrency.lockutils [req-c89c3a90-90d3-437d-b105-ac43d27e0b44 req-11701fdd-9974-4395-80a4-63e08560be7d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-2702fe48-44d0-408d-8d10-fd635e3779c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.028 187189 DEBUG nova.network.neutron [req-c89c3a90-90d3-437d-b105-ac43d27e0b44 req-11701fdd-9974-4395-80a4-63e08560be7d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Refreshing network info cache for port 3484baf0-bfbb-4b67-b841-a369f9a2c534 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.081 187189 DEBUG nova.network.neutron [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Successfully updated port: a1e67d00-8650-44ba-b75d-07f55b8d8810 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.107 187189 DEBUG oslo_concurrency.lockutils [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "refresh_cache-5f11adcd-958a-4269-905d-a017406505f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.108 187189 DEBUG oslo_concurrency.lockutils [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquired lock "refresh_cache-5f11adcd-958a-4269-905d-a017406505f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.108 187189 DEBUG nova.network.neutron [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.184 187189 DEBUG nova.virt.libvirt.driver [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.186 187189 DEBUG nova.virt.libvirt.driver [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.187 187189 INFO nova.virt.libvirt.driver [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Creating image(s)
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.188 187189 DEBUG nova.objects.instance [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 2702fe48-44d0-408d-8d10-fd635e3779c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.197 187189 DEBUG nova.compute.manager [req-d3f39337-d391-42cd-86a7-da2a5100e3be req-d5a78f53-9132-44f4-9757-df439bdbb503 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Received event network-changed-a1e67d00-8650-44ba-b75d-07f55b8d8810 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.197 187189 DEBUG nova.compute.manager [req-d3f39337-d391-42cd-86a7-da2a5100e3be req-d5a78f53-9132-44f4-9757-df439bdbb503 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Refreshing instance network info cache due to event network-changed-a1e67d00-8650-44ba-b75d-07f55b8d8810. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.197 187189 DEBUG oslo_concurrency.lockutils [req-d3f39337-d391-42cd-86a7-da2a5100e3be req-d5a78f53-9132-44f4-9757-df439bdbb503 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-5f11adcd-958a-4269-905d-a017406505f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.203 187189 DEBUG oslo_concurrency.processutils [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.277 187189 DEBUG oslo_concurrency.processutils [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.278 187189 DEBUG nova.virt.disk.api [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Checking if we can resize image /var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.278 187189 DEBUG oslo_concurrency.processutils [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.308 187189 DEBUG nova.network.neutron [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.339 187189 DEBUG oslo_concurrency.processutils [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.339 187189 DEBUG nova.virt.disk.api [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Cannot resize image /var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.361 187189 DEBUG nova.virt.libvirt.driver [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.361 187189 DEBUG nova.virt.libvirt.driver [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Ensure instance console log exists: /var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.362 187189 DEBUG oslo_concurrency.lockutils [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.362 187189 DEBUG oslo_concurrency.lockutils [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.362 187189 DEBUG oslo_concurrency.lockutils [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.364 187189 DEBUG nova.virt.libvirt.driver [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Start _get_guest_xml network_info=[{"id": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "address": "fa:16:3e:34:30:8b", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "vif_mac": "fa:16:3e:34:30:8b"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3484baf0-bf", "ovs_interfaceid": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.371 187189 WARNING nova.virt.libvirt.driver [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.376 187189 DEBUG nova.virt.libvirt.host [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.377 187189 DEBUG nova.virt.libvirt.host [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.382 187189 DEBUG nova.virt.libvirt.host [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.382 187189 DEBUG nova.virt.libvirt.host [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.383 187189 DEBUG nova.virt.libvirt.driver [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.384 187189 DEBUG nova.virt.hardware [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e29df891-dca5-4a1c-9258-dc512a46956f',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.384 187189 DEBUG nova.virt.hardware [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.384 187189 DEBUG nova.virt.hardware [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.384 187189 DEBUG nova.virt.hardware [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.384 187189 DEBUG nova.virt.hardware [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.385 187189 DEBUG nova.virt.hardware [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.385 187189 DEBUG nova.virt.hardware [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.385 187189 DEBUG nova.virt.hardware [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.385 187189 DEBUG nova.virt.hardware [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.385 187189 DEBUG nova.virt.hardware [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.386 187189 DEBUG nova.virt.hardware [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.386 187189 DEBUG nova.objects.instance [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 2702fe48-44d0-408d-8d10-fd635e3779c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.405 187189 DEBUG oslo_concurrency.processutils [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.467 187189 DEBUG oslo_concurrency.processutils [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/disk.config --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.468 187189 DEBUG oslo_concurrency.lockutils [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "/var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.468 187189 DEBUG oslo_concurrency.lockutils [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "/var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.470 187189 DEBUG oslo_concurrency.lockutils [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "/var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.471 187189 DEBUG nova.virt.libvirt.vif [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:24:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-864835491',display_name='tempest-ServerDiskConfigTestJSON-server-864835491',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-864835491',id=122,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:24:17Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6d55e57bfd184513a304a61cc1cb3730',ramdisk_id='',reservation_id='r-ybkvc9v9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-1282760174',owner_user_name='tempest-ServerDiskConfigTestJSON-1282760174-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:24:36Z,user_data=None,user_id='000fb7b950024e16902cd58f2ea16ac9',uuid=2702fe48-44d0-408d-8d10-fd635e3779c9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "address": "fa:16:3e:34:30:8b", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "vif_mac": "fa:16:3e:34:30:8b"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3484baf0-bf", "ovs_interfaceid": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.471 187189 DEBUG nova.network.os_vif_util [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converting VIF {"id": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "address": "fa:16:3e:34:30:8b", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "vif_mac": "fa:16:3e:34:30:8b"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3484baf0-bf", "ovs_interfaceid": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.472 187189 DEBUG nova.network.os_vif_util [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:34:30:8b,bridge_name='br-int',has_traffic_filtering=True,id=3484baf0-bfbb-4b67-b841-a369f9a2c534,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3484baf0-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.475 187189 DEBUG nova.virt.libvirt.driver [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:24:39 compute-0 nova_compute[187185]:   <uuid>2702fe48-44d0-408d-8d10-fd635e3779c9</uuid>
Nov 29 07:24:39 compute-0 nova_compute[187185]:   <name>instance-0000007a</name>
Nov 29 07:24:39 compute-0 nova_compute[187185]:   <memory>196608</memory>
Nov 29 07:24:39 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:24:39 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:24:39 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:       <nova:name>tempest-ServerDiskConfigTestJSON-server-864835491</nova:name>
Nov 29 07:24:39 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:24:39</nova:creationTime>
Nov 29 07:24:39 compute-0 nova_compute[187185]:       <nova:flavor name="m1.micro">
Nov 29 07:24:39 compute-0 nova_compute[187185]:         <nova:memory>192</nova:memory>
Nov 29 07:24:39 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:24:39 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:24:39 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:24:39 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:24:39 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:24:39 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:24:39 compute-0 nova_compute[187185]:         <nova:user uuid="000fb7b950024e16902cd58f2ea16ac9">tempest-ServerDiskConfigTestJSON-1282760174-project-member</nova:user>
Nov 29 07:24:39 compute-0 nova_compute[187185]:         <nova:project uuid="6d55e57bfd184513a304a61cc1cb3730">tempest-ServerDiskConfigTestJSON-1282760174</nova:project>
Nov 29 07:24:39 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:24:39 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:24:39 compute-0 nova_compute[187185]:         <nova:port uuid="3484baf0-bfbb-4b67-b841-a369f9a2c534">
Nov 29 07:24:39 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:24:39 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:24:39 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:24:39 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <system>
Nov 29 07:24:39 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:24:39 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:24:39 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:24:39 compute-0 nova_compute[187185]:       <entry name="serial">2702fe48-44d0-408d-8d10-fd635e3779c9</entry>
Nov 29 07:24:39 compute-0 nova_compute[187185]:       <entry name="uuid">2702fe48-44d0-408d-8d10-fd635e3779c9</entry>
Nov 29 07:24:39 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     </system>
Nov 29 07:24:39 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:24:39 compute-0 nova_compute[187185]:   <os>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:   </os>
Nov 29 07:24:39 compute-0 nova_compute[187185]:   <features>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:   </features>
Nov 29 07:24:39 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:24:39 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:24:39 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:24:39 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/disk"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:24:39 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/disk.config"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:24:39 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:34:30:8b"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:       <target dev="tap3484baf0-bf"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:24:39 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/console.log" append="off"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <video>
Nov 29 07:24:39 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     </video>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:24:39 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:24:39 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:24:39 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:24:39 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:24:39 compute-0 nova_compute[187185]: </domain>
Nov 29 07:24:39 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.477 187189 DEBUG nova.virt.libvirt.vif [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:24:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-864835491',display_name='tempest-ServerDiskConfigTestJSON-server-864835491',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-864835491',id=122,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:24:17Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6d55e57bfd184513a304a61cc1cb3730',ramdisk_id='',reservation_id='r-ybkvc9v9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-1282760174',owner_user_name='tempest-ServerDiskConfigTestJSON-1282760174-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:24:36Z,user_data=None,user_id='000fb7b950024e16902cd58f2ea16ac9',uuid=2702fe48-44d0-408d-8d10-fd635e3779c9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "address": "fa:16:3e:34:30:8b", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "vif_mac": "fa:16:3e:34:30:8b"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3484baf0-bf", "ovs_interfaceid": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.477 187189 DEBUG nova.network.os_vif_util [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converting VIF {"id": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "address": "fa:16:3e:34:30:8b", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "vif_mac": "fa:16:3e:34:30:8b"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3484baf0-bf", "ovs_interfaceid": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.478 187189 DEBUG nova.network.os_vif_util [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:34:30:8b,bridge_name='br-int',has_traffic_filtering=True,id=3484baf0-bfbb-4b67-b841-a369f9a2c534,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3484baf0-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.479 187189 DEBUG os_vif [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:30:8b,bridge_name='br-int',has_traffic_filtering=True,id=3484baf0-bfbb-4b67-b841-a369f9a2c534,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3484baf0-bf') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.479 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.480 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.480 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.484 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.484 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3484baf0-bf, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.485 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3484baf0-bf, col_values=(('external_ids', {'iface-id': '3484baf0-bfbb-4b67-b841-a369f9a2c534', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:34:30:8b', 'vm-uuid': '2702fe48-44d0-408d-8d10-fd635e3779c9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.487 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:39 compute-0 NetworkManager[55227]: <info>  [1764401079.4884] manager: (tap3484baf0-bf): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/184)
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.490 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.496 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.498 187189 INFO os_vif [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:30:8b,bridge_name='br-int',has_traffic_filtering=True,id=3484baf0-bfbb-4b67-b841-a369f9a2c534,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3484baf0-bf')
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.911 187189 DEBUG nova.virt.libvirt.driver [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.911 187189 DEBUG nova.virt.libvirt.driver [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.912 187189 DEBUG nova.virt.libvirt.driver [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] No VIF found with MAC fa:16:3e:34:30:8b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.912 187189 INFO nova.virt.libvirt.driver [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Using config drive
Nov 29 07:24:39 compute-0 NetworkManager[55227]: <info>  [1764401079.9891] manager: (tap3484baf0-bf): new Tun device (/org/freedesktop/NetworkManager/Devices/185)
Nov 29 07:24:39 compute-0 kernel: tap3484baf0-bf: entered promiscuous mode
Nov 29 07:24:39 compute-0 ovn_controller[95281]: 2025-11-29T07:24:39Z|00348|binding|INFO|Claiming lport 3484baf0-bfbb-4b67-b841-a369f9a2c534 for this chassis.
Nov 29 07:24:39 compute-0 ovn_controller[95281]: 2025-11-29T07:24:39Z|00349|binding|INFO|3484baf0-bfbb-4b67-b841-a369f9a2c534: Claiming fa:16:3e:34:30:8b 10.100.0.12
Nov 29 07:24:39 compute-0 nova_compute[187185]: 2025-11-29 07:24:39.992 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:40.006 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:34:30:8b 10.100.0.12'], port_security=['fa:16:3e:34:30:8b 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '2702fe48-44d0-408d-8d10-fd635e3779c9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6d55e57bfd184513a304a61cc1cb3730', 'neutron:revision_number': '6', 'neutron:security_group_ids': '44bb1ac9-49ae-4c0a-8013-0db5efadb536', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=804567bc-6857-4eb6-aa00-b449f09c69a2, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=3484baf0-bfbb-4b67-b841-a369f9a2c534) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:40.007 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 3484baf0-bfbb-4b67-b841-a369f9a2c534 in datapath 9b34af6b-edf9-4b27-b1dc-2b18c2eec958 bound to our chassis
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:40.008 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9b34af6b-edf9-4b27-b1dc-2b18c2eec958
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:40.021 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[50184a07-0f58-40c1-ae10-cbeddc5bdb6f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:40.022 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9b34af6b-e1 in ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:40.025 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9b34af6b-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:40.025 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[57b5db3d-4710-4cda-9a14-d3f3a8b64a0c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:40.026 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f5e4fdb2-bf7e-4942-a6c7-9817a97281ba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:40 compute-0 systemd-udevd[233869]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:24:40 compute-0 systemd-machined[153486]: New machine qemu-46-instance-0000007a.
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:40.038 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[84515071-46b9-4fbb-b689-aba9e70a292e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:40 compute-0 NetworkManager[55227]: <info>  [1764401080.0504] device (tap3484baf0-bf): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:24:40 compute-0 NetworkManager[55227]: <info>  [1764401080.0515] device (tap3484baf0-bf): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.052 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.055 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:40 compute-0 ovn_controller[95281]: 2025-11-29T07:24:40Z|00350|binding|INFO|Setting lport 3484baf0-bfbb-4b67-b841-a369f9a2c534 ovn-installed in OVS
Nov 29 07:24:40 compute-0 ovn_controller[95281]: 2025-11-29T07:24:40Z|00351|binding|INFO|Setting lport 3484baf0-bfbb-4b67-b841-a369f9a2c534 up in Southbound
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.058 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:40 compute-0 systemd[1]: Started Virtual Machine qemu-46-instance-0000007a.
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:40.064 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f5c04123-50d2-4428-bc58-57e3113a0d1e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.066 187189 DEBUG nova.network.neutron [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Updating instance_info_cache with network_info: [{"id": "a1e67d00-8650-44ba-b75d-07f55b8d8810", "address": "fa:16:3e:39:9f:d3", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1e67d00-86", "ovs_interfaceid": "a1e67d00-8650-44ba-b75d-07f55b8d8810", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.093 187189 DEBUG oslo_concurrency.lockutils [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Releasing lock "refresh_cache-5f11adcd-958a-4269-905d-a017406505f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.094 187189 DEBUG nova.compute.manager [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Instance network_info: |[{"id": "a1e67d00-8650-44ba-b75d-07f55b8d8810", "address": "fa:16:3e:39:9f:d3", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1e67d00-86", "ovs_interfaceid": "a1e67d00-8650-44ba-b75d-07f55b8d8810", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:40.093 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[3d0d6610-fd10-4812-b148-3cd5379ad995]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.094 187189 DEBUG oslo_concurrency.lockutils [req-d3f39337-d391-42cd-86a7-da2a5100e3be req-d5a78f53-9132-44f4-9757-df439bdbb503 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-5f11adcd-958a-4269-905d-a017406505f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.094 187189 DEBUG nova.network.neutron [req-d3f39337-d391-42cd-86a7-da2a5100e3be req-d5a78f53-9132-44f4-9757-df439bdbb503 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Refreshing network info cache for port a1e67d00-8650-44ba-b75d-07f55b8d8810 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.097 187189 DEBUG nova.virt.libvirt.driver [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Start _get_guest_xml network_info=[{"id": "a1e67d00-8650-44ba-b75d-07f55b8d8810", "address": "fa:16:3e:39:9f:d3", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1e67d00-86", "ovs_interfaceid": "a1e67d00-8650-44ba-b75d-07f55b8d8810", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:24:40 compute-0 systemd-udevd[233872]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:40.098 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[7c59892d-6e72-492a-b040-002fb70ec6da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:40 compute-0 NetworkManager[55227]: <info>  [1764401080.0999] manager: (tap9b34af6b-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/186)
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.107 187189 WARNING nova.virt.libvirt.driver [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.119 187189 DEBUG nova.virt.libvirt.host [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.119 187189 DEBUG nova.virt.libvirt.host [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.130 187189 DEBUG nova.virt.libvirt.host [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.131 187189 DEBUG nova.virt.libvirt.host [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.132 187189 DEBUG nova.virt.libvirt.driver [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.132 187189 DEBUG nova.virt.hardware [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:40.132 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[9e81dbde-b33b-41e5-a136-4411ee9fc6ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.133 187189 DEBUG nova.virt.hardware [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.133 187189 DEBUG nova.virt.hardware [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.133 187189 DEBUG nova.virt.hardware [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.133 187189 DEBUG nova.virt.hardware [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.133 187189 DEBUG nova.virt.hardware [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.134 187189 DEBUG nova.virt.hardware [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.134 187189 DEBUG nova.virt.hardware [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.134 187189 DEBUG nova.virt.hardware [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.134 187189 DEBUG nova.virt.hardware [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.135 187189 DEBUG nova.virt.hardware [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:40.136 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[27c77363-20dc-4271-a162-2df4736051b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.138 187189 DEBUG nova.virt.libvirt.vif [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:24:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1816602290',display_name='tempest-ServerStableDeviceRescueTest-server-1816602290',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-1816602290',id=123,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ac3bb322fa744e099b38e08abe12d0e2',ramdisk_id='',reservation_id='r-6vcqx0xs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-2012111838',owner_user_name='tempest-ServerStableDeviceRescueTest-2012111838-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:24:37Z,user_data=None,user_id='5be41a8530314f83bbecbb74b9276f2d',uuid=5f11adcd-958a-4269-905d-a017406505f0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a1e67d00-8650-44ba-b75d-07f55b8d8810", "address": "fa:16:3e:39:9f:d3", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1e67d00-86", "ovs_interfaceid": "a1e67d00-8650-44ba-b75d-07f55b8d8810", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.139 187189 DEBUG nova.network.os_vif_util [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Converting VIF {"id": "a1e67d00-8650-44ba-b75d-07f55b8d8810", "address": "fa:16:3e:39:9f:d3", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1e67d00-86", "ovs_interfaceid": "a1e67d00-8650-44ba-b75d-07f55b8d8810", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.140 187189 DEBUG nova.network.os_vif_util [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:39:9f:d3,bridge_name='br-int',has_traffic_filtering=True,id=a1e67d00-8650-44ba-b75d-07f55b8d8810,network=Network(240f16d8-602b-4aa1-8edb-e3a8d3674e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa1e67d00-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.141 187189 DEBUG nova.objects.instance [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5f11adcd-958a-4269-905d-a017406505f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:24:40 compute-0 NetworkManager[55227]: <info>  [1764401080.1628] device (tap9b34af6b-e0): carrier: link connected
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.162 187189 DEBUG nova.virt.libvirt.driver [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:24:40 compute-0 nova_compute[187185]:   <uuid>5f11adcd-958a-4269-905d-a017406505f0</uuid>
Nov 29 07:24:40 compute-0 nova_compute[187185]:   <name>instance-0000007b</name>
Nov 29 07:24:40 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:24:40 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:24:40 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:24:40 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:       <nova:name>tempest-ServerStableDeviceRescueTest-server-1816602290</nova:name>
Nov 29 07:24:40 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:24:40</nova:creationTime>
Nov 29 07:24:40 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:24:40 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:24:40 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:24:40 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:24:40 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:24:40 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:24:40 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:24:40 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:24:40 compute-0 nova_compute[187185]:         <nova:user uuid="5be41a8530314f83bbecbb74b9276f2d">tempest-ServerStableDeviceRescueTest-2012111838-project-member</nova:user>
Nov 29 07:24:40 compute-0 nova_compute[187185]:         <nova:project uuid="ac3bb322fa744e099b38e08abe12d0e2">tempest-ServerStableDeviceRescueTest-2012111838</nova:project>
Nov 29 07:24:40 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:24:40 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:24:40 compute-0 nova_compute[187185]:         <nova:port uuid="a1e67d00-8650-44ba-b75d-07f55b8d8810">
Nov 29 07:24:40 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:24:40 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:24:40 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:24:40 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <system>
Nov 29 07:24:40 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:24:40 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:24:40 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:24:40 compute-0 nova_compute[187185]:       <entry name="serial">5f11adcd-958a-4269-905d-a017406505f0</entry>
Nov 29 07:24:40 compute-0 nova_compute[187185]:       <entry name="uuid">5f11adcd-958a-4269-905d-a017406505f0</entry>
Nov 29 07:24:40 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     </system>
Nov 29 07:24:40 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:24:40 compute-0 nova_compute[187185]:   <os>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:   </os>
Nov 29 07:24:40 compute-0 nova_compute[187185]:   <features>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:   </features>
Nov 29 07:24:40 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:24:40 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:24:40 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:24:40 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:24:40 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk.config"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:24:40 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:39:9f:d3"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:       <target dev="tapa1e67d00-86"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:24:40 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/console.log" append="off"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <video>
Nov 29 07:24:40 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     </video>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:24:40 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:24:40 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:24:40 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:24:40 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:24:40 compute-0 nova_compute[187185]: </domain>
Nov 29 07:24:40 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.162 187189 DEBUG nova.compute.manager [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Preparing to wait for external event network-vif-plugged-a1e67d00-8650-44ba-b75d-07f55b8d8810 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.163 187189 DEBUG oslo_concurrency.lockutils [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "5f11adcd-958a-4269-905d-a017406505f0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.163 187189 DEBUG oslo_concurrency.lockutils [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "5f11adcd-958a-4269-905d-a017406505f0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.163 187189 DEBUG oslo_concurrency.lockutils [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "5f11adcd-958a-4269-905d-a017406505f0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.164 187189 DEBUG nova.virt.libvirt.vif [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:24:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1816602290',display_name='tempest-ServerStableDeviceRescueTest-server-1816602290',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-1816602290',id=123,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ac3bb322fa744e099b38e08abe12d0e2',ramdisk_id='',reservation_id='r-6vcqx0xs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-2012111838',owner_user_name='tempest-ServerStableDeviceRescueTest-2012111838-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:24:37Z,user_data=None,user_id='5be41a8530314f83bbecbb74b9276f2d',uuid=5f11adcd-958a-4269-905d-a017406505f0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a1e67d00-8650-44ba-b75d-07f55b8d8810", "address": "fa:16:3e:39:9f:d3", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1e67d00-86", "ovs_interfaceid": "a1e67d00-8650-44ba-b75d-07f55b8d8810", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.164 187189 DEBUG nova.network.os_vif_util [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Converting VIF {"id": "a1e67d00-8650-44ba-b75d-07f55b8d8810", "address": "fa:16:3e:39:9f:d3", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1e67d00-86", "ovs_interfaceid": "a1e67d00-8650-44ba-b75d-07f55b8d8810", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.164 187189 DEBUG nova.network.os_vif_util [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:39:9f:d3,bridge_name='br-int',has_traffic_filtering=True,id=a1e67d00-8650-44ba-b75d-07f55b8d8810,network=Network(240f16d8-602b-4aa1-8edb-e3a8d3674e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa1e67d00-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.165 187189 DEBUG os_vif [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:9f:d3,bridge_name='br-int',has_traffic_filtering=True,id=a1e67d00-8650-44ba-b75d-07f55b8d8810,network=Network(240f16d8-602b-4aa1-8edb-e3a8d3674e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa1e67d00-86') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.165 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.165 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.165 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.167 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.168 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa1e67d00-86, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:40.168 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[e5978ef9-ec3d-47a1-a7db-d8d8e06b20ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.168 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa1e67d00-86, col_values=(('external_ids', {'iface-id': 'a1e67d00-8650-44ba-b75d-07f55b8d8810', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:39:9f:d3', 'vm-uuid': '5f11adcd-958a-4269-905d-a017406505f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.169 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:40 compute-0 NetworkManager[55227]: <info>  [1764401080.1707] manager: (tapa1e67d00-86): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/187)
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.172 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.176 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.178 187189 INFO os_vif [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:9f:d3,bridge_name='br-int',has_traffic_filtering=True,id=a1e67d00-8650-44ba-b75d-07f55b8d8810,network=Network(240f16d8-602b-4aa1-8edb-e3a8d3674e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa1e67d00-86')
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:40.187 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[53a74dbd-1d5e-4d19-b73d-f7a9aa3716ba]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9b34af6b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fd:40:d9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 113], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 652782, 'reachable_time': 43680, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233901, 'error': None, 'target': 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:40.205 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f1cd7140-4d14-4afc-9d88-861add0ee656]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefd:40d9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 652782, 'tstamp': 652782}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233902, 'error': None, 'target': 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:40.222 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[438f9555-6ebf-488a-9067-9c78368fd80e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9b34af6b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fd:40:d9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 113], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 652782, 'reachable_time': 43680, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233903, 'error': None, 'target': 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:40.248 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[67ea422d-88a8-4ced-89ea-dbfd3b809c1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.290 187189 DEBUG nova.virt.libvirt.driver [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.291 187189 DEBUG nova.virt.libvirt.driver [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.291 187189 DEBUG nova.virt.libvirt.driver [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] No VIF found with MAC fa:16:3e:39:9f:d3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.291 187189 INFO nova.virt.libvirt.driver [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Using config drive
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:40.308 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[9b4d3cfe-5fb7-4b63-ae9e-b29610c5366c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:40.310 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9b34af6b-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:40.310 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:40.310 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9b34af6b-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:24:40 compute-0 NetworkManager[55227]: <info>  [1764401080.3125] manager: (tap9b34af6b-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/188)
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.311 187189 DEBUG nova.network.neutron [req-c89c3a90-90d3-437d-b105-ac43d27e0b44 req-11701fdd-9974-4395-80a4-63e08560be7d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Updated VIF entry in instance network info cache for port 3484baf0-bfbb-4b67-b841-a369f9a2c534. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.313 187189 DEBUG nova.network.neutron [req-c89c3a90-90d3-437d-b105-ac43d27e0b44 req-11701fdd-9974-4395-80a4-63e08560be7d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Updating instance_info_cache with network_info: [{"id": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "address": "fa:16:3e:34:30:8b", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3484baf0-bf", "ovs_interfaceid": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:24:40 compute-0 kernel: tap9b34af6b-e0: entered promiscuous mode
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.320 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:40.320 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9b34af6b-e0, col_values=(('external_ids', {'iface-id': '88f3bff1-58a0-4231-87c4-807c4c2657d5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:24:40 compute-0 ovn_controller[95281]: 2025-11-29T07:24:40Z|00352|binding|INFO|Releasing lport 88f3bff1-58a0-4231-87c4-807c4c2657d5 from this chassis (sb_readonly=0)
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.335 187189 DEBUG oslo_concurrency.lockutils [req-c89c3a90-90d3-437d-b105-ac43d27e0b44 req-11701fdd-9974-4395-80a4-63e08560be7d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-2702fe48-44d0-408d-8d10-fd635e3779c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.336 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.337 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:40.338 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9b34af6b-edf9-4b27-b1dc-2b18c2eec958.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9b34af6b-edf9-4b27-b1dc-2b18c2eec958.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:40.339 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[96c8435f-9935-49b4-8dff-3d2960f5134c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:40.340 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-9b34af6b-edf9-4b27-b1dc-2b18c2eec958
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/9b34af6b-edf9-4b27-b1dc-2b18c2eec958.pid.haproxy
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID 9b34af6b-edf9-4b27-b1dc-2b18c2eec958
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:24:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:40.340 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'env', 'PROCESS_TAG=haproxy-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9b34af6b-edf9-4b27-b1dc-2b18c2eec958.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.352 187189 DEBUG nova.compute.manager [req-8a1539a9-9bfc-471e-818c-cad0d9236cef req-09db0890-b6da-467a-8a68-404d349cbd5b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Received event network-vif-plugged-3484baf0-bfbb-4b67-b841-a369f9a2c534 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.353 187189 DEBUG oslo_concurrency.lockutils [req-8a1539a9-9bfc-471e-818c-cad0d9236cef req-09db0890-b6da-467a-8a68-404d349cbd5b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.353 187189 DEBUG oslo_concurrency.lockutils [req-8a1539a9-9bfc-471e-818c-cad0d9236cef req-09db0890-b6da-467a-8a68-404d349cbd5b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.353 187189 DEBUG oslo_concurrency.lockutils [req-8a1539a9-9bfc-471e-818c-cad0d9236cef req-09db0890-b6da-467a-8a68-404d349cbd5b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.353 187189 DEBUG nova.compute.manager [req-8a1539a9-9bfc-471e-818c-cad0d9236cef req-09db0890-b6da-467a-8a68-404d349cbd5b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] No waiting events found dispatching network-vif-plugged-3484baf0-bfbb-4b67-b841-a369f9a2c534 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.354 187189 WARNING nova.compute.manager [req-8a1539a9-9bfc-471e-818c-cad0d9236cef req-09db0890-b6da-467a-8a68-404d349cbd5b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Received unexpected event network-vif-plugged-3484baf0-bfbb-4b67-b841-a369f9a2c534 for instance with vm_state active and task_state resize_finish.
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.613 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401080.6135664, 2702fe48-44d0-408d-8d10-fd635e3779c9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.614 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] VM Resumed (Lifecycle Event)
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.616 187189 DEBUG nova.compute.manager [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.619 187189 INFO nova.virt.libvirt.driver [-] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Instance running successfully.
Nov 29 07:24:40 compute-0 virtqemud[186729]: argument unsupported: QEMU guest agent is not configured
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.623 187189 DEBUG nova.virt.libvirt.guest [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.623 187189 DEBUG nova.virt.libvirt.driver [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.630 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.633 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.676 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] During sync_power_state the instance has a pending task (resize_finish). Skip.
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.677 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401080.6150002, 2702fe48-44d0-408d-8d10-fd635e3779c9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.677 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] VM Started (Lifecycle Event)
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.700 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:24:40 compute-0 podman[233944]: 2025-11-29 07:24:40.787945823 +0000 UTC m=+0.111291290 container create 289dd7ff939ff073193c24d8adff8bd7c2ab1b059b013adf22cd659f84825024 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 29 07:24:40 compute-0 podman[233944]: 2025-11-29 07:24:40.696972857 +0000 UTC m=+0.020318354 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:24:40 compute-0 systemd[1]: Started libpod-conmon-289dd7ff939ff073193c24d8adff8bd7c2ab1b059b013adf22cd659f84825024.scope.
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.846 187189 INFO nova.virt.libvirt.driver [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Creating config drive at /var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk.config
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.850 187189 DEBUG oslo_concurrency.processutils [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg_9x5ru5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:24:40 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:24:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a9fa77b4055a7158239636c873de44924040c0504cef4308cb8e41e640386ab/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:24:40 compute-0 podman[233944]: 2025-11-29 07:24:40.88529929 +0000 UTC m=+0.208644807 container init 289dd7ff939ff073193c24d8adff8bd7c2ab1b059b013adf22cd659f84825024 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.886 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:24:40 compute-0 podman[233944]: 2025-11-29 07:24:40.891187756 +0000 UTC m=+0.214533223 container start 289dd7ff939ff073193c24d8adff8bd7c2ab1b059b013adf22cd659f84825024 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:24:40 compute-0 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[233960]: [NOTICE]   (233967) : New worker (233969) forked
Nov 29 07:24:40 compute-0 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[233960]: [NOTICE]   (233967) : Loading success.
Nov 29 07:24:40 compute-0 nova_compute[187185]: 2025-11-29 07:24:40.983 187189 DEBUG oslo_concurrency.processutils [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg_9x5ru5" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:24:41 compute-0 systemd-udevd[233889]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:24:41 compute-0 NetworkManager[55227]: <info>  [1764401081.0509] manager: (tapa1e67d00-86): new Tun device (/org/freedesktop/NetworkManager/Devices/189)
Nov 29 07:24:41 compute-0 kernel: tapa1e67d00-86: entered promiscuous mode
Nov 29 07:24:41 compute-0 ovn_controller[95281]: 2025-11-29T07:24:41Z|00353|binding|INFO|Claiming lport a1e67d00-8650-44ba-b75d-07f55b8d8810 for this chassis.
Nov 29 07:24:41 compute-0 ovn_controller[95281]: 2025-11-29T07:24:41Z|00354|binding|INFO|a1e67d00-8650-44ba-b75d-07f55b8d8810: Claiming fa:16:3e:39:9f:d3 10.100.0.11
Nov 29 07:24:41 compute-0 nova_compute[187185]: 2025-11-29 07:24:41.057 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:41 compute-0 NetworkManager[55227]: <info>  [1764401081.0626] device (tapa1e67d00-86): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:24:41 compute-0 NetworkManager[55227]: <info>  [1764401081.0640] device (tapa1e67d00-86): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:41.072 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:39:9f:d3 10.100.0.11'], port_security=['fa:16:3e:39:9f:d3 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '5f11adcd-958a-4269-905d-a017406505f0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'neutron:revision_number': '2', 'neutron:security_group_ids': '339ff6a8-b11e-4176-931b-a82ab9688ace', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0beb853-8490-4e92-a787-adc66ba47efc, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=a1e67d00-8650-44ba-b75d-07f55b8d8810) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:41.075 104254 INFO neutron.agent.ovn.metadata.agent [-] Port a1e67d00-8650-44ba-b75d-07f55b8d8810 in datapath 240f16d8-602b-4aa1-8edb-e3a8d3674e39 bound to our chassis
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:41.081 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 240f16d8-602b-4aa1-8edb-e3a8d3674e39
Nov 29 07:24:41 compute-0 systemd-machined[153486]: New machine qemu-47-instance-0000007b.
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:41.095 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[3efc0f7c-af50-4ec1-b34e-e5862d66343b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:41.096 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap240f16d8-61 in ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:41.100 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap240f16d8-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:41.100 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[b17e43ed-4e18-492c-a7df-5a21b7c86c23]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:41.102 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[2f712977-682c-40cf-9bd3-a0762b3f1e02]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:41 compute-0 nova_compute[187185]: 2025-11-29 07:24:41.117 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:41 compute-0 systemd[1]: Started Virtual Machine qemu-47-instance-0000007b.
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:41.118 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[8503aebf-af51-4094-826e-0083304dd95a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:41 compute-0 ovn_controller[95281]: 2025-11-29T07:24:41Z|00355|binding|INFO|Setting lport a1e67d00-8650-44ba-b75d-07f55b8d8810 ovn-installed in OVS
Nov 29 07:24:41 compute-0 ovn_controller[95281]: 2025-11-29T07:24:41Z|00356|binding|INFO|Setting lport a1e67d00-8650-44ba-b75d-07f55b8d8810 up in Southbound
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:41.140 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[8f40bdf6-3789-4a0e-be93-40c1276a95b4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:41 compute-0 nova_compute[187185]: 2025-11-29 07:24:41.141 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:41.181 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[d1fd560b-4dd4-429b-880c-fe4225d0cd85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:41.187 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[fd5c5767-927b-482b-9488-6601de4f5efc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:41 compute-0 NetworkManager[55227]: <info>  [1764401081.1883] manager: (tap240f16d8-60): new Veth device (/org/freedesktop/NetworkManager/Devices/190)
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:41.220 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[687c868c-286a-4b03-94a9-474a9eee1224]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:41.223 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[9068bcce-c463-409d-a2e6-0cf6ab22f83c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:41 compute-0 NetworkManager[55227]: <info>  [1764401081.2473] device (tap240f16d8-60): carrier: link connected
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:41.255 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[bf985494-c6ff-42a0-a7eb-19bc8cd9cfc7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:41.274 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a1bb45cb-7c94-417b-9fe4-bb9bcb8eb091]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap240f16d8-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:aa:7e:40'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 115], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 652891, 'reachable_time': 22166, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234011, 'error': None, 'target': 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:41.303 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[7deb2117-a369-476a-be2b-c06e70023eae]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feaa:7e40'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 652891, 'tstamp': 652891}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234012, 'error': None, 'target': 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:41.319 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[2c4fafa4-b8b4-48b8-ae1d-ff527f35a16d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap240f16d8-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:aa:7e:40'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 115], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 652891, 'reachable_time': 22166, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234013, 'error': None, 'target': 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:41.355 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[9db814bf-ad8d-4fd2-a024-0ccb1ac62d60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:41.412 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[934f9627-98d8-44ca-baaa-f8b65d32d6ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:41.414 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap240f16d8-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:41.414 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:41.414 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap240f16d8-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:24:41 compute-0 kernel: tap240f16d8-60: entered promiscuous mode
Nov 29 07:24:41 compute-0 NetworkManager[55227]: <info>  [1764401081.4168] manager: (tap240f16d8-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/191)
Nov 29 07:24:41 compute-0 nova_compute[187185]: 2025-11-29 07:24:41.417 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:41.419 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap240f16d8-60, col_values=(('external_ids', {'iface-id': '0d1614aa-fbbf-4ad2-9ee2-8948d583ddf6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:24:41 compute-0 nova_compute[187185]: 2025-11-29 07:24:41.420 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:41 compute-0 ovn_controller[95281]: 2025-11-29T07:24:41Z|00357|binding|INFO|Releasing lport 0d1614aa-fbbf-4ad2-9ee2-8948d583ddf6 from this chassis (sb_readonly=0)
Nov 29 07:24:41 compute-0 nova_compute[187185]: 2025-11-29 07:24:41.432 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:41.433 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/240f16d8-602b-4aa1-8edb-e3a8d3674e39.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/240f16d8-602b-4aa1-8edb-e3a8d3674e39.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:41.434 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[7a3e4171-b2e9-40ce-91eb-310a532fa698]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:41.435 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-240f16d8-602b-4aa1-8edb-e3a8d3674e39
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/240f16d8-602b-4aa1-8edb-e3a8d3674e39.pid.haproxy
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID 240f16d8-602b-4aa1-8edb-e3a8d3674e39
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:24:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:41.437 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'env', 'PROCESS_TAG=haproxy-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/240f16d8-602b-4aa1-8edb-e3a8d3674e39.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:24:41 compute-0 nova_compute[187185]: 2025-11-29 07:24:41.483 187189 DEBUG nova.compute.manager [req-b4dfdd13-5923-487b-9131-4e73772c7240 req-a0278173-6e5d-45b1-ad59-0e51971b605d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Received event network-vif-plugged-a1e67d00-8650-44ba-b75d-07f55b8d8810 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:24:41 compute-0 nova_compute[187185]: 2025-11-29 07:24:41.483 187189 DEBUG oslo_concurrency.lockutils [req-b4dfdd13-5923-487b-9131-4e73772c7240 req-a0278173-6e5d-45b1-ad59-0e51971b605d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5f11adcd-958a-4269-905d-a017406505f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:24:41 compute-0 nova_compute[187185]: 2025-11-29 07:24:41.483 187189 DEBUG oslo_concurrency.lockutils [req-b4dfdd13-5923-487b-9131-4e73772c7240 req-a0278173-6e5d-45b1-ad59-0e51971b605d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f11adcd-958a-4269-905d-a017406505f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:24:41 compute-0 nova_compute[187185]: 2025-11-29 07:24:41.484 187189 DEBUG oslo_concurrency.lockutils [req-b4dfdd13-5923-487b-9131-4e73772c7240 req-a0278173-6e5d-45b1-ad59-0e51971b605d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f11adcd-958a-4269-905d-a017406505f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:24:41 compute-0 nova_compute[187185]: 2025-11-29 07:24:41.484 187189 DEBUG nova.compute.manager [req-b4dfdd13-5923-487b-9131-4e73772c7240 req-a0278173-6e5d-45b1-ad59-0e51971b605d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Processing event network-vif-plugged-a1e67d00-8650-44ba-b75d-07f55b8d8810 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:24:41 compute-0 nova_compute[187185]: 2025-11-29 07:24:41.704 187189 DEBUG nova.compute.manager [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:24:41 compute-0 nova_compute[187185]: 2025-11-29 07:24:41.705 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401081.7053332, 5f11adcd-958a-4269-905d-a017406505f0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:24:41 compute-0 nova_compute[187185]: 2025-11-29 07:24:41.706 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5f11adcd-958a-4269-905d-a017406505f0] VM Started (Lifecycle Event)
Nov 29 07:24:41 compute-0 nova_compute[187185]: 2025-11-29 07:24:41.716 187189 DEBUG nova.virt.libvirt.driver [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:24:41 compute-0 nova_compute[187185]: 2025-11-29 07:24:41.726 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:24:41 compute-0 nova_compute[187185]: 2025-11-29 07:24:41.727 187189 INFO nova.virt.libvirt.driver [-] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Instance spawned successfully.
Nov 29 07:24:41 compute-0 nova_compute[187185]: 2025-11-29 07:24:41.727 187189 DEBUG nova.virt.libvirt.driver [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:24:41 compute-0 nova_compute[187185]: 2025-11-29 07:24:41.732 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:24:41 compute-0 nova_compute[187185]: 2025-11-29 07:24:41.739 187189 DEBUG nova.network.neutron [req-d3f39337-d391-42cd-86a7-da2a5100e3be req-d5a78f53-9132-44f4-9757-df439bdbb503 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Updated VIF entry in instance network info cache for port a1e67d00-8650-44ba-b75d-07f55b8d8810. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:24:41 compute-0 nova_compute[187185]: 2025-11-29 07:24:41.740 187189 DEBUG nova.network.neutron [req-d3f39337-d391-42cd-86a7-da2a5100e3be req-d5a78f53-9132-44f4-9757-df439bdbb503 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Updating instance_info_cache with network_info: [{"id": "a1e67d00-8650-44ba-b75d-07f55b8d8810", "address": "fa:16:3e:39:9f:d3", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1e67d00-86", "ovs_interfaceid": "a1e67d00-8650-44ba-b75d-07f55b8d8810", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:24:41 compute-0 nova_compute[187185]: 2025-11-29 07:24:41.755 187189 DEBUG nova.virt.libvirt.driver [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:24:41 compute-0 nova_compute[187185]: 2025-11-29 07:24:41.755 187189 DEBUG nova.virt.libvirt.driver [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:24:41 compute-0 nova_compute[187185]: 2025-11-29 07:24:41.756 187189 DEBUG nova.virt.libvirt.driver [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:24:41 compute-0 nova_compute[187185]: 2025-11-29 07:24:41.756 187189 DEBUG nova.virt.libvirt.driver [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:24:41 compute-0 nova_compute[187185]: 2025-11-29 07:24:41.757 187189 DEBUG nova.virt.libvirt.driver [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:24:41 compute-0 nova_compute[187185]: 2025-11-29 07:24:41.757 187189 DEBUG nova.virt.libvirt.driver [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:24:41 compute-0 nova_compute[187185]: 2025-11-29 07:24:41.761 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5f11adcd-958a-4269-905d-a017406505f0] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:24:41 compute-0 nova_compute[187185]: 2025-11-29 07:24:41.762 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401081.7083428, 5f11adcd-958a-4269-905d-a017406505f0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:24:41 compute-0 nova_compute[187185]: 2025-11-29 07:24:41.762 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5f11adcd-958a-4269-905d-a017406505f0] VM Paused (Lifecycle Event)
Nov 29 07:24:41 compute-0 nova_compute[187185]: 2025-11-29 07:24:41.763 187189 DEBUG oslo_concurrency.lockutils [req-d3f39337-d391-42cd-86a7-da2a5100e3be req-d5a78f53-9132-44f4-9757-df439bdbb503 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-5f11adcd-958a-4269-905d-a017406505f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:24:41 compute-0 nova_compute[187185]: 2025-11-29 07:24:41.790 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:24:41 compute-0 nova_compute[187185]: 2025-11-29 07:24:41.793 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401081.709034, 5f11adcd-958a-4269-905d-a017406505f0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:24:41 compute-0 nova_compute[187185]: 2025-11-29 07:24:41.793 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5f11adcd-958a-4269-905d-a017406505f0] VM Resumed (Lifecycle Event)
Nov 29 07:24:41 compute-0 nova_compute[187185]: 2025-11-29 07:24:41.815 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:24:41 compute-0 nova_compute[187185]: 2025-11-29 07:24:41.817 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:24:41 compute-0 nova_compute[187185]: 2025-11-29 07:24:41.846 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5f11adcd-958a-4269-905d-a017406505f0] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:24:41 compute-0 nova_compute[187185]: 2025-11-29 07:24:41.849 187189 INFO nova.compute.manager [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Took 4.54 seconds to spawn the instance on the hypervisor.
Nov 29 07:24:41 compute-0 nova_compute[187185]: 2025-11-29 07:24:41.849 187189 DEBUG nova.compute.manager [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:24:41 compute-0 podman[234050]: 2025-11-29 07:24:41.78972229 +0000 UTC m=+0.023332559 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:24:41 compute-0 nova_compute[187185]: 2025-11-29 07:24:41.945 187189 INFO nova.compute.manager [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Took 5.05 seconds to build instance.
Nov 29 07:24:41 compute-0 nova_compute[187185]: 2025-11-29 07:24:41.964 187189 DEBUG oslo_concurrency.lockutils [None req-6cd2ce26-d11c-4a25-a4b1-e5b5758def16 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "5f11adcd-958a-4269-905d-a017406505f0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.161s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:24:42 compute-0 sshd-session[234063]: Accepted publickey for nova from 192.168.122.101 port 59798 ssh2: ECDSA SHA256:TRxHqyAgYthLa8Amy2/P9EvhpVYcaH8++okwzvcHmaY
Nov 29 07:24:42 compute-0 systemd-logind[788]: New session 44 of user nova.
Nov 29 07:24:42 compute-0 systemd[1]: Started Session 44 of User nova.
Nov 29 07:24:42 compute-0 sshd-session[234063]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 29 07:24:42 compute-0 podman[234050]: 2025-11-29 07:24:42.366053377 +0000 UTC m=+0.599663616 container create 2f01ae3b54bfd5b2b064416d67b4ec240bcdc698e67290cf758c005b6164c851 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 29 07:24:42 compute-0 nova_compute[187185]: 2025-11-29 07:24:42.490 187189 DEBUG nova.compute.manager [req-a2606dcf-5427-4e1a-91db-9be593f5223a req-eb7380d0-4699-41ba-a7da-189ec5ac3f40 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Received event network-vif-plugged-3484baf0-bfbb-4b67-b841-a369f9a2c534 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:24:42 compute-0 nova_compute[187185]: 2025-11-29 07:24:42.491 187189 DEBUG oslo_concurrency.lockutils [req-a2606dcf-5427-4e1a-91db-9be593f5223a req-eb7380d0-4699-41ba-a7da-189ec5ac3f40 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:24:42 compute-0 nova_compute[187185]: 2025-11-29 07:24:42.491 187189 DEBUG oslo_concurrency.lockutils [req-a2606dcf-5427-4e1a-91db-9be593f5223a req-eb7380d0-4699-41ba-a7da-189ec5ac3f40 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:24:42 compute-0 nova_compute[187185]: 2025-11-29 07:24:42.491 187189 DEBUG oslo_concurrency.lockutils [req-a2606dcf-5427-4e1a-91db-9be593f5223a req-eb7380d0-4699-41ba-a7da-189ec5ac3f40 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:24:42 compute-0 nova_compute[187185]: 2025-11-29 07:24:42.492 187189 DEBUG nova.compute.manager [req-a2606dcf-5427-4e1a-91db-9be593f5223a req-eb7380d0-4699-41ba-a7da-189ec5ac3f40 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] No waiting events found dispatching network-vif-plugged-3484baf0-bfbb-4b67-b841-a369f9a2c534 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:24:42 compute-0 nova_compute[187185]: 2025-11-29 07:24:42.492 187189 WARNING nova.compute.manager [req-a2606dcf-5427-4e1a-91db-9be593f5223a req-eb7380d0-4699-41ba-a7da-189ec5ac3f40 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Received unexpected event network-vif-plugged-3484baf0-bfbb-4b67-b841-a369f9a2c534 for instance with vm_state resized and task_state None.
Nov 29 07:24:42 compute-0 sshd-session[234066]: Received disconnect from 192.168.122.101 port 59798:11: disconnected by user
Nov 29 07:24:42 compute-0 sshd-session[234066]: Disconnected from user nova 192.168.122.101 port 59798
Nov 29 07:24:42 compute-0 sshd-session[234063]: pam_unix(sshd:session): session closed for user nova
Nov 29 07:24:42 compute-0 systemd[1]: session-44.scope: Deactivated successfully.
Nov 29 07:24:42 compute-0 systemd-logind[788]: Session 44 logged out. Waiting for processes to exit.
Nov 29 07:24:42 compute-0 systemd-logind[788]: Removed session 44.
Nov 29 07:24:42 compute-0 sshd-session[234068]: Accepted publickey for nova from 192.168.122.101 port 59800 ssh2: ECDSA SHA256:TRxHqyAgYthLa8Amy2/P9EvhpVYcaH8++okwzvcHmaY
Nov 29 07:24:42 compute-0 systemd-logind[788]: New session 45 of user nova.
Nov 29 07:24:42 compute-0 systemd[1]: Started Session 45 of User nova.
Nov 29 07:24:42 compute-0 sshd-session[234068]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 29 07:24:42 compute-0 nova_compute[187185]: 2025-11-29 07:24:42.940 187189 DEBUG nova.compute.manager [None req-96604e5f-4fc4-4458-9022-5f71dc221c3e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:24:43 compute-0 nova_compute[187185]: 2025-11-29 07:24:43.029 187189 INFO nova.compute.manager [None req-96604e5f-4fc4-4458-9022-5f71dc221c3e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] instance snapshotting
Nov 29 07:24:43 compute-0 sshd-session[234071]: Received disconnect from 192.168.122.101 port 59800:11: disconnected by user
Nov 29 07:24:43 compute-0 sshd-session[234071]: Disconnected from user nova 192.168.122.101 port 59800
Nov 29 07:24:43 compute-0 sshd-session[234068]: pam_unix(sshd:session): session closed for user nova
Nov 29 07:24:43 compute-0 systemd[1]: session-45.scope: Deactivated successfully.
Nov 29 07:24:43 compute-0 systemd-logind[788]: Session 45 logged out. Waiting for processes to exit.
Nov 29 07:24:43 compute-0 systemd-logind[788]: Removed session 45.
Nov 29 07:24:43 compute-0 systemd[1]: Started libpod-conmon-2f01ae3b54bfd5b2b064416d67b4ec240bcdc698e67290cf758c005b6164c851.scope.
Nov 29 07:24:43 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:24:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5a87177512692a664580163c7f0bd904ce12b38b73b317c7fb430e8a6e2d7c3e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:24:43 compute-0 sshd-session[234078]: Accepted publickey for nova from 192.168.122.101 port 59814 ssh2: ECDSA SHA256:TRxHqyAgYthLa8Amy2/P9EvhpVYcaH8++okwzvcHmaY
Nov 29 07:24:43 compute-0 podman[234050]: 2025-11-29 07:24:43.218477132 +0000 UTC m=+1.452087451 container init 2f01ae3b54bfd5b2b064416d67b4ec240bcdc698e67290cf758c005b6164c851 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Nov 29 07:24:43 compute-0 systemd-logind[788]: New session 46 of user nova.
Nov 29 07:24:43 compute-0 podman[234050]: 2025-11-29 07:24:43.232345693 +0000 UTC m=+1.465955932 container start 2f01ae3b54bfd5b2b064416d67b4ec240bcdc698e67290cf758c005b6164c851 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 07:24:43 compute-0 systemd[1]: Started Session 46 of User nova.
Nov 29 07:24:43 compute-0 sshd-session[234078]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 29 07:24:43 compute-0 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[234075]: [NOTICE]   (234082) : New worker (234085) forked
Nov 29 07:24:43 compute-0 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[234075]: [NOTICE]   (234082) : Loading success.
Nov 29 07:24:43 compute-0 nova_compute[187185]: 2025-11-29 07:24:43.276 187189 INFO nova.virt.libvirt.driver [None req-96604e5f-4fc4-4458-9022-5f71dc221c3e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Beginning live snapshot process
Nov 29 07:24:43 compute-0 sshd-session[234084]: Received disconnect from 192.168.122.101 port 59814:11: disconnected by user
Nov 29 07:24:43 compute-0 sshd-session[234084]: Disconnected from user nova 192.168.122.101 port 59814
Nov 29 07:24:43 compute-0 sshd-session[234078]: pam_unix(sshd:session): session closed for user nova
Nov 29 07:24:43 compute-0 systemd[1]: session-46.scope: Deactivated successfully.
Nov 29 07:24:43 compute-0 systemd-logind[788]: Session 46 logged out. Waiting for processes to exit.
Nov 29 07:24:43 compute-0 systemd-logind[788]: Removed session 46.
Nov 29 07:24:43 compute-0 virtqemud[186729]: invalid argument: disk vda does not have an active block job
Nov 29 07:24:43 compute-0 nova_compute[187185]: 2025-11-29 07:24:43.511 187189 DEBUG oslo_concurrency.processutils [None req-96604e5f-4fc4-4458-9022-5f71dc221c3e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:24:43 compute-0 nova_compute[187185]: 2025-11-29 07:24:43.583 187189 DEBUG nova.compute.manager [req-38ee1941-7f94-48f7-bb3a-44ab1ac47c91 req-7851b78b-8435-46c0-af62-56cec340268d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Received event network-vif-plugged-a1e67d00-8650-44ba-b75d-07f55b8d8810 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:24:43 compute-0 nova_compute[187185]: 2025-11-29 07:24:43.584 187189 DEBUG oslo_concurrency.lockutils [req-38ee1941-7f94-48f7-bb3a-44ab1ac47c91 req-7851b78b-8435-46c0-af62-56cec340268d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5f11adcd-958a-4269-905d-a017406505f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:24:43 compute-0 nova_compute[187185]: 2025-11-29 07:24:43.584 187189 DEBUG oslo_concurrency.lockutils [req-38ee1941-7f94-48f7-bb3a-44ab1ac47c91 req-7851b78b-8435-46c0-af62-56cec340268d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f11adcd-958a-4269-905d-a017406505f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:24:43 compute-0 nova_compute[187185]: 2025-11-29 07:24:43.584 187189 DEBUG oslo_concurrency.lockutils [req-38ee1941-7f94-48f7-bb3a-44ab1ac47c91 req-7851b78b-8435-46c0-af62-56cec340268d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f11adcd-958a-4269-905d-a017406505f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:24:43 compute-0 nova_compute[187185]: 2025-11-29 07:24:43.585 187189 DEBUG nova.compute.manager [req-38ee1941-7f94-48f7-bb3a-44ab1ac47c91 req-7851b78b-8435-46c0-af62-56cec340268d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] No waiting events found dispatching network-vif-plugged-a1e67d00-8650-44ba-b75d-07f55b8d8810 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:24:43 compute-0 nova_compute[187185]: 2025-11-29 07:24:43.585 187189 WARNING nova.compute.manager [req-38ee1941-7f94-48f7-bb3a-44ab1ac47c91 req-7851b78b-8435-46c0-af62-56cec340268d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Received unexpected event network-vif-plugged-a1e67d00-8650-44ba-b75d-07f55b8d8810 for instance with vm_state active and task_state image_pending_upload.
Nov 29 07:24:43 compute-0 nova_compute[187185]: 2025-11-29 07:24:43.613 187189 DEBUG oslo_concurrency.processutils [None req-96604e5f-4fc4-4458-9022-5f71dc221c3e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk --force-share --output=json -f qcow2" returned: 0 in 0.102s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:24:43 compute-0 nova_compute[187185]: 2025-11-29 07:24:43.615 187189 DEBUG oslo_concurrency.processutils [None req-96604e5f-4fc4-4458-9022-5f71dc221c3e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:24:43 compute-0 nova_compute[187185]: 2025-11-29 07:24:43.699 187189 DEBUG oslo_concurrency.processutils [None req-96604e5f-4fc4-4458-9022-5f71dc221c3e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk --force-share --output=json -f qcow2" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:24:43 compute-0 nova_compute[187185]: 2025-11-29 07:24:43.731 187189 DEBUG oslo_concurrency.processutils [None req-96604e5f-4fc4-4458-9022-5f71dc221c3e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:24:43 compute-0 nova_compute[187185]: 2025-11-29 07:24:43.825 187189 DEBUG oslo_concurrency.processutils [None req-96604e5f-4fc4-4458-9022-5f71dc221c3e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:24:43 compute-0 nova_compute[187185]: 2025-11-29 07:24:43.826 187189 DEBUG oslo_concurrency.processutils [None req-96604e5f-4fc4-4458-9022-5f71dc221c3e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpxffofv8e/8421ae9395de47afa62f14d8a5a4f0ad.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:24:43 compute-0 nova_compute[187185]: 2025-11-29 07:24:43.867 187189 DEBUG oslo_concurrency.processutils [None req-96604e5f-4fc4-4458-9022-5f71dc221c3e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpxffofv8e/8421ae9395de47afa62f14d8a5a4f0ad.delta 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:24:43 compute-0 nova_compute[187185]: 2025-11-29 07:24:43.868 187189 INFO nova.virt.libvirt.driver [None req-96604e5f-4fc4-4458-9022-5f71dc221c3e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Quiescing instance not available: QEMU guest agent is not enabled.
Nov 29 07:24:43 compute-0 nova_compute[187185]: 2025-11-29 07:24:43.930 187189 DEBUG nova.virt.libvirt.guest [None req-96604e5f-4fc4-4458-9022-5f71dc221c3e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Nov 29 07:24:43 compute-0 nova_compute[187185]: 2025-11-29 07:24:43.936 187189 INFO nova.virt.libvirt.driver [None req-96604e5f-4fc4-4458-9022-5f71dc221c3e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Skipping quiescing instance: QEMU guest agent is not enabled.
Nov 29 07:24:43 compute-0 nova_compute[187185]: 2025-11-29 07:24:43.983 187189 INFO nova.network.neutron [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Updating port e89dd8de-f981-46cf-aa04-cfad6a9b2326 with attributes {'binding:host_id': 'compute-0.ctlplane.example.com', 'device_owner': 'compute:nova'}
Nov 29 07:24:43 compute-0 nova_compute[187185]: 2025-11-29 07:24:43.989 187189 DEBUG nova.privsep.utils [None req-96604e5f-4fc4-4458-9022-5f71dc221c3e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 29 07:24:43 compute-0 nova_compute[187185]: 2025-11-29 07:24:43.990 187189 DEBUG oslo_concurrency.processutils [None req-96604e5f-4fc4-4458-9022-5f71dc221c3e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpxffofv8e/8421ae9395de47afa62f14d8a5a4f0ad.delta /var/lib/nova/instances/snapshots/tmpxffofv8e/8421ae9395de47afa62f14d8a5a4f0ad execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:24:44 compute-0 nova_compute[187185]: 2025-11-29 07:24:44.217 187189 DEBUG oslo_concurrency.processutils [None req-96604e5f-4fc4-4458-9022-5f71dc221c3e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpxffofv8e/8421ae9395de47afa62f14d8a5a4f0ad.delta /var/lib/nova/instances/snapshots/tmpxffofv8e/8421ae9395de47afa62f14d8a5a4f0ad" returned: 0 in 0.227s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:24:44 compute-0 nova_compute[187185]: 2025-11-29 07:24:44.218 187189 INFO nova.virt.libvirt.driver [None req-96604e5f-4fc4-4458-9022-5f71dc221c3e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Snapshot extracted, beginning image upload
Nov 29 07:24:44 compute-0 nova_compute[187185]: 2025-11-29 07:24:44.482 187189 DEBUG nova.compute.manager [req-66b64ba3-279a-4d99-b56e-7e154fd9d698 req-d8c3565a-32f6-4062-9f41-e11c2d4f2d1d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Received event network-vif-unplugged-e89dd8de-f981-46cf-aa04-cfad6a9b2326 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:24:44 compute-0 nova_compute[187185]: 2025-11-29 07:24:44.483 187189 DEBUG oslo_concurrency.lockutils [req-66b64ba3-279a-4d99-b56e-7e154fd9d698 req-d8c3565a-32f6-4062-9f41-e11c2d4f2d1d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7c10cb24-586c-4507-8169-8258d7136397-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:24:44 compute-0 nova_compute[187185]: 2025-11-29 07:24:44.483 187189 DEBUG oslo_concurrency.lockutils [req-66b64ba3-279a-4d99-b56e-7e154fd9d698 req-d8c3565a-32f6-4062-9f41-e11c2d4f2d1d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7c10cb24-586c-4507-8169-8258d7136397-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:24:44 compute-0 nova_compute[187185]: 2025-11-29 07:24:44.483 187189 DEBUG oslo_concurrency.lockutils [req-66b64ba3-279a-4d99-b56e-7e154fd9d698 req-d8c3565a-32f6-4062-9f41-e11c2d4f2d1d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7c10cb24-586c-4507-8169-8258d7136397-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:24:44 compute-0 nova_compute[187185]: 2025-11-29 07:24:44.483 187189 DEBUG nova.compute.manager [req-66b64ba3-279a-4d99-b56e-7e154fd9d698 req-d8c3565a-32f6-4062-9f41-e11c2d4f2d1d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] No waiting events found dispatching network-vif-unplugged-e89dd8de-f981-46cf-aa04-cfad6a9b2326 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:24:44 compute-0 nova_compute[187185]: 2025-11-29 07:24:44.484 187189 WARNING nova.compute.manager [req-66b64ba3-279a-4d99-b56e-7e154fd9d698 req-d8c3565a-32f6-4062-9f41-e11c2d4f2d1d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Received unexpected event network-vif-unplugged-e89dd8de-f981-46cf-aa04-cfad6a9b2326 for instance with vm_state active and task_state resize_migrated.
Nov 29 07:24:45 compute-0 nova_compute[187185]: 2025-11-29 07:24:45.171 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:46 compute-0 nova_compute[187185]: 2025-11-29 07:24:46.186 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:48.001 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}dd148b1b64568516e8e9c4f7dca4b2c96ed9e7a6d38f1387503b8184f9bf9013" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Nov 29 07:24:48 compute-0 podman[234123]: 2025-11-29 07:24:48.812600566 +0000 UTC m=+0.069003497 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 07:24:48 compute-0 podman[234121]: 2025-11-29 07:24:48.818333078 +0000 UTC m=+0.072171057 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Nov 29 07:24:48 compute-0 podman[234122]: 2025-11-29 07:24:48.842740926 +0000 UTC m=+0.100876236 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vendor=Red Hat, Inc., vcs-type=git, version=9.6, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, architecture=x86_64, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.expose-services=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.340 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 644 Content-Type: application/json Date: Sat, 29 Nov 2025 07:24:48 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-ef1c458b-ee16-4240-943f-51194927c41a x-openstack-request-id: req-ef1c458b-ee16-4240-943f-51194927c41a _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.341 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31"}]}, {"id": "e29df891-dca5-4a1c-9258-dc512a46956f", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/e29df891-dca5-4a1c-9258-dc512a46956f"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/e29df891-dca5-4a1c-9258-dc512a46956f"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.341 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-ef1c458b-ee16-4240-943f-51194927c41a request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.343 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/e29df891-dca5-4a1c-9258-dc512a46956f -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}dd148b1b64568516e8e9c4f7dca4b2c96ed9e7a6d38f1387503b8184f9bf9013" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.409 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 496 Content-Type: application/json Date: Sat, 29 Nov 2025 07:24:49 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-5465545e-d7be-4ed8-b39e-1f34a9361391 x-openstack-request-id: req-5465545e-d7be-4ed8-b39e-1f34a9361391 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.409 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "e29df891-dca5-4a1c-9258-dc512a46956f", "name": "m1.micro", "ram": 192, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/e29df891-dca5-4a1c-9258-dc512a46956f"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/e29df891-dca5-4a1c-9258-dc512a46956f"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.409 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/e29df891-dca5-4a1c-9258-dc512a46956f used request id req-5465545e-d7be-4ed8-b39e-1f34a9361391 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.410 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '2702fe48-44d0-408d-8d10-fd635e3779c9', 'name': 'tempest-ServerDiskConfigTestJSON-server-864835491', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000007a', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '6d55e57bfd184513a304a61cc1cb3730', 'user_id': '000fb7b950024e16902cd58f2ea16ac9', 'hostId': 'b73f865756412e547668d2c5fefa1e20eb689217c3e5e5cfc8cbb829', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.413 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '5f11adcd-958a-4269-905d-a017406505f0', 'name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000007b', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'hostId': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.414 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.452 12 DEBUG ceilometer.compute.pollsters [-] 2702fe48-44d0-408d-8d10-fd635e3779c9/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.452 12 DEBUG ceilometer.compute.pollsters [-] 2702fe48-44d0-408d-8d10-fd635e3779c9/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.482 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.483 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bf0e6e78-f892-44a3-91c3-dcd19949d476', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '000fb7b950024e16902cd58f2ea16ac9', 'user_name': None, 'project_id': '6d55e57bfd184513a304a61cc1cb3730', 'project_name': None, 'resource_id': '2702fe48-44d0-408d-8d10-fd635e3779c9-vda', 'timestamp': '2025-11-29T07:24:49.414329', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-864835491', 'name': 'instance-0000007a', 'instance_id': '2702fe48-44d0-408d-8d10-fd635e3779c9', 'instance_type': 'm1.micro', 'host': 'b73f865756412e547668d2c5fefa1e20eb689217c3e5e5cfc8cbb829', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7d4942f2-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.132567412, 'message_signature': '01a79b00628678aa12ecc4c5ebaefe232e59f3ee6967c701ba01d2e05e66c465'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '000fb7b950024e16902cd58f2ea16ac9', 'user_name': None, 'project_id': '6d55e57bfd184513a304a61cc1cb3730', 'project_name': None, 'resource_id': '2702fe48-44d0-408d-8d10-fd635e3779c9-sda', 'timestamp': '2025-11-29T07:24:49.414329', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-864835491', 'name': 'instance-0000007a', 'instance_id': '2702fe48-44d0-408d-8d10-fd635e3779c9', 'instance_type': 'm1.micro', 'host': 'b73f865756412e547668d2c5fefa1e20eb689217c3e5e5cfc8cbb829', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7d495210-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.132567412, 'message_signature': '6c5e0b7da31f47395ae889f29738fe72625444dffd8d2aeee525c6b71fd5c15b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '5f11adcd-958a-4269-905d-a017406505f0-vda', 'timestamp': '2025-11-29T07:24:49.414329', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'instance-0000007b', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7d4de1f4-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.171234672, 'message_signature': '1ea0a017ac052d32d85fd0789df7a6e1d1058b8e802e9edc05ad909443c7d71e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '5f11adcd-958a-4269-905d-a017406505f0-sda', 'timestamp': '2025-11-29T07:24:49.414329', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'instance-0000007b', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7d4df0fe-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.171234672, 'message_signature': 'd763c015514aafef0b096b6ed027c0c8a33ac95e0697b81a9ae83c6b39c75944'}]}, 'timestamp': '2025-11-29 07:24:49.483304', '_unique_id': 'da8a7968436f490fa5051bf330692fa8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.484 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.485 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.490 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 2702fe48-44d0-408d-8d10-fd635e3779c9 / tap3484baf0-bf inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.490 12 DEBUG ceilometer.compute.pollsters [-] 2702fe48-44d0-408d-8d10-fd635e3779c9/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.492 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 5f11adcd-958a-4269-905d-a017406505f0 / tapa1e67d00-86 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.492 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '34491c66-5a8f-4a25-b7de-0f8e09a13a89', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '000fb7b950024e16902cd58f2ea16ac9', 'user_name': None, 'project_id': '6d55e57bfd184513a304a61cc1cb3730', 'project_name': None, 'resource_id': 'instance-0000007a-2702fe48-44d0-408d-8d10-fd635e3779c9-tap3484baf0-bf', 'timestamp': '2025-11-29T07:24:49.485603', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-864835491', 'name': 'tap3484baf0-bf', 'instance_id': '2702fe48-44d0-408d-8d10-fd635e3779c9', 'instance_type': 'm1.micro', 'host': 'b73f865756412e547668d2c5fefa1e20eb689217c3e5e5cfc8cbb829', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:34:30:8b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3484baf0-bf'}, 'message_id': '7d4f12d6-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.203842472, 'message_signature': '11046fbed6979179029d6487661cac049888c1eab413fdc5bbb8980a3bb738d6'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'instance-0000007b-5f11adcd-958a-4269-905d-a017406505f0-tapa1e67d00-86', 'timestamp': '2025-11-29T07:24:49.485603', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'tapa1e67d00-86', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:39:9f:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa1e67d00-86'}, 'message_id': '7d4f6e48-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.208955766, 'message_signature': 'ed15eb6b2d72b80e4eb62e01e82d8abed66681070981a6d0561860ddd7b2039e'}]}, 'timestamp': '2025-11-29 07:24:49.493079', '_unique_id': '051825df79e04da9a27a2375d3767588'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.494 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.495 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.510 12 DEBUG ceilometer.compute.pollsters [-] 2702fe48-44d0-408d-8d10-fd635e3779c9/disk.device.allocation volume: 29954048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.511 12 DEBUG ceilometer.compute.pollsters [-] 2702fe48-44d0-408d-8d10-fd635e3779c9/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.523 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.524 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f88a507b-ae9a-46db-afec-0c52850a8c8b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29954048, 'user_id': '000fb7b950024e16902cd58f2ea16ac9', 'user_name': None, 'project_id': '6d55e57bfd184513a304a61cc1cb3730', 'project_name': None, 'resource_id': '2702fe48-44d0-408d-8d10-fd635e3779c9-vda', 'timestamp': '2025-11-29T07:24:49.495818', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-864835491', 'name': 'instance-0000007a', 'instance_id': '2702fe48-44d0-408d-8d10-fd635e3779c9', 'instance_type': 'm1.micro', 'host': 'b73f865756412e547668d2c5fefa1e20eb689217c3e5e5cfc8cbb829', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7d523af6-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.214081301, 'message_signature': '2862e8119e96e715094a8e0b3c68ef4b7939038bcc52e93b623aeaa800ae4bb3'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '000fb7b950024e16902cd58f2ea16ac9', 'user_name': None, 'project_id': '6d55e57bfd184513a304a61cc1cb3730', 'project_name': None, 'resource_id': '2702fe48-44d0-408d-8d10-fd635e3779c9-sda', 'timestamp': '2025-11-29T07:24:49.495818', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-864835491', 'name': 'instance-0000007a', 'instance_id': '2702fe48-44d0-408d-8d10-fd635e3779c9', 'instance_type': 'm1.micro', 'host': 'b73f865756412e547668d2c5fefa1e20eb689217c3e5e5cfc8cbb829', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7d524fbe-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.214081301, 'message_signature': 'ffb8f8d9282bb0d74b35f148098813365406f7c8ec08ed0f7cb90a5d20d049e2'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '5f11adcd-958a-4269-905d-a017406505f0-vda', 'timestamp': '2025-11-29T07:24:49.495818', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'instance-0000007b', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7d542924-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.230263157, 'message_signature': 'fdf84ca78c046efc7e354a2f680753a1a1b2615ea69454d42264d3ee466dee58'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '5f11adcd-958a-4269-905d-a017406505f0-sda', 'timestamp': '2025-11-29T07:24:49.495818', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'instance-0000007b', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7d54382e-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.230263157, 'message_signature': '61631a1e7aae120af70bf85659e3e8f654ee86fbfc9c05529ea783e0e1ec1a32'}]}, 'timestamp': '2025-11-29 07:24:49.524489', '_unique_id': '6542ee4156704b87a303ac847eb4e536'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.525 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.526 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.527 12 DEBUG ceilometer.compute.pollsters [-] 2702fe48-44d0-408d-8d10-fd635e3779c9/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.527 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4638f374-26ba-4d9e-9efe-78d1dbb0d2de', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '000fb7b950024e16902cd58f2ea16ac9', 'user_name': None, 'project_id': '6d55e57bfd184513a304a61cc1cb3730', 'project_name': None, 'resource_id': 'instance-0000007a-2702fe48-44d0-408d-8d10-fd635e3779c9-tap3484baf0-bf', 'timestamp': '2025-11-29T07:24:49.527062', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-864835491', 'name': 'tap3484baf0-bf', 'instance_id': '2702fe48-44d0-408d-8d10-fd635e3779c9', 'instance_type': 'm1.micro', 'host': 'b73f865756412e547668d2c5fefa1e20eb689217c3e5e5cfc8cbb829', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:34:30:8b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3484baf0-bf'}, 'message_id': '7d54abd8-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.203842472, 'message_signature': '01bbb8b92ca9ec694e912dbc259326aaf4e11bcb7c5e1d89454b5eda8ef52af6'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'instance-0000007b-5f11adcd-958a-4269-905d-a017406505f0-tapa1e67d00-86', 'timestamp': '2025-11-29T07:24:49.527062', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'tapa1e67d00-86', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:39:9f:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa1e67d00-86'}, 'message_id': '7d54b90c-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.208955766, 'message_signature': 'fa47f7f7fe7b465dee61287942fa28120b89afbeac562e7b8cce4a4ab00b5fb5'}]}, 'timestamp': '2025-11-29 07:24:49.527777', '_unique_id': '392ba9042cf64e2091ca127951e49a9c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.528 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.529 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.529 12 DEBUG ceilometer.compute.pollsters [-] 2702fe48-44d0-408d-8d10-fd635e3779c9/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.530 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd879ab9-9fe4-4242-8fca-74de362df30b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '000fb7b950024e16902cd58f2ea16ac9', 'user_name': None, 'project_id': '6d55e57bfd184513a304a61cc1cb3730', 'project_name': None, 'resource_id': 'instance-0000007a-2702fe48-44d0-408d-8d10-fd635e3779c9-tap3484baf0-bf', 'timestamp': '2025-11-29T07:24:49.529700', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-864835491', 'name': 'tap3484baf0-bf', 'instance_id': '2702fe48-44d0-408d-8d10-fd635e3779c9', 'instance_type': 'm1.micro', 'host': 'b73f865756412e547668d2c5fefa1e20eb689217c3e5e5cfc8cbb829', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:34:30:8b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3484baf0-bf'}, 'message_id': '7d5512b2-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.203842472, 'message_signature': '5360df3836a526e6c19d324b3667f797f1795aeceaa12d38b99ee1c6a099515c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'instance-0000007b-5f11adcd-958a-4269-905d-a017406505f0-tapa1e67d00-86', 'timestamp': '2025-11-29T07:24:49.529700', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'tapa1e67d00-86', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:39:9f:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa1e67d00-86'}, 'message_id': '7d551fdc-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.208955766, 'message_signature': '6d6dbb810e82c63be577d2c7490e49c7e1e97cc24e3810585fbdd8d308ffa23d'}]}, 'timestamp': '2025-11-29 07:24:49.530398', '_unique_id': '81f3a23794fb42b296d2de426fdd87ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.531 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.532 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.532 12 DEBUG ceilometer.compute.pollsters [-] 2702fe48-44d0-408d-8d10-fd635e3779c9/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.532 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '25ca9244-1cb4-413d-8ed6-dbb7389d739e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '000fb7b950024e16902cd58f2ea16ac9', 'user_name': None, 'project_id': '6d55e57bfd184513a304a61cc1cb3730', 'project_name': None, 'resource_id': 'instance-0000007a-2702fe48-44d0-408d-8d10-fd635e3779c9-tap3484baf0-bf', 'timestamp': '2025-11-29T07:24:49.532274', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-864835491', 'name': 'tap3484baf0-bf', 'instance_id': '2702fe48-44d0-408d-8d10-fd635e3779c9', 'instance_type': 'm1.micro', 'host': 'b73f865756412e547668d2c5fefa1e20eb689217c3e5e5cfc8cbb829', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:34:30:8b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3484baf0-bf'}, 'message_id': '7d5576e4-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.203842472, 'message_signature': 'aa959aecc68939affd37307b55ee5d3a03c5b9ef36f97c6a0f84ef79b4f04f03'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'instance-0000007b-5f11adcd-958a-4269-905d-a017406505f0-tapa1e67d00-86', 'timestamp': '2025-11-29T07:24:49.532274', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'tapa1e67d00-86', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:39:9f:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa1e67d00-86'}, 'message_id': '7d5585e4-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.208955766, 'message_signature': 'e9c47b89a8994b1bad2cfa79c0f18ec370e52a498d2e53df10a9f88c5fed5fc0'}]}, 'timestamp': '2025-11-29 07:24:49.533025', '_unique_id': 'ec57661000794333a1a69a35fc41a232'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.533 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.534 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.534 12 DEBUG ceilometer.compute.pollsters [-] 2702fe48-44d0-408d-8d10-fd635e3779c9/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.535 12 DEBUG ceilometer.compute.pollsters [-] 2702fe48-44d0-408d-8d10-fd635e3779c9/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.535 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.535 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fba78b74-929f-4d51-9fe4-b4b2f140935e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '000fb7b950024e16902cd58f2ea16ac9', 'user_name': None, 'project_id': '6d55e57bfd184513a304a61cc1cb3730', 'project_name': None, 'resource_id': '2702fe48-44d0-408d-8d10-fd635e3779c9-vda', 'timestamp': '2025-11-29T07:24:49.534775', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-864835491', 'name': 'instance-0000007a', 'instance_id': '2702fe48-44d0-408d-8d10-fd635e3779c9', 'instance_type': 'm1.micro', 'host': 'b73f865756412e547668d2c5fefa1e20eb689217c3e5e5cfc8cbb829', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7d55d92c-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.214081301, 'message_signature': '84d39b1e528a03a8bc36f14bb661c16fc48f63c76cabddbedd561ab64ecd91d4'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '000fb7b950024e16902cd58f2ea16ac9', 'user_name': None, 'project_id': '6d55e57bfd184513a304a61cc1cb3730', 'project_name': None, 'resource_id': '2702fe48-44d0-408d-8d10-fd635e3779c9-sda', 'timestamp': '2025-11-29T07:24:49.534775', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-864835491', 'name': 'instance-0000007a', 'instance_id': '2702fe48-44d0-408d-8d10-fd635e3779c9', 'instance_type': 'm1.micro', 'host': 'b73f865756412e547668d2c5fefa1e20eb689217c3e5e5cfc8cbb829', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7d55e5ca-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.214081301, 'message_signature': '17a85dbdb804b94ae58c930cfd4faa13028ccf8b82216ee703730644545070fb'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '5f11adcd-958a-4269-905d-a017406505f0-vda', 'timestamp': '2025-11-29T07:24:49.534775', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'instance-0000007b', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7d55f1e6-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.230263157, 'message_signature': '31effd58fbe84de04c8e29f0356a89ed1a8178176844b2fa044cfa16668a8978'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '5f11adcd-958a-4269-905d-a017406505f0-sda', 'timestamp': '2025-11-29T07:24:49.534775', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'instance-0000007b', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7d55fefc-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.230263157, 'message_signature': 'd9d2ea0b072a94149ab76370b55d7e82502f4f6fe886fb0b93a10d351cd65545'}]}, 'timestamp': '2025-11-29 07:24:49.536110', '_unique_id': '67cd36ebf2334f28a3148777459620c4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.536 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.537 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.558 12 DEBUG ceilometer.compute.pollsters [-] 2702fe48-44d0-408d-8d10-fd635e3779c9/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.558 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 2702fe48-44d0-408d-8d10-fd635e3779c9: ceilometer.compute.pollsters.NoVolumeException
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.575 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.575 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 5f11adcd-958a-4269-905d-a017406505f0: ceilometer.compute.pollsters.NoVolumeException
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.575 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.576 12 DEBUG ceilometer.compute.pollsters [-] 2702fe48-44d0-408d-8d10-fd635e3779c9/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.576 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0a1efd5c-e1d7-402a-9a6e-e762a4ee74d5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '000fb7b950024e16902cd58f2ea16ac9', 'user_name': None, 'project_id': '6d55e57bfd184513a304a61cc1cb3730', 'project_name': None, 'resource_id': 'instance-0000007a-2702fe48-44d0-408d-8d10-fd635e3779c9-tap3484baf0-bf', 'timestamp': '2025-11-29T07:24:49.576201', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-864835491', 'name': 'tap3484baf0-bf', 'instance_id': '2702fe48-44d0-408d-8d10-fd635e3779c9', 'instance_type': 'm1.micro', 'host': 'b73f865756412e547668d2c5fefa1e20eb689217c3e5e5cfc8cbb829', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:34:30:8b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3484baf0-bf'}, 'message_id': '7d5c2e30-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.203842472, 'message_signature': 'ebbda38a932b42e9bbe9d6fcf514afc638627dbc62cc1fcf727ed1e033f06210'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'instance-0000007b-5f11adcd-958a-4269-905d-a017406505f0-tapa1e67d00-86', 'timestamp': '2025-11-29T07:24:49.576201', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'tapa1e67d00-86', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:39:9f:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa1e67d00-86'}, 'message_id': '7d5c3f06-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.208955766, 'message_signature': '549a6b8420cdcb16c8a2c85a2c7db4b92c4f03819e45e8e90b97701c006c663d'}]}, 'timestamp': '2025-11-29 07:24:49.577097', '_unique_id': '2561a24acdad4704ace7260aad44cd9c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.578 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.579 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.579 12 DEBUG ceilometer.compute.pollsters [-] 2702fe48-44d0-408d-8d10-fd635e3779c9/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.580 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '03c03bb6-c6d0-4fe0-af90-c45297eb9fcb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '000fb7b950024e16902cd58f2ea16ac9', 'user_name': None, 'project_id': '6d55e57bfd184513a304a61cc1cb3730', 'project_name': None, 'resource_id': 'instance-0000007a-2702fe48-44d0-408d-8d10-fd635e3779c9-tap3484baf0-bf', 'timestamp': '2025-11-29T07:24:49.579662', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-864835491', 'name': 'tap3484baf0-bf', 'instance_id': '2702fe48-44d0-408d-8d10-fd635e3779c9', 'instance_type': 'm1.micro', 'host': 'b73f865756412e547668d2c5fefa1e20eb689217c3e5e5cfc8cbb829', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:34:30:8b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3484baf0-bf'}, 'message_id': '7d5cb904-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.203842472, 'message_signature': '84077a09714392596cbf60b81c303fa7b62863b3d8dfe187875fc989047c4303'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'instance-0000007b-5f11adcd-958a-4269-905d-a017406505f0-tapa1e67d00-86', 'timestamp': '2025-11-29T07:24:49.579662', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'tapa1e67d00-86', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:39:9f:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa1e67d00-86'}, 'message_id': '7d5cc66a-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.208955766, 'message_signature': '49c72fe1b4e5aceb7eb460c7c6628049014c8cf259a773bfe91a64698ca082e5'}]}, 'timestamp': '2025-11-29 07:24:49.580551', '_unique_id': 'b64984cdc8254807a24b2a758c195be4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.581 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.582 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.582 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.582 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-864835491>, <NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1816602290>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-864835491>, <NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1816602290>]
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.583 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.583 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.583 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-864835491>, <NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1816602290>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-864835491>, <NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1816602290>]
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.583 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.583 12 DEBUG ceilometer.compute.pollsters [-] 2702fe48-44d0-408d-8d10-fd635e3779c9/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.584 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '961e3439-c310-4031-8882-946c9c0917b8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '000fb7b950024e16902cd58f2ea16ac9', 'user_name': None, 'project_id': '6d55e57bfd184513a304a61cc1cb3730', 'project_name': None, 'resource_id': 'instance-0000007a-2702fe48-44d0-408d-8d10-fd635e3779c9-tap3484baf0-bf', 'timestamp': '2025-11-29T07:24:49.583938', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-864835491', 'name': 'tap3484baf0-bf', 'instance_id': '2702fe48-44d0-408d-8d10-fd635e3779c9', 'instance_type': 'm1.micro', 'host': 'b73f865756412e547668d2c5fefa1e20eb689217c3e5e5cfc8cbb829', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:34:30:8b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3484baf0-bf'}, 'message_id': '7d5d58d2-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.203842472, 'message_signature': '56f51926daf3f453a9bdcb82d5bc3977a5b697292d91b8f0f7c7a85f8db8ffc3'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'instance-0000007b-5f11adcd-958a-4269-905d-a017406505f0-tapa1e67d00-86', 'timestamp': '2025-11-29T07:24:49.583938', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'tapa1e67d00-86', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:39:9f:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa1e67d00-86'}, 'message_id': '7d5d65fc-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.208955766, 'message_signature': '3223d8f52252105d0524b225e5d143955c9d307a5af8e16b2cf5b15d44ce3924'}]}, 'timestamp': '2025-11-29 07:24:49.584635', '_unique_id': 'd1b455e6afb64ecba865b9b02c0efdfe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.585 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.586 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.586 12 DEBUG ceilometer.compute.pollsters [-] 2702fe48-44d0-408d-8d10-fd635e3779c9/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.586 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '45cc6805-fc08-4731-bb08-ad268aa943d2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '000fb7b950024e16902cd58f2ea16ac9', 'user_name': None, 'project_id': '6d55e57bfd184513a304a61cc1cb3730', 'project_name': None, 'resource_id': 'instance-0000007a-2702fe48-44d0-408d-8d10-fd635e3779c9-tap3484baf0-bf', 'timestamp': '2025-11-29T07:24:49.586557', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-864835491', 'name': 'tap3484baf0-bf', 'instance_id': '2702fe48-44d0-408d-8d10-fd635e3779c9', 'instance_type': 'm1.micro', 'host': 'b73f865756412e547668d2c5fefa1e20eb689217c3e5e5cfc8cbb829', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:34:30:8b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3484baf0-bf'}, 'message_id': '7d5dbf16-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.203842472, 'message_signature': '1fc8f2a9e443c4509bc9cdc32b3050a325fc5902c4fbea7b0b3dab3df2f0e4f4'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'instance-0000007b-5f11adcd-958a-4269-905d-a017406505f0-tapa1e67d00-86', 'timestamp': '2025-11-29T07:24:49.586557', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'tapa1e67d00-86', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:39:9f:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa1e67d00-86'}, 'message_id': '7d5dcd58-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.208955766, 'message_signature': '361d944262d741ede0f78f36316228a68a273d552d268f4a9f3ef88cfad1b705'}]}, 'timestamp': '2025-11-29 07:24:49.587305', '_unique_id': 'db83f4b06b234289a9d872fd11a85da5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.587 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.589 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.589 12 DEBUG ceilometer.compute.pollsters [-] 2702fe48-44d0-408d-8d10-fd635e3779c9/cpu volume: 8630000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.589 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/cpu volume: 7540000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e1701bc0-8c34-412c-8455-21ba584f7625', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 8630000000, 'user_id': '000fb7b950024e16902cd58f2ea16ac9', 'user_name': None, 'project_id': '6d55e57bfd184513a304a61cc1cb3730', 'project_name': None, 'resource_id': '2702fe48-44d0-408d-8d10-fd635e3779c9', 'timestamp': '2025-11-29T07:24:49.589240', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-864835491', 'name': 'instance-0000007a', 'instance_id': '2702fe48-44d0-408d-8d10-fd635e3779c9', 'instance_type': 'm1.micro', 'host': 'b73f865756412e547668d2c5fefa1e20eb689217c3e5e5cfc8cbb829', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '7d5e27c6-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.276657396, 'message_signature': '471c4efb9399fd84f8a93358d6d98e45dcb431063cc0077e9ea75e6d9c49f383'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 7540000000, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '5f11adcd-958a-4269-905d-a017406505f0', 'timestamp': '2025-11-29T07:24:49.589240', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'instance-0000007b', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '7d5e3414-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.293396978, 'message_signature': '4e5a634197a916b1ed97b398e08769d44052dd943ad08c4a7da9b00c5403d208'}]}, 'timestamp': '2025-11-29 07:24:49.589922', '_unique_id': '64929d492b9e4deb914b080f568ced87'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.590 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.591 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.591 12 DEBUG ceilometer.compute.pollsters [-] 2702fe48-44d0-408d-8d10-fd635e3779c9/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.592 12 DEBUG ceilometer.compute.pollsters [-] 2702fe48-44d0-408d-8d10-fd635e3779c9/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.592 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.592 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f966c037-9c8b-4416-9fa2-69f286d18aa0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '000fb7b950024e16902cd58f2ea16ac9', 'user_name': None, 'project_id': '6d55e57bfd184513a304a61cc1cb3730', 'project_name': None, 'resource_id': '2702fe48-44d0-408d-8d10-fd635e3779c9-vda', 'timestamp': '2025-11-29T07:24:49.591805', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-864835491', 'name': 'instance-0000007a', 'instance_id': '2702fe48-44d0-408d-8d10-fd635e3779c9', 'instance_type': 'm1.micro', 'host': 'b73f865756412e547668d2c5fefa1e20eb689217c3e5e5cfc8cbb829', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7d5e8d1a-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.132567412, 'message_signature': 'cd021495269295f99979490717baa39860cbe8bf40f6aebd701610afc7077c72'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '000fb7b950024e16902cd58f2ea16ac9', 'user_name': None, 'project_id': '6d55e57bfd184513a304a61cc1cb3730', 'project_name': None, 'resource_id': '2702fe48-44d0-408d-8d10-fd635e3779c9-sda', 'timestamp': '2025-11-29T07:24:49.591805', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-864835491', 'name': 'instance-0000007a', 'instance_id': '2702fe48-44d0-408d-8d10-fd635e3779c9', 'instance_type': 'm1.micro', 'host': 'b73f865756412e547668d2c5fefa1e20eb689217c3e5e5cfc8cbb829', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7d5e9968-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.132567412, 'message_signature': '990552f01042c5222e616f2a3f5e04319308e46c8aaf3030c30239e588c04ba9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '5f11adcd-958a-4269-905d-a017406505f0-vda', 'timestamp': '2025-11-29T07:24:49.591805', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'instance-0000007b', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7d5ea58e-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.171234672, 'message_signature': 'fc0122e8adcb5d3f4cbfac5391049dc0be28eccefe0d45f826d7b1d69b29891a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '5f11adcd-958a-4269-905d-a017406505f0-sda', 'timestamp': '2025-11-29T07:24:49.591805', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'instance-0000007b', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7d5eb272-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.171234672, 'message_signature': 'd685f3ad0800bc20cd3c9082960a39672b978ef986e207ffc439a58120d7cc9f'}]}, 'timestamp': '2025-11-29 07:24:49.593131', '_unique_id': '251eccd7b52342e894b89839de66a5d9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.593 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.594 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.595 12 DEBUG ceilometer.compute.pollsters [-] 2702fe48-44d0-408d-8d10-fd635e3779c9/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.595 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd89c6383-6954-453d-a5fb-1440bdecae48', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '000fb7b950024e16902cd58f2ea16ac9', 'user_name': None, 'project_id': '6d55e57bfd184513a304a61cc1cb3730', 'project_name': None, 'resource_id': 'instance-0000007a-2702fe48-44d0-408d-8d10-fd635e3779c9-tap3484baf0-bf', 'timestamp': '2025-11-29T07:24:49.595196', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-864835491', 'name': 'tap3484baf0-bf', 'instance_id': '2702fe48-44d0-408d-8d10-fd635e3779c9', 'instance_type': 'm1.micro', 'host': 'b73f865756412e547668d2c5fefa1e20eb689217c3e5e5cfc8cbb829', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:34:30:8b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3484baf0-bf'}, 'message_id': '7d5f1118-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.203842472, 'message_signature': 'e9afc7d338cfebf9891507b30ef3ba49a5739a45a73245f94e99f8ec39128abf'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'instance-0000007b-5f11adcd-958a-4269-905d-a017406505f0-tapa1e67d00-86', 'timestamp': '2025-11-29T07:24:49.595196', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'tapa1e67d00-86', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:39:9f:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa1e67d00-86'}, 'message_id': '7d5f1e7e-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.208955766, 'message_signature': '9ad6a060b2cee92ca8e663dd6b1fcfc8548b0f941312b01be94a32d1f3f63415'}]}, 'timestamp': '2025-11-29 07:24:49.595933', '_unique_id': '3d1673ecddb64a50a916693f96df9cca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.596 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.597 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.597 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.598 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-864835491>, <NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1816602290>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-864835491>, <NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1816602290>]
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.598 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.598 12 DEBUG ceilometer.compute.pollsters [-] 2702fe48-44d0-408d-8d10-fd635e3779c9/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.598 12 DEBUG ceilometer.compute.pollsters [-] 2702fe48-44d0-408d-8d10-fd635e3779c9/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.599 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.599 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd7294aa9-1fa1-4a5a-b590-c91e81513331', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '000fb7b950024e16902cd58f2ea16ac9', 'user_name': None, 'project_id': '6d55e57bfd184513a304a61cc1cb3730', 'project_name': None, 'resource_id': '2702fe48-44d0-408d-8d10-fd635e3779c9-vda', 'timestamp': '2025-11-29T07:24:49.598483', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-864835491', 'name': 'instance-0000007a', 'instance_id': '2702fe48-44d0-408d-8d10-fd635e3779c9', 'instance_type': 'm1.micro', 'host': 'b73f865756412e547668d2c5fefa1e20eb689217c3e5e5cfc8cbb829', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7d5f90a2-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.132567412, 'message_signature': 'f620be044cb2c98f576221609ed6e4bbba242f563909bae5b346fb7ad5701c3f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '000fb7b950024e16902cd58f2ea16ac9', 'user_name': None, 'project_id': '6d55e57bfd184513a304a61cc1cb3730', 'project_name': None, 'resource_id': '2702fe48-44d0-408d-8d10-fd635e3779c9-sda', 'timestamp': '2025-11-29T07:24:49.598483', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-864835491', 'name': 'instance-0000007a', 'instance_id': '2702fe48-44d0-408d-8d10-fd635e3779c9', 'instance_type': 'm1.micro', 'host': 'b73f865756412e547668d2c5fefa1e20eb689217c3e5e5cfc8cbb829', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7d5f9dfe-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.132567412, 'message_signature': '6b9583928e38fb422e53d270aee7b594ad1b94adaff4feca7a863a7df525870d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '5f11adcd-958a-4269-905d-a017406505f0-vda', 'timestamp': '2025-11-29T07:24:49.598483', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'instance-0000007b', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7d5fab00-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.171234672, 'message_signature': 'c47ce04757bbd650344bf95e265c4169de69cbf96fff68e3db918102f7256503'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '5f11adcd-958a-4269-905d-a017406505f0-sda', 'timestamp': '2025-11-29T07:24:49.598483', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'instance-0000007b', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7d5fb712-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.171234672, 'message_signature': '94720ef529163263a9a11c4c4b63b0845d340096b02d297fc9907c3b743d26a9'}]}, 'timestamp': '2025-11-29 07:24:49.599805', '_unique_id': '2e614c904a654fb6adcfba29de74d8d3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.600 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.601 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.601 12 DEBUG ceilometer.compute.pollsters [-] 2702fe48-44d0-408d-8d10-fd635e3779c9/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.602 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3ffb8c7e-a2e4-46f1-bace-0d618932c185', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '000fb7b950024e16902cd58f2ea16ac9', 'user_name': None, 'project_id': '6d55e57bfd184513a304a61cc1cb3730', 'project_name': None, 'resource_id': 'instance-0000007a-2702fe48-44d0-408d-8d10-fd635e3779c9-tap3484baf0-bf', 'timestamp': '2025-11-29T07:24:49.601895', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-864835491', 'name': 'tap3484baf0-bf', 'instance_id': '2702fe48-44d0-408d-8d10-fd635e3779c9', 'instance_type': 'm1.micro', 'host': 'b73f865756412e547668d2c5fefa1e20eb689217c3e5e5cfc8cbb829', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:34:30:8b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3484baf0-bf'}, 'message_id': '7d601680-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.203842472, 'message_signature': 'cbfd9a95046b1f708da182b39534f846af98103c1d960f487c69d1971ea48cd6'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'instance-0000007b-5f11adcd-958a-4269-905d-a017406505f0-tapa1e67d00-86', 'timestamp': '2025-11-29T07:24:49.601895', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'tapa1e67d00-86', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:39:9f:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa1e67d00-86'}, 'message_id': '7d602364-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.208955766, 'message_signature': '5fd2c4c80927c90092e3200466e1f231d2682bba9db60715104f907f5a09b48e'}]}, 'timestamp': '2025-11-29 07:24:49.602590', '_unique_id': 'b3f55e90c23743a0a87338450a85dfe2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.603 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.604 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.604 12 DEBUG ceilometer.compute.pollsters [-] 2702fe48-44d0-408d-8d10-fd635e3779c9/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.604 12 DEBUG ceilometer.compute.pollsters [-] 2702fe48-44d0-408d-8d10-fd635e3779c9/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.605 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.605 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ef6b8937-0890-4def-99f1-34e120c556bd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '000fb7b950024e16902cd58f2ea16ac9', 'user_name': None, 'project_id': '6d55e57bfd184513a304a61cc1cb3730', 'project_name': None, 'resource_id': '2702fe48-44d0-408d-8d10-fd635e3779c9-vda', 'timestamp': '2025-11-29T07:24:49.604538', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-864835491', 'name': 'instance-0000007a', 'instance_id': '2702fe48-44d0-408d-8d10-fd635e3779c9', 'instance_type': 'm1.micro', 'host': 'b73f865756412e547668d2c5fefa1e20eb689217c3e5e5cfc8cbb829', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7d607cf6-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.132567412, 'message_signature': 'fad3dfa965f743e56955dc5774cb1375770a373ac70771585075ba215f1a8672'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '000fb7b950024e16902cd58f2ea16ac9', 'user_name': None, 'project_id': '6d55e57bfd184513a304a61cc1cb3730', 'project_name': None, 'resource_id': '2702fe48-44d0-408d-8d10-fd635e3779c9-sda', 'timestamp': '2025-11-29T07:24:49.604538', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-864835491', 'name': 'instance-0000007a', 'instance_id': '2702fe48-44d0-408d-8d10-fd635e3779c9', 'instance_type': 'm1.micro', 'host': 'b73f865756412e547668d2c5fefa1e20eb689217c3e5e5cfc8cbb829', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7d608a20-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.132567412, 'message_signature': '822db57573b8da26a9c17d5c80fa04ace24ad5a011d81e7122f5aacc5ce0ef88'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '5f11adcd-958a-4269-905d-a017406505f0-vda', 'timestamp': '2025-11-29T07:24:49.604538', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'instance-0000007b', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7d609678-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.171234672, 'message_signature': 'b03f19793492ad277647054f03c1b53c001d870c4c11017e4278c35d0f6bb20a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '5f11adcd-958a-4269-905d-a017406505f0-sda', 'timestamp': '2025-11-29T07:24:49.604538', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'instance-0000007b', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7d60a276-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.171234672, 'message_signature': 'ea7cde196c149d32e46bcc35c785258aa1f74be7d84835d79490b351ab854118'}]}, 'timestamp': '2025-11-29 07:24:49.605829', '_unique_id': 'f67c4f37f9c746a6afeb0915fc9554d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.606 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.607 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.607 12 DEBUG ceilometer.compute.pollsters [-] 2702fe48-44d0-408d-8d10-fd635e3779c9/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.608 12 DEBUG ceilometer.compute.pollsters [-] 2702fe48-44d0-408d-8d10-fd635e3779c9/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.608 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.608 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '392bd4dc-1d99-4416-8b4e-98a708783492', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '000fb7b950024e16902cd58f2ea16ac9', 'user_name': None, 'project_id': '6d55e57bfd184513a304a61cc1cb3730', 'project_name': None, 'resource_id': '2702fe48-44d0-408d-8d10-fd635e3779c9-vda', 'timestamp': '2025-11-29T07:24:49.607796', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-864835491', 'name': 'instance-0000007a', 'instance_id': '2702fe48-44d0-408d-8d10-fd635e3779c9', 'instance_type': 'm1.micro', 'host': 'b73f865756412e547668d2c5fefa1e20eb689217c3e5e5cfc8cbb829', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7d60fdc0-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.132567412, 'message_signature': '38c0cb4ed81041dc0986bb6358b1848fc74b73b45b80b562d09ee84e665750db'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '000fb7b950024e16902cd58f2ea16ac9', 'user_name': None, 'project_id': '6d55e57bfd184513a304a61cc1cb3730', 'project_name': None, 'resource_id': '2702fe48-44d0-408d-8d10-fd635e3779c9-sda', 'timestamp': '2025-11-29T07:24:49.607796', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-864835491', 'name': 'instance-0000007a', 'instance_id': '2702fe48-44d0-408d-8d10-fd635e3779c9', 'instance_type': 'm1.micro', 'host': 'b73f865756412e547668d2c5fefa1e20eb689217c3e5e5cfc8cbb829', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7d610a2c-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.132567412, 'message_signature': '4282e932384f40b2a75aeac7d08cb67db25f20f228d20567a178d854414face1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '5f11adcd-958a-4269-905d-a017406505f0-vda', 'timestamp': '2025-11-29T07:24:49.607796', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'instance-0000007b', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7d611616-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.171234672, 'message_signature': '5187103f62cee5692f1f6bba96c9068595539689eca3f91ab7d29cdfd22ff6b7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '5f11adcd-958a-4269-905d-a017406505f0-sda', 'timestamp': '2025-11-29T07:24:49.607796', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'instance-0000007b', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7d6122e6-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.171234672, 'message_signature': '3d347e653c42a1adb6a199b6efa54994e23fb696d8b3584244bd1dc955599c15'}]}, 'timestamp': '2025-11-29 07:24:49.609118', '_unique_id': '1fac311423cd4c179a32b8f23f78a3bc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.609 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.610 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.611 12 DEBUG ceilometer.compute.pollsters [-] 2702fe48-44d0-408d-8d10-fd635e3779c9/disk.device.read.latency volume: 354184038 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.611 12 DEBUG ceilometer.compute.pollsters [-] 2702fe48-44d0-408d-8d10-fd635e3779c9/disk.device.read.latency volume: 405651 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.611 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/disk.device.read.latency volume: 610967900 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.612 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/disk.device.read.latency volume: 663029 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '592cebd3-4566-43c3-bd16-eef8cf3a8545', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 354184038, 'user_id': '000fb7b950024e16902cd58f2ea16ac9', 'user_name': None, 'project_id': '6d55e57bfd184513a304a61cc1cb3730', 'project_name': None, 'resource_id': '2702fe48-44d0-408d-8d10-fd635e3779c9-vda', 'timestamp': '2025-11-29T07:24:49.611079', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-864835491', 'name': 'instance-0000007a', 'instance_id': '2702fe48-44d0-408d-8d10-fd635e3779c9', 'instance_type': 'm1.micro', 'host': 'b73f865756412e547668d2c5fefa1e20eb689217c3e5e5cfc8cbb829', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7d617d18-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.132567412, 'message_signature': 'fbb766a6b9db288f16948ce428a746f6351e572b531522fa64e3d55e43e3ea2d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 405651, 'user_id': '000fb7b950024e16902cd58f2ea16ac9', 'user_name': None, 'project_id': '6d55e57bfd184513a304a61cc1cb3730', 'project_name': None, 'resource_id': '2702fe48-44d0-408d-8d10-fd635e3779c9-sda', 'timestamp': '2025-11-29T07:24:49.611079', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-864835491', 'name': 'instance-0000007a', 'instance_id': '2702fe48-44d0-408d-8d10-fd635e3779c9', 'instance_type': 'm1.micro', 'host': 'b73f865756412e547668d2c5fefa1e20eb689217c3e5e5cfc8cbb829', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7d618984-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.132567412, 'message_signature': 'e5a9f203e59181353800575c63569ceb2eb774dae86be0b5b396ce1b637e71d6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 610967900, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '5f11adcd-958a-4269-905d-a017406505f0-vda', 'timestamp': '2025-11-29T07:24:49.611079', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'instance-0000007b', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7d61969a-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.171234672, 'message_signature': '8d232c19171a810a82609921edea9ef92f5c024ff80a0543d92a48c799b760cf'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 663029, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '5f11adcd-958a-4269-905d-a017406505f0-sda', 'timestamp': '2025-11-29T07:24:49.611079', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'instance-0000007b', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7d61a2ca-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.171234672, 'message_signature': '3200becdcb69aab1fc728ece41e905273c493f2c7f481dd892dfa1c93e83750d'}]}, 'timestamp': '2025-11-29 07:24:49.612388', '_unique_id': '565aa831d6474a68a72e8ca6add3b21d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.613 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.614 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.614 12 DEBUG ceilometer.compute.pollsters [-] 2702fe48-44d0-408d-8d10-fd635e3779c9/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.614 12 DEBUG ceilometer.compute.pollsters [-] 2702fe48-44d0-408d-8d10-fd635e3779c9/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.614 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.615 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd26316e9-c8f4-4cd1-b05a-d226e49a32bd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '000fb7b950024e16902cd58f2ea16ac9', 'user_name': None, 'project_id': '6d55e57bfd184513a304a61cc1cb3730', 'project_name': None, 'resource_id': '2702fe48-44d0-408d-8d10-fd635e3779c9-vda', 'timestamp': '2025-11-29T07:24:49.614324', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-864835491', 'name': 'instance-0000007a', 'instance_id': '2702fe48-44d0-408d-8d10-fd635e3779c9', 'instance_type': 'm1.micro', 'host': 'b73f865756412e547668d2c5fefa1e20eb689217c3e5e5cfc8cbb829', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7d61fb80-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.214081301, 'message_signature': 'dc9bb9aa2961e468780e7c7533d948d0509197cf0b3b78086d2248169bcea2f6'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '000fb7b950024e16902cd58f2ea16ac9', 'user_name': None, 'project_id': '6d55e57bfd184513a304a61cc1cb3730', 'project_name': None, 'resource_id': '2702fe48-44d0-408d-8d10-fd635e3779c9-sda', 'timestamp': '2025-11-29T07:24:49.614324', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-864835491', 'name': 'instance-0000007a', 'instance_id': '2702fe48-44d0-408d-8d10-fd635e3779c9', 'instance_type': 'm1.micro', 'host': 'b73f865756412e547668d2c5fefa1e20eb689217c3e5e5cfc8cbb829', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e29df891-dca5-4a1c-9258-dc512a46956f', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7d620864-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.214081301, 'message_signature': '981377a3f335bc1474a9f8c15c024b0d405b15bc437de8a494de375f78ba360d'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '5f11adcd-958a-4269-905d-a017406505f0-vda', 'timestamp': '2025-11-29T07:24:49.614324', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'instance-0000007b', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7d6214ee-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.230263157, 'message_signature': '28bd68618225aa0983c90014c328524a86d9c002704a53fe49762ecfe64b75dd'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '5f11adcd-958a-4269-905d-a017406505f0-sda', 'timestamp': '2025-11-29T07:24:49.614324', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'instance-0000007b', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7d6221e6-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6537.230263157, 'message_signature': 'ad4b0c58b65df1c7e6095d2234c2bcd9fcd0a5276d6f9eb64650ec1b764fa10c'}]}, 'timestamp': '2025-11-29 07:24:49.615643', '_unique_id': 'b2cef485f6b44d04af18da8e39f9fed8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.616 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.617 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.617 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:24:49 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:24:49.617 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-864835491>, <NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1816602290>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-864835491>, <NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1816602290>]
Nov 29 07:24:49 compute-0 nova_compute[187185]: 2025-11-29 07:24:49.717 187189 DEBUG oslo_concurrency.lockutils [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "refresh_cache-7c10cb24-586c-4507-8169-8258d7136397" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:24:49 compute-0 nova_compute[187185]: 2025-11-29 07:24:49.718 187189 DEBUG oslo_concurrency.lockutils [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquired lock "refresh_cache-7c10cb24-586c-4507-8169-8258d7136397" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:24:49 compute-0 nova_compute[187185]: 2025-11-29 07:24:49.718 187189 DEBUG nova.network.neutron [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:24:50 compute-0 nova_compute[187185]: 2025-11-29 07:24:50.173 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:50 compute-0 nova_compute[187185]: 2025-11-29 07:24:50.302 187189 DEBUG nova.compute.manager [req-992148c6-1bc1-4c8b-80da-a1d834a93281 req-1a708a7a-ec8c-40cf-a9c5-ee18c1d1c1d8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Received event network-changed-e89dd8de-f981-46cf-aa04-cfad6a9b2326 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:24:50 compute-0 nova_compute[187185]: 2025-11-29 07:24:50.303 187189 DEBUG nova.compute.manager [req-992148c6-1bc1-4c8b-80da-a1d834a93281 req-1a708a7a-ec8c-40cf-a9c5-ee18c1d1c1d8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Refreshing instance network info cache due to event network-changed-e89dd8de-f981-46cf-aa04-cfad6a9b2326. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:24:50 compute-0 nova_compute[187185]: 2025-11-29 07:24:50.303 187189 DEBUG oslo_concurrency.lockutils [req-992148c6-1bc1-4c8b-80da-a1d834a93281 req-1a708a7a-ec8c-40cf-a9c5-ee18c1d1c1d8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-7c10cb24-586c-4507-8169-8258d7136397" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:24:51 compute-0 nova_compute[187185]: 2025-11-29 07:24:51.104 187189 INFO nova.virt.libvirt.driver [None req-96604e5f-4fc4-4458-9022-5f71dc221c3e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Snapshot image upload complete
Nov 29 07:24:51 compute-0 nova_compute[187185]: 2025-11-29 07:24:51.106 187189 INFO nova.compute.manager [None req-96604e5f-4fc4-4458-9022-5f71dc221c3e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Took 8.06 seconds to snapshot the instance on the hypervisor.
Nov 29 07:24:51 compute-0 nova_compute[187185]: 2025-11-29 07:24:51.188 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:52 compute-0 nova_compute[187185]: 2025-11-29 07:24:52.622 187189 DEBUG nova.compute.manager [req-8e6bae3f-4775-4942-8d12-28b3f457e5ca req-62c85421-f419-43d8-8e92-9317d74db22c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Received event network-vif-plugged-e89dd8de-f981-46cf-aa04-cfad6a9b2326 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:24:52 compute-0 nova_compute[187185]: 2025-11-29 07:24:52.623 187189 DEBUG oslo_concurrency.lockutils [req-8e6bae3f-4775-4942-8d12-28b3f457e5ca req-62c85421-f419-43d8-8e92-9317d74db22c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7c10cb24-586c-4507-8169-8258d7136397-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:24:52 compute-0 nova_compute[187185]: 2025-11-29 07:24:52.623 187189 DEBUG oslo_concurrency.lockutils [req-8e6bae3f-4775-4942-8d12-28b3f457e5ca req-62c85421-f419-43d8-8e92-9317d74db22c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7c10cb24-586c-4507-8169-8258d7136397-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:24:52 compute-0 nova_compute[187185]: 2025-11-29 07:24:52.624 187189 DEBUG oslo_concurrency.lockutils [req-8e6bae3f-4775-4942-8d12-28b3f457e5ca req-62c85421-f419-43d8-8e92-9317d74db22c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7c10cb24-586c-4507-8169-8258d7136397-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:24:52 compute-0 nova_compute[187185]: 2025-11-29 07:24:52.624 187189 DEBUG nova.compute.manager [req-8e6bae3f-4775-4942-8d12-28b3f457e5ca req-62c85421-f419-43d8-8e92-9317d74db22c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] No waiting events found dispatching network-vif-plugged-e89dd8de-f981-46cf-aa04-cfad6a9b2326 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:24:52 compute-0 nova_compute[187185]: 2025-11-29 07:24:52.625 187189 WARNING nova.compute.manager [req-8e6bae3f-4775-4942-8d12-28b3f457e5ca req-62c85421-f419-43d8-8e92-9317d74db22c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Received unexpected event network-vif-plugged-e89dd8de-f981-46cf-aa04-cfad6a9b2326 for instance with vm_state active and task_state resize_migrated.
Nov 29 07:24:53 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Nov 29 07:24:53 compute-0 systemd[233715]: Activating special unit Exit the Session...
Nov 29 07:24:53 compute-0 systemd[233715]: Stopped target Main User Target.
Nov 29 07:24:53 compute-0 systemd[233715]: Stopped target Basic System.
Nov 29 07:24:53 compute-0 systemd[233715]: Stopped target Paths.
Nov 29 07:24:53 compute-0 systemd[233715]: Stopped target Sockets.
Nov 29 07:24:53 compute-0 systemd[233715]: Stopped target Timers.
Nov 29 07:24:53 compute-0 systemd[233715]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 29 07:24:53 compute-0 systemd[233715]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 29 07:24:53 compute-0 systemd[233715]: Closed D-Bus User Message Bus Socket.
Nov 29 07:24:53 compute-0 systemd[233715]: Stopped Create User's Volatile Files and Directories.
Nov 29 07:24:53 compute-0 systemd[233715]: Removed slice User Application Slice.
Nov 29 07:24:53 compute-0 systemd[233715]: Reached target Shutdown.
Nov 29 07:24:53 compute-0 systemd[233715]: Finished Exit the Session.
Nov 29 07:24:53 compute-0 systemd[233715]: Reached target Exit the Session.
Nov 29 07:24:53 compute-0 ovn_controller[95281]: 2025-11-29T07:24:53Z|00044|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:34:30:8b 10.100.0.12
Nov 29 07:24:53 compute-0 ovn_controller[95281]: 2025-11-29T07:24:53Z|00045|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:34:30:8b 10.100.0.12
Nov 29 07:24:53 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Nov 29 07:24:53 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Nov 29 07:24:53 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 29 07:24:53 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 29 07:24:53 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 29 07:24:53 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 29 07:24:53 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Nov 29 07:24:53 compute-0 systemd[1]: user-42436.slice: Consumed 1.150s CPU time.
Nov 29 07:24:54 compute-0 nova_compute[187185]: 2025-11-29 07:24:54.180 187189 DEBUG nova.network.neutron [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Updating instance_info_cache with network_info: [{"id": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "address": "fa:16:3e:b1:6e:42", "network": {"id": "be5e5e17-de26-4f07-84cb-bd99be23cd24", "bridge": "br-int", "label": "tempest-network-smoke--530096136", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape89dd8de-f9", "ovs_interfaceid": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:24:54 compute-0 ovn_controller[95281]: 2025-11-29T07:24:54Z|00046|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:39:9f:d3 10.100.0.11
Nov 29 07:24:54 compute-0 ovn_controller[95281]: 2025-11-29T07:24:54Z|00047|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:39:9f:d3 10.100.0.11
Nov 29 07:24:54 compute-0 nova_compute[187185]: 2025-11-29 07:24:54.556 187189 DEBUG oslo_concurrency.lockutils [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Releasing lock "refresh_cache-7c10cb24-586c-4507-8169-8258d7136397" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:24:54 compute-0 nova_compute[187185]: 2025-11-29 07:24:54.563 187189 DEBUG oslo_concurrency.lockutils [req-992148c6-1bc1-4c8b-80da-a1d834a93281 req-1a708a7a-ec8c-40cf-a9c5-ee18c1d1c1d8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-7c10cb24-586c-4507-8169-8258d7136397" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:24:54 compute-0 nova_compute[187185]: 2025-11-29 07:24:54.563 187189 DEBUG nova.network.neutron [req-992148c6-1bc1-4c8b-80da-a1d834a93281 req-1a708a7a-ec8c-40cf-a9c5-ee18c1d1c1d8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Refreshing network info cache for port e89dd8de-f981-46cf-aa04-cfad6a9b2326 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:24:55 compute-0 nova_compute[187185]: 2025-11-29 07:24:55.177 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:55 compute-0 nova_compute[187185]: 2025-11-29 07:24:55.276 187189 DEBUG nova.virt.libvirt.driver [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Nov 29 07:24:55 compute-0 nova_compute[187185]: 2025-11-29 07:24:55.278 187189 DEBUG nova.virt.libvirt.driver [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Nov 29 07:24:55 compute-0 nova_compute[187185]: 2025-11-29 07:24:55.279 187189 INFO nova.virt.libvirt.driver [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Creating image(s)
Nov 29 07:24:55 compute-0 nova_compute[187185]: 2025-11-29 07:24:55.280 187189 DEBUG nova.objects.instance [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 7c10cb24-586c-4507-8169-8258d7136397 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:24:55 compute-0 nova_compute[187185]: 2025-11-29 07:24:55.398 187189 DEBUG oslo_concurrency.processutils [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:24:55 compute-0 nova_compute[187185]: 2025-11-29 07:24:55.432 187189 INFO nova.compute.manager [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Rescuing
Nov 29 07:24:55 compute-0 nova_compute[187185]: 2025-11-29 07:24:55.432 187189 DEBUG oslo_concurrency.lockutils [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "refresh_cache-5f11adcd-958a-4269-905d-a017406505f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:24:55 compute-0 nova_compute[187185]: 2025-11-29 07:24:55.433 187189 DEBUG oslo_concurrency.lockutils [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquired lock "refresh_cache-5f11adcd-958a-4269-905d-a017406505f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:24:55 compute-0 nova_compute[187185]: 2025-11-29 07:24:55.433 187189 DEBUG nova.network.neutron [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:24:55 compute-0 nova_compute[187185]: 2025-11-29 07:24:55.491 187189 DEBUG oslo_concurrency.processutils [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:24:55 compute-0 nova_compute[187185]: 2025-11-29 07:24:55.492 187189 DEBUG nova.virt.disk.api [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Checking if we can resize image /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:24:55 compute-0 nova_compute[187185]: 2025-11-29 07:24:55.493 187189 DEBUG oslo_concurrency.processutils [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:24:55 compute-0 nova_compute[187185]: 2025-11-29 07:24:55.563 187189 DEBUG oslo_concurrency.processutils [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:24:55 compute-0 nova_compute[187185]: 2025-11-29 07:24:55.563 187189 DEBUG nova.virt.disk.api [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Cannot resize image /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:24:55 compute-0 nova_compute[187185]: 2025-11-29 07:24:55.761 187189 DEBUG nova.virt.libvirt.driver [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Nov 29 07:24:55 compute-0 nova_compute[187185]: 2025-11-29 07:24:55.762 187189 DEBUG nova.virt.libvirt.driver [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Ensure instance console log exists: /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:24:55 compute-0 nova_compute[187185]: 2025-11-29 07:24:55.763 187189 DEBUG oslo_concurrency.lockutils [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:24:55 compute-0 nova_compute[187185]: 2025-11-29 07:24:55.764 187189 DEBUG oslo_concurrency.lockutils [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:24:55 compute-0 nova_compute[187185]: 2025-11-29 07:24:55.765 187189 DEBUG oslo_concurrency.lockutils [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:24:55 compute-0 nova_compute[187185]: 2025-11-29 07:24:55.771 187189 DEBUG nova.virt.libvirt.driver [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Start _get_guest_xml network_info=[{"id": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "address": "fa:16:3e:b1:6e:42", "network": {"id": "be5e5e17-de26-4f07-84cb-bd99be23cd24", "bridge": "br-int", "label": "tempest-network-smoke--530096136", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--530096136", "vif_mac": "fa:16:3e:b1:6e:42"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape89dd8de-f9", "ovs_interfaceid": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:24:55 compute-0 nova_compute[187185]: 2025-11-29 07:24:55.780 187189 WARNING nova.virt.libvirt.driver [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:24:55 compute-0 nova_compute[187185]: 2025-11-29 07:24:55.787 187189 DEBUG nova.virt.libvirt.host [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:24:55 compute-0 nova_compute[187185]: 2025-11-29 07:24:55.788 187189 DEBUG nova.virt.libvirt.host [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:24:55 compute-0 nova_compute[187185]: 2025-11-29 07:24:55.792 187189 DEBUG nova.virt.libvirt.host [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:24:55 compute-0 nova_compute[187185]: 2025-11-29 07:24:55.793 187189 DEBUG nova.virt.libvirt.host [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:24:55 compute-0 nova_compute[187185]: 2025-11-29 07:24:55.794 187189 DEBUG nova.virt.libvirt.driver [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:24:55 compute-0 nova_compute[187185]: 2025-11-29 07:24:55.795 187189 DEBUG nova.virt.hardware [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e29df891-dca5-4a1c-9258-dc512a46956f',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:24:55 compute-0 nova_compute[187185]: 2025-11-29 07:24:55.795 187189 DEBUG nova.virt.hardware [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:24:55 compute-0 nova_compute[187185]: 2025-11-29 07:24:55.796 187189 DEBUG nova.virt.hardware [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:24:55 compute-0 nova_compute[187185]: 2025-11-29 07:24:55.796 187189 DEBUG nova.virt.hardware [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:24:55 compute-0 nova_compute[187185]: 2025-11-29 07:24:55.796 187189 DEBUG nova.virt.hardware [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:24:55 compute-0 nova_compute[187185]: 2025-11-29 07:24:55.796 187189 DEBUG nova.virt.hardware [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:24:55 compute-0 nova_compute[187185]: 2025-11-29 07:24:55.797 187189 DEBUG nova.virt.hardware [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:24:55 compute-0 nova_compute[187185]: 2025-11-29 07:24:55.797 187189 DEBUG nova.virt.hardware [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:24:55 compute-0 nova_compute[187185]: 2025-11-29 07:24:55.797 187189 DEBUG nova.virt.hardware [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:24:55 compute-0 nova_compute[187185]: 2025-11-29 07:24:55.798 187189 DEBUG nova.virt.hardware [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:24:55 compute-0 nova_compute[187185]: 2025-11-29 07:24:55.798 187189 DEBUG nova.virt.hardware [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:24:55 compute-0 nova_compute[187185]: 2025-11-29 07:24:55.798 187189 DEBUG nova.objects.instance [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 7c10cb24-586c-4507-8169-8258d7136397 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:24:56 compute-0 nova_compute[187185]: 2025-11-29 07:24:56.191 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:56 compute-0 nova_compute[187185]: 2025-11-29 07:24:56.548 187189 DEBUG oslo_concurrency.processutils [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:24:56 compute-0 nova_compute[187185]: 2025-11-29 07:24:56.682 187189 DEBUG oslo_concurrency.processutils [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk.config --force-share --output=json" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:24:56 compute-0 nova_compute[187185]: 2025-11-29 07:24:56.684 187189 DEBUG oslo_concurrency.lockutils [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "/var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:24:56 compute-0 nova_compute[187185]: 2025-11-29 07:24:56.685 187189 DEBUG oslo_concurrency.lockutils [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "/var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:24:56 compute-0 nova_compute[187185]: 2025-11-29 07:24:56.686 187189 DEBUG oslo_concurrency.lockutils [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "/var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:24:56 compute-0 nova_compute[187185]: 2025-11-29 07:24:56.688 187189 DEBUG nova.virt.libvirt.vif [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:23:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1276368768',display_name='tempest-TestNetworkAdvancedServerOps-server-1276368768',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1276368768',id=121,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBUznZR2iOKaJbWAB1nxy/Np7mGSzlwsDQ7Ycl3wci2nJ60qWbosUg5gundiked4HoZaTmuE/0+OTOCJFQ4CjxMZqyT1FcUBwmvtOPuSl/eONA9sj7Vj+75xN046AU/KWg==',key_name='tempest-TestNetworkAdvancedServerOps-143614444',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:24:11Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-it4r0l7q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:24:43Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=7c10cb24-586c-4507-8169-8258d7136397,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "address": "fa:16:3e:b1:6e:42", "network": {"id": "be5e5e17-de26-4f07-84cb-bd99be23cd24", "bridge": "br-int", "label": "tempest-network-smoke--530096136", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--530096136", "vif_mac": "fa:16:3e:b1:6e:42"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape89dd8de-f9", "ovs_interfaceid": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:24:56 compute-0 nova_compute[187185]: 2025-11-29 07:24:56.689 187189 DEBUG nova.network.os_vif_util [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converting VIF {"id": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "address": "fa:16:3e:b1:6e:42", "network": {"id": "be5e5e17-de26-4f07-84cb-bd99be23cd24", "bridge": "br-int", "label": "tempest-network-smoke--530096136", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--530096136", "vif_mac": "fa:16:3e:b1:6e:42"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape89dd8de-f9", "ovs_interfaceid": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:24:56 compute-0 nova_compute[187185]: 2025-11-29 07:24:56.690 187189 DEBUG nova.network.os_vif_util [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:6e:42,bridge_name='br-int',has_traffic_filtering=True,id=e89dd8de-f981-46cf-aa04-cfad6a9b2326,network=Network(be5e5e17-de26-4f07-84cb-bd99be23cd24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape89dd8de-f9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:24:56 compute-0 nova_compute[187185]: 2025-11-29 07:24:56.693 187189 DEBUG nova.virt.libvirt.driver [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:24:56 compute-0 nova_compute[187185]:   <uuid>7c10cb24-586c-4507-8169-8258d7136397</uuid>
Nov 29 07:24:56 compute-0 nova_compute[187185]:   <name>instance-00000079</name>
Nov 29 07:24:56 compute-0 nova_compute[187185]:   <memory>196608</memory>
Nov 29 07:24:56 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:24:56 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:24:56 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-1276368768</nova:name>
Nov 29 07:24:56 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:24:55</nova:creationTime>
Nov 29 07:24:56 compute-0 nova_compute[187185]:       <nova:flavor name="m1.micro">
Nov 29 07:24:56 compute-0 nova_compute[187185]:         <nova:memory>192</nova:memory>
Nov 29 07:24:56 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:24:56 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:24:56 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:24:56 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:24:56 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:24:56 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:24:56 compute-0 nova_compute[187185]:         <nova:user uuid="bfd2024670594b10941cec8a59d2573f">tempest-TestNetworkAdvancedServerOps-1380683659-project-member</nova:user>
Nov 29 07:24:56 compute-0 nova_compute[187185]:         <nova:project uuid="c231e63624d44fc19e0989abfb1afb22">tempest-TestNetworkAdvancedServerOps-1380683659</nova:project>
Nov 29 07:24:56 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:24:56 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:24:56 compute-0 nova_compute[187185]:         <nova:port uuid="e89dd8de-f981-46cf-aa04-cfad6a9b2326">
Nov 29 07:24:56 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:24:56 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:24:56 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:24:56 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <system>
Nov 29 07:24:56 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:24:56 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:24:56 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:24:56 compute-0 nova_compute[187185]:       <entry name="serial">7c10cb24-586c-4507-8169-8258d7136397</entry>
Nov 29 07:24:56 compute-0 nova_compute[187185]:       <entry name="uuid">7c10cb24-586c-4507-8169-8258d7136397</entry>
Nov 29 07:24:56 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     </system>
Nov 29 07:24:56 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:24:56 compute-0 nova_compute[187185]:   <os>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:   </os>
Nov 29 07:24:56 compute-0 nova_compute[187185]:   <features>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:   </features>
Nov 29 07:24:56 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:24:56 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:24:56 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:24:56 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:24:56 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk.config"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:24:56 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:b1:6e:42"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:       <target dev="tape89dd8de-f9"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:24:56 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/console.log" append="off"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <video>
Nov 29 07:24:56 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     </video>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:24:56 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:24:56 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:24:56 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:24:56 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:24:56 compute-0 nova_compute[187185]: </domain>
Nov 29 07:24:56 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:24:56 compute-0 nova_compute[187185]: 2025-11-29 07:24:56.695 187189 DEBUG nova.virt.libvirt.vif [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:23:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1276368768',display_name='tempest-TestNetworkAdvancedServerOps-server-1276368768',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1276368768',id=121,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBUznZR2iOKaJbWAB1nxy/Np7mGSzlwsDQ7Ycl3wci2nJ60qWbosUg5gundiked4HoZaTmuE/0+OTOCJFQ4CjxMZqyT1FcUBwmvtOPuSl/eONA9sj7Vj+75xN046AU/KWg==',key_name='tempest-TestNetworkAdvancedServerOps-143614444',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:24:11Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-it4r0l7q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:24:43Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=7c10cb24-586c-4507-8169-8258d7136397,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "address": "fa:16:3e:b1:6e:42", "network": {"id": "be5e5e17-de26-4f07-84cb-bd99be23cd24", "bridge": "br-int", "label": "tempest-network-smoke--530096136", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--530096136", "vif_mac": "fa:16:3e:b1:6e:42"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape89dd8de-f9", "ovs_interfaceid": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:24:56 compute-0 nova_compute[187185]: 2025-11-29 07:24:56.696 187189 DEBUG nova.network.os_vif_util [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converting VIF {"id": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "address": "fa:16:3e:b1:6e:42", "network": {"id": "be5e5e17-de26-4f07-84cb-bd99be23cd24", "bridge": "br-int", "label": "tempest-network-smoke--530096136", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--530096136", "vif_mac": "fa:16:3e:b1:6e:42"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape89dd8de-f9", "ovs_interfaceid": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:24:56 compute-0 nova_compute[187185]: 2025-11-29 07:24:56.697 187189 DEBUG nova.network.os_vif_util [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:6e:42,bridge_name='br-int',has_traffic_filtering=True,id=e89dd8de-f981-46cf-aa04-cfad6a9b2326,network=Network(be5e5e17-de26-4f07-84cb-bd99be23cd24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape89dd8de-f9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:24:56 compute-0 nova_compute[187185]: 2025-11-29 07:24:56.697 187189 DEBUG os_vif [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:6e:42,bridge_name='br-int',has_traffic_filtering=True,id=e89dd8de-f981-46cf-aa04-cfad6a9b2326,network=Network(be5e5e17-de26-4f07-84cb-bd99be23cd24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape89dd8de-f9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:24:56 compute-0 nova_compute[187185]: 2025-11-29 07:24:56.698 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:56 compute-0 nova_compute[187185]: 2025-11-29 07:24:56.698 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:24:56 compute-0 nova_compute[187185]: 2025-11-29 07:24:56.699 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:24:56 compute-0 nova_compute[187185]: 2025-11-29 07:24:56.703 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:56 compute-0 nova_compute[187185]: 2025-11-29 07:24:56.703 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape89dd8de-f9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:24:56 compute-0 nova_compute[187185]: 2025-11-29 07:24:56.704 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape89dd8de-f9, col_values=(('external_ids', {'iface-id': 'e89dd8de-f981-46cf-aa04-cfad6a9b2326', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b1:6e:42', 'vm-uuid': '7c10cb24-586c-4507-8169-8258d7136397'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:24:56 compute-0 nova_compute[187185]: 2025-11-29 07:24:56.706 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:56 compute-0 NetworkManager[55227]: <info>  [1764401096.7068] manager: (tape89dd8de-f9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/192)
Nov 29 07:24:56 compute-0 nova_compute[187185]: 2025-11-29 07:24:56.708 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:24:56 compute-0 nova_compute[187185]: 2025-11-29 07:24:56.715 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:56 compute-0 nova_compute[187185]: 2025-11-29 07:24:56.716 187189 INFO os_vif [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:6e:42,bridge_name='br-int',has_traffic_filtering=True,id=e89dd8de-f981-46cf-aa04-cfad6a9b2326,network=Network(be5e5e17-de26-4f07-84cb-bd99be23cd24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape89dd8de-f9')
Nov 29 07:24:56 compute-0 nova_compute[187185]: 2025-11-29 07:24:56.929 187189 DEBUG nova.virt.libvirt.driver [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:24:56 compute-0 nova_compute[187185]: 2025-11-29 07:24:56.929 187189 DEBUG nova.virt.libvirt.driver [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:24:56 compute-0 nova_compute[187185]: 2025-11-29 07:24:56.929 187189 DEBUG nova.virt.libvirt.driver [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] No VIF found with MAC fa:16:3e:b1:6e:42, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:24:56 compute-0 nova_compute[187185]: 2025-11-29 07:24:56.930 187189 INFO nova.virt.libvirt.driver [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Using config drive
Nov 29 07:24:57 compute-0 kernel: tape89dd8de-f9: entered promiscuous mode
Nov 29 07:24:57 compute-0 NetworkManager[55227]: <info>  [1764401097.0108] manager: (tape89dd8de-f9): new Tun device (/org/freedesktop/NetworkManager/Devices/193)
Nov 29 07:24:57 compute-0 ovn_controller[95281]: 2025-11-29T07:24:57Z|00358|binding|INFO|Claiming lport e89dd8de-f981-46cf-aa04-cfad6a9b2326 for this chassis.
Nov 29 07:24:57 compute-0 ovn_controller[95281]: 2025-11-29T07:24:57Z|00359|binding|INFO|e89dd8de-f981-46cf-aa04-cfad6a9b2326: Claiming fa:16:3e:b1:6e:42 10.100.0.7
Nov 29 07:24:57 compute-0 nova_compute[187185]: 2025-11-29 07:24:57.012 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:57 compute-0 nova_compute[187185]: 2025-11-29 07:24:57.016 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:57 compute-0 nova_compute[187185]: 2025-11-29 07:24:57.023 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:57 compute-0 nova_compute[187185]: 2025-11-29 07:24:57.028 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:57 compute-0 NetworkManager[55227]: <info>  [1764401097.0303] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/194)
Nov 29 07:24:57 compute-0 NetworkManager[55227]: <info>  [1764401097.0315] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/195)
Nov 29 07:24:57 compute-0 systemd-udevd[234235]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:24:57 compute-0 NetworkManager[55227]: <info>  [1764401097.0685] device (tape89dd8de-f9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:24:57 compute-0 NetworkManager[55227]: <info>  [1764401097.0705] device (tape89dd8de-f9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:24:57 compute-0 systemd-machined[153486]: New machine qemu-48-instance-00000079.
Nov 29 07:24:57 compute-0 systemd[1]: Started Virtual Machine qemu-48-instance-00000079.
Nov 29 07:24:57 compute-0 nova_compute[187185]: 2025-11-29 07:24:57.212 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:57 compute-0 nova_compute[187185]: 2025-11-29 07:24:57.230 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:57 compute-0 ovn_controller[95281]: 2025-11-29T07:24:57Z|00360|binding|INFO|Releasing lport 88f3bff1-58a0-4231-87c4-807c4c2657d5 from this chassis (sb_readonly=0)
Nov 29 07:24:57 compute-0 ovn_controller[95281]: 2025-11-29T07:24:57Z|00361|binding|INFO|Releasing lport 0d1614aa-fbbf-4ad2-9ee2-8948d583ddf6 from this chassis (sb_readonly=0)
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:57.257 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:6e:42 10.100.0.7'], port_security=['fa:16:3e:b1:6e:42 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '7c10cb24-586c-4507-8169-8258d7136397', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-be5e5e17-de26-4f07-84cb-bd99be23cd24', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c231e63624d44fc19e0989abfb1afb22', 'neutron:revision_number': '5', 'neutron:security_group_ids': '51b81e59-c129-44d0-83ab-ea09f800f560', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.173'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=40c46f88-56a4-469c-8869-7f0629f57469, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=e89dd8de-f981-46cf-aa04-cfad6a9b2326) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:57.258 104254 INFO neutron.agent.ovn.metadata.agent [-] Port e89dd8de-f981-46cf-aa04-cfad6a9b2326 in datapath be5e5e17-de26-4f07-84cb-bd99be23cd24 bound to our chassis
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:57.260 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network be5e5e17-de26-4f07-84cb-bd99be23cd24
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:57.280 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e88e1892-ebc1-481c-8447-973af7a3f97f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:57.282 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapbe5e5e17-d1 in ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:57.285 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapbe5e5e17-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:57.285 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a01ca508-5553-4d23-aaf0-764a08bf2c46]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:57.287 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[87275d93-51e2-41bb-a68d-5886dbc83e53]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:57 compute-0 ovn_controller[95281]: 2025-11-29T07:24:57Z|00362|binding|INFO|Setting lport e89dd8de-f981-46cf-aa04-cfad6a9b2326 ovn-installed in OVS
Nov 29 07:24:57 compute-0 ovn_controller[95281]: 2025-11-29T07:24:57Z|00363|binding|INFO|Setting lport e89dd8de-f981-46cf-aa04-cfad6a9b2326 up in Southbound
Nov 29 07:24:57 compute-0 nova_compute[187185]: 2025-11-29 07:24:57.291 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:57 compute-0 nova_compute[187185]: 2025-11-29 07:24:57.304 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:57.311 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[7f49e34b-a9a4-4b10-b0d1-60ca35a74d77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:57.346 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[88521a6c-a31d-461d-9403-e7b27c694264]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:57 compute-0 nova_compute[187185]: 2025-11-29 07:24:57.376 187189 DEBUG nova.network.neutron [req-992148c6-1bc1-4c8b-80da-a1d834a93281 req-1a708a7a-ec8c-40cf-a9c5-ee18c1d1c1d8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Updated VIF entry in instance network info cache for port e89dd8de-f981-46cf-aa04-cfad6a9b2326. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:24:57 compute-0 nova_compute[187185]: 2025-11-29 07:24:57.377 187189 DEBUG nova.network.neutron [req-992148c6-1bc1-4c8b-80da-a1d834a93281 req-1a708a7a-ec8c-40cf-a9c5-ee18c1d1c1d8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Updating instance_info_cache with network_info: [{"id": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "address": "fa:16:3e:b1:6e:42", "network": {"id": "be5e5e17-de26-4f07-84cb-bd99be23cd24", "bridge": "br-int", "label": "tempest-network-smoke--530096136", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape89dd8de-f9", "ovs_interfaceid": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:57.387 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[52dd7051-c813-4294-9966-bc22d6804c82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:57 compute-0 NetworkManager[55227]: <info>  [1764401097.3970] manager: (tapbe5e5e17-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/196)
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:57.397 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[051d1b1b-5a8f-4310-870b-a35a3a57064d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:57 compute-0 nova_compute[187185]: 2025-11-29 07:24:57.425 187189 DEBUG oslo_concurrency.lockutils [req-992148c6-1bc1-4c8b-80da-a1d834a93281 req-1a708a7a-ec8c-40cf-a9c5-ee18c1d1c1d8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-7c10cb24-586c-4507-8169-8258d7136397" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:57.442 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[302a25a5-476f-4af9-af0a-848021d7b9e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:57.446 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[97e7f324-055d-4fa8-ad77-3eca2127d12b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:57 compute-0 NetworkManager[55227]: <info>  [1764401097.4753] device (tapbe5e5e17-d0): carrier: link connected
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:57.484 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[e57ea40f-9e19-4d20-9407-9541ba02804e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:57.510 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[0eb5c5c5-db73-4c71-930c-0e47c73e9a37]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbe5e5e17-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:44:a9:0d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 117], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 654513, 'reachable_time': 17762, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234271, 'error': None, 'target': 'ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:57.534 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d73067a5-0e6c-4e7d-9c60-65f50e0f4f40]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe44:a90d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 654513, 'tstamp': 654513}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234272, 'error': None, 'target': 'ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:57.562 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[83b934c4-004a-48d6-9bdf-83a3d41174fc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbe5e5e17-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:44:a9:0d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 117], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 654513, 'reachable_time': 17762, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234273, 'error': None, 'target': 'ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:57 compute-0 nova_compute[187185]: 2025-11-29 07:24:57.567 187189 DEBUG nova.network.neutron [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Updating instance_info_cache with network_info: [{"id": "a1e67d00-8650-44ba-b75d-07f55b8d8810", "address": "fa:16:3e:39:9f:d3", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1e67d00-86", "ovs_interfaceid": "a1e67d00-8650-44ba-b75d-07f55b8d8810", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:57.602 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d2e3e67d-c15b-48cd-9aa0-8fd0cd1f52a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:57 compute-0 nova_compute[187185]: 2025-11-29 07:24:57.619 187189 DEBUG oslo_concurrency.lockutils [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Releasing lock "refresh_cache-5f11adcd-958a-4269-905d-a017406505f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:57.679 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[fc36083a-281a-4493-aca8-123edae27b06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:57 compute-0 nova_compute[187185]: 2025-11-29 07:24:57.680 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401097.680268, 7c10cb24-586c-4507-8169-8258d7136397 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:24:57 compute-0 nova_compute[187185]: 2025-11-29 07:24:57.681 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 7c10cb24-586c-4507-8169-8258d7136397] VM Resumed (Lifecycle Event)
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:57.681 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbe5e5e17-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:57.681 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:57.682 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbe5e5e17-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:24:57 compute-0 nova_compute[187185]: 2025-11-29 07:24:57.683 187189 DEBUG nova.compute.manager [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:24:57 compute-0 nova_compute[187185]: 2025-11-29 07:24:57.702 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:57 compute-0 NetworkManager[55227]: <info>  [1764401097.7030] manager: (tapbe5e5e17-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/197)
Nov 29 07:24:57 compute-0 kernel: tapbe5e5e17-d0: entered promiscuous mode
Nov 29 07:24:57 compute-0 nova_compute[187185]: 2025-11-29 07:24:57.705 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:24:57 compute-0 nova_compute[187185]: 2025-11-29 07:24:57.705 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:57.705 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbe5e5e17-d0, col_values=(('external_ids', {'iface-id': '2da41e48-a12e-440c-815f-4c44c48f8762'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:24:57 compute-0 ovn_controller[95281]: 2025-11-29T07:24:57Z|00364|binding|INFO|Releasing lport 2da41e48-a12e-440c-815f-4c44c48f8762 from this chassis (sb_readonly=0)
Nov 29 07:24:57 compute-0 nova_compute[187185]: 2025-11-29 07:24:57.707 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:57.709 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/be5e5e17-de26-4f07-84cb-bd99be23cd24.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/be5e5e17-de26-4f07-84cb-bd99be23cd24.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:24:57 compute-0 nova_compute[187185]: 2025-11-29 07:24:57.711 187189 INFO nova.virt.libvirt.driver [-] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Instance running successfully.
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:57.711 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[fefe9d45-a681-4883-a050-00a59c246f61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:24:57 compute-0 virtqemud[186729]: argument unsupported: QEMU guest agent is not configured
Nov 29 07:24:57 compute-0 nova_compute[187185]: 2025-11-29 07:24:57.712 187189 DEBUG nova.virt.libvirt.guest [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:57.712 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-be5e5e17-de26-4f07-84cb-bd99be23cd24
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/be5e5e17-de26-4f07-84cb-bd99be23cd24.pid.haproxy
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID be5e5e17-de26-4f07-84cb-bd99be23cd24
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:24:57 compute-0 nova_compute[187185]: 2025-11-29 07:24:57.713 187189 DEBUG nova.virt.libvirt.driver [None req-67449d45-df70-4805-807d-0a74c7195dc5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Nov 29 07:24:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:24:57.713 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24', 'env', 'PROCESS_TAG=haproxy-be5e5e17-de26-4f07-84cb-bd99be23cd24', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/be5e5e17-de26-4f07-84cb-bd99be23cd24.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:24:57 compute-0 nova_compute[187185]: 2025-11-29 07:24:57.715 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:24:57 compute-0 nova_compute[187185]: 2025-11-29 07:24:57.720 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:24:58 compute-0 podman[234312]: 2025-11-29 07:24:58.073761459 +0000 UTC m=+0.030458520 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:24:58 compute-0 podman[234312]: 2025-11-29 07:24:58.277828095 +0000 UTC m=+0.234525156 container create a925727d1aad090da1682982cbc86285c575f8a68a767e5aad77430ee6ef2fbd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 07:24:58 compute-0 systemd[1]: Started libpod-conmon-a925727d1aad090da1682982cbc86285c575f8a68a767e5aad77430ee6ef2fbd.scope.
Nov 29 07:24:58 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:24:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/082ca51418527a87ffea65750fd070756b7a67896159c76faedfeed34349cf43/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:24:58 compute-0 podman[234312]: 2025-11-29 07:24:58.432768706 +0000 UTC m=+0.389465767 container init a925727d1aad090da1682982cbc86285c575f8a68a767e5aad77430ee6ef2fbd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:24:58 compute-0 podman[234312]: 2025-11-29 07:24:58.438061205 +0000 UTC m=+0.394758266 container start a925727d1aad090da1682982cbc86285c575f8a68a767e5aad77430ee6ef2fbd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:24:58 compute-0 podman[234326]: 2025-11-29 07:24:58.450902037 +0000 UTC m=+0.125293055 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Nov 29 07:24:58 compute-0 neutron-haproxy-ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24[234343]: [NOTICE]   (234358) : New worker (234360) forked
Nov 29 07:24:58 compute-0 neutron-haproxy-ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24[234343]: [NOTICE]   (234358) : Loading success.
Nov 29 07:24:58 compute-0 nova_compute[187185]: 2025-11-29 07:24:58.662 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 7c10cb24-586c-4507-8169-8258d7136397] During sync_power_state the instance has a pending task (resize_finish). Skip.
Nov 29 07:24:58 compute-0 nova_compute[187185]: 2025-11-29 07:24:58.663 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401097.6811006, 7c10cb24-586c-4507-8169-8258d7136397 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:24:58 compute-0 nova_compute[187185]: 2025-11-29 07:24:58.663 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 7c10cb24-586c-4507-8169-8258d7136397] VM Started (Lifecycle Event)
Nov 29 07:24:58 compute-0 nova_compute[187185]: 2025-11-29 07:24:58.684 187189 DEBUG nova.virt.libvirt.driver [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Nov 29 07:24:58 compute-0 nova_compute[187185]: 2025-11-29 07:24:58.706 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:24:58 compute-0 nova_compute[187185]: 2025-11-29 07:24:58.709 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:24:58 compute-0 nova_compute[187185]: 2025-11-29 07:24:58.733 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 7c10cb24-586c-4507-8169-8258d7136397] During sync_power_state the instance has a pending task (resize_finish). Skip.
Nov 29 07:24:59 compute-0 nova_compute[187185]: 2025-11-29 07:24:59.757 187189 DEBUG nova.compute.manager [req-9045edf0-42e6-40cc-add9-b93da415975b req-a23731e1-95ab-44a8-9fe8-c5274c3b700e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Received event network-vif-plugged-e89dd8de-f981-46cf-aa04-cfad6a9b2326 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:24:59 compute-0 nova_compute[187185]: 2025-11-29 07:24:59.758 187189 DEBUG oslo_concurrency.lockutils [req-9045edf0-42e6-40cc-add9-b93da415975b req-a23731e1-95ab-44a8-9fe8-c5274c3b700e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7c10cb24-586c-4507-8169-8258d7136397-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:24:59 compute-0 nova_compute[187185]: 2025-11-29 07:24:59.759 187189 DEBUG oslo_concurrency.lockutils [req-9045edf0-42e6-40cc-add9-b93da415975b req-a23731e1-95ab-44a8-9fe8-c5274c3b700e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7c10cb24-586c-4507-8169-8258d7136397-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:24:59 compute-0 nova_compute[187185]: 2025-11-29 07:24:59.759 187189 DEBUG oslo_concurrency.lockutils [req-9045edf0-42e6-40cc-add9-b93da415975b req-a23731e1-95ab-44a8-9fe8-c5274c3b700e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7c10cb24-586c-4507-8169-8258d7136397-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:24:59 compute-0 nova_compute[187185]: 2025-11-29 07:24:59.759 187189 DEBUG nova.compute.manager [req-9045edf0-42e6-40cc-add9-b93da415975b req-a23731e1-95ab-44a8-9fe8-c5274c3b700e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] No waiting events found dispatching network-vif-plugged-e89dd8de-f981-46cf-aa04-cfad6a9b2326 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:24:59 compute-0 nova_compute[187185]: 2025-11-29 07:24:59.760 187189 WARNING nova.compute.manager [req-9045edf0-42e6-40cc-add9-b93da415975b req-a23731e1-95ab-44a8-9fe8-c5274c3b700e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Received unexpected event network-vif-plugged-e89dd8de-f981-46cf-aa04-cfad6a9b2326 for instance with vm_state resized and task_state None.
Nov 29 07:25:00 compute-0 kernel: tapa1e67d00-86 (unregistering): left promiscuous mode
Nov 29 07:25:00 compute-0 NetworkManager[55227]: <info>  [1764401100.8855] device (tapa1e67d00-86): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:25:00 compute-0 ovn_controller[95281]: 2025-11-29T07:25:00Z|00365|binding|INFO|Releasing lport a1e67d00-8650-44ba-b75d-07f55b8d8810 from this chassis (sb_readonly=0)
Nov 29 07:25:00 compute-0 nova_compute[187185]: 2025-11-29 07:25:00.892 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:00 compute-0 ovn_controller[95281]: 2025-11-29T07:25:00Z|00366|binding|INFO|Setting lport a1e67d00-8650-44ba-b75d-07f55b8d8810 down in Southbound
Nov 29 07:25:00 compute-0 ovn_controller[95281]: 2025-11-29T07:25:00Z|00367|binding|INFO|Removing iface tapa1e67d00-86 ovn-installed in OVS
Nov 29 07:25:00 compute-0 nova_compute[187185]: 2025-11-29 07:25:00.897 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:00 compute-0 nova_compute[187185]: 2025-11-29 07:25:00.918 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:00 compute-0 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d0000007b.scope: Deactivated successfully.
Nov 29 07:25:00 compute-0 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d0000007b.scope: Consumed 13.684s CPU time.
Nov 29 07:25:00 compute-0 systemd-machined[153486]: Machine qemu-47-instance-0000007b terminated.
Nov 29 07:25:01 compute-0 nova_compute[187185]: 2025-11-29 07:25:01.193 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:01.362 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:39:9f:d3 10.100.0.11'], port_security=['fa:16:3e:39:9f:d3 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '5f11adcd-958a-4269-905d-a017406505f0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'neutron:revision_number': '4', 'neutron:security_group_ids': '339ff6a8-b11e-4176-931b-a82ab9688ace', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0beb853-8490-4e92-a787-adc66ba47efc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=a1e67d00-8650-44ba-b75d-07f55b8d8810) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:25:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:01.363 104254 INFO neutron.agent.ovn.metadata.agent [-] Port a1e67d00-8650-44ba-b75d-07f55b8d8810 in datapath 240f16d8-602b-4aa1-8edb-e3a8d3674e39 unbound from our chassis
Nov 29 07:25:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:01.365 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 240f16d8-602b-4aa1-8edb-e3a8d3674e39, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:25:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:01.367 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e25946ef-966f-4b4d-bf90-dff2947c3ab7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:01.367 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39 namespace which is not needed anymore
Nov 29 07:25:01 compute-0 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[234075]: [NOTICE]   (234082) : haproxy version is 2.8.14-c23fe91
Nov 29 07:25:01 compute-0 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[234075]: [NOTICE]   (234082) : path to executable is /usr/sbin/haproxy
Nov 29 07:25:01 compute-0 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[234075]: [WARNING]  (234082) : Exiting Master process...
Nov 29 07:25:01 compute-0 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[234075]: [WARNING]  (234082) : Exiting Master process...
Nov 29 07:25:01 compute-0 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[234075]: [ALERT]    (234082) : Current worker (234085) exited with code 143 (Terminated)
Nov 29 07:25:01 compute-0 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[234075]: [WARNING]  (234082) : All workers exited. Exiting... (0)
Nov 29 07:25:01 compute-0 systemd[1]: libpod-2f01ae3b54bfd5b2b064416d67b4ec240bcdc698e67290cf758c005b6164c851.scope: Deactivated successfully.
Nov 29 07:25:01 compute-0 podman[234409]: 2025-11-29 07:25:01.527771247 +0000 UTC m=+0.052708497 container died 2f01ae3b54bfd5b2b064416d67b4ec240bcdc698e67290cf758c005b6164c851 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS)
Nov 29 07:25:01 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2f01ae3b54bfd5b2b064416d67b4ec240bcdc698e67290cf758c005b6164c851-userdata-shm.mount: Deactivated successfully.
Nov 29 07:25:01 compute-0 systemd[1]: var-lib-containers-storage-overlay-5a87177512692a664580163c7f0bd904ce12b38b73b317c7fb430e8a6e2d7c3e-merged.mount: Deactivated successfully.
Nov 29 07:25:01 compute-0 podman[234409]: 2025-11-29 07:25:01.584325573 +0000 UTC m=+0.109262793 container cleanup 2f01ae3b54bfd5b2b064416d67b4ec240bcdc698e67290cf758c005b6164c851 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:25:01 compute-0 systemd[1]: libpod-conmon-2f01ae3b54bfd5b2b064416d67b4ec240bcdc698e67290cf758c005b6164c851.scope: Deactivated successfully.
Nov 29 07:25:01 compute-0 podman[234438]: 2025-11-29 07:25:01.687006139 +0000 UTC m=+0.047048048 container remove 2f01ae3b54bfd5b2b064416d67b4ec240bcdc698e67290cf758c005b6164c851 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 29 07:25:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:01.696 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[9aadae50-dd40-411f-874b-5d8da2710fc2]: (4, ('Sat Nov 29 07:25:01 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39 (2f01ae3b54bfd5b2b064416d67b4ec240bcdc698e67290cf758c005b6164c851)\n2f01ae3b54bfd5b2b064416d67b4ec240bcdc698e67290cf758c005b6164c851\nSat Nov 29 07:25:01 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39 (2f01ae3b54bfd5b2b064416d67b4ec240bcdc698e67290cf758c005b6164c851)\n2f01ae3b54bfd5b2b064416d67b4ec240bcdc698e67290cf758c005b6164c851\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:01.698 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[0a1bc0dc-9521-4d8d-b8dc-7f4ff587c0e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:01.699 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap240f16d8-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:25:01 compute-0 nova_compute[187185]: 2025-11-29 07:25:01.701 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:01 compute-0 kernel: tap240f16d8-60: left promiscuous mode
Nov 29 07:25:01 compute-0 nova_compute[187185]: 2025-11-29 07:25:01.706 187189 DEBUG oslo_concurrency.lockutils [None req-e780857d-3ef6-4259-8989-007f87322a8e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "2702fe48-44d0-408d-8d10-fd635e3779c9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:25:01 compute-0 nova_compute[187185]: 2025-11-29 07:25:01.706 187189 DEBUG oslo_concurrency.lockutils [None req-e780857d-3ef6-4259-8989-007f87322a8e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "2702fe48-44d0-408d-8d10-fd635e3779c9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:25:01 compute-0 nova_compute[187185]: 2025-11-29 07:25:01.706 187189 DEBUG oslo_concurrency.lockutils [None req-e780857d-3ef6-4259-8989-007f87322a8e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:25:01 compute-0 nova_compute[187185]: 2025-11-29 07:25:01.706 187189 DEBUG oslo_concurrency.lockutils [None req-e780857d-3ef6-4259-8989-007f87322a8e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:25:01 compute-0 nova_compute[187185]: 2025-11-29 07:25:01.706 187189 DEBUG oslo_concurrency.lockutils [None req-e780857d-3ef6-4259-8989-007f87322a8e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:25:01 compute-0 nova_compute[187185]: 2025-11-29 07:25:01.708 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:01 compute-0 nova_compute[187185]: 2025-11-29 07:25:01.709 187189 INFO nova.virt.libvirt.driver [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Instance shutdown successfully after 3 seconds.
Nov 29 07:25:01 compute-0 nova_compute[187185]: 2025-11-29 07:25:01.717 187189 INFO nova.compute.manager [None req-e780857d-3ef6-4259-8989-007f87322a8e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Terminating instance
Nov 29 07:25:01 compute-0 nova_compute[187185]: 2025-11-29 07:25:01.721 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:01 compute-0 nova_compute[187185]: 2025-11-29 07:25:01.722 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:01.725 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[c9a9950c-e67b-4868-b29e-f648f2d3520d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:01 compute-0 nova_compute[187185]: 2025-11-29 07:25:01.729 187189 INFO nova.virt.libvirt.driver [-] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Instance destroyed successfully.
Nov 29 07:25:01 compute-0 nova_compute[187185]: 2025-11-29 07:25:01.729 187189 DEBUG nova.objects.instance [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lazy-loading 'numa_topology' on Instance uuid 5f11adcd-958a-4269-905d-a017406505f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:25:01 compute-0 nova_compute[187185]: 2025-11-29 07:25:01.730 187189 DEBUG nova.compute.manager [None req-e780857d-3ef6-4259-8989-007f87322a8e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 07:25:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:01.742 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[8955ddce-e096-4f8e-9e79-01283140ef4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:01.744 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[cc3033df-7d26-4421-a5fb-7f9c9e8a9e5a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:01 compute-0 nova_compute[187185]: 2025-11-29 07:25:01.754 187189 INFO nova.virt.libvirt.driver [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Attempting a stable device rescue
Nov 29 07:25:01 compute-0 kernel: tap3484baf0-bf (unregistering): left promiscuous mode
Nov 29 07:25:01 compute-0 NetworkManager[55227]: <info>  [1764401101.7623] device (tap3484baf0-bf): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:25:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:01.774 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[633ca229-04ba-4627-a487-bfa87e2e40b2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 652884, 'reachable_time': 27453, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234457, 'error': None, 'target': 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:01 compute-0 nova_compute[187185]: 2025-11-29 07:25:01.776 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:01 compute-0 ovn_controller[95281]: 2025-11-29T07:25:01Z|00368|binding|INFO|Releasing lport 3484baf0-bfbb-4b67-b841-a369f9a2c534 from this chassis (sb_readonly=0)
Nov 29 07:25:01 compute-0 ovn_controller[95281]: 2025-11-29T07:25:01Z|00369|binding|INFO|Setting lport 3484baf0-bfbb-4b67-b841-a369f9a2c534 down in Southbound
Nov 29 07:25:01 compute-0 ovn_controller[95281]: 2025-11-29T07:25:01Z|00370|binding|INFO|Removing iface tap3484baf0-bf ovn-installed in OVS
Nov 29 07:25:01 compute-0 systemd[1]: run-netns-ovnmeta\x2d240f16d8\x2d602b\x2d4aa1\x2d8edb\x2de3a8d3674e39.mount: Deactivated successfully.
Nov 29 07:25:01 compute-0 nova_compute[187185]: 2025-11-29 07:25:01.794 187189 DEBUG nova.compute.manager [req-f6877a2d-30ab-417b-b641-e037be2f50ac req-546916fc-6c69-4540-a3a2-9f2460a7c88c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Received event network-vif-unplugged-a1e67d00-8650-44ba-b75d-07f55b8d8810 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:25:01 compute-0 nova_compute[187185]: 2025-11-29 07:25:01.794 187189 DEBUG oslo_concurrency.lockutils [req-f6877a2d-30ab-417b-b641-e037be2f50ac req-546916fc-6c69-4540-a3a2-9f2460a7c88c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5f11adcd-958a-4269-905d-a017406505f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:25:01 compute-0 nova_compute[187185]: 2025-11-29 07:25:01.795 187189 DEBUG oslo_concurrency.lockutils [req-f6877a2d-30ab-417b-b641-e037be2f50ac req-546916fc-6c69-4540-a3a2-9f2460a7c88c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f11adcd-958a-4269-905d-a017406505f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:25:01 compute-0 nova_compute[187185]: 2025-11-29 07:25:01.795 187189 DEBUG oslo_concurrency.lockutils [req-f6877a2d-30ab-417b-b641-e037be2f50ac req-546916fc-6c69-4540-a3a2-9f2460a7c88c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f11adcd-958a-4269-905d-a017406505f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:25:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:01.795 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:34:30:8b 10.100.0.12'], port_security=['fa:16:3e:34:30:8b 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '2702fe48-44d0-408d-8d10-fd635e3779c9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6d55e57bfd184513a304a61cc1cb3730', 'neutron:revision_number': '8', 'neutron:security_group_ids': '44bb1ac9-49ae-4c0a-8013-0db5efadb536', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=804567bc-6857-4eb6-aa00-b449f09c69a2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=3484baf0-bfbb-4b67-b841-a369f9a2c534) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:25:01 compute-0 nova_compute[187185]: 2025-11-29 07:25:01.796 187189 DEBUG nova.compute.manager [req-f6877a2d-30ab-417b-b641-e037be2f50ac req-546916fc-6c69-4540-a3a2-9f2460a7c88c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] No waiting events found dispatching network-vif-unplugged-a1e67d00-8650-44ba-b75d-07f55b8d8810 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:25:01 compute-0 nova_compute[187185]: 2025-11-29 07:25:01.796 187189 WARNING nova.compute.manager [req-f6877a2d-30ab-417b-b641-e037be2f50ac req-546916fc-6c69-4540-a3a2-9f2460a7c88c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Received unexpected event network-vif-unplugged-a1e67d00-8650-44ba-b75d-07f55b8d8810 for instance with vm_state active and task_state rescuing.
Nov 29 07:25:01 compute-0 nova_compute[187185]: 2025-11-29 07:25:01.797 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:01.792 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:25:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:01.792 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[483505e5-b083-4300-b3b9-cc23ed1a91f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:01.800 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 3484baf0-bfbb-4b67-b841-a369f9a2c534 in datapath 9b34af6b-edf9-4b27-b1dc-2b18c2eec958 unbound from our chassis
Nov 29 07:25:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:01.804 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9b34af6b-edf9-4b27-b1dc-2b18c2eec958, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:25:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:01.806 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[28ac34d7-2bbf-4e5c-9450-253f8d5fcd07]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:01.807 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 namespace which is not needed anymore
Nov 29 07:25:01 compute-0 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d0000007a.scope: Deactivated successfully.
Nov 29 07:25:01 compute-0 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d0000007a.scope: Consumed 13.117s CPU time.
Nov 29 07:25:01 compute-0 systemd-machined[153486]: Machine qemu-46-instance-0000007a terminated.
Nov 29 07:25:01 compute-0 nova_compute[187185]: 2025-11-29 07:25:01.963 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:01 compute-0 nova_compute[187185]: 2025-11-29 07:25:01.977 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:01 compute-0 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[233960]: [NOTICE]   (233967) : haproxy version is 2.8.14-c23fe91
Nov 29 07:25:01 compute-0 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[233960]: [NOTICE]   (233967) : path to executable is /usr/sbin/haproxy
Nov 29 07:25:01 compute-0 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[233960]: [WARNING]  (233967) : Exiting Master process...
Nov 29 07:25:01 compute-0 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[233960]: [ALERT]    (233967) : Current worker (233969) exited with code 143 (Terminated)
Nov 29 07:25:01 compute-0 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[233960]: [WARNING]  (233967) : All workers exited. Exiting... (0)
Nov 29 07:25:01 compute-0 systemd[1]: libpod-289dd7ff939ff073193c24d8adff8bd7c2ab1b059b013adf22cd659f84825024.scope: Deactivated successfully.
Nov 29 07:25:02 compute-0 podman[234479]: 2025-11-29 07:25:02.003641631 +0000 UTC m=+0.064065719 container died 289dd7ff939ff073193c24d8adff8bd7c2ab1b059b013adf22cd659f84825024 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 29 07:25:02 compute-0 nova_compute[187185]: 2025-11-29 07:25:02.013 187189 INFO nova.virt.libvirt.driver [-] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Instance destroyed successfully.
Nov 29 07:25:02 compute-0 nova_compute[187185]: 2025-11-29 07:25:02.014 187189 DEBUG nova.objects.instance [None req-e780857d-3ef6-4259-8989-007f87322a8e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lazy-loading 'resources' on Instance uuid 2702fe48-44d0-408d-8d10-fd635e3779c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:25:02 compute-0 nova_compute[187185]: 2025-11-29 07:25:02.032 187189 DEBUG nova.virt.libvirt.vif [None req-e780857d-3ef6-4259-8989-007f87322a8e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:24:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-864835491',display_name='tempest-ServerDiskConfigTestJSON-server-864835491',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-864835491',id=122,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:24:40Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6d55e57bfd184513a304a61cc1cb3730',ramdisk_id='',reservation_id='r-ybkvc9v9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1282760174',owner_user_name='tempest-ServerDiskConfigTestJSON-1282760174-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:24:43Z,user_data=None,user_id='000fb7b950024e16902cd58f2ea16ac9',uuid=2702fe48-44d0-408d-8d10-fd635e3779c9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "address": "fa:16:3e:34:30:8b", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3484baf0-bf", "ovs_interfaceid": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:25:02 compute-0 nova_compute[187185]: 2025-11-29 07:25:02.032 187189 DEBUG nova.network.os_vif_util [None req-e780857d-3ef6-4259-8989-007f87322a8e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converting VIF {"id": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "address": "fa:16:3e:34:30:8b", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3484baf0-bf", "ovs_interfaceid": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:25:02 compute-0 nova_compute[187185]: 2025-11-29 07:25:02.033 187189 DEBUG nova.network.os_vif_util [None req-e780857d-3ef6-4259-8989-007f87322a8e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:34:30:8b,bridge_name='br-int',has_traffic_filtering=True,id=3484baf0-bfbb-4b67-b841-a369f9a2c534,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3484baf0-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:25:02 compute-0 nova_compute[187185]: 2025-11-29 07:25:02.033 187189 DEBUG os_vif [None req-e780857d-3ef6-4259-8989-007f87322a8e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:34:30:8b,bridge_name='br-int',has_traffic_filtering=True,id=3484baf0-bfbb-4b67-b841-a369f9a2c534,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3484baf0-bf') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:25:02 compute-0 nova_compute[187185]: 2025-11-29 07:25:02.034 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:02 compute-0 nova_compute[187185]: 2025-11-29 07:25:02.035 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3484baf0-bf, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:25:02 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-289dd7ff939ff073193c24d8adff8bd7c2ab1b059b013adf22cd659f84825024-userdata-shm.mount: Deactivated successfully.
Nov 29 07:25:02 compute-0 systemd[1]: var-lib-containers-storage-overlay-7a9fa77b4055a7158239636c873de44924040c0504cef4308cb8e41e640386ab-merged.mount: Deactivated successfully.
Nov 29 07:25:02 compute-0 nova_compute[187185]: 2025-11-29 07:25:02.042 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:25:02 compute-0 nova_compute[187185]: 2025-11-29 07:25:02.043 187189 INFO os_vif [None req-e780857d-3ef6-4259-8989-007f87322a8e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:34:30:8b,bridge_name='br-int',has_traffic_filtering=True,id=3484baf0-bfbb-4b67-b841-a369f9a2c534,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3484baf0-bf')
Nov 29 07:25:02 compute-0 nova_compute[187185]: 2025-11-29 07:25:02.044 187189 INFO nova.virt.libvirt.driver [None req-e780857d-3ef6-4259-8989-007f87322a8e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Deleting instance files /var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9_del
Nov 29 07:25:02 compute-0 podman[234479]: 2025-11-29 07:25:02.048545877 +0000 UTC m=+0.108969965 container cleanup 289dd7ff939ff073193c24d8adff8bd7c2ab1b059b013adf22cd659f84825024 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 07:25:02 compute-0 nova_compute[187185]: 2025-11-29 07:25:02.049 187189 INFO nova.virt.libvirt.driver [None req-e780857d-3ef6-4259-8989-007f87322a8e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Deletion of /var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9_del complete
Nov 29 07:25:02 compute-0 systemd[1]: libpod-conmon-289dd7ff939ff073193c24d8adff8bd7c2ab1b059b013adf22cd659f84825024.scope: Deactivated successfully.
Nov 29 07:25:02 compute-0 nova_compute[187185]: 2025-11-29 07:25:02.076 187189 DEBUG nova.virt.libvirt.driver [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'scsi', 'dev': 'sdb', 'type': 'disk'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314
Nov 29 07:25:02 compute-0 nova_compute[187185]: 2025-11-29 07:25:02.081 187189 DEBUG nova.virt.libvirt.driver [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Nov 29 07:25:02 compute-0 nova_compute[187185]: 2025-11-29 07:25:02.081 187189 INFO nova.virt.libvirt.driver [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Creating image(s)
Nov 29 07:25:02 compute-0 nova_compute[187185]: 2025-11-29 07:25:02.082 187189 DEBUG oslo_concurrency.lockutils [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "/var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:25:02 compute-0 nova_compute[187185]: 2025-11-29 07:25:02.082 187189 DEBUG oslo_concurrency.lockutils [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "/var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:25:02 compute-0 nova_compute[187185]: 2025-11-29 07:25:02.083 187189 DEBUG oslo_concurrency.lockutils [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "/var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:25:02 compute-0 nova_compute[187185]: 2025-11-29 07:25:02.083 187189 DEBUG nova.objects.instance [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 5f11adcd-958a-4269-905d-a017406505f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:25:02 compute-0 nova_compute[187185]: 2025-11-29 07:25:02.100 187189 DEBUG oslo_concurrency.lockutils [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "f54dd85e52fe479e36220a2e2d112289f5828e52" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:25:02 compute-0 nova_compute[187185]: 2025-11-29 07:25:02.101 187189 DEBUG oslo_concurrency.lockutils [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "f54dd85e52fe479e36220a2e2d112289f5828e52" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:25:02 compute-0 podman[234528]: 2025-11-29 07:25:02.118898172 +0000 UTC m=+0.045991089 container remove 289dd7ff939ff073193c24d8adff8bd7c2ab1b059b013adf22cd659f84825024 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:25:02 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:02.126 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[30b2347e-d1a4-4f82-a772-d59cb5bcaacc]: (4, ('Sat Nov 29 07:25:01 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 (289dd7ff939ff073193c24d8adff8bd7c2ab1b059b013adf22cd659f84825024)\n289dd7ff939ff073193c24d8adff8bd7c2ab1b059b013adf22cd659f84825024\nSat Nov 29 07:25:02 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 (289dd7ff939ff073193c24d8adff8bd7c2ab1b059b013adf22cd659f84825024)\n289dd7ff939ff073193c24d8adff8bd7c2ab1b059b013adf22cd659f84825024\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:02 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:02.128 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[948f1270-a4cd-431f-9ffd-38cdf361245d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:02 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:02.129 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9b34af6b-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:25:02 compute-0 nova_compute[187185]: 2025-11-29 07:25:02.131 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:02 compute-0 kernel: tap9b34af6b-e0: left promiscuous mode
Nov 29 07:25:02 compute-0 nova_compute[187185]: 2025-11-29 07:25:02.141 187189 INFO nova.compute.manager [None req-e780857d-3ef6-4259-8989-007f87322a8e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Took 0.41 seconds to destroy the instance on the hypervisor.
Nov 29 07:25:02 compute-0 nova_compute[187185]: 2025-11-29 07:25:02.142 187189 DEBUG oslo.service.loopingcall [None req-e780857d-3ef6-4259-8989-007f87322a8e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 07:25:02 compute-0 nova_compute[187185]: 2025-11-29 07:25:02.142 187189 DEBUG nova.compute.manager [-] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 07:25:02 compute-0 nova_compute[187185]: 2025-11-29 07:25:02.143 187189 DEBUG nova.network.neutron [-] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 07:25:02 compute-0 nova_compute[187185]: 2025-11-29 07:25:02.149 187189 DEBUG nova.compute.manager [req-6d05b0e6-bc4e-4f1d-a191-50861dffba44 req-2e49ee27-22e9-44ab-97a5-e80adf4f33a8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Received event network-vif-plugged-e89dd8de-f981-46cf-aa04-cfad6a9b2326 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:25:02 compute-0 nova_compute[187185]: 2025-11-29 07:25:02.149 187189 DEBUG oslo_concurrency.lockutils [req-6d05b0e6-bc4e-4f1d-a191-50861dffba44 req-2e49ee27-22e9-44ab-97a5-e80adf4f33a8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7c10cb24-586c-4507-8169-8258d7136397-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:25:02 compute-0 nova_compute[187185]: 2025-11-29 07:25:02.151 187189 DEBUG oslo_concurrency.lockutils [req-6d05b0e6-bc4e-4f1d-a191-50861dffba44 req-2e49ee27-22e9-44ab-97a5-e80adf4f33a8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7c10cb24-586c-4507-8169-8258d7136397-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:25:02 compute-0 nova_compute[187185]: 2025-11-29 07:25:02.151 187189 DEBUG oslo_concurrency.lockutils [req-6d05b0e6-bc4e-4f1d-a191-50861dffba44 req-2e49ee27-22e9-44ab-97a5-e80adf4f33a8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7c10cb24-586c-4507-8169-8258d7136397-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:25:02 compute-0 nova_compute[187185]: 2025-11-29 07:25:02.151 187189 DEBUG nova.compute.manager [req-6d05b0e6-bc4e-4f1d-a191-50861dffba44 req-2e49ee27-22e9-44ab-97a5-e80adf4f33a8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] No waiting events found dispatching network-vif-plugged-e89dd8de-f981-46cf-aa04-cfad6a9b2326 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:25:02 compute-0 nova_compute[187185]: 2025-11-29 07:25:02.151 187189 WARNING nova.compute.manager [req-6d05b0e6-bc4e-4f1d-a191-50861dffba44 req-2e49ee27-22e9-44ab-97a5-e80adf4f33a8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Received unexpected event network-vif-plugged-e89dd8de-f981-46cf-aa04-cfad6a9b2326 for instance with vm_state resized and task_state None.
Nov 29 07:25:02 compute-0 nova_compute[187185]: 2025-11-29 07:25:02.152 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:02 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:02.153 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[98331624-a6ef-486a-93b6-b814c0aff528]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:02 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:02.166 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6957c05d-f15c-4f2c-884c-b4fd8aa02593]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:02 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:02.168 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[dcc5577a-1806-4ec0-ba45-bb098bdf9b9e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:02 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:02.189 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[da19d50e-56bb-492e-8428-ad73a5101613]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 652775, 'reachable_time': 16770, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234545, 'error': None, 'target': 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:02 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:02.191 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:25:02 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:02.191 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[8d6d154a-c19d-40b8-91e3-16d5aa81d4ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:02 compute-0 nova_compute[187185]: 2025-11-29 07:25:02.474 187189 DEBUG nova.compute.manager [req-70758d5e-8b4a-45db-a58f-ff5f31e1c9dd req-4144d624-7799-487d-9cc1-ca311d0fb66c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Received event network-vif-unplugged-3484baf0-bfbb-4b67-b841-a369f9a2c534 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:25:02 compute-0 nova_compute[187185]: 2025-11-29 07:25:02.474 187189 DEBUG oslo_concurrency.lockutils [req-70758d5e-8b4a-45db-a58f-ff5f31e1c9dd req-4144d624-7799-487d-9cc1-ca311d0fb66c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:25:02 compute-0 nova_compute[187185]: 2025-11-29 07:25:02.475 187189 DEBUG oslo_concurrency.lockutils [req-70758d5e-8b4a-45db-a58f-ff5f31e1c9dd req-4144d624-7799-487d-9cc1-ca311d0fb66c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:25:02 compute-0 nova_compute[187185]: 2025-11-29 07:25:02.475 187189 DEBUG oslo_concurrency.lockutils [req-70758d5e-8b4a-45db-a58f-ff5f31e1c9dd req-4144d624-7799-487d-9cc1-ca311d0fb66c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:25:02 compute-0 nova_compute[187185]: 2025-11-29 07:25:02.475 187189 DEBUG nova.compute.manager [req-70758d5e-8b4a-45db-a58f-ff5f31e1c9dd req-4144d624-7799-487d-9cc1-ca311d0fb66c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] No waiting events found dispatching network-vif-unplugged-3484baf0-bfbb-4b67-b841-a369f9a2c534 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:25:02 compute-0 nova_compute[187185]: 2025-11-29 07:25:02.475 187189 DEBUG nova.compute.manager [req-70758d5e-8b4a-45db-a58f-ff5f31e1c9dd req-4144d624-7799-487d-9cc1-ca311d0fb66c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Received event network-vif-unplugged-3484baf0-bfbb-4b67-b841-a369f9a2c534 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 07:25:02 compute-0 systemd[1]: run-netns-ovnmeta\x2d9b34af6b\x2dedf9\x2d4b27\x2db1dc\x2d2b18c2eec958.mount: Deactivated successfully.
Nov 29 07:25:03 compute-0 nova_compute[187185]: 2025-11-29 07:25:03.087 187189 DEBUG nova.network.neutron [-] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:25:03 compute-0 nova_compute[187185]: 2025-11-29 07:25:03.282 187189 INFO nova.compute.manager [-] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Took 1.14 seconds to deallocate network for instance.
Nov 29 07:25:03 compute-0 nova_compute[187185]: 2025-11-29 07:25:03.380 187189 DEBUG oslo_concurrency.lockutils [None req-e780857d-3ef6-4259-8989-007f87322a8e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:25:03 compute-0 nova_compute[187185]: 2025-11-29 07:25:03.381 187189 DEBUG oslo_concurrency.lockutils [None req-e780857d-3ef6-4259-8989-007f87322a8e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:25:03 compute-0 nova_compute[187185]: 2025-11-29 07:25:03.388 187189 DEBUG oslo_concurrency.lockutils [None req-e780857d-3ef6-4259-8989-007f87322a8e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.007s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:25:03 compute-0 nova_compute[187185]: 2025-11-29 07:25:03.436 187189 INFO nova.scheduler.client.report [None req-e780857d-3ef6-4259-8989-007f87322a8e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Deleted allocations for instance 2702fe48-44d0-408d-8d10-fd635e3779c9
Nov 29 07:25:03 compute-0 nova_compute[187185]: 2025-11-29 07:25:03.502 187189 DEBUG oslo_concurrency.processutils [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f54dd85e52fe479e36220a2e2d112289f5828e52.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:25:03 compute-0 nova_compute[187185]: 2025-11-29 07:25:03.548 187189 DEBUG oslo_concurrency.lockutils [None req-e780857d-3ef6-4259-8989-007f87322a8e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "2702fe48-44d0-408d-8d10-fd635e3779c9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.842s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:25:03 compute-0 nova_compute[187185]: 2025-11-29 07:25:03.578 187189 DEBUG oslo_concurrency.processutils [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f54dd85e52fe479e36220a2e2d112289f5828e52.part --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:25:03 compute-0 nova_compute[187185]: 2025-11-29 07:25:03.579 187189 DEBUG nova.virt.images [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] 0e7bea15-406f-4109-be7a-667834e4aa26 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Nov 29 07:25:03 compute-0 nova_compute[187185]: 2025-11-29 07:25:03.580 187189 DEBUG nova.privsep.utils [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 29 07:25:03 compute-0 nova_compute[187185]: 2025-11-29 07:25:03.581 187189 DEBUG oslo_concurrency.processutils [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/f54dd85e52fe479e36220a2e2d112289f5828e52.part /var/lib/nova/instances/_base/f54dd85e52fe479e36220a2e2d112289f5828e52.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:25:03 compute-0 nova_compute[187185]: 2025-11-29 07:25:03.812 187189 DEBUG oslo_concurrency.processutils [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/f54dd85e52fe479e36220a2e2d112289f5828e52.part /var/lib/nova/instances/_base/f54dd85e52fe479e36220a2e2d112289f5828e52.converted" returned: 0 in 0.231s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:25:03 compute-0 nova_compute[187185]: 2025-11-29 07:25:03.819 187189 DEBUG oslo_concurrency.processutils [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f54dd85e52fe479e36220a2e2d112289f5828e52.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:25:03 compute-0 nova_compute[187185]: 2025-11-29 07:25:03.876 187189 DEBUG oslo_concurrency.processutils [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f54dd85e52fe479e36220a2e2d112289f5828e52.converted --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:25:03 compute-0 nova_compute[187185]: 2025-11-29 07:25:03.878 187189 DEBUG oslo_concurrency.lockutils [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "f54dd85e52fe479e36220a2e2d112289f5828e52" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.777s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:25:03 compute-0 nova_compute[187185]: 2025-11-29 07:25:03.898 187189 DEBUG nova.compute.manager [req-d91d8e39-e9c0-4d42-bd92-46cdc7efae72 req-090e90b2-0047-4c3c-9cf1-35a6b215348e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Received event network-vif-plugged-a1e67d00-8650-44ba-b75d-07f55b8d8810 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:25:03 compute-0 nova_compute[187185]: 2025-11-29 07:25:03.899 187189 DEBUG oslo_concurrency.lockutils [req-d91d8e39-e9c0-4d42-bd92-46cdc7efae72 req-090e90b2-0047-4c3c-9cf1-35a6b215348e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5f11adcd-958a-4269-905d-a017406505f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:25:03 compute-0 nova_compute[187185]: 2025-11-29 07:25:03.899 187189 DEBUG oslo_concurrency.lockutils [req-d91d8e39-e9c0-4d42-bd92-46cdc7efae72 req-090e90b2-0047-4c3c-9cf1-35a6b215348e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f11adcd-958a-4269-905d-a017406505f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:25:03 compute-0 nova_compute[187185]: 2025-11-29 07:25:03.899 187189 DEBUG oslo_concurrency.lockutils [req-d91d8e39-e9c0-4d42-bd92-46cdc7efae72 req-090e90b2-0047-4c3c-9cf1-35a6b215348e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f11adcd-958a-4269-905d-a017406505f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:25:03 compute-0 nova_compute[187185]: 2025-11-29 07:25:03.900 187189 DEBUG nova.compute.manager [req-d91d8e39-e9c0-4d42-bd92-46cdc7efae72 req-090e90b2-0047-4c3c-9cf1-35a6b215348e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] No waiting events found dispatching network-vif-plugged-a1e67d00-8650-44ba-b75d-07f55b8d8810 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:25:03 compute-0 nova_compute[187185]: 2025-11-29 07:25:03.900 187189 WARNING nova.compute.manager [req-d91d8e39-e9c0-4d42-bd92-46cdc7efae72 req-090e90b2-0047-4c3c-9cf1-35a6b215348e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Received unexpected event network-vif-plugged-a1e67d00-8650-44ba-b75d-07f55b8d8810 for instance with vm_state active and task_state rescuing.
Nov 29 07:25:03 compute-0 nova_compute[187185]: 2025-11-29 07:25:03.900 187189 DEBUG nova.compute.manager [req-d91d8e39-e9c0-4d42-bd92-46cdc7efae72 req-090e90b2-0047-4c3c-9cf1-35a6b215348e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Received event network-vif-deleted-3484baf0-bfbb-4b67-b841-a369f9a2c534 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:25:03 compute-0 nova_compute[187185]: 2025-11-29 07:25:03.901 187189 DEBUG oslo_concurrency.lockutils [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "f54dd85e52fe479e36220a2e2d112289f5828e52" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:25:03 compute-0 nova_compute[187185]: 2025-11-29 07:25:03.902 187189 DEBUG oslo_concurrency.lockutils [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "f54dd85e52fe479e36220a2e2d112289f5828e52" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:25:03 compute-0 nova_compute[187185]: 2025-11-29 07:25:03.911 187189 DEBUG oslo_concurrency.processutils [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f54dd85e52fe479e36220a2e2d112289f5828e52 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:25:03 compute-0 nova_compute[187185]: 2025-11-29 07:25:03.968 187189 DEBUG oslo_concurrency.processutils [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f54dd85e52fe479e36220a2e2d112289f5828e52 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:25:03 compute-0 nova_compute[187185]: 2025-11-29 07:25:03.970 187189 DEBUG oslo_concurrency.processutils [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f54dd85e52fe479e36220a2e2d112289f5828e52,backing_fmt=raw /var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.013 187189 DEBUG oslo_concurrency.processutils [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f54dd85e52fe479e36220a2e2d112289f5828e52,backing_fmt=raw /var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk.rescue" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.015 187189 DEBUG oslo_concurrency.lockutils [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "f54dd85e52fe479e36220a2e2d112289f5828e52" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.015 187189 DEBUG nova.objects.instance [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lazy-loading 'migration_context' on Instance uuid 5f11adcd-958a-4269-905d-a017406505f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.035 187189 DEBUG nova.virt.libvirt.driver [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.039 187189 DEBUG nova.virt.libvirt.driver [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Start _get_guest_xml network_info=[{"id": "a1e67d00-8650-44ba-b75d-07f55b8d8810", "address": "fa:16:3e:39:9f:d3", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "vif_mac": "fa:16:3e:39:9f:d3"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1e67d00-86", "ovs_interfaceid": "a1e67d00-8650-44ba-b75d-07f55b8d8810", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'scsi', 'dev': 'sdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '0e7bea15-406f-4109-be7a-667834e4aa26', 'kernel_id': '', 'ramdisk_id': ''} block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.040 187189 DEBUG nova.objects.instance [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lazy-loading 'resources' on Instance uuid 5f11adcd-958a-4269-905d-a017406505f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.076 187189 WARNING nova.virt.libvirt.driver [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.085 187189 DEBUG nova.virt.libvirt.host [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.087 187189 DEBUG nova.virt.libvirt.host [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.092 187189 DEBUG nova.virt.libvirt.host [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.094 187189 DEBUG nova.virt.libvirt.host [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.096 187189 DEBUG nova.virt.libvirt.driver [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.096 187189 DEBUG nova.virt.hardware [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.097 187189 DEBUG nova.virt.hardware [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.097 187189 DEBUG nova.virt.hardware [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.097 187189 DEBUG nova.virt.hardware [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.098 187189 DEBUG nova.virt.hardware [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.098 187189 DEBUG nova.virt.hardware [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.098 187189 DEBUG nova.virt.hardware [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.099 187189 DEBUG nova.virt.hardware [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.099 187189 DEBUG nova.virt.hardware [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.099 187189 DEBUG nova.virt.hardware [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.100 187189 DEBUG nova.virt.hardware [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.100 187189 DEBUG nova.objects.instance [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 5f11adcd-958a-4269-905d-a017406505f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.124 187189 DEBUG oslo_concurrency.processutils [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.185 187189 DEBUG oslo_concurrency.processutils [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk.config --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.187 187189 DEBUG oslo_concurrency.lockutils [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "/var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.187 187189 DEBUG oslo_concurrency.lockutils [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "/var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.188 187189 DEBUG oslo_concurrency.lockutils [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "/var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.189 187189 DEBUG nova.virt.libvirt.vif [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:24:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1816602290',display_name='tempest-ServerStableDeviceRescueTest-server-1816602290',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-1816602290',id=123,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:24:41Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ac3bb322fa744e099b38e08abe12d0e2',ramdisk_id='',reservation_id='r-6vcqx0xs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-2012111838',owner_user_name='tempest-ServerStableDeviceRescueTest-2012111838-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:24:51Z,user_data=None,user_id='5be41a8530314f83bbecbb74b9276f2d',uuid=5f11adcd-958a-4269-905d-a017406505f0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a1e67d00-8650-44ba-b75d-07f55b8d8810", "address": "fa:16:3e:39:9f:d3", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "vif_mac": "fa:16:3e:39:9f:d3"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1e67d00-86", "ovs_interfaceid": "a1e67d00-8650-44ba-b75d-07f55b8d8810", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.190 187189 DEBUG nova.network.os_vif_util [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Converting VIF {"id": "a1e67d00-8650-44ba-b75d-07f55b8d8810", "address": "fa:16:3e:39:9f:d3", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "vif_mac": "fa:16:3e:39:9f:d3"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1e67d00-86", "ovs_interfaceid": "a1e67d00-8650-44ba-b75d-07f55b8d8810", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.191 187189 DEBUG nova.network.os_vif_util [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:39:9f:d3,bridge_name='br-int',has_traffic_filtering=True,id=a1e67d00-8650-44ba-b75d-07f55b8d8810,network=Network(240f16d8-602b-4aa1-8edb-e3a8d3674e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa1e67d00-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.191 187189 DEBUG nova.objects.instance [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5f11adcd-958a-4269-905d-a017406505f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.206 187189 DEBUG nova.virt.libvirt.driver [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:25:04 compute-0 nova_compute[187185]:   <uuid>5f11adcd-958a-4269-905d-a017406505f0</uuid>
Nov 29 07:25:04 compute-0 nova_compute[187185]:   <name>instance-0000007b</name>
Nov 29 07:25:04 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:25:04 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:25:04 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:25:04 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:       <nova:name>tempest-ServerStableDeviceRescueTest-server-1816602290</nova:name>
Nov 29 07:25:04 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:25:04</nova:creationTime>
Nov 29 07:25:04 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:25:04 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:25:04 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:25:04 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:25:04 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:25:04 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:25:04 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:25:04 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:25:04 compute-0 nova_compute[187185]:         <nova:user uuid="5be41a8530314f83bbecbb74b9276f2d">tempest-ServerStableDeviceRescueTest-2012111838-project-member</nova:user>
Nov 29 07:25:04 compute-0 nova_compute[187185]:         <nova:project uuid="ac3bb322fa744e099b38e08abe12d0e2">tempest-ServerStableDeviceRescueTest-2012111838</nova:project>
Nov 29 07:25:04 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:25:04 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:25:04 compute-0 nova_compute[187185]:         <nova:port uuid="a1e67d00-8650-44ba-b75d-07f55b8d8810">
Nov 29 07:25:04 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:25:04 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:25:04 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:25:04 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <system>
Nov 29 07:25:04 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:25:04 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:25:04 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:25:04 compute-0 nova_compute[187185]:       <entry name="serial">5f11adcd-958a-4269-905d-a017406505f0</entry>
Nov 29 07:25:04 compute-0 nova_compute[187185]:       <entry name="uuid">5f11adcd-958a-4269-905d-a017406505f0</entry>
Nov 29 07:25:04 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     </system>
Nov 29 07:25:04 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:25:04 compute-0 nova_compute[187185]:   <os>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:   </os>
Nov 29 07:25:04 compute-0 nova_compute[187185]:   <features>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:   </features>
Nov 29 07:25:04 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:25:04 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:25:04 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:25:04 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:25:04 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk.config"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:25:04 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk.rescue"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:       <target dev="sdb" bus="scsi"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:       <boot order="1"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:25:04 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:39:9f:d3"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:       <target dev="tapa1e67d00-86"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:25:04 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/console.log" append="off"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <video>
Nov 29 07:25:04 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     </video>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:25:04 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:25:04 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:25:04 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:25:04 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:25:04 compute-0 nova_compute[187185]: </domain>
Nov 29 07:25:04 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.222 187189 INFO nova.virt.libvirt.driver [-] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Instance destroyed successfully.
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.298 187189 DEBUG nova.virt.libvirt.driver [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.299 187189 DEBUG nova.virt.libvirt.driver [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.299 187189 DEBUG nova.virt.libvirt.driver [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] No BDM found with device name sdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.299 187189 DEBUG nova.virt.libvirt.driver [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] No VIF found with MAC fa:16:3e:39:9f:d3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.300 187189 INFO nova.virt.libvirt.driver [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Using config drive
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.315 187189 DEBUG nova.objects.instance [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 5f11adcd-958a-4269-905d-a017406505f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.341 187189 DEBUG nova.objects.instance [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lazy-loading 'keypairs' on Instance uuid 5f11adcd-958a-4269-905d-a017406505f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.583 187189 DEBUG nova.compute.manager [req-d5153a97-681b-44c5-ab6c-0b2d0a542809 req-bab2b541-b4a3-4e86-a7fe-955a79ae896d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Received event network-vif-plugged-3484baf0-bfbb-4b67-b841-a369f9a2c534 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.584 187189 DEBUG oslo_concurrency.lockutils [req-d5153a97-681b-44c5-ab6c-0b2d0a542809 req-bab2b541-b4a3-4e86-a7fe-955a79ae896d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.585 187189 DEBUG oslo_concurrency.lockutils [req-d5153a97-681b-44c5-ab6c-0b2d0a542809 req-bab2b541-b4a3-4e86-a7fe-955a79ae896d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.586 187189 DEBUG oslo_concurrency.lockutils [req-d5153a97-681b-44c5-ab6c-0b2d0a542809 req-bab2b541-b4a3-4e86-a7fe-955a79ae896d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.586 187189 DEBUG nova.compute.manager [req-d5153a97-681b-44c5-ab6c-0b2d0a542809 req-bab2b541-b4a3-4e86-a7fe-955a79ae896d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] No waiting events found dispatching network-vif-plugged-3484baf0-bfbb-4b67-b841-a369f9a2c534 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.587 187189 WARNING nova.compute.manager [req-d5153a97-681b-44c5-ab6c-0b2d0a542809 req-bab2b541-b4a3-4e86-a7fe-955a79ae896d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Received unexpected event network-vif-plugged-3484baf0-bfbb-4b67-b841-a369f9a2c534 for instance with vm_state deleted and task_state None.
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.791 187189 INFO nova.virt.libvirt.driver [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Creating config drive at /var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk.config.rescue
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.798 187189 DEBUG oslo_concurrency.processutils [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpi936a46p execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:25:04 compute-0 nova_compute[187185]: 2025-11-29 07:25:04.932 187189 DEBUG oslo_concurrency.processutils [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpi936a46p" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:25:05 compute-0 kernel: tapa1e67d00-86: entered promiscuous mode
Nov 29 07:25:05 compute-0 NetworkManager[55227]: <info>  [1764401105.0344] manager: (tapa1e67d00-86): new Tun device (/org/freedesktop/NetworkManager/Devices/198)
Nov 29 07:25:05 compute-0 ovn_controller[95281]: 2025-11-29T07:25:05Z|00371|binding|INFO|Claiming lport a1e67d00-8650-44ba-b75d-07f55b8d8810 for this chassis.
Nov 29 07:25:05 compute-0 ovn_controller[95281]: 2025-11-29T07:25:05Z|00372|binding|INFO|a1e67d00-8650-44ba-b75d-07f55b8d8810: Claiming fa:16:3e:39:9f:d3 10.100.0.11
Nov 29 07:25:05 compute-0 nova_compute[187185]: 2025-11-29 07:25:05.035 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:05.044 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:39:9f:d3 10.100.0.11'], port_security=['fa:16:3e:39:9f:d3 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '5f11adcd-958a-4269-905d-a017406505f0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'neutron:revision_number': '5', 'neutron:security_group_ids': '339ff6a8-b11e-4176-931b-a82ab9688ace', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0beb853-8490-4e92-a787-adc66ba47efc, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=a1e67d00-8650-44ba-b75d-07f55b8d8810) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:05.046 104254 INFO neutron.agent.ovn.metadata.agent [-] Port a1e67d00-8650-44ba-b75d-07f55b8d8810 in datapath 240f16d8-602b-4aa1-8edb-e3a8d3674e39 bound to our chassis
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:05.050 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 240f16d8-602b-4aa1-8edb-e3a8d3674e39
Nov 29 07:25:05 compute-0 ovn_controller[95281]: 2025-11-29T07:25:05Z|00373|binding|INFO|Setting lport a1e67d00-8650-44ba-b75d-07f55b8d8810 ovn-installed in OVS
Nov 29 07:25:05 compute-0 ovn_controller[95281]: 2025-11-29T07:25:05Z|00374|binding|INFO|Setting lport a1e67d00-8650-44ba-b75d-07f55b8d8810 up in Southbound
Nov 29 07:25:05 compute-0 nova_compute[187185]: 2025-11-29 07:25:05.072 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:05 compute-0 nova_compute[187185]: 2025-11-29 07:25:05.080 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:05.077 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[4260ffc6-e2f7-4a05-94fb-5bdff3bac491]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:05.082 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap240f16d8-61 in ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:05.084 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap240f16d8-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:05.084 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[df78c014-e149-46e0-b2d3-721875f135eb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:05.086 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[3787eed8-b65a-4f27-bed8-ea1e1b57f19e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:05 compute-0 systemd-udevd[234596]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:05.099 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[f9e6e86a-7a57-4cdc-8fa8-b536cf972c15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:05 compute-0 systemd-machined[153486]: New machine qemu-49-instance-0000007b.
Nov 29 07:25:05 compute-0 systemd[1]: Started Virtual Machine qemu-49-instance-0000007b.
Nov 29 07:25:05 compute-0 NetworkManager[55227]: <info>  [1764401105.1069] device (tapa1e67d00-86): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:25:05 compute-0 NetworkManager[55227]: <info>  [1764401105.1077] device (tapa1e67d00-86): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:05.212 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[568db742-a516-474c-b1f0-b61e30ce01b6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:05.253 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[2f3e0259-50de-48f8-9053-362a8b059ef3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:05 compute-0 NetworkManager[55227]: <info>  [1764401105.2630] manager: (tap240f16d8-60): new Veth device (/org/freedesktop/NetworkManager/Devices/199)
Nov 29 07:25:05 compute-0 systemd-udevd[234600]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:05.261 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[3dfc0c0b-6d2f-4345-a9d0-9296d821d8d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:05 compute-0 podman[234586]: 2025-11-29 07:25:05.275231293 +0000 UTC m=+0.204035876 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:05.307 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[bec178f1-7997-41ab-bfd2-d65334408624]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:05.310 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[48d0ce1c-a6d7-48a1-8e84-5108899f1faf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:05 compute-0 NetworkManager[55227]: <info>  [1764401105.3413] device (tap240f16d8-60): carrier: link connected
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:05.350 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[952b5aba-467b-4218-8ac9-713e0b349941]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:05.369 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6bfbb3e6-cf8c-43d5-9d62-1bf7fcd99574]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap240f16d8-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:aa:7e:40'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 121], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 655300, 'reachable_time': 34989, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234644, 'error': None, 'target': 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:05.389 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f0a7fac5-4a62-4b8f-91fb-0a192f95f6c6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feaa:7e40'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 655300, 'tstamp': 655300}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234645, 'error': None, 'target': 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:05.408 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[5a443eb8-9a7c-49ee-a2b4-c562e2d00f11]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap240f16d8-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:aa:7e:40'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 121], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 655300, 'reachable_time': 34989, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234646, 'error': None, 'target': 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:05.449 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6dc0ba90-8eaa-4cd4-a118-c49ae8c7def0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:05.523 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[1cc409b8-9fcb-457a-af86-3115c34626fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:05.525 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap240f16d8-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:05.525 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:05.526 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap240f16d8-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:25:05 compute-0 nova_compute[187185]: 2025-11-29 07:25:05.528 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:05 compute-0 kernel: tap240f16d8-60: entered promiscuous mode
Nov 29 07:25:05 compute-0 NetworkManager[55227]: <info>  [1764401105.5291] manager: (tap240f16d8-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/200)
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:05.531 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap240f16d8-60, col_values=(('external_ids', {'iface-id': '0d1614aa-fbbf-4ad2-9ee2-8948d583ddf6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:25:05 compute-0 ovn_controller[95281]: 2025-11-29T07:25:05Z|00375|binding|INFO|Releasing lport 0d1614aa-fbbf-4ad2-9ee2-8948d583ddf6 from this chassis (sb_readonly=0)
Nov 29 07:25:05 compute-0 nova_compute[187185]: 2025-11-29 07:25:05.532 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:05 compute-0 nova_compute[187185]: 2025-11-29 07:25:05.544 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:05.545 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/240f16d8-602b-4aa1-8edb-e3a8d3674e39.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/240f16d8-602b-4aa1-8edb-e3a8d3674e39.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:05.547 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[42eb3c1f-25ee-48d9-9471-f322e90b5509]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:05.548 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-240f16d8-602b-4aa1-8edb-e3a8d3674e39
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/240f16d8-602b-4aa1-8edb-e3a8d3674e39.pid.haproxy
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID 240f16d8-602b-4aa1-8edb-e3a8d3674e39
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:25:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:05.549 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'env', 'PROCESS_TAG=haproxy-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/240f16d8-602b-4aa1-8edb-e3a8d3674e39.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:25:05 compute-0 nova_compute[187185]: 2025-11-29 07:25:05.967 187189 DEBUG nova.compute.manager [req-138aa92d-1cd6-480f-9631-89fdde6df544 req-0681938c-65a8-44ab-b571-9b32236d10b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Received event network-vif-plugged-a1e67d00-8650-44ba-b75d-07f55b8d8810 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:25:05 compute-0 nova_compute[187185]: 2025-11-29 07:25:05.968 187189 DEBUG oslo_concurrency.lockutils [req-138aa92d-1cd6-480f-9631-89fdde6df544 req-0681938c-65a8-44ab-b571-9b32236d10b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5f11adcd-958a-4269-905d-a017406505f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:25:05 compute-0 nova_compute[187185]: 2025-11-29 07:25:05.969 187189 DEBUG oslo_concurrency.lockutils [req-138aa92d-1cd6-480f-9631-89fdde6df544 req-0681938c-65a8-44ab-b571-9b32236d10b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f11adcd-958a-4269-905d-a017406505f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:25:05 compute-0 nova_compute[187185]: 2025-11-29 07:25:05.969 187189 DEBUG oslo_concurrency.lockutils [req-138aa92d-1cd6-480f-9631-89fdde6df544 req-0681938c-65a8-44ab-b571-9b32236d10b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f11adcd-958a-4269-905d-a017406505f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:25:05 compute-0 nova_compute[187185]: 2025-11-29 07:25:05.969 187189 DEBUG nova.compute.manager [req-138aa92d-1cd6-480f-9631-89fdde6df544 req-0681938c-65a8-44ab-b571-9b32236d10b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] No waiting events found dispatching network-vif-plugged-a1e67d00-8650-44ba-b75d-07f55b8d8810 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:25:05 compute-0 nova_compute[187185]: 2025-11-29 07:25:05.969 187189 WARNING nova.compute.manager [req-138aa92d-1cd6-480f-9631-89fdde6df544 req-0681938c-65a8-44ab-b571-9b32236d10b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Received unexpected event network-vif-plugged-a1e67d00-8650-44ba-b75d-07f55b8d8810 for instance with vm_state active and task_state rescuing.
Nov 29 07:25:06 compute-0 podman[234678]: 2025-11-29 07:25:05.924258771 +0000 UTC m=+0.026451528 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:25:06 compute-0 nova_compute[187185]: 2025-11-29 07:25:06.197 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:06 compute-0 nova_compute[187185]: 2025-11-29 07:25:06.321 187189 DEBUG nova.virt.libvirt.host [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Removed pending event for 5f11adcd-958a-4269-905d-a017406505f0 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Nov 29 07:25:06 compute-0 nova_compute[187185]: 2025-11-29 07:25:06.322 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401106.3192694, 5f11adcd-958a-4269-905d-a017406505f0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:25:06 compute-0 nova_compute[187185]: 2025-11-29 07:25:06.322 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5f11adcd-958a-4269-905d-a017406505f0] VM Resumed (Lifecycle Event)
Nov 29 07:25:06 compute-0 nova_compute[187185]: 2025-11-29 07:25:06.338 187189 DEBUG nova.compute.manager [None req-2fb51c01-0d99-42a6-86c0-b6bb66198a7e 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:25:06 compute-0 nova_compute[187185]: 2025-11-29 07:25:06.347 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:25:06 compute-0 nova_compute[187185]: 2025-11-29 07:25:06.354 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:25:06 compute-0 nova_compute[187185]: 2025-11-29 07:25:06.394 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5f11adcd-958a-4269-905d-a017406505f0] During sync_power_state the instance has a pending task (rescuing). Skip.
Nov 29 07:25:06 compute-0 nova_compute[187185]: 2025-11-29 07:25:06.394 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401106.320349, 5f11adcd-958a-4269-905d-a017406505f0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:25:06 compute-0 nova_compute[187185]: 2025-11-29 07:25:06.395 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5f11adcd-958a-4269-905d-a017406505f0] VM Started (Lifecycle Event)
Nov 29 07:25:06 compute-0 podman[234678]: 2025-11-29 07:25:06.444623998 +0000 UTC m=+0.546816765 container create 73a66039136ca9b46c1f15cb7816017d6343364983179d93514488b356c539aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 07:25:06 compute-0 nova_compute[187185]: 2025-11-29 07:25:06.465 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:25:06 compute-0 nova_compute[187185]: 2025-11-29 07:25:06.472 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:25:06 compute-0 systemd[1]: Started libpod-conmon-73a66039136ca9b46c1f15cb7816017d6343364983179d93514488b356c539aa.scope.
Nov 29 07:25:06 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:25:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/74aeeaa8d974bc06e491fe15c661f88a3b2a9fb0c5e8e2a6cbc224cce4d630f4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:25:06 compute-0 podman[234678]: 2025-11-29 07:25:06.556273427 +0000 UTC m=+0.658466194 container init 73a66039136ca9b46c1f15cb7816017d6343364983179d93514488b356c539aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 29 07:25:06 compute-0 podman[234678]: 2025-11-29 07:25:06.561946907 +0000 UTC m=+0.664139644 container start 73a66039136ca9b46c1f15cb7816017d6343364983179d93514488b356c539aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 29 07:25:06 compute-0 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[234701]: [NOTICE]   (234705) : New worker (234707) forked
Nov 29 07:25:06 compute-0 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[234701]: [NOTICE]   (234705) : Loading success.
Nov 29 07:25:07 compute-0 nova_compute[187185]: 2025-11-29 07:25:07.037 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:08 compute-0 nova_compute[187185]: 2025-11-29 07:25:08.122 187189 DEBUG nova.compute.manager [req-30bfc431-3526-406f-810c-1070f7349e65 req-2a79c8e3-3a54-4468-b6da-a840e358596f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Received event network-vif-plugged-a1e67d00-8650-44ba-b75d-07f55b8d8810 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:25:08 compute-0 nova_compute[187185]: 2025-11-29 07:25:08.124 187189 DEBUG oslo_concurrency.lockutils [req-30bfc431-3526-406f-810c-1070f7349e65 req-2a79c8e3-3a54-4468-b6da-a840e358596f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5f11adcd-958a-4269-905d-a017406505f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:25:08 compute-0 nova_compute[187185]: 2025-11-29 07:25:08.125 187189 DEBUG oslo_concurrency.lockutils [req-30bfc431-3526-406f-810c-1070f7349e65 req-2a79c8e3-3a54-4468-b6da-a840e358596f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f11adcd-958a-4269-905d-a017406505f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:25:08 compute-0 nova_compute[187185]: 2025-11-29 07:25:08.125 187189 DEBUG oslo_concurrency.lockutils [req-30bfc431-3526-406f-810c-1070f7349e65 req-2a79c8e3-3a54-4468-b6da-a840e358596f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f11adcd-958a-4269-905d-a017406505f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:25:08 compute-0 nova_compute[187185]: 2025-11-29 07:25:08.126 187189 DEBUG nova.compute.manager [req-30bfc431-3526-406f-810c-1070f7349e65 req-2a79c8e3-3a54-4468-b6da-a840e358596f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] No waiting events found dispatching network-vif-plugged-a1e67d00-8650-44ba-b75d-07f55b8d8810 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:25:08 compute-0 nova_compute[187185]: 2025-11-29 07:25:08.127 187189 WARNING nova.compute.manager [req-30bfc431-3526-406f-810c-1070f7349e65 req-2a79c8e3-3a54-4468-b6da-a840e358596f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Received unexpected event network-vif-plugged-a1e67d00-8650-44ba-b75d-07f55b8d8810 for instance with vm_state rescued and task_state unrescuing.
Nov 29 07:25:08 compute-0 nova_compute[187185]: 2025-11-29 07:25:08.201 187189 INFO nova.compute.manager [None req-0d8421d2-7b50-4435-b590-67124d97cf43 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Unrescuing
Nov 29 07:25:08 compute-0 nova_compute[187185]: 2025-11-29 07:25:08.202 187189 DEBUG oslo_concurrency.lockutils [None req-0d8421d2-7b50-4435-b590-67124d97cf43 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "refresh_cache-5f11adcd-958a-4269-905d-a017406505f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:25:08 compute-0 nova_compute[187185]: 2025-11-29 07:25:08.203 187189 DEBUG oslo_concurrency.lockutils [None req-0d8421d2-7b50-4435-b590-67124d97cf43 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquired lock "refresh_cache-5f11adcd-958a-4269-905d-a017406505f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:25:08 compute-0 nova_compute[187185]: 2025-11-29 07:25:08.204 187189 DEBUG nova.network.neutron [None req-0d8421d2-7b50-4435-b590-67124d97cf43 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:25:08 compute-0 podman[234716]: 2025-11-29 07:25:08.812569621 +0000 UTC m=+0.074305827 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:25:08 compute-0 podman[234717]: 2025-11-29 07:25:08.822404218 +0000 UTC m=+0.084088883 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 07:25:10 compute-0 nova_compute[187185]: 2025-11-29 07:25:10.268 187189 DEBUG nova.network.neutron [None req-0d8421d2-7b50-4435-b590-67124d97cf43 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Updating instance_info_cache with network_info: [{"id": "a1e67d00-8650-44ba-b75d-07f55b8d8810", "address": "fa:16:3e:39:9f:d3", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1e67d00-86", "ovs_interfaceid": "a1e67d00-8650-44ba-b75d-07f55b8d8810", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:25:10 compute-0 nova_compute[187185]: 2025-11-29 07:25:10.286 187189 DEBUG oslo_concurrency.lockutils [None req-0d8421d2-7b50-4435-b590-67124d97cf43 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Releasing lock "refresh_cache-5f11adcd-958a-4269-905d-a017406505f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:25:10 compute-0 nova_compute[187185]: 2025-11-29 07:25:10.288 187189 DEBUG nova.objects.instance [None req-0d8421d2-7b50-4435-b590-67124d97cf43 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lazy-loading 'flavor' on Instance uuid 5f11adcd-958a-4269-905d-a017406505f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:25:10 compute-0 kernel: tapa1e67d00-86 (unregistering): left promiscuous mode
Nov 29 07:25:10 compute-0 NetworkManager[55227]: <info>  [1764401110.3649] device (tapa1e67d00-86): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:25:10 compute-0 nova_compute[187185]: 2025-11-29 07:25:10.379 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:10 compute-0 ovn_controller[95281]: 2025-11-29T07:25:10Z|00376|binding|INFO|Releasing lport a1e67d00-8650-44ba-b75d-07f55b8d8810 from this chassis (sb_readonly=0)
Nov 29 07:25:10 compute-0 ovn_controller[95281]: 2025-11-29T07:25:10Z|00377|binding|INFO|Setting lport a1e67d00-8650-44ba-b75d-07f55b8d8810 down in Southbound
Nov 29 07:25:10 compute-0 ovn_controller[95281]: 2025-11-29T07:25:10Z|00378|binding|INFO|Removing iface tapa1e67d00-86 ovn-installed in OVS
Nov 29 07:25:10 compute-0 nova_compute[187185]: 2025-11-29 07:25:10.381 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:10.386 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:39:9f:d3 10.100.0.11'], port_security=['fa:16:3e:39:9f:d3 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '5f11adcd-958a-4269-905d-a017406505f0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'neutron:revision_number': '6', 'neutron:security_group_ids': '339ff6a8-b11e-4176-931b-a82ab9688ace', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0beb853-8490-4e92-a787-adc66ba47efc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=a1e67d00-8650-44ba-b75d-07f55b8d8810) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:25:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:10.388 104254 INFO neutron.agent.ovn.metadata.agent [-] Port a1e67d00-8650-44ba-b75d-07f55b8d8810 in datapath 240f16d8-602b-4aa1-8edb-e3a8d3674e39 unbound from our chassis
Nov 29 07:25:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:10.390 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 240f16d8-602b-4aa1-8edb-e3a8d3674e39, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:25:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:10.391 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[bb325424-b10d-43f9-a8f0-a145dcdd9871]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:10.392 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39 namespace which is not needed anymore
Nov 29 07:25:10 compute-0 nova_compute[187185]: 2025-11-29 07:25:10.398 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:10 compute-0 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d0000007b.scope: Deactivated successfully.
Nov 29 07:25:10 compute-0 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d0000007b.scope: Consumed 5.229s CPU time.
Nov 29 07:25:10 compute-0 systemd-machined[153486]: Machine qemu-49-instance-0000007b terminated.
Nov 29 07:25:10 compute-0 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[234701]: [NOTICE]   (234705) : haproxy version is 2.8.14-c23fe91
Nov 29 07:25:10 compute-0 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[234701]: [NOTICE]   (234705) : path to executable is /usr/sbin/haproxy
Nov 29 07:25:10 compute-0 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[234701]: [WARNING]  (234705) : Exiting Master process...
Nov 29 07:25:10 compute-0 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[234701]: [WARNING]  (234705) : Exiting Master process...
Nov 29 07:25:10 compute-0 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[234701]: [ALERT]    (234705) : Current worker (234707) exited with code 143 (Terminated)
Nov 29 07:25:10 compute-0 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[234701]: [WARNING]  (234705) : All workers exited. Exiting... (0)
Nov 29 07:25:10 compute-0 systemd[1]: libpod-73a66039136ca9b46c1f15cb7816017d6343364983179d93514488b356c539aa.scope: Deactivated successfully.
Nov 29 07:25:10 compute-0 podman[234791]: 2025-11-29 07:25:10.542749264 +0000 UTC m=+0.044703092 container died 73a66039136ca9b46c1f15cb7816017d6343364983179d93514488b356c539aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 29 07:25:10 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-73a66039136ca9b46c1f15cb7816017d6343364983179d93514488b356c539aa-userdata-shm.mount: Deactivated successfully.
Nov 29 07:25:10 compute-0 systemd[1]: var-lib-containers-storage-overlay-74aeeaa8d974bc06e491fe15c661f88a3b2a9fb0c5e8e2a6cbc224cce4d630f4-merged.mount: Deactivated successfully.
Nov 29 07:25:10 compute-0 podman[234791]: 2025-11-29 07:25:10.601527932 +0000 UTC m=+0.103481760 container cleanup 73a66039136ca9b46c1f15cb7816017d6343364983179d93514488b356c539aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Nov 29 07:25:10 compute-0 systemd[1]: libpod-conmon-73a66039136ca9b46c1f15cb7816017d6343364983179d93514488b356c539aa.scope: Deactivated successfully.
Nov 29 07:25:10 compute-0 nova_compute[187185]: 2025-11-29 07:25:10.621 187189 INFO nova.virt.libvirt.driver [-] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Instance destroyed successfully.
Nov 29 07:25:10 compute-0 nova_compute[187185]: 2025-11-29 07:25:10.621 187189 DEBUG nova.objects.instance [None req-0d8421d2-7b50-4435-b590-67124d97cf43 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lazy-loading 'numa_topology' on Instance uuid 5f11adcd-958a-4269-905d-a017406505f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:25:10 compute-0 podman[234839]: 2025-11-29 07:25:10.688143575 +0000 UTC m=+0.055020913 container remove 73a66039136ca9b46c1f15cb7816017d6343364983179d93514488b356c539aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 29 07:25:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:10.696 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f6fe6b23-db12-479a-8d70-b62fe3140b01]: (4, ('Sat Nov 29 07:25:10 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39 (73a66039136ca9b46c1f15cb7816017d6343364983179d93514488b356c539aa)\n73a66039136ca9b46c1f15cb7816017d6343364983179d93514488b356c539aa\nSat Nov 29 07:25:10 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39 (73a66039136ca9b46c1f15cb7816017d6343364983179d93514488b356c539aa)\n73a66039136ca9b46c1f15cb7816017d6343364983179d93514488b356c539aa\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:10.699 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[4f6c1ea1-608b-417a-b8b3-ad5af504f983]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:10.700 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap240f16d8-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:25:10 compute-0 nova_compute[187185]: 2025-11-29 07:25:10.702 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:10 compute-0 kernel: tap240f16d8-60: left promiscuous mode
Nov 29 07:25:10 compute-0 nova_compute[187185]: 2025-11-29 07:25:10.719 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:10.723 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[aec10422-8402-4945-87bf-5e7141efec2a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:10 compute-0 NetworkManager[55227]: <info>  [1764401110.7247] manager: (tapa1e67d00-86): new Tun device (/org/freedesktop/NetworkManager/Devices/201)
Nov 29 07:25:10 compute-0 systemd-udevd[234771]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:25:10 compute-0 kernel: tapa1e67d00-86: entered promiscuous mode
Nov 29 07:25:10 compute-0 nova_compute[187185]: 2025-11-29 07:25:10.729 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:10 compute-0 ovn_controller[95281]: 2025-11-29T07:25:10Z|00379|binding|INFO|Claiming lport a1e67d00-8650-44ba-b75d-07f55b8d8810 for this chassis.
Nov 29 07:25:10 compute-0 ovn_controller[95281]: 2025-11-29T07:25:10Z|00380|binding|INFO|a1e67d00-8650-44ba-b75d-07f55b8d8810: Claiming fa:16:3e:39:9f:d3 10.100.0.11
Nov 29 07:25:10 compute-0 NetworkManager[55227]: <info>  [1764401110.7359] device (tapa1e67d00-86): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:25:10 compute-0 NetworkManager[55227]: <info>  [1764401110.7365] device (tapa1e67d00-86): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:25:10 compute-0 ovn_controller[95281]: 2025-11-29T07:25:10Z|00381|binding|INFO|Setting lport a1e67d00-8650-44ba-b75d-07f55b8d8810 ovn-installed in OVS
Nov 29 07:25:10 compute-0 ovn_controller[95281]: 2025-11-29T07:25:10Z|00382|binding|INFO|Setting lport a1e67d00-8650-44ba-b75d-07f55b8d8810 up in Southbound
Nov 29 07:25:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:10.741 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:39:9f:d3 10.100.0.11'], port_security=['fa:16:3e:39:9f:d3 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '5f11adcd-958a-4269-905d-a017406505f0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'neutron:revision_number': '6', 'neutron:security_group_ids': '339ff6a8-b11e-4176-931b-a82ab9688ace', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0beb853-8490-4e92-a787-adc66ba47efc, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=a1e67d00-8650-44ba-b75d-07f55b8d8810) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:25:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:10.742 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[2154aeba-77ee-4d25-a793-a006bf5bb2fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:10 compute-0 nova_compute[187185]: 2025-11-29 07:25:10.742 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:10.743 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[1d2f1815-1d14-4d25-a5b7-db52e4dd8cdc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:10 compute-0 nova_compute[187185]: 2025-11-29 07:25:10.744 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:10.758 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f110b58b-35c3-4007-abee-4e651ee030f1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 655290, 'reachable_time': 16960, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234869, 'error': None, 'target': 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:10.760 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:25:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:10.760 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[0ed9bc2c-774d-4c79-a9e8-d88eb542ebf9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:10.760 104254 INFO neutron.agent.ovn.metadata.agent [-] Port a1e67d00-8650-44ba-b75d-07f55b8d8810 in datapath 240f16d8-602b-4aa1-8edb-e3a8d3674e39 unbound from our chassis
Nov 29 07:25:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:10.762 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 240f16d8-602b-4aa1-8edb-e3a8d3674e39
Nov 29 07:25:10 compute-0 systemd-machined[153486]: New machine qemu-50-instance-0000007b.
Nov 29 07:25:10 compute-0 systemd[1]: run-netns-ovnmeta\x2d240f16d8\x2d602b\x2d4aa1\x2d8edb\x2de3a8d3674e39.mount: Deactivated successfully.
Nov 29 07:25:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:10.771 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a8ccc989-d964-4d9c-9322-1e1d73b2b7ba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:10.772 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap240f16d8-61 in ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:25:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:10.774 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap240f16d8-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:25:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:10.774 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[5b586a3e-3477-4156-b3b4-f057ad3f923d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:10.775 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[063cfb2b-ab3b-43b4-a21b-494cd661a775]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:10 compute-0 systemd[1]: Started Virtual Machine qemu-50-instance-0000007b.
Nov 29 07:25:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:10.785 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[6510e289-d83f-4599-8f76-e38a2a7a11a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:10.807 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[dbf9d064-46da-48ee-99a1-cd79bf7d76c4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:10.839 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[ee022ba9-5743-4aba-9f04-9ea4f17641cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:10.845 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[788361d5-f309-46ad-9cab-14dc91e82cad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:10 compute-0 NetworkManager[55227]: <info>  [1764401110.8463] manager: (tap240f16d8-60): new Veth device (/org/freedesktop/NetworkManager/Devices/202)
Nov 29 07:25:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:10.879 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[b52b6393-7316-4187-bb85-c8bfa8bcdf47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:10.883 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[27cd6e0b-af64-40bf-ae43-0f93b9e65736]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:10 compute-0 NetworkManager[55227]: <info>  [1764401110.9026] device (tap240f16d8-60): carrier: link connected
Nov 29 07:25:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:10.907 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[62df8f9b-deb7-4be2-a8b0-b378acc46221]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:10.925 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[8b3ff6ff-b9dc-4022-a3df-9d3a6b5ba273]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap240f16d8-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:aa:7e:40'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 124], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 655856, 'reachable_time': 41600, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234903, 'error': None, 'target': 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:10.937 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[8e70269a-12db-4023-9f5b-551a8ddd593b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feaa:7e40'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 655856, 'tstamp': 655856}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234904, 'error': None, 'target': 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:10.967 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[7a56af61-f2b6-4c56-a6cb-3c26937eb839]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap240f16d8-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:aa:7e:40'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 124], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 655856, 'reachable_time': 41600, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234905, 'error': None, 'target': 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:11 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:11.012 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[c23efc9d-14a9-4629-b99f-ae112ca1bfa4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:11 compute-0 ovn_controller[95281]: 2025-11-29T07:25:11Z|00048|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b1:6e:42 10.100.0.7
Nov 29 07:25:11 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:11.093 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[7f6b2e06-6ad7-490c-bcff-10c0561fd853]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:11 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:11.094 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap240f16d8-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:25:11 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:11.094 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:25:11 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:11.095 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap240f16d8-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:25:11 compute-0 kernel: tap240f16d8-60: entered promiscuous mode
Nov 29 07:25:11 compute-0 NetworkManager[55227]: <info>  [1764401111.0983] manager: (tap240f16d8-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/203)
Nov 29 07:25:11 compute-0 nova_compute[187185]: 2025-11-29 07:25:11.097 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:11 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:11.101 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap240f16d8-60, col_values=(('external_ids', {'iface-id': '0d1614aa-fbbf-4ad2-9ee2-8948d583ddf6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:25:11 compute-0 ovn_controller[95281]: 2025-11-29T07:25:11Z|00383|binding|INFO|Releasing lport 0d1614aa-fbbf-4ad2-9ee2-8948d583ddf6 from this chassis (sb_readonly=0)
Nov 29 07:25:11 compute-0 nova_compute[187185]: 2025-11-29 07:25:11.103 187189 DEBUG nova.virt.libvirt.host [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Removed pending event for 5f11adcd-958a-4269-905d-a017406505f0 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Nov 29 07:25:11 compute-0 nova_compute[187185]: 2025-11-29 07:25:11.105 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401111.1025488, 5f11adcd-958a-4269-905d-a017406505f0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:25:11 compute-0 nova_compute[187185]: 2025-11-29 07:25:11.105 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5f11adcd-958a-4269-905d-a017406505f0] VM Resumed (Lifecycle Event)
Nov 29 07:25:11 compute-0 nova_compute[187185]: 2025-11-29 07:25:11.107 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:11 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:11.107 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/240f16d8-602b-4aa1-8edb-e3a8d3674e39.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/240f16d8-602b-4aa1-8edb-e3a8d3674e39.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:25:11 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:11.108 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f612986a-9a5f-4214-912c-1f8fa25f8322]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:11 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:11.109 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:25:11 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:25:11 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:25:11 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-240f16d8-602b-4aa1-8edb-e3a8d3674e39
Nov 29 07:25:11 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:25:11 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:25:11 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:25:11 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/240f16d8-602b-4aa1-8edb-e3a8d3674e39.pid.haproxy
Nov 29 07:25:11 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:25:11 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:25:11 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:25:11 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:25:11 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:25:11 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:25:11 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:25:11 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:25:11 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:25:11 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:25:11 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:25:11 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:25:11 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:25:11 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:25:11 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:25:11 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:25:11 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:25:11 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:25:11 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:25:11 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:25:11 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID 240f16d8-602b-4aa1-8edb-e3a8d3674e39
Nov 29 07:25:11 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:25:11 compute-0 nova_compute[187185]: 2025-11-29 07:25:11.110 187189 DEBUG nova.compute.manager [None req-0d8421d2-7b50-4435-b590-67124d97cf43 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:25:11 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:11.110 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'env', 'PROCESS_TAG=haproxy-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/240f16d8-602b-4aa1-8edb-e3a8d3674e39.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:25:11 compute-0 nova_compute[187185]: 2025-11-29 07:25:11.117 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:11 compute-0 nova_compute[187185]: 2025-11-29 07:25:11.147 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:25:11 compute-0 nova_compute[187185]: 2025-11-29 07:25:11.149 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:25:11 compute-0 nova_compute[187185]: 2025-11-29 07:25:11.179 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5f11adcd-958a-4269-905d-a017406505f0] During sync_power_state the instance has a pending task (unrescuing). Skip.
Nov 29 07:25:11 compute-0 nova_compute[187185]: 2025-11-29 07:25:11.179 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401111.1058953, 5f11adcd-958a-4269-905d-a017406505f0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:25:11 compute-0 nova_compute[187185]: 2025-11-29 07:25:11.180 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5f11adcd-958a-4269-905d-a017406505f0] VM Started (Lifecycle Event)
Nov 29 07:25:11 compute-0 nova_compute[187185]: 2025-11-29 07:25:11.202 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:11 compute-0 nova_compute[187185]: 2025-11-29 07:25:11.290 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:25:11 compute-0 nova_compute[187185]: 2025-11-29 07:25:11.300 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:25:11 compute-0 podman[234944]: 2025-11-29 07:25:11.523621002 +0000 UTC m=+0.071260991 container create 978450dd442a870e1f1d2fdc6c438c8950d308fb165d2cbbf6bc90afd8e60c5c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 29 07:25:11 compute-0 podman[234944]: 2025-11-29 07:25:11.481191315 +0000 UTC m=+0.028831384 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:25:11 compute-0 systemd[1]: Started libpod-conmon-978450dd442a870e1f1d2fdc6c438c8950d308fb165d2cbbf6bc90afd8e60c5c.scope.
Nov 29 07:25:11 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:25:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4c2f41f87dfbb1291e21fe35775c6228824079c81320e8d28a6c79bd1f224b19/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:25:11 compute-0 podman[234944]: 2025-11-29 07:25:11.685978942 +0000 UTC m=+0.233618921 container init 978450dd442a870e1f1d2fdc6c438c8950d308fb165d2cbbf6bc90afd8e60c5c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 29 07:25:11 compute-0 podman[234944]: 2025-11-29 07:25:11.692683421 +0000 UTC m=+0.240323410 container start 978450dd442a870e1f1d2fdc6c438c8950d308fb165d2cbbf6bc90afd8e60c5c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 29 07:25:11 compute-0 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[234959]: [NOTICE]   (234964) : New worker (234966) forked
Nov 29 07:25:11 compute-0 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[234959]: [NOTICE]   (234964) : Loading success.
Nov 29 07:25:12 compute-0 nova_compute[187185]: 2025-11-29 07:25:12.040 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:12 compute-0 nova_compute[187185]: 2025-11-29 07:25:12.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:25:12 compute-0 nova_compute[187185]: 2025-11-29 07:25:12.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:25:12 compute-0 nova_compute[187185]: 2025-11-29 07:25:12.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:25:12 compute-0 nova_compute[187185]: 2025-11-29 07:25:12.498 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "refresh_cache-7c10cb24-586c-4507-8169-8258d7136397" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:25:12 compute-0 nova_compute[187185]: 2025-11-29 07:25:12.498 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquired lock "refresh_cache-7c10cb24-586c-4507-8169-8258d7136397" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:25:12 compute-0 nova_compute[187185]: 2025-11-29 07:25:12.499 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 29 07:25:12 compute-0 nova_compute[187185]: 2025-11-29 07:25:12.499 187189 DEBUG nova.objects.instance [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 7c10cb24-586c-4507-8169-8258d7136397 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:25:12 compute-0 nova_compute[187185]: 2025-11-29 07:25:12.919 187189 DEBUG nova.compute.manager [req-19b940c0-a188-4eb4-a2b9-d5441d2f98ff req-4f33a9a7-a280-4b48-9042-5f1ea224b21e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Received event network-vif-unplugged-a1e67d00-8650-44ba-b75d-07f55b8d8810 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:25:12 compute-0 nova_compute[187185]: 2025-11-29 07:25:12.919 187189 DEBUG oslo_concurrency.lockutils [req-19b940c0-a188-4eb4-a2b9-d5441d2f98ff req-4f33a9a7-a280-4b48-9042-5f1ea224b21e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5f11adcd-958a-4269-905d-a017406505f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:25:12 compute-0 nova_compute[187185]: 2025-11-29 07:25:12.920 187189 DEBUG oslo_concurrency.lockutils [req-19b940c0-a188-4eb4-a2b9-d5441d2f98ff req-4f33a9a7-a280-4b48-9042-5f1ea224b21e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f11adcd-958a-4269-905d-a017406505f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:25:12 compute-0 nova_compute[187185]: 2025-11-29 07:25:12.920 187189 DEBUG oslo_concurrency.lockutils [req-19b940c0-a188-4eb4-a2b9-d5441d2f98ff req-4f33a9a7-a280-4b48-9042-5f1ea224b21e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f11adcd-958a-4269-905d-a017406505f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:25:12 compute-0 nova_compute[187185]: 2025-11-29 07:25:12.920 187189 DEBUG nova.compute.manager [req-19b940c0-a188-4eb4-a2b9-d5441d2f98ff req-4f33a9a7-a280-4b48-9042-5f1ea224b21e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] No waiting events found dispatching network-vif-unplugged-a1e67d00-8650-44ba-b75d-07f55b8d8810 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:25:12 compute-0 nova_compute[187185]: 2025-11-29 07:25:12.921 187189 WARNING nova.compute.manager [req-19b940c0-a188-4eb4-a2b9-d5441d2f98ff req-4f33a9a7-a280-4b48-9042-5f1ea224b21e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Received unexpected event network-vif-unplugged-a1e67d00-8650-44ba-b75d-07f55b8d8810 for instance with vm_state active and task_state None.
Nov 29 07:25:12 compute-0 nova_compute[187185]: 2025-11-29 07:25:12.921 187189 DEBUG nova.compute.manager [req-19b940c0-a188-4eb4-a2b9-d5441d2f98ff req-4f33a9a7-a280-4b48-9042-5f1ea224b21e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Received event network-vif-plugged-a1e67d00-8650-44ba-b75d-07f55b8d8810 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:25:12 compute-0 nova_compute[187185]: 2025-11-29 07:25:12.921 187189 DEBUG oslo_concurrency.lockutils [req-19b940c0-a188-4eb4-a2b9-d5441d2f98ff req-4f33a9a7-a280-4b48-9042-5f1ea224b21e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5f11adcd-958a-4269-905d-a017406505f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:25:12 compute-0 nova_compute[187185]: 2025-11-29 07:25:12.922 187189 DEBUG oslo_concurrency.lockutils [req-19b940c0-a188-4eb4-a2b9-d5441d2f98ff req-4f33a9a7-a280-4b48-9042-5f1ea224b21e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f11adcd-958a-4269-905d-a017406505f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:25:12 compute-0 nova_compute[187185]: 2025-11-29 07:25:12.922 187189 DEBUG oslo_concurrency.lockutils [req-19b940c0-a188-4eb4-a2b9-d5441d2f98ff req-4f33a9a7-a280-4b48-9042-5f1ea224b21e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f11adcd-958a-4269-905d-a017406505f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:25:12 compute-0 nova_compute[187185]: 2025-11-29 07:25:12.922 187189 DEBUG nova.compute.manager [req-19b940c0-a188-4eb4-a2b9-d5441d2f98ff req-4f33a9a7-a280-4b48-9042-5f1ea224b21e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] No waiting events found dispatching network-vif-plugged-a1e67d00-8650-44ba-b75d-07f55b8d8810 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:25:12 compute-0 nova_compute[187185]: 2025-11-29 07:25:12.923 187189 WARNING nova.compute.manager [req-19b940c0-a188-4eb4-a2b9-d5441d2f98ff req-4f33a9a7-a280-4b48-9042-5f1ea224b21e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Received unexpected event network-vif-plugged-a1e67d00-8650-44ba-b75d-07f55b8d8810 for instance with vm_state active and task_state None.
Nov 29 07:25:12 compute-0 nova_compute[187185]: 2025-11-29 07:25:12.923 187189 DEBUG nova.compute.manager [req-19b940c0-a188-4eb4-a2b9-d5441d2f98ff req-4f33a9a7-a280-4b48-9042-5f1ea224b21e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Received event network-vif-plugged-a1e67d00-8650-44ba-b75d-07f55b8d8810 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:25:12 compute-0 nova_compute[187185]: 2025-11-29 07:25:12.923 187189 DEBUG oslo_concurrency.lockutils [req-19b940c0-a188-4eb4-a2b9-d5441d2f98ff req-4f33a9a7-a280-4b48-9042-5f1ea224b21e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5f11adcd-958a-4269-905d-a017406505f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:25:12 compute-0 nova_compute[187185]: 2025-11-29 07:25:12.924 187189 DEBUG oslo_concurrency.lockutils [req-19b940c0-a188-4eb4-a2b9-d5441d2f98ff req-4f33a9a7-a280-4b48-9042-5f1ea224b21e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f11adcd-958a-4269-905d-a017406505f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:25:12 compute-0 nova_compute[187185]: 2025-11-29 07:25:12.924 187189 DEBUG oslo_concurrency.lockutils [req-19b940c0-a188-4eb4-a2b9-d5441d2f98ff req-4f33a9a7-a280-4b48-9042-5f1ea224b21e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f11adcd-958a-4269-905d-a017406505f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:25:12 compute-0 nova_compute[187185]: 2025-11-29 07:25:12.924 187189 DEBUG nova.compute.manager [req-19b940c0-a188-4eb4-a2b9-d5441d2f98ff req-4f33a9a7-a280-4b48-9042-5f1ea224b21e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] No waiting events found dispatching network-vif-plugged-a1e67d00-8650-44ba-b75d-07f55b8d8810 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:25:12 compute-0 nova_compute[187185]: 2025-11-29 07:25:12.925 187189 WARNING nova.compute.manager [req-19b940c0-a188-4eb4-a2b9-d5441d2f98ff req-4f33a9a7-a280-4b48-9042-5f1ea224b21e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Received unexpected event network-vif-plugged-a1e67d00-8650-44ba-b75d-07f55b8d8810 for instance with vm_state active and task_state None.
Nov 29 07:25:12 compute-0 nova_compute[187185]: 2025-11-29 07:25:12.925 187189 DEBUG nova.compute.manager [req-19b940c0-a188-4eb4-a2b9-d5441d2f98ff req-4f33a9a7-a280-4b48-9042-5f1ea224b21e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Received event network-vif-plugged-a1e67d00-8650-44ba-b75d-07f55b8d8810 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:25:12 compute-0 nova_compute[187185]: 2025-11-29 07:25:12.925 187189 DEBUG oslo_concurrency.lockutils [req-19b940c0-a188-4eb4-a2b9-d5441d2f98ff req-4f33a9a7-a280-4b48-9042-5f1ea224b21e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5f11adcd-958a-4269-905d-a017406505f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:25:12 compute-0 nova_compute[187185]: 2025-11-29 07:25:12.926 187189 DEBUG oslo_concurrency.lockutils [req-19b940c0-a188-4eb4-a2b9-d5441d2f98ff req-4f33a9a7-a280-4b48-9042-5f1ea224b21e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f11adcd-958a-4269-905d-a017406505f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:25:12 compute-0 nova_compute[187185]: 2025-11-29 07:25:12.926 187189 DEBUG oslo_concurrency.lockutils [req-19b940c0-a188-4eb4-a2b9-d5441d2f98ff req-4f33a9a7-a280-4b48-9042-5f1ea224b21e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f11adcd-958a-4269-905d-a017406505f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:25:12 compute-0 nova_compute[187185]: 2025-11-29 07:25:12.926 187189 DEBUG nova.compute.manager [req-19b940c0-a188-4eb4-a2b9-d5441d2f98ff req-4f33a9a7-a280-4b48-9042-5f1ea224b21e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] No waiting events found dispatching network-vif-plugged-a1e67d00-8650-44ba-b75d-07f55b8d8810 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:25:12 compute-0 nova_compute[187185]: 2025-11-29 07:25:12.927 187189 WARNING nova.compute.manager [req-19b940c0-a188-4eb4-a2b9-d5441d2f98ff req-4f33a9a7-a280-4b48-9042-5f1ea224b21e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Received unexpected event network-vif-plugged-a1e67d00-8650-44ba-b75d-07f55b8d8810 for instance with vm_state active and task_state None.
Nov 29 07:25:13 compute-0 nova_compute[187185]: 2025-11-29 07:25:13.831 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Updating instance_info_cache with network_info: [{"id": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "address": "fa:16:3e:b1:6e:42", "network": {"id": "be5e5e17-de26-4f07-84cb-bd99be23cd24", "bridge": "br-int", "label": "tempest-network-smoke--530096136", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape89dd8de-f9", "ovs_interfaceid": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:25:13 compute-0 nova_compute[187185]: 2025-11-29 07:25:13.853 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Releasing lock "refresh_cache-7c10cb24-586c-4507-8169-8258d7136397" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:25:13 compute-0 nova_compute[187185]: 2025-11-29 07:25:13.853 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 29 07:25:13 compute-0 nova_compute[187185]: 2025-11-29 07:25:13.854 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:25:13 compute-0 nova_compute[187185]: 2025-11-29 07:25:13.854 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:25:13 compute-0 nova_compute[187185]: 2025-11-29 07:25:13.886 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:25:13 compute-0 nova_compute[187185]: 2025-11-29 07:25:13.887 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:25:13 compute-0 nova_compute[187185]: 2025-11-29 07:25:13.887 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:25:13 compute-0 nova_compute[187185]: 2025-11-29 07:25:13.887 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:25:13 compute-0 nova_compute[187185]: 2025-11-29 07:25:13.989 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:25:14 compute-0 nova_compute[187185]: 2025-11-29 07:25:14.079 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:25:14 compute-0 nova_compute[187185]: 2025-11-29 07:25:14.080 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:25:14 compute-0 nova_compute[187185]: 2025-11-29 07:25:14.141 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:25:14 compute-0 nova_compute[187185]: 2025-11-29 07:25:14.146 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:25:14 compute-0 nova_compute[187185]: 2025-11-29 07:25:14.204 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:25:14 compute-0 nova_compute[187185]: 2025-11-29 07:25:14.205 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:25:14 compute-0 nova_compute[187185]: 2025-11-29 07:25:14.258 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:25:14 compute-0 nova_compute[187185]: 2025-11-29 07:25:14.401 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:25:14 compute-0 nova_compute[187185]: 2025-11-29 07:25:14.403 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5284MB free_disk=73.20218276977539GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:25:14 compute-0 nova_compute[187185]: 2025-11-29 07:25:14.404 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:25:14 compute-0 nova_compute[187185]: 2025-11-29 07:25:14.404 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:25:14 compute-0 nova_compute[187185]: 2025-11-29 07:25:14.540 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance 7c10cb24-586c-4507-8169-8258d7136397 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:25:14 compute-0 nova_compute[187185]: 2025-11-29 07:25:14.540 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance 5f11adcd-958a-4269-905d-a017406505f0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:25:14 compute-0 nova_compute[187185]: 2025-11-29 07:25:14.541 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:25:14 compute-0 nova_compute[187185]: 2025-11-29 07:25:14.541 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=832MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:25:14 compute-0 nova_compute[187185]: 2025-11-29 07:25:14.611 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:25:14 compute-0 nova_compute[187185]: 2025-11-29 07:25:14.635 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:25:14 compute-0 nova_compute[187185]: 2025-11-29 07:25:14.673 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:25:14 compute-0 nova_compute[187185]: 2025-11-29 07:25:14.674 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.270s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:25:15 compute-0 nova_compute[187185]: 2025-11-29 07:25:15.136 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:25:15 compute-0 nova_compute[187185]: 2025-11-29 07:25:15.137 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:25:16 compute-0 nova_compute[187185]: 2025-11-29 07:25:16.206 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:16 compute-0 nova_compute[187185]: 2025-11-29 07:25:16.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:25:16 compute-0 nova_compute[187185]: 2025-11-29 07:25:16.318 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:25:16 compute-0 nova_compute[187185]: 2025-11-29 07:25:16.318 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:25:16 compute-0 nova_compute[187185]: 2025-11-29 07:25:16.319 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:25:17 compute-0 nova_compute[187185]: 2025-11-29 07:25:17.013 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401102.0115488, 2702fe48-44d0-408d-8d10-fd635e3779c9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:25:17 compute-0 nova_compute[187185]: 2025-11-29 07:25:17.014 187189 INFO nova.compute.manager [-] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] VM Stopped (Lifecycle Event)
Nov 29 07:25:17 compute-0 nova_compute[187185]: 2025-11-29 07:25:17.033 187189 DEBUG nova.compute.manager [None req-661721b5-6afd-418c-bc77-290e0a46a742 - - - - - -] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:25:17 compute-0 nova_compute[187185]: 2025-11-29 07:25:17.042 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:17 compute-0 nova_compute[187185]: 2025-11-29 07:25:17.311 187189 INFO nova.compute.manager [None req-c5bbf31a-39a6-4468-a158-1bed02ebf20b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Get console output
Nov 29 07:25:17 compute-0 nova_compute[187185]: 2025-11-29 07:25:17.317 213758 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 29 07:25:18 compute-0 nova_compute[187185]: 2025-11-29 07:25:18.997 187189 DEBUG oslo_concurrency.lockutils [None req-77425e4c-30dd-4bb9-8109-64c6470afecb bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "7c10cb24-586c-4507-8169-8258d7136397" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:25:18 compute-0 nova_compute[187185]: 2025-11-29 07:25:18.999 187189 DEBUG oslo_concurrency.lockutils [None req-77425e4c-30dd-4bb9-8109-64c6470afecb bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "7c10cb24-586c-4507-8169-8258d7136397" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:25:19 compute-0 nova_compute[187185]: 2025-11-29 07:25:19.000 187189 DEBUG oslo_concurrency.lockutils [None req-77425e4c-30dd-4bb9-8109-64c6470afecb bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "7c10cb24-586c-4507-8169-8258d7136397-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:25:19 compute-0 nova_compute[187185]: 2025-11-29 07:25:19.000 187189 DEBUG oslo_concurrency.lockutils [None req-77425e4c-30dd-4bb9-8109-64c6470afecb bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "7c10cb24-586c-4507-8169-8258d7136397-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:25:19 compute-0 nova_compute[187185]: 2025-11-29 07:25:19.001 187189 DEBUG oslo_concurrency.lockutils [None req-77425e4c-30dd-4bb9-8109-64c6470afecb bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "7c10cb24-586c-4507-8169-8258d7136397-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:25:19 compute-0 nova_compute[187185]: 2025-11-29 07:25:19.021 187189 INFO nova.compute.manager [None req-77425e4c-30dd-4bb9-8109-64c6470afecb bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Terminating instance
Nov 29 07:25:19 compute-0 nova_compute[187185]: 2025-11-29 07:25:19.048 187189 DEBUG nova.compute.manager [None req-77425e4c-30dd-4bb9-8109-64c6470afecb bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 07:25:19 compute-0 kernel: tape89dd8de-f9 (unregistering): left promiscuous mode
Nov 29 07:25:19 compute-0 NetworkManager[55227]: <info>  [1764401119.0699] device (tape89dd8de-f9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:25:19 compute-0 nova_compute[187185]: 2025-11-29 07:25:19.075 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:19 compute-0 ovn_controller[95281]: 2025-11-29T07:25:19Z|00384|binding|INFO|Releasing lport e89dd8de-f981-46cf-aa04-cfad6a9b2326 from this chassis (sb_readonly=0)
Nov 29 07:25:19 compute-0 ovn_controller[95281]: 2025-11-29T07:25:19Z|00385|binding|INFO|Setting lport e89dd8de-f981-46cf-aa04-cfad6a9b2326 down in Southbound
Nov 29 07:25:19 compute-0 ovn_controller[95281]: 2025-11-29T07:25:19Z|00386|binding|INFO|Removing iface tape89dd8de-f9 ovn-installed in OVS
Nov 29 07:25:19 compute-0 nova_compute[187185]: 2025-11-29 07:25:19.089 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:19 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:19.090 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:6e:42 10.100.0.7'], port_security=['fa:16:3e:b1:6e:42 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '7c10cb24-586c-4507-8169-8258d7136397', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-be5e5e17-de26-4f07-84cb-bd99be23cd24', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c231e63624d44fc19e0989abfb1afb22', 'neutron:revision_number': '7', 'neutron:security_group_ids': '51b81e59-c129-44d0-83ab-ea09f800f560', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=40c46f88-56a4-469c-8869-7f0629f57469, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=e89dd8de-f981-46cf-aa04-cfad6a9b2326) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:25:19 compute-0 nova_compute[187185]: 2025-11-29 07:25:19.092 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:19 compute-0 nova_compute[187185]: 2025-11-29 07:25:19.094 187189 DEBUG nova.compute.manager [req-6422d357-98e6-4c47-a4cf-8e16ccf919b3 req-c4a62d36-15d5-4326-ab60-7838a6953448 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Received event network-changed-e89dd8de-f981-46cf-aa04-cfad6a9b2326 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:25:19 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:19.093 104254 INFO neutron.agent.ovn.metadata.agent [-] Port e89dd8de-f981-46cf-aa04-cfad6a9b2326 in datapath be5e5e17-de26-4f07-84cb-bd99be23cd24 unbound from our chassis
Nov 29 07:25:19 compute-0 nova_compute[187185]: 2025-11-29 07:25:19.095 187189 DEBUG nova.compute.manager [req-6422d357-98e6-4c47-a4cf-8e16ccf919b3 req-c4a62d36-15d5-4326-ab60-7838a6953448 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Refreshing instance network info cache due to event network-changed-e89dd8de-f981-46cf-aa04-cfad6a9b2326. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:25:19 compute-0 nova_compute[187185]: 2025-11-29 07:25:19.095 187189 DEBUG oslo_concurrency.lockutils [req-6422d357-98e6-4c47-a4cf-8e16ccf919b3 req-c4a62d36-15d5-4326-ab60-7838a6953448 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-7c10cb24-586c-4507-8169-8258d7136397" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:25:19 compute-0 nova_compute[187185]: 2025-11-29 07:25:19.095 187189 DEBUG oslo_concurrency.lockutils [req-6422d357-98e6-4c47-a4cf-8e16ccf919b3 req-c4a62d36-15d5-4326-ab60-7838a6953448 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-7c10cb24-586c-4507-8169-8258d7136397" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:25:19 compute-0 nova_compute[187185]: 2025-11-29 07:25:19.095 187189 DEBUG nova.network.neutron [req-6422d357-98e6-4c47-a4cf-8e16ccf919b3 req-c4a62d36-15d5-4326-ab60-7838a6953448 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Refreshing network info cache for port e89dd8de-f981-46cf-aa04-cfad6a9b2326 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:25:19 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:19.100 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network be5e5e17-de26-4f07-84cb-bd99be23cd24, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:25:19 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:19.102 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[9eab91ea-a26d-44d8-87c4-d558c5b3914b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:19 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:19.103 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24 namespace which is not needed anymore
Nov 29 07:25:19 compute-0 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d00000079.scope: Deactivated successfully.
Nov 29 07:25:19 compute-0 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d00000079.scope: Consumed 13.670s CPU time.
Nov 29 07:25:19 compute-0 systemd-machined[153486]: Machine qemu-48-instance-00000079 terminated.
Nov 29 07:25:19 compute-0 podman[234991]: 2025-11-29 07:25:19.178240288 +0000 UTC m=+0.077593620 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.buildah.version=1.33.7, vcs-type=git, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., maintainer=Red Hat, Inc., distribution-scope=public, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, managed_by=edpm_ansible, config_id=edpm, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container)
Nov 29 07:25:19 compute-0 podman[234988]: 2025-11-29 07:25:19.199489797 +0000 UTC m=+0.100596988 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:25:19 compute-0 podman[234995]: 2025-11-29 07:25:19.201647498 +0000 UTC m=+0.089515456 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:25:19 compute-0 neutron-haproxy-ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24[234343]: [NOTICE]   (234358) : haproxy version is 2.8.14-c23fe91
Nov 29 07:25:19 compute-0 neutron-haproxy-ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24[234343]: [NOTICE]   (234358) : path to executable is /usr/sbin/haproxy
Nov 29 07:25:19 compute-0 neutron-haproxy-ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24[234343]: [WARNING]  (234358) : Exiting Master process...
Nov 29 07:25:19 compute-0 neutron-haproxy-ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24[234343]: [ALERT]    (234358) : Current worker (234360) exited with code 143 (Terminated)
Nov 29 07:25:19 compute-0 neutron-haproxy-ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24[234343]: [WARNING]  (234358) : All workers exited. Exiting... (0)
Nov 29 07:25:19 compute-0 systemd[1]: libpod-a925727d1aad090da1682982cbc86285c575f8a68a767e5aad77430ee6ef2fbd.scope: Deactivated successfully.
Nov 29 07:25:19 compute-0 podman[235065]: 2025-11-29 07:25:19.239413394 +0000 UTC m=+0.046625617 container died a925727d1aad090da1682982cbc86285c575f8a68a767e5aad77430ee6ef2fbd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 07:25:19 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a925727d1aad090da1682982cbc86285c575f8a68a767e5aad77430ee6ef2fbd-userdata-shm.mount: Deactivated successfully.
Nov 29 07:25:19 compute-0 kernel: tape89dd8de-f9: entered promiscuous mode
Nov 29 07:25:19 compute-0 systemd[1]: var-lib-containers-storage-overlay-082ca51418527a87ffea65750fd070756b7a67896159c76faedfeed34349cf43-merged.mount: Deactivated successfully.
Nov 29 07:25:19 compute-0 kernel: tape89dd8de-f9 (unregistering): left promiscuous mode
Nov 29 07:25:19 compute-0 NetworkManager[55227]: <info>  [1764401119.2748] manager: (tape89dd8de-f9): new Tun device (/org/freedesktop/NetworkManager/Devices/204)
Nov 29 07:25:19 compute-0 podman[235065]: 2025-11-29 07:25:19.280745999 +0000 UTC m=+0.087958232 container cleanup a925727d1aad090da1682982cbc86285c575f8a68a767e5aad77430ee6ef2fbd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:25:19 compute-0 nova_compute[187185]: 2025-11-29 07:25:19.359 187189 DEBUG nova.compute.manager [req-25425927-9cdc-43c0-a4f8-ff5ddfebc752 req-8dfb047e-5657-42f6-aedf-b53c4c761916 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Received event network-vif-unplugged-e89dd8de-f981-46cf-aa04-cfad6a9b2326 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:25:19 compute-0 nova_compute[187185]: 2025-11-29 07:25:19.360 187189 DEBUG oslo_concurrency.lockutils [req-25425927-9cdc-43c0-a4f8-ff5ddfebc752 req-8dfb047e-5657-42f6-aedf-b53c4c761916 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7c10cb24-586c-4507-8169-8258d7136397-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:25:19 compute-0 nova_compute[187185]: 2025-11-29 07:25:19.360 187189 DEBUG oslo_concurrency.lockutils [req-25425927-9cdc-43c0-a4f8-ff5ddfebc752 req-8dfb047e-5657-42f6-aedf-b53c4c761916 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7c10cb24-586c-4507-8169-8258d7136397-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:25:19 compute-0 nova_compute[187185]: 2025-11-29 07:25:19.360 187189 DEBUG oslo_concurrency.lockutils [req-25425927-9cdc-43c0-a4f8-ff5ddfebc752 req-8dfb047e-5657-42f6-aedf-b53c4c761916 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7c10cb24-586c-4507-8169-8258d7136397-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:25:19 compute-0 nova_compute[187185]: 2025-11-29 07:25:19.362 187189 DEBUG nova.compute.manager [req-25425927-9cdc-43c0-a4f8-ff5ddfebc752 req-8dfb047e-5657-42f6-aedf-b53c4c761916 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] No waiting events found dispatching network-vif-unplugged-e89dd8de-f981-46cf-aa04-cfad6a9b2326 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:25:19 compute-0 nova_compute[187185]: 2025-11-29 07:25:19.362 187189 DEBUG nova.compute.manager [req-25425927-9cdc-43c0-a4f8-ff5ddfebc752 req-8dfb047e-5657-42f6-aedf-b53c4c761916 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Received event network-vif-unplugged-e89dd8de-f981-46cf-aa04-cfad6a9b2326 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 07:25:19 compute-0 nova_compute[187185]: 2025-11-29 07:25:19.362 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:19 compute-0 systemd[1]: libpod-conmon-a925727d1aad090da1682982cbc86285c575f8a68a767e5aad77430ee6ef2fbd.scope: Deactivated successfully.
Nov 29 07:25:19 compute-0 nova_compute[187185]: 2025-11-29 07:25:19.396 187189 INFO nova.virt.libvirt.driver [-] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Instance destroyed successfully.
Nov 29 07:25:19 compute-0 nova_compute[187185]: 2025-11-29 07:25:19.396 187189 DEBUG nova.objects.instance [None req-77425e4c-30dd-4bb9-8109-64c6470afecb bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'resources' on Instance uuid 7c10cb24-586c-4507-8169-8258d7136397 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:25:19 compute-0 nova_compute[187185]: 2025-11-29 07:25:19.411 187189 DEBUG nova.virt.libvirt.vif [None req-77425e4c-30dd-4bb9-8109-64c6470afecb bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:23:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1276368768',display_name='tempest-TestNetworkAdvancedServerOps-server-1276368768',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1276368768',id=121,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBUznZR2iOKaJbWAB1nxy/Np7mGSzlwsDQ7Ycl3wci2nJ60qWbosUg5gundiked4HoZaTmuE/0+OTOCJFQ4CjxMZqyT1FcUBwmvtOPuSl/eONA9sj7Vj+75xN046AU/KWg==',key_name='tempest-TestNetworkAdvancedServerOps-143614444',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:24:58Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-it4r0l7q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:25:04Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=7c10cb24-586c-4507-8169-8258d7136397,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "address": "fa:16:3e:b1:6e:42", "network": {"id": "be5e5e17-de26-4f07-84cb-bd99be23cd24", "bridge": "br-int", "label": "tempest-network-smoke--530096136", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape89dd8de-f9", "ovs_interfaceid": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:25:19 compute-0 nova_compute[187185]: 2025-11-29 07:25:19.412 187189 DEBUG nova.network.os_vif_util [None req-77425e4c-30dd-4bb9-8109-64c6470afecb bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converting VIF {"id": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "address": "fa:16:3e:b1:6e:42", "network": {"id": "be5e5e17-de26-4f07-84cb-bd99be23cd24", "bridge": "br-int", "label": "tempest-network-smoke--530096136", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape89dd8de-f9", "ovs_interfaceid": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:25:19 compute-0 nova_compute[187185]: 2025-11-29 07:25:19.412 187189 DEBUG nova.network.os_vif_util [None req-77425e4c-30dd-4bb9-8109-64c6470afecb bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b1:6e:42,bridge_name='br-int',has_traffic_filtering=True,id=e89dd8de-f981-46cf-aa04-cfad6a9b2326,network=Network(be5e5e17-de26-4f07-84cb-bd99be23cd24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape89dd8de-f9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:25:19 compute-0 nova_compute[187185]: 2025-11-29 07:25:19.413 187189 DEBUG os_vif [None req-77425e4c-30dd-4bb9-8109-64c6470afecb bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b1:6e:42,bridge_name='br-int',has_traffic_filtering=True,id=e89dd8de-f981-46cf-aa04-cfad6a9b2326,network=Network(be5e5e17-de26-4f07-84cb-bd99be23cd24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape89dd8de-f9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:25:19 compute-0 nova_compute[187185]: 2025-11-29 07:25:19.415 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:19 compute-0 nova_compute[187185]: 2025-11-29 07:25:19.415 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape89dd8de-f9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:25:19 compute-0 nova_compute[187185]: 2025-11-29 07:25:19.418 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:25:19 compute-0 nova_compute[187185]: 2025-11-29 07:25:19.421 187189 INFO os_vif [None req-77425e4c-30dd-4bb9-8109-64c6470afecb bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b1:6e:42,bridge_name='br-int',has_traffic_filtering=True,id=e89dd8de-f981-46cf-aa04-cfad6a9b2326,network=Network(be5e5e17-de26-4f07-84cb-bd99be23cd24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape89dd8de-f9')
Nov 29 07:25:19 compute-0 nova_compute[187185]: 2025-11-29 07:25:19.422 187189 INFO nova.virt.libvirt.driver [None req-77425e4c-30dd-4bb9-8109-64c6470afecb bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Deleting instance files /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397_del
Nov 29 07:25:19 compute-0 nova_compute[187185]: 2025-11-29 07:25:19.429 187189 INFO nova.virt.libvirt.driver [None req-77425e4c-30dd-4bb9-8109-64c6470afecb bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Deletion of /var/lib/nova/instances/7c10cb24-586c-4507-8169-8258d7136397_del complete
Nov 29 07:25:19 compute-0 podman[235104]: 2025-11-29 07:25:19.462177807 +0000 UTC m=+0.071119687 container remove a925727d1aad090da1682982cbc86285c575f8a68a767e5aad77430ee6ef2fbd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:25:19 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:19.470 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[ed08d0a5-27a1-470e-b7c5-ae33fcd1c369]: (4, ('Sat Nov 29 07:25:19 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24 (a925727d1aad090da1682982cbc86285c575f8a68a767e5aad77430ee6ef2fbd)\na925727d1aad090da1682982cbc86285c575f8a68a767e5aad77430ee6ef2fbd\nSat Nov 29 07:25:19 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24 (a925727d1aad090da1682982cbc86285c575f8a68a767e5aad77430ee6ef2fbd)\na925727d1aad090da1682982cbc86285c575f8a68a767e5aad77430ee6ef2fbd\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:19 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:19.472 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[4d05e0ae-40b5-441c-9565-4c5e39c2066d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:19 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:19.473 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbe5e5e17-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:25:19 compute-0 nova_compute[187185]: 2025-11-29 07:25:19.475 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:19 compute-0 kernel: tapbe5e5e17-d0: left promiscuous mode
Nov 29 07:25:19 compute-0 nova_compute[187185]: 2025-11-29 07:25:19.487 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:19 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:19.489 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[30805784-333e-4e99-bd99-9b5b16911826]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:19 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:19.510 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[65bdf527-cf93-4ca1-8d06-4fb8d0aa5e12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:19 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:19.511 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e33d1632-051d-4fa2-b4ac-8d895477512b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:19 compute-0 nova_compute[187185]: 2025-11-29 07:25:19.514 187189 INFO nova.compute.manager [None req-77425e4c-30dd-4bb9-8109-64c6470afecb bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Took 0.47 seconds to destroy the instance on the hypervisor.
Nov 29 07:25:19 compute-0 nova_compute[187185]: 2025-11-29 07:25:19.515 187189 DEBUG oslo.service.loopingcall [None req-77425e4c-30dd-4bb9-8109-64c6470afecb bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 07:25:19 compute-0 nova_compute[187185]: 2025-11-29 07:25:19.515 187189 DEBUG nova.compute.manager [-] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 07:25:19 compute-0 nova_compute[187185]: 2025-11-29 07:25:19.515 187189 DEBUG nova.network.neutron [-] [instance: 7c10cb24-586c-4507-8169-8258d7136397] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 07:25:19 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:19.527 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[b263e3e9-b0b9-49ff-ba75-53b8ec9f5a3f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 654504, 'reachable_time': 28703, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235124, 'error': None, 'target': 'ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:19 compute-0 systemd[1]: run-netns-ovnmeta\x2dbe5e5e17\x2dde26\x2d4f07\x2d84cb\x2dbd99be23cd24.mount: Deactivated successfully.
Nov 29 07:25:19 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:19.530 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-be5e5e17-de26-4f07-84cb-bd99be23cd24 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:25:19 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:19.531 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[9470fbdf-68bf-49f6-a0e8-a39d1641263a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:20 compute-0 nova_compute[187185]: 2025-11-29 07:25:20.312 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:25:20 compute-0 nova_compute[187185]: 2025-11-29 07:25:20.519 187189 DEBUG nova.network.neutron [-] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:25:20 compute-0 nova_compute[187185]: 2025-11-29 07:25:20.539 187189 INFO nova.compute.manager [-] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Took 1.02 seconds to deallocate network for instance.
Nov 29 07:25:20 compute-0 nova_compute[187185]: 2025-11-29 07:25:20.619 187189 DEBUG oslo_concurrency.lockutils [None req-77425e4c-30dd-4bb9-8109-64c6470afecb bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:25:20 compute-0 nova_compute[187185]: 2025-11-29 07:25:20.619 187189 DEBUG oslo_concurrency.lockutils [None req-77425e4c-30dd-4bb9-8109-64c6470afecb bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:25:20 compute-0 nova_compute[187185]: 2025-11-29 07:25:20.712 187189 DEBUG nova.compute.provider_tree [None req-77425e4c-30dd-4bb9-8109-64c6470afecb bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:25:20 compute-0 nova_compute[187185]: 2025-11-29 07:25:20.729 187189 DEBUG nova.scheduler.client.report [None req-77425e4c-30dd-4bb9-8109-64c6470afecb bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:25:20 compute-0 nova_compute[187185]: 2025-11-29 07:25:20.755 187189 DEBUG oslo_concurrency.lockutils [None req-77425e4c-30dd-4bb9-8109-64c6470afecb bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.136s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:25:20 compute-0 nova_compute[187185]: 2025-11-29 07:25:20.785 187189 INFO nova.scheduler.client.report [None req-77425e4c-30dd-4bb9-8109-64c6470afecb bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Deleted allocations for instance 7c10cb24-586c-4507-8169-8258d7136397
Nov 29 07:25:20 compute-0 nova_compute[187185]: 2025-11-29 07:25:20.888 187189 DEBUG oslo_concurrency.lockutils [None req-77425e4c-30dd-4bb9-8109-64c6470afecb bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "7c10cb24-586c-4507-8169-8258d7136397" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.889s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:25:20 compute-0 nova_compute[187185]: 2025-11-29 07:25:20.895 187189 DEBUG nova.network.neutron [req-6422d357-98e6-4c47-a4cf-8e16ccf919b3 req-c4a62d36-15d5-4326-ab60-7838a6953448 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Updated VIF entry in instance network info cache for port e89dd8de-f981-46cf-aa04-cfad6a9b2326. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:25:20 compute-0 nova_compute[187185]: 2025-11-29 07:25:20.895 187189 DEBUG nova.network.neutron [req-6422d357-98e6-4c47-a4cf-8e16ccf919b3 req-c4a62d36-15d5-4326-ab60-7838a6953448 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Updating instance_info_cache with network_info: [{"id": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "address": "fa:16:3e:b1:6e:42", "network": {"id": "be5e5e17-de26-4f07-84cb-bd99be23cd24", "bridge": "br-int", "label": "tempest-network-smoke--530096136", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape89dd8de-f9", "ovs_interfaceid": "e89dd8de-f981-46cf-aa04-cfad6a9b2326", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:25:20 compute-0 nova_compute[187185]: 2025-11-29 07:25:20.915 187189 DEBUG oslo_concurrency.lockutils [req-6422d357-98e6-4c47-a4cf-8e16ccf919b3 req-c4a62d36-15d5-4326-ab60-7838a6953448 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-7c10cb24-586c-4507-8169-8258d7136397" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:25:21 compute-0 nova_compute[187185]: 2025-11-29 07:25:21.211 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:21 compute-0 nova_compute[187185]: 2025-11-29 07:25:21.598 187189 DEBUG nova.compute.manager [req-5f4db0c6-ca79-46c8-8e9c-60d40a6e1df1 req-c6a4a08a-a1e4-4fd3-b4ce-85f8cca20a32 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Received event network-vif-plugged-e89dd8de-f981-46cf-aa04-cfad6a9b2326 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:25:21 compute-0 nova_compute[187185]: 2025-11-29 07:25:21.599 187189 DEBUG oslo_concurrency.lockutils [req-5f4db0c6-ca79-46c8-8e9c-60d40a6e1df1 req-c6a4a08a-a1e4-4fd3-b4ce-85f8cca20a32 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7c10cb24-586c-4507-8169-8258d7136397-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:25:21 compute-0 nova_compute[187185]: 2025-11-29 07:25:21.599 187189 DEBUG oslo_concurrency.lockutils [req-5f4db0c6-ca79-46c8-8e9c-60d40a6e1df1 req-c6a4a08a-a1e4-4fd3-b4ce-85f8cca20a32 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7c10cb24-586c-4507-8169-8258d7136397-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:25:21 compute-0 nova_compute[187185]: 2025-11-29 07:25:21.599 187189 DEBUG oslo_concurrency.lockutils [req-5f4db0c6-ca79-46c8-8e9c-60d40a6e1df1 req-c6a4a08a-a1e4-4fd3-b4ce-85f8cca20a32 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7c10cb24-586c-4507-8169-8258d7136397-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:25:21 compute-0 nova_compute[187185]: 2025-11-29 07:25:21.599 187189 DEBUG nova.compute.manager [req-5f4db0c6-ca79-46c8-8e9c-60d40a6e1df1 req-c6a4a08a-a1e4-4fd3-b4ce-85f8cca20a32 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] No waiting events found dispatching network-vif-plugged-e89dd8de-f981-46cf-aa04-cfad6a9b2326 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:25:21 compute-0 nova_compute[187185]: 2025-11-29 07:25:21.600 187189 WARNING nova.compute.manager [req-5f4db0c6-ca79-46c8-8e9c-60d40a6e1df1 req-c6a4a08a-a1e4-4fd3-b4ce-85f8cca20a32 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Received unexpected event network-vif-plugged-e89dd8de-f981-46cf-aa04-cfad6a9b2326 for instance with vm_state deleted and task_state None.
Nov 29 07:25:21 compute-0 nova_compute[187185]: 2025-11-29 07:25:21.600 187189 DEBUG nova.compute.manager [req-5f4db0c6-ca79-46c8-8e9c-60d40a6e1df1 req-c6a4a08a-a1e4-4fd3-b4ce-85f8cca20a32 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Received event network-vif-deleted-e89dd8de-f981-46cf-aa04-cfad6a9b2326 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:25:23 compute-0 ovn_controller[95281]: 2025-11-29T07:25:23Z|00049|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:39:9f:d3 10.100.0.11
Nov 29 07:25:23 compute-0 ovn_controller[95281]: 2025-11-29T07:25:23Z|00050|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:39:9f:d3 10.100.0.11
Nov 29 07:25:24 compute-0 nova_compute[187185]: 2025-11-29 07:25:24.419 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:25.515 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:25:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:25.516 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:25:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:25.517 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:25:26 compute-0 nova_compute[187185]: 2025-11-29 07:25:26.212 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:28 compute-0 podman[235134]: 2025-11-29 07:25:28.856094774 +0000 UTC m=+0.103446349 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3)
Nov 29 07:25:29 compute-0 nova_compute[187185]: 2025-11-29 07:25:29.427 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:29 compute-0 ovn_controller[95281]: 2025-11-29T07:25:29Z|00387|binding|INFO|Releasing lport 0d1614aa-fbbf-4ad2-9ee2-8948d583ddf6 from this chassis (sb_readonly=0)
Nov 29 07:25:29 compute-0 nova_compute[187185]: 2025-11-29 07:25:29.463 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:29 compute-0 ovn_controller[95281]: 2025-11-29T07:25:29Z|00388|binding|INFO|Releasing lport 0d1614aa-fbbf-4ad2-9ee2-8948d583ddf6 from this chassis (sb_readonly=0)
Nov 29 07:25:29 compute-0 nova_compute[187185]: 2025-11-29 07:25:29.708 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:30.385 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=42, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=41) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:25:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:30.386 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:25:30 compute-0 nova_compute[187185]: 2025-11-29 07:25:30.387 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:31 compute-0 nova_compute[187185]: 2025-11-29 07:25:31.213 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:31.389 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '42'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:25:32 compute-0 nova_compute[187185]: 2025-11-29 07:25:32.595 187189 DEBUG oslo_concurrency.lockutils [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "bbcf2b17-c33f-4a89-9f82-60b4dcfa7208" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:25:32 compute-0 nova_compute[187185]: 2025-11-29 07:25:32.595 187189 DEBUG oslo_concurrency.lockutils [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "bbcf2b17-c33f-4a89-9f82-60b4dcfa7208" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:25:32 compute-0 nova_compute[187185]: 2025-11-29 07:25:32.615 187189 DEBUG nova.compute.manager [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 07:25:32 compute-0 nova_compute[187185]: 2025-11-29 07:25:32.744 187189 DEBUG oslo_concurrency.lockutils [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:25:32 compute-0 nova_compute[187185]: 2025-11-29 07:25:32.745 187189 DEBUG oslo_concurrency.lockutils [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:25:32 compute-0 nova_compute[187185]: 2025-11-29 07:25:32.753 187189 DEBUG nova.virt.hardware [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:25:32 compute-0 nova_compute[187185]: 2025-11-29 07:25:32.754 187189 INFO nova.compute.claims [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:25:32 compute-0 nova_compute[187185]: 2025-11-29 07:25:32.936 187189 DEBUG nova.compute.provider_tree [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:25:33 compute-0 nova_compute[187185]: 2025-11-29 07:25:33.104 187189 DEBUG nova.scheduler.client.report [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:25:33 compute-0 nova_compute[187185]: 2025-11-29 07:25:33.131 187189 DEBUG oslo_concurrency.lockutils [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.386s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:25:33 compute-0 nova_compute[187185]: 2025-11-29 07:25:33.132 187189 DEBUG nova.compute.manager [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 07:25:33 compute-0 nova_compute[187185]: 2025-11-29 07:25:33.210 187189 DEBUG nova.compute.manager [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 07:25:33 compute-0 nova_compute[187185]: 2025-11-29 07:25:33.210 187189 DEBUG nova.network.neutron [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 07:25:33 compute-0 nova_compute[187185]: 2025-11-29 07:25:33.243 187189 INFO nova.virt.libvirt.driver [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 07:25:33 compute-0 nova_compute[187185]: 2025-11-29 07:25:33.267 187189 DEBUG nova.compute.manager [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 07:25:33 compute-0 nova_compute[187185]: 2025-11-29 07:25:33.394 187189 DEBUG nova.compute.manager [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 07:25:33 compute-0 nova_compute[187185]: 2025-11-29 07:25:33.396 187189 DEBUG nova.virt.libvirt.driver [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:25:33 compute-0 nova_compute[187185]: 2025-11-29 07:25:33.397 187189 INFO nova.virt.libvirt.driver [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Creating image(s)
Nov 29 07:25:33 compute-0 nova_compute[187185]: 2025-11-29 07:25:33.398 187189 DEBUG oslo_concurrency.lockutils [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "/var/lib/nova/instances/bbcf2b17-c33f-4a89-9f82-60b4dcfa7208/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:25:33 compute-0 nova_compute[187185]: 2025-11-29 07:25:33.399 187189 DEBUG oslo_concurrency.lockutils [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "/var/lib/nova/instances/bbcf2b17-c33f-4a89-9f82-60b4dcfa7208/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:25:33 compute-0 nova_compute[187185]: 2025-11-29 07:25:33.400 187189 DEBUG oslo_concurrency.lockutils [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "/var/lib/nova/instances/bbcf2b17-c33f-4a89-9f82-60b4dcfa7208/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:25:33 compute-0 nova_compute[187185]: 2025-11-29 07:25:33.426 187189 DEBUG nova.policy [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 07:25:33 compute-0 nova_compute[187185]: 2025-11-29 07:25:33.430 187189 DEBUG oslo_concurrency.processutils [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:25:33 compute-0 nova_compute[187185]: 2025-11-29 07:25:33.529 187189 DEBUG oslo_concurrency.processutils [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:25:33 compute-0 nova_compute[187185]: 2025-11-29 07:25:33.531 187189 DEBUG oslo_concurrency.lockutils [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:25:33 compute-0 nova_compute[187185]: 2025-11-29 07:25:33.532 187189 DEBUG oslo_concurrency.lockutils [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:25:33 compute-0 nova_compute[187185]: 2025-11-29 07:25:33.558 187189 DEBUG oslo_concurrency.processutils [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:25:33 compute-0 nova_compute[187185]: 2025-11-29 07:25:33.633 187189 DEBUG oslo_concurrency.processutils [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:25:33 compute-0 nova_compute[187185]: 2025-11-29 07:25:33.634 187189 DEBUG oslo_concurrency.processutils [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/bbcf2b17-c33f-4a89-9f82-60b4dcfa7208/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:25:33 compute-0 nova_compute[187185]: 2025-11-29 07:25:33.972 187189 DEBUG oslo_concurrency.processutils [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/bbcf2b17-c33f-4a89-9f82-60b4dcfa7208/disk 1073741824" returned: 0 in 0.337s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:25:33 compute-0 nova_compute[187185]: 2025-11-29 07:25:33.974 187189 DEBUG oslo_concurrency.lockutils [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.442s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:25:33 compute-0 nova_compute[187185]: 2025-11-29 07:25:33.975 187189 DEBUG oslo_concurrency.processutils [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:25:34 compute-0 nova_compute[187185]: 2025-11-29 07:25:34.062 187189 DEBUG oslo_concurrency.processutils [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:25:34 compute-0 nova_compute[187185]: 2025-11-29 07:25:34.063 187189 DEBUG nova.virt.disk.api [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Checking if we can resize image /var/lib/nova/instances/bbcf2b17-c33f-4a89-9f82-60b4dcfa7208/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:25:34 compute-0 nova_compute[187185]: 2025-11-29 07:25:34.064 187189 DEBUG oslo_concurrency.processutils [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbcf2b17-c33f-4a89-9f82-60b4dcfa7208/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:25:34 compute-0 nova_compute[187185]: 2025-11-29 07:25:34.118 187189 DEBUG oslo_concurrency.processutils [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbcf2b17-c33f-4a89-9f82-60b4dcfa7208/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:25:34 compute-0 nova_compute[187185]: 2025-11-29 07:25:34.119 187189 DEBUG nova.virt.disk.api [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Cannot resize image /var/lib/nova/instances/bbcf2b17-c33f-4a89-9f82-60b4dcfa7208/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:25:34 compute-0 nova_compute[187185]: 2025-11-29 07:25:34.120 187189 DEBUG nova.objects.instance [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'migration_context' on Instance uuid bbcf2b17-c33f-4a89-9f82-60b4dcfa7208 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:25:34 compute-0 nova_compute[187185]: 2025-11-29 07:25:34.136 187189 DEBUG nova.virt.libvirt.driver [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:25:34 compute-0 nova_compute[187185]: 2025-11-29 07:25:34.136 187189 DEBUG nova.virt.libvirt.driver [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Ensure instance console log exists: /var/lib/nova/instances/bbcf2b17-c33f-4a89-9f82-60b4dcfa7208/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:25:34 compute-0 nova_compute[187185]: 2025-11-29 07:25:34.137 187189 DEBUG oslo_concurrency.lockutils [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:25:34 compute-0 nova_compute[187185]: 2025-11-29 07:25:34.137 187189 DEBUG oslo_concurrency.lockutils [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:25:34 compute-0 nova_compute[187185]: 2025-11-29 07:25:34.137 187189 DEBUG oslo_concurrency.lockutils [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:25:34 compute-0 nova_compute[187185]: 2025-11-29 07:25:34.394 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401119.3929214, 7c10cb24-586c-4507-8169-8258d7136397 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:25:34 compute-0 nova_compute[187185]: 2025-11-29 07:25:34.395 187189 INFO nova.compute.manager [-] [instance: 7c10cb24-586c-4507-8169-8258d7136397] VM Stopped (Lifecycle Event)
Nov 29 07:25:34 compute-0 nova_compute[187185]: 2025-11-29 07:25:34.422 187189 DEBUG nova.compute.manager [None req-f2976ee7-9571-45ee-8d41-4f25103903b0 - - - - - -] [instance: 7c10cb24-586c-4507-8169-8258d7136397] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:25:34 compute-0 nova_compute[187185]: 2025-11-29 07:25:34.429 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:35 compute-0 nova_compute[187185]: 2025-11-29 07:25:35.151 187189 DEBUG nova.network.neutron [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Successfully created port: 723f857c-f89e-440a-83f2-6bf46b479fca _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:25:35 compute-0 podman[235176]: 2025-11-29 07:25:35.814401049 +0000 UTC m=+0.071165428 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 07:25:36 compute-0 nova_compute[187185]: 2025-11-29 07:25:36.216 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:38 compute-0 nova_compute[187185]: 2025-11-29 07:25:38.480 187189 DEBUG nova.network.neutron [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Successfully created port: adf9aa84-82bb-4e89-a0a5-7fad93336a39 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:25:39 compute-0 nova_compute[187185]: 2025-11-29 07:25:39.431 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:39 compute-0 podman[235200]: 2025-11-29 07:25:39.848036007 +0000 UTC m=+0.104722315 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 07:25:39 compute-0 podman[235201]: 2025-11-29 07:25:39.853214303 +0000 UTC m=+0.104054856 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm)
Nov 29 07:25:40 compute-0 nova_compute[187185]: 2025-11-29 07:25:40.729 187189 DEBUG nova.network.neutron [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Successfully updated port: 723f857c-f89e-440a-83f2-6bf46b479fca _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:25:41 compute-0 nova_compute[187185]: 2025-11-29 07:25:41.218 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:42 compute-0 nova_compute[187185]: 2025-11-29 07:25:42.042 187189 DEBUG nova.network.neutron [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Successfully updated port: adf9aa84-82bb-4e89-a0a5-7fad93336a39 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:25:42 compute-0 nova_compute[187185]: 2025-11-29 07:25:42.057 187189 DEBUG oslo_concurrency.lockutils [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "refresh_cache-bbcf2b17-c33f-4a89-9f82-60b4dcfa7208" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:25:42 compute-0 nova_compute[187185]: 2025-11-29 07:25:42.058 187189 DEBUG oslo_concurrency.lockutils [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquired lock "refresh_cache-bbcf2b17-c33f-4a89-9f82-60b4dcfa7208" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:25:42 compute-0 nova_compute[187185]: 2025-11-29 07:25:42.058 187189 DEBUG nova.network.neutron [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:25:42 compute-0 nova_compute[187185]: 2025-11-29 07:25:42.174 187189 DEBUG nova.compute.manager [req-e1961052-085d-4858-a035-2cbb7440b681 req-72f8951f-37e0-4384-acc5-0786599709e6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Received event network-changed-723f857c-f89e-440a-83f2-6bf46b479fca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:25:42 compute-0 nova_compute[187185]: 2025-11-29 07:25:42.175 187189 DEBUG nova.compute.manager [req-e1961052-085d-4858-a035-2cbb7440b681 req-72f8951f-37e0-4384-acc5-0786599709e6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Refreshing instance network info cache due to event network-changed-723f857c-f89e-440a-83f2-6bf46b479fca. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:25:42 compute-0 nova_compute[187185]: 2025-11-29 07:25:42.175 187189 DEBUG oslo_concurrency.lockutils [req-e1961052-085d-4858-a035-2cbb7440b681 req-72f8951f-37e0-4384-acc5-0786599709e6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-bbcf2b17-c33f-4a89-9f82-60b4dcfa7208" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:25:42 compute-0 nova_compute[187185]: 2025-11-29 07:25:42.424 187189 DEBUG nova.network.neutron [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.433 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.695 187189 DEBUG nova.network.neutron [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Updating instance_info_cache with network_info: [{"id": "723f857c-f89e-440a-83f2-6bf46b479fca", "address": "fa:16:3e:62:41:bf", "network": {"id": "ae86c83f-be5a-4cd0-9064-11898ee2fcef", "bridge": "br-int", "label": "tempest-network-smoke--695329026", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap723f857c-f8", "ovs_interfaceid": "723f857c-f89e-440a-83f2-6bf46b479fca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "adf9aa84-82bb-4e89-a0a5-7fad93336a39", "address": "fa:16:3e:59:27:b3", "network": {"id": "a3d94aff-5439-43d3-a356-7aafae582344", "bridge": "br-int", "label": "tempest-network-smoke--90551140", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe59:27b3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapadf9aa84-82", "ovs_interfaceid": "adf9aa84-82bb-4e89-a0a5-7fad93336a39", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.736 187189 DEBUG oslo_concurrency.lockutils [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Releasing lock "refresh_cache-bbcf2b17-c33f-4a89-9f82-60b4dcfa7208" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.737 187189 DEBUG nova.compute.manager [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Instance network_info: |[{"id": "723f857c-f89e-440a-83f2-6bf46b479fca", "address": "fa:16:3e:62:41:bf", "network": {"id": "ae86c83f-be5a-4cd0-9064-11898ee2fcef", "bridge": "br-int", "label": "tempest-network-smoke--695329026", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap723f857c-f8", "ovs_interfaceid": "723f857c-f89e-440a-83f2-6bf46b479fca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "adf9aa84-82bb-4e89-a0a5-7fad93336a39", "address": "fa:16:3e:59:27:b3", "network": {"id": "a3d94aff-5439-43d3-a356-7aafae582344", "bridge": "br-int", "label": "tempest-network-smoke--90551140", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe59:27b3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapadf9aa84-82", "ovs_interfaceid": "adf9aa84-82bb-4e89-a0a5-7fad93336a39", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.737 187189 DEBUG oslo_concurrency.lockutils [req-e1961052-085d-4858-a035-2cbb7440b681 req-72f8951f-37e0-4384-acc5-0786599709e6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-bbcf2b17-c33f-4a89-9f82-60b4dcfa7208" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.738 187189 DEBUG nova.network.neutron [req-e1961052-085d-4858-a035-2cbb7440b681 req-72f8951f-37e0-4384-acc5-0786599709e6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Refreshing network info cache for port 723f857c-f89e-440a-83f2-6bf46b479fca _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.745 187189 DEBUG nova.virt.libvirt.driver [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Start _get_guest_xml network_info=[{"id": "723f857c-f89e-440a-83f2-6bf46b479fca", "address": "fa:16:3e:62:41:bf", "network": {"id": "ae86c83f-be5a-4cd0-9064-11898ee2fcef", "bridge": "br-int", "label": "tempest-network-smoke--695329026", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap723f857c-f8", "ovs_interfaceid": "723f857c-f89e-440a-83f2-6bf46b479fca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "adf9aa84-82bb-4e89-a0a5-7fad93336a39", "address": "fa:16:3e:59:27:b3", "network": {"id": "a3d94aff-5439-43d3-a356-7aafae582344", "bridge": "br-int", "label": "tempest-network-smoke--90551140", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe59:27b3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapadf9aa84-82", "ovs_interfaceid": "adf9aa84-82bb-4e89-a0a5-7fad93336a39", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.755 187189 WARNING nova.virt.libvirt.driver [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.766 187189 DEBUG nova.virt.libvirt.host [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.767 187189 DEBUG nova.virt.libvirt.host [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.773 187189 DEBUG nova.virt.libvirt.host [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.773 187189 DEBUG nova.virt.libvirt.host [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.775 187189 DEBUG nova.virt.libvirt.driver [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.775 187189 DEBUG nova.virt.hardware [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.776 187189 DEBUG nova.virt.hardware [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.776 187189 DEBUG nova.virt.hardware [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.776 187189 DEBUG nova.virt.hardware [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.777 187189 DEBUG nova.virt.hardware [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.777 187189 DEBUG nova.virt.hardware [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.777 187189 DEBUG nova.virt.hardware [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.778 187189 DEBUG nova.virt.hardware [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.778 187189 DEBUG nova.virt.hardware [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.778 187189 DEBUG nova.virt.hardware [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.779 187189 DEBUG nova.virt.hardware [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.783 187189 DEBUG nova.virt.libvirt.vif [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:25:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-670731095',display_name='tempest-TestGettingAddress-server-670731095',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-670731095',id=127,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBfPCDIsgTH7NijT1yLJHsEeHpblJVOMzuD1uWLhNG/kTdHG+MlT37mOzLs3Jd0/9NUkh6NevkJ52dyRmEbrCaMvcIh0EOIGfP4sOHwd11Jy3SL4tJpdp4JARnM5Jon1zg==',key_name='tempest-TestGettingAddress-2055091882',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-08141zmb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:25:33Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=bbcf2b17-c33f-4a89-9f82-60b4dcfa7208,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "723f857c-f89e-440a-83f2-6bf46b479fca", "address": "fa:16:3e:62:41:bf", "network": {"id": "ae86c83f-be5a-4cd0-9064-11898ee2fcef", "bridge": "br-int", "label": "tempest-network-smoke--695329026", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap723f857c-f8", "ovs_interfaceid": "723f857c-f89e-440a-83f2-6bf46b479fca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.784 187189 DEBUG nova.network.os_vif_util [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "723f857c-f89e-440a-83f2-6bf46b479fca", "address": "fa:16:3e:62:41:bf", "network": {"id": "ae86c83f-be5a-4cd0-9064-11898ee2fcef", "bridge": "br-int", "label": "tempest-network-smoke--695329026", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap723f857c-f8", "ovs_interfaceid": "723f857c-f89e-440a-83f2-6bf46b479fca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.784 187189 DEBUG nova.network.os_vif_util [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:41:bf,bridge_name='br-int',has_traffic_filtering=True,id=723f857c-f89e-440a-83f2-6bf46b479fca,network=Network(ae86c83f-be5a-4cd0-9064-11898ee2fcef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap723f857c-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.785 187189 DEBUG nova.virt.libvirt.vif [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:25:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-670731095',display_name='tempest-TestGettingAddress-server-670731095',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-670731095',id=127,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBfPCDIsgTH7NijT1yLJHsEeHpblJVOMzuD1uWLhNG/kTdHG+MlT37mOzLs3Jd0/9NUkh6NevkJ52dyRmEbrCaMvcIh0EOIGfP4sOHwd11Jy3SL4tJpdp4JARnM5Jon1zg==',key_name='tempest-TestGettingAddress-2055091882',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-08141zmb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:25:33Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=bbcf2b17-c33f-4a89-9f82-60b4dcfa7208,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "adf9aa84-82bb-4e89-a0a5-7fad93336a39", "address": "fa:16:3e:59:27:b3", "network": {"id": "a3d94aff-5439-43d3-a356-7aafae582344", "bridge": "br-int", "label": "tempest-network-smoke--90551140", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe59:27b3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapadf9aa84-82", "ovs_interfaceid": "adf9aa84-82bb-4e89-a0a5-7fad93336a39", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.786 187189 DEBUG nova.network.os_vif_util [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "adf9aa84-82bb-4e89-a0a5-7fad93336a39", "address": "fa:16:3e:59:27:b3", "network": {"id": "a3d94aff-5439-43d3-a356-7aafae582344", "bridge": "br-int", "label": "tempest-network-smoke--90551140", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe59:27b3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapadf9aa84-82", "ovs_interfaceid": "adf9aa84-82bb-4e89-a0a5-7fad93336a39", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.786 187189 DEBUG nova.network.os_vif_util [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:59:27:b3,bridge_name='br-int',has_traffic_filtering=True,id=adf9aa84-82bb-4e89-a0a5-7fad93336a39,network=Network(a3d94aff-5439-43d3-a356-7aafae582344),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapadf9aa84-82') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.787 187189 DEBUG nova.objects.instance [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'pci_devices' on Instance uuid bbcf2b17-c33f-4a89-9f82-60b4dcfa7208 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.803 187189 DEBUG nova.virt.libvirt.driver [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:25:44 compute-0 nova_compute[187185]:   <uuid>bbcf2b17-c33f-4a89-9f82-60b4dcfa7208</uuid>
Nov 29 07:25:44 compute-0 nova_compute[187185]:   <name>instance-0000007f</name>
Nov 29 07:25:44 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:25:44 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:25:44 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:25:44 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:       <nova:name>tempest-TestGettingAddress-server-670731095</nova:name>
Nov 29 07:25:44 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:25:44</nova:creationTime>
Nov 29 07:25:44 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:25:44 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:25:44 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:25:44 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:25:44 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:25:44 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:25:44 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:25:44 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:25:44 compute-0 nova_compute[187185]:         <nova:user uuid="31ac7b05b012433b89143dc9f259644a">tempest-TestGettingAddress-1465017630-project-member</nova:user>
Nov 29 07:25:44 compute-0 nova_compute[187185]:         <nova:project uuid="0111c22b4b954ea586ca20d91ed3970f">tempest-TestGettingAddress-1465017630</nova:project>
Nov 29 07:25:44 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:25:44 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:25:44 compute-0 nova_compute[187185]:         <nova:port uuid="723f857c-f89e-440a-83f2-6bf46b479fca">
Nov 29 07:25:44 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:25:44 compute-0 nova_compute[187185]:         <nova:port uuid="adf9aa84-82bb-4e89-a0a5-7fad93336a39">
Nov 29 07:25:44 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fe59:27b3" ipVersion="6"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:25:44 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:25:44 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:25:44 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <system>
Nov 29 07:25:44 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:25:44 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:25:44 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:25:44 compute-0 nova_compute[187185]:       <entry name="serial">bbcf2b17-c33f-4a89-9f82-60b4dcfa7208</entry>
Nov 29 07:25:44 compute-0 nova_compute[187185]:       <entry name="uuid">bbcf2b17-c33f-4a89-9f82-60b4dcfa7208</entry>
Nov 29 07:25:44 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     </system>
Nov 29 07:25:44 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:25:44 compute-0 nova_compute[187185]:   <os>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:   </os>
Nov 29 07:25:44 compute-0 nova_compute[187185]:   <features>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:   </features>
Nov 29 07:25:44 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:25:44 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:25:44 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:25:44 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/bbcf2b17-c33f-4a89-9f82-60b4dcfa7208/disk"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:25:44 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/bbcf2b17-c33f-4a89-9f82-60b4dcfa7208/disk.config"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:25:44 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:62:41:bf"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:       <target dev="tap723f857c-f8"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:25:44 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:59:27:b3"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:       <target dev="tapadf9aa84-82"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:25:44 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/bbcf2b17-c33f-4a89-9f82-60b4dcfa7208/console.log" append="off"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <video>
Nov 29 07:25:44 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     </video>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:25:44 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:25:44 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:25:44 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:25:44 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:25:44 compute-0 nova_compute[187185]: </domain>
Nov 29 07:25:44 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.805 187189 DEBUG nova.compute.manager [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Preparing to wait for external event network-vif-plugged-723f857c-f89e-440a-83f2-6bf46b479fca prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.805 187189 DEBUG oslo_concurrency.lockutils [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "bbcf2b17-c33f-4a89-9f82-60b4dcfa7208-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.806 187189 DEBUG oslo_concurrency.lockutils [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "bbcf2b17-c33f-4a89-9f82-60b4dcfa7208-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.806 187189 DEBUG oslo_concurrency.lockutils [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "bbcf2b17-c33f-4a89-9f82-60b4dcfa7208-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.806 187189 DEBUG nova.compute.manager [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Preparing to wait for external event network-vif-plugged-adf9aa84-82bb-4e89-a0a5-7fad93336a39 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.807 187189 DEBUG oslo_concurrency.lockutils [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "bbcf2b17-c33f-4a89-9f82-60b4dcfa7208-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.807 187189 DEBUG oslo_concurrency.lockutils [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "bbcf2b17-c33f-4a89-9f82-60b4dcfa7208-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.807 187189 DEBUG oslo_concurrency.lockutils [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "bbcf2b17-c33f-4a89-9f82-60b4dcfa7208-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.808 187189 DEBUG nova.virt.libvirt.vif [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:25:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-670731095',display_name='tempest-TestGettingAddress-server-670731095',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-670731095',id=127,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBfPCDIsgTH7NijT1yLJHsEeHpblJVOMzuD1uWLhNG/kTdHG+MlT37mOzLs3Jd0/9NUkh6NevkJ52dyRmEbrCaMvcIh0EOIGfP4sOHwd11Jy3SL4tJpdp4JARnM5Jon1zg==',key_name='tempest-TestGettingAddress-2055091882',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-08141zmb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:25:33Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=bbcf2b17-c33f-4a89-9f82-60b4dcfa7208,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "723f857c-f89e-440a-83f2-6bf46b479fca", "address": "fa:16:3e:62:41:bf", "network": {"id": "ae86c83f-be5a-4cd0-9064-11898ee2fcef", "bridge": "br-int", "label": "tempest-network-smoke--695329026", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap723f857c-f8", "ovs_interfaceid": "723f857c-f89e-440a-83f2-6bf46b479fca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.808 187189 DEBUG nova.network.os_vif_util [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "723f857c-f89e-440a-83f2-6bf46b479fca", "address": "fa:16:3e:62:41:bf", "network": {"id": "ae86c83f-be5a-4cd0-9064-11898ee2fcef", "bridge": "br-int", "label": "tempest-network-smoke--695329026", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap723f857c-f8", "ovs_interfaceid": "723f857c-f89e-440a-83f2-6bf46b479fca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.809 187189 DEBUG nova.network.os_vif_util [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:41:bf,bridge_name='br-int',has_traffic_filtering=True,id=723f857c-f89e-440a-83f2-6bf46b479fca,network=Network(ae86c83f-be5a-4cd0-9064-11898ee2fcef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap723f857c-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.809 187189 DEBUG os_vif [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:41:bf,bridge_name='br-int',has_traffic_filtering=True,id=723f857c-f89e-440a-83f2-6bf46b479fca,network=Network(ae86c83f-be5a-4cd0-9064-11898ee2fcef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap723f857c-f8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.810 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.810 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.811 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.813 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.813 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap723f857c-f8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.814 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap723f857c-f8, col_values=(('external_ids', {'iface-id': '723f857c-f89e-440a-83f2-6bf46b479fca', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:62:41:bf', 'vm-uuid': 'bbcf2b17-c33f-4a89-9f82-60b4dcfa7208'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.829 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:44 compute-0 NetworkManager[55227]: <info>  [1764401144.8311] manager: (tap723f857c-f8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/205)
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.833 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.836 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.837 187189 INFO os_vif [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:41:bf,bridge_name='br-int',has_traffic_filtering=True,id=723f857c-f89e-440a-83f2-6bf46b479fca,network=Network(ae86c83f-be5a-4cd0-9064-11898ee2fcef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap723f857c-f8')
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.838 187189 DEBUG nova.virt.libvirt.vif [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:25:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-670731095',display_name='tempest-TestGettingAddress-server-670731095',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-670731095',id=127,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBfPCDIsgTH7NijT1yLJHsEeHpblJVOMzuD1uWLhNG/kTdHG+MlT37mOzLs3Jd0/9NUkh6NevkJ52dyRmEbrCaMvcIh0EOIGfP4sOHwd11Jy3SL4tJpdp4JARnM5Jon1zg==',key_name='tempest-TestGettingAddress-2055091882',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-08141zmb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:25:33Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=bbcf2b17-c33f-4a89-9f82-60b4dcfa7208,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "adf9aa84-82bb-4e89-a0a5-7fad93336a39", "address": "fa:16:3e:59:27:b3", "network": {"id": "a3d94aff-5439-43d3-a356-7aafae582344", "bridge": "br-int", "label": "tempest-network-smoke--90551140", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe59:27b3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapadf9aa84-82", "ovs_interfaceid": "adf9aa84-82bb-4e89-a0a5-7fad93336a39", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.839 187189 DEBUG nova.network.os_vif_util [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "adf9aa84-82bb-4e89-a0a5-7fad93336a39", "address": "fa:16:3e:59:27:b3", "network": {"id": "a3d94aff-5439-43d3-a356-7aafae582344", "bridge": "br-int", "label": "tempest-network-smoke--90551140", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe59:27b3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapadf9aa84-82", "ovs_interfaceid": "adf9aa84-82bb-4e89-a0a5-7fad93336a39", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.840 187189 DEBUG nova.network.os_vif_util [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:59:27:b3,bridge_name='br-int',has_traffic_filtering=True,id=adf9aa84-82bb-4e89-a0a5-7fad93336a39,network=Network(a3d94aff-5439-43d3-a356-7aafae582344),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapadf9aa84-82') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.841 187189 DEBUG os_vif [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:59:27:b3,bridge_name='br-int',has_traffic_filtering=True,id=adf9aa84-82bb-4e89-a0a5-7fad93336a39,network=Network(a3d94aff-5439-43d3-a356-7aafae582344),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapadf9aa84-82') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.841 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.842 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.842 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.845 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.845 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapadf9aa84-82, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.846 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapadf9aa84-82, col_values=(('external_ids', {'iface-id': 'adf9aa84-82bb-4e89-a0a5-7fad93336a39', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:59:27:b3', 'vm-uuid': 'bbcf2b17-c33f-4a89-9f82-60b4dcfa7208'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.848 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:44 compute-0 NetworkManager[55227]: <info>  [1764401144.8496] manager: (tapadf9aa84-82): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/206)
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.851 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.856 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.857 187189 INFO os_vif [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:59:27:b3,bridge_name='br-int',has_traffic_filtering=True,id=adf9aa84-82bb-4e89-a0a5-7fad93336a39,network=Network(a3d94aff-5439-43d3-a356-7aafae582344),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapadf9aa84-82')
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.956 187189 DEBUG nova.virt.libvirt.driver [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.957 187189 DEBUG nova.virt.libvirt.driver [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.958 187189 DEBUG nova.virt.libvirt.driver [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No VIF found with MAC fa:16:3e:62:41:bf, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.958 187189 DEBUG nova.virt.libvirt.driver [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No VIF found with MAC fa:16:3e:59:27:b3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:25:44 compute-0 nova_compute[187185]: 2025-11-29 07:25:44.959 187189 INFO nova.virt.libvirt.driver [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Using config drive
Nov 29 07:25:45 compute-0 nova_compute[187185]: 2025-11-29 07:25:45.434 187189 INFO nova.virt.libvirt.driver [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Creating config drive at /var/lib/nova/instances/bbcf2b17-c33f-4a89-9f82-60b4dcfa7208/disk.config
Nov 29 07:25:45 compute-0 nova_compute[187185]: 2025-11-29 07:25:45.444 187189 DEBUG oslo_concurrency.processutils [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bbcf2b17-c33f-4a89-9f82-60b4dcfa7208/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkicavqd5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:25:45 compute-0 nova_compute[187185]: 2025-11-29 07:25:45.581 187189 DEBUG oslo_concurrency.processutils [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bbcf2b17-c33f-4a89-9f82-60b4dcfa7208/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkicavqd5" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:25:45 compute-0 NetworkManager[55227]: <info>  [1764401145.6563] manager: (tap723f857c-f8): new Tun device (/org/freedesktop/NetworkManager/Devices/207)
Nov 29 07:25:45 compute-0 kernel: tap723f857c-f8: entered promiscuous mode
Nov 29 07:25:45 compute-0 ovn_controller[95281]: 2025-11-29T07:25:45Z|00389|binding|INFO|Claiming lport 723f857c-f89e-440a-83f2-6bf46b479fca for this chassis.
Nov 29 07:25:45 compute-0 ovn_controller[95281]: 2025-11-29T07:25:45Z|00390|binding|INFO|723f857c-f89e-440a-83f2-6bf46b479fca: Claiming fa:16:3e:62:41:bf 10.100.0.11
Nov 29 07:25:45 compute-0 nova_compute[187185]: 2025-11-29 07:25:45.661 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:45 compute-0 NetworkManager[55227]: <info>  [1764401145.6760] manager: (tapadf9aa84-82): new Tun device (/org/freedesktop/NetworkManager/Devices/208)
Nov 29 07:25:45 compute-0 nova_compute[187185]: 2025-11-29 07:25:45.675 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:45 compute-0 kernel: tapadf9aa84-82: entered promiscuous mode
Nov 29 07:25:45 compute-0 nova_compute[187185]: 2025-11-29 07:25:45.679 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:45 compute-0 NetworkManager[55227]: <info>  [1764401145.6911] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/209)
Nov 29 07:25:45 compute-0 ovn_controller[95281]: 2025-11-29T07:25:45Z|00391|if_status|INFO|Not updating pb chassis for adf9aa84-82bb-4e89-a0a5-7fad93336a39 now as sb is readonly
Nov 29 07:25:45 compute-0 nova_compute[187185]: 2025-11-29 07:25:45.691 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:45 compute-0 NetworkManager[55227]: <info>  [1764401145.6928] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/210)
Nov 29 07:25:45 compute-0 systemd-udevd[235266]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:25:45 compute-0 systemd-udevd[235267]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:25:45 compute-0 NetworkManager[55227]: <info>  [1764401145.7114] device (tap723f857c-f8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:25:45 compute-0 NetworkManager[55227]: <info>  [1764401145.7130] device (tap723f857c-f8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:25:45 compute-0 NetworkManager[55227]: <info>  [1764401145.7164] device (tapadf9aa84-82): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:25:45 compute-0 NetworkManager[55227]: <info>  [1764401145.7174] device (tapadf9aa84-82): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:25:45 compute-0 systemd-machined[153486]: New machine qemu-51-instance-0000007f.
Nov 29 07:25:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:45.785 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:62:41:bf 10.100.0.11'], port_security=['fa:16:3e:62:41:bf 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'bbcf2b17-c33f-4a89-9f82-60b4dcfa7208', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ae86c83f-be5a-4cd0-9064-11898ee2fcef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'dcb73a4e-e43c-4221-95d7-295071c2bea0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f5c6bb94-c536-451b-a4cb-db984bf0cbdf, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=723f857c-f89e-440a-83f2-6bf46b479fca) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:25:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:45.786 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 723f857c-f89e-440a-83f2-6bf46b479fca in datapath ae86c83f-be5a-4cd0-9064-11898ee2fcef bound to our chassis
Nov 29 07:25:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:45.788 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ae86c83f-be5a-4cd0-9064-11898ee2fcef
Nov 29 07:25:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:45.799 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[135c1843-f3de-438d-b05f-71bde11fde37]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:45.800 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapae86c83f-b1 in ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:25:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:45.802 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapae86c83f-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:25:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:45.802 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[2ed5976d-b710-41bb-b5c8-4ea93fdc22d9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:45.803 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[46f9482d-e92f-4288-9a25-5bba86733ec3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:45.819 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[c2cfa606-0518-4882-ac0a-208a7b3899b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:45 compute-0 systemd[1]: Started Virtual Machine qemu-51-instance-0000007f.
Nov 29 07:25:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:45.854 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[77d38496-2f93-4c1e-82fa-7f9ded23da01]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:45 compute-0 ovn_controller[95281]: 2025-11-29T07:25:45Z|00392|binding|INFO|Claiming lport adf9aa84-82bb-4e89-a0a5-7fad93336a39 for this chassis.
Nov 29 07:25:45 compute-0 ovn_controller[95281]: 2025-11-29T07:25:45Z|00393|binding|INFO|adf9aa84-82bb-4e89-a0a5-7fad93336a39: Claiming fa:16:3e:59:27:b3 2001:db8::f816:3eff:fe59:27b3
Nov 29 07:25:45 compute-0 ovn_controller[95281]: 2025-11-29T07:25:45Z|00394|binding|INFO|Releasing lport 0d1614aa-fbbf-4ad2-9ee2-8948d583ddf6 from this chassis (sb_readonly=0)
Nov 29 07:25:45 compute-0 ovn_controller[95281]: 2025-11-29T07:25:45Z|00395|binding|INFO|Setting lport 723f857c-f89e-440a-83f2-6bf46b479fca ovn-installed in OVS
Nov 29 07:25:45 compute-0 ovn_controller[95281]: 2025-11-29T07:25:45Z|00396|binding|INFO|Setting lport 723f857c-f89e-440a-83f2-6bf46b479fca up in Southbound
Nov 29 07:25:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:45.914 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:59:27:b3 2001:db8::f816:3eff:fe59:27b3'], port_security=['fa:16:3e:59:27:b3 2001:db8::f816:3eff:fe59:27b3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe59:27b3/64', 'neutron:device_id': 'bbcf2b17-c33f-4a89-9f82-60b4dcfa7208', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a3d94aff-5439-43d3-a356-7aafae582344', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'dcb73a4e-e43c-4221-95d7-295071c2bea0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=890f979e-778b-42a4-aff1-be3795cfb05f, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=adf9aa84-82bb-4e89-a0a5-7fad93336a39) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:25:45 compute-0 nova_compute[187185]: 2025-11-29 07:25:45.914 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:45 compute-0 ovn_controller[95281]: 2025-11-29T07:25:45Z|00397|binding|INFO|Releasing lport 0d1614aa-fbbf-4ad2-9ee2-8948d583ddf6 from this chassis (sb_readonly=0)
Nov 29 07:25:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:45.919 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[3aace0a7-035e-404d-8f94-60894e3bf1bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:45.940 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[547726ef-0886-40c5-b1e8-217145d80722]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:45 compute-0 NetworkManager[55227]: <info>  [1764401145.9414] manager: (tapae86c83f-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/211)
Nov 29 07:25:45 compute-0 systemd-udevd[235272]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:25:45 compute-0 ovn_controller[95281]: 2025-11-29T07:25:45Z|00398|binding|INFO|Setting lport adf9aa84-82bb-4e89-a0a5-7fad93336a39 ovn-installed in OVS
Nov 29 07:25:45 compute-0 ovn_controller[95281]: 2025-11-29T07:25:45Z|00399|binding|INFO|Setting lport adf9aa84-82bb-4e89-a0a5-7fad93336a39 up in Southbound
Nov 29 07:25:45 compute-0 nova_compute[187185]: 2025-11-29 07:25:45.944 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:45.977 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[e82e2d7b-e13f-446b-b599-b2eb343c92b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:45.981 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[0c6bf056-7aa8-4db4-9c93-719fb87744e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:46 compute-0 NetworkManager[55227]: <info>  [1764401146.0077] device (tapae86c83f-b0): carrier: link connected
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:46.013 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[549cd8f6-a33a-4fe1-89ff-1d9cba13a91a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:46.034 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[4aa33bb2-0bd7-411d-91c0-126816670072]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapae86c83f-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fc:2f:e5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 128], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 659367, 'reachable_time': 36535, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235302, 'error': None, 'target': 'ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:46.053 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[8ac89414-02dd-421f-af6b-9c15a34cd545]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefc:2fe5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 659367, 'tstamp': 659367}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235303, 'error': None, 'target': 'ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:46.073 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[9d4948b4-2430-42b4-8d76-910b2381a24f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapae86c83f-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fc:2f:e5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 128], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 659367, 'reachable_time': 36535, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235304, 'error': None, 'target': 'ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:46.109 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[16e86bfd-6896-4431-93d6-480bb7e10346]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.157 187189 DEBUG nova.compute.manager [req-eebc78df-44e8-4f4a-8887-0aa08aa3306c req-643c29d5-5ffe-4dff-bd5f-56e5454b1ed2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Received event network-vif-plugged-723f857c-f89e-440a-83f2-6bf46b479fca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.158 187189 DEBUG oslo_concurrency.lockutils [req-eebc78df-44e8-4f4a-8887-0aa08aa3306c req-643c29d5-5ffe-4dff-bd5f-56e5454b1ed2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "bbcf2b17-c33f-4a89-9f82-60b4dcfa7208-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.159 187189 DEBUG oslo_concurrency.lockutils [req-eebc78df-44e8-4f4a-8887-0aa08aa3306c req-643c29d5-5ffe-4dff-bd5f-56e5454b1ed2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "bbcf2b17-c33f-4a89-9f82-60b4dcfa7208-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.159 187189 DEBUG oslo_concurrency.lockutils [req-eebc78df-44e8-4f4a-8887-0aa08aa3306c req-643c29d5-5ffe-4dff-bd5f-56e5454b1ed2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "bbcf2b17-c33f-4a89-9f82-60b4dcfa7208-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.160 187189 DEBUG nova.compute.manager [req-eebc78df-44e8-4f4a-8887-0aa08aa3306c req-643c29d5-5ffe-4dff-bd5f-56e5454b1ed2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Processing event network-vif-plugged-723f857c-f89e-440a-83f2-6bf46b479fca _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:46.200 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[376e6aef-213f-4b38-bf3f-6938e9cad206]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:46.202 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapae86c83f-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:46.202 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:46.203 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapae86c83f-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:25:46 compute-0 kernel: tapae86c83f-b0: entered promiscuous mode
Nov 29 07:25:46 compute-0 NetworkManager[55227]: <info>  [1764401146.2068] manager: (tapae86c83f-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/212)
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.205 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.210 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:46.212 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapae86c83f-b0, col_values=(('external_ids', {'iface-id': 'e3e5d9ef-c03b-4d54-8f92-11c237a85862'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:25:46 compute-0 ovn_controller[95281]: 2025-11-29T07:25:46Z|00400|binding|INFO|Releasing lport e3e5d9ef-c03b-4d54-8f92-11c237a85862 from this chassis (sb_readonly=0)
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.214 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.216 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:46.219 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ae86c83f-be5a-4cd0-9064-11898ee2fcef.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ae86c83f-be5a-4cd0-9064-11898ee2fcef.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:46.220 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[198f5f16-089c-46fe-b292-0eba6b5da258]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:46.221 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-ae86c83f-be5a-4cd0-9064-11898ee2fcef
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/ae86c83f-be5a-4cd0-9064-11898ee2fcef.pid.haproxy
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID ae86c83f-be5a-4cd0-9064-11898ee2fcef
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:46.222 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef', 'env', 'PROCESS_TAG=haproxy-ae86c83f-be5a-4cd0-9064-11898ee2fcef', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ae86c83f-be5a-4cd0-9064-11898ee2fcef.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.238 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.302 187189 DEBUG nova.network.neutron [req-e1961052-085d-4858-a035-2cbb7440b681 req-72f8951f-37e0-4384-acc5-0786599709e6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Updated VIF entry in instance network info cache for port 723f857c-f89e-440a-83f2-6bf46b479fca. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.303 187189 DEBUG nova.network.neutron [req-e1961052-085d-4858-a035-2cbb7440b681 req-72f8951f-37e0-4384-acc5-0786599709e6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Updating instance_info_cache with network_info: [{"id": "723f857c-f89e-440a-83f2-6bf46b479fca", "address": "fa:16:3e:62:41:bf", "network": {"id": "ae86c83f-be5a-4cd0-9064-11898ee2fcef", "bridge": "br-int", "label": "tempest-network-smoke--695329026", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap723f857c-f8", "ovs_interfaceid": "723f857c-f89e-440a-83f2-6bf46b479fca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "adf9aa84-82bb-4e89-a0a5-7fad93336a39", "address": "fa:16:3e:59:27:b3", "network": {"id": "a3d94aff-5439-43d3-a356-7aafae582344", "bridge": "br-int", "label": "tempest-network-smoke--90551140", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe59:27b3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapadf9aa84-82", "ovs_interfaceid": "adf9aa84-82bb-4e89-a0a5-7fad93336a39", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.315 187189 DEBUG nova.compute.manager [req-64a0be19-dead-42ea-abd1-1e298a741e8a req-d5ac6f71-1bed-475b-8a62-25b7ea0e2a4b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Received event network-vif-plugged-adf9aa84-82bb-4e89-a0a5-7fad93336a39 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.315 187189 DEBUG oslo_concurrency.lockutils [req-64a0be19-dead-42ea-abd1-1e298a741e8a req-d5ac6f71-1bed-475b-8a62-25b7ea0e2a4b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "bbcf2b17-c33f-4a89-9f82-60b4dcfa7208-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.316 187189 DEBUG oslo_concurrency.lockutils [req-64a0be19-dead-42ea-abd1-1e298a741e8a req-d5ac6f71-1bed-475b-8a62-25b7ea0e2a4b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "bbcf2b17-c33f-4a89-9f82-60b4dcfa7208-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.316 187189 DEBUG oslo_concurrency.lockutils [req-64a0be19-dead-42ea-abd1-1e298a741e8a req-d5ac6f71-1bed-475b-8a62-25b7ea0e2a4b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "bbcf2b17-c33f-4a89-9f82-60b4dcfa7208-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.316 187189 DEBUG nova.compute.manager [req-64a0be19-dead-42ea-abd1-1e298a741e8a req-d5ac6f71-1bed-475b-8a62-25b7ea0e2a4b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Processing event network-vif-plugged-adf9aa84-82bb-4e89-a0a5-7fad93336a39 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.328 187189 DEBUG oslo_concurrency.lockutils [req-e1961052-085d-4858-a035-2cbb7440b681 req-72f8951f-37e0-4384-acc5-0786599709e6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-bbcf2b17-c33f-4a89-9f82-60b4dcfa7208" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.329 187189 DEBUG nova.compute.manager [req-e1961052-085d-4858-a035-2cbb7440b681 req-72f8951f-37e0-4384-acc5-0786599709e6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Received event network-changed-adf9aa84-82bb-4e89-a0a5-7fad93336a39 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.329 187189 DEBUG nova.compute.manager [req-e1961052-085d-4858-a035-2cbb7440b681 req-72f8951f-37e0-4384-acc5-0786599709e6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Refreshing instance network info cache due to event network-changed-adf9aa84-82bb-4e89-a0a5-7fad93336a39. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.330 187189 DEBUG oslo_concurrency.lockutils [req-e1961052-085d-4858-a035-2cbb7440b681 req-72f8951f-37e0-4384-acc5-0786599709e6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-bbcf2b17-c33f-4a89-9f82-60b4dcfa7208" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.330 187189 DEBUG oslo_concurrency.lockutils [req-e1961052-085d-4858-a035-2cbb7440b681 req-72f8951f-37e0-4384-acc5-0786599709e6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-bbcf2b17-c33f-4a89-9f82-60b4dcfa7208" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.331 187189 DEBUG nova.network.neutron [req-e1961052-085d-4858-a035-2cbb7440b681 req-72f8951f-37e0-4384-acc5-0786599709e6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Refreshing network info cache for port adf9aa84-82bb-4e89-a0a5-7fad93336a39 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.456 187189 DEBUG nova.compute.manager [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Instance event wait completed in 0 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.458 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401146.4562647, bbcf2b17-c33f-4a89-9f82-60b4dcfa7208 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.458 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] VM Started (Lifecycle Event)
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.462 187189 DEBUG nova.virt.libvirt.driver [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.467 187189 INFO nova.virt.libvirt.driver [-] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Instance spawned successfully.
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.467 187189 DEBUG nova.virt.libvirt.driver [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.487 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.493 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.497 187189 DEBUG nova.virt.libvirt.driver [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.498 187189 DEBUG nova.virt.libvirt.driver [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.498 187189 DEBUG nova.virt.libvirt.driver [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.498 187189 DEBUG nova.virt.libvirt.driver [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.499 187189 DEBUG nova.virt.libvirt.driver [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.499 187189 DEBUG nova.virt.libvirt.driver [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.538 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.539 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401146.4577863, bbcf2b17-c33f-4a89-9f82-60b4dcfa7208 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.539 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] VM Paused (Lifecycle Event)
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.579 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.589 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401146.4609752, bbcf2b17-c33f-4a89-9f82-60b4dcfa7208 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.589 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] VM Resumed (Lifecycle Event)
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.614 187189 INFO nova.compute.manager [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Took 13.22 seconds to spawn the instance on the hypervisor.
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.614 187189 DEBUG nova.compute.manager [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.619 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.631 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.672 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:25:46 compute-0 podman[235343]: 2025-11-29 07:25:46.705334434 +0000 UTC m=+0.069901393 container create 07cf9803e720de352ca147f2db2eebe7f27c16a244beaebdff83d1665389700b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.734 187189 INFO nova.compute.manager [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Took 14.03 seconds to build instance.
Nov 29 07:25:46 compute-0 podman[235343]: 2025-11-29 07:25:46.66657154 +0000 UTC m=+0.031138549 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:25:46 compute-0 nova_compute[187185]: 2025-11-29 07:25:46.758 187189 DEBUG oslo_concurrency.lockutils [None req-c55132a5-b74c-48be-a5d1-9d31ee8e1bdf 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "bbcf2b17-c33f-4a89-9f82-60b4dcfa7208" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.163s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:25:46 compute-0 systemd[1]: Started libpod-conmon-07cf9803e720de352ca147f2db2eebe7f27c16a244beaebdff83d1665389700b.scope.
Nov 29 07:25:46 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:25:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/31ecee06b277bb3bbe12006d6ee8294511ab996b4eb4a1d3296d530b7464a827/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:25:46 compute-0 podman[235343]: 2025-11-29 07:25:46.825469903 +0000 UTC m=+0.190036882 container init 07cf9803e720de352ca147f2db2eebe7f27c16a244beaebdff83d1665389700b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:25:46 compute-0 podman[235343]: 2025-11-29 07:25:46.836520854 +0000 UTC m=+0.201087813 container start 07cf9803e720de352ca147f2db2eebe7f27c16a244beaebdff83d1665389700b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 07:25:46 compute-0 neutron-haproxy-ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef[235358]: [NOTICE]   (235362) : New worker (235364) forked
Nov 29 07:25:46 compute-0 neutron-haproxy-ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef[235358]: [NOTICE]   (235362) : Loading success.
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:46.917 104254 INFO neutron.agent.ovn.metadata.agent [-] Port adf9aa84-82bb-4e89-a0a5-7fad93336a39 in datapath a3d94aff-5439-43d3-a356-7aafae582344 unbound from our chassis
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:46.921 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a3d94aff-5439-43d3-a356-7aafae582344
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:46.938 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[3752f7bc-7cc4-4c3d-85e6-91abd5ccc2e3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:46.939 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa3d94aff-51 in ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:46.941 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa3d94aff-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:46.941 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[5d16807d-9508-4a75-8e5a-7c8f80f5e8c5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:46.942 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[852c13dc-175b-4fcf-ae04-9fdb9af936f4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:46.959 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[0b230433-fead-4701-a041-f7160d56e491]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:46.983 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e1531651-fca0-4a9f-9b33-e7a4bac59df6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:47 compute-0 nova_compute[187185]: 2025-11-29 07:25:47.027 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:47.051 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[a1d8d352-ed0d-4fd5-9089-6a9ade6f462c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:47 compute-0 NetworkManager[55227]: <info>  [1764401147.0599] manager: (tapa3d94aff-50): new Veth device (/org/freedesktop/NetworkManager/Devices/213)
Nov 29 07:25:47 compute-0 systemd-udevd[235292]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:47.058 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d5d68e66-3fa6-4dee-a628-d5343fdfba88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:47.098 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[43d80be8-f813-4a08-ae47-3467d4eccc1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:47.103 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[c0a17a55-e4fe-4c76-95fc-5980ce9966ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:47 compute-0 NetworkManager[55227]: <info>  [1764401147.1388] device (tapa3d94aff-50): carrier: link connected
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:47.148 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[a99fd9fd-31ac-4b9b-88af-1e445660432e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:47.176 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[99cec919-d531-46b4-8786-5caeac711078]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa3d94aff-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:82:4f:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 129], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 659480, 'reachable_time': 21036, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235383, 'error': None, 'target': 'ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:47.200 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[9f9ffa7f-89af-48a8-8617-b23d51c7ee26]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe82:4fda'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 659480, 'tstamp': 659480}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235384, 'error': None, 'target': 'ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:47.223 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[1e93e7ad-2a5d-41e5-8f56-a183088f0a80]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa3d94aff-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:82:4f:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 180, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 180, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 129], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 659480, 'reachable_time': 21036, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235385, 'error': None, 'target': 'ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:47.268 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[b54c8719-f30b-4905-b5c8-a8b8a67c9e5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:47.320 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[5b7228a6-40c7-400d-aa4c-ac1a877e690c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:47.322 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa3d94aff-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:47.322 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:47.323 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa3d94aff-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:25:47 compute-0 NetworkManager[55227]: <info>  [1764401147.3260] manager: (tapa3d94aff-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/214)
Nov 29 07:25:47 compute-0 kernel: tapa3d94aff-50: entered promiscuous mode
Nov 29 07:25:47 compute-0 ovn_controller[95281]: 2025-11-29T07:25:47Z|00401|binding|INFO|Releasing lport 06f4ec62-8b16-4a76-9398-b2117639cd20 from this chassis (sb_readonly=0)
Nov 29 07:25:47 compute-0 nova_compute[187185]: 2025-11-29 07:25:47.328 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:47 compute-0 nova_compute[187185]: 2025-11-29 07:25:47.332 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:47.331 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa3d94aff-50, col_values=(('external_ids', {'iface-id': '06f4ec62-8b16-4a76-9398-b2117639cd20'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:25:47 compute-0 nova_compute[187185]: 2025-11-29 07:25:47.358 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:47.359 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a3d94aff-5439-43d3-a356-7aafae582344.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a3d94aff-5439-43d3-a356-7aafae582344.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:47.361 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[7fada7de-92f1-492e-8ad7-c1b4a30f7eb1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:47.362 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-a3d94aff-5439-43d3-a356-7aafae582344
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/a3d94aff-5439-43d3-a356-7aafae582344.pid.haproxy
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID a3d94aff-5439-43d3-a356-7aafae582344
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:25:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:25:47.363 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344', 'env', 'PROCESS_TAG=haproxy-a3d94aff-5439-43d3-a356-7aafae582344', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a3d94aff-5439-43d3-a356-7aafae582344.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:25:47 compute-0 nova_compute[187185]: 2025-11-29 07:25:47.828 187189 DEBUG nova.network.neutron [req-e1961052-085d-4858-a035-2cbb7440b681 req-72f8951f-37e0-4384-acc5-0786599709e6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Updated VIF entry in instance network info cache for port adf9aa84-82bb-4e89-a0a5-7fad93336a39. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:25:47 compute-0 nova_compute[187185]: 2025-11-29 07:25:47.829 187189 DEBUG nova.network.neutron [req-e1961052-085d-4858-a035-2cbb7440b681 req-72f8951f-37e0-4384-acc5-0786599709e6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Updating instance_info_cache with network_info: [{"id": "723f857c-f89e-440a-83f2-6bf46b479fca", "address": "fa:16:3e:62:41:bf", "network": {"id": "ae86c83f-be5a-4cd0-9064-11898ee2fcef", "bridge": "br-int", "label": "tempest-network-smoke--695329026", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap723f857c-f8", "ovs_interfaceid": "723f857c-f89e-440a-83f2-6bf46b479fca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "adf9aa84-82bb-4e89-a0a5-7fad93336a39", "address": "fa:16:3e:59:27:b3", "network": {"id": "a3d94aff-5439-43d3-a356-7aafae582344", "bridge": "br-int", "label": "tempest-network-smoke--90551140", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe59:27b3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapadf9aa84-82", "ovs_interfaceid": "adf9aa84-82bb-4e89-a0a5-7fad93336a39", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:25:47 compute-0 podman[235416]: 2025-11-29 07:25:47.746028999 +0000 UTC m=+0.028198876 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:25:47 compute-0 nova_compute[187185]: 2025-11-29 07:25:47.859 187189 DEBUG oslo_concurrency.lockutils [req-e1961052-085d-4858-a035-2cbb7440b681 req-72f8951f-37e0-4384-acc5-0786599709e6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-bbcf2b17-c33f-4a89-9f82-60b4dcfa7208" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:25:48 compute-0 nova_compute[187185]: 2025-11-29 07:25:48.339 187189 DEBUG nova.compute.manager [req-f084c317-dac8-4d48-a7e5-b4b64fbd724e req-898aeccc-f17c-4d59-b358-7f8108070085 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Received event network-vif-plugged-723f857c-f89e-440a-83f2-6bf46b479fca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:25:48 compute-0 nova_compute[187185]: 2025-11-29 07:25:48.339 187189 DEBUG oslo_concurrency.lockutils [req-f084c317-dac8-4d48-a7e5-b4b64fbd724e req-898aeccc-f17c-4d59-b358-7f8108070085 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "bbcf2b17-c33f-4a89-9f82-60b4dcfa7208-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:25:48 compute-0 nova_compute[187185]: 2025-11-29 07:25:48.340 187189 DEBUG oslo_concurrency.lockutils [req-f084c317-dac8-4d48-a7e5-b4b64fbd724e req-898aeccc-f17c-4d59-b358-7f8108070085 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "bbcf2b17-c33f-4a89-9f82-60b4dcfa7208-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:25:48 compute-0 nova_compute[187185]: 2025-11-29 07:25:48.340 187189 DEBUG oslo_concurrency.lockutils [req-f084c317-dac8-4d48-a7e5-b4b64fbd724e req-898aeccc-f17c-4d59-b358-7f8108070085 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "bbcf2b17-c33f-4a89-9f82-60b4dcfa7208-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:25:48 compute-0 nova_compute[187185]: 2025-11-29 07:25:48.340 187189 DEBUG nova.compute.manager [req-f084c317-dac8-4d48-a7e5-b4b64fbd724e req-898aeccc-f17c-4d59-b358-7f8108070085 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] No waiting events found dispatching network-vif-plugged-723f857c-f89e-440a-83f2-6bf46b479fca pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:25:48 compute-0 nova_compute[187185]: 2025-11-29 07:25:48.340 187189 WARNING nova.compute.manager [req-f084c317-dac8-4d48-a7e5-b4b64fbd724e req-898aeccc-f17c-4d59-b358-7f8108070085 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Received unexpected event network-vif-plugged-723f857c-f89e-440a-83f2-6bf46b479fca for instance with vm_state active and task_state None.
Nov 29 07:25:49 compute-0 podman[235416]: 2025-11-29 07:25:49.242396537 +0000 UTC m=+1.524566384 container create 4469ec564937420c0cd49acaf1ee01a11fcc422abb5fcd89a399de6d5648a7d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:25:49 compute-0 systemd[1]: Started libpod-conmon-4469ec564937420c0cd49acaf1ee01a11fcc422abb5fcd89a399de6d5648a7d5.scope.
Nov 29 07:25:49 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:25:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e33e2f8868301b1a02773571b7f9c79e146cb26656f609fb98303cb3e27a629f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:25:49 compute-0 nova_compute[187185]: 2025-11-29 07:25:49.849 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:49 compute-0 nova_compute[187185]: 2025-11-29 07:25:49.896 187189 DEBUG nova.compute.manager [req-48604b5d-d3cf-4db7-8c41-a72b22b91b6e req-fe86a616-5664-4a43-8009-ab6319f430b0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Received event network-vif-plugged-adf9aa84-82bb-4e89-a0a5-7fad93336a39 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:25:49 compute-0 podman[235416]: 2025-11-29 07:25:49.896639262 +0000 UTC m=+2.178809099 container init 4469ec564937420c0cd49acaf1ee01a11fcc422abb5fcd89a399de6d5648a7d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0)
Nov 29 07:25:49 compute-0 nova_compute[187185]: 2025-11-29 07:25:49.896 187189 DEBUG oslo_concurrency.lockutils [req-48604b5d-d3cf-4db7-8c41-a72b22b91b6e req-fe86a616-5664-4a43-8009-ab6319f430b0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "bbcf2b17-c33f-4a89-9f82-60b4dcfa7208-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:25:49 compute-0 nova_compute[187185]: 2025-11-29 07:25:49.897 187189 DEBUG oslo_concurrency.lockutils [req-48604b5d-d3cf-4db7-8c41-a72b22b91b6e req-fe86a616-5664-4a43-8009-ab6319f430b0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "bbcf2b17-c33f-4a89-9f82-60b4dcfa7208-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:25:49 compute-0 nova_compute[187185]: 2025-11-29 07:25:49.897 187189 DEBUG oslo_concurrency.lockutils [req-48604b5d-d3cf-4db7-8c41-a72b22b91b6e req-fe86a616-5664-4a43-8009-ab6319f430b0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "bbcf2b17-c33f-4a89-9f82-60b4dcfa7208-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:25:49 compute-0 nova_compute[187185]: 2025-11-29 07:25:49.897 187189 DEBUG nova.compute.manager [req-48604b5d-d3cf-4db7-8c41-a72b22b91b6e req-fe86a616-5664-4a43-8009-ab6319f430b0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] No waiting events found dispatching network-vif-plugged-adf9aa84-82bb-4e89-a0a5-7fad93336a39 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:25:49 compute-0 nova_compute[187185]: 2025-11-29 07:25:49.897 187189 WARNING nova.compute.manager [req-48604b5d-d3cf-4db7-8c41-a72b22b91b6e req-fe86a616-5664-4a43-8009-ab6319f430b0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Received unexpected event network-vif-plugged-adf9aa84-82bb-4e89-a0a5-7fad93336a39 for instance with vm_state active and task_state None.
Nov 29 07:25:49 compute-0 podman[235416]: 2025-11-29 07:25:49.908109615 +0000 UTC m=+2.190279492 container start 4469ec564937420c0cd49acaf1ee01a11fcc422abb5fcd89a399de6d5648a7d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 07:25:49 compute-0 neutron-haproxy-ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344[235465]: [NOTICE]   (235500) : New worker (235502) forked
Nov 29 07:25:49 compute-0 neutron-haproxy-ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344[235465]: [NOTICE]   (235500) : Loading success.
Nov 29 07:25:49 compute-0 podman[235432]: 2025-11-29 07:25:49.96144104 +0000 UTC m=+0.662079437 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:25:49 compute-0 podman[235430]: 2025-11-29 07:25:49.961976515 +0000 UTC m=+0.678529871 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 29 07:25:50 compute-0 podman[235431]: 2025-11-29 07:25:50.038766951 +0000 UTC m=+0.746421166 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, config_id=edpm, name=ubi9-minimal, distribution-scope=public, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., version=9.6, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc.)
Nov 29 07:25:51 compute-0 nova_compute[187185]: 2025-11-29 07:25:51.242 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:53 compute-0 nova_compute[187185]: 2025-11-29 07:25:53.235 187189 DEBUG nova.compute.manager [req-3a679cca-3bb2-4050-8931-7832d5bf23ea req-f7dc8af8-8c49-4fda-8cc0-ed5aa211b9ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Received event network-changed-723f857c-f89e-440a-83f2-6bf46b479fca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:25:53 compute-0 nova_compute[187185]: 2025-11-29 07:25:53.236 187189 DEBUG nova.compute.manager [req-3a679cca-3bb2-4050-8931-7832d5bf23ea req-f7dc8af8-8c49-4fda-8cc0-ed5aa211b9ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Refreshing instance network info cache due to event network-changed-723f857c-f89e-440a-83f2-6bf46b479fca. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:25:53 compute-0 nova_compute[187185]: 2025-11-29 07:25:53.236 187189 DEBUG oslo_concurrency.lockutils [req-3a679cca-3bb2-4050-8931-7832d5bf23ea req-f7dc8af8-8c49-4fda-8cc0-ed5aa211b9ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-bbcf2b17-c33f-4a89-9f82-60b4dcfa7208" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:25:53 compute-0 nova_compute[187185]: 2025-11-29 07:25:53.236 187189 DEBUG oslo_concurrency.lockutils [req-3a679cca-3bb2-4050-8931-7832d5bf23ea req-f7dc8af8-8c49-4fda-8cc0-ed5aa211b9ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-bbcf2b17-c33f-4a89-9f82-60b4dcfa7208" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:25:53 compute-0 nova_compute[187185]: 2025-11-29 07:25:53.236 187189 DEBUG nova.network.neutron [req-3a679cca-3bb2-4050-8931-7832d5bf23ea req-f7dc8af8-8c49-4fda-8cc0-ed5aa211b9ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Refreshing network info cache for port 723f857c-f89e-440a-83f2-6bf46b479fca _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:25:54 compute-0 nova_compute[187185]: 2025-11-29 07:25:54.853 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:56 compute-0 nova_compute[187185]: 2025-11-29 07:25:56.244 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:25:58 compute-0 nova_compute[187185]: 2025-11-29 07:25:58.998 187189 DEBUG nova.network.neutron [req-3a679cca-3bb2-4050-8931-7832d5bf23ea req-f7dc8af8-8c49-4fda-8cc0-ed5aa211b9ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Updated VIF entry in instance network info cache for port 723f857c-f89e-440a-83f2-6bf46b479fca. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:25:59 compute-0 nova_compute[187185]: 2025-11-29 07:25:58.999 187189 DEBUG nova.network.neutron [req-3a679cca-3bb2-4050-8931-7832d5bf23ea req-f7dc8af8-8c49-4fda-8cc0-ed5aa211b9ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Updating instance_info_cache with network_info: [{"id": "723f857c-f89e-440a-83f2-6bf46b479fca", "address": "fa:16:3e:62:41:bf", "network": {"id": "ae86c83f-be5a-4cd0-9064-11898ee2fcef", "bridge": "br-int", "label": "tempest-network-smoke--695329026", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap723f857c-f8", "ovs_interfaceid": "723f857c-f89e-440a-83f2-6bf46b479fca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "adf9aa84-82bb-4e89-a0a5-7fad93336a39", "address": "fa:16:3e:59:27:b3", "network": {"id": "a3d94aff-5439-43d3-a356-7aafae582344", "bridge": "br-int", "label": "tempest-network-smoke--90551140", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe59:27b3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapadf9aa84-82", "ovs_interfaceid": "adf9aa84-82bb-4e89-a0a5-7fad93336a39", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:25:59 compute-0 nova_compute[187185]: 2025-11-29 07:25:59.039 187189 DEBUG oslo_concurrency.lockutils [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "eafc7a74-759b-40e8-a27f-d9610458b32a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:25:59 compute-0 nova_compute[187185]: 2025-11-29 07:25:59.040 187189 DEBUG oslo_concurrency.lockutils [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "eafc7a74-759b-40e8-a27f-d9610458b32a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:25:59 compute-0 nova_compute[187185]: 2025-11-29 07:25:59.041 187189 DEBUG oslo_concurrency.lockutils [req-3a679cca-3bb2-4050-8931-7832d5bf23ea req-f7dc8af8-8c49-4fda-8cc0-ed5aa211b9ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-bbcf2b17-c33f-4a89-9f82-60b4dcfa7208" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:25:59 compute-0 nova_compute[187185]: 2025-11-29 07:25:59.073 187189 DEBUG nova.compute.manager [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 07:25:59 compute-0 nova_compute[187185]: 2025-11-29 07:25:59.171 187189 DEBUG oslo_concurrency.lockutils [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:25:59 compute-0 nova_compute[187185]: 2025-11-29 07:25:59.172 187189 DEBUG oslo_concurrency.lockutils [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:25:59 compute-0 nova_compute[187185]: 2025-11-29 07:25:59.180 187189 DEBUG nova.virt.hardware [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:25:59 compute-0 nova_compute[187185]: 2025-11-29 07:25:59.180 187189 INFO nova.compute.claims [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:25:59 compute-0 nova_compute[187185]: 2025-11-29 07:25:59.383 187189 DEBUG nova.compute.provider_tree [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:25:59 compute-0 ovn_controller[95281]: 2025-11-29T07:25:59Z|00051|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:62:41:bf 10.100.0.11
Nov 29 07:25:59 compute-0 ovn_controller[95281]: 2025-11-29T07:25:59Z|00052|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:62:41:bf 10.100.0.11
Nov 29 07:25:59 compute-0 podman[235523]: 2025-11-29 07:25:59.845498401 +0000 UTC m=+0.107298928 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 29 07:25:59 compute-0 nova_compute[187185]: 2025-11-29 07:25:59.855 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:00 compute-0 nova_compute[187185]: 2025-11-29 07:26:00.132 187189 DEBUG nova.scheduler.client.report [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:26:00 compute-0 nova_compute[187185]: 2025-11-29 07:26:00.166 187189 DEBUG oslo_concurrency.lockutils [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.995s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:26:00 compute-0 nova_compute[187185]: 2025-11-29 07:26:00.167 187189 DEBUG nova.compute.manager [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 07:26:00 compute-0 nova_compute[187185]: 2025-11-29 07:26:00.238 187189 DEBUG nova.compute.manager [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 07:26:00 compute-0 nova_compute[187185]: 2025-11-29 07:26:00.239 187189 DEBUG nova.network.neutron [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 07:26:00 compute-0 nova_compute[187185]: 2025-11-29 07:26:00.258 187189 INFO nova.virt.libvirt.driver [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 07:26:00 compute-0 nova_compute[187185]: 2025-11-29 07:26:00.283 187189 DEBUG nova.compute.manager [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 07:26:00 compute-0 nova_compute[187185]: 2025-11-29 07:26:00.432 187189 DEBUG nova.compute.manager [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 07:26:00 compute-0 nova_compute[187185]: 2025-11-29 07:26:00.434 187189 DEBUG nova.virt.libvirt.driver [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:26:00 compute-0 nova_compute[187185]: 2025-11-29 07:26:00.434 187189 INFO nova.virt.libvirt.driver [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Creating image(s)
Nov 29 07:26:00 compute-0 nova_compute[187185]: 2025-11-29 07:26:00.435 187189 DEBUG oslo_concurrency.lockutils [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "/var/lib/nova/instances/eafc7a74-759b-40e8-a27f-d9610458b32a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:26:00 compute-0 nova_compute[187185]: 2025-11-29 07:26:00.435 187189 DEBUG oslo_concurrency.lockutils [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "/var/lib/nova/instances/eafc7a74-759b-40e8-a27f-d9610458b32a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:26:00 compute-0 nova_compute[187185]: 2025-11-29 07:26:00.436 187189 DEBUG oslo_concurrency.lockutils [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "/var/lib/nova/instances/eafc7a74-759b-40e8-a27f-d9610458b32a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:26:00 compute-0 nova_compute[187185]: 2025-11-29 07:26:00.454 187189 DEBUG oslo_concurrency.processutils [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:26:00 compute-0 nova_compute[187185]: 2025-11-29 07:26:00.481 187189 DEBUG nova.policy [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '000fb7b950024e16902cd58f2ea16ac9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6d55e57bfd184513a304a61cc1cb3730', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 07:26:00 compute-0 nova_compute[187185]: 2025-11-29 07:26:00.511 187189 DEBUG oslo_concurrency.processutils [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:26:00 compute-0 nova_compute[187185]: 2025-11-29 07:26:00.512 187189 DEBUG oslo_concurrency.lockutils [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:26:00 compute-0 nova_compute[187185]: 2025-11-29 07:26:00.512 187189 DEBUG oslo_concurrency.lockutils [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:26:00 compute-0 nova_compute[187185]: 2025-11-29 07:26:00.523 187189 DEBUG oslo_concurrency.processutils [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:26:00 compute-0 nova_compute[187185]: 2025-11-29 07:26:00.574 187189 DEBUG oslo_concurrency.processutils [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:26:00 compute-0 nova_compute[187185]: 2025-11-29 07:26:00.575 187189 DEBUG oslo_concurrency.processutils [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/eafc7a74-759b-40e8-a27f-d9610458b32a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:26:00 compute-0 nova_compute[187185]: 2025-11-29 07:26:00.614 187189 DEBUG oslo_concurrency.processutils [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/eafc7a74-759b-40e8-a27f-d9610458b32a/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:26:00 compute-0 nova_compute[187185]: 2025-11-29 07:26:00.615 187189 DEBUG oslo_concurrency.lockutils [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:26:00 compute-0 nova_compute[187185]: 2025-11-29 07:26:00.615 187189 DEBUG oslo_concurrency.processutils [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:26:00 compute-0 nova_compute[187185]: 2025-11-29 07:26:00.678 187189 DEBUG oslo_concurrency.processutils [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:26:00 compute-0 nova_compute[187185]: 2025-11-29 07:26:00.679 187189 DEBUG nova.virt.disk.api [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Checking if we can resize image /var/lib/nova/instances/eafc7a74-759b-40e8-a27f-d9610458b32a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:26:00 compute-0 nova_compute[187185]: 2025-11-29 07:26:00.680 187189 DEBUG oslo_concurrency.processutils [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eafc7a74-759b-40e8-a27f-d9610458b32a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:26:00 compute-0 nova_compute[187185]: 2025-11-29 07:26:00.742 187189 DEBUG oslo_concurrency.processutils [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eafc7a74-759b-40e8-a27f-d9610458b32a/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:26:00 compute-0 nova_compute[187185]: 2025-11-29 07:26:00.743 187189 DEBUG nova.virt.disk.api [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Cannot resize image /var/lib/nova/instances/eafc7a74-759b-40e8-a27f-d9610458b32a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:26:00 compute-0 nova_compute[187185]: 2025-11-29 07:26:00.743 187189 DEBUG nova.objects.instance [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lazy-loading 'migration_context' on Instance uuid eafc7a74-759b-40e8-a27f-d9610458b32a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:26:00 compute-0 nova_compute[187185]: 2025-11-29 07:26:00.760 187189 DEBUG nova.virt.libvirt.driver [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:26:00 compute-0 nova_compute[187185]: 2025-11-29 07:26:00.760 187189 DEBUG nova.virt.libvirt.driver [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Ensure instance console log exists: /var/lib/nova/instances/eafc7a74-759b-40e8-a27f-d9610458b32a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:26:00 compute-0 nova_compute[187185]: 2025-11-29 07:26:00.761 187189 DEBUG oslo_concurrency.lockutils [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:26:00 compute-0 nova_compute[187185]: 2025-11-29 07:26:00.761 187189 DEBUG oslo_concurrency.lockutils [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:26:00 compute-0 nova_compute[187185]: 2025-11-29 07:26:00.761 187189 DEBUG oslo_concurrency.lockutils [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:26:01 compute-0 nova_compute[187185]: 2025-11-29 07:26:01.063 187189 DEBUG nova.network.neutron [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Successfully created port: 74192508-a888-4ab0-9ebe-a3404b5c5812 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:26:01 compute-0 nova_compute[187185]: 2025-11-29 07:26:01.246 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:02 compute-0 nova_compute[187185]: 2025-11-29 07:26:02.729 187189 DEBUG nova.network.neutron [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Successfully updated port: 74192508-a888-4ab0-9ebe-a3404b5c5812 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:26:02 compute-0 nova_compute[187185]: 2025-11-29 07:26:02.748 187189 DEBUG oslo_concurrency.lockutils [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "refresh_cache-eafc7a74-759b-40e8-a27f-d9610458b32a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:26:02 compute-0 nova_compute[187185]: 2025-11-29 07:26:02.749 187189 DEBUG oslo_concurrency.lockutils [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquired lock "refresh_cache-eafc7a74-759b-40e8-a27f-d9610458b32a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:26:02 compute-0 nova_compute[187185]: 2025-11-29 07:26:02.749 187189 DEBUG nova.network.neutron [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:26:02 compute-0 nova_compute[187185]: 2025-11-29 07:26:02.837 187189 DEBUG nova.compute.manager [req-1b9aa24a-a26c-44f0-a187-8af5d7e06d30 req-2702d1f7-1709-4f91-b548-6c3b164c218c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Received event network-changed-74192508-a888-4ab0-9ebe-a3404b5c5812 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:26:02 compute-0 nova_compute[187185]: 2025-11-29 07:26:02.838 187189 DEBUG nova.compute.manager [req-1b9aa24a-a26c-44f0-a187-8af5d7e06d30 req-2702d1f7-1709-4f91-b548-6c3b164c218c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Refreshing instance network info cache due to event network-changed-74192508-a888-4ab0-9ebe-a3404b5c5812. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:26:02 compute-0 nova_compute[187185]: 2025-11-29 07:26:02.838 187189 DEBUG oslo_concurrency.lockutils [req-1b9aa24a-a26c-44f0-a187-8af5d7e06d30 req-2702d1f7-1709-4f91-b548-6c3b164c218c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-eafc7a74-759b-40e8-a27f-d9610458b32a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:26:03 compute-0 nova_compute[187185]: 2025-11-29 07:26:03.502 187189 DEBUG nova.network.neutron [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.726 187189 DEBUG nova.network.neutron [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Updating instance_info_cache with network_info: [{"id": "74192508-a888-4ab0-9ebe-a3404b5c5812", "address": "fa:16:3e:88:47:ff", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74192508-a8", "ovs_interfaceid": "74192508-a888-4ab0-9ebe-a3404b5c5812", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.784 187189 DEBUG oslo_concurrency.lockutils [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Releasing lock "refresh_cache-eafc7a74-759b-40e8-a27f-d9610458b32a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.784 187189 DEBUG nova.compute.manager [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Instance network_info: |[{"id": "74192508-a888-4ab0-9ebe-a3404b5c5812", "address": "fa:16:3e:88:47:ff", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74192508-a8", "ovs_interfaceid": "74192508-a888-4ab0-9ebe-a3404b5c5812", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.785 187189 DEBUG oslo_concurrency.lockutils [req-1b9aa24a-a26c-44f0-a187-8af5d7e06d30 req-2702d1f7-1709-4f91-b548-6c3b164c218c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-eafc7a74-759b-40e8-a27f-d9610458b32a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.785 187189 DEBUG nova.network.neutron [req-1b9aa24a-a26c-44f0-a187-8af5d7e06d30 req-2702d1f7-1709-4f91-b548-6c3b164c218c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Refreshing network info cache for port 74192508-a888-4ab0-9ebe-a3404b5c5812 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.787 187189 DEBUG nova.virt.libvirt.driver [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Start _get_guest_xml network_info=[{"id": "74192508-a888-4ab0-9ebe-a3404b5c5812", "address": "fa:16:3e:88:47:ff", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74192508-a8", "ovs_interfaceid": "74192508-a888-4ab0-9ebe-a3404b5c5812", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.792 187189 WARNING nova.virt.libvirt.driver [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.801 187189 DEBUG nova.virt.libvirt.host [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.802 187189 DEBUG nova.virt.libvirt.host [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.812 187189 DEBUG nova.virt.libvirt.host [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.813 187189 DEBUG nova.virt.libvirt.host [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.814 187189 DEBUG nova.virt.libvirt.driver [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.814 187189 DEBUG nova.virt.hardware [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.815 187189 DEBUG nova.virt.hardware [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.815 187189 DEBUG nova.virt.hardware [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.815 187189 DEBUG nova.virt.hardware [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.815 187189 DEBUG nova.virt.hardware [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.816 187189 DEBUG nova.virt.hardware [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.816 187189 DEBUG nova.virt.hardware [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.816 187189 DEBUG nova.virt.hardware [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.816 187189 DEBUG nova.virt.hardware [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.816 187189 DEBUG nova.virt.hardware [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.817 187189 DEBUG nova.virt.hardware [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.820 187189 DEBUG nova.virt.libvirt.vif [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:25:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-636885058',display_name='tempest-ServerDiskConfigTestJSON-server-636885058',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-636885058',id=129,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6d55e57bfd184513a304a61cc1cb3730',ramdisk_id='',reservation_id='r-nezhywl9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1282760174',owner_user_name='tempest-ServerDiskConfigTestJSON-1282760174-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:26:00Z,user_data=None,user_id='000fb7b950024e16902cd58f2ea16ac9',uuid=eafc7a74-759b-40e8-a27f-d9610458b32a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "74192508-a888-4ab0-9ebe-a3404b5c5812", "address": "fa:16:3e:88:47:ff", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74192508-a8", "ovs_interfaceid": "74192508-a888-4ab0-9ebe-a3404b5c5812", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.820 187189 DEBUG nova.network.os_vif_util [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converting VIF {"id": "74192508-a888-4ab0-9ebe-a3404b5c5812", "address": "fa:16:3e:88:47:ff", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74192508-a8", "ovs_interfaceid": "74192508-a888-4ab0-9ebe-a3404b5c5812", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.821 187189 DEBUG nova.network.os_vif_util [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:88:47:ff,bridge_name='br-int',has_traffic_filtering=True,id=74192508-a888-4ab0-9ebe-a3404b5c5812,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74192508-a8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.822 187189 DEBUG nova.objects.instance [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lazy-loading 'pci_devices' on Instance uuid eafc7a74-759b-40e8-a27f-d9610458b32a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.842 187189 DEBUG nova.virt.libvirt.driver [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:26:04 compute-0 nova_compute[187185]:   <uuid>eafc7a74-759b-40e8-a27f-d9610458b32a</uuid>
Nov 29 07:26:04 compute-0 nova_compute[187185]:   <name>instance-00000081</name>
Nov 29 07:26:04 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:26:04 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:26:04 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:26:04 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:       <nova:name>tempest-ServerDiskConfigTestJSON-server-636885058</nova:name>
Nov 29 07:26:04 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:26:04</nova:creationTime>
Nov 29 07:26:04 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:26:04 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:26:04 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:26:04 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:26:04 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:26:04 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:26:04 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:26:04 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:26:04 compute-0 nova_compute[187185]:         <nova:user uuid="000fb7b950024e16902cd58f2ea16ac9">tempest-ServerDiskConfigTestJSON-1282760174-project-member</nova:user>
Nov 29 07:26:04 compute-0 nova_compute[187185]:         <nova:project uuid="6d55e57bfd184513a304a61cc1cb3730">tempest-ServerDiskConfigTestJSON-1282760174</nova:project>
Nov 29 07:26:04 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:26:04 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:26:04 compute-0 nova_compute[187185]:         <nova:port uuid="74192508-a888-4ab0-9ebe-a3404b5c5812">
Nov 29 07:26:04 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:26:04 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:26:04 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:26:04 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <system>
Nov 29 07:26:04 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:26:04 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:26:04 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:26:04 compute-0 nova_compute[187185]:       <entry name="serial">eafc7a74-759b-40e8-a27f-d9610458b32a</entry>
Nov 29 07:26:04 compute-0 nova_compute[187185]:       <entry name="uuid">eafc7a74-759b-40e8-a27f-d9610458b32a</entry>
Nov 29 07:26:04 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     </system>
Nov 29 07:26:04 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:26:04 compute-0 nova_compute[187185]:   <os>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:   </os>
Nov 29 07:26:04 compute-0 nova_compute[187185]:   <features>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:   </features>
Nov 29 07:26:04 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:26:04 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:26:04 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:26:04 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/eafc7a74-759b-40e8-a27f-d9610458b32a/disk"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:26:04 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/eafc7a74-759b-40e8-a27f-d9610458b32a/disk.config"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:26:04 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:88:47:ff"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:       <target dev="tap74192508-a8"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:26:04 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/eafc7a74-759b-40e8-a27f-d9610458b32a/console.log" append="off"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <video>
Nov 29 07:26:04 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     </video>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:26:04 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:26:04 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:26:04 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:26:04 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:26:04 compute-0 nova_compute[187185]: </domain>
Nov 29 07:26:04 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.844 187189 DEBUG nova.compute.manager [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Preparing to wait for external event network-vif-plugged-74192508-a888-4ab0-9ebe-a3404b5c5812 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.844 187189 DEBUG oslo_concurrency.lockutils [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "eafc7a74-759b-40e8-a27f-d9610458b32a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.845 187189 DEBUG oslo_concurrency.lockutils [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "eafc7a74-759b-40e8-a27f-d9610458b32a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.845 187189 DEBUG oslo_concurrency.lockutils [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "eafc7a74-759b-40e8-a27f-d9610458b32a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.845 187189 DEBUG nova.virt.libvirt.vif [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:25:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-636885058',display_name='tempest-ServerDiskConfigTestJSON-server-636885058',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-636885058',id=129,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6d55e57bfd184513a304a61cc1cb3730',ramdisk_id='',reservation_id='r-nezhywl9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1282760174',owner_user_name='tempest-ServerDiskConfigTestJSON-1282760174-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:26:00Z,user_data=None,user_id='000fb7b950024e16902cd58f2ea16ac9',uuid=eafc7a74-759b-40e8-a27f-d9610458b32a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "74192508-a888-4ab0-9ebe-a3404b5c5812", "address": "fa:16:3e:88:47:ff", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74192508-a8", "ovs_interfaceid": "74192508-a888-4ab0-9ebe-a3404b5c5812", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.846 187189 DEBUG nova.network.os_vif_util [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converting VIF {"id": "74192508-a888-4ab0-9ebe-a3404b5c5812", "address": "fa:16:3e:88:47:ff", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74192508-a8", "ovs_interfaceid": "74192508-a888-4ab0-9ebe-a3404b5c5812", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.846 187189 DEBUG nova.network.os_vif_util [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:88:47:ff,bridge_name='br-int',has_traffic_filtering=True,id=74192508-a888-4ab0-9ebe-a3404b5c5812,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74192508-a8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.847 187189 DEBUG os_vif [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:88:47:ff,bridge_name='br-int',has_traffic_filtering=True,id=74192508-a888-4ab0-9ebe-a3404b5c5812,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74192508-a8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.847 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.848 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.848 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.851 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.852 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap74192508-a8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.852 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap74192508-a8, col_values=(('external_ids', {'iface-id': '74192508-a888-4ab0-9ebe-a3404b5c5812', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:88:47:ff', 'vm-uuid': 'eafc7a74-759b-40e8-a27f-d9610458b32a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.854 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:04 compute-0 NetworkManager[55227]: <info>  [1764401164.8558] manager: (tap74192508-a8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/215)
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.856 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.865 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.866 187189 INFO os_vif [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:88:47:ff,bridge_name='br-int',has_traffic_filtering=True,id=74192508-a888-4ab0-9ebe-a3404b5c5812,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74192508-a8')
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.962 187189 DEBUG nova.virt.libvirt.driver [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.962 187189 DEBUG nova.virt.libvirt.driver [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.963 187189 DEBUG nova.virt.libvirt.driver [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] No VIF found with MAC fa:16:3e:88:47:ff, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:26:04 compute-0 nova_compute[187185]: 2025-11-29 07:26:04.964 187189 INFO nova.virt.libvirt.driver [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Using config drive
Nov 29 07:26:05 compute-0 nova_compute[187185]: 2025-11-29 07:26:05.667 187189 INFO nova.virt.libvirt.driver [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Creating config drive at /var/lib/nova/instances/eafc7a74-759b-40e8-a27f-d9610458b32a/disk.config
Nov 29 07:26:05 compute-0 nova_compute[187185]: 2025-11-29 07:26:05.672 187189 DEBUG oslo_concurrency.processutils [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/eafc7a74-759b-40e8-a27f-d9610458b32a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzo8__d6k execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:26:05 compute-0 nova_compute[187185]: 2025-11-29 07:26:05.798 187189 DEBUG oslo_concurrency.processutils [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/eafc7a74-759b-40e8-a27f-d9610458b32a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzo8__d6k" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:26:05 compute-0 NetworkManager[55227]: <info>  [1764401165.8863] manager: (tap74192508-a8): new Tun device (/org/freedesktop/NetworkManager/Devices/216)
Nov 29 07:26:05 compute-0 kernel: tap74192508-a8: entered promiscuous mode
Nov 29 07:26:05 compute-0 nova_compute[187185]: 2025-11-29 07:26:05.933 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:05 compute-0 ovn_controller[95281]: 2025-11-29T07:26:05Z|00402|binding|INFO|Claiming lport 74192508-a888-4ab0-9ebe-a3404b5c5812 for this chassis.
Nov 29 07:26:05 compute-0 ovn_controller[95281]: 2025-11-29T07:26:05Z|00403|binding|INFO|74192508-a888-4ab0-9ebe-a3404b5c5812: Claiming fa:16:3e:88:47:ff 10.100.0.12
Nov 29 07:26:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:05.945 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:88:47:ff 10.100.0.12'], port_security=['fa:16:3e:88:47:ff 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'eafc7a74-759b-40e8-a27f-d9610458b32a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6d55e57bfd184513a304a61cc1cb3730', 'neutron:revision_number': '2', 'neutron:security_group_ids': '44bb1ac9-49ae-4c0a-8013-0db5efadb536', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=804567bc-6857-4eb6-aa00-b449f09c69a2, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=74192508-a888-4ab0-9ebe-a3404b5c5812) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:26:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:05.946 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 74192508-a888-4ab0-9ebe-a3404b5c5812 in datapath 9b34af6b-edf9-4b27-b1dc-2b18c2eec958 bound to our chassis
Nov 29 07:26:05 compute-0 ovn_controller[95281]: 2025-11-29T07:26:05Z|00404|binding|INFO|Setting lport 74192508-a888-4ab0-9ebe-a3404b5c5812 ovn-installed in OVS
Nov 29 07:26:05 compute-0 ovn_controller[95281]: 2025-11-29T07:26:05Z|00405|binding|INFO|Setting lport 74192508-a888-4ab0-9ebe-a3404b5c5812 up in Southbound
Nov 29 07:26:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:05.949 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9b34af6b-edf9-4b27-b1dc-2b18c2eec958
Nov 29 07:26:05 compute-0 nova_compute[187185]: 2025-11-29 07:26:05.952 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:05 compute-0 systemd-udevd[235593]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:26:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:05.962 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[cdb253c7-25be-49cc-b417-372dfd1afd58]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:26:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:05.964 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9b34af6b-e1 in ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:26:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:05.966 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9b34af6b-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:26:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:05.966 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[60bb74fd-5c5b-4716-8f95-8034d4425e3e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:26:05 compute-0 systemd-machined[153486]: New machine qemu-52-instance-00000081.
Nov 29 07:26:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:05.967 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f5cf1e27-e152-4abe-8057-b54d46e000ad]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:26:05 compute-0 NetworkManager[55227]: <info>  [1764401165.9760] device (tap74192508-a8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:26:05 compute-0 NetworkManager[55227]: <info>  [1764401165.9766] device (tap74192508-a8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:26:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:05.979 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[70c6ef04-8971-4168-8f64-768e7994191e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:26:05 compute-0 systemd[1]: Started Virtual Machine qemu-52-instance-00000081.
Nov 29 07:26:06 compute-0 podman[235576]: 2025-11-29 07:26:06.003356258 +0000 UTC m=+0.121467667 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:06.008 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a9be22d9-f639-4dec-9518-d09fa3dabf9c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:06.046 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[6f9fad61-fdf2-4c15-a99b-1cb8d5b7bbac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:06.052 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[dd42b3ae-9ed3-4f20-9f5e-4128cf35e7b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:26:06 compute-0 NetworkManager[55227]: <info>  [1764401166.0537] manager: (tap9b34af6b-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/217)
Nov 29 07:26:06 compute-0 systemd-udevd[235600]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:06.085 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[378c3bea-5594-4241-805a-80d3aa43c7ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:06.088 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[9e15491c-5dfc-4942-b24f-02e924288071]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:26:06 compute-0 NetworkManager[55227]: <info>  [1764401166.1070] device (tap9b34af6b-e0): carrier: link connected
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:06.113 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[ee5e3c1b-8825-4ac3-81e0-f170f3c96486]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:06.129 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[4d0360de-d345-4eb4-b426-9aad4d4bc85c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9b34af6b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fd:40:d9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 131], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 661377, 'reachable_time': 38385, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235638, 'error': None, 'target': 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:06.142 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[5835d1a8-8644-43f4-8099-0b4f2ac1f6fb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefd:40d9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 661377, 'tstamp': 661377}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235639, 'error': None, 'target': 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:06.156 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[148f1d04-4db7-42f3-9552-392856c3407c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9b34af6b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fd:40:d9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 131], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 661377, 'reachable_time': 38385, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235640, 'error': None, 'target': 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:06.185 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[b05bc8a5-3273-4ecb-b673-f8a21aeaf9f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:26:06 compute-0 nova_compute[187185]: 2025-11-29 07:26:06.248 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:06.252 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[1729993f-ea10-4b46-a39a-c375a72d8e6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:06.253 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9b34af6b-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:06.253 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:06.253 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9b34af6b-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:26:06 compute-0 nova_compute[187185]: 2025-11-29 07:26:06.255 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:06 compute-0 NetworkManager[55227]: <info>  [1764401166.2562] manager: (tap9b34af6b-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/218)
Nov 29 07:26:06 compute-0 kernel: tap9b34af6b-e0: entered promiscuous mode
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:06.259 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9b34af6b-e0, col_values=(('external_ids', {'iface-id': '88f3bff1-58a0-4231-87c4-807c4c2657d5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:26:06 compute-0 nova_compute[187185]: 2025-11-29 07:26:06.260 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:06 compute-0 ovn_controller[95281]: 2025-11-29T07:26:06Z|00406|binding|INFO|Releasing lport 88f3bff1-58a0-4231-87c4-807c4c2657d5 from this chassis (sb_readonly=0)
Nov 29 07:26:06 compute-0 nova_compute[187185]: 2025-11-29 07:26:06.276 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:06.278 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9b34af6b-edf9-4b27-b1dc-2b18c2eec958.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9b34af6b-edf9-4b27-b1dc-2b18c2eec958.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:06.279 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[8a975fd7-e0a2-444c-a5fe-33f9527db8b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:06.280 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-9b34af6b-edf9-4b27-b1dc-2b18c2eec958
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/9b34af6b-edf9-4b27-b1dc-2b18c2eec958.pid.haproxy
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID 9b34af6b-edf9-4b27-b1dc-2b18c2eec958
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:26:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:06.281 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'env', 'PROCESS_TAG=haproxy-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9b34af6b-edf9-4b27-b1dc-2b18c2eec958.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:26:06 compute-0 nova_compute[187185]: 2025-11-29 07:26:06.621 187189 DEBUG nova.compute.manager [req-629fecdf-a70f-49af-8ce7-8fa0bd58f01f req-68fe7c89-51f8-4b16-ada1-1fab0e1380eb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Received event network-vif-plugged-74192508-a888-4ab0-9ebe-a3404b5c5812 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:26:06 compute-0 nova_compute[187185]: 2025-11-29 07:26:06.623 187189 DEBUG oslo_concurrency.lockutils [req-629fecdf-a70f-49af-8ce7-8fa0bd58f01f req-68fe7c89-51f8-4b16-ada1-1fab0e1380eb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "eafc7a74-759b-40e8-a27f-d9610458b32a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:26:06 compute-0 nova_compute[187185]: 2025-11-29 07:26:06.624 187189 DEBUG oslo_concurrency.lockutils [req-629fecdf-a70f-49af-8ce7-8fa0bd58f01f req-68fe7c89-51f8-4b16-ada1-1fab0e1380eb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "eafc7a74-759b-40e8-a27f-d9610458b32a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:26:06 compute-0 nova_compute[187185]: 2025-11-29 07:26:06.624 187189 DEBUG oslo_concurrency.lockutils [req-629fecdf-a70f-49af-8ce7-8fa0bd58f01f req-68fe7c89-51f8-4b16-ada1-1fab0e1380eb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "eafc7a74-759b-40e8-a27f-d9610458b32a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:26:06 compute-0 nova_compute[187185]: 2025-11-29 07:26:06.624 187189 DEBUG nova.compute.manager [req-629fecdf-a70f-49af-8ce7-8fa0bd58f01f req-68fe7c89-51f8-4b16-ada1-1fab0e1380eb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Processing event network-vif-plugged-74192508-a888-4ab0-9ebe-a3404b5c5812 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:26:06 compute-0 podman[235675]: 2025-11-29 07:26:06.630243111 +0000 UTC m=+0.059681724 container create 2daeaac3c1b8b269d24769164a245758006849d706196fa1af36115333103d21 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true)
Nov 29 07:26:06 compute-0 nova_compute[187185]: 2025-11-29 07:26:06.641 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401166.6410768, eafc7a74-759b-40e8-a27f-d9610458b32a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:26:06 compute-0 nova_compute[187185]: 2025-11-29 07:26:06.641 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] VM Started (Lifecycle Event)
Nov 29 07:26:06 compute-0 nova_compute[187185]: 2025-11-29 07:26:06.645 187189 DEBUG nova.compute.manager [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:26:06 compute-0 nova_compute[187185]: 2025-11-29 07:26:06.648 187189 DEBUG nova.virt.libvirt.driver [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:26:06 compute-0 nova_compute[187185]: 2025-11-29 07:26:06.652 187189 INFO nova.virt.libvirt.driver [-] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Instance spawned successfully.
Nov 29 07:26:06 compute-0 nova_compute[187185]: 2025-11-29 07:26:06.652 187189 DEBUG nova.virt.libvirt.driver [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:26:06 compute-0 nova_compute[187185]: 2025-11-29 07:26:06.654 187189 DEBUG nova.network.neutron [req-1b9aa24a-a26c-44f0-a187-8af5d7e06d30 req-2702d1f7-1709-4f91-b548-6c3b164c218c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Updated VIF entry in instance network info cache for port 74192508-a888-4ab0-9ebe-a3404b5c5812. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:26:06 compute-0 nova_compute[187185]: 2025-11-29 07:26:06.654 187189 DEBUG nova.network.neutron [req-1b9aa24a-a26c-44f0-a187-8af5d7e06d30 req-2702d1f7-1709-4f91-b548-6c3b164c218c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Updating instance_info_cache with network_info: [{"id": "74192508-a888-4ab0-9ebe-a3404b5c5812", "address": "fa:16:3e:88:47:ff", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74192508-a8", "ovs_interfaceid": "74192508-a888-4ab0-9ebe-a3404b5c5812", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:26:06 compute-0 systemd[1]: Started libpod-conmon-2daeaac3c1b8b269d24769164a245758006849d706196fa1af36115333103d21.scope.
Nov 29 07:26:06 compute-0 nova_compute[187185]: 2025-11-29 07:26:06.671 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:26:06 compute-0 nova_compute[187185]: 2025-11-29 07:26:06.676 187189 DEBUG oslo_concurrency.lockutils [req-1b9aa24a-a26c-44f0-a187-8af5d7e06d30 req-2702d1f7-1709-4f91-b548-6c3b164c218c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-eafc7a74-759b-40e8-a27f-d9610458b32a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:26:06 compute-0 nova_compute[187185]: 2025-11-29 07:26:06.677 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:26:06 compute-0 nova_compute[187185]: 2025-11-29 07:26:06.682 187189 DEBUG nova.virt.libvirt.driver [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:26:06 compute-0 nova_compute[187185]: 2025-11-29 07:26:06.682 187189 DEBUG nova.virt.libvirt.driver [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:26:06 compute-0 nova_compute[187185]: 2025-11-29 07:26:06.683 187189 DEBUG nova.virt.libvirt.driver [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:26:06 compute-0 nova_compute[187185]: 2025-11-29 07:26:06.683 187189 DEBUG nova.virt.libvirt.driver [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:26:06 compute-0 nova_compute[187185]: 2025-11-29 07:26:06.684 187189 DEBUG nova.virt.libvirt.driver [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:26:06 compute-0 nova_compute[187185]: 2025-11-29 07:26:06.684 187189 DEBUG nova.virt.libvirt.driver [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:26:06 compute-0 podman[235675]: 2025-11-29 07:26:06.592373963 +0000 UTC m=+0.021812606 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:26:06 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:26:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/16b70a41039a7948cdc10b915386f17fea69f0034eeb65e6d856d7548fbfd869/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:26:06 compute-0 podman[235675]: 2025-11-29 07:26:06.715419964 +0000 UTC m=+0.144858627 container init 2daeaac3c1b8b269d24769164a245758006849d706196fa1af36115333103d21 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:26:06 compute-0 nova_compute[187185]: 2025-11-29 07:26:06.716 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:26:06 compute-0 nova_compute[187185]: 2025-11-29 07:26:06.716 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401166.6435287, eafc7a74-759b-40e8-a27f-d9610458b32a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:26:06 compute-0 nova_compute[187185]: 2025-11-29 07:26:06.717 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] VM Paused (Lifecycle Event)
Nov 29 07:26:06 compute-0 podman[235675]: 2025-11-29 07:26:06.721297919 +0000 UTC m=+0.150736552 container start 2daeaac3c1b8b269d24769164a245758006849d706196fa1af36115333103d21 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 07:26:06 compute-0 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[235692]: [NOTICE]   (235697) : New worker (235699) forked
Nov 29 07:26:06 compute-0 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[235692]: [NOTICE]   (235697) : Loading success.
Nov 29 07:26:06 compute-0 nova_compute[187185]: 2025-11-29 07:26:06.747 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:26:06 compute-0 nova_compute[187185]: 2025-11-29 07:26:06.752 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401166.6489472, eafc7a74-759b-40e8-a27f-d9610458b32a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:26:06 compute-0 nova_compute[187185]: 2025-11-29 07:26:06.752 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] VM Resumed (Lifecycle Event)
Nov 29 07:26:06 compute-0 nova_compute[187185]: 2025-11-29 07:26:06.776 187189 INFO nova.compute.manager [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Took 6.34 seconds to spawn the instance on the hypervisor.
Nov 29 07:26:06 compute-0 nova_compute[187185]: 2025-11-29 07:26:06.776 187189 DEBUG nova.compute.manager [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:26:06 compute-0 nova_compute[187185]: 2025-11-29 07:26:06.778 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:26:06 compute-0 nova_compute[187185]: 2025-11-29 07:26:06.787 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:26:06 compute-0 nova_compute[187185]: 2025-11-29 07:26:06.812 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:26:06 compute-0 nova_compute[187185]: 2025-11-29 07:26:06.902 187189 INFO nova.compute.manager [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Took 7.77 seconds to build instance.
Nov 29 07:26:06 compute-0 nova_compute[187185]: 2025-11-29 07:26:06.927 187189 DEBUG oslo_concurrency.lockutils [None req-d3639b35-4fc1-4b21-8ce8-19f5a1883ace 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "eafc7a74-759b-40e8-a27f-d9610458b32a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.887s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:26:09 compute-0 nova_compute[187185]: 2025-11-29 07:26:09.125 187189 DEBUG nova.compute.manager [req-f9b8e1bb-15f9-4657-b0a5-726464ae3b9b req-9a3bfe7a-77a6-442f-a67f-bdf105cb5888 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Received event network-vif-plugged-74192508-a888-4ab0-9ebe-a3404b5c5812 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:26:09 compute-0 nova_compute[187185]: 2025-11-29 07:26:09.126 187189 DEBUG oslo_concurrency.lockutils [req-f9b8e1bb-15f9-4657-b0a5-726464ae3b9b req-9a3bfe7a-77a6-442f-a67f-bdf105cb5888 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "eafc7a74-759b-40e8-a27f-d9610458b32a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:26:09 compute-0 nova_compute[187185]: 2025-11-29 07:26:09.126 187189 DEBUG oslo_concurrency.lockutils [req-f9b8e1bb-15f9-4657-b0a5-726464ae3b9b req-9a3bfe7a-77a6-442f-a67f-bdf105cb5888 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "eafc7a74-759b-40e8-a27f-d9610458b32a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:26:09 compute-0 nova_compute[187185]: 2025-11-29 07:26:09.126 187189 DEBUG oslo_concurrency.lockutils [req-f9b8e1bb-15f9-4657-b0a5-726464ae3b9b req-9a3bfe7a-77a6-442f-a67f-bdf105cb5888 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "eafc7a74-759b-40e8-a27f-d9610458b32a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:26:09 compute-0 nova_compute[187185]: 2025-11-29 07:26:09.126 187189 DEBUG nova.compute.manager [req-f9b8e1bb-15f9-4657-b0a5-726464ae3b9b req-9a3bfe7a-77a6-442f-a67f-bdf105cb5888 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] No waiting events found dispatching network-vif-plugged-74192508-a888-4ab0-9ebe-a3404b5c5812 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:26:09 compute-0 nova_compute[187185]: 2025-11-29 07:26:09.127 187189 WARNING nova.compute.manager [req-f9b8e1bb-15f9-4657-b0a5-726464ae3b9b req-9a3bfe7a-77a6-442f-a67f-bdf105cb5888 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Received unexpected event network-vif-plugged-74192508-a888-4ab0-9ebe-a3404b5c5812 for instance with vm_state active and task_state None.
Nov 29 07:26:09 compute-0 nova_compute[187185]: 2025-11-29 07:26:09.855 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:10 compute-0 podman[235708]: 2025-11-29 07:26:10.795386648 +0000 UTC m=+0.060583120 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:26:10 compute-0 podman[235709]: 2025-11-29 07:26:10.812108399 +0000 UTC m=+0.070982533 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible)
Nov 29 07:26:11 compute-0 nova_compute[187185]: 2025-11-29 07:26:11.253 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:13 compute-0 nova_compute[187185]: 2025-11-29 07:26:13.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:26:13 compute-0 nova_compute[187185]: 2025-11-29 07:26:13.315 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:26:14 compute-0 nova_compute[187185]: 2025-11-29 07:26:14.898 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:15 compute-0 nova_compute[187185]: 2025-11-29 07:26:15.060 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "refresh_cache-5f11adcd-958a-4269-905d-a017406505f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:26:15 compute-0 nova_compute[187185]: 2025-11-29 07:26:15.061 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquired lock "refresh_cache-5f11adcd-958a-4269-905d-a017406505f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:26:15 compute-0 nova_compute[187185]: 2025-11-29 07:26:15.061 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 29 07:26:16 compute-0 nova_compute[187185]: 2025-11-29 07:26:16.256 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:19 compute-0 ovn_controller[95281]: 2025-11-29T07:26:19Z|00053|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:88:47:ff 10.100.0.12
Nov 29 07:26:19 compute-0 ovn_controller[95281]: 2025-11-29T07:26:19Z|00054|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:88:47:ff 10.100.0.12
Nov 29 07:26:19 compute-0 nova_compute[187185]: 2025-11-29 07:26:19.900 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:20 compute-0 nova_compute[187185]: 2025-11-29 07:26:20.010 187189 DEBUG nova.compute.manager [req-d8c13066-885b-48de-bd9e-036eba44549d req-f643a237-f990-414d-abbd-9d9db89cb892 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Received event network-changed-723f857c-f89e-440a-83f2-6bf46b479fca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:26:20 compute-0 nova_compute[187185]: 2025-11-29 07:26:20.010 187189 DEBUG nova.compute.manager [req-d8c13066-885b-48de-bd9e-036eba44549d req-f643a237-f990-414d-abbd-9d9db89cb892 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Refreshing instance network info cache due to event network-changed-723f857c-f89e-440a-83f2-6bf46b479fca. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:26:20 compute-0 nova_compute[187185]: 2025-11-29 07:26:20.011 187189 DEBUG oslo_concurrency.lockutils [req-d8c13066-885b-48de-bd9e-036eba44549d req-f643a237-f990-414d-abbd-9d9db89cb892 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-bbcf2b17-c33f-4a89-9f82-60b4dcfa7208" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:26:20 compute-0 nova_compute[187185]: 2025-11-29 07:26:20.011 187189 DEBUG oslo_concurrency.lockutils [req-d8c13066-885b-48de-bd9e-036eba44549d req-f643a237-f990-414d-abbd-9d9db89cb892 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-bbcf2b17-c33f-4a89-9f82-60b4dcfa7208" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:26:20 compute-0 nova_compute[187185]: 2025-11-29 07:26:20.011 187189 DEBUG nova.network.neutron [req-d8c13066-885b-48de-bd9e-036eba44549d req-f643a237-f990-414d-abbd-9d9db89cb892 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Refreshing network info cache for port 723f857c-f89e-440a-83f2-6bf46b479fca _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:26:20 compute-0 podman[235758]: 2025-11-29 07:26:20.800500156 +0000 UTC m=+0.053197951 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:26:20 compute-0 podman[235756]: 2025-11-29 07:26:20.802298417 +0000 UTC m=+0.059099118 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Nov 29 07:26:20 compute-0 podman[235757]: 2025-11-29 07:26:20.815062597 +0000 UTC m=+0.068968426 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 07:26:21 compute-0 nova_compute[187185]: 2025-11-29 07:26:21.259 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:24 compute-0 nova_compute[187185]: 2025-11-29 07:26:24.902 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:25.516 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:26:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:25.517 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:26:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:25.519 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:26:26 compute-0 nova_compute[187185]: 2025-11-29 07:26:26.264 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:29 compute-0 nova_compute[187185]: 2025-11-29 07:26:29.594 187189 DEBUG nova.network.neutron [req-d8c13066-885b-48de-bd9e-036eba44549d req-f643a237-f990-414d-abbd-9d9db89cb892 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Updated VIF entry in instance network info cache for port 723f857c-f89e-440a-83f2-6bf46b479fca. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:26:29 compute-0 nova_compute[187185]: 2025-11-29 07:26:29.595 187189 DEBUG nova.network.neutron [req-d8c13066-885b-48de-bd9e-036eba44549d req-f643a237-f990-414d-abbd-9d9db89cb892 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Updating instance_info_cache with network_info: [{"id": "723f857c-f89e-440a-83f2-6bf46b479fca", "address": "fa:16:3e:62:41:bf", "network": {"id": "ae86c83f-be5a-4cd0-9064-11898ee2fcef", "bridge": "br-int", "label": "tempest-network-smoke--695329026", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap723f857c-f8", "ovs_interfaceid": "723f857c-f89e-440a-83f2-6bf46b479fca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "adf9aa84-82bb-4e89-a0a5-7fad93336a39", "address": "fa:16:3e:59:27:b3", "network": {"id": "a3d94aff-5439-43d3-a356-7aafae582344", "bridge": "br-int", "label": "tempest-network-smoke--90551140", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe59:27b3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapadf9aa84-82", "ovs_interfaceid": "adf9aa84-82bb-4e89-a0a5-7fad93336a39", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:26:29 compute-0 nova_compute[187185]: 2025-11-29 07:26:29.827 187189 DEBUG oslo_concurrency.lockutils [None req-134c6108-9b85-46fe-a2f2-6ce48302746b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "bbcf2b17-c33f-4a89-9f82-60b4dcfa7208" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:26:29 compute-0 nova_compute[187185]: 2025-11-29 07:26:29.827 187189 DEBUG oslo_concurrency.lockutils [None req-134c6108-9b85-46fe-a2f2-6ce48302746b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "bbcf2b17-c33f-4a89-9f82-60b4dcfa7208" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:26:29 compute-0 nova_compute[187185]: 2025-11-29 07:26:29.827 187189 DEBUG oslo_concurrency.lockutils [None req-134c6108-9b85-46fe-a2f2-6ce48302746b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "bbcf2b17-c33f-4a89-9f82-60b4dcfa7208-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:26:29 compute-0 nova_compute[187185]: 2025-11-29 07:26:29.828 187189 DEBUG oslo_concurrency.lockutils [None req-134c6108-9b85-46fe-a2f2-6ce48302746b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "bbcf2b17-c33f-4a89-9f82-60b4dcfa7208-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:26:29 compute-0 nova_compute[187185]: 2025-11-29 07:26:29.828 187189 DEBUG oslo_concurrency.lockutils [None req-134c6108-9b85-46fe-a2f2-6ce48302746b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "bbcf2b17-c33f-4a89-9f82-60b4dcfa7208-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:26:29 compute-0 nova_compute[187185]: 2025-11-29 07:26:29.905 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:30 compute-0 nova_compute[187185]: 2025-11-29 07:26:30.487 187189 DEBUG oslo_concurrency.lockutils [req-d8c13066-885b-48de-bd9e-036eba44549d req-f643a237-f990-414d-abbd-9d9db89cb892 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-bbcf2b17-c33f-4a89-9f82-60b4dcfa7208" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:26:30 compute-0 nova_compute[187185]: 2025-11-29 07:26:30.503 187189 DEBUG oslo_concurrency.lockutils [None req-336c97c4-1177-448c-b7ea-d7ada580baae 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "eafc7a74-759b-40e8-a27f-d9610458b32a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:26:30 compute-0 nova_compute[187185]: 2025-11-29 07:26:30.503 187189 DEBUG oslo_concurrency.lockutils [None req-336c97c4-1177-448c-b7ea-d7ada580baae 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "eafc7a74-759b-40e8-a27f-d9610458b32a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:26:30 compute-0 nova_compute[187185]: 2025-11-29 07:26:30.504 187189 DEBUG oslo_concurrency.lockutils [None req-336c97c4-1177-448c-b7ea-d7ada580baae 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "eafc7a74-759b-40e8-a27f-d9610458b32a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:26:30 compute-0 nova_compute[187185]: 2025-11-29 07:26:30.504 187189 DEBUG oslo_concurrency.lockutils [None req-336c97c4-1177-448c-b7ea-d7ada580baae 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "eafc7a74-759b-40e8-a27f-d9610458b32a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:26:30 compute-0 nova_compute[187185]: 2025-11-29 07:26:30.504 187189 DEBUG oslo_concurrency.lockutils [None req-336c97c4-1177-448c-b7ea-d7ada580baae 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "eafc7a74-759b-40e8-a27f-d9610458b32a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:26:30 compute-0 nova_compute[187185]: 2025-11-29 07:26:30.716 187189 INFO nova.compute.manager [None req-134c6108-9b85-46fe-a2f2-6ce48302746b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Terminating instance
Nov 29 07:26:30 compute-0 nova_compute[187185]: 2025-11-29 07:26:30.820 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Updating instance_info_cache with network_info: [{"id": "a1e67d00-8650-44ba-b75d-07f55b8d8810", "address": "fa:16:3e:39:9f:d3", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1e67d00-86", "ovs_interfaceid": "a1e67d00-8650-44ba-b75d-07f55b8d8810", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:26:30 compute-0 podman[235833]: 2025-11-29 07:26:30.847799933 +0000 UTC m=+0.117341061 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:26:31 compute-0 nova_compute[187185]: 2025-11-29 07:26:31.267 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:32 compute-0 nova_compute[187185]: 2025-11-29 07:26:32.125 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Releasing lock "refresh_cache-5f11adcd-958a-4269-905d-a017406505f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:26:32 compute-0 nova_compute[187185]: 2025-11-29 07:26:32.126 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 29 07:26:32 compute-0 nova_compute[187185]: 2025-11-29 07:26:32.127 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:26:32 compute-0 nova_compute[187185]: 2025-11-29 07:26:32.127 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:26:32 compute-0 nova_compute[187185]: 2025-11-29 07:26:32.128 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:26:32 compute-0 nova_compute[187185]: 2025-11-29 07:26:32.128 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:26:32 compute-0 nova_compute[187185]: 2025-11-29 07:26:32.129 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:26:32 compute-0 nova_compute[187185]: 2025-11-29 07:26:32.129 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:26:32 compute-0 nova_compute[187185]: 2025-11-29 07:26:32.129 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:26:32 compute-0 nova_compute[187185]: 2025-11-29 07:26:32.130 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:26:32 compute-0 nova_compute[187185]: 2025-11-29 07:26:32.158 187189 INFO nova.compute.manager [None req-336c97c4-1177-448c-b7ea-d7ada580baae 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Terminating instance
Nov 29 07:26:32 compute-0 nova_compute[187185]: 2025-11-29 07:26:32.870 187189 DEBUG nova.compute.manager [None req-134c6108-9b85-46fe-a2f2-6ce48302746b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 07:26:32 compute-0 kernel: tap723f857c-f8 (unregistering): left promiscuous mode
Nov 29 07:26:32 compute-0 NetworkManager[55227]: <info>  [1764401192.9138] device (tap723f857c-f8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:26:32 compute-0 nova_compute[187185]: 2025-11-29 07:26:32.968 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:32 compute-0 ovn_controller[95281]: 2025-11-29T07:26:32Z|00407|binding|INFO|Releasing lport 723f857c-f89e-440a-83f2-6bf46b479fca from this chassis (sb_readonly=0)
Nov 29 07:26:32 compute-0 ovn_controller[95281]: 2025-11-29T07:26:32Z|00408|binding|INFO|Setting lport 723f857c-f89e-440a-83f2-6bf46b479fca down in Southbound
Nov 29 07:26:32 compute-0 ovn_controller[95281]: 2025-11-29T07:26:32Z|00409|binding|INFO|Removing iface tap723f857c-f8 ovn-installed in OVS
Nov 29 07:26:32 compute-0 nova_compute[187185]: 2025-11-29 07:26:32.974 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:32 compute-0 nova_compute[187185]: 2025-11-29 07:26:32.989 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:32 compute-0 kernel: tapadf9aa84-82 (unregistering): left promiscuous mode
Nov 29 07:26:33 compute-0 NetworkManager[55227]: <info>  [1764401193.0022] device (tapadf9aa84-82): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:26:33 compute-0 ovn_controller[95281]: 2025-11-29T07:26:33Z|00410|binding|INFO|Releasing lport adf9aa84-82bb-4e89-a0a5-7fad93336a39 from this chassis (sb_readonly=1)
Nov 29 07:26:33 compute-0 ovn_controller[95281]: 2025-11-29T07:26:33Z|00411|binding|INFO|Removing iface tapadf9aa84-82 ovn-installed in OVS
Nov 29 07:26:33 compute-0 ovn_controller[95281]: 2025-11-29T07:26:33Z|00412|if_status|INFO|Dropped 4 log messages in last 1496 seconds (most recently, 1496 seconds ago) due to excessive rate
Nov 29 07:26:33 compute-0 ovn_controller[95281]: 2025-11-29T07:26:33Z|00413|if_status|INFO|Not setting lport adf9aa84-82bb-4e89-a0a5-7fad93336a39 down as sb is readonly
Nov 29 07:26:33 compute-0 nova_compute[187185]: 2025-11-29 07:26:33.014 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:33 compute-0 nova_compute[187185]: 2025-11-29 07:26:33.019 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:33 compute-0 nova_compute[187185]: 2025-11-29 07:26:33.029 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:33 compute-0 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d0000007f.scope: Deactivated successfully.
Nov 29 07:26:33 compute-0 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d0000007f.scope: Consumed 15.597s CPU time.
Nov 29 07:26:33 compute-0 systemd-machined[153486]: Machine qemu-51-instance-0000007f terminated.
Nov 29 07:26:33 compute-0 NetworkManager[55227]: <info>  [1764401193.1237] manager: (tapadf9aa84-82): new Tun device (/org/freedesktop/NetworkManager/Devices/219)
Nov 29 07:26:33 compute-0 nova_compute[187185]: 2025-11-29 07:26:33.169 187189 INFO nova.virt.libvirt.driver [-] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Instance destroyed successfully.
Nov 29 07:26:33 compute-0 nova_compute[187185]: 2025-11-29 07:26:33.169 187189 DEBUG nova.objects.instance [None req-134c6108-9b85-46fe-a2f2-6ce48302746b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'resources' on Instance uuid bbcf2b17-c33f-4a89-9f82-60b4dcfa7208 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:26:33 compute-0 nova_compute[187185]: 2025-11-29 07:26:33.273 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:26:33 compute-0 nova_compute[187185]: 2025-11-29 07:26:33.273 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:26:33 compute-0 nova_compute[187185]: 2025-11-29 07:26:33.274 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:26:33 compute-0 nova_compute[187185]: 2025-11-29 07:26:33.274 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:26:33 compute-0 ovn_controller[95281]: 2025-11-29T07:26:33Z|00414|binding|INFO|Setting lport adf9aa84-82bb-4e89-a0a5-7fad93336a39 down in Southbound
Nov 29 07:26:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:33.704 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:62:41:bf 10.100.0.11'], port_security=['fa:16:3e:62:41:bf 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'bbcf2b17-c33f-4a89-9f82-60b4dcfa7208', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ae86c83f-be5a-4cd0-9064-11898ee2fcef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'dcb73a4e-e43c-4221-95d7-295071c2bea0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f5c6bb94-c536-451b-a4cb-db984bf0cbdf, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=723f857c-f89e-440a-83f2-6bf46b479fca) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:26:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:33.707 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 723f857c-f89e-440a-83f2-6bf46b479fca in datapath ae86c83f-be5a-4cd0-9064-11898ee2fcef unbound from our chassis
Nov 29 07:26:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:33.712 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ae86c83f-be5a-4cd0-9064-11898ee2fcef, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:26:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:33.714 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[ed8eb76f-c908-40e7-83cc-5818cfefa26f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:26:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:33.715 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef namespace which is not needed anymore
Nov 29 07:26:33 compute-0 nova_compute[187185]: 2025-11-29 07:26:33.896 187189 DEBUG nova.virt.libvirt.vif [None req-134c6108-9b85-46fe-a2f2-6ce48302746b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:25:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-670731095',display_name='tempest-TestGettingAddress-server-670731095',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-670731095',id=127,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBfPCDIsgTH7NijT1yLJHsEeHpblJVOMzuD1uWLhNG/kTdHG+MlT37mOzLs3Jd0/9NUkh6NevkJ52dyRmEbrCaMvcIh0EOIGfP4sOHwd11Jy3SL4tJpdp4JARnM5Jon1zg==',key_name='tempest-TestGettingAddress-2055091882',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:25:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-08141zmb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:25:46Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=bbcf2b17-c33f-4a89-9f82-60b4dcfa7208,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "723f857c-f89e-440a-83f2-6bf46b479fca", "address": "fa:16:3e:62:41:bf", "network": {"id": "ae86c83f-be5a-4cd0-9064-11898ee2fcef", "bridge": "br-int", "label": "tempest-network-smoke--695329026", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap723f857c-f8", "ovs_interfaceid": "723f857c-f89e-440a-83f2-6bf46b479fca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:26:33 compute-0 nova_compute[187185]: 2025-11-29 07:26:33.896 187189 DEBUG nova.network.os_vif_util [None req-134c6108-9b85-46fe-a2f2-6ce48302746b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "723f857c-f89e-440a-83f2-6bf46b479fca", "address": "fa:16:3e:62:41:bf", "network": {"id": "ae86c83f-be5a-4cd0-9064-11898ee2fcef", "bridge": "br-int", "label": "tempest-network-smoke--695329026", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap723f857c-f8", "ovs_interfaceid": "723f857c-f89e-440a-83f2-6bf46b479fca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:26:33 compute-0 nova_compute[187185]: 2025-11-29 07:26:33.897 187189 DEBUG nova.network.os_vif_util [None req-134c6108-9b85-46fe-a2f2-6ce48302746b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:62:41:bf,bridge_name='br-int',has_traffic_filtering=True,id=723f857c-f89e-440a-83f2-6bf46b479fca,network=Network(ae86c83f-be5a-4cd0-9064-11898ee2fcef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap723f857c-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:26:33 compute-0 nova_compute[187185]: 2025-11-29 07:26:33.897 187189 DEBUG os_vif [None req-134c6108-9b85-46fe-a2f2-6ce48302746b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:62:41:bf,bridge_name='br-int',has_traffic_filtering=True,id=723f857c-f89e-440a-83f2-6bf46b479fca,network=Network(ae86c83f-be5a-4cd0-9064-11898ee2fcef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap723f857c-f8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:26:33 compute-0 nova_compute[187185]: 2025-11-29 07:26:33.901 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:33 compute-0 nova_compute[187185]: 2025-11-29 07:26:33.902 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap723f857c-f8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:26:33 compute-0 nova_compute[187185]: 2025-11-29 07:26:33.904 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:33 compute-0 nova_compute[187185]: 2025-11-29 07:26:33.907 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:26:33 compute-0 nova_compute[187185]: 2025-11-29 07:26:33.914 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:33 compute-0 neutron-haproxy-ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef[235358]: [NOTICE]   (235362) : haproxy version is 2.8.14-c23fe91
Nov 29 07:26:33 compute-0 neutron-haproxy-ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef[235358]: [NOTICE]   (235362) : path to executable is /usr/sbin/haproxy
Nov 29 07:26:33 compute-0 neutron-haproxy-ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef[235358]: [WARNING]  (235362) : Exiting Master process...
Nov 29 07:26:33 compute-0 neutron-haproxy-ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef[235358]: [ALERT]    (235362) : Current worker (235364) exited with code 143 (Terminated)
Nov 29 07:26:33 compute-0 neutron-haproxy-ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef[235358]: [WARNING]  (235362) : All workers exited. Exiting... (0)
Nov 29 07:26:33 compute-0 nova_compute[187185]: 2025-11-29 07:26:33.919 187189 INFO os_vif [None req-134c6108-9b85-46fe-a2f2-6ce48302746b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:62:41:bf,bridge_name='br-int',has_traffic_filtering=True,id=723f857c-f89e-440a-83f2-6bf46b479fca,network=Network(ae86c83f-be5a-4cd0-9064-11898ee2fcef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap723f857c-f8')
Nov 29 07:26:33 compute-0 nova_compute[187185]: 2025-11-29 07:26:33.921 187189 DEBUG nova.virt.libvirt.vif [None req-134c6108-9b85-46fe-a2f2-6ce48302746b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:25:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-670731095',display_name='tempest-TestGettingAddress-server-670731095',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-670731095',id=127,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBfPCDIsgTH7NijT1yLJHsEeHpblJVOMzuD1uWLhNG/kTdHG+MlT37mOzLs3Jd0/9NUkh6NevkJ52dyRmEbrCaMvcIh0EOIGfP4sOHwd11Jy3SL4tJpdp4JARnM5Jon1zg==',key_name='tempest-TestGettingAddress-2055091882',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:25:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-08141zmb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:25:46Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=bbcf2b17-c33f-4a89-9f82-60b4dcfa7208,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "adf9aa84-82bb-4e89-a0a5-7fad93336a39", "address": "fa:16:3e:59:27:b3", "network": {"id": "a3d94aff-5439-43d3-a356-7aafae582344", "bridge": "br-int", "label": "tempest-network-smoke--90551140", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe59:27b3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapadf9aa84-82", "ovs_interfaceid": "adf9aa84-82bb-4e89-a0a5-7fad93336a39", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:26:33 compute-0 nova_compute[187185]: 2025-11-29 07:26:33.921 187189 DEBUG nova.network.os_vif_util [None req-134c6108-9b85-46fe-a2f2-6ce48302746b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "adf9aa84-82bb-4e89-a0a5-7fad93336a39", "address": "fa:16:3e:59:27:b3", "network": {"id": "a3d94aff-5439-43d3-a356-7aafae582344", "bridge": "br-int", "label": "tempest-network-smoke--90551140", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe59:27b3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapadf9aa84-82", "ovs_interfaceid": "adf9aa84-82bb-4e89-a0a5-7fad93336a39", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:26:33 compute-0 nova_compute[187185]: 2025-11-29 07:26:33.922 187189 DEBUG nova.network.os_vif_util [None req-134c6108-9b85-46fe-a2f2-6ce48302746b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:59:27:b3,bridge_name='br-int',has_traffic_filtering=True,id=adf9aa84-82bb-4e89-a0a5-7fad93336a39,network=Network(a3d94aff-5439-43d3-a356-7aafae582344),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapadf9aa84-82') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:26:33 compute-0 nova_compute[187185]: 2025-11-29 07:26:33.922 187189 DEBUG os_vif [None req-134c6108-9b85-46fe-a2f2-6ce48302746b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:59:27:b3,bridge_name='br-int',has_traffic_filtering=True,id=adf9aa84-82bb-4e89-a0a5-7fad93336a39,network=Network(a3d94aff-5439-43d3-a356-7aafae582344),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapadf9aa84-82') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:26:33 compute-0 systemd[1]: libpod-07cf9803e720de352ca147f2db2eebe7f27c16a244beaebdff83d1665389700b.scope: Deactivated successfully.
Nov 29 07:26:33 compute-0 nova_compute[187185]: 2025-11-29 07:26:33.924 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:33 compute-0 nova_compute[187185]: 2025-11-29 07:26:33.924 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapadf9aa84-82, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:26:33 compute-0 nova_compute[187185]: 2025-11-29 07:26:33.926 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:33 compute-0 podman[235918]: 2025-11-29 07:26:33.929770625 +0000 UTC m=+0.066056744 container died 07cf9803e720de352ca147f2db2eebe7f27c16a244beaebdff83d1665389700b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:26:33 compute-0 nova_compute[187185]: 2025-11-29 07:26:33.930 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:26:33 compute-0 nova_compute[187185]: 2025-11-29 07:26:33.933 187189 INFO os_vif [None req-134c6108-9b85-46fe-a2f2-6ce48302746b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:59:27:b3,bridge_name='br-int',has_traffic_filtering=True,id=adf9aa84-82bb-4e89-a0a5-7fad93336a39,network=Network(a3d94aff-5439-43d3-a356-7aafae582344),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapadf9aa84-82')
Nov 29 07:26:33 compute-0 nova_compute[187185]: 2025-11-29 07:26:33.934 187189 INFO nova.virt.libvirt.driver [None req-134c6108-9b85-46fe-a2f2-6ce48302746b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Deleting instance files /var/lib/nova/instances/bbcf2b17-c33f-4a89-9f82-60b4dcfa7208_del
Nov 29 07:26:33 compute-0 nova_compute[187185]: 2025-11-29 07:26:33.935 187189 INFO nova.virt.libvirt.driver [None req-134c6108-9b85-46fe-a2f2-6ce48302746b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Deletion of /var/lib/nova/instances/bbcf2b17-c33f-4a89-9f82-60b4dcfa7208_del complete
Nov 29 07:26:33 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-07cf9803e720de352ca147f2db2eebe7f27c16a244beaebdff83d1665389700b-userdata-shm.mount: Deactivated successfully.
Nov 29 07:26:33 compute-0 systemd[1]: var-lib-containers-storage-overlay-31ecee06b277bb3bbe12006d6ee8294511ab996b4eb4a1d3296d530b7464a827-merged.mount: Deactivated successfully.
Nov 29 07:26:33 compute-0 podman[235918]: 2025-11-29 07:26:33.983809559 +0000 UTC m=+0.120095688 container cleanup 07cf9803e720de352ca147f2db2eebe7f27c16a244beaebdff83d1665389700b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 29 07:26:34 compute-0 systemd[1]: libpod-conmon-07cf9803e720de352ca147f2db2eebe7f27c16a244beaebdff83d1665389700b.scope: Deactivated successfully.
Nov 29 07:26:34 compute-0 podman[235950]: 2025-11-29 07:26:34.050799299 +0000 UTC m=+0.044240349 container remove 07cf9803e720de352ca147f2db2eebe7f27c16a244beaebdff83d1665389700b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 29 07:26:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:34.058 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[0bf9b724-9083-4d9a-a888-cd609e4d3f16]: (4, ('Sat Nov 29 07:26:33 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef (07cf9803e720de352ca147f2db2eebe7f27c16a244beaebdff83d1665389700b)\n07cf9803e720de352ca147f2db2eebe7f27c16a244beaebdff83d1665389700b\nSat Nov 29 07:26:33 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef (07cf9803e720de352ca147f2db2eebe7f27c16a244beaebdff83d1665389700b)\n07cf9803e720de352ca147f2db2eebe7f27c16a244beaebdff83d1665389700b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:26:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:34.060 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a134b5ab-a678-4401-b927-c3f1051b3503]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:26:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:34.062 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapae86c83f-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:26:34 compute-0 nova_compute[187185]: 2025-11-29 07:26:34.064 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:34 compute-0 kernel: tapae86c83f-b0: left promiscuous mode
Nov 29 07:26:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:34.071 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[bebecdd6-1e45-4cea-baf9-e76d5a71f0f7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:26:34 compute-0 nova_compute[187185]: 2025-11-29 07:26:34.084 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:34.092 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[91e7291e-0fdf-46f9-95cf-e2cc98941823]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:26:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:34.094 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6fb5bba4-b79e-4714-8ece-20fb79fc93ca]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:26:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:34.120 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[8b7f5472-c4c7-44fa-9f59-d4f4291c9d71]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 659357, 'reachable_time': 17491, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235965, 'error': None, 'target': 'ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:26:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:34.125 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ae86c83f-be5a-4cd0-9064-11898ee2fcef deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:26:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:34.126 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[37b4dc57-0289-4402-b7e2-a5fd3ba64722]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:26:34 compute-0 systemd[1]: run-netns-ovnmeta\x2dae86c83f\x2dbe5a\x2d4cd0\x2d9064\x2d11898ee2fcef.mount: Deactivated successfully.
Nov 29 07:26:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:34.152 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:59:27:b3 2001:db8::f816:3eff:fe59:27b3'], port_security=['fa:16:3e:59:27:b3 2001:db8::f816:3eff:fe59:27b3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe59:27b3/64', 'neutron:device_id': 'bbcf2b17-c33f-4a89-9f82-60b4dcfa7208', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a3d94aff-5439-43d3-a356-7aafae582344', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'dcb73a4e-e43c-4221-95d7-295071c2bea0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=890f979e-778b-42a4-aff1-be3795cfb05f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=adf9aa84-82bb-4e89-a0a5-7fad93336a39) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:26:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:34.153 104254 INFO neutron.agent.ovn.metadata.agent [-] Port adf9aa84-82bb-4e89-a0a5-7fad93336a39 in datapath a3d94aff-5439-43d3-a356-7aafae582344 unbound from our chassis
Nov 29 07:26:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:34.157 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a3d94aff-5439-43d3-a356-7aafae582344, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:26:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:34.158 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[15b58f28-2555-46ba-a0a6-56397d218e90]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:26:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:34.159 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344 namespace which is not needed anymore
Nov 29 07:26:34 compute-0 neutron-haproxy-ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344[235465]: [NOTICE]   (235500) : haproxy version is 2.8.14-c23fe91
Nov 29 07:26:34 compute-0 neutron-haproxy-ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344[235465]: [NOTICE]   (235500) : path to executable is /usr/sbin/haproxy
Nov 29 07:26:34 compute-0 neutron-haproxy-ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344[235465]: [WARNING]  (235500) : Exiting Master process...
Nov 29 07:26:34 compute-0 neutron-haproxy-ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344[235465]: [WARNING]  (235500) : Exiting Master process...
Nov 29 07:26:34 compute-0 neutron-haproxy-ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344[235465]: [ALERT]    (235500) : Current worker (235502) exited with code 143 (Terminated)
Nov 29 07:26:34 compute-0 neutron-haproxy-ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344[235465]: [WARNING]  (235500) : All workers exited. Exiting... (0)
Nov 29 07:26:34 compute-0 systemd[1]: libpod-4469ec564937420c0cd49acaf1ee01a11fcc422abb5fcd89a399de6d5648a7d5.scope: Deactivated successfully.
Nov 29 07:26:34 compute-0 podman[235983]: 2025-11-29 07:26:34.318975884 +0000 UTC m=+0.049062395 container died 4469ec564937420c0cd49acaf1ee01a11fcc422abb5fcd89a399de6d5648a7d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:26:34 compute-0 systemd[1]: var-lib-containers-storage-overlay-e33e2f8868301b1a02773571b7f9c79e146cb26656f609fb98303cb3e27a629f-merged.mount: Deactivated successfully.
Nov 29 07:26:34 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4469ec564937420c0cd49acaf1ee01a11fcc422abb5fcd89a399de6d5648a7d5-userdata-shm.mount: Deactivated successfully.
Nov 29 07:26:34 compute-0 podman[235983]: 2025-11-29 07:26:34.352594342 +0000 UTC m=+0.082680853 container cleanup 4469ec564937420c0cd49acaf1ee01a11fcc422abb5fcd89a399de6d5648a7d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:26:34 compute-0 systemd[1]: libpod-conmon-4469ec564937420c0cd49acaf1ee01a11fcc422abb5fcd89a399de6d5648a7d5.scope: Deactivated successfully.
Nov 29 07:26:34 compute-0 podman[236010]: 2025-11-29 07:26:34.440154002 +0000 UTC m=+0.061480085 container remove 4469ec564937420c0cd49acaf1ee01a11fcc422abb5fcd89a399de6d5648a7d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 29 07:26:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:34.448 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[aab31172-5513-4f0f-a105-622b480801cd]: (4, ('Sat Nov 29 07:26:34 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344 (4469ec564937420c0cd49acaf1ee01a11fcc422abb5fcd89a399de6d5648a7d5)\n4469ec564937420c0cd49acaf1ee01a11fcc422abb5fcd89a399de6d5648a7d5\nSat Nov 29 07:26:34 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344 (4469ec564937420c0cd49acaf1ee01a11fcc422abb5fcd89a399de6d5648a7d5)\n4469ec564937420c0cd49acaf1ee01a11fcc422abb5fcd89a399de6d5648a7d5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:26:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:34.453 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[c33da216-0ba9-4ee1-8b84-21ba40e714b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:26:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:34.456 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa3d94aff-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:26:34 compute-0 nova_compute[187185]: 2025-11-29 07:26:34.459 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:34 compute-0 kernel: tapa3d94aff-50: left promiscuous mode
Nov 29 07:26:34 compute-0 nova_compute[187185]: 2025-11-29 07:26:34.463 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:34.469 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[c8161b16-4dd2-4560-abbb-fdd283aa029e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:26:34 compute-0 nova_compute[187185]: 2025-11-29 07:26:34.479 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:34.486 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d39e02a1-c52e-4535-bdb3-26c4cec0a0ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:26:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:34.487 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[7b1c390f-2984-469a-9470-d13fb61889c6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:26:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:34.512 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[c95b33a3-d5d7-4192-83bd-ca00c042aaf7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 659470, 'reachable_time': 23746, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236030, 'error': None, 'target': 'ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:26:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:34.516 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a3d94aff-5439-43d3-a356-7aafae582344 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:26:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:34.516 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[7ddabeb7-f022-4c59-aa63-1971ba496c74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:26:34 compute-0 systemd[1]: run-netns-ovnmeta\x2da3d94aff\x2d5439\x2d43d3\x2da356\x2d7aafae582344.mount: Deactivated successfully.
Nov 29 07:26:35 compute-0 nova_compute[187185]: 2025-11-29 07:26:35.099 187189 DEBUG nova.compute.manager [None req-336c97c4-1177-448c-b7ea-d7ada580baae 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 07:26:35 compute-0 kernel: tap74192508-a8 (unregistering): left promiscuous mode
Nov 29 07:26:35 compute-0 NetworkManager[55227]: <info>  [1764401195.1263] device (tap74192508-a8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:26:35 compute-0 ovn_controller[95281]: 2025-11-29T07:26:35Z|00415|binding|INFO|Releasing lport 74192508-a888-4ab0-9ebe-a3404b5c5812 from this chassis (sb_readonly=0)
Nov 29 07:26:35 compute-0 nova_compute[187185]: 2025-11-29 07:26:35.130 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:35 compute-0 ovn_controller[95281]: 2025-11-29T07:26:35Z|00416|binding|INFO|Setting lport 74192508-a888-4ab0-9ebe-a3404b5c5812 down in Southbound
Nov 29 07:26:35 compute-0 ovn_controller[95281]: 2025-11-29T07:26:35Z|00417|binding|INFO|Removing iface tap74192508-a8 ovn-installed in OVS
Nov 29 07:26:35 compute-0 nova_compute[187185]: 2025-11-29 07:26:35.134 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:35 compute-0 nova_compute[187185]: 2025-11-29 07:26:35.155 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:35 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:35.185 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:88:47:ff 10.100.0.12'], port_security=['fa:16:3e:88:47:ff 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'eafc7a74-759b-40e8-a27f-d9610458b32a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6d55e57bfd184513a304a61cc1cb3730', 'neutron:revision_number': '4', 'neutron:security_group_ids': '44bb1ac9-49ae-4c0a-8013-0db5efadb536', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=804567bc-6857-4eb6-aa00-b449f09c69a2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=74192508-a888-4ab0-9ebe-a3404b5c5812) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:26:35 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:35.187 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 74192508-a888-4ab0-9ebe-a3404b5c5812 in datapath 9b34af6b-edf9-4b27-b1dc-2b18c2eec958 unbound from our chassis
Nov 29 07:26:35 compute-0 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d00000081.scope: Deactivated successfully.
Nov 29 07:26:35 compute-0 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d00000081.scope: Consumed 13.679s CPU time.
Nov 29 07:26:35 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:35.192 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9b34af6b-edf9-4b27-b1dc-2b18c2eec958, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:26:35 compute-0 systemd-machined[153486]: Machine qemu-52-instance-00000081 terminated.
Nov 29 07:26:35 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:35.193 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[183e4bad-a1f3-4281-bedd-db8cdc0c1789]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:26:35 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:35.194 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 namespace which is not needed anymore
Nov 29 07:26:35 compute-0 nova_compute[187185]: 2025-11-29 07:26:35.378 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eafc7a74-759b-40e8-a27f-d9610458b32a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:26:35 compute-0 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[235692]: [NOTICE]   (235697) : haproxy version is 2.8.14-c23fe91
Nov 29 07:26:35 compute-0 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[235692]: [NOTICE]   (235697) : path to executable is /usr/sbin/haproxy
Nov 29 07:26:35 compute-0 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[235692]: [WARNING]  (235697) : Exiting Master process...
Nov 29 07:26:35 compute-0 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[235692]: [WARNING]  (235697) : Exiting Master process...
Nov 29 07:26:35 compute-0 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[235692]: [ALERT]    (235697) : Current worker (235699) exited with code 143 (Terminated)
Nov 29 07:26:35 compute-0 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[235692]: [WARNING]  (235697) : All workers exited. Exiting... (0)
Nov 29 07:26:35 compute-0 systemd[1]: libpod-2daeaac3c1b8b269d24769164a245758006849d706196fa1af36115333103d21.scope: Deactivated successfully.
Nov 29 07:26:35 compute-0 podman[236052]: 2025-11-29 07:26:35.390081056 +0000 UTC m=+0.061838715 container died 2daeaac3c1b8b269d24769164a245758006849d706196fa1af36115333103d21 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:26:35 compute-0 nova_compute[187185]: 2025-11-29 07:26:35.410 187189 INFO nova.virt.libvirt.driver [-] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Instance destroyed successfully.
Nov 29 07:26:35 compute-0 nova_compute[187185]: 2025-11-29 07:26:35.412 187189 DEBUG nova.objects.instance [None req-336c97c4-1177-448c-b7ea-d7ada580baae 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lazy-loading 'resources' on Instance uuid eafc7a74-759b-40e8-a27f-d9610458b32a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:26:35 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2daeaac3c1b8b269d24769164a245758006849d706196fa1af36115333103d21-userdata-shm.mount: Deactivated successfully.
Nov 29 07:26:35 compute-0 systemd[1]: var-lib-containers-storage-overlay-16b70a41039a7948cdc10b915386f17fea69f0034eeb65e6d856d7548fbfd869-merged.mount: Deactivated successfully.
Nov 29 07:26:35 compute-0 podman[236052]: 2025-11-29 07:26:35.442911697 +0000 UTC m=+0.114669346 container cleanup 2daeaac3c1b8b269d24769164a245758006849d706196fa1af36115333103d21 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:26:35 compute-0 systemd[1]: libpod-conmon-2daeaac3c1b8b269d24769164a245758006849d706196fa1af36115333103d21.scope: Deactivated successfully.
Nov 29 07:26:35 compute-0 nova_compute[187185]: 2025-11-29 07:26:35.463 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eafc7a74-759b-40e8-a27f-d9610458b32a/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:26:35 compute-0 nova_compute[187185]: 2025-11-29 07:26:35.464 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eafc7a74-759b-40e8-a27f-d9610458b32a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:26:35 compute-0 podman[236098]: 2025-11-29 07:26:35.567112299 +0000 UTC m=+0.097470319 container remove 2daeaac3c1b8b269d24769164a245758006849d706196fa1af36115333103d21 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 29 07:26:35 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:35.575 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[3842a926-b624-47e3-8b16-d0ebd566f5b7]: (4, ('Sat Nov 29 07:26:35 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 (2daeaac3c1b8b269d24769164a245758006849d706196fa1af36115333103d21)\n2daeaac3c1b8b269d24769164a245758006849d706196fa1af36115333103d21\nSat Nov 29 07:26:35 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 (2daeaac3c1b8b269d24769164a245758006849d706196fa1af36115333103d21)\n2daeaac3c1b8b269d24769164a245758006849d706196fa1af36115333103d21\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:26:35 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:35.577 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[fd6975e0-b1eb-45c5-9645-390679446036]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:26:35 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:35.578 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9b34af6b-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:26:35 compute-0 nova_compute[187185]: 2025-11-29 07:26:35.580 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:35 compute-0 kernel: tap9b34af6b-e0: left promiscuous mode
Nov 29 07:26:35 compute-0 nova_compute[187185]: 2025-11-29 07:26:35.590 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eafc7a74-759b-40e8-a27f-d9610458b32a/disk --force-share --output=json" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:26:35 compute-0 nova_compute[187185]: 2025-11-29 07:26:35.597 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:26:35 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:35.600 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[1235d9b1-d936-404c-bf21-30b66b875540]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:26:35 compute-0 nova_compute[187185]: 2025-11-29 07:26:35.623 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:35 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:35.629 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[4be86c70-961d-425d-9aee-df167929e57f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:26:35 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:35.631 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[7826d82c-8a54-4642-99ff-e3afb9a947a8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:26:35 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:35.645 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[9809469a-ce25-4919-9d40-94e5fa6e0718]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 661370, 'reachable_time': 19761, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236121, 'error': None, 'target': 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:26:35 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:35.647 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:26:35 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:35.647 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[c7a6f4c0-e2f7-4866-bec6-b0f239b10f5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:26:35 compute-0 systemd[1]: run-netns-ovnmeta\x2d9b34af6b\x2dedf9\x2d4b27\x2db1dc\x2d2b18c2eec958.mount: Deactivated successfully.
Nov 29 07:26:35 compute-0 nova_compute[187185]: 2025-11-29 07:26:35.678 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:26:35 compute-0 nova_compute[187185]: 2025-11-29 07:26:35.680 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:26:35 compute-0 nova_compute[187185]: 2025-11-29 07:26:35.767 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:26:35 compute-0 nova_compute[187185]: 2025-11-29 07:26:35.774 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Periodic task is updating the host stats, it is trying to get disk info for instance-0000007f, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/bbcf2b17-c33f-4a89-9f82-60b4dcfa7208/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/bbcf2b17-c33f-4a89-9f82-60b4dcfa7208/disk
Nov 29 07:26:35 compute-0 nova_compute[187185]: 2025-11-29 07:26:35.964 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:26:35 compute-0 nova_compute[187185]: 2025-11-29 07:26:35.965 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5366MB free_disk=73.15781021118164GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:26:35 compute-0 nova_compute[187185]: 2025-11-29 07:26:35.965 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:26:35 compute-0 nova_compute[187185]: 2025-11-29 07:26:35.966 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:26:36 compute-0 nova_compute[187185]: 2025-11-29 07:26:36.321 187189 DEBUG nova.virt.libvirt.vif [None req-336c97c4-1177-448c-b7ea-d7ada580baae 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:25:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-636885058',display_name='tempest-ServerDiskConfigTestJSON-server-636885058',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-636885058',id=129,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:26:06Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6d55e57bfd184513a304a61cc1cb3730',ramdisk_id='',reservation_id='r-nezhywl9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1282760174',owner_user_name='tempest-ServerDiskConfigTestJSON-1282760174-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:26:10Z,user_data=None,user_id='000fb7b950024e16902cd58f2ea16ac9',uuid=eafc7a74-759b-40e8-a27f-d9610458b32a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "74192508-a888-4ab0-9ebe-a3404b5c5812", "address": "fa:16:3e:88:47:ff", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74192508-a8", "ovs_interfaceid": "74192508-a888-4ab0-9ebe-a3404b5c5812", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:26:36 compute-0 nova_compute[187185]: 2025-11-29 07:26:36.322 187189 DEBUG nova.network.os_vif_util [None req-336c97c4-1177-448c-b7ea-d7ada580baae 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converting VIF {"id": "74192508-a888-4ab0-9ebe-a3404b5c5812", "address": "fa:16:3e:88:47:ff", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74192508-a8", "ovs_interfaceid": "74192508-a888-4ab0-9ebe-a3404b5c5812", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:26:36 compute-0 nova_compute[187185]: 2025-11-29 07:26:36.324 187189 DEBUG nova.network.os_vif_util [None req-336c97c4-1177-448c-b7ea-d7ada580baae 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:88:47:ff,bridge_name='br-int',has_traffic_filtering=True,id=74192508-a888-4ab0-9ebe-a3404b5c5812,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74192508-a8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:26:36 compute-0 nova_compute[187185]: 2025-11-29 07:26:36.325 187189 DEBUG os_vif [None req-336c97c4-1177-448c-b7ea-d7ada580baae 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:88:47:ff,bridge_name='br-int',has_traffic_filtering=True,id=74192508-a888-4ab0-9ebe-a3404b5c5812,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74192508-a8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:26:36 compute-0 nova_compute[187185]: 2025-11-29 07:26:36.328 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:36 compute-0 nova_compute[187185]: 2025-11-29 07:26:36.331 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap74192508-a8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:26:36 compute-0 nova_compute[187185]: 2025-11-29 07:26:36.334 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:36 compute-0 nova_compute[187185]: 2025-11-29 07:26:36.336 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:36 compute-0 nova_compute[187185]: 2025-11-29 07:26:36.341 187189 INFO os_vif [None req-336c97c4-1177-448c-b7ea-d7ada580baae 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:88:47:ff,bridge_name='br-int',has_traffic_filtering=True,id=74192508-a888-4ab0-9ebe-a3404b5c5812,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74192508-a8')
Nov 29 07:26:36 compute-0 nova_compute[187185]: 2025-11-29 07:26:36.343 187189 INFO nova.virt.libvirt.driver [None req-336c97c4-1177-448c-b7ea-d7ada580baae 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Deleting instance files /var/lib/nova/instances/eafc7a74-759b-40e8-a27f-d9610458b32a_del
Nov 29 07:26:36 compute-0 nova_compute[187185]: 2025-11-29 07:26:36.344 187189 INFO nova.virt.libvirt.driver [None req-336c97c4-1177-448c-b7ea-d7ada580baae 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Deletion of /var/lib/nova/instances/eafc7a74-759b-40e8-a27f-d9610458b32a_del complete
Nov 29 07:26:36 compute-0 nova_compute[187185]: 2025-11-29 07:26:36.837 187189 INFO nova.compute.manager [None req-134c6108-9b85-46fe-a2f2-6ce48302746b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Took 3.97 seconds to destroy the instance on the hypervisor.
Nov 29 07:26:36 compute-0 nova_compute[187185]: 2025-11-29 07:26:36.837 187189 DEBUG oslo.service.loopingcall [None req-134c6108-9b85-46fe-a2f2-6ce48302746b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 07:26:36 compute-0 nova_compute[187185]: 2025-11-29 07:26:36.838 187189 DEBUG nova.compute.manager [-] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 07:26:36 compute-0 nova_compute[187185]: 2025-11-29 07:26:36.838 187189 DEBUG nova.network.neutron [-] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 07:26:36 compute-0 podman[236128]: 2025-11-29 07:26:36.851619782 +0000 UTC m=+0.093542520 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 07:26:36 compute-0 nova_compute[187185]: 2025-11-29 07:26:36.976 187189 DEBUG nova.compute.manager [req-98d5b4b7-dbf5-42c1-8eef-f2552d3ea0ad req-70233ff7-ac0c-4d08-86b8-5ddb01550964 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Received event network-vif-unplugged-723f857c-f89e-440a-83f2-6bf46b479fca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:26:36 compute-0 nova_compute[187185]: 2025-11-29 07:26:36.977 187189 DEBUG oslo_concurrency.lockutils [req-98d5b4b7-dbf5-42c1-8eef-f2552d3ea0ad req-70233ff7-ac0c-4d08-86b8-5ddb01550964 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "bbcf2b17-c33f-4a89-9f82-60b4dcfa7208-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:26:36 compute-0 nova_compute[187185]: 2025-11-29 07:26:36.977 187189 DEBUG oslo_concurrency.lockutils [req-98d5b4b7-dbf5-42c1-8eef-f2552d3ea0ad req-70233ff7-ac0c-4d08-86b8-5ddb01550964 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "bbcf2b17-c33f-4a89-9f82-60b4dcfa7208-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:26:36 compute-0 nova_compute[187185]: 2025-11-29 07:26:36.977 187189 DEBUG oslo_concurrency.lockutils [req-98d5b4b7-dbf5-42c1-8eef-f2552d3ea0ad req-70233ff7-ac0c-4d08-86b8-5ddb01550964 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "bbcf2b17-c33f-4a89-9f82-60b4dcfa7208-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:26:36 compute-0 nova_compute[187185]: 2025-11-29 07:26:36.977 187189 DEBUG nova.compute.manager [req-98d5b4b7-dbf5-42c1-8eef-f2552d3ea0ad req-70233ff7-ac0c-4d08-86b8-5ddb01550964 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] No waiting events found dispatching network-vif-unplugged-723f857c-f89e-440a-83f2-6bf46b479fca pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:26:36 compute-0 nova_compute[187185]: 2025-11-29 07:26:36.978 187189 DEBUG nova.compute.manager [req-98d5b4b7-dbf5-42c1-8eef-f2552d3ea0ad req-70233ff7-ac0c-4d08-86b8-5ddb01550964 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Received event network-vif-unplugged-723f857c-f89e-440a-83f2-6bf46b479fca for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 07:26:37 compute-0 nova_compute[187185]: 2025-11-29 07:26:37.304 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance 5f11adcd-958a-4269-905d-a017406505f0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:26:37 compute-0 nova_compute[187185]: 2025-11-29 07:26:37.305 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance bbcf2b17-c33f-4a89-9f82-60b4dcfa7208 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:26:37 compute-0 nova_compute[187185]: 2025-11-29 07:26:37.305 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance eafc7a74-759b-40e8-a27f-d9610458b32a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:26:37 compute-0 nova_compute[187185]: 2025-11-29 07:26:37.305 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:26:37 compute-0 nova_compute[187185]: 2025-11-29 07:26:37.306 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:26:37 compute-0 nova_compute[187185]: 2025-11-29 07:26:37.405 187189 DEBUG nova.compute.manager [req-1610ad67-3637-4a84-a0b5-6deda142ac57 req-62b71d78-de22-4e56-bb58-635a4d2f4ae5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Received event network-vif-unplugged-adf9aa84-82bb-4e89-a0a5-7fad93336a39 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:26:37 compute-0 nova_compute[187185]: 2025-11-29 07:26:37.406 187189 DEBUG oslo_concurrency.lockutils [req-1610ad67-3637-4a84-a0b5-6deda142ac57 req-62b71d78-de22-4e56-bb58-635a4d2f4ae5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "bbcf2b17-c33f-4a89-9f82-60b4dcfa7208-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:26:37 compute-0 nova_compute[187185]: 2025-11-29 07:26:37.406 187189 DEBUG oslo_concurrency.lockutils [req-1610ad67-3637-4a84-a0b5-6deda142ac57 req-62b71d78-de22-4e56-bb58-635a4d2f4ae5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "bbcf2b17-c33f-4a89-9f82-60b4dcfa7208-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:26:37 compute-0 nova_compute[187185]: 2025-11-29 07:26:37.406 187189 DEBUG oslo_concurrency.lockutils [req-1610ad67-3637-4a84-a0b5-6deda142ac57 req-62b71d78-de22-4e56-bb58-635a4d2f4ae5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "bbcf2b17-c33f-4a89-9f82-60b4dcfa7208-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:26:37 compute-0 nova_compute[187185]: 2025-11-29 07:26:37.406 187189 DEBUG nova.compute.manager [req-1610ad67-3637-4a84-a0b5-6deda142ac57 req-62b71d78-de22-4e56-bb58-635a4d2f4ae5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] No waiting events found dispatching network-vif-unplugged-adf9aa84-82bb-4e89-a0a5-7fad93336a39 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:26:37 compute-0 nova_compute[187185]: 2025-11-29 07:26:37.407 187189 DEBUG nova.compute.manager [req-1610ad67-3637-4a84-a0b5-6deda142ac57 req-62b71d78-de22-4e56-bb58-635a4d2f4ae5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Received event network-vif-unplugged-adf9aa84-82bb-4e89-a0a5-7fad93336a39 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 07:26:37 compute-0 nova_compute[187185]: 2025-11-29 07:26:37.467 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:26:37 compute-0 nova_compute[187185]: 2025-11-29 07:26:37.503 187189 INFO nova.compute.manager [None req-336c97c4-1177-448c-b7ea-d7ada580baae 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Took 2.40 seconds to destroy the instance on the hypervisor.
Nov 29 07:26:37 compute-0 nova_compute[187185]: 2025-11-29 07:26:37.504 187189 DEBUG oslo.service.loopingcall [None req-336c97c4-1177-448c-b7ea-d7ada580baae 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 07:26:37 compute-0 nova_compute[187185]: 2025-11-29 07:26:37.505 187189 DEBUG nova.compute.manager [-] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 07:26:37 compute-0 nova_compute[187185]: 2025-11-29 07:26:37.505 187189 DEBUG nova.network.neutron [-] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 07:26:37 compute-0 nova_compute[187185]: 2025-11-29 07:26:37.528 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:26:37 compute-0 nova_compute[187185]: 2025-11-29 07:26:37.699 187189 DEBUG nova.compute.manager [req-45d786a5-bc37-41bc-a2b8-d63ab024b3fe req-8f228af4-1b7f-4672-840a-91a05a556ab8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Received event network-vif-unplugged-74192508-a888-4ab0-9ebe-a3404b5c5812 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:26:37 compute-0 nova_compute[187185]: 2025-11-29 07:26:37.699 187189 DEBUG oslo_concurrency.lockutils [req-45d786a5-bc37-41bc-a2b8-d63ab024b3fe req-8f228af4-1b7f-4672-840a-91a05a556ab8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "eafc7a74-759b-40e8-a27f-d9610458b32a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:26:37 compute-0 nova_compute[187185]: 2025-11-29 07:26:37.699 187189 DEBUG oslo_concurrency.lockutils [req-45d786a5-bc37-41bc-a2b8-d63ab024b3fe req-8f228af4-1b7f-4672-840a-91a05a556ab8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "eafc7a74-759b-40e8-a27f-d9610458b32a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:26:37 compute-0 nova_compute[187185]: 2025-11-29 07:26:37.700 187189 DEBUG oslo_concurrency.lockutils [req-45d786a5-bc37-41bc-a2b8-d63ab024b3fe req-8f228af4-1b7f-4672-840a-91a05a556ab8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "eafc7a74-759b-40e8-a27f-d9610458b32a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:26:37 compute-0 nova_compute[187185]: 2025-11-29 07:26:37.700 187189 DEBUG nova.compute.manager [req-45d786a5-bc37-41bc-a2b8-d63ab024b3fe req-8f228af4-1b7f-4672-840a-91a05a556ab8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] No waiting events found dispatching network-vif-unplugged-74192508-a888-4ab0-9ebe-a3404b5c5812 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:26:37 compute-0 nova_compute[187185]: 2025-11-29 07:26:37.700 187189 DEBUG nova.compute.manager [req-45d786a5-bc37-41bc-a2b8-d63ab024b3fe req-8f228af4-1b7f-4672-840a-91a05a556ab8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Received event network-vif-unplugged-74192508-a888-4ab0-9ebe-a3404b5c5812 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 07:26:37 compute-0 nova_compute[187185]: 2025-11-29 07:26:37.769 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:26:37 compute-0 nova_compute[187185]: 2025-11-29 07:26:37.769 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.803s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:26:37 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:37.840 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=43, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=42) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:26:37 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:37.841 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:26:37 compute-0 nova_compute[187185]: 2025-11-29 07:26:37.842 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:38 compute-0 nova_compute[187185]: 2025-11-29 07:26:38.404 187189 DEBUG nova.compute.manager [req-c0405212-d8f0-4c17-9111-d3f5a9724db7 req-8d38f737-8852-4324-9143-bcb2d2c66159 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Received event network-vif-deleted-723f857c-f89e-440a-83f2-6bf46b479fca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:26:38 compute-0 nova_compute[187185]: 2025-11-29 07:26:38.405 187189 INFO nova.compute.manager [req-c0405212-d8f0-4c17-9111-d3f5a9724db7 req-8d38f737-8852-4324-9143-bcb2d2c66159 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Neutron deleted interface 723f857c-f89e-440a-83f2-6bf46b479fca; detaching it from the instance and deleting it from the info cache
Nov 29 07:26:38 compute-0 nova_compute[187185]: 2025-11-29 07:26:38.405 187189 DEBUG nova.network.neutron [req-c0405212-d8f0-4c17-9111-d3f5a9724db7 req-8d38f737-8852-4324-9143-bcb2d2c66159 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Updating instance_info_cache with network_info: [{"id": "adf9aa84-82bb-4e89-a0a5-7fad93336a39", "address": "fa:16:3e:59:27:b3", "network": {"id": "a3d94aff-5439-43d3-a356-7aafae582344", "bridge": "br-int", "label": "tempest-network-smoke--90551140", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe59:27b3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapadf9aa84-82", "ovs_interfaceid": "adf9aa84-82bb-4e89-a0a5-7fad93336a39", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:26:38 compute-0 nova_compute[187185]: 2025-11-29 07:26:38.443 187189 DEBUG nova.compute.manager [req-c0405212-d8f0-4c17-9111-d3f5a9724db7 req-8d38f737-8852-4324-9143-bcb2d2c66159 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Detach interface failed, port_id=723f857c-f89e-440a-83f2-6bf46b479fca, reason: Instance bbcf2b17-c33f-4a89-9f82-60b4dcfa7208 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Nov 29 07:26:38 compute-0 nova_compute[187185]: 2025-11-29 07:26:38.671 187189 DEBUG nova.network.neutron [-] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:26:38 compute-0 nova_compute[187185]: 2025-11-29 07:26:38.717 187189 INFO nova.compute.manager [-] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Took 1.21 seconds to deallocate network for instance.
Nov 29 07:26:38 compute-0 nova_compute[187185]: 2025-11-29 07:26:38.821 187189 DEBUG oslo_concurrency.lockutils [None req-336c97c4-1177-448c-b7ea-d7ada580baae 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:26:38 compute-0 nova_compute[187185]: 2025-11-29 07:26:38.821 187189 DEBUG oslo_concurrency.lockutils [None req-336c97c4-1177-448c-b7ea-d7ada580baae 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:26:38 compute-0 nova_compute[187185]: 2025-11-29 07:26:38.987 187189 DEBUG nova.compute.provider_tree [None req-336c97c4-1177-448c-b7ea-d7ada580baae 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:26:39 compute-0 nova_compute[187185]: 2025-11-29 07:26:39.116 187189 DEBUG nova.scheduler.client.report [None req-336c97c4-1177-448c-b7ea-d7ada580baae 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:26:39 compute-0 nova_compute[187185]: 2025-11-29 07:26:39.132 187189 DEBUG nova.compute.manager [req-bfd734de-010c-4833-aacd-96448ef9a41a req-c797ff3b-de07-449c-beb6-dce53a06379c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Received event network-vif-plugged-723f857c-f89e-440a-83f2-6bf46b479fca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:26:39 compute-0 nova_compute[187185]: 2025-11-29 07:26:39.132 187189 DEBUG oslo_concurrency.lockutils [req-bfd734de-010c-4833-aacd-96448ef9a41a req-c797ff3b-de07-449c-beb6-dce53a06379c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "bbcf2b17-c33f-4a89-9f82-60b4dcfa7208-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:26:39 compute-0 nova_compute[187185]: 2025-11-29 07:26:39.134 187189 DEBUG oslo_concurrency.lockutils [req-bfd734de-010c-4833-aacd-96448ef9a41a req-c797ff3b-de07-449c-beb6-dce53a06379c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "bbcf2b17-c33f-4a89-9f82-60b4dcfa7208-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:26:39 compute-0 nova_compute[187185]: 2025-11-29 07:26:39.134 187189 DEBUG oslo_concurrency.lockutils [req-bfd734de-010c-4833-aacd-96448ef9a41a req-c797ff3b-de07-449c-beb6-dce53a06379c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "bbcf2b17-c33f-4a89-9f82-60b4dcfa7208-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:26:39 compute-0 nova_compute[187185]: 2025-11-29 07:26:39.135 187189 DEBUG nova.compute.manager [req-bfd734de-010c-4833-aacd-96448ef9a41a req-c797ff3b-de07-449c-beb6-dce53a06379c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] No waiting events found dispatching network-vif-plugged-723f857c-f89e-440a-83f2-6bf46b479fca pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:26:39 compute-0 nova_compute[187185]: 2025-11-29 07:26:39.135 187189 WARNING nova.compute.manager [req-bfd734de-010c-4833-aacd-96448ef9a41a req-c797ff3b-de07-449c-beb6-dce53a06379c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Received unexpected event network-vif-plugged-723f857c-f89e-440a-83f2-6bf46b479fca for instance with vm_state active and task_state deleting.
Nov 29 07:26:39 compute-0 nova_compute[187185]: 2025-11-29 07:26:39.136 187189 DEBUG nova.compute.manager [req-bfd734de-010c-4833-aacd-96448ef9a41a req-c797ff3b-de07-449c-beb6-dce53a06379c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Received event network-vif-deleted-74192508-a888-4ab0-9ebe-a3404b5c5812 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:26:39 compute-0 nova_compute[187185]: 2025-11-29 07:26:39.198 187189 DEBUG oslo_concurrency.lockutils [None req-336c97c4-1177-448c-b7ea-d7ada580baae 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.377s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:26:39 compute-0 nova_compute[187185]: 2025-11-29 07:26:39.204 187189 DEBUG nova.network.neutron [-] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:26:39 compute-0 nova_compute[187185]: 2025-11-29 07:26:39.243 187189 INFO nova.compute.manager [-] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Took 2.40 seconds to deallocate network for instance.
Nov 29 07:26:39 compute-0 nova_compute[187185]: 2025-11-29 07:26:39.258 187189 INFO nova.scheduler.client.report [None req-336c97c4-1177-448c-b7ea-d7ada580baae 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Deleted allocations for instance eafc7a74-759b-40e8-a27f-d9610458b32a
Nov 29 07:26:39 compute-0 nova_compute[187185]: 2025-11-29 07:26:39.563 187189 DEBUG oslo_concurrency.lockutils [None req-134c6108-9b85-46fe-a2f2-6ce48302746b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:26:39 compute-0 nova_compute[187185]: 2025-11-29 07:26:39.564 187189 DEBUG oslo_concurrency.lockutils [None req-134c6108-9b85-46fe-a2f2-6ce48302746b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:26:39 compute-0 nova_compute[187185]: 2025-11-29 07:26:39.678 187189 DEBUG oslo_concurrency.lockutils [None req-336c97c4-1177-448c-b7ea-d7ada580baae 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "eafc7a74-759b-40e8-a27f-d9610458b32a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 9.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:26:39 compute-0 nova_compute[187185]: 2025-11-29 07:26:39.708 187189 DEBUG nova.compute.provider_tree [None req-134c6108-9b85-46fe-a2f2-6ce48302746b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:26:39 compute-0 nova_compute[187185]: 2025-11-29 07:26:39.992 187189 DEBUG nova.compute.manager [req-5451a623-ab5c-439d-8fc4-74d6cd424237 req-0f576e40-3bf5-4696-829e-da31caac2bfe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Received event network-vif-plugged-adf9aa84-82bb-4e89-a0a5-7fad93336a39 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:26:39 compute-0 nova_compute[187185]: 2025-11-29 07:26:39.993 187189 DEBUG oslo_concurrency.lockutils [req-5451a623-ab5c-439d-8fc4-74d6cd424237 req-0f576e40-3bf5-4696-829e-da31caac2bfe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "bbcf2b17-c33f-4a89-9f82-60b4dcfa7208-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:26:39 compute-0 nova_compute[187185]: 2025-11-29 07:26:39.993 187189 DEBUG oslo_concurrency.lockutils [req-5451a623-ab5c-439d-8fc4-74d6cd424237 req-0f576e40-3bf5-4696-829e-da31caac2bfe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "bbcf2b17-c33f-4a89-9f82-60b4dcfa7208-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:26:39 compute-0 nova_compute[187185]: 2025-11-29 07:26:39.993 187189 DEBUG oslo_concurrency.lockutils [req-5451a623-ab5c-439d-8fc4-74d6cd424237 req-0f576e40-3bf5-4696-829e-da31caac2bfe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "bbcf2b17-c33f-4a89-9f82-60b4dcfa7208-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:26:39 compute-0 nova_compute[187185]: 2025-11-29 07:26:39.994 187189 DEBUG nova.compute.manager [req-5451a623-ab5c-439d-8fc4-74d6cd424237 req-0f576e40-3bf5-4696-829e-da31caac2bfe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] No waiting events found dispatching network-vif-plugged-adf9aa84-82bb-4e89-a0a5-7fad93336a39 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:26:39 compute-0 nova_compute[187185]: 2025-11-29 07:26:39.994 187189 WARNING nova.compute.manager [req-5451a623-ab5c-439d-8fc4-74d6cd424237 req-0f576e40-3bf5-4696-829e-da31caac2bfe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Received unexpected event network-vif-plugged-adf9aa84-82bb-4e89-a0a5-7fad93336a39 for instance with vm_state deleted and task_state None.
Nov 29 07:26:40 compute-0 nova_compute[187185]: 2025-11-29 07:26:40.709 187189 DEBUG nova.compute.manager [req-46a2a8b4-20f0-49e5-9811-ccb3e65e3c34 req-cffd08d5-06b8-4d05-b9c2-2501e56a0bf6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Received event network-vif-plugged-74192508-a888-4ab0-9ebe-a3404b5c5812 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:26:40 compute-0 nova_compute[187185]: 2025-11-29 07:26:40.710 187189 DEBUG oslo_concurrency.lockutils [req-46a2a8b4-20f0-49e5-9811-ccb3e65e3c34 req-cffd08d5-06b8-4d05-b9c2-2501e56a0bf6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "eafc7a74-759b-40e8-a27f-d9610458b32a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:26:40 compute-0 nova_compute[187185]: 2025-11-29 07:26:40.710 187189 DEBUG oslo_concurrency.lockutils [req-46a2a8b4-20f0-49e5-9811-ccb3e65e3c34 req-cffd08d5-06b8-4d05-b9c2-2501e56a0bf6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "eafc7a74-759b-40e8-a27f-d9610458b32a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:26:40 compute-0 nova_compute[187185]: 2025-11-29 07:26:40.710 187189 DEBUG oslo_concurrency.lockutils [req-46a2a8b4-20f0-49e5-9811-ccb3e65e3c34 req-cffd08d5-06b8-4d05-b9c2-2501e56a0bf6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "eafc7a74-759b-40e8-a27f-d9610458b32a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:26:40 compute-0 nova_compute[187185]: 2025-11-29 07:26:40.710 187189 DEBUG nova.compute.manager [req-46a2a8b4-20f0-49e5-9811-ccb3e65e3c34 req-cffd08d5-06b8-4d05-b9c2-2501e56a0bf6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] No waiting events found dispatching network-vif-plugged-74192508-a888-4ab0-9ebe-a3404b5c5812 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:26:40 compute-0 nova_compute[187185]: 2025-11-29 07:26:40.711 187189 WARNING nova.compute.manager [req-46a2a8b4-20f0-49e5-9811-ccb3e65e3c34 req-cffd08d5-06b8-4d05-b9c2-2501e56a0bf6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Received unexpected event network-vif-plugged-74192508-a888-4ab0-9ebe-a3404b5c5812 for instance with vm_state deleted and task_state None.
Nov 29 07:26:40 compute-0 nova_compute[187185]: 2025-11-29 07:26:40.820 187189 DEBUG nova.scheduler.client.report [None req-134c6108-9b85-46fe-a2f2-6ce48302746b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:26:40 compute-0 nova_compute[187185]: 2025-11-29 07:26:40.944 187189 DEBUG oslo_concurrency.lockutils [None req-134c6108-9b85-46fe-a2f2-6ce48302746b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.381s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:26:41 compute-0 nova_compute[187185]: 2025-11-29 07:26:41.321 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:41 compute-0 nova_compute[187185]: 2025-11-29 07:26:41.334 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:41 compute-0 nova_compute[187185]: 2025-11-29 07:26:41.794 187189 INFO nova.scheduler.client.report [None req-134c6108-9b85-46fe-a2f2-6ce48302746b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Deleted allocations for instance bbcf2b17-c33f-4a89-9f82-60b4dcfa7208
Nov 29 07:26:41 compute-0 nova_compute[187185]: 2025-11-29 07:26:41.800 187189 DEBUG nova.compute.manager [req-396a716c-b65b-4a4a-ad50-440a052ceb58 req-6acb18ef-4e56-459f-9180-70c5c3320671 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Received event network-vif-deleted-adf9aa84-82bb-4e89-a0a5-7fad93336a39 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:26:41 compute-0 podman[236153]: 2025-11-29 07:26:41.835437753 +0000 UTC m=+0.100441495 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=edpm)
Nov 29 07:26:41 compute-0 podman[236152]: 2025-11-29 07:26:41.848156731 +0000 UTC m=+0.113190573 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, container_name=multipathd, org.label-schema.license=GPLv2)
Nov 29 07:26:42 compute-0 nova_compute[187185]: 2025-11-29 07:26:42.075 187189 DEBUG oslo_concurrency.lockutils [None req-134c6108-9b85-46fe-a2f2-6ce48302746b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "bbcf2b17-c33f-4a89-9f82-60b4dcfa7208" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 12.248s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:26:44 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:26:44.844 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '43'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:26:46 compute-0 nova_compute[187185]: 2025-11-29 07:26:46.325 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:46 compute-0 nova_compute[187185]: 2025-11-29 07:26:46.337 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:46 compute-0 nova_compute[187185]: 2025-11-29 07:26:46.766 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:26:46 compute-0 nova_compute[187185]: 2025-11-29 07:26:46.767 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.011 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '5f11adcd-958a-4269-905d-a017406505f0', 'name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000007b', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'hostId': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.014 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.019 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5decc0ee-5d9c-4f09-8b48-d91e5d9eef20', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'instance-0000007b-5f11adcd-958a-4269-905d-a017406505f0-tapa1e67d00-86', 'timestamp': '2025-11-29T07:26:48.014795', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'tapa1e67d00-86', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:39:9f:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa1e67d00-86'}, 'message_id': 'c3f55290-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6655.733204392, 'message_signature': '6164ad7ff78bcba69d2f2538cbfef440cc71ef65ad872c45585569c356cb27db'}]}, 'timestamp': '2025-11-29 07:26:48.021320', '_unique_id': '137375668a93404fbe32106479c3fc14'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.025 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.028 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.028 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ddc7d93c-3b71-465e-88c7-8e46ab6fd16e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'instance-0000007b-5f11adcd-958a-4269-905d-a017406505f0-tapa1e67d00-86', 'timestamp': '2025-11-29T07:26:48.028494', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'tapa1e67d00-86', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:39:9f:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa1e67d00-86'}, 'message_id': 'c3f691a0-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6655.733204392, 'message_signature': '86b97a2f70fa165fa7b300159eb7bd8aa58d11f329cde0c32255ac1109a65aa9'}]}, 'timestamp': '2025-11-29 07:26:48.029215', '_unique_id': '2be76a7d323940349b62a1852c19079a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.030 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.031 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.075 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/disk.device.write.bytes volume: 413696 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.076 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d82f9e0-aacc-405d-ac10-8d2755b02041', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 413696, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '5f11adcd-958a-4269-905d-a017406505f0-vda', 'timestamp': '2025-11-29T07:26:48.031613', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'instance-0000007b', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c3fdd190-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6655.749850461, 'message_signature': 'f59ba2635a2e68be2aac5c341dc85f4f5cba8a0f99c709c3bc7172acaecc51f9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '5f11adcd-958a-4269-905d-a017406505f0-sda', 'timestamp': '2025-11-29T07:26:48.031613', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'instance-0000007b', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c3fddece-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6655.749850461, 'message_signature': 'c15ceabfe94f73180e7c950a16c56ffe2ab2967d470966ac43d8c0748d22bee7'}]}, 'timestamp': '2025-11-29 07:26:48.076825', '_unique_id': 'da3354ec9de046e79a5b48bdc3e8591f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.078 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.091 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/disk.device.allocation volume: 30744576 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.092 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'df5312b6-ca11-4581-ad11-1c6eeb0f2f71', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30744576, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '5f11adcd-958a-4269-905d-a017406505f0-vda', 'timestamp': '2025-11-29T07:26:48.078955', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'instance-0000007b', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c4003ac0-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6655.797215537, 'message_signature': '0f37b9d52eef161bc20ff00274395fbfa046857191bd35165c346aa11429e771'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '5f11adcd-958a-4269-905d-a017406505f0-sda', 'timestamp': '2025-11-29T07:26:48.078955', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'instance-0000007b', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c4004574-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6655.797215537, 'message_signature': '678ca9cd074db8bf5a45d2c802566d4b42d2c38ef9a43280c605b6baa53ef358'}]}, 'timestamp': '2025-11-29 07:26:48.092508', '_unique_id': '59fb49d326ef4a6f8f2231d5b16828a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.093 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.094 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.094 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5fb1e3a3-ef8e-4cf3-9f47-925b4e2c9848', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'instance-0000007b-5f11adcd-958a-4269-905d-a017406505f0-tapa1e67d00-86', 'timestamp': '2025-11-29T07:26:48.094359', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'tapa1e67d00-86', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:39:9f:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa1e67d00-86'}, 'message_id': 'c40097c2-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6655.733204392, 'message_signature': 'b63ff9709390862b222e81087a8d0026f67e9dd2a465114a3a25d6fa860839ae'}]}, 'timestamp': '2025-11-29 07:26:48.094623', '_unique_id': 'c2a20db3056c442d9c6af25fbec4da44'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.095 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/disk.device.read.bytes volume: 32057344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8bbda4eb-24fe-4ff7-b50d-0233847a4024', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 32057344, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '5f11adcd-958a-4269-905d-a017406505f0-vda', 'timestamp': '2025-11-29T07:26:48.095814', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'instance-0000007b', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c400d0d4-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6655.749850461, 'message_signature': '0604c453f145f3cfe1bb616491a2c903b8163225ee1432835da998e469667d6b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '5f11adcd-958a-4269-905d-a017406505f0-sda', 'timestamp': '2025-11-29T07:26:48.095814', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'instance-0000007b', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c400d8fe-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6655.749850461, 'message_signature': '24e685ce8a1fb5df6564d3c071ea0ccb34d4d937a4578576643b0eed8441aedb'}]}, 'timestamp': '2025-11-29 07:26:48.096274', '_unique_id': 'c458f1f3c9ff4f59b5c0f2e7c5b7d019'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.096 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.097 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.097 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.097 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/network.outgoing.bytes.delta volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '002b0c48-9779-49b0-a37c-21c7ebb6e431', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'instance-0000007b-5f11adcd-958a-4269-905d-a017406505f0-tapa1e67d00-86', 'timestamp': '2025-11-29T07:26:48.097651', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'tapa1e67d00-86', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:39:9f:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa1e67d00-86'}, 'message_id': 'c4011756-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6655.733204392, 'message_signature': '13df4c3c396b8c99f802752508e4a7ab57fa81a799c53b8a84f2776f44a25590'}]}, 'timestamp': '2025-11-29 07:26:48.097904', '_unique_id': '116356eb434b4115a806f0a94bc37dea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.098 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/disk.device.write.requests volume: 50 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b2668f76-3909-46ad-bc28-be7d97c2d68f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 50, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '5f11adcd-958a-4269-905d-a017406505f0-vda', 'timestamp': '2025-11-29T07:26:48.099001', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'instance-0000007b', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c4014bc2-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6655.749850461, 'message_signature': '9c407c971e6cad4c3485e7c24481fb5ad6226d87fa3a2e0d70bc9b876a1ee710'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '5f11adcd-958a-4269-905d-a017406505f0-sda', 'timestamp': '2025-11-29T07:26:48.099001', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'instance-0000007b', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c401536a-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6655.749850461, 'message_signature': 'eba0c4f078d9745f65374cd5b96b25de88d683aeb325b243ec5a9ed4822b6786'}]}, 'timestamp': '2025-11-29 07:26:48.099428', '_unique_id': '3d33f47474e6401a9b1c0fef5f7ef25c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.099 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.100 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.100 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.100 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/network.incoming.bytes volume: 1430 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd034d6b-21f8-40c1-9f1d-8748592f01bc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1430, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'instance-0000007b-5f11adcd-958a-4269-905d-a017406505f0-tapa1e67d00-86', 'timestamp': '2025-11-29T07:26:48.100608', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'tapa1e67d00-86', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:39:9f:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa1e67d00-86'}, 'message_id': 'c4018b64-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6655.733204392, 'message_signature': '27b22314d7946662bf1b199da40cd3a0b2eaccfd7a22e43f735dbd1393473cc1'}]}, 'timestamp': '2025-11-29 07:26:48.100876', '_unique_id': '2b752dcf28fc42b89f785e410ea1fd10'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.101 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4b88693e-97bb-42df-9cd5-568240739dcd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'instance-0000007b-5f11adcd-958a-4269-905d-a017406505f0-tapa1e67d00-86', 'timestamp': '2025-11-29T07:26:48.101968', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'tapa1e67d00-86', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:39:9f:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa1e67d00-86'}, 'message_id': 'c401bfbc-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6655.733204392, 'message_signature': 'fbfc380845f943bac89869ddc1ae1e87ca601724cf305baca4d2d8be4a809461'}]}, 'timestamp': '2025-11-29 07:26:48.102196', '_unique_id': '660e8b63df7245598187542d4bb4533e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.102 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.103 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.103 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/disk.device.read.requests volume: 1213 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '241ec8f0-6634-480b-b04d-2315d8e1e155', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1213, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '5f11adcd-958a-4269-905d-a017406505f0-vda', 'timestamp': '2025-11-29T07:26:48.103749', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'instance-0000007b', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c40206e8-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6655.749850461, 'message_signature': '97b8cbfa9361248cf19f7de44f429d491a46a025e828475fcec5ace309526418'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '5f11adcd-958a-4269-905d-a017406505f0-sda', 'timestamp': '2025-11-29T07:26:48.103749', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'instance-0000007b', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c4020f3a-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6655.749850461, 'message_signature': '72e699a99f00ca56b7d7e34827a8fd83ea2c8512ff9affae89665256f14d4c0d'}]}, 'timestamp': '2025-11-29 07:26:48.104215', '_unique_id': 'deaafa157deb4739928feb1c9d70311c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.104 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.105 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.105 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.105 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'df0f338e-7bb2-4ea0-af2d-8e28e23a37dd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '5f11adcd-958a-4269-905d-a017406505f0-vda', 'timestamp': '2025-11-29T07:26:48.105546', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'instance-0000007b', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c4024dba-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6655.797215537, 'message_signature': '2783f5a47371ec802f7f8b594ade9127ba1b0190a2d844ebad85a9ec1f051ff1'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '5f11adcd-958a-4269-905d-a017406505f0-sda', 'timestamp': '2025-11-29T07:26:48.105546', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'instance-0000007b', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c4025846-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6655.797215537, 'message_signature': '71f85027f3413de4b6b75700e4929077c3173612f135f2470815752aef2b3137'}]}, 'timestamp': '2025-11-29 07:26:48.106095', '_unique_id': '36157b0b7890443d821932d6d75f79a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.106 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ff028712-1f82-4bf3-be54-2c50165e9f4a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'instance-0000007b-5f11adcd-958a-4269-905d-a017406505f0-tapa1e67d00-86', 'timestamp': '2025-11-29T07:26:48.107240', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'tapa1e67d00-86', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:39:9f:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa1e67d00-86'}, 'message_id': 'c4028dca-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6655.733204392, 'message_signature': '101e7c171e075607cd12535c49d064734c9b34941380d5c12b963724c3fe3f06'}]}, 'timestamp': '2025-11-29 07:26:48.107493', '_unique_id': 'bead7d663e364168aab6973d383283d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.107 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.108 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.108 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/network.incoming.packets volume: 12 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '05fe0577-ee6c-4719-946b-0764c8a6228c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 12, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'instance-0000007b-5f11adcd-958a-4269-905d-a017406505f0-tapa1e67d00-86', 'timestamp': '2025-11-29T07:26:48.108627', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'tapa1e67d00-86', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:39:9f:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa1e67d00-86'}, 'message_id': 'c402c3da-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6655.733204392, 'message_signature': 'dc225995f559f30ab59508fda4554b31d05291b88144660df86b6cd0a8476771'}]}, 'timestamp': '2025-11-29 07:26:48.108877', '_unique_id': '07e5491c48e1429a98b98d0031e3512e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.109 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c1ddaedb-499e-4020-8c88-ab42adef2e79', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'instance-0000007b-5f11adcd-958a-4269-905d-a017406505f0-tapa1e67d00-86', 'timestamp': '2025-11-29T07:26:48.109947', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'tapa1e67d00-86', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:39:9f:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa1e67d00-86'}, 'message_id': 'c402f77e-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6655.733204392, 'message_signature': '147aea23313c37b4c182262cfa81dbce5a1a7e6975a54063e87d61f1253c2c19'}]}, 'timestamp': '2025-11-29 07:26:48.110194', '_unique_id': 'ae55b681f8fa4745a3c407b2927b0a7f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.110 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.111 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.111 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/disk.device.usage volume: 30081024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.111 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c7deb818-4a52-4734-8411-58ee400cd619', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30081024, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '5f11adcd-958a-4269-905d-a017406505f0-vda', 'timestamp': '2025-11-29T07:26:48.111336', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'instance-0000007b', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c4032e60-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6655.797215537, 'message_signature': '9428970719d9ec889747cd586687750ef5383a9094a65d93dc30703fc7235cce'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '5f11adcd-958a-4269-905d-a017406505f0-sda', 'timestamp': '2025-11-29T07:26:48.111336', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'instance-0000007b', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c4033630-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6655.797215537, 'message_signature': 'ca93f895725f6054cda7722881aef42304d00c579873c928c73295f4a5f032a1'}]}, 'timestamp': '2025-11-29 07:26:48.111766', '_unique_id': 'a3939338e48340e2b0045f9e38defd9f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.112 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/disk.device.read.latency volume: 262824280 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/disk.device.read.latency volume: 26017664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '290bfc72-8ff6-4d11-8c09-38447698b2f6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 262824280, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '5f11adcd-958a-4269-905d-a017406505f0-vda', 'timestamp': '2025-11-29T07:26:48.112942', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'instance-0000007b', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c4036c5e-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6655.749850461, 'message_signature': 'da80232f230da4c5b54aa8da84cec4fa3bd4947ffe8db98d5baed036e6cd3196'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 26017664, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '5f11adcd-958a-4269-905d-a017406505f0-sda', 'timestamp': '2025-11-29T07:26:48.112942', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'instance-0000007b', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c40373e8-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6655.749850461, 'message_signature': '94c94ecf5c0372e7b4b2601366a36db792ad6dd93a2db18ba769e41324a30221'}]}, 'timestamp': '2025-11-29 07:26:48.113373', '_unique_id': '0f10f9dddfa3416c914df2179024a38f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.113 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.114 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.114 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/network.incoming.bytes.delta volume: 1340 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8cf4e4f4-6371-4003-9402-56d51ffd3022', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 1340, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': 'instance-0000007b-5f11adcd-958a-4269-905d-a017406505f0-tapa1e67d00-86', 'timestamp': '2025-11-29T07:26:48.114434', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'tapa1e67d00-86', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:39:9f:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa1e67d00-86'}, 'message_id': 'c403a6ba-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6655.733204392, 'message_signature': '494ad70457278f9bbdfecc7e5cc8206a97373dd15363b372222067b6360a85b5'}]}, 'timestamp': '2025-11-29 07:26:48.114660', '_unique_id': 'ed97a33250b24b7b9ef40c06cd060783'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/disk.device.write.latency volume: 49120189 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.115 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d54be8e-4b75-4d49-8522-75fb662c67b7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 49120189, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '5f11adcd-958a-4269-905d-a017406505f0-vda', 'timestamp': '2025-11-29T07:26:48.115746', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'instance-0000007b', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c403db26-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6655.749850461, 'message_signature': '8758c1a8a1b7c998ab481a19e4f4bec015f73d94c3a7fe11d6e18fc8873d9bec'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '5f11adcd-958a-4269-905d-a017406505f0-sda', 'timestamp': '2025-11-29T07:26:48.115746', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'instance-0000007b', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c403e2e2-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6655.749850461, 'message_signature': '75d0356d9e29134e7183339cfa8a006f34828b0988e625d6ebf082b3e1407832'}]}, 'timestamp': '2025-11-29 07:26:48.116185', '_unique_id': '1a51f08d57bd4d37ac3441e70343b004'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.116 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.117 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.117 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.134 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/memory.usage volume: 42.28125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eb27b1e9-9465-4b47-9de0-df7f94414876', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.28125, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '5f11adcd-958a-4269-905d-a017406505f0', 'timestamp': '2025-11-29T07:26:48.117324', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'instance-0000007b', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'c406b5bc-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6655.852529168, 'message_signature': '336218939366d87bbdf3b2cd4f18b07656a280f62497aeb4f4c16c4950136767'}]}, 'timestamp': '2025-11-29 07:26:48.134732', '_unique_id': '1bedc7658c4c48858e6578ceed031abf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.135 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 DEBUG ceilometer.compute.pollsters [-] 5f11adcd-958a-4269-905d-a017406505f0/cpu volume: 11760000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fa2eaa77-751a-4a7e-b890-4d6a902975c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11760000000, 'user_id': '5be41a8530314f83bbecbb74b9276f2d', 'user_name': None, 'project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'project_name': None, 'resource_id': '5f11adcd-958a-4269-905d-a017406505f0', 'timestamp': '2025-11-29T07:26:48.136123', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1816602290', 'name': 'instance-0000007b', 'instance_id': '5f11adcd-958a-4269-905d-a017406505f0', 'instance_type': 'm1.nano', 'host': '2d923278f47bed4174fb4baf058dcb7fe228d44740cb6ebbf041ff6d', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'c406f612-ccf4-11f0-8f64-fa163e220349', 'monotonic_time': 6655.852529168, 'message_signature': '641d2ed36e36b89f5c57a9fa366655d72c64597ba8d2350cbc0a7d6d18be4e5e'}]}, 'timestamp': '2025-11-29 07:26:48.136356', '_unique_id': 'faa7a0b2993d4f6fa693fe91338d7f88'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:26:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:26:48.136 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:26:48 compute-0 nova_compute[187185]: 2025-11-29 07:26:48.168 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401193.166586, bbcf2b17-c33f-4a89-9f82-60b4dcfa7208 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:26:48 compute-0 nova_compute[187185]: 2025-11-29 07:26:48.169 187189 INFO nova.compute.manager [-] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] VM Stopped (Lifecycle Event)
Nov 29 07:26:48 compute-0 nova_compute[187185]: 2025-11-29 07:26:48.220 187189 DEBUG nova.compute.manager [None req-473d811b-8458-4b77-a323-2de8aceb6a43 - - - - - -] [instance: bbcf2b17-c33f-4a89-9f82-60b4dcfa7208] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:26:50 compute-0 nova_compute[187185]: 2025-11-29 07:26:50.375 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401195.3734822, eafc7a74-759b-40e8-a27f-d9610458b32a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:26:50 compute-0 nova_compute[187185]: 2025-11-29 07:26:50.376 187189 INFO nova.compute.manager [-] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] VM Stopped (Lifecycle Event)
Nov 29 07:26:50 compute-0 nova_compute[187185]: 2025-11-29 07:26:50.412 187189 DEBUG nova.compute.manager [None req-18af21b9-d01a-4417-b604-e1dc8e0336e2 - - - - - -] [instance: eafc7a74-759b-40e8-a27f-d9610458b32a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:26:51 compute-0 nova_compute[187185]: 2025-11-29 07:26:51.326 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:51 compute-0 nova_compute[187185]: 2025-11-29 07:26:51.339 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:51 compute-0 podman[236200]: 2025-11-29 07:26:51.841277089 +0000 UTC m=+0.067606948 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 07:26:51 compute-0 podman[236193]: 2025-11-29 07:26:51.850336725 +0000 UTC m=+0.099261511 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 07:26:51 compute-0 podman[236194]: 2025-11-29 07:26:51.854223044 +0000 UTC m=+0.085750390 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_id=edpm, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, vcs-type=git, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc.)
Nov 29 07:26:53 compute-0 ovn_controller[95281]: 2025-11-29T07:26:53Z|00418|binding|INFO|Releasing lport 0d1614aa-fbbf-4ad2-9ee2-8948d583ddf6 from this chassis (sb_readonly=0)
Nov 29 07:26:53 compute-0 nova_compute[187185]: 2025-11-29 07:26:53.683 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:53 compute-0 ovn_controller[95281]: 2025-11-29T07:26:53Z|00419|binding|INFO|Releasing lport 0d1614aa-fbbf-4ad2-9ee2-8948d583ddf6 from this chassis (sb_readonly=0)
Nov 29 07:26:53 compute-0 nova_compute[187185]: 2025-11-29 07:26:53.926 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:56 compute-0 nova_compute[187185]: 2025-11-29 07:26:56.328 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:26:56 compute-0 nova_compute[187185]: 2025-11-29 07:26:56.340 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:27:01 compute-0 nova_compute[187185]: 2025-11-29 07:27:01.342 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:27:01 compute-0 nova_compute[187185]: 2025-11-29 07:27:01.344 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:27:01 compute-0 nova_compute[187185]: 2025-11-29 07:27:01.345 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 29 07:27:01 compute-0 nova_compute[187185]: 2025-11-29 07:27:01.345 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 29 07:27:01 compute-0 nova_compute[187185]: 2025-11-29 07:27:01.359 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:27:01 compute-0 nova_compute[187185]: 2025-11-29 07:27:01.360 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 29 07:27:01 compute-0 podman[236255]: 2025-11-29 07:27:01.546065234 +0000 UTC m=+0.150603199 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:27:06 compute-0 nova_compute[187185]: 2025-11-29 07:27:06.360 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:27:07 compute-0 podman[236281]: 2025-11-29 07:27:07.791990525 +0000 UTC m=+0.058201743 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:27:11 compute-0 nova_compute[187185]: 2025-11-29 07:27:11.363 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:27:11 compute-0 nova_compute[187185]: 2025-11-29 07:27:11.364 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:27:11 compute-0 nova_compute[187185]: 2025-11-29 07:27:11.364 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 29 07:27:11 compute-0 nova_compute[187185]: 2025-11-29 07:27:11.365 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 29 07:27:11 compute-0 nova_compute[187185]: 2025-11-29 07:27:11.365 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 29 07:27:11 compute-0 nova_compute[187185]: 2025-11-29 07:27:11.367 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:27:12 compute-0 podman[236307]: 2025-11-29 07:27:12.852054605 +0000 UTC m=+0.090442462 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3)
Nov 29 07:27:12 compute-0 podman[236306]: 2025-11-29 07:27:12.867568123 +0000 UTC m=+0.112470874 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd)
Nov 29 07:27:13 compute-0 nova_compute[187185]: 2025-11-29 07:27:13.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:27:13 compute-0 nova_compute[187185]: 2025-11-29 07:27:13.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:27:13 compute-0 nova_compute[187185]: 2025-11-29 07:27:13.318 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:27:13 compute-0 nova_compute[187185]: 2025-11-29 07:27:13.737 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "refresh_cache-5f11adcd-958a-4269-905d-a017406505f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:27:13 compute-0 nova_compute[187185]: 2025-11-29 07:27:13.738 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquired lock "refresh_cache-5f11adcd-958a-4269-905d-a017406505f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:27:13 compute-0 nova_compute[187185]: 2025-11-29 07:27:13.738 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 29 07:27:13 compute-0 nova_compute[187185]: 2025-11-29 07:27:13.739 187189 DEBUG nova.objects.instance [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 5f11adcd-958a-4269-905d-a017406505f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:27:16 compute-0 nova_compute[187185]: 2025-11-29 07:27:16.368 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:27:16 compute-0 nova_compute[187185]: 2025-11-29 07:27:16.370 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:27:16 compute-0 nova_compute[187185]: 2025-11-29 07:27:16.370 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 29 07:27:16 compute-0 nova_compute[187185]: 2025-11-29 07:27:16.370 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 29 07:27:16 compute-0 nova_compute[187185]: 2025-11-29 07:27:16.403 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:27:16 compute-0 nova_compute[187185]: 2025-11-29 07:27:16.404 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 29 07:27:19 compute-0 nova_compute[187185]: 2025-11-29 07:27:19.942 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Updating instance_info_cache with network_info: [{"id": "a1e67d00-8650-44ba-b75d-07f55b8d8810", "address": "fa:16:3e:39:9f:d3", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1e67d00-86", "ovs_interfaceid": "a1e67d00-8650-44ba-b75d-07f55b8d8810", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:27:19 compute-0 nova_compute[187185]: 2025-11-29 07:27:19.967 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Releasing lock "refresh_cache-5f11adcd-958a-4269-905d-a017406505f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:27:19 compute-0 nova_compute[187185]: 2025-11-29 07:27:19.967 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 29 07:27:19 compute-0 nova_compute[187185]: 2025-11-29 07:27:19.968 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:27:19 compute-0 nova_compute[187185]: 2025-11-29 07:27:19.968 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:27:19 compute-0 nova_compute[187185]: 2025-11-29 07:27:19.968 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:27:19 compute-0 nova_compute[187185]: 2025-11-29 07:27:19.968 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:27:19 compute-0 nova_compute[187185]: 2025-11-29 07:27:19.969 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:27:19 compute-0 nova_compute[187185]: 2025-11-29 07:27:19.969 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:27:19 compute-0 nova_compute[187185]: 2025-11-29 07:27:19.969 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:27:19 compute-0 nova_compute[187185]: 2025-11-29 07:27:19.969 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:27:19 compute-0 nova_compute[187185]: 2025-11-29 07:27:19.993 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:27:19 compute-0 nova_compute[187185]: 2025-11-29 07:27:19.996 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:27:19 compute-0 nova_compute[187185]: 2025-11-29 07:27:19.996 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:27:19 compute-0 nova_compute[187185]: 2025-11-29 07:27:19.997 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:27:20 compute-0 nova_compute[187185]: 2025-11-29 07:27:20.199 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:27:20 compute-0 nova_compute[187185]: 2025-11-29 07:27:20.301 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:27:20 compute-0 nova_compute[187185]: 2025-11-29 07:27:20.303 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:27:20 compute-0 nova_compute[187185]: 2025-11-29 07:27:20.388 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:27:20 compute-0 nova_compute[187185]: 2025-11-29 07:27:20.579 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:27:20 compute-0 nova_compute[187185]: 2025-11-29 07:27:20.581 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5545MB free_disk=73.21519088745117GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:27:20 compute-0 nova_compute[187185]: 2025-11-29 07:27:20.582 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:27:20 compute-0 nova_compute[187185]: 2025-11-29 07:27:20.582 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:27:21 compute-0 nova_compute[187185]: 2025-11-29 07:27:21.405 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:27:22 compute-0 podman[236354]: 2025-11-29 07:27:22.824788007 +0000 UTC m=+0.072780334 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 07:27:22 compute-0 podman[236352]: 2025-11-29 07:27:22.83765695 +0000 UTC m=+0.096399290 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:27:22 compute-0 podman[236353]: 2025-11-29 07:27:22.848106555 +0000 UTC m=+0.099778656 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-type=git, build-date=2025-08-20T13:12:41, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 29 07:27:23 compute-0 nova_compute[187185]: 2025-11-29 07:27:23.448 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance 5f11adcd-958a-4269-905d-a017406505f0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:27:23 compute-0 nova_compute[187185]: 2025-11-29 07:27:23.449 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:27:23 compute-0 nova_compute[187185]: 2025-11-29 07:27:23.450 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:27:23 compute-0 nova_compute[187185]: 2025-11-29 07:27:23.525 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:27:24 compute-0 nova_compute[187185]: 2025-11-29 07:27:24.923 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:27:25 compute-0 nova_compute[187185]: 2025-11-29 07:27:25.102 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:27:25 compute-0 nova_compute[187185]: 2025-11-29 07:27:25.103 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.521s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:27:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:27:25.517 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:27:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:27:25.518 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:27:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:27:25.518 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:27:26 compute-0 nova_compute[187185]: 2025-11-29 07:27:26.440 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:27:26 compute-0 nova_compute[187185]: 2025-11-29 07:27:26.442 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:27:26 compute-0 nova_compute[187185]: 2025-11-29 07:27:26.442 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5035 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 29 07:27:26 compute-0 nova_compute[187185]: 2025-11-29 07:27:26.442 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 29 07:27:26 compute-0 nova_compute[187185]: 2025-11-29 07:27:26.443 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 29 07:27:26 compute-0 nova_compute[187185]: 2025-11-29 07:27:26.445 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:27:31 compute-0 nova_compute[187185]: 2025-11-29 07:27:31.446 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:27:31 compute-0 podman[236416]: 2025-11-29 07:27:31.843060898 +0000 UTC m=+0.112164834 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 07:27:36 compute-0 nova_compute[187185]: 2025-11-29 07:27:36.099 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:27:36 compute-0 nova_compute[187185]: 2025-11-29 07:27:36.448 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:27:36 compute-0 nova_compute[187185]: 2025-11-29 07:27:36.450 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:27:37 compute-0 nova_compute[187185]: 2025-11-29 07:27:37.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:27:37 compute-0 nova_compute[187185]: 2025-11-29 07:27:37.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 29 07:27:37 compute-0 nova_compute[187185]: 2025-11-29 07:27:37.339 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:27:38 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:27:38.482 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=44, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=43) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:27:38 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:27:38.483 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:27:38 compute-0 nova_compute[187185]: 2025-11-29 07:27:38.484 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:27:38 compute-0 podman[236442]: 2025-11-29 07:27:38.779674093 +0000 UTC m=+0.052543243 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 07:27:41 compute-0 nova_compute[187185]: 2025-11-29 07:27:41.451 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:27:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:27:41.485 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '44'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:27:43 compute-0 podman[236467]: 2025-11-29 07:27:43.825628085 +0000 UTC m=+0.086192802 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.build-date=20251125)
Nov 29 07:27:43 compute-0 podman[236468]: 2025-11-29 07:27:43.834444014 +0000 UTC m=+0.088300972 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 07:27:46 compute-0 nova_compute[187185]: 2025-11-29 07:27:46.482 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:27:46 compute-0 nova_compute[187185]: 2025-11-29 07:27:46.934 187189 DEBUG oslo_concurrency.lockutils [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Acquiring lock "f22f95f6-efd0-4710-adf7-895e0acda50c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:27:46 compute-0 nova_compute[187185]: 2025-11-29 07:27:46.935 187189 DEBUG oslo_concurrency.lockutils [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "f22f95f6-efd0-4710-adf7-895e0acda50c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:27:46 compute-0 nova_compute[187185]: 2025-11-29 07:27:46.984 187189 DEBUG nova.compute.manager [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 07:27:47 compute-0 nova_compute[187185]: 2025-11-29 07:27:47.145 187189 DEBUG oslo_concurrency.lockutils [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:27:47 compute-0 nova_compute[187185]: 2025-11-29 07:27:47.146 187189 DEBUG oslo_concurrency.lockutils [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:27:47 compute-0 nova_compute[187185]: 2025-11-29 07:27:47.154 187189 DEBUG nova.virt.hardware [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:27:47 compute-0 nova_compute[187185]: 2025-11-29 07:27:47.154 187189 INFO nova.compute.claims [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:27:47 compute-0 nova_compute[187185]: 2025-11-29 07:27:47.476 187189 DEBUG nova.compute.provider_tree [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:27:47 compute-0 nova_compute[187185]: 2025-11-29 07:27:47.493 187189 DEBUG nova.scheduler.client.report [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:27:47 compute-0 nova_compute[187185]: 2025-11-29 07:27:47.602 187189 DEBUG oslo_concurrency.lockutils [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.456s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:27:47 compute-0 nova_compute[187185]: 2025-11-29 07:27:47.603 187189 DEBUG nova.compute.manager [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 07:27:47 compute-0 nova_compute[187185]: 2025-11-29 07:27:47.722 187189 DEBUG nova.compute.manager [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 07:27:47 compute-0 nova_compute[187185]: 2025-11-29 07:27:47.723 187189 DEBUG nova.network.neutron [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 07:27:47 compute-0 nova_compute[187185]: 2025-11-29 07:27:47.789 187189 INFO nova.virt.libvirt.driver [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 07:27:47 compute-0 nova_compute[187185]: 2025-11-29 07:27:47.829 187189 DEBUG nova.compute.manager [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 07:27:48 compute-0 nova_compute[187185]: 2025-11-29 07:27:48.100 187189 DEBUG nova.compute.manager [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 07:27:48 compute-0 nova_compute[187185]: 2025-11-29 07:27:48.102 187189 DEBUG nova.virt.libvirt.driver [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:27:48 compute-0 nova_compute[187185]: 2025-11-29 07:27:48.103 187189 INFO nova.virt.libvirt.driver [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Creating image(s)
Nov 29 07:27:48 compute-0 nova_compute[187185]: 2025-11-29 07:27:48.103 187189 DEBUG oslo_concurrency.lockutils [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Acquiring lock "/var/lib/nova/instances/f22f95f6-efd0-4710-adf7-895e0acda50c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:27:48 compute-0 nova_compute[187185]: 2025-11-29 07:27:48.104 187189 DEBUG oslo_concurrency.lockutils [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "/var/lib/nova/instances/f22f95f6-efd0-4710-adf7-895e0acda50c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:27:48 compute-0 nova_compute[187185]: 2025-11-29 07:27:48.105 187189 DEBUG oslo_concurrency.lockutils [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "/var/lib/nova/instances/f22f95f6-efd0-4710-adf7-895e0acda50c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:27:48 compute-0 nova_compute[187185]: 2025-11-29 07:27:48.123 187189 DEBUG oslo_concurrency.processutils [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:27:48 compute-0 nova_compute[187185]: 2025-11-29 07:27:48.180 187189 DEBUG oslo_concurrency.processutils [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:27:48 compute-0 nova_compute[187185]: 2025-11-29 07:27:48.181 187189 DEBUG oslo_concurrency.lockutils [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Acquiring lock "923f30c548f83d073f1130ce28fd6a6debb4b123" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:27:48 compute-0 nova_compute[187185]: 2025-11-29 07:27:48.182 187189 DEBUG oslo_concurrency.lockutils [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "923f30c548f83d073f1130ce28fd6a6debb4b123" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:27:48 compute-0 nova_compute[187185]: 2025-11-29 07:27:48.197 187189 DEBUG oslo_concurrency.processutils [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:27:48 compute-0 nova_compute[187185]: 2025-11-29 07:27:48.258 187189 DEBUG oslo_concurrency.processutils [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:27:48 compute-0 nova_compute[187185]: 2025-11-29 07:27:48.260 187189 DEBUG oslo_concurrency.processutils [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123,backing_fmt=raw /var/lib/nova/instances/f22f95f6-efd0-4710-adf7-895e0acda50c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:27:48 compute-0 nova_compute[187185]: 2025-11-29 07:27:48.310 187189 DEBUG oslo_concurrency.processutils [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123,backing_fmt=raw /var/lib/nova/instances/f22f95f6-efd0-4710-adf7-895e0acda50c/disk 1073741824" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:27:48 compute-0 nova_compute[187185]: 2025-11-29 07:27:48.312 187189 DEBUG oslo_concurrency.lockutils [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "923f30c548f83d073f1130ce28fd6a6debb4b123" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:27:48 compute-0 nova_compute[187185]: 2025-11-29 07:27:48.313 187189 DEBUG oslo_concurrency.processutils [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:27:48 compute-0 nova_compute[187185]: 2025-11-29 07:27:48.378 187189 DEBUG oslo_concurrency.processutils [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:27:48 compute-0 nova_compute[187185]: 2025-11-29 07:27:48.379 187189 DEBUG nova.virt.disk.api [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Checking if we can resize image /var/lib/nova/instances/f22f95f6-efd0-4710-adf7-895e0acda50c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:27:48 compute-0 nova_compute[187185]: 2025-11-29 07:27:48.379 187189 DEBUG oslo_concurrency.processutils [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f22f95f6-efd0-4710-adf7-895e0acda50c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:27:48 compute-0 nova_compute[187185]: 2025-11-29 07:27:48.460 187189 DEBUG oslo_concurrency.processutils [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f22f95f6-efd0-4710-adf7-895e0acda50c/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:27:48 compute-0 nova_compute[187185]: 2025-11-29 07:27:48.461 187189 DEBUG nova.virt.disk.api [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Cannot resize image /var/lib/nova/instances/f22f95f6-efd0-4710-adf7-895e0acda50c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:27:48 compute-0 nova_compute[187185]: 2025-11-29 07:27:48.462 187189 DEBUG nova.objects.instance [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lazy-loading 'migration_context' on Instance uuid f22f95f6-efd0-4710-adf7-895e0acda50c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:27:49 compute-0 nova_compute[187185]: 2025-11-29 07:27:49.239 187189 DEBUG nova.virt.libvirt.driver [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:27:49 compute-0 nova_compute[187185]: 2025-11-29 07:27:49.239 187189 DEBUG nova.virt.libvirt.driver [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Ensure instance console log exists: /var/lib/nova/instances/f22f95f6-efd0-4710-adf7-895e0acda50c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:27:49 compute-0 nova_compute[187185]: 2025-11-29 07:27:49.240 187189 DEBUG oslo_concurrency.lockutils [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:27:49 compute-0 nova_compute[187185]: 2025-11-29 07:27:49.241 187189 DEBUG oslo_concurrency.lockutils [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:27:49 compute-0 nova_compute[187185]: 2025-11-29 07:27:49.241 187189 DEBUG oslo_concurrency.lockutils [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:27:49 compute-0 nova_compute[187185]: 2025-11-29 07:27:49.265 187189 DEBUG nova.policy [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 07:27:49 compute-0 nova_compute[187185]: 2025-11-29 07:27:49.350 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:27:49 compute-0 nova_compute[187185]: 2025-11-29 07:27:49.351 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 29 07:27:49 compute-0 nova_compute[187185]: 2025-11-29 07:27:49.374 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 29 07:27:51 compute-0 nova_compute[187185]: 2025-11-29 07:27:51.484 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:27:51 compute-0 nova_compute[187185]: 2025-11-29 07:27:51.543 187189 DEBUG nova.network.neutron [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Successfully created port: c3de84a1-7764-4c77-a2fd-fd169639ed1e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:27:53 compute-0 nova_compute[187185]: 2025-11-29 07:27:53.792 187189 DEBUG oslo_concurrency.lockutils [None req-ae3e8584-36be-4f58-ae6d-a09054525e72 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "5f11adcd-958a-4269-905d-a017406505f0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:27:53 compute-0 nova_compute[187185]: 2025-11-29 07:27:53.793 187189 DEBUG oslo_concurrency.lockutils [None req-ae3e8584-36be-4f58-ae6d-a09054525e72 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "5f11adcd-958a-4269-905d-a017406505f0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:27:53 compute-0 nova_compute[187185]: 2025-11-29 07:27:53.793 187189 DEBUG oslo_concurrency.lockutils [None req-ae3e8584-36be-4f58-ae6d-a09054525e72 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "5f11adcd-958a-4269-905d-a017406505f0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:27:53 compute-0 nova_compute[187185]: 2025-11-29 07:27:53.793 187189 DEBUG oslo_concurrency.lockutils [None req-ae3e8584-36be-4f58-ae6d-a09054525e72 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "5f11adcd-958a-4269-905d-a017406505f0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:27:53 compute-0 nova_compute[187185]: 2025-11-29 07:27:53.793 187189 DEBUG oslo_concurrency.lockutils [None req-ae3e8584-36be-4f58-ae6d-a09054525e72 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "5f11adcd-958a-4269-905d-a017406505f0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:27:53 compute-0 nova_compute[187185]: 2025-11-29 07:27:53.805 187189 INFO nova.compute.manager [None req-ae3e8584-36be-4f58-ae6d-a09054525e72 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Terminating instance
Nov 29 07:27:53 compute-0 nova_compute[187185]: 2025-11-29 07:27:53.820 187189 DEBUG nova.compute.manager [None req-ae3e8584-36be-4f58-ae6d-a09054525e72 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 07:27:53 compute-0 kernel: tapa1e67d00-86 (unregistering): left promiscuous mode
Nov 29 07:27:53 compute-0 podman[236519]: 2025-11-29 07:27:53.842730021 +0000 UTC m=+0.086337066 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 29 07:27:53 compute-0 podman[236520]: 2025-11-29 07:27:53.84657154 +0000 UTC m=+0.083483536 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, vcs-type=git, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal, version=9.6, distribution-scope=public, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 07:27:53 compute-0 NetworkManager[55227]: <info>  [1764401273.8492] device (tapa1e67d00-86): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:27:53 compute-0 podman[236521]: 2025-11-29 07:27:53.850414878 +0000 UTC m=+0.077985701 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:27:53 compute-0 nova_compute[187185]: 2025-11-29 07:27:53.854 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:27:53 compute-0 ovn_controller[95281]: 2025-11-29T07:27:53Z|00420|binding|INFO|Releasing lport a1e67d00-8650-44ba-b75d-07f55b8d8810 from this chassis (sb_readonly=0)
Nov 29 07:27:53 compute-0 ovn_controller[95281]: 2025-11-29T07:27:53Z|00421|binding|INFO|Setting lport a1e67d00-8650-44ba-b75d-07f55b8d8810 down in Southbound
Nov 29 07:27:53 compute-0 ovn_controller[95281]: 2025-11-29T07:27:53Z|00422|binding|INFO|Removing iface tapa1e67d00-86 ovn-installed in OVS
Nov 29 07:27:53 compute-0 nova_compute[187185]: 2025-11-29 07:27:53.873 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:27:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:27:53.896 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:39:9f:d3 10.100.0.11'], port_security=['fa:16:3e:39:9f:d3 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '5f11adcd-958a-4269-905d-a017406505f0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac3bb322fa744e099b38e08abe12d0e2', 'neutron:revision_number': '8', 'neutron:security_group_ids': '339ff6a8-b11e-4176-931b-a82ab9688ace', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0beb853-8490-4e92-a787-adc66ba47efc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=a1e67d00-8650-44ba-b75d-07f55b8d8810) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:27:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:27:53.898 104254 INFO neutron.agent.ovn.metadata.agent [-] Port a1e67d00-8650-44ba-b75d-07f55b8d8810 in datapath 240f16d8-602b-4aa1-8edb-e3a8d3674e39 unbound from our chassis
Nov 29 07:27:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:27:53.900 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 240f16d8-602b-4aa1-8edb-e3a8d3674e39, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:27:53 compute-0 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d0000007b.scope: Deactivated successfully.
Nov 29 07:27:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:27:53.901 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a73049f3-c273-445c-a01f-78722eddf7d3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:27:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:27:53.902 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39 namespace which is not needed anymore
Nov 29 07:27:53 compute-0 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d0000007b.scope: Consumed 19.930s CPU time.
Nov 29 07:27:53 compute-0 systemd-machined[153486]: Machine qemu-50-instance-0000007b terminated.
Nov 29 07:27:54 compute-0 nova_compute[187185]: 2025-11-29 07:27:54.050 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:27:54 compute-0 nova_compute[187185]: 2025-11-29 07:27:54.113 187189 INFO nova.virt.libvirt.driver [-] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Instance destroyed successfully.
Nov 29 07:27:54 compute-0 nova_compute[187185]: 2025-11-29 07:27:54.114 187189 DEBUG nova.objects.instance [None req-ae3e8584-36be-4f58-ae6d-a09054525e72 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lazy-loading 'resources' on Instance uuid 5f11adcd-958a-4269-905d-a017406505f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:27:54 compute-0 nova_compute[187185]: 2025-11-29 07:27:54.191 187189 DEBUG nova.virt.libvirt.vif [None req-ae3e8584-36be-4f58-ae6d-a09054525e72 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:24:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1816602290',display_name='tempest-ServerStableDeviceRescueTest-server-1816602290',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-1816602290',id=123,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:25:06Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ac3bb322fa744e099b38e08abe12d0e2',ramdisk_id='',reservation_id='r-6vcqx0xs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-2012111838',owner_user_name='tempest-ServerStableDeviceRescueTest-2012111838-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:25:11Z,user_data=None,user_id='5be41a8530314f83bbecbb74b9276f2d',uuid=5f11adcd-958a-4269-905d-a017406505f0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a1e67d00-8650-44ba-b75d-07f55b8d8810", "address": "fa:16:3e:39:9f:d3", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1e67d00-86", "ovs_interfaceid": "a1e67d00-8650-44ba-b75d-07f55b8d8810", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:27:54 compute-0 nova_compute[187185]: 2025-11-29 07:27:54.192 187189 DEBUG nova.network.os_vif_util [None req-ae3e8584-36be-4f58-ae6d-a09054525e72 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Converting VIF {"id": "a1e67d00-8650-44ba-b75d-07f55b8d8810", "address": "fa:16:3e:39:9f:d3", "network": {"id": "240f16d8-602b-4aa1-8edb-e3a8d3674e39", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-405832774-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ac3bb322fa744e099b38e08abe12d0e2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1e67d00-86", "ovs_interfaceid": "a1e67d00-8650-44ba-b75d-07f55b8d8810", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:27:54 compute-0 nova_compute[187185]: 2025-11-29 07:27:54.194 187189 DEBUG nova.network.os_vif_util [None req-ae3e8584-36be-4f58-ae6d-a09054525e72 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:39:9f:d3,bridge_name='br-int',has_traffic_filtering=True,id=a1e67d00-8650-44ba-b75d-07f55b8d8810,network=Network(240f16d8-602b-4aa1-8edb-e3a8d3674e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa1e67d00-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:27:54 compute-0 nova_compute[187185]: 2025-11-29 07:27:54.195 187189 DEBUG os_vif [None req-ae3e8584-36be-4f58-ae6d-a09054525e72 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:39:9f:d3,bridge_name='br-int',has_traffic_filtering=True,id=a1e67d00-8650-44ba-b75d-07f55b8d8810,network=Network(240f16d8-602b-4aa1-8edb-e3a8d3674e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa1e67d00-86') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:27:54 compute-0 nova_compute[187185]: 2025-11-29 07:27:54.198 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:27:54 compute-0 nova_compute[187185]: 2025-11-29 07:27:54.199 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa1e67d00-86, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:27:54 compute-0 nova_compute[187185]: 2025-11-29 07:27:54.241 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:27:54 compute-0 nova_compute[187185]: 2025-11-29 07:27:54.246 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:27:54 compute-0 nova_compute[187185]: 2025-11-29 07:27:54.250 187189 INFO os_vif [None req-ae3e8584-36be-4f58-ae6d-a09054525e72 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:39:9f:d3,bridge_name='br-int',has_traffic_filtering=True,id=a1e67d00-8650-44ba-b75d-07f55b8d8810,network=Network(240f16d8-602b-4aa1-8edb-e3a8d3674e39),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa1e67d00-86')
Nov 29 07:27:54 compute-0 nova_compute[187185]: 2025-11-29 07:27:54.251 187189 INFO nova.virt.libvirt.driver [None req-ae3e8584-36be-4f58-ae6d-a09054525e72 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Deleting instance files /var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0_del
Nov 29 07:27:54 compute-0 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[234959]: [NOTICE]   (234964) : haproxy version is 2.8.14-c23fe91
Nov 29 07:27:54 compute-0 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[234959]: [NOTICE]   (234964) : path to executable is /usr/sbin/haproxy
Nov 29 07:27:54 compute-0 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[234959]: [WARNING]  (234964) : Exiting Master process...
Nov 29 07:27:54 compute-0 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[234959]: [WARNING]  (234964) : Exiting Master process...
Nov 29 07:27:54 compute-0 nova_compute[187185]: 2025-11-29 07:27:54.252 187189 INFO nova.virt.libvirt.driver [None req-ae3e8584-36be-4f58-ae6d-a09054525e72 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Deletion of /var/lib/nova/instances/5f11adcd-958a-4269-905d-a017406505f0_del complete
Nov 29 07:27:54 compute-0 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[234959]: [ALERT]    (234964) : Current worker (234966) exited with code 143 (Terminated)
Nov 29 07:27:54 compute-0 neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39[234959]: [WARNING]  (234964) : All workers exited. Exiting... (0)
Nov 29 07:27:54 compute-0 systemd[1]: libpod-978450dd442a870e1f1d2fdc6c438c8950d308fb165d2cbbf6bc90afd8e60c5c.scope: Deactivated successfully.
Nov 29 07:27:54 compute-0 podman[236601]: 2025-11-29 07:27:54.262582442 +0000 UTC m=+0.242054656 container died 978450dd442a870e1f1d2fdc6c438c8950d308fb165d2cbbf6bc90afd8e60c5c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 07:27:54 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-978450dd442a870e1f1d2fdc6c438c8950d308fb165d2cbbf6bc90afd8e60c5c-userdata-shm.mount: Deactivated successfully.
Nov 29 07:27:54 compute-0 systemd[1]: var-lib-containers-storage-overlay-4c2f41f87dfbb1291e21fe35775c6228824079c81320e8d28a6c79bd1f224b19-merged.mount: Deactivated successfully.
Nov 29 07:27:54 compute-0 podman[236601]: 2025-11-29 07:27:54.307345946 +0000 UTC m=+0.286818160 container cleanup 978450dd442a870e1f1d2fdc6c438c8950d308fb165d2cbbf6bc90afd8e60c5c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:27:54 compute-0 systemd[1]: libpod-conmon-978450dd442a870e1f1d2fdc6c438c8950d308fb165d2cbbf6bc90afd8e60c5c.scope: Deactivated successfully.
Nov 29 07:27:54 compute-0 nova_compute[187185]: 2025-11-29 07:27:54.474 187189 INFO nova.compute.manager [None req-ae3e8584-36be-4f58-ae6d-a09054525e72 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Took 0.65 seconds to destroy the instance on the hypervisor.
Nov 29 07:27:54 compute-0 nova_compute[187185]: 2025-11-29 07:27:54.475 187189 DEBUG oslo.service.loopingcall [None req-ae3e8584-36be-4f58-ae6d-a09054525e72 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 07:27:54 compute-0 nova_compute[187185]: 2025-11-29 07:27:54.478 187189 DEBUG nova.network.neutron [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Successfully updated port: c3de84a1-7764-4c77-a2fd-fd169639ed1e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:27:54 compute-0 nova_compute[187185]: 2025-11-29 07:27:54.479 187189 DEBUG nova.compute.manager [-] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 07:27:54 compute-0 nova_compute[187185]: 2025-11-29 07:27:54.480 187189 DEBUG nova.network.neutron [-] [instance: 5f11adcd-958a-4269-905d-a017406505f0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 07:27:54 compute-0 nova_compute[187185]: 2025-11-29 07:27:54.524 187189 DEBUG oslo_concurrency.lockutils [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Acquiring lock "refresh_cache-f22f95f6-efd0-4710-adf7-895e0acda50c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:27:54 compute-0 nova_compute[187185]: 2025-11-29 07:27:54.525 187189 DEBUG oslo_concurrency.lockutils [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Acquired lock "refresh_cache-f22f95f6-efd0-4710-adf7-895e0acda50c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:27:54 compute-0 nova_compute[187185]: 2025-11-29 07:27:54.525 187189 DEBUG nova.network.neutron [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:27:54 compute-0 podman[236644]: 2025-11-29 07:27:54.539386041 +0000 UTC m=+0.203135371 container remove 978450dd442a870e1f1d2fdc6c438c8950d308fb165d2cbbf6bc90afd8e60c5c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 29 07:27:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:27:54.547 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[80e39f3b-3c50-4a9a-b91f-bccbbbfd9c11]: (4, ('Sat Nov 29 07:27:54 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39 (978450dd442a870e1f1d2fdc6c438c8950d308fb165d2cbbf6bc90afd8e60c5c)\n978450dd442a870e1f1d2fdc6c438c8950d308fb165d2cbbf6bc90afd8e60c5c\nSat Nov 29 07:27:54 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39 (978450dd442a870e1f1d2fdc6c438c8950d308fb165d2cbbf6bc90afd8e60c5c)\n978450dd442a870e1f1d2fdc6c438c8950d308fb165d2cbbf6bc90afd8e60c5c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:27:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:27:54.549 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f0ce7315-ba30-48ff-a002-b9c408d1c4cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:27:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:27:54.551 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap240f16d8-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:27:54 compute-0 nova_compute[187185]: 2025-11-29 07:27:54.554 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:27:54 compute-0 kernel: tap240f16d8-60: left promiscuous mode
Nov 29 07:27:54 compute-0 nova_compute[187185]: 2025-11-29 07:27:54.557 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:27:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:27:54.561 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[101bb4ac-61ad-4d3b-9de0-036be05f201c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:27:54 compute-0 nova_compute[187185]: 2025-11-29 07:27:54.572 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:27:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:27:54.587 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[56b6706a-c07e-4f92-9798-30039821465a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:27:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:27:54.589 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6d7ce7f7-0804-4d1a-9ad1-30d9b4c614f8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:27:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:27:54.608 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e873f591-7ceb-47a3-8e69-b445013af0de]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 655849, 'reachable_time': 24314, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236657, 'error': None, 'target': 'ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:27:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:27:54.611 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-240f16d8-602b-4aa1-8edb-e3a8d3674e39 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:27:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:27:54.612 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[d502d469-2eec-4d7e-af30-b9e73f80d0e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:27:54 compute-0 systemd[1]: run-netns-ovnmeta\x2d240f16d8\x2d602b\x2d4aa1\x2d8edb\x2de3a8d3674e39.mount: Deactivated successfully.
Nov 29 07:27:54 compute-0 nova_compute[187185]: 2025-11-29 07:27:54.660 187189 DEBUG nova.compute.manager [req-79f20601-9118-4625-93f0-85ad1e5ee30a req-237055ee-607a-4d8a-aaf3-c35d1c07c83f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Received event network-vif-unplugged-a1e67d00-8650-44ba-b75d-07f55b8d8810 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:27:54 compute-0 nova_compute[187185]: 2025-11-29 07:27:54.661 187189 DEBUG oslo_concurrency.lockutils [req-79f20601-9118-4625-93f0-85ad1e5ee30a req-237055ee-607a-4d8a-aaf3-c35d1c07c83f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5f11adcd-958a-4269-905d-a017406505f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:27:54 compute-0 nova_compute[187185]: 2025-11-29 07:27:54.662 187189 DEBUG oslo_concurrency.lockutils [req-79f20601-9118-4625-93f0-85ad1e5ee30a req-237055ee-607a-4d8a-aaf3-c35d1c07c83f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f11adcd-958a-4269-905d-a017406505f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:27:54 compute-0 nova_compute[187185]: 2025-11-29 07:27:54.662 187189 DEBUG oslo_concurrency.lockutils [req-79f20601-9118-4625-93f0-85ad1e5ee30a req-237055ee-607a-4d8a-aaf3-c35d1c07c83f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f11adcd-958a-4269-905d-a017406505f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:27:54 compute-0 nova_compute[187185]: 2025-11-29 07:27:54.662 187189 DEBUG nova.compute.manager [req-79f20601-9118-4625-93f0-85ad1e5ee30a req-237055ee-607a-4d8a-aaf3-c35d1c07c83f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] No waiting events found dispatching network-vif-unplugged-a1e67d00-8650-44ba-b75d-07f55b8d8810 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:27:54 compute-0 nova_compute[187185]: 2025-11-29 07:27:54.663 187189 DEBUG nova.compute.manager [req-79f20601-9118-4625-93f0-85ad1e5ee30a req-237055ee-607a-4d8a-aaf3-c35d1c07c83f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Received event network-vif-unplugged-a1e67d00-8650-44ba-b75d-07f55b8d8810 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 07:27:55 compute-0 nova_compute[187185]: 2025-11-29 07:27:55.666 187189 DEBUG nova.network.neutron [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:27:56 compute-0 nova_compute[187185]: 2025-11-29 07:27:56.021 187189 DEBUG nova.compute.manager [req-f516ab28-a182-48b8-94b7-58975445e3ff req-a3ec2e2e-f05c-4374-a191-f66d45734a3a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Received event network-changed-c3de84a1-7764-4c77-a2fd-fd169639ed1e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:27:56 compute-0 nova_compute[187185]: 2025-11-29 07:27:56.021 187189 DEBUG nova.compute.manager [req-f516ab28-a182-48b8-94b7-58975445e3ff req-a3ec2e2e-f05c-4374-a191-f66d45734a3a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Refreshing instance network info cache due to event network-changed-c3de84a1-7764-4c77-a2fd-fd169639ed1e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:27:56 compute-0 nova_compute[187185]: 2025-11-29 07:27:56.021 187189 DEBUG oslo_concurrency.lockutils [req-f516ab28-a182-48b8-94b7-58975445e3ff req-a3ec2e2e-f05c-4374-a191-f66d45734a3a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-f22f95f6-efd0-4710-adf7-895e0acda50c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:27:56 compute-0 nova_compute[187185]: 2025-11-29 07:27:56.486 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:27:56 compute-0 nova_compute[187185]: 2025-11-29 07:27:56.964 187189 DEBUG nova.compute.manager [req-a9bbb2a0-67ff-4793-970c-7d522adfd52b req-d9cebf97-1e27-45fe-a649-20d1494dc21c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Received event network-vif-plugged-a1e67d00-8650-44ba-b75d-07f55b8d8810 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:27:56 compute-0 nova_compute[187185]: 2025-11-29 07:27:56.965 187189 DEBUG oslo_concurrency.lockutils [req-a9bbb2a0-67ff-4793-970c-7d522adfd52b req-d9cebf97-1e27-45fe-a649-20d1494dc21c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5f11adcd-958a-4269-905d-a017406505f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:27:56 compute-0 nova_compute[187185]: 2025-11-29 07:27:56.965 187189 DEBUG oslo_concurrency.lockutils [req-a9bbb2a0-67ff-4793-970c-7d522adfd52b req-d9cebf97-1e27-45fe-a649-20d1494dc21c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f11adcd-958a-4269-905d-a017406505f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:27:56 compute-0 nova_compute[187185]: 2025-11-29 07:27:56.965 187189 DEBUG oslo_concurrency.lockutils [req-a9bbb2a0-67ff-4793-970c-7d522adfd52b req-d9cebf97-1e27-45fe-a649-20d1494dc21c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5f11adcd-958a-4269-905d-a017406505f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:27:56 compute-0 nova_compute[187185]: 2025-11-29 07:27:56.965 187189 DEBUG nova.compute.manager [req-a9bbb2a0-67ff-4793-970c-7d522adfd52b req-d9cebf97-1e27-45fe-a649-20d1494dc21c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] No waiting events found dispatching network-vif-plugged-a1e67d00-8650-44ba-b75d-07f55b8d8810 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:27:56 compute-0 nova_compute[187185]: 2025-11-29 07:27:56.966 187189 WARNING nova.compute.manager [req-a9bbb2a0-67ff-4793-970c-7d522adfd52b req-d9cebf97-1e27-45fe-a649-20d1494dc21c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Received unexpected event network-vif-plugged-a1e67d00-8650-44ba-b75d-07f55b8d8810 for instance with vm_state active and task_state deleting.
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.038 187189 DEBUG nova.network.neutron [-] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.067 187189 INFO nova.compute.manager [-] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Took 3.59 seconds to deallocate network for instance.
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.188 187189 DEBUG oslo_concurrency.lockutils [None req-ae3e8584-36be-4f58-ae6d-a09054525e72 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.189 187189 DEBUG oslo_concurrency.lockutils [None req-ae3e8584-36be-4f58-ae6d-a09054525e72 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.238 187189 DEBUG nova.compute.manager [req-4ccde33a-d092-4db6-9ed2-782eb39f20d9 req-beb6ea32-15c2-4ab3-b3d2-99752781a72a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Received event network-vif-deleted-a1e67d00-8650-44ba-b75d-07f55b8d8810 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.379 187189 DEBUG nova.compute.provider_tree [None req-ae3e8584-36be-4f58-ae6d-a09054525e72 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.395 187189 DEBUG nova.scheduler.client.report [None req-ae3e8584-36be-4f58-ae6d-a09054525e72 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.450 187189 DEBUG oslo_concurrency.lockutils [None req-ae3e8584-36be-4f58-ae6d-a09054525e72 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.261s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.492 187189 DEBUG nova.network.neutron [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Updating instance_info_cache with network_info: [{"id": "c3de84a1-7764-4c77-a2fd-fd169639ed1e", "address": "fa:16:3e:f7:bb:74", "network": {"id": "28412826-5463-46e4-95cb-a7d788b1ab15", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1363473809-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7843cfa993a1428aaaa660321ebba1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3de84a1-77", "ovs_interfaceid": "c3de84a1-7764-4c77-a2fd-fd169639ed1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.507 187189 INFO nova.scheduler.client.report [None req-ae3e8584-36be-4f58-ae6d-a09054525e72 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Deleted allocations for instance 5f11adcd-958a-4269-905d-a017406505f0
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.574 187189 DEBUG oslo_concurrency.lockutils [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Releasing lock "refresh_cache-f22f95f6-efd0-4710-adf7-895e0acda50c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.575 187189 DEBUG nova.compute.manager [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Instance network_info: |[{"id": "c3de84a1-7764-4c77-a2fd-fd169639ed1e", "address": "fa:16:3e:f7:bb:74", "network": {"id": "28412826-5463-46e4-95cb-a7d788b1ab15", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1363473809-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7843cfa993a1428aaaa660321ebba1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3de84a1-77", "ovs_interfaceid": "c3de84a1-7764-4c77-a2fd-fd169639ed1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.576 187189 DEBUG oslo_concurrency.lockutils [req-f516ab28-a182-48b8-94b7-58975445e3ff req-a3ec2e2e-f05c-4374-a191-f66d45734a3a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-f22f95f6-efd0-4710-adf7-895e0acda50c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.577 187189 DEBUG nova.network.neutron [req-f516ab28-a182-48b8-94b7-58975445e3ff req-a3ec2e2e-f05c-4374-a191-f66d45734a3a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Refreshing network info cache for port c3de84a1-7764-4c77-a2fd-fd169639ed1e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.582 187189 DEBUG nova.virt.libvirt.driver [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Start _get_guest_xml network_info=[{"id": "c3de84a1-7764-4c77-a2fd-fd169639ed1e", "address": "fa:16:3e:f7:bb:74", "network": {"id": "28412826-5463-46e4-95cb-a7d788b1ab15", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1363473809-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7843cfa993a1428aaaa660321ebba1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3de84a1-77", "ovs_interfaceid": "c3de84a1-7764-4c77-a2fd-fd169639ed1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:09Z,direct_url=<?>,disk_format='qcow2',id=3372b7b2-657b-4c4d-9d9d-7c5b771a630a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.588 187189 WARNING nova.virt.libvirt.driver [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.593 187189 DEBUG nova.virt.libvirt.host [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.595 187189 DEBUG nova.virt.libvirt.host [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.602 187189 DEBUG nova.virt.libvirt.host [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.603 187189 DEBUG nova.virt.libvirt.host [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.605 187189 DEBUG nova.virt.libvirt.driver [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.605 187189 DEBUG nova.virt.hardware [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:09Z,direct_url=<?>,disk_format='qcow2',id=3372b7b2-657b-4c4d-9d9d-7c5b771a630a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.606 187189 DEBUG nova.virt.hardware [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.606 187189 DEBUG nova.virt.hardware [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.607 187189 DEBUG nova.virt.hardware [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.607 187189 DEBUG nova.virt.hardware [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.607 187189 DEBUG nova.virt.hardware [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.607 187189 DEBUG nova.virt.hardware [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.608 187189 DEBUG nova.virt.hardware [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.608 187189 DEBUG nova.virt.hardware [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.608 187189 DEBUG nova.virt.hardware [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.608 187189 DEBUG nova.virt.hardware [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.612 187189 DEBUG nova.virt.libvirt.vif [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:27:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1499668672',display_name='tempest-ListServerFiltersTestJSON-instance-1499668672',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1499668672',id=132,image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7843cfa993a1428aaaa660321ebba1ac',ramdisk_id='',reservation_id='r-meotbwol',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-1571311845',owner_user_name='tempest-ListServerFiltersTestJSON-1571311845-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:27:47Z,user_data=None,user_id='3e2a40601ced4de78fe1767769f262c0',uuid=f22f95f6-efd0-4710-adf7-895e0acda50c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c3de84a1-7764-4c77-a2fd-fd169639ed1e", "address": "fa:16:3e:f7:bb:74", "network": {"id": "28412826-5463-46e4-95cb-a7d788b1ab15", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1363473809-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7843cfa993a1428aaaa660321ebba1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3de84a1-77", "ovs_interfaceid": "c3de84a1-7764-4c77-a2fd-fd169639ed1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.612 187189 DEBUG nova.network.os_vif_util [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Converting VIF {"id": "c3de84a1-7764-4c77-a2fd-fd169639ed1e", "address": "fa:16:3e:f7:bb:74", "network": {"id": "28412826-5463-46e4-95cb-a7d788b1ab15", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1363473809-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7843cfa993a1428aaaa660321ebba1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3de84a1-77", "ovs_interfaceid": "c3de84a1-7764-4c77-a2fd-fd169639ed1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.613 187189 DEBUG nova.network.os_vif_util [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f7:bb:74,bridge_name='br-int',has_traffic_filtering=True,id=c3de84a1-7764-4c77-a2fd-fd169639ed1e,network=Network(28412826-5463-46e4-95cb-a7d788b1ab15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3de84a1-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.614 187189 DEBUG nova.objects.instance [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lazy-loading 'pci_devices' on Instance uuid f22f95f6-efd0-4710-adf7-895e0acda50c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.645 187189 DEBUG nova.virt.libvirt.driver [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:27:58 compute-0 nova_compute[187185]:   <uuid>f22f95f6-efd0-4710-adf7-895e0acda50c</uuid>
Nov 29 07:27:58 compute-0 nova_compute[187185]:   <name>instance-00000084</name>
Nov 29 07:27:58 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:27:58 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:27:58 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:27:58 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:       <nova:name>tempest-ListServerFiltersTestJSON-instance-1499668672</nova:name>
Nov 29 07:27:58 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:27:58</nova:creationTime>
Nov 29 07:27:58 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:27:58 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:27:58 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:27:58 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:27:58 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:27:58 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:27:58 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:27:58 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:27:58 compute-0 nova_compute[187185]:         <nova:user uuid="3e2a40601ced4de78fe1767769f262c0">tempest-ListServerFiltersTestJSON-1571311845-project-member</nova:user>
Nov 29 07:27:58 compute-0 nova_compute[187185]:         <nova:project uuid="7843cfa993a1428aaaa660321ebba1ac">tempest-ListServerFiltersTestJSON-1571311845</nova:project>
Nov 29 07:27:58 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:27:58 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="3372b7b2-657b-4c4d-9d9d-7c5b771a630a"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:27:58 compute-0 nova_compute[187185]:         <nova:port uuid="c3de84a1-7764-4c77-a2fd-fd169639ed1e">
Nov 29 07:27:58 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:27:58 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:27:58 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:27:58 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <system>
Nov 29 07:27:58 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:27:58 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:27:58 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:27:58 compute-0 nova_compute[187185]:       <entry name="serial">f22f95f6-efd0-4710-adf7-895e0acda50c</entry>
Nov 29 07:27:58 compute-0 nova_compute[187185]:       <entry name="uuid">f22f95f6-efd0-4710-adf7-895e0acda50c</entry>
Nov 29 07:27:58 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     </system>
Nov 29 07:27:58 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:27:58 compute-0 nova_compute[187185]:   <os>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:   </os>
Nov 29 07:27:58 compute-0 nova_compute[187185]:   <features>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:   </features>
Nov 29 07:27:58 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:27:58 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:27:58 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:27:58 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/f22f95f6-efd0-4710-adf7-895e0acda50c/disk"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:27:58 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/f22f95f6-efd0-4710-adf7-895e0acda50c/disk.config"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:27:58 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:f7:bb:74"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:       <target dev="tapc3de84a1-77"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:27:58 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/f22f95f6-efd0-4710-adf7-895e0acda50c/console.log" append="off"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <video>
Nov 29 07:27:58 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     </video>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:27:58 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:27:58 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:27:58 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:27:58 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:27:58 compute-0 nova_compute[187185]: </domain>
Nov 29 07:27:58 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.647 187189 DEBUG nova.compute.manager [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Preparing to wait for external event network-vif-plugged-c3de84a1-7764-4c77-a2fd-fd169639ed1e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.647 187189 DEBUG oslo_concurrency.lockutils [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Acquiring lock "f22f95f6-efd0-4710-adf7-895e0acda50c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.648 187189 DEBUG oslo_concurrency.lockutils [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "f22f95f6-efd0-4710-adf7-895e0acda50c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.648 187189 DEBUG oslo_concurrency.lockutils [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "f22f95f6-efd0-4710-adf7-895e0acda50c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.649 187189 DEBUG nova.virt.libvirt.vif [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:27:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1499668672',display_name='tempest-ListServerFiltersTestJSON-instance-1499668672',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1499668672',id=132,image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7843cfa993a1428aaaa660321ebba1ac',ramdisk_id='',reservation_id='r-meotbwol',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-1571311845',owner_user_name='tempest-ListServerFiltersTestJSON-1571311845-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:27:47Z,user_data=None,user_id='3e2a40601ced4de78fe1767769f262c0',uuid=f22f95f6-efd0-4710-adf7-895e0acda50c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c3de84a1-7764-4c77-a2fd-fd169639ed1e", "address": "fa:16:3e:f7:bb:74", "network": {"id": "28412826-5463-46e4-95cb-a7d788b1ab15", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1363473809-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7843cfa993a1428aaaa660321ebba1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3de84a1-77", "ovs_interfaceid": "c3de84a1-7764-4c77-a2fd-fd169639ed1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.649 187189 DEBUG nova.network.os_vif_util [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Converting VIF {"id": "c3de84a1-7764-4c77-a2fd-fd169639ed1e", "address": "fa:16:3e:f7:bb:74", "network": {"id": "28412826-5463-46e4-95cb-a7d788b1ab15", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1363473809-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7843cfa993a1428aaaa660321ebba1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3de84a1-77", "ovs_interfaceid": "c3de84a1-7764-4c77-a2fd-fd169639ed1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.649 187189 DEBUG nova.network.os_vif_util [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f7:bb:74,bridge_name='br-int',has_traffic_filtering=True,id=c3de84a1-7764-4c77-a2fd-fd169639ed1e,network=Network(28412826-5463-46e4-95cb-a7d788b1ab15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3de84a1-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.650 187189 DEBUG os_vif [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f7:bb:74,bridge_name='br-int',has_traffic_filtering=True,id=c3de84a1-7764-4c77-a2fd-fd169639ed1e,network=Network(28412826-5463-46e4-95cb-a7d788b1ab15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3de84a1-77') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.651 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.651 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.651 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.653 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.654 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc3de84a1-77, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.654 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc3de84a1-77, col_values=(('external_ids', {'iface-id': 'c3de84a1-7764-4c77-a2fd-fd169639ed1e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f7:bb:74', 'vm-uuid': 'f22f95f6-efd0-4710-adf7-895e0acda50c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.656 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:27:58 compute-0 NetworkManager[55227]: <info>  [1764401278.6570] manager: (tapc3de84a1-77): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/220)
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.658 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.662 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.664 187189 INFO os_vif [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f7:bb:74,bridge_name='br-int',has_traffic_filtering=True,id=c3de84a1-7764-4c77-a2fd-fd169639ed1e,network=Network(28412826-5463-46e4-95cb-a7d788b1ab15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3de84a1-77')
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.714 187189 DEBUG oslo_concurrency.lockutils [None req-ae3e8584-36be-4f58-ae6d-a09054525e72 5be41a8530314f83bbecbb74b9276f2d ac3bb322fa744e099b38e08abe12d0e2 - - default default] Lock "5f11adcd-958a-4269-905d-a017406505f0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.921s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.790 187189 DEBUG nova.virt.libvirt.driver [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.791 187189 DEBUG nova.virt.libvirt.driver [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.791 187189 DEBUG nova.virt.libvirt.driver [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] No VIF found with MAC fa:16:3e:f7:bb:74, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:27:58 compute-0 nova_compute[187185]: 2025-11-29 07:27:58.792 187189 INFO nova.virt.libvirt.driver [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Using config drive
Nov 29 07:28:00 compute-0 nova_compute[187185]: 2025-11-29 07:28:00.305 187189 INFO nova.virt.libvirt.driver [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Creating config drive at /var/lib/nova/instances/f22f95f6-efd0-4710-adf7-895e0acda50c/disk.config
Nov 29 07:28:00 compute-0 nova_compute[187185]: 2025-11-29 07:28:00.316 187189 DEBUG oslo_concurrency.processutils [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f22f95f6-efd0-4710-adf7-895e0acda50c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_qfahqyv execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:28:00 compute-0 nova_compute[187185]: 2025-11-29 07:28:00.466 187189 DEBUG oslo_concurrency.processutils [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f22f95f6-efd0-4710-adf7-895e0acda50c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_qfahqyv" returned: 0 in 0.150s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:28:00 compute-0 kernel: tapc3de84a1-77: entered promiscuous mode
Nov 29 07:28:00 compute-0 NetworkManager[55227]: <info>  [1764401280.5637] manager: (tapc3de84a1-77): new Tun device (/org/freedesktop/NetworkManager/Devices/221)
Nov 29 07:28:00 compute-0 ovn_controller[95281]: 2025-11-29T07:28:00Z|00423|binding|INFO|Claiming lport c3de84a1-7764-4c77-a2fd-fd169639ed1e for this chassis.
Nov 29 07:28:00 compute-0 ovn_controller[95281]: 2025-11-29T07:28:00Z|00424|binding|INFO|c3de84a1-7764-4c77-a2fd-fd169639ed1e: Claiming fa:16:3e:f7:bb:74 10.100.0.13
Nov 29 07:28:00 compute-0 nova_compute[187185]: 2025-11-29 07:28:00.566 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:00.585 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f7:bb:74 10.100.0.13'], port_security=['fa:16:3e:f7:bb:74 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-28412826-5463-46e4-95cb-a7d788b1ab15', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7843cfa993a1428aaaa660321ebba1ac', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b91ab01c-e143-4067-9931-a92270268d77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3cbf7b29-c247-42f8-abc3-94d1e6be8d3f, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=c3de84a1-7764-4c77-a2fd-fd169639ed1e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:00.587 104254 INFO neutron.agent.ovn.metadata.agent [-] Port c3de84a1-7764-4c77-a2fd-fd169639ed1e in datapath 28412826-5463-46e4-95cb-a7d788b1ab15 bound to our chassis
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:00.590 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 28412826-5463-46e4-95cb-a7d788b1ab15
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:00.603 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a806ce95-588b-4eab-b511-7b232a5cb24a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:00.603 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap28412826-51 in ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:00.605 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap28412826-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:00.605 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[92923da3-661e-4edf-bb01-01ce274c3b76]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:00.606 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[da307dc2-eff3-4e5b-9bcb-4ca0a0a0de31]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:00 compute-0 systemd-udevd[236677]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:00.621 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[a110e607-239a-43f5-a4d0-f63ace57c239]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:00 compute-0 NetworkManager[55227]: <info>  [1764401280.6261] device (tapc3de84a1-77): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:28:00 compute-0 nova_compute[187185]: 2025-11-29 07:28:00.623 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:00 compute-0 ovn_controller[95281]: 2025-11-29T07:28:00Z|00425|binding|INFO|Setting lport c3de84a1-7764-4c77-a2fd-fd169639ed1e ovn-installed in OVS
Nov 29 07:28:00 compute-0 ovn_controller[95281]: 2025-11-29T07:28:00Z|00426|binding|INFO|Setting lport c3de84a1-7764-4c77-a2fd-fd169639ed1e up in Southbound
Nov 29 07:28:00 compute-0 NetworkManager[55227]: <info>  [1764401280.6281] device (tapc3de84a1-77): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:28:00 compute-0 nova_compute[187185]: 2025-11-29 07:28:00.629 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:00 compute-0 systemd-machined[153486]: New machine qemu-53-instance-00000084.
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:00.649 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[8c091c00-dc9d-42f6-8db0-3669563d0115]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:00 compute-0 systemd[1]: Started Virtual Machine qemu-53-instance-00000084.
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:00.688 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[379d919e-d015-45a0-bbdf-0abdd4c224ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:00.695 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[612fb416-9bcc-4a13-a2e1-4e6377aea1cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:00 compute-0 NetworkManager[55227]: <info>  [1764401280.6967] manager: (tap28412826-50): new Veth device (/org/freedesktop/NetworkManager/Devices/222)
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:00.732 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[97725fff-895f-42a3-821b-c2b64d7399c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:00.735 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[550f1f78-5dc8-4fff-89db-10445476af0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:00 compute-0 NetworkManager[55227]: <info>  [1764401280.7631] device (tap28412826-50): carrier: link connected
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:00.769 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[bba4c3a6-bb54-4b11-aa03-2257abf07c67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:00.784 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f004a409-2704-4918-88d4-32c2a5888ade]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap28412826-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fc:c0:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 137], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 672842, 'reachable_time': 35550, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236712, 'error': None, 'target': 'ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:00.801 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[73138b22-4166-4ec0-aba7-42facf0940ba]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefc:c072'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 672842, 'tstamp': 672842}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236713, 'error': None, 'target': 'ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:00.823 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[5064ac56-d4d9-4ded-aaea-d4b45c3d682a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap28412826-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fc:c0:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 137], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 672842, 'reachable_time': 35550, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236714, 'error': None, 'target': 'ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:00.861 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[847dee10-9d1b-4b24-9734-b5ecf0c8bd26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:00.929 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[3a5ed41c-5d0c-4d72-8f4d-bc5a08c7b738]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:00.930 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap28412826-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:00.930 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:00.931 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap28412826-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:28:00 compute-0 nova_compute[187185]: 2025-11-29 07:28:00.932 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:00 compute-0 kernel: tap28412826-50: entered promiscuous mode
Nov 29 07:28:00 compute-0 NetworkManager[55227]: <info>  [1764401280.9334] manager: (tap28412826-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/223)
Nov 29 07:28:00 compute-0 nova_compute[187185]: 2025-11-29 07:28:00.935 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:00.936 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap28412826-50, col_values=(('external_ids', {'iface-id': '2abf732f-8f8c-470e-b6e2-def265b14d70'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:28:00 compute-0 nova_compute[187185]: 2025-11-29 07:28:00.936 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:00 compute-0 ovn_controller[95281]: 2025-11-29T07:28:00Z|00427|binding|INFO|Releasing lport 2abf732f-8f8c-470e-b6e2-def265b14d70 from this chassis (sb_readonly=0)
Nov 29 07:28:00 compute-0 nova_compute[187185]: 2025-11-29 07:28:00.954 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:00.956 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/28412826-5463-46e4-95cb-a7d788b1ab15.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/28412826-5463-46e4-95cb-a7d788b1ab15.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:00.957 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[04322910-f6d3-4290-a8e1-bc83bc692c1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:00.958 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-28412826-5463-46e4-95cb-a7d788b1ab15
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/28412826-5463-46e4-95cb-a7d788b1ab15.pid.haproxy
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID 28412826-5463-46e4-95cb-a7d788b1ab15
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:28:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:00.959 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15', 'env', 'PROCESS_TAG=haproxy-28412826-5463-46e4-95cb-a7d788b1ab15', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/28412826-5463-46e4-95cb-a7d788b1ab15.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:28:01 compute-0 nova_compute[187185]: 2025-11-29 07:28:01.163 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401281.162189, f22f95f6-efd0-4710-adf7-895e0acda50c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:28:01 compute-0 nova_compute[187185]: 2025-11-29 07:28:01.166 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] VM Started (Lifecycle Event)
Nov 29 07:28:01 compute-0 nova_compute[187185]: 2025-11-29 07:28:01.188 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:28:01 compute-0 nova_compute[187185]: 2025-11-29 07:28:01.194 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401281.164323, f22f95f6-efd0-4710-adf7-895e0acda50c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:28:01 compute-0 nova_compute[187185]: 2025-11-29 07:28:01.194 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] VM Paused (Lifecycle Event)
Nov 29 07:28:01 compute-0 nova_compute[187185]: 2025-11-29 07:28:01.238 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:28:01 compute-0 nova_compute[187185]: 2025-11-29 07:28:01.242 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:28:01 compute-0 nova_compute[187185]: 2025-11-29 07:28:01.264 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:28:01 compute-0 podman[236753]: 2025-11-29 07:28:01.407812833 +0000 UTC m=+0.064495650 container create 6080745654af630fcfb35ff0e01dd7a6af8f7fadbae34c161bfc4be9d220da35 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 07:28:01 compute-0 systemd[1]: Started libpod-conmon-6080745654af630fcfb35ff0e01dd7a6af8f7fadbae34c161bfc4be9d220da35.scope.
Nov 29 07:28:01 compute-0 podman[236753]: 2025-11-29 07:28:01.365493569 +0000 UTC m=+0.022176376 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:28:01 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:28:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6f199aa1b7ed84e9d95e9aaa5b9e5c9d0ce191c277ba5603629bbbd6a31194db/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:28:01 compute-0 nova_compute[187185]: 2025-11-29 07:28:01.507 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:01 compute-0 podman[236753]: 2025-11-29 07:28:01.520202362 +0000 UTC m=+0.176885159 container init 6080745654af630fcfb35ff0e01dd7a6af8f7fadbae34c161bfc4be9d220da35 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Nov 29 07:28:01 compute-0 podman[236753]: 2025-11-29 07:28:01.528018923 +0000 UTC m=+0.184701720 container start 6080745654af630fcfb35ff0e01dd7a6af8f7fadbae34c161bfc4be9d220da35 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 29 07:28:01 compute-0 nova_compute[187185]: 2025-11-29 07:28:01.528 187189 DEBUG nova.compute.manager [req-f3bf0509-3bd5-4152-a711-d3a995e6ebad req-97bb0b8e-cd8e-43fa-983a-fb713dc6324e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Received event network-vif-plugged-c3de84a1-7764-4c77-a2fd-fd169639ed1e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:28:01 compute-0 nova_compute[187185]: 2025-11-29 07:28:01.529 187189 DEBUG oslo_concurrency.lockutils [req-f3bf0509-3bd5-4152-a711-d3a995e6ebad req-97bb0b8e-cd8e-43fa-983a-fb713dc6324e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "f22f95f6-efd0-4710-adf7-895e0acda50c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:28:01 compute-0 nova_compute[187185]: 2025-11-29 07:28:01.529 187189 DEBUG oslo_concurrency.lockutils [req-f3bf0509-3bd5-4152-a711-d3a995e6ebad req-97bb0b8e-cd8e-43fa-983a-fb713dc6324e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f22f95f6-efd0-4710-adf7-895e0acda50c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:28:01 compute-0 nova_compute[187185]: 2025-11-29 07:28:01.530 187189 DEBUG oslo_concurrency.lockutils [req-f3bf0509-3bd5-4152-a711-d3a995e6ebad req-97bb0b8e-cd8e-43fa-983a-fb713dc6324e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f22f95f6-efd0-4710-adf7-895e0acda50c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:28:01 compute-0 nova_compute[187185]: 2025-11-29 07:28:01.530 187189 DEBUG nova.compute.manager [req-f3bf0509-3bd5-4152-a711-d3a995e6ebad req-97bb0b8e-cd8e-43fa-983a-fb713dc6324e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Processing event network-vif-plugged-c3de84a1-7764-4c77-a2fd-fd169639ed1e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:28:01 compute-0 nova_compute[187185]: 2025-11-29 07:28:01.532 187189 DEBUG nova.network.neutron [req-f516ab28-a182-48b8-94b7-58975445e3ff req-a3ec2e2e-f05c-4374-a191-f66d45734a3a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Updated VIF entry in instance network info cache for port c3de84a1-7764-4c77-a2fd-fd169639ed1e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:28:01 compute-0 nova_compute[187185]: 2025-11-29 07:28:01.532 187189 DEBUG nova.network.neutron [req-f516ab28-a182-48b8-94b7-58975445e3ff req-a3ec2e2e-f05c-4374-a191-f66d45734a3a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Updating instance_info_cache with network_info: [{"id": "c3de84a1-7764-4c77-a2fd-fd169639ed1e", "address": "fa:16:3e:f7:bb:74", "network": {"id": "28412826-5463-46e4-95cb-a7d788b1ab15", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1363473809-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7843cfa993a1428aaaa660321ebba1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3de84a1-77", "ovs_interfaceid": "c3de84a1-7764-4c77-a2fd-fd169639ed1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:28:01 compute-0 nova_compute[187185]: 2025-11-29 07:28:01.534 187189 DEBUG nova.compute.manager [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:28:01 compute-0 nova_compute[187185]: 2025-11-29 07:28:01.538 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401281.5386071, f22f95f6-efd0-4710-adf7-895e0acda50c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:28:01 compute-0 nova_compute[187185]: 2025-11-29 07:28:01.539 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] VM Resumed (Lifecycle Event)
Nov 29 07:28:01 compute-0 nova_compute[187185]: 2025-11-29 07:28:01.541 187189 DEBUG nova.virt.libvirt.driver [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:28:01 compute-0 nova_compute[187185]: 2025-11-29 07:28:01.547 187189 INFO nova.virt.libvirt.driver [-] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Instance spawned successfully.
Nov 29 07:28:01 compute-0 nova_compute[187185]: 2025-11-29 07:28:01.547 187189 DEBUG nova.virt.libvirt.driver [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:28:01 compute-0 neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15[236768]: [NOTICE]   (236773) : New worker (236775) forked
Nov 29 07:28:01 compute-0 neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15[236768]: [NOTICE]   (236773) : Loading success.
Nov 29 07:28:01 compute-0 nova_compute[187185]: 2025-11-29 07:28:01.612 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:28:01 compute-0 nova_compute[187185]: 2025-11-29 07:28:01.615 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:28:01 compute-0 nova_compute[187185]: 2025-11-29 07:28:01.650 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:28:01 compute-0 nova_compute[187185]: 2025-11-29 07:28:01.663 187189 DEBUG oslo_concurrency.lockutils [req-f516ab28-a182-48b8-94b7-58975445e3ff req-a3ec2e2e-f05c-4374-a191-f66d45734a3a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-f22f95f6-efd0-4710-adf7-895e0acda50c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:28:01 compute-0 nova_compute[187185]: 2025-11-29 07:28:01.667 187189 DEBUG nova.virt.libvirt.driver [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:28:01 compute-0 nova_compute[187185]: 2025-11-29 07:28:01.668 187189 DEBUG nova.virt.libvirt.driver [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:28:01 compute-0 nova_compute[187185]: 2025-11-29 07:28:01.668 187189 DEBUG nova.virt.libvirt.driver [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:28:01 compute-0 nova_compute[187185]: 2025-11-29 07:28:01.669 187189 DEBUG nova.virt.libvirt.driver [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:28:01 compute-0 nova_compute[187185]: 2025-11-29 07:28:01.669 187189 DEBUG nova.virt.libvirt.driver [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:28:01 compute-0 nova_compute[187185]: 2025-11-29 07:28:01.671 187189 DEBUG nova.virt.libvirt.driver [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:28:01 compute-0 nova_compute[187185]: 2025-11-29 07:28:01.806 187189 INFO nova.compute.manager [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Took 13.70 seconds to spawn the instance on the hypervisor.
Nov 29 07:28:01 compute-0 nova_compute[187185]: 2025-11-29 07:28:01.806 187189 DEBUG nova.compute.manager [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:28:01 compute-0 nova_compute[187185]: 2025-11-29 07:28:01.924 187189 INFO nova.compute.manager [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Took 14.85 seconds to build instance.
Nov 29 07:28:01 compute-0 nova_compute[187185]: 2025-11-29 07:28:01.948 187189 DEBUG oslo_concurrency.lockutils [None req-3c11af81-8da9-48f6-983c-355069c4c778 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "f22f95f6-efd0-4710-adf7-895e0acda50c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.013s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:28:02 compute-0 podman[236784]: 2025-11-29 07:28:02.831094359 +0000 UTC m=+0.094034454 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125)
Nov 29 07:28:03 compute-0 nova_compute[187185]: 2025-11-29 07:28:03.704 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:03 compute-0 nova_compute[187185]: 2025-11-29 07:28:03.709 187189 DEBUG nova.compute.manager [req-8e078265-ccfd-406f-aba1-780bebb0fff9 req-31284a57-b44b-445c-afb2-374d1febc3be 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Received event network-vif-plugged-c3de84a1-7764-4c77-a2fd-fd169639ed1e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:28:03 compute-0 nova_compute[187185]: 2025-11-29 07:28:03.710 187189 DEBUG oslo_concurrency.lockutils [req-8e078265-ccfd-406f-aba1-780bebb0fff9 req-31284a57-b44b-445c-afb2-374d1febc3be 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "f22f95f6-efd0-4710-adf7-895e0acda50c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:28:03 compute-0 nova_compute[187185]: 2025-11-29 07:28:03.710 187189 DEBUG oslo_concurrency.lockutils [req-8e078265-ccfd-406f-aba1-780bebb0fff9 req-31284a57-b44b-445c-afb2-374d1febc3be 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f22f95f6-efd0-4710-adf7-895e0acda50c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:28:03 compute-0 nova_compute[187185]: 2025-11-29 07:28:03.710 187189 DEBUG oslo_concurrency.lockutils [req-8e078265-ccfd-406f-aba1-780bebb0fff9 req-31284a57-b44b-445c-afb2-374d1febc3be 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f22f95f6-efd0-4710-adf7-895e0acda50c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:28:03 compute-0 nova_compute[187185]: 2025-11-29 07:28:03.711 187189 DEBUG nova.compute.manager [req-8e078265-ccfd-406f-aba1-780bebb0fff9 req-31284a57-b44b-445c-afb2-374d1febc3be 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] No waiting events found dispatching network-vif-plugged-c3de84a1-7764-4c77-a2fd-fd169639ed1e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:28:03 compute-0 nova_compute[187185]: 2025-11-29 07:28:03.711 187189 WARNING nova.compute.manager [req-8e078265-ccfd-406f-aba1-780bebb0fff9 req-31284a57-b44b-445c-afb2-374d1febc3be 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Received unexpected event network-vif-plugged-c3de84a1-7764-4c77-a2fd-fd169639ed1e for instance with vm_state active and task_state None.
Nov 29 07:28:06 compute-0 nova_compute[187185]: 2025-11-29 07:28:06.510 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:08 compute-0 ovn_controller[95281]: 2025-11-29T07:28:08Z|00428|binding|INFO|Releasing lport 2abf732f-8f8c-470e-b6e2-def265b14d70 from this chassis (sb_readonly=0)
Nov 29 07:28:08 compute-0 nova_compute[187185]: 2025-11-29 07:28:08.162 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:08 compute-0 nova_compute[187185]: 2025-11-29 07:28:08.707 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:09 compute-0 nova_compute[187185]: 2025-11-29 07:28:09.113 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401274.108925, 5f11adcd-958a-4269-905d-a017406505f0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:28:09 compute-0 nova_compute[187185]: 2025-11-29 07:28:09.117 187189 INFO nova.compute.manager [-] [instance: 5f11adcd-958a-4269-905d-a017406505f0] VM Stopped (Lifecycle Event)
Nov 29 07:28:09 compute-0 nova_compute[187185]: 2025-11-29 07:28:09.204 187189 DEBUG nova.compute.manager [None req-c862f8e5-63d8-4e03-b01d-27ad7e40014b - - - - - -] [instance: 5f11adcd-958a-4269-905d-a017406505f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:28:09 compute-0 podman[236810]: 2025-11-29 07:28:09.799205359 +0000 UTC m=+0.063871182 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 07:28:11 compute-0 nova_compute[187185]: 2025-11-29 07:28:11.511 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:13 compute-0 ovn_controller[95281]: 2025-11-29T07:28:13Z|00055|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f7:bb:74 10.100.0.13
Nov 29 07:28:13 compute-0 ovn_controller[95281]: 2025-11-29T07:28:13Z|00056|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f7:bb:74 10.100.0.13
Nov 29 07:28:13 compute-0 nova_compute[187185]: 2025-11-29 07:28:13.710 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:14 compute-0 nova_compute[187185]: 2025-11-29 07:28:14.342 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:28:14 compute-0 nova_compute[187185]: 2025-11-29 07:28:14.343 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:28:14 compute-0 nova_compute[187185]: 2025-11-29 07:28:14.344 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:28:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:14.682 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b9:46:33 2001:db8:0:1:f816:3eff:feb9:4633 2001:db8::f816:3eff:feb9:4633'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:feb9:4633/64 2001:db8::f816:3eff:feb9:4633/64', 'neutron:device_id': 'ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ff387e90-45c2-42d7-b536-fee4d2b6eb5e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ed5ad144-c783-4b67-a226-e0c5588d3535, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f0ce4da0-40ec-44ef-8179-4cbfad9b57f1) old=Port_Binding(mac=['fa:16:3e:b9:46:33 2001:db8::f816:3eff:feb9:4633'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:feb9:4633/64', 'neutron:device_id': 'ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ff387e90-45c2-42d7-b536-fee4d2b6eb5e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:28:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:14.684 104254 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f0ce4da0-40ec-44ef-8179-4cbfad9b57f1 in datapath ff387e90-45c2-42d7-b536-fee4d2b6eb5e updated
Nov 29 07:28:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:14.686 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ff387e90-45c2-42d7-b536-fee4d2b6eb5e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:28:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:14.687 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[9c22f37b-43aa-4afd-a42c-c1e51d424413]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:14 compute-0 nova_compute[187185]: 2025-11-29 07:28:14.707 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "refresh_cache-f22f95f6-efd0-4710-adf7-895e0acda50c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:28:14 compute-0 nova_compute[187185]: 2025-11-29 07:28:14.708 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquired lock "refresh_cache-f22f95f6-efd0-4710-adf7-895e0acda50c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:28:14 compute-0 nova_compute[187185]: 2025-11-29 07:28:14.708 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 29 07:28:14 compute-0 nova_compute[187185]: 2025-11-29 07:28:14.709 187189 DEBUG nova.objects.instance [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f22f95f6-efd0-4710-adf7-895e0acda50c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:28:14 compute-0 podman[236854]: 2025-11-29 07:28:14.7965163 +0000 UTC m=+0.056430562 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 07:28:14 compute-0 podman[236855]: 2025-11-29 07:28:14.808746255 +0000 UTC m=+0.066186078 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 29 07:28:16 compute-0 nova_compute[187185]: 2025-11-29 07:28:16.514 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:16 compute-0 nova_compute[187185]: 2025-11-29 07:28:16.987 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Updating instance_info_cache with network_info: [{"id": "c3de84a1-7764-4c77-a2fd-fd169639ed1e", "address": "fa:16:3e:f7:bb:74", "network": {"id": "28412826-5463-46e4-95cb-a7d788b1ab15", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1363473809-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7843cfa993a1428aaaa660321ebba1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3de84a1-77", "ovs_interfaceid": "c3de84a1-7764-4c77-a2fd-fd169639ed1e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:28:17 compute-0 nova_compute[187185]: 2025-11-29 07:28:17.023 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Releasing lock "refresh_cache-f22f95f6-efd0-4710-adf7-895e0acda50c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:28:17 compute-0 nova_compute[187185]: 2025-11-29 07:28:17.023 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 29 07:28:17 compute-0 nova_compute[187185]: 2025-11-29 07:28:17.024 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:28:17 compute-0 nova_compute[187185]: 2025-11-29 07:28:17.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:28:17 compute-0 nova_compute[187185]: 2025-11-29 07:28:17.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:28:17 compute-0 nova_compute[187185]: 2025-11-29 07:28:17.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:28:17 compute-0 nova_compute[187185]: 2025-11-29 07:28:17.337 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:28:17 compute-0 nova_compute[187185]: 2025-11-29 07:28:17.338 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:28:17 compute-0 nova_compute[187185]: 2025-11-29 07:28:17.339 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:28:17 compute-0 nova_compute[187185]: 2025-11-29 07:28:17.339 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:28:17 compute-0 nova_compute[187185]: 2025-11-29 07:28:17.421 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f22f95f6-efd0-4710-adf7-895e0acda50c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:28:17 compute-0 nova_compute[187185]: 2025-11-29 07:28:17.504 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f22f95f6-efd0-4710-adf7-895e0acda50c/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:28:17 compute-0 nova_compute[187185]: 2025-11-29 07:28:17.505 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f22f95f6-efd0-4710-adf7-895e0acda50c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:28:17 compute-0 nova_compute[187185]: 2025-11-29 07:28:17.559 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f22f95f6-efd0-4710-adf7-895e0acda50c/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:28:17 compute-0 nova_compute[187185]: 2025-11-29 07:28:17.758 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:28:17 compute-0 nova_compute[187185]: 2025-11-29 07:28:17.760 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5529MB free_disk=73.21548080444336GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:28:17 compute-0 nova_compute[187185]: 2025-11-29 07:28:17.760 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:28:17 compute-0 nova_compute[187185]: 2025-11-29 07:28:17.761 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:28:17 compute-0 nova_compute[187185]: 2025-11-29 07:28:17.870 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance f22f95f6-efd0-4710-adf7-895e0acda50c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:28:17 compute-0 nova_compute[187185]: 2025-11-29 07:28:17.871 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:28:17 compute-0 nova_compute[187185]: 2025-11-29 07:28:17.872 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:28:17 compute-0 nova_compute[187185]: 2025-11-29 07:28:17.943 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:28:17 compute-0 nova_compute[187185]: 2025-11-29 07:28:17.967 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:28:18 compute-0 nova_compute[187185]: 2025-11-29 07:28:18.024 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:28:18 compute-0 nova_compute[187185]: 2025-11-29 07:28:18.025 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.265s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:28:18 compute-0 nova_compute[187185]: 2025-11-29 07:28:18.714 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:19 compute-0 nova_compute[187185]: 2025-11-29 07:28:19.027 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:28:19 compute-0 nova_compute[187185]: 2025-11-29 07:28:19.027 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:28:20 compute-0 nova_compute[187185]: 2025-11-29 07:28:20.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:28:20 compute-0 nova_compute[187185]: 2025-11-29 07:28:20.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:28:20 compute-0 nova_compute[187185]: 2025-11-29 07:28:20.828 187189 DEBUG oslo_concurrency.lockutils [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "2cb4e847-114f-440a-b231-65e3fff0f0d2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:28:20 compute-0 nova_compute[187185]: 2025-11-29 07:28:20.829 187189 DEBUG oslo_concurrency.lockutils [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2cb4e847-114f-440a-b231-65e3fff0f0d2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:28:20 compute-0 nova_compute[187185]: 2025-11-29 07:28:20.867 187189 DEBUG nova.compute.manager [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 07:28:20 compute-0 nova_compute[187185]: 2025-11-29 07:28:20.957 187189 DEBUG oslo_concurrency.lockutils [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:28:20 compute-0 nova_compute[187185]: 2025-11-29 07:28:20.958 187189 DEBUG oslo_concurrency.lockutils [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:28:20 compute-0 nova_compute[187185]: 2025-11-29 07:28:20.964 187189 DEBUG nova.virt.hardware [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:28:20 compute-0 nova_compute[187185]: 2025-11-29 07:28:20.965 187189 INFO nova.compute.claims [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:28:21 compute-0 nova_compute[187185]: 2025-11-29 07:28:21.110 187189 DEBUG nova.compute.provider_tree [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:28:21 compute-0 nova_compute[187185]: 2025-11-29 07:28:21.151 187189 DEBUG nova.scheduler.client.report [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:28:21 compute-0 nova_compute[187185]: 2025-11-29 07:28:21.194 187189 DEBUG oslo_concurrency.lockutils [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.236s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:28:21 compute-0 nova_compute[187185]: 2025-11-29 07:28:21.196 187189 DEBUG nova.compute.manager [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 07:28:21 compute-0 nova_compute[187185]: 2025-11-29 07:28:21.274 187189 DEBUG nova.compute.manager [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 07:28:21 compute-0 nova_compute[187185]: 2025-11-29 07:28:21.275 187189 DEBUG nova.network.neutron [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 07:28:21 compute-0 nova_compute[187185]: 2025-11-29 07:28:21.295 187189 INFO nova.virt.libvirt.driver [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 07:28:21 compute-0 nova_compute[187185]: 2025-11-29 07:28:21.313 187189 DEBUG nova.compute.manager [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 07:28:21 compute-0 nova_compute[187185]: 2025-11-29 07:28:21.446 187189 DEBUG nova.compute.manager [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 07:28:21 compute-0 nova_compute[187185]: 2025-11-29 07:28:21.450 187189 DEBUG nova.virt.libvirt.driver [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:28:21 compute-0 nova_compute[187185]: 2025-11-29 07:28:21.451 187189 INFO nova.virt.libvirt.driver [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Creating image(s)
Nov 29 07:28:21 compute-0 nova_compute[187185]: 2025-11-29 07:28:21.452 187189 DEBUG oslo_concurrency.lockutils [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "/var/lib/nova/instances/2cb4e847-114f-440a-b231-65e3fff0f0d2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:28:21 compute-0 nova_compute[187185]: 2025-11-29 07:28:21.452 187189 DEBUG oslo_concurrency.lockutils [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "/var/lib/nova/instances/2cb4e847-114f-440a-b231-65e3fff0f0d2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:28:21 compute-0 nova_compute[187185]: 2025-11-29 07:28:21.454 187189 DEBUG oslo_concurrency.lockutils [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "/var/lib/nova/instances/2cb4e847-114f-440a-b231-65e3fff0f0d2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:28:21 compute-0 nova_compute[187185]: 2025-11-29 07:28:21.481 187189 DEBUG oslo_concurrency.processutils [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:28:21 compute-0 nova_compute[187185]: 2025-11-29 07:28:21.517 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:21 compute-0 nova_compute[187185]: 2025-11-29 07:28:21.567 187189 DEBUG oslo_concurrency.processutils [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:28:21 compute-0 nova_compute[187185]: 2025-11-29 07:28:21.568 187189 DEBUG oslo_concurrency.lockutils [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:28:21 compute-0 nova_compute[187185]: 2025-11-29 07:28:21.569 187189 DEBUG oslo_concurrency.lockutils [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:28:21 compute-0 nova_compute[187185]: 2025-11-29 07:28:21.587 187189 DEBUG oslo_concurrency.processutils [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:28:21 compute-0 nova_compute[187185]: 2025-11-29 07:28:21.641 187189 DEBUG oslo_concurrency.processutils [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:28:21 compute-0 nova_compute[187185]: 2025-11-29 07:28:21.642 187189 DEBUG oslo_concurrency.processutils [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/2cb4e847-114f-440a-b231-65e3fff0f0d2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:28:21 compute-0 nova_compute[187185]: 2025-11-29 07:28:21.678 187189 DEBUG oslo_concurrency.processutils [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/2cb4e847-114f-440a-b231-65e3fff0f0d2/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:28:21 compute-0 nova_compute[187185]: 2025-11-29 07:28:21.679 187189 DEBUG oslo_concurrency.lockutils [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:28:21 compute-0 nova_compute[187185]: 2025-11-29 07:28:21.680 187189 DEBUG oslo_concurrency.processutils [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:28:21 compute-0 nova_compute[187185]: 2025-11-29 07:28:21.743 187189 DEBUG nova.policy [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 07:28:21 compute-0 nova_compute[187185]: 2025-11-29 07:28:21.749 187189 DEBUG oslo_concurrency.processutils [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:28:21 compute-0 nova_compute[187185]: 2025-11-29 07:28:21.750 187189 DEBUG nova.virt.disk.api [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Checking if we can resize image /var/lib/nova/instances/2cb4e847-114f-440a-b231-65e3fff0f0d2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:28:21 compute-0 nova_compute[187185]: 2025-11-29 07:28:21.751 187189 DEBUG oslo_concurrency.processutils [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2cb4e847-114f-440a-b231-65e3fff0f0d2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:28:21 compute-0 nova_compute[187185]: 2025-11-29 07:28:21.811 187189 DEBUG oslo_concurrency.processutils [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2cb4e847-114f-440a-b231-65e3fff0f0d2/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:28:21 compute-0 nova_compute[187185]: 2025-11-29 07:28:21.813 187189 DEBUG nova.virt.disk.api [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Cannot resize image /var/lib/nova/instances/2cb4e847-114f-440a-b231-65e3fff0f0d2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:28:21 compute-0 nova_compute[187185]: 2025-11-29 07:28:21.813 187189 DEBUG nova.objects.instance [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'migration_context' on Instance uuid 2cb4e847-114f-440a-b231-65e3fff0f0d2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:28:21 compute-0 nova_compute[187185]: 2025-11-29 07:28:21.831 187189 DEBUG nova.virt.libvirt.driver [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:28:21 compute-0 nova_compute[187185]: 2025-11-29 07:28:21.831 187189 DEBUG nova.virt.libvirt.driver [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Ensure instance console log exists: /var/lib/nova/instances/2cb4e847-114f-440a-b231-65e3fff0f0d2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:28:21 compute-0 nova_compute[187185]: 2025-11-29 07:28:21.832 187189 DEBUG oslo_concurrency.lockutils [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:28:21 compute-0 nova_compute[187185]: 2025-11-29 07:28:21.833 187189 DEBUG oslo_concurrency.lockutils [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:28:21 compute-0 nova_compute[187185]: 2025-11-29 07:28:21.834 187189 DEBUG oslo_concurrency.lockutils [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:28:22 compute-0 nova_compute[187185]: 2025-11-29 07:28:22.483 187189 DEBUG nova.network.neutron [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Successfully created port: 836d3fdf-e98b-4a41-864f-9e3fbdb29394 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:28:23 compute-0 nova_compute[187185]: 2025-11-29 07:28:23.051 187189 DEBUG nova.network.neutron [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Successfully created port: 9c194df0-4c84-41bd-94af-7e4ecd312dd5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:28:23 compute-0 nova_compute[187185]: 2025-11-29 07:28:23.718 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:24 compute-0 nova_compute[187185]: 2025-11-29 07:28:24.962 187189 DEBUG nova.network.neutron [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Successfully updated port: 836d3fdf-e98b-4a41-864f-9e3fbdb29394 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:28:24 compute-0 podman[236916]: 2025-11-29 07:28:24.995861883 +0000 UTC m=+0.055914858 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 29 07:28:25 compute-0 podman[236917]: 2025-11-29 07:28:25.020694473 +0000 UTC m=+0.073774532 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7, architecture=x86_64)
Nov 29 07:28:25 compute-0 nova_compute[187185]: 2025-11-29 07:28:25.056 187189 DEBUG nova.compute.manager [req-785be74e-6198-4e2c-bde2-de57c34ddbae req-3bbba090-43b3-4d00-bf42-e73d7f04cc2a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Received event network-changed-836d3fdf-e98b-4a41-864f-9e3fbdb29394 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:28:25 compute-0 nova_compute[187185]: 2025-11-29 07:28:25.056 187189 DEBUG nova.compute.manager [req-785be74e-6198-4e2c-bde2-de57c34ddbae req-3bbba090-43b3-4d00-bf42-e73d7f04cc2a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Refreshing instance network info cache due to event network-changed-836d3fdf-e98b-4a41-864f-9e3fbdb29394. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:28:25 compute-0 nova_compute[187185]: 2025-11-29 07:28:25.057 187189 DEBUG oslo_concurrency.lockutils [req-785be74e-6198-4e2c-bde2-de57c34ddbae req-3bbba090-43b3-4d00-bf42-e73d7f04cc2a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-2cb4e847-114f-440a-b231-65e3fff0f0d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:28:25 compute-0 nova_compute[187185]: 2025-11-29 07:28:25.057 187189 DEBUG oslo_concurrency.lockutils [req-785be74e-6198-4e2c-bde2-de57c34ddbae req-3bbba090-43b3-4d00-bf42-e73d7f04cc2a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-2cb4e847-114f-440a-b231-65e3fff0f0d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:28:25 compute-0 nova_compute[187185]: 2025-11-29 07:28:25.058 187189 DEBUG nova.network.neutron [req-785be74e-6198-4e2c-bde2-de57c34ddbae req-3bbba090-43b3-4d00-bf42-e73d7f04cc2a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Refreshing network info cache for port 836d3fdf-e98b-4a41-864f-9e3fbdb29394 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:28:25 compute-0 podman[236923]: 2025-11-29 07:28:25.058932842 +0000 UTC m=+0.100272349 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 07:28:25 compute-0 nova_compute[187185]: 2025-11-29 07:28:25.316 187189 DEBUG nova.network.neutron [req-785be74e-6198-4e2c-bde2-de57c34ddbae req-3bbba090-43b3-4d00-bf42-e73d7f04cc2a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:28:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:25.518 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:28:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:25.519 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:28:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:25.520 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:28:25 compute-0 nova_compute[187185]: 2025-11-29 07:28:25.783 187189 DEBUG nova.network.neutron [req-785be74e-6198-4e2c-bde2-de57c34ddbae req-3bbba090-43b3-4d00-bf42-e73d7f04cc2a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:28:25 compute-0 nova_compute[187185]: 2025-11-29 07:28:25.801 187189 DEBUG oslo_concurrency.lockutils [req-785be74e-6198-4e2c-bde2-de57c34ddbae req-3bbba090-43b3-4d00-bf42-e73d7f04cc2a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-2cb4e847-114f-440a-b231-65e3fff0f0d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:28:25 compute-0 nova_compute[187185]: 2025-11-29 07:28:25.924 187189 DEBUG nova.network.neutron [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Successfully updated port: 9c194df0-4c84-41bd-94af-7e4ecd312dd5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:28:25 compute-0 nova_compute[187185]: 2025-11-29 07:28:25.954 187189 DEBUG oslo_concurrency.lockutils [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "refresh_cache-2cb4e847-114f-440a-b231-65e3fff0f0d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:28:25 compute-0 nova_compute[187185]: 2025-11-29 07:28:25.954 187189 DEBUG oslo_concurrency.lockutils [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquired lock "refresh_cache-2cb4e847-114f-440a-b231-65e3fff0f0d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:28:25 compute-0 nova_compute[187185]: 2025-11-29 07:28:25.955 187189 DEBUG nova.network.neutron [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:28:26 compute-0 nova_compute[187185]: 2025-11-29 07:28:26.169 187189 DEBUG nova.network.neutron [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:28:26 compute-0 nova_compute[187185]: 2025-11-29 07:28:26.310 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:28:26 compute-0 nova_compute[187185]: 2025-11-29 07:28:26.530 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:27 compute-0 nova_compute[187185]: 2025-11-29 07:28:27.152 187189 DEBUG nova.compute.manager [req-a1eb42e1-37a9-4b7e-8b72-067a93a3dc3f req-ba901f00-c716-4808-ab9f-42afe87e7567 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Received event network-changed-9c194df0-4c84-41bd-94af-7e4ecd312dd5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:28:27 compute-0 nova_compute[187185]: 2025-11-29 07:28:27.153 187189 DEBUG nova.compute.manager [req-a1eb42e1-37a9-4b7e-8b72-067a93a3dc3f req-ba901f00-c716-4808-ab9f-42afe87e7567 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Refreshing instance network info cache due to event network-changed-9c194df0-4c84-41bd-94af-7e4ecd312dd5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:28:27 compute-0 nova_compute[187185]: 2025-11-29 07:28:27.153 187189 DEBUG oslo_concurrency.lockutils [req-a1eb42e1-37a9-4b7e-8b72-067a93a3dc3f req-ba901f00-c716-4808-ab9f-42afe87e7567 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-2cb4e847-114f-440a-b231-65e3fff0f0d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:28:28 compute-0 nova_compute[187185]: 2025-11-29 07:28:28.722 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.065 187189 DEBUG nova.network.neutron [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Updating instance_info_cache with network_info: [{"id": "836d3fdf-e98b-4a41-864f-9e3fbdb29394", "address": "fa:16:3e:d5:a1:ff", "network": {"id": "94472368-b72a-4e5d-ac59-40b24b7ba792", "bridge": "br-int", "label": "tempest-network-smoke--1761418689", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap836d3fdf-e9", "ovs_interfaceid": "836d3fdf-e98b-4a41-864f-9e3fbdb29394", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9c194df0-4c84-41bd-94af-7e4ecd312dd5", "address": "fa:16:3e:da:a3:a5", "network": {"id": "ff387e90-45c2-42d7-b536-fee4d2b6eb5e", "bridge": "br-int", "label": "tempest-network-smoke--1344723239", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feda:a3a5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feda:a3a5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c194df0-4c", "ovs_interfaceid": "9c194df0-4c84-41bd-94af-7e4ecd312dd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.091 187189 DEBUG oslo_concurrency.lockutils [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Releasing lock "refresh_cache-2cb4e847-114f-440a-b231-65e3fff0f0d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.091 187189 DEBUG nova.compute.manager [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Instance network_info: |[{"id": "836d3fdf-e98b-4a41-864f-9e3fbdb29394", "address": "fa:16:3e:d5:a1:ff", "network": {"id": "94472368-b72a-4e5d-ac59-40b24b7ba792", "bridge": "br-int", "label": "tempest-network-smoke--1761418689", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap836d3fdf-e9", "ovs_interfaceid": "836d3fdf-e98b-4a41-864f-9e3fbdb29394", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9c194df0-4c84-41bd-94af-7e4ecd312dd5", "address": "fa:16:3e:da:a3:a5", "network": {"id": "ff387e90-45c2-42d7-b536-fee4d2b6eb5e", "bridge": "br-int", "label": "tempest-network-smoke--1344723239", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feda:a3a5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feda:a3a5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c194df0-4c", "ovs_interfaceid": "9c194df0-4c84-41bd-94af-7e4ecd312dd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.092 187189 DEBUG oslo_concurrency.lockutils [req-a1eb42e1-37a9-4b7e-8b72-067a93a3dc3f req-ba901f00-c716-4808-ab9f-42afe87e7567 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-2cb4e847-114f-440a-b231-65e3fff0f0d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.092 187189 DEBUG nova.network.neutron [req-a1eb42e1-37a9-4b7e-8b72-067a93a3dc3f req-ba901f00-c716-4808-ab9f-42afe87e7567 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Refreshing network info cache for port 9c194df0-4c84-41bd-94af-7e4ecd312dd5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.098 187189 DEBUG nova.virt.libvirt.driver [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Start _get_guest_xml network_info=[{"id": "836d3fdf-e98b-4a41-864f-9e3fbdb29394", "address": "fa:16:3e:d5:a1:ff", "network": {"id": "94472368-b72a-4e5d-ac59-40b24b7ba792", "bridge": "br-int", "label": "tempest-network-smoke--1761418689", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap836d3fdf-e9", "ovs_interfaceid": "836d3fdf-e98b-4a41-864f-9e3fbdb29394", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9c194df0-4c84-41bd-94af-7e4ecd312dd5", "address": "fa:16:3e:da:a3:a5", "network": {"id": "ff387e90-45c2-42d7-b536-fee4d2b6eb5e", "bridge": "br-int", "label": "tempest-network-smoke--1344723239", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feda:a3a5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feda:a3a5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c194df0-4c", "ovs_interfaceid": "9c194df0-4c84-41bd-94af-7e4ecd312dd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.105 187189 WARNING nova.virt.libvirt.driver [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.110 187189 DEBUG nova.virt.libvirt.host [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.111 187189 DEBUG nova.virt.libvirt.host [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.119 187189 DEBUG nova.virt.libvirt.host [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.120 187189 DEBUG nova.virt.libvirt.host [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.122 187189 DEBUG nova.virt.libvirt.driver [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.122 187189 DEBUG nova.virt.hardware [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.122 187189 DEBUG nova.virt.hardware [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.123 187189 DEBUG nova.virt.hardware [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.123 187189 DEBUG nova.virt.hardware [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.123 187189 DEBUG nova.virt.hardware [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.124 187189 DEBUG nova.virt.hardware [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.124 187189 DEBUG nova.virt.hardware [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.124 187189 DEBUG nova.virt.hardware [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.125 187189 DEBUG nova.virt.hardware [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.125 187189 DEBUG nova.virt.hardware [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.125 187189 DEBUG nova.virt.hardware [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.130 187189 DEBUG nova.virt.libvirt.vif [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:28:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-345833542',display_name='tempest-TestGettingAddress-server-345833542',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-345833542',id=135,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMDVssChktFh4KaYAW3qIeGMuNugTc7Dbw5csxo4G7cMEBd3kE//4edEXQEzew5nx3IZ0PY0s8qOeS5wzlVZFNNAqhiy1Tau85BZFwFUT3MGeCBu5wHnKv5gWDKRFcpa7w==',key_name='tempest-TestGettingAddress-312359009',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-vmipnsga',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:28:21Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=2cb4e847-114f-440a-b231-65e3fff0f0d2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "836d3fdf-e98b-4a41-864f-9e3fbdb29394", "address": "fa:16:3e:d5:a1:ff", "network": {"id": "94472368-b72a-4e5d-ac59-40b24b7ba792", "bridge": "br-int", "label": "tempest-network-smoke--1761418689", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap836d3fdf-e9", "ovs_interfaceid": "836d3fdf-e98b-4a41-864f-9e3fbdb29394", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.130 187189 DEBUG nova.network.os_vif_util [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "836d3fdf-e98b-4a41-864f-9e3fbdb29394", "address": "fa:16:3e:d5:a1:ff", "network": {"id": "94472368-b72a-4e5d-ac59-40b24b7ba792", "bridge": "br-int", "label": "tempest-network-smoke--1761418689", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap836d3fdf-e9", "ovs_interfaceid": "836d3fdf-e98b-4a41-864f-9e3fbdb29394", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.131 187189 DEBUG nova.network.os_vif_util [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d5:a1:ff,bridge_name='br-int',has_traffic_filtering=True,id=836d3fdf-e98b-4a41-864f-9e3fbdb29394,network=Network(94472368-b72a-4e5d-ac59-40b24b7ba792),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap836d3fdf-e9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.132 187189 DEBUG nova.virt.libvirt.vif [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:28:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-345833542',display_name='tempest-TestGettingAddress-server-345833542',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-345833542',id=135,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMDVssChktFh4KaYAW3qIeGMuNugTc7Dbw5csxo4G7cMEBd3kE//4edEXQEzew5nx3IZ0PY0s8qOeS5wzlVZFNNAqhiy1Tau85BZFwFUT3MGeCBu5wHnKv5gWDKRFcpa7w==',key_name='tempest-TestGettingAddress-312359009',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-vmipnsga',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:28:21Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=2cb4e847-114f-440a-b231-65e3fff0f0d2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9c194df0-4c84-41bd-94af-7e4ecd312dd5", "address": "fa:16:3e:da:a3:a5", "network": {"id": "ff387e90-45c2-42d7-b536-fee4d2b6eb5e", "bridge": "br-int", "label": "tempest-network-smoke--1344723239", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feda:a3a5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feda:a3a5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c194df0-4c", "ovs_interfaceid": "9c194df0-4c84-41bd-94af-7e4ecd312dd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.133 187189 DEBUG nova.network.os_vif_util [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "9c194df0-4c84-41bd-94af-7e4ecd312dd5", "address": "fa:16:3e:da:a3:a5", "network": {"id": "ff387e90-45c2-42d7-b536-fee4d2b6eb5e", "bridge": "br-int", "label": "tempest-network-smoke--1344723239", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feda:a3a5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feda:a3a5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c194df0-4c", "ovs_interfaceid": "9c194df0-4c84-41bd-94af-7e4ecd312dd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.133 187189 DEBUG nova.network.os_vif_util [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:da:a3:a5,bridge_name='br-int',has_traffic_filtering=True,id=9c194df0-4c84-41bd-94af-7e4ecd312dd5,network=Network(ff387e90-45c2-42d7-b536-fee4d2b6eb5e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9c194df0-4c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.134 187189 DEBUG nova.objects.instance [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'pci_devices' on Instance uuid 2cb4e847-114f-440a-b231-65e3fff0f0d2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.152 187189 DEBUG nova.virt.libvirt.driver [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:28:29 compute-0 nova_compute[187185]:   <uuid>2cb4e847-114f-440a-b231-65e3fff0f0d2</uuid>
Nov 29 07:28:29 compute-0 nova_compute[187185]:   <name>instance-00000087</name>
Nov 29 07:28:29 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:28:29 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:28:29 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:28:29 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:       <nova:name>tempest-TestGettingAddress-server-345833542</nova:name>
Nov 29 07:28:29 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:28:29</nova:creationTime>
Nov 29 07:28:29 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:28:29 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:28:29 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:28:29 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:28:29 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:28:29 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:28:29 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:28:29 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:28:29 compute-0 nova_compute[187185]:         <nova:user uuid="31ac7b05b012433b89143dc9f259644a">tempest-TestGettingAddress-1465017630-project-member</nova:user>
Nov 29 07:28:29 compute-0 nova_compute[187185]:         <nova:project uuid="0111c22b4b954ea586ca20d91ed3970f">tempest-TestGettingAddress-1465017630</nova:project>
Nov 29 07:28:29 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:28:29 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:28:29 compute-0 nova_compute[187185]:         <nova:port uuid="836d3fdf-e98b-4a41-864f-9e3fbdb29394">
Nov 29 07:28:29 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:28:29 compute-0 nova_compute[187185]:         <nova:port uuid="9c194df0-4c84-41bd-94af-7e4ecd312dd5">
Nov 29 07:28:29 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:feda:a3a5" ipVersion="6"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:feda:a3a5" ipVersion="6"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:28:29 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:28:29 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:28:29 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <system>
Nov 29 07:28:29 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:28:29 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:28:29 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:28:29 compute-0 nova_compute[187185]:       <entry name="serial">2cb4e847-114f-440a-b231-65e3fff0f0d2</entry>
Nov 29 07:28:29 compute-0 nova_compute[187185]:       <entry name="uuid">2cb4e847-114f-440a-b231-65e3fff0f0d2</entry>
Nov 29 07:28:29 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     </system>
Nov 29 07:28:29 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:28:29 compute-0 nova_compute[187185]:   <os>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:   </os>
Nov 29 07:28:29 compute-0 nova_compute[187185]:   <features>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:   </features>
Nov 29 07:28:29 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:28:29 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:28:29 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:28:29 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/2cb4e847-114f-440a-b231-65e3fff0f0d2/disk"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:28:29 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/2cb4e847-114f-440a-b231-65e3fff0f0d2/disk.config"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:28:29 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:d5:a1:ff"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:       <target dev="tap836d3fdf-e9"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:28:29 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:da:a3:a5"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:       <target dev="tap9c194df0-4c"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:28:29 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/2cb4e847-114f-440a-b231-65e3fff0f0d2/console.log" append="off"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <video>
Nov 29 07:28:29 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     </video>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:28:29 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:28:29 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:28:29 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:28:29 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:28:29 compute-0 nova_compute[187185]: </domain>
Nov 29 07:28:29 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.154 187189 DEBUG nova.compute.manager [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Preparing to wait for external event network-vif-plugged-836d3fdf-e98b-4a41-864f-9e3fbdb29394 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.154 187189 DEBUG oslo_concurrency.lockutils [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "2cb4e847-114f-440a-b231-65e3fff0f0d2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.154 187189 DEBUG oslo_concurrency.lockutils [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2cb4e847-114f-440a-b231-65e3fff0f0d2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.155 187189 DEBUG oslo_concurrency.lockutils [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2cb4e847-114f-440a-b231-65e3fff0f0d2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.155 187189 DEBUG nova.compute.manager [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Preparing to wait for external event network-vif-plugged-9c194df0-4c84-41bd-94af-7e4ecd312dd5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.155 187189 DEBUG oslo_concurrency.lockutils [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "2cb4e847-114f-440a-b231-65e3fff0f0d2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.156 187189 DEBUG oslo_concurrency.lockutils [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2cb4e847-114f-440a-b231-65e3fff0f0d2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.156 187189 DEBUG oslo_concurrency.lockutils [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2cb4e847-114f-440a-b231-65e3fff0f0d2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.157 187189 DEBUG nova.virt.libvirt.vif [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:28:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-345833542',display_name='tempest-TestGettingAddress-server-345833542',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-345833542',id=135,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMDVssChktFh4KaYAW3qIeGMuNugTc7Dbw5csxo4G7cMEBd3kE//4edEXQEzew5nx3IZ0PY0s8qOeS5wzlVZFNNAqhiy1Tau85BZFwFUT3MGeCBu5wHnKv5gWDKRFcpa7w==',key_name='tempest-TestGettingAddress-312359009',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-vmipnsga',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:28:21Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=2cb4e847-114f-440a-b231-65e3fff0f0d2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "836d3fdf-e98b-4a41-864f-9e3fbdb29394", "address": "fa:16:3e:d5:a1:ff", "network": {"id": "94472368-b72a-4e5d-ac59-40b24b7ba792", "bridge": "br-int", "label": "tempest-network-smoke--1761418689", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap836d3fdf-e9", "ovs_interfaceid": "836d3fdf-e98b-4a41-864f-9e3fbdb29394", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.157 187189 DEBUG nova.network.os_vif_util [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "836d3fdf-e98b-4a41-864f-9e3fbdb29394", "address": "fa:16:3e:d5:a1:ff", "network": {"id": "94472368-b72a-4e5d-ac59-40b24b7ba792", "bridge": "br-int", "label": "tempest-network-smoke--1761418689", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap836d3fdf-e9", "ovs_interfaceid": "836d3fdf-e98b-4a41-864f-9e3fbdb29394", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.158 187189 DEBUG nova.network.os_vif_util [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d5:a1:ff,bridge_name='br-int',has_traffic_filtering=True,id=836d3fdf-e98b-4a41-864f-9e3fbdb29394,network=Network(94472368-b72a-4e5d-ac59-40b24b7ba792),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap836d3fdf-e9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.158 187189 DEBUG os_vif [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:a1:ff,bridge_name='br-int',has_traffic_filtering=True,id=836d3fdf-e98b-4a41-864f-9e3fbdb29394,network=Network(94472368-b72a-4e5d-ac59-40b24b7ba792),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap836d3fdf-e9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.158 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.159 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.159 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.163 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.163 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap836d3fdf-e9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.163 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap836d3fdf-e9, col_values=(('external_ids', {'iface-id': '836d3fdf-e98b-4a41-864f-9e3fbdb29394', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d5:a1:ff', 'vm-uuid': '2cb4e847-114f-440a-b231-65e3fff0f0d2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.189 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:29 compute-0 NetworkManager[55227]: <info>  [1764401309.1901] manager: (tap836d3fdf-e9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/224)
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.192 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.198 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.200 187189 INFO os_vif [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:a1:ff,bridge_name='br-int',has_traffic_filtering=True,id=836d3fdf-e98b-4a41-864f-9e3fbdb29394,network=Network(94472368-b72a-4e5d-ac59-40b24b7ba792),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap836d3fdf-e9')
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.201 187189 DEBUG nova.virt.libvirt.vif [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:28:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-345833542',display_name='tempest-TestGettingAddress-server-345833542',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-345833542',id=135,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMDVssChktFh4KaYAW3qIeGMuNugTc7Dbw5csxo4G7cMEBd3kE//4edEXQEzew5nx3IZ0PY0s8qOeS5wzlVZFNNAqhiy1Tau85BZFwFUT3MGeCBu5wHnKv5gWDKRFcpa7w==',key_name='tempest-TestGettingAddress-312359009',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-vmipnsga',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:28:21Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=2cb4e847-114f-440a-b231-65e3fff0f0d2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9c194df0-4c84-41bd-94af-7e4ecd312dd5", "address": "fa:16:3e:da:a3:a5", "network": {"id": "ff387e90-45c2-42d7-b536-fee4d2b6eb5e", "bridge": "br-int", "label": "tempest-network-smoke--1344723239", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feda:a3a5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feda:a3a5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c194df0-4c", "ovs_interfaceid": "9c194df0-4c84-41bd-94af-7e4ecd312dd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.201 187189 DEBUG nova.network.os_vif_util [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "9c194df0-4c84-41bd-94af-7e4ecd312dd5", "address": "fa:16:3e:da:a3:a5", "network": {"id": "ff387e90-45c2-42d7-b536-fee4d2b6eb5e", "bridge": "br-int", "label": "tempest-network-smoke--1344723239", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feda:a3a5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feda:a3a5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c194df0-4c", "ovs_interfaceid": "9c194df0-4c84-41bd-94af-7e4ecd312dd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.203 187189 DEBUG nova.network.os_vif_util [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:da:a3:a5,bridge_name='br-int',has_traffic_filtering=True,id=9c194df0-4c84-41bd-94af-7e4ecd312dd5,network=Network(ff387e90-45c2-42d7-b536-fee4d2b6eb5e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9c194df0-4c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.203 187189 DEBUG os_vif [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:a3:a5,bridge_name='br-int',has_traffic_filtering=True,id=9c194df0-4c84-41bd-94af-7e4ecd312dd5,network=Network(ff387e90-45c2-42d7-b536-fee4d2b6eb5e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9c194df0-4c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.204 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.204 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.205 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.207 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.207 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9c194df0-4c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.208 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9c194df0-4c, col_values=(('external_ids', {'iface-id': '9c194df0-4c84-41bd-94af-7e4ecd312dd5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:da:a3:a5', 'vm-uuid': '2cb4e847-114f-440a-b231-65e3fff0f0d2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.209 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:29 compute-0 NetworkManager[55227]: <info>  [1764401309.2107] manager: (tap9c194df0-4c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/225)
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.212 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.217 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.218 187189 INFO os_vif [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:da:a3:a5,bridge_name='br-int',has_traffic_filtering=True,id=9c194df0-4c84-41bd-94af-7e4ecd312dd5,network=Network(ff387e90-45c2-42d7-b536-fee4d2b6eb5e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9c194df0-4c')
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.270 187189 DEBUG nova.virt.libvirt.driver [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.271 187189 DEBUG nova.virt.libvirt.driver [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.271 187189 DEBUG nova.virt.libvirt.driver [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No VIF found with MAC fa:16:3e:d5:a1:ff, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.272 187189 DEBUG nova.virt.libvirt.driver [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No VIF found with MAC fa:16:3e:da:a3:a5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:28:29 compute-0 nova_compute[187185]: 2025-11-29 07:28:29.272 187189 INFO nova.virt.libvirt.driver [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Using config drive
Nov 29 07:28:30 compute-0 nova_compute[187185]: 2025-11-29 07:28:30.017 187189 INFO nova.virt.libvirt.driver [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Creating config drive at /var/lib/nova/instances/2cb4e847-114f-440a-b231-65e3fff0f0d2/disk.config
Nov 29 07:28:30 compute-0 nova_compute[187185]: 2025-11-29 07:28:30.027 187189 DEBUG oslo_concurrency.processutils [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2cb4e847-114f-440a-b231-65e3fff0f0d2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpi_373y9a execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:28:30 compute-0 nova_compute[187185]: 2025-11-29 07:28:30.173 187189 DEBUG oslo_concurrency.processutils [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2cb4e847-114f-440a-b231-65e3fff0f0d2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpi_373y9a" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:28:30 compute-0 kernel: tap836d3fdf-e9: entered promiscuous mode
Nov 29 07:28:30 compute-0 NetworkManager[55227]: <info>  [1764401310.2607] manager: (tap836d3fdf-e9): new Tun device (/org/freedesktop/NetworkManager/Devices/226)
Nov 29 07:28:30 compute-0 nova_compute[187185]: 2025-11-29 07:28:30.269 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:30 compute-0 ovn_controller[95281]: 2025-11-29T07:28:30Z|00429|binding|INFO|Claiming lport 836d3fdf-e98b-4a41-864f-9e3fbdb29394 for this chassis.
Nov 29 07:28:30 compute-0 ovn_controller[95281]: 2025-11-29T07:28:30Z|00430|binding|INFO|836d3fdf-e98b-4a41-864f-9e3fbdb29394: Claiming fa:16:3e:d5:a1:ff 10.100.0.9
Nov 29 07:28:30 compute-0 nova_compute[187185]: 2025-11-29 07:28:30.289 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:30 compute-0 NetworkManager[55227]: <info>  [1764401310.2923] manager: (tap9c194df0-4c): new Tun device (/org/freedesktop/NetworkManager/Devices/227)
Nov 29 07:28:30 compute-0 systemd-udevd[237000]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:28:30 compute-0 systemd-udevd[237001]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:30.304 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:a1:ff 10.100.0.9'], port_security=['fa:16:3e:d5:a1:ff 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-94472368-b72a-4e5d-ac59-40b24b7ba792', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '291ecd66-4825-4cfd-92f0-8370bd5efe48', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c99c04c3-6b8c-480e-be26-e44e383928c7, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=836d3fdf-e98b-4a41-864f-9e3fbdb29394) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:30.308 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 836d3fdf-e98b-4a41-864f-9e3fbdb29394 in datapath 94472368-b72a-4e5d-ac59-40b24b7ba792 bound to our chassis
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:30.312 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 94472368-b72a-4e5d-ac59-40b24b7ba792
Nov 29 07:28:30 compute-0 NetworkManager[55227]: <info>  [1764401310.3155] device (tap836d3fdf-e9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:28:30 compute-0 NetworkManager[55227]: <info>  [1764401310.3168] device (tap836d3fdf-e9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:30.326 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a4dd14e3-b286-4b83-a636-b9dc65d9209b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:30.327 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap94472368-b1 in ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:30.329 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap94472368-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:30.329 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[64b21582-48a2-4fa3-8b35-e11e0ff44a4b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:30.330 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[2eba955f-fca0-400c-9fd8-954f120de8d0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:30 compute-0 systemd-machined[153486]: New machine qemu-54-instance-00000087.
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:30.346 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[79459134-c3bd-4490-8abf-f5508f027dff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:30 compute-0 systemd[1]: Started Virtual Machine qemu-54-instance-00000087.
Nov 29 07:28:30 compute-0 NetworkManager[55227]: <info>  [1764401310.3657] device (tap9c194df0-4c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:28:30 compute-0 kernel: tap9c194df0-4c: entered promiscuous mode
Nov 29 07:28:30 compute-0 NetworkManager[55227]: <info>  [1764401310.3670] device (tap9c194df0-4c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:28:30 compute-0 ovn_controller[95281]: 2025-11-29T07:28:30Z|00431|binding|INFO|Claiming lport 9c194df0-4c84-41bd-94af-7e4ecd312dd5 for this chassis.
Nov 29 07:28:30 compute-0 ovn_controller[95281]: 2025-11-29T07:28:30Z|00432|binding|INFO|9c194df0-4c84-41bd-94af-7e4ecd312dd5: Claiming fa:16:3e:da:a3:a5 2001:db8:0:1:f816:3eff:feda:a3a5 2001:db8::f816:3eff:feda:a3a5
Nov 29 07:28:30 compute-0 nova_compute[187185]: 2025-11-29 07:28:30.368 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:30.374 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[2bcfe09b-87a6-4298-be10-e43a77298bad]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:30.377 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:da:a3:a5 2001:db8:0:1:f816:3eff:feda:a3a5 2001:db8::f816:3eff:feda:a3a5'], port_security=['fa:16:3e:da:a3:a5 2001:db8:0:1:f816:3eff:feda:a3a5 2001:db8::f816:3eff:feda:a3a5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:feda:a3a5/64 2001:db8::f816:3eff:feda:a3a5/64', 'neutron:device_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ff387e90-45c2-42d7-b536-fee4d2b6eb5e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '291ecd66-4825-4cfd-92f0-8370bd5efe48', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ed5ad144-c783-4b67-a226-e0c5588d3535, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=9c194df0-4c84-41bd-94af-7e4ecd312dd5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:28:30 compute-0 ovn_controller[95281]: 2025-11-29T07:28:30Z|00433|binding|INFO|Setting lport 836d3fdf-e98b-4a41-864f-9e3fbdb29394 ovn-installed in OVS
Nov 29 07:28:30 compute-0 ovn_controller[95281]: 2025-11-29T07:28:30Z|00434|binding|INFO|Setting lport 836d3fdf-e98b-4a41-864f-9e3fbdb29394 up in Southbound
Nov 29 07:28:30 compute-0 nova_compute[187185]: 2025-11-29 07:28:30.380 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:30 compute-0 ovn_controller[95281]: 2025-11-29T07:28:30Z|00435|binding|INFO|Setting lport 9c194df0-4c84-41bd-94af-7e4ecd312dd5 ovn-installed in OVS
Nov 29 07:28:30 compute-0 ovn_controller[95281]: 2025-11-29T07:28:30Z|00436|binding|INFO|Setting lport 9c194df0-4c84-41bd-94af-7e4ecd312dd5 up in Southbound
Nov 29 07:28:30 compute-0 nova_compute[187185]: 2025-11-29 07:28:30.390 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:30.412 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[1d576db6-f8c7-4be1-993c-097bc32b3d0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:30.420 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[ce98d140-5d55-4fb8-b31b-5bb27695d4c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:30 compute-0 NetworkManager[55227]: <info>  [1764401310.4217] manager: (tap94472368-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/228)
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:30.459 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[ae092097-3023-4641-b1f5-fb1a1e61e873]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:30.462 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[1841d28f-0099-4809-bcb8-e0742bda3cff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:30 compute-0 NetworkManager[55227]: <info>  [1764401310.4856] device (tap94472368-b0): carrier: link connected
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:30.492 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[71d7755b-9472-4e4b-b720-b5a870367094]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:30.512 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[1a825b85-6b4d-4c68-8205-1cffe87a9817]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap94472368-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:59:43:56'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 140], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 675814, 'reachable_time': 36493, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237038, 'error': None, 'target': 'ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:30.530 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[64d78a5d-368c-4eec-9386-ce883b137e8d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe59:4356'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 675814, 'tstamp': 675814}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237039, 'error': None, 'target': 'ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:30.552 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f8a1ce5d-6dbf-4ea8-b3c7-abba5e50584a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap94472368-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:59:43:56'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 140], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 675814, 'reachable_time': 36493, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 237040, 'error': None, 'target': 'ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:30.585 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[5c56ecb8-54bf-4a59-9b0e-5f694e660ee1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:30.656 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[c733e59c-6f18-4c94-96f1-eb0699f3e032]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:30.658 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap94472368-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:30.658 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:30.659 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap94472368-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:28:30 compute-0 NetworkManager[55227]: <info>  [1764401310.6621] manager: (tap94472368-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/229)
Nov 29 07:28:30 compute-0 kernel: tap94472368-b0: entered promiscuous mode
Nov 29 07:28:30 compute-0 nova_compute[187185]: 2025-11-29 07:28:30.661 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:30.666 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap94472368-b0, col_values=(('external_ids', {'iface-id': '9d125548-8068-4815-941c-f4536091ef07'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:28:30 compute-0 nova_compute[187185]: 2025-11-29 07:28:30.667 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:30 compute-0 ovn_controller[95281]: 2025-11-29T07:28:30Z|00437|binding|INFO|Releasing lport 9d125548-8068-4815-941c-f4536091ef07 from this chassis (sb_readonly=0)
Nov 29 07:28:30 compute-0 nova_compute[187185]: 2025-11-29 07:28:30.671 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:30.671 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/94472368-b72a-4e5d-ac59-40b24b7ba792.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/94472368-b72a-4e5d-ac59-40b24b7ba792.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:30.672 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[3e40b837-fac2-403c-9a37-27cbae343fa5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:30.673 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-94472368-b72a-4e5d-ac59-40b24b7ba792
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/94472368-b72a-4e5d-ac59-40b24b7ba792.pid.haproxy
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID 94472368-b72a-4e5d-ac59-40b24b7ba792
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:28:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:30.673 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792', 'env', 'PROCESS_TAG=haproxy-94472368-b72a-4e5d-ac59-40b24b7ba792', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/94472368-b72a-4e5d-ac59-40b24b7ba792.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:28:30 compute-0 nova_compute[187185]: 2025-11-29 07:28:30.696 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:31 compute-0 podman[237072]: 2025-11-29 07:28:31.015387406 +0000 UTC m=+0.055702342 container create 56e5ba6434cb225dde086a2ec606b7be5a95b70b3965f2411aee8995833bf9f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:28:31 compute-0 systemd[1]: Started libpod-conmon-56e5ba6434cb225dde086a2ec606b7be5a95b70b3965f2411aee8995833bf9f2.scope.
Nov 29 07:28:31 compute-0 podman[237072]: 2025-11-29 07:28:30.98042313 +0000 UTC m=+0.020738086 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:28:31 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:28:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1b47d04aff71d36b1a4920b5e5f528b4acbf90772f99e52b6266be3c57eeb86d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:28:31 compute-0 podman[237072]: 2025-11-29 07:28:31.112689671 +0000 UTC m=+0.153004617 container init 56e5ba6434cb225dde086a2ec606b7be5a95b70b3965f2411aee8995833bf9f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 29 07:28:31 compute-0 podman[237072]: 2025-11-29 07:28:31.123626639 +0000 UTC m=+0.163941565 container start 56e5ba6434cb225dde086a2ec606b7be5a95b70b3965f2411aee8995833bf9f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 07:28:31 compute-0 neutron-haproxy-ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792[237087]: [NOTICE]   (237091) : New worker (237093) forked
Nov 29 07:28:31 compute-0 neutron-haproxy-ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792[237087]: [NOTICE]   (237091) : Loading success.
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:31.195 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 9c194df0-4c84-41bd-94af-7e4ecd312dd5 in datapath ff387e90-45c2-42d7-b536-fee4d2b6eb5e unbound from our chassis
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:31.198 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ff387e90-45c2-42d7-b536-fee4d2b6eb5e
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:31.209 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[3d648d13-79df-4b52-ac74-774c3861af02]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:31.210 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapff387e90-41 in ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:31.213 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapff387e90-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:31.213 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[9bb84c61-cff0-4d9c-8cfb-5533b58a86c5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:31.214 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[988c384f-ce35-43ad-a205-a34edb1b65e9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:31.227 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[939ec5da-baaa-4b36-b1e3-11b80fad0178]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:31.242 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[89797a23-e7a0-403c-9212-95c330cd6fdf]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:31.285 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[5f0d4d89-f5d2-4574-bcd2-bd4176cc9de2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:31.296 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[86285518-1067-40ee-81ad-c65a76935fae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:31 compute-0 NetworkManager[55227]: <info>  [1764401311.2992] manager: (tapff387e90-40): new Veth device (/org/freedesktop/NetworkManager/Devices/230)
Nov 29 07:28:31 compute-0 systemd-udevd[237025]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:31.349 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[169a3401-5b62-414c-8647-ab794a0e38b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:31.354 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[2e8703ee-b7f3-4460-990d-f6992207a83c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:31 compute-0 nova_compute[187185]: 2025-11-29 07:28:31.360 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401311.359183, 2cb4e847-114f-440a-b231-65e3fff0f0d2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:28:31 compute-0 nova_compute[187185]: 2025-11-29 07:28:31.361 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] VM Started (Lifecycle Event)
Nov 29 07:28:31 compute-0 NetworkManager[55227]: <info>  [1764401311.3905] device (tapff387e90-40): carrier: link connected
Nov 29 07:28:31 compute-0 nova_compute[187185]: 2025-11-29 07:28:31.390 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:28:31 compute-0 nova_compute[187185]: 2025-11-29 07:28:31.396 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401311.3606026, 2cb4e847-114f-440a-b231-65e3fff0f0d2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:28:31 compute-0 nova_compute[187185]: 2025-11-29 07:28:31.396 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] VM Paused (Lifecycle Event)
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:31.398 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[f3fa28b2-d34d-4a88-872c-2524b5a1b4a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:31 compute-0 nova_compute[187185]: 2025-11-29 07:28:31.413 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:28:31 compute-0 nova_compute[187185]: 2025-11-29 07:28:31.418 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:31.419 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[5fd0fb22-3e57-4206-8faa-b7814a69aac9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapff387e90-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b9:46:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 141], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 675905, 'reachable_time': 26929, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237122, 'error': None, 'target': 'ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:31.441 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[0586b50e-1fd9-4bb7-9d53-ec412f238e67]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb9:4633'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 675905, 'tstamp': 675905}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237123, 'error': None, 'target': 'ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:31 compute-0 nova_compute[187185]: 2025-11-29 07:28:31.459 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:31.458 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[3c44f43a-4322-453d-8f4f-4c0a00a067d1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapff387e90-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b9:46:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 141], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 675905, 'reachable_time': 26929, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 237124, 'error': None, 'target': 'ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:31.491 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e3521b2b-a363-4cfc-bc6e-4904a82ac37b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:31 compute-0 nova_compute[187185]: 2025-11-29 07:28:31.532 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:31.532 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a6d6d785-55fb-4de3-af0c-07810b72d357]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:31.535 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapff387e90-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:31.535 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:31.536 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapff387e90-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:28:31 compute-0 NetworkManager[55227]: <info>  [1764401311.5381] manager: (tapff387e90-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/231)
Nov 29 07:28:31 compute-0 kernel: tapff387e90-40: entered promiscuous mode
Nov 29 07:28:31 compute-0 nova_compute[187185]: 2025-11-29 07:28:31.537 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:31.541 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapff387e90-40, col_values=(('external_ids', {'iface-id': 'f0ce4da0-40ec-44ef-8179-4cbfad9b57f1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:28:31 compute-0 nova_compute[187185]: 2025-11-29 07:28:31.542 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:31 compute-0 ovn_controller[95281]: 2025-11-29T07:28:31Z|00438|binding|INFO|Releasing lport f0ce4da0-40ec-44ef-8179-4cbfad9b57f1 from this chassis (sb_readonly=0)
Nov 29 07:28:31 compute-0 nova_compute[187185]: 2025-11-29 07:28:31.543 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:31.544 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ff387e90-45c2-42d7-b536-fee4d2b6eb5e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ff387e90-45c2-42d7-b536-fee4d2b6eb5e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:31.544 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[8616b41b-99e9-49f9-b41d-5c87dc1066db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:31.545 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-ff387e90-45c2-42d7-b536-fee4d2b6eb5e
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/ff387e90-45c2-42d7-b536-fee4d2b6eb5e.pid.haproxy
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID ff387e90-45c2-42d7-b536-fee4d2b6eb5e
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:28:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:31.546 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e', 'env', 'PROCESS_TAG=haproxy-ff387e90-45c2-42d7-b536-fee4d2b6eb5e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ff387e90-45c2-42d7-b536-fee4d2b6eb5e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:28:31 compute-0 nova_compute[187185]: 2025-11-29 07:28:31.555 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:31 compute-0 nova_compute[187185]: 2025-11-29 07:28:31.587 187189 DEBUG nova.compute.manager [req-7099a8c6-9dd3-4b1f-90c0-198a4fbbd6ad req-cab7e48b-1b6d-434b-872e-73698e1ea39f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Received event network-vif-plugged-836d3fdf-e98b-4a41-864f-9e3fbdb29394 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:28:31 compute-0 nova_compute[187185]: 2025-11-29 07:28:31.588 187189 DEBUG oslo_concurrency.lockutils [req-7099a8c6-9dd3-4b1f-90c0-198a4fbbd6ad req-cab7e48b-1b6d-434b-872e-73698e1ea39f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2cb4e847-114f-440a-b231-65e3fff0f0d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:28:31 compute-0 nova_compute[187185]: 2025-11-29 07:28:31.588 187189 DEBUG oslo_concurrency.lockutils [req-7099a8c6-9dd3-4b1f-90c0-198a4fbbd6ad req-cab7e48b-1b6d-434b-872e-73698e1ea39f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2cb4e847-114f-440a-b231-65e3fff0f0d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:28:31 compute-0 nova_compute[187185]: 2025-11-29 07:28:31.588 187189 DEBUG oslo_concurrency.lockutils [req-7099a8c6-9dd3-4b1f-90c0-198a4fbbd6ad req-cab7e48b-1b6d-434b-872e-73698e1ea39f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2cb4e847-114f-440a-b231-65e3fff0f0d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:28:31 compute-0 nova_compute[187185]: 2025-11-29 07:28:31.589 187189 DEBUG nova.compute.manager [req-7099a8c6-9dd3-4b1f-90c0-198a4fbbd6ad req-cab7e48b-1b6d-434b-872e-73698e1ea39f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Processing event network-vif-plugged-836d3fdf-e98b-4a41-864f-9e3fbdb29394 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:28:31 compute-0 nova_compute[187185]: 2025-11-29 07:28:31.589 187189 DEBUG nova.compute.manager [req-7099a8c6-9dd3-4b1f-90c0-198a4fbbd6ad req-cab7e48b-1b6d-434b-872e-73698e1ea39f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Received event network-vif-plugged-836d3fdf-e98b-4a41-864f-9e3fbdb29394 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:28:31 compute-0 nova_compute[187185]: 2025-11-29 07:28:31.589 187189 DEBUG oslo_concurrency.lockutils [req-7099a8c6-9dd3-4b1f-90c0-198a4fbbd6ad req-cab7e48b-1b6d-434b-872e-73698e1ea39f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2cb4e847-114f-440a-b231-65e3fff0f0d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:28:31 compute-0 nova_compute[187185]: 2025-11-29 07:28:31.589 187189 DEBUG oslo_concurrency.lockutils [req-7099a8c6-9dd3-4b1f-90c0-198a4fbbd6ad req-cab7e48b-1b6d-434b-872e-73698e1ea39f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2cb4e847-114f-440a-b231-65e3fff0f0d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:28:31 compute-0 nova_compute[187185]: 2025-11-29 07:28:31.590 187189 DEBUG oslo_concurrency.lockutils [req-7099a8c6-9dd3-4b1f-90c0-198a4fbbd6ad req-cab7e48b-1b6d-434b-872e-73698e1ea39f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2cb4e847-114f-440a-b231-65e3fff0f0d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:28:31 compute-0 nova_compute[187185]: 2025-11-29 07:28:31.590 187189 DEBUG nova.compute.manager [req-7099a8c6-9dd3-4b1f-90c0-198a4fbbd6ad req-cab7e48b-1b6d-434b-872e-73698e1ea39f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] No event matching network-vif-plugged-836d3fdf-e98b-4a41-864f-9e3fbdb29394 in dict_keys([('network-vif-plugged', '9c194df0-4c84-41bd-94af-7e4ecd312dd5')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325
Nov 29 07:28:31 compute-0 nova_compute[187185]: 2025-11-29 07:28:31.590 187189 WARNING nova.compute.manager [req-7099a8c6-9dd3-4b1f-90c0-198a4fbbd6ad req-cab7e48b-1b6d-434b-872e-73698e1ea39f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Received unexpected event network-vif-plugged-836d3fdf-e98b-4a41-864f-9e3fbdb29394 for instance with vm_state building and task_state spawning.
Nov 29 07:28:31 compute-0 nova_compute[187185]: 2025-11-29 07:28:31.796 187189 DEBUG nova.network.neutron [req-a1eb42e1-37a9-4b7e-8b72-067a93a3dc3f req-ba901f00-c716-4808-ab9f-42afe87e7567 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Updated VIF entry in instance network info cache for port 9c194df0-4c84-41bd-94af-7e4ecd312dd5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:28:31 compute-0 nova_compute[187185]: 2025-11-29 07:28:31.797 187189 DEBUG nova.network.neutron [req-a1eb42e1-37a9-4b7e-8b72-067a93a3dc3f req-ba901f00-c716-4808-ab9f-42afe87e7567 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Updating instance_info_cache with network_info: [{"id": "836d3fdf-e98b-4a41-864f-9e3fbdb29394", "address": "fa:16:3e:d5:a1:ff", "network": {"id": "94472368-b72a-4e5d-ac59-40b24b7ba792", "bridge": "br-int", "label": "tempest-network-smoke--1761418689", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap836d3fdf-e9", "ovs_interfaceid": "836d3fdf-e98b-4a41-864f-9e3fbdb29394", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9c194df0-4c84-41bd-94af-7e4ecd312dd5", "address": "fa:16:3e:da:a3:a5", "network": {"id": "ff387e90-45c2-42d7-b536-fee4d2b6eb5e", "bridge": "br-int", "label": "tempest-network-smoke--1344723239", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feda:a3a5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feda:a3a5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c194df0-4c", "ovs_interfaceid": "9c194df0-4c84-41bd-94af-7e4ecd312dd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:28:31 compute-0 nova_compute[187185]: 2025-11-29 07:28:31.813 187189 DEBUG oslo_concurrency.lockutils [req-a1eb42e1-37a9-4b7e-8b72-067a93a3dc3f req-ba901f00-c716-4808-ab9f-42afe87e7567 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-2cb4e847-114f-440a-b231-65e3fff0f0d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:28:31 compute-0 podman[237154]: 2025-11-29 07:28:31.954124676 +0000 UTC m=+0.058835201 container create e033e20188aa4ad87e9df95b5b83bbfd0157c0456746005e1ce373a546b84257 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 29 07:28:32 compute-0 systemd[1]: Started libpod-conmon-e033e20188aa4ad87e9df95b5b83bbfd0157c0456746005e1ce373a546b84257.scope.
Nov 29 07:28:32 compute-0 podman[237154]: 2025-11-29 07:28:31.924661685 +0000 UTC m=+0.029372190 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:28:32 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:28:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4a170deaf292f901bfd34cda8779c75d6590e63df05c96aa56a3e0691adcc5f7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:28:32 compute-0 podman[237154]: 2025-11-29 07:28:32.048774506 +0000 UTC m=+0.153485031 container init e033e20188aa4ad87e9df95b5b83bbfd0157c0456746005e1ce373a546b84257 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:28:32 compute-0 podman[237154]: 2025-11-29 07:28:32.058458629 +0000 UTC m=+0.163169114 container start e033e20188aa4ad87e9df95b5b83bbfd0157c0456746005e1ce373a546b84257 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Nov 29 07:28:32 compute-0 neutron-haproxy-ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e[237169]: [NOTICE]   (237173) : New worker (237175) forked
Nov 29 07:28:32 compute-0 neutron-haproxy-ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e[237169]: [NOTICE]   (237173) : Loading success.
Nov 29 07:28:33 compute-0 nova_compute[187185]: 2025-11-29 07:28:33.513 187189 DEBUG nova.compute.manager [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Received event network-vif-plugged-9c194df0-4c84-41bd-94af-7e4ecd312dd5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:28:33 compute-0 nova_compute[187185]: 2025-11-29 07:28:33.514 187189 DEBUG oslo_concurrency.lockutils [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2cb4e847-114f-440a-b231-65e3fff0f0d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:28:33 compute-0 nova_compute[187185]: 2025-11-29 07:28:33.514 187189 DEBUG oslo_concurrency.lockutils [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2cb4e847-114f-440a-b231-65e3fff0f0d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:28:33 compute-0 nova_compute[187185]: 2025-11-29 07:28:33.515 187189 DEBUG oslo_concurrency.lockutils [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2cb4e847-114f-440a-b231-65e3fff0f0d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:28:33 compute-0 nova_compute[187185]: 2025-11-29 07:28:33.515 187189 DEBUG nova.compute.manager [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Processing event network-vif-plugged-9c194df0-4c84-41bd-94af-7e4ecd312dd5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:28:33 compute-0 nova_compute[187185]: 2025-11-29 07:28:33.516 187189 DEBUG nova.compute.manager [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Received event network-vif-plugged-9c194df0-4c84-41bd-94af-7e4ecd312dd5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:28:33 compute-0 nova_compute[187185]: 2025-11-29 07:28:33.517 187189 DEBUG oslo_concurrency.lockutils [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2cb4e847-114f-440a-b231-65e3fff0f0d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:28:33 compute-0 nova_compute[187185]: 2025-11-29 07:28:33.517 187189 DEBUG oslo_concurrency.lockutils [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2cb4e847-114f-440a-b231-65e3fff0f0d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:28:33 compute-0 nova_compute[187185]: 2025-11-29 07:28:33.518 187189 DEBUG oslo_concurrency.lockutils [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2cb4e847-114f-440a-b231-65e3fff0f0d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:28:33 compute-0 nova_compute[187185]: 2025-11-29 07:28:33.518 187189 DEBUG nova.compute.manager [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] No waiting events found dispatching network-vif-plugged-9c194df0-4c84-41bd-94af-7e4ecd312dd5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:28:33 compute-0 nova_compute[187185]: 2025-11-29 07:28:33.518 187189 WARNING nova.compute.manager [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Received unexpected event network-vif-plugged-9c194df0-4c84-41bd-94af-7e4ecd312dd5 for instance with vm_state building and task_state spawning.
Nov 29 07:28:33 compute-0 nova_compute[187185]: 2025-11-29 07:28:33.520 187189 DEBUG nova.compute.manager [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Instance event wait completed in 2 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:28:33 compute-0 nova_compute[187185]: 2025-11-29 07:28:33.525 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401313.525258, 2cb4e847-114f-440a-b231-65e3fff0f0d2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:28:33 compute-0 nova_compute[187185]: 2025-11-29 07:28:33.526 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] VM Resumed (Lifecycle Event)
Nov 29 07:28:33 compute-0 nova_compute[187185]: 2025-11-29 07:28:33.529 187189 DEBUG nova.virt.libvirt.driver [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:28:33 compute-0 nova_compute[187185]: 2025-11-29 07:28:33.533 187189 INFO nova.virt.libvirt.driver [-] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Instance spawned successfully.
Nov 29 07:28:33 compute-0 nova_compute[187185]: 2025-11-29 07:28:33.534 187189 DEBUG nova.virt.libvirt.driver [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:28:33 compute-0 nova_compute[187185]: 2025-11-29 07:28:33.551 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:28:33 compute-0 nova_compute[187185]: 2025-11-29 07:28:33.563 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:28:33 compute-0 nova_compute[187185]: 2025-11-29 07:28:33.569 187189 DEBUG nova.virt.libvirt.driver [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:28:33 compute-0 nova_compute[187185]: 2025-11-29 07:28:33.570 187189 DEBUG nova.virt.libvirt.driver [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:28:33 compute-0 nova_compute[187185]: 2025-11-29 07:28:33.570 187189 DEBUG nova.virt.libvirt.driver [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:28:33 compute-0 nova_compute[187185]: 2025-11-29 07:28:33.571 187189 DEBUG nova.virt.libvirt.driver [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:28:33 compute-0 nova_compute[187185]: 2025-11-29 07:28:33.571 187189 DEBUG nova.virt.libvirt.driver [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:28:33 compute-0 nova_compute[187185]: 2025-11-29 07:28:33.571 187189 DEBUG nova.virt.libvirt.driver [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:28:33 compute-0 nova_compute[187185]: 2025-11-29 07:28:33.610 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:28:33 compute-0 nova_compute[187185]: 2025-11-29 07:28:33.719 187189 INFO nova.compute.manager [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Took 12.27 seconds to spawn the instance on the hypervisor.
Nov 29 07:28:33 compute-0 nova_compute[187185]: 2025-11-29 07:28:33.720 187189 DEBUG nova.compute.manager [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:28:33 compute-0 nova_compute[187185]: 2025-11-29 07:28:33.851 187189 INFO nova.compute.manager [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Took 12.92 seconds to build instance.
Nov 29 07:28:33 compute-0 podman[237184]: 2025-11-29 07:28:33.863782062 +0000 UTC m=+0.121507369 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 07:28:33 compute-0 nova_compute[187185]: 2025-11-29 07:28:33.879 187189 DEBUG oslo_concurrency.lockutils [None req-da09f1e7-2e99-4e3c-b7b7-f24a8aad0f97 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2cb4e847-114f-440a-b231-65e3fff0f0d2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.050s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:28:34 compute-0 nova_compute[187185]: 2025-11-29 07:28:34.210 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:36 compute-0 nova_compute[187185]: 2025-11-29 07:28:36.536 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:37 compute-0 nova_compute[187185]: 2025-11-29 07:28:37.223 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:37 compute-0 NetworkManager[55227]: <info>  [1764401317.2243] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/232)
Nov 29 07:28:37 compute-0 NetworkManager[55227]: <info>  [1764401317.2256] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/233)
Nov 29 07:28:37 compute-0 nova_compute[187185]: 2025-11-29 07:28:37.401 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:37 compute-0 ovn_controller[95281]: 2025-11-29T07:28:37Z|00439|binding|INFO|Releasing lport 2abf732f-8f8c-470e-b6e2-def265b14d70 from this chassis (sb_readonly=0)
Nov 29 07:28:37 compute-0 ovn_controller[95281]: 2025-11-29T07:28:37Z|00440|binding|INFO|Releasing lport 9d125548-8068-4815-941c-f4536091ef07 from this chassis (sb_readonly=0)
Nov 29 07:28:37 compute-0 ovn_controller[95281]: 2025-11-29T07:28:37Z|00441|binding|INFO|Releasing lport f0ce4da0-40ec-44ef-8179-4cbfad9b57f1 from this chassis (sb_readonly=0)
Nov 29 07:28:37 compute-0 nova_compute[187185]: 2025-11-29 07:28:37.427 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:39 compute-0 nova_compute[187185]: 2025-11-29 07:28:39.215 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:39 compute-0 nova_compute[187185]: 2025-11-29 07:28:39.283 187189 DEBUG nova.compute.manager [req-78f8d6fa-c97d-4288-a066-23685256197d req-190d4e6b-ea2c-47f5-b7a3-102474c1363b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Received event network-changed-836d3fdf-e98b-4a41-864f-9e3fbdb29394 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:28:39 compute-0 nova_compute[187185]: 2025-11-29 07:28:39.284 187189 DEBUG nova.compute.manager [req-78f8d6fa-c97d-4288-a066-23685256197d req-190d4e6b-ea2c-47f5-b7a3-102474c1363b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Refreshing instance network info cache due to event network-changed-836d3fdf-e98b-4a41-864f-9e3fbdb29394. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:28:39 compute-0 nova_compute[187185]: 2025-11-29 07:28:39.284 187189 DEBUG oslo_concurrency.lockutils [req-78f8d6fa-c97d-4288-a066-23685256197d req-190d4e6b-ea2c-47f5-b7a3-102474c1363b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-2cb4e847-114f-440a-b231-65e3fff0f0d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:28:39 compute-0 nova_compute[187185]: 2025-11-29 07:28:39.284 187189 DEBUG oslo_concurrency.lockutils [req-78f8d6fa-c97d-4288-a066-23685256197d req-190d4e6b-ea2c-47f5-b7a3-102474c1363b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-2cb4e847-114f-440a-b231-65e3fff0f0d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:28:39 compute-0 nova_compute[187185]: 2025-11-29 07:28:39.284 187189 DEBUG nova.network.neutron [req-78f8d6fa-c97d-4288-a066-23685256197d req-190d4e6b-ea2c-47f5-b7a3-102474c1363b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Refreshing network info cache for port 836d3fdf-e98b-4a41-864f-9e3fbdb29394 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:28:40 compute-0 podman[237212]: 2025-11-29 07:28:40.834114369 +0000 UTC m=+0.089944980 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:28:41 compute-0 nova_compute[187185]: 2025-11-29 07:28:41.569 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:43 compute-0 nova_compute[187185]: 2025-11-29 07:28:43.746 187189 DEBUG nova.network.neutron [req-78f8d6fa-c97d-4288-a066-23685256197d req-190d4e6b-ea2c-47f5-b7a3-102474c1363b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Updated VIF entry in instance network info cache for port 836d3fdf-e98b-4a41-864f-9e3fbdb29394. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:28:43 compute-0 nova_compute[187185]: 2025-11-29 07:28:43.747 187189 DEBUG nova.network.neutron [req-78f8d6fa-c97d-4288-a066-23685256197d req-190d4e6b-ea2c-47f5-b7a3-102474c1363b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Updating instance_info_cache with network_info: [{"id": "836d3fdf-e98b-4a41-864f-9e3fbdb29394", "address": "fa:16:3e:d5:a1:ff", "network": {"id": "94472368-b72a-4e5d-ac59-40b24b7ba792", "bridge": "br-int", "label": "tempest-network-smoke--1761418689", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.220", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap836d3fdf-e9", "ovs_interfaceid": "836d3fdf-e98b-4a41-864f-9e3fbdb29394", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9c194df0-4c84-41bd-94af-7e4ecd312dd5", "address": "fa:16:3e:da:a3:a5", "network": {"id": "ff387e90-45c2-42d7-b536-fee4d2b6eb5e", "bridge": "br-int", "label": "tempest-network-smoke--1344723239", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feda:a3a5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feda:a3a5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c194df0-4c", "ovs_interfaceid": "9c194df0-4c84-41bd-94af-7e4ecd312dd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:28:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:43.851 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=45, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=44) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:28:43 compute-0 nova_compute[187185]: 2025-11-29 07:28:43.852 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:43.855 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:28:43 compute-0 nova_compute[187185]: 2025-11-29 07:28:43.920 187189 DEBUG oslo_concurrency.lockutils [req-78f8d6fa-c97d-4288-a066-23685256197d req-190d4e6b-ea2c-47f5-b7a3-102474c1363b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-2cb4e847-114f-440a-b231-65e3fff0f0d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:28:44 compute-0 nova_compute[187185]: 2025-11-29 07:28:44.218 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:44 compute-0 nova_compute[187185]: 2025-11-29 07:28:44.311 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:28:45 compute-0 podman[237247]: 2025-11-29 07:28:45.852424979 +0000 UTC m=+0.093316288 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 29 07:28:45 compute-0 podman[237246]: 2025-11-29 07:28:45.858572305 +0000 UTC m=+0.104508578 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 07:28:46 compute-0 nova_compute[187185]: 2025-11-29 07:28:46.572 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.011 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f22f95f6-efd0-4710-adf7-895e0acda50c', 'name': 'tempest-ListServerFiltersTestJSON-instance-1499668672', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000084', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '7843cfa993a1428aaaa660321ebba1ac', 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'hostId': 'fb1f4d6d255efe63d2ef702fe4c95a98e37c027293f6cb5b0c303f7f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.015 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'name': 'tempest-TestGettingAddress-server-345833542', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000087', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '0111c22b4b954ea586ca20d91ed3970f', 'user_id': '31ac7b05b012433b89143dc9f259644a', 'hostId': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.016 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.019 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for f22f95f6-efd0-4710-adf7-895e0acda50c / tapc3de84a1-77 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.019 12 DEBUG ceilometer.compute.pollsters [-] f22f95f6-efd0-4710-adf7-895e0acda50c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.021 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 2cb4e847-114f-440a-b231-65e3fff0f0d2 / tap836d3fdf-e9 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.022 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 2cb4e847-114f-440a-b231-65e3fff0f0d2 / tap9c194df0-4c inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.022 12 DEBUG ceilometer.compute.pollsters [-] 2cb4e847-114f-440a-b231-65e3fff0f0d2/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.023 12 DEBUG ceilometer.compute.pollsters [-] 2cb4e847-114f-440a-b231-65e3fff0f0d2/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fd701d9a-fe76-43a8-a12b-3420d3c16b12', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': 'instance-00000084-f22f95f6-efd0-4710-adf7-895e0acda50c-tapc3de84a1-77', 'timestamp': '2025-11-29T07:28:48.016352', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1499668672', 'name': 'tapc3de84a1-77', 'instance_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c', 'instance_type': 'm1.nano', 'host': 'fb1f4d6d255efe63d2ef702fe4c95a98e37c027293f6cb5b0c303f7f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f7:bb:74', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc3de84a1-77'}, 'message_id': '0b7bca5e-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.734611496, 'message_signature': '49a00f1be8bcad39911849cb53a340bce59f05f360de4fce1cb7dc99f32c391b'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000087-2cb4e847-114f-440a-b231-65e3fff0f0d2-tap836d3fdf-e9', 'timestamp': '2025-11-29T07:28:48.016352', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-345833542', 'name': 'tap836d3fdf-e9', 'instance_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d5:a1:ff', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap836d3fdf-e9'}, 'message_id': '0b7c3782-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.738558679, 'message_signature': '82d6dc88e0b2799b43868ac3a0a2f9903b1090e933c9568ebd7de564b68bc866'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000087-2cb4e847-114f-440a-b231-65e3fff0f0d2-tap9c194df0-4c', 'timestamp': '2025-11-29T07:28:48.016352', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-345833542', 'name': 'tap9c194df0-4c', 'instance_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:da:a3:a5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9c194df0-4c'}, 'message_id': '0b7c4588-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.738558679, 'message_signature': 'c6bd6dfdb18c24afbac8a056aca1cc96aea72ea94bf9e7aa0d97fb349e01afb7'}]}, 'timestamp': '2025-11-29 07:28:48.023447', '_unique_id': 'ad996f6725824d2e831f3e88f7b2cc80'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.025 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.026 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.026 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.026 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1499668672>, <NovaLikeServer: tempest-TestGettingAddress-server-345833542>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1499668672>, <NovaLikeServer: tempest-TestGettingAddress-server-345833542>]
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.027 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.054 12 DEBUG ceilometer.compute.pollsters [-] f22f95f6-efd0-4710-adf7-895e0acda50c/disk.device.write.requests volume: 308 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.055 12 DEBUG ceilometer.compute.pollsters [-] f22f95f6-efd0-4710-adf7-895e0acda50c/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.083 12 DEBUG ceilometer.compute.pollsters [-] 2cb4e847-114f-440a-b231-65e3fff0f0d2/disk.device.write.requests volume: 236 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.084 12 DEBUG ceilometer.compute.pollsters [-] 2cb4e847-114f-440a-b231-65e3fff0f0d2/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd0a9a9bd-b26e-4590-bb02-72b8a8414a91', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 308, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c-vda', 'timestamp': '2025-11-29T07:28:48.027330', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1499668672', 'name': 'instance-00000084', 'instance_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c', 'instance_type': 'm1.nano', 'host': 'fb1f4d6d255efe63d2ef702fe4c95a98e37c027293f6cb5b0c303f7f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b8114c8-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.7455536, 'message_signature': 'e0ee7f4664d236839b225bb80133ad7d7eda29302292e1cb64d58828227567c3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c-sda', 'timestamp': '2025-11-29T07:28:48.027330', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1499668672', 'name': 'instance-00000084', 'instance_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c', 'instance_type': 'm1.nano', 'host': 'fb1f4d6d255efe63d2ef702fe4c95a98e37c027293f6cb5b0c303f7f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b812378-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.7455536, 'message_signature': 'd0cfbc275a2d7fe047336d7c435e31fd79908b49c6d70d3e97772f5979a480fc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 236, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2-vda', 'timestamp': '2025-11-29T07:28:48.027330', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-345833542', 'name': 'instance-00000087', 'instance_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b8595b6-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.773508962, 'message_signature': 'abad0d7ac4fea5b113c9f835e03281048d86b8fcc3cbb5526fa882b31af8467f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2-sda', 'timestamp': '2025-11-29T07:28:48.027330', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-345833542', 'name': 'instance-00000087', 'instance_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b85b0dc-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.773508962, 'message_signature': 'ba70b1b0657721d79ef369a2aaf595ad11593cc39fee6f7cdbf02ab07913a16e'}]}, 'timestamp': '2025-11-29 07:28:48.085249', '_unique_id': '50b06504f7bd48dd8a8359874e5e3842'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.086 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.088 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.089 12 DEBUG ceilometer.compute.pollsters [-] f22f95f6-efd0-4710-adf7-895e0acda50c/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.089 12 DEBUG ceilometer.compute.pollsters [-] 2cb4e847-114f-440a-b231-65e3fff0f0d2/network.outgoing.bytes volume: 266 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.090 12 DEBUG ceilometer.compute.pollsters [-] 2cb4e847-114f-440a-b231-65e3fff0f0d2/network.outgoing.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '12b28cd0-b2ff-4f4e-b954-79b6d8c453fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': 'instance-00000084-f22f95f6-efd0-4710-adf7-895e0acda50c-tapc3de84a1-77', 'timestamp': '2025-11-29T07:28:48.089355', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1499668672', 'name': 'tapc3de84a1-77', 'instance_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c', 'instance_type': 'm1.nano', 'host': 'fb1f4d6d255efe63d2ef702fe4c95a98e37c027293f6cb5b0c303f7f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f7:bb:74', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc3de84a1-77'}, 'message_id': '0b86670c-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.734611496, 'message_signature': '7045816f7b92c8346b643813eb6070ff41abac7e2cf5d5f2826ba305770a1123'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 266, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000087-2cb4e847-114f-440a-b231-65e3fff0f0d2-tap836d3fdf-e9', 'timestamp': '2025-11-29T07:28:48.089355', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-345833542', 'name': 'tap836d3fdf-e9', 'instance_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d5:a1:ff', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap836d3fdf-e9'}, 'message_id': '0b867af8-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.738558679, 'message_signature': '91fb47dd862ba2d9f0f8fc037ef68449070b8691eefce6faf15e63a74037956d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000087-2cb4e847-114f-440a-b231-65e3fff0f0d2-tap9c194df0-4c', 'timestamp': '2025-11-29T07:28:48.089355', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-345833542', 'name': 'tap9c194df0-4c', 'instance_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:da:a3:a5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9c194df0-4c'}, 'message_id': '0b868976-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.738558679, 'message_signature': 'f8078cbf5d071f3384cdae48d66dbe7abbef5ce2e88d68983c89e900a671ff28'}]}, 'timestamp': '2025-11-29 07:28:48.090848', '_unique_id': '68e9a4f6f21b4d29bbffed228a1aa096'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.092 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.093 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.093 12 DEBUG ceilometer.compute.pollsters [-] f22f95f6-efd0-4710-adf7-895e0acda50c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.094 12 DEBUG ceilometer.compute.pollsters [-] 2cb4e847-114f-440a-b231-65e3fff0f0d2/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.094 12 DEBUG ceilometer.compute.pollsters [-] 2cb4e847-114f-440a-b231-65e3fff0f0d2/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2e53c8d2-1269-41a6-b1c9-4b69d972ccc8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': 'instance-00000084-f22f95f6-efd0-4710-adf7-895e0acda50c-tapc3de84a1-77', 'timestamp': '2025-11-29T07:28:48.093764', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1499668672', 'name': 'tapc3de84a1-77', 'instance_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c', 'instance_type': 'm1.nano', 'host': 'fb1f4d6d255efe63d2ef702fe4c95a98e37c027293f6cb5b0c303f7f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f7:bb:74', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc3de84a1-77'}, 'message_id': '0b87106c-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.734611496, 'message_signature': '7abb6f7fa84c75126e7fb9a30165774db1ca8d2621f66c9306ae5f193a32f740'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000087-2cb4e847-114f-440a-b231-65e3fff0f0d2-tap836d3fdf-e9', 'timestamp': '2025-11-29T07:28:48.093764', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-345833542', 'name': 'tap836d3fdf-e9', 'instance_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d5:a1:ff', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap836d3fdf-e9'}, 'message_id': '0b872098-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.738558679, 'message_signature': '338341ab6bc7131118f27467ee8a77fab431e46a15ad63969a5d7721d6d5f7d7'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000087-2cb4e847-114f-440a-b231-65e3fff0f0d2-tap9c194df0-4c', 'timestamp': '2025-11-29T07:28:48.093764', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-345833542', 'name': 'tap9c194df0-4c', 'instance_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:da:a3:a5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9c194df0-4c'}, 'message_id': '0b873510-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.738558679, 'message_signature': 'f01b1aceece8505fc221e131139027b818aaaee750288a22e49d2c8315d6053d'}]}, 'timestamp': '2025-11-29 07:28:48.095097', '_unique_id': 'df3a2528c74740a3ae793efaa49c94d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.095 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.097 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.098 12 DEBUG ceilometer.compute.pollsters [-] f22f95f6-efd0-4710-adf7-895e0acda50c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.098 12 DEBUG ceilometer.compute.pollsters [-] 2cb4e847-114f-440a-b231-65e3fff0f0d2/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.099 12 DEBUG ceilometer.compute.pollsters [-] 2cb4e847-114f-440a-b231-65e3fff0f0d2/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '918b3084-89d6-4ca2-ae0e-d8ac8c11cee3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': 'instance-00000084-f22f95f6-efd0-4710-adf7-895e0acda50c-tapc3de84a1-77', 'timestamp': '2025-11-29T07:28:48.098008', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1499668672', 'name': 'tapc3de84a1-77', 'instance_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c', 'instance_type': 'm1.nano', 'host': 'fb1f4d6d255efe63d2ef702fe4c95a98e37c027293f6cb5b0c303f7f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f7:bb:74', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc3de84a1-77'}, 'message_id': '0b87b9c2-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.734611496, 'message_signature': '6a566cdc7f6740d1dcaa40e096c16ae30a6e2c4f46a4aff855ea3e01f2ce6c16'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000087-2cb4e847-114f-440a-b231-65e3fff0f0d2-tap836d3fdf-e9', 'timestamp': '2025-11-29T07:28:48.098008', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-345833542', 'name': 'tap836d3fdf-e9', 'instance_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d5:a1:ff', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap836d3fdf-e9'}, 'message_id': '0b87d83a-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.738558679, 'message_signature': '1f478b8e3f9c8cb68f1c1bea5127ddfdca640e0bad3e8c8bfd70210bd039c3df'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000087-2cb4e847-114f-440a-b231-65e3fff0f0d2-tap9c194df0-4c', 'timestamp': '2025-11-29T07:28:48.098008', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-345833542', 'name': 'tap9c194df0-4c', 'instance_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:da:a3:a5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9c194df0-4c'}, 'message_id': '0b87ea64-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.738558679, 'message_signature': '4858dc02ed656160173785086cf6328575f948bc09b3e4fca0418500f603fbac'}]}, 'timestamp': '2025-11-29 07:28:48.099955', '_unique_id': 'd7760cdec8d34aeda34ce4cd5afff8f9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.101 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.104 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.105 12 DEBUG ceilometer.compute.pollsters [-] f22f95f6-efd0-4710-adf7-895e0acda50c/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.105 12 DEBUG ceilometer.compute.pollsters [-] 2cb4e847-114f-440a-b231-65e3fff0f0d2/network.outgoing.packets volume: 3 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.105 12 DEBUG ceilometer.compute.pollsters [-] 2cb4e847-114f-440a-b231-65e3fff0f0d2/network.outgoing.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f5352f3-8f90-4995-a6f0-b30fe3212071', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': 'instance-00000084-f22f95f6-efd0-4710-adf7-895e0acda50c-tapc3de84a1-77', 'timestamp': '2025-11-29T07:28:48.104980', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1499668672', 'name': 'tapc3de84a1-77', 'instance_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c', 'instance_type': 'm1.nano', 'host': 'fb1f4d6d255efe63d2ef702fe4c95a98e37c027293f6cb5b0c303f7f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f7:bb:74', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc3de84a1-77'}, 'message_id': '0b88cb28-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.734611496, 'message_signature': 'f6218f5c2e80a3080dc44bb39b2c454ce92a2950f855169d37d125471a720233'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 3, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000087-2cb4e847-114f-440a-b231-65e3fff0f0d2-tap836d3fdf-e9', 'timestamp': '2025-11-29T07:28:48.104980', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-345833542', 'name': 'tap836d3fdf-e9', 'instance_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d5:a1:ff', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap836d3fdf-e9'}, 'message_id': '0b88dc62-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.738558679, 'message_signature': 'c809d9d963b9201a221b84d9967f0d72105e2f6a38879a88a8b1d29f1ee1dfc4'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000087-2cb4e847-114f-440a-b231-65e3fff0f0d2-tap9c194df0-4c', 'timestamp': '2025-11-29T07:28:48.104980', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-345833542', 'name': 'tap9c194df0-4c', 'instance_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:da:a3:a5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9c194df0-4c'}, 'message_id': '0b88ec0c-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.738558679, 'message_signature': 'ff6eb90c051e649d9eb0259410575587725b38e5b23e120091e5e51850a0de61'}]}, 'timestamp': '2025-11-29 07:28:48.106355', '_unique_id': '494d498a8731413eae99453402cd1cdd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.107 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.109 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.109 12 DEBUG ceilometer.compute.pollsters [-] f22f95f6-efd0-4710-adf7-895e0acda50c/disk.device.read.latency volume: 253773626 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.109 12 DEBUG ceilometer.compute.pollsters [-] f22f95f6-efd0-4710-adf7-895e0acda50c/disk.device.read.latency volume: 30150181 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.110 12 DEBUG ceilometer.compute.pollsters [-] 2cb4e847-114f-440a-b231-65e3fff0f0d2/disk.device.read.latency volume: 251318908 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.110 12 DEBUG ceilometer.compute.pollsters [-] 2cb4e847-114f-440a-b231-65e3fff0f0d2/disk.device.read.latency volume: 22830866 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e7653404-b5e4-47a8-bcc8-59967ef28e5c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 253773626, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c-vda', 'timestamp': '2025-11-29T07:28:48.109347', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1499668672', 'name': 'instance-00000084', 'instance_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c', 'instance_type': 'm1.nano', 'host': 'fb1f4d6d255efe63d2ef702fe4c95a98e37c027293f6cb5b0c303f7f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b897226-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.7455536, 'message_signature': '4d7aacb917827e8de9d5aef6ea8404797061947cc6e5597c21f20aa7494d230e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 30150181, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c-sda', 'timestamp': '2025-11-29T07:28:48.109347', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1499668672', 'name': 'instance-00000084', 'instance_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c', 'instance_type': 'm1.nano', 'host': 'fb1f4d6d255efe63d2ef702fe4c95a98e37c027293f6cb5b0c303f7f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b89813a-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.7455536, 'message_signature': '95bd57eec94bc7a9fa0b5694d20590f4bbb6dabe6d0fd691ae93d99688c3bcfd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 251318908, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2-vda', 'timestamp': '2025-11-29T07:28:48.109347', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-345833542', 'name': 'instance-00000087', 'instance_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b898e6e-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.773508962, 'message_signature': '23e93ca0aedcbdf379e12e41999499c4b9b94bd0eba8e878c47f0dbe5e5e05db'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22830866, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2-sda', 'timestamp': '2025-11-29T07:28:48.109347', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-345833542', 'name': 'instance-00000087', 'instance_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b899b34-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.773508962, 'message_signature': 'b565fad3afe806f24da3a40d4363b8f594b6866cfa2e5ce75bec5e1fb93de738'}]}, 'timestamp': '2025-11-29 07:28:48.110845', '_unique_id': '3693cad456f448e8997b28f633193198'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.111 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.113 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.126 12 DEBUG ceilometer.compute.pollsters [-] f22f95f6-efd0-4710-adf7-895e0acda50c/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.127 12 DEBUG ceilometer.compute.pollsters [-] f22f95f6-efd0-4710-adf7-895e0acda50c/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.142 12 DEBUG ceilometer.compute.pollsters [-] 2cb4e847-114f-440a-b231-65e3fff0f0d2/disk.device.allocation volume: 29106176 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.143 12 DEBUG ceilometer.compute.pollsters [-] 2cb4e847-114f-440a-b231-65e3fff0f0d2/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ce4a897f-709f-4b1c-b46f-809e8fedcf7f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c-vda', 'timestamp': '2025-11-29T07:28:48.113227', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1499668672', 'name': 'instance-00000084', 'instance_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c', 'instance_type': 'm1.nano', 'host': 'fb1f4d6d255efe63d2ef702fe4c95a98e37c027293f6cb5b0c303f7f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b8c2c50-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.831504805, 'message_signature': 'a7fd2219fb1550e8965cc22a5231b94ec326ac5f160d2d886af28143cdad924d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c-sda', 'timestamp': '2025-11-29T07:28:48.113227', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1499668672', 'name': 'instance-00000084', 'instance_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c', 'instance_type': 'm1.nano', 'host': 'fb1f4d6d255efe63d2ef702fe4c95a98e37c027293f6cb5b0c303f7f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b8c3ec0-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.831504805, 'message_signature': 'e960462cf482527802fb87f43d34d445d914d8ba983cc5347bde6aa843e3ece2'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29106176, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2-vda', 'timestamp': '2025-11-29T07:28:48.113227', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-345833542', 'name': 'instance-00000087', 'instance_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b8e95c6-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.846353271, 'message_signature': '9dfc7d179abc3c6c44c09c8ea795cfe78e6dd7c37f6a01018fb91a8eef1db60a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2-sda', 'timestamp': '2025-11-29T07:28:48.113227', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-345833542', 'name': 'instance-00000087', 'instance_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b8ea548-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.846353271, 'message_signature': 'd6dac91c74bc28d5450ebcdd3637aa67892cf3d26973be3ec18c959619bb5fbd'}]}, 'timestamp': '2025-11-29 07:28:48.143919', '_unique_id': 'c02b2f6e363d441a954c99e627ea0452'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.145 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.146 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.146 12 DEBUG ceilometer.compute.pollsters [-] f22f95f6-efd0-4710-adf7-895e0acda50c/disk.device.write.bytes volume: 72888320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.147 12 DEBUG ceilometer.compute.pollsters [-] f22f95f6-efd0-4710-adf7-895e0acda50c/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.147 12 DEBUG ceilometer.compute.pollsters [-] 2cb4e847-114f-440a-b231-65e3fff0f0d2/disk.device.write.bytes volume: 25681920 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.148 12 DEBUG ceilometer.compute.pollsters [-] 2cb4e847-114f-440a-b231-65e3fff0f0d2/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '26804ed6-0981-4c9c-a6a8-83324cb97044', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72888320, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c-vda', 'timestamp': '2025-11-29T07:28:48.146824', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1499668672', 'name': 'instance-00000084', 'instance_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c', 'instance_type': 'm1.nano', 'host': 'fb1f4d6d255efe63d2ef702fe4c95a98e37c027293f6cb5b0c303f7f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b8f29fa-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.7455536, 'message_signature': 'f8d79bf1a0ff53a9303d42b04b62211cedd54980c0f468c14f9491bd7e7ecdac'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c-sda', 'timestamp': '2025-11-29T07:28:48.146824', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1499668672', 'name': 'instance-00000084', 'instance_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c', 'instance_type': 'm1.nano', 'host': 'fb1f4d6d255efe63d2ef702fe4c95a98e37c027293f6cb5b0c303f7f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b8f38a0-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.7455536, 'message_signature': '9f63642851e00ed94fa3a35ef4409f3d0b5aac2bef3aaac7961eb61bcbf5e1bc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 25681920, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2-vda', 'timestamp': '2025-11-29T07:28:48.146824', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-345833542', 'name': 'instance-00000087', 'instance_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b8f45b6-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.773508962, 'message_signature': '93e33281802cbf60005136f041b3cfe515b8c6ce876f041ffc24acb7db29a3bf'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2-sda', 'timestamp': '2025-11-29T07:28:48.146824', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-345833542', 'name': 'instance-00000087', 'instance_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b8f5538-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.773508962, 'message_signature': 'f925fcd35d38b0cac74034e3cee78ae09f04a872423b2ed61de6f11057347b33'}]}, 'timestamp': '2025-11-29 07:28:48.148359', '_unique_id': 'e0755a3a754e476888bcccfc9fb21712'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.149 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.150 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.151 12 DEBUG ceilometer.compute.pollsters [-] f22f95f6-efd0-4710-adf7-895e0acda50c/disk.device.write.latency volume: 6288645800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.151 12 DEBUG ceilometer.compute.pollsters [-] f22f95f6-efd0-4710-adf7-895e0acda50c/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.151 12 DEBUG ceilometer.compute.pollsters [-] 2cb4e847-114f-440a-b231-65e3fff0f0d2/disk.device.write.latency volume: 39904770920 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.152 12 DEBUG ceilometer.compute.pollsters [-] 2cb4e847-114f-440a-b231-65e3fff0f0d2/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f6139df1-45d6-4d9b-b777-e34b943020be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6288645800, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c-vda', 'timestamp': '2025-11-29T07:28:48.150989', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1499668672', 'name': 'instance-00000084', 'instance_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c', 'instance_type': 'm1.nano', 'host': 'fb1f4d6d255efe63d2ef702fe4c95a98e37c027293f6cb5b0c303f7f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b8fcac2-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.7455536, 'message_signature': '458fa46d35cf79072cf8c67e6b52471ab8df5ad44df8778dee4c9c369dcdb797'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c-sda', 'timestamp': '2025-11-29T07:28:48.150989', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1499668672', 'name': 'instance-00000084', 'instance_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c', 'instance_type': 'm1.nano', 'host': 'fb1f4d6d255efe63d2ef702fe4c95a98e37c027293f6cb5b0c303f7f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b8fdaa8-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.7455536, 'message_signature': '5c3cf65df387da87ae5106eb0598ab4e031f14e054f947cf6869533db0e2c9d3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 39904770920, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2-vda', 'timestamp': '2025-11-29T07:28:48.150989', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-345833542', 'name': 'instance-00000087', 'instance_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b8fede0-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.773508962, 'message_signature': 'b11de6c32749101b567a8e590686ae72b6588d65ca46c7004eca033c54696b2c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2-sda', 'timestamp': '2025-11-29T07:28:48.150989', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-345833542', 'name': 'instance-00000087', 'instance_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b8ffc18-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.773508962, 'message_signature': '6ec1c8d87954a90fffe059c225aadd6b7adb85ec8f0a41af567c34f0d495b49d'}]}, 'timestamp': '2025-11-29 07:28:48.152612', '_unique_id': 'b38655d52f424c6b8b1dc9bd05258653'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.153 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.154 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.154 12 DEBUG ceilometer.compute.pollsters [-] f22f95f6-efd0-4710-adf7-895e0acda50c/network.incoming.bytes volume: 1520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.155 12 DEBUG ceilometer.compute.pollsters [-] 2cb4e847-114f-440a-b231-65e3fff0f0d2/network.incoming.bytes volume: 844 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.155 12 DEBUG ceilometer.compute.pollsters [-] 2cb4e847-114f-440a-b231-65e3fff0f0d2/network.incoming.bytes volume: 622 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3d2dd26b-c216-444e-ad6f-34b9d06302a5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1520, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': 'instance-00000084-f22f95f6-efd0-4710-adf7-895e0acda50c-tapc3de84a1-77', 'timestamp': '2025-11-29T07:28:48.154774', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1499668672', 'name': 'tapc3de84a1-77', 'instance_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c', 'instance_type': 'm1.nano', 'host': 'fb1f4d6d255efe63d2ef702fe4c95a98e37c027293f6cb5b0c303f7f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f7:bb:74', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc3de84a1-77'}, 'message_id': '0b906054-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.734611496, 'message_signature': 'f261d25accb342f96c474a94dd4c01354c618dc10f5a8c1d763d6da9da7756e0'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 844, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000087-2cb4e847-114f-440a-b231-65e3fff0f0d2-tap836d3fdf-e9', 'timestamp': '2025-11-29T07:28:48.154774', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-345833542', 'name': 'tap836d3fdf-e9', 'instance_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d5:a1:ff', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap836d3fdf-e9'}, 'message_id': '0b906e96-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.738558679, 'message_signature': '2040c9d11fd5d80b62340eac675f99445b3b2239616079cb81d0a82498803d85'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 622, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000087-2cb4e847-114f-440a-b231-65e3fff0f0d2-tap9c194df0-4c', 'timestamp': '2025-11-29T07:28:48.154774', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-345833542', 'name': 'tap9c194df0-4c', 'instance_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:da:a3:a5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9c194df0-4c'}, 'message_id': '0b907cec-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.738558679, 'message_signature': 'f9310027349f301cc5a217cf45e4f8317a9cd1b514e7ccba716ddc37509a072a'}]}, 'timestamp': '2025-11-29 07:28:48.155947', '_unique_id': '60045d528f26413590b367a1fddea41a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.156 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.157 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.181 12 DEBUG ceilometer.compute.pollsters [-] f22f95f6-efd0-4710-adf7-895e0acda50c/memory.usage volume: 42.671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.199 12 DEBUG ceilometer.compute.pollsters [-] 2cb4e847-114f-440a-b231-65e3fff0f0d2/memory.usage volume: 40.39453125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '86cc0b89-22ec-49e6-9e22-d705c9648d85', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.671875, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c', 'timestamp': '2025-11-29T07:28:48.158083', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1499668672', 'name': 'instance-00000084', 'instance_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c', 'instance_type': 'm1.nano', 'host': 'fb1f4d6d255efe63d2ef702fe4c95a98e37c027293f6cb5b0c303f7f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '0b948102-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.89965376, 'message_signature': 'acefe548e8aafcc5d73a00cc69348a40ccee6f711b639ef45a56154d88e3030b'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.39453125, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'timestamp': '2025-11-29T07:28:48.158083', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-345833542', 'name': 'instance-00000087', 'instance_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '0b974400-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.918010186, 'message_signature': 'c24d855588c40c76bedc3b34641335b32196ef597453dea16cc9cf71ad3a5e26'}]}, 'timestamp': '2025-11-29 07:28:48.200403', '_unique_id': '668441ff65d649d79dba54fc82c8193d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.201 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.202 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.202 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.202 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1499668672>, <NovaLikeServer: tempest-TestGettingAddress-server-345833542>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1499668672>, <NovaLikeServer: tempest-TestGettingAddress-server-345833542>]
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.203 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 07:28:48 compute-0 ovn_controller[95281]: 2025-11-29T07:28:48Z|00057|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d5:a1:ff 10.100.0.9
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.203 12 DEBUG ceilometer.compute.pollsters [-] f22f95f6-efd0-4710-adf7-895e0acda50c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.203 12 DEBUG ceilometer.compute.pollsters [-] 2cb4e847-114f-440a-b231-65e3fff0f0d2/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.204 12 DEBUG ceilometer.compute.pollsters [-] 2cb4e847-114f-440a-b231-65e3fff0f0d2/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ovn_controller[95281]: 2025-11-29T07:28:48Z|00058|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d5:a1:ff 10.100.0.9
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ad79a82e-c77a-41e8-906a-8bf12ec7a0f2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': 'instance-00000084-f22f95f6-efd0-4710-adf7-895e0acda50c-tapc3de84a1-77', 'timestamp': '2025-11-29T07:28:48.203217', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1499668672', 'name': 'tapc3de84a1-77', 'instance_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c', 'instance_type': 'm1.nano', 'host': 'fb1f4d6d255efe63d2ef702fe4c95a98e37c027293f6cb5b0c303f7f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f7:bb:74', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc3de84a1-77'}, 'message_id': '0b97c63c-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.734611496, 'message_signature': 'f95d3baf4c31192933f281843213d5f372caa280f34c5de01e64d7e4a06959f4'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000087-2cb4e847-114f-440a-b231-65e3fff0f0d2-tap836d3fdf-e9', 'timestamp': '2025-11-29T07:28:48.203217', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-345833542', 'name': 'tap836d3fdf-e9', 'instance_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d5:a1:ff', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap836d3fdf-e9'}, 'message_id': '0b97d816-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.738558679, 'message_signature': '1ca72696dcff19f5076ae9725cad62e2231fec736253b01345fc1bfb41a6937c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000087-2cb4e847-114f-440a-b231-65e3fff0f0d2-tap9c194df0-4c', 'timestamp': '2025-11-29T07:28:48.203217', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-345833542', 'name': 'tap9c194df0-4c', 'instance_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:da:a3:a5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9c194df0-4c'}, 'message_id': '0b97e3ec-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.738558679, 'message_signature': '40383a1c7e7933039110543ad3c345b864acd648ef6c3c792e9d58df9f609b88'}]}, 'timestamp': '2025-11-29 07:28:48.204460', '_unique_id': 'e5ffa23d79d74b08809f67b428926a2b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.205 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.206 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.206 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.206 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1499668672>, <NovaLikeServer: tempest-TestGettingAddress-server-345833542>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1499668672>, <NovaLikeServer: tempest-TestGettingAddress-server-345833542>]
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.206 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.206 12 DEBUG ceilometer.compute.pollsters [-] f22f95f6-efd0-4710-adf7-895e0acda50c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.206 12 DEBUG ceilometer.compute.pollsters [-] 2cb4e847-114f-440a-b231-65e3fff0f0d2/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.207 12 DEBUG ceilometer.compute.pollsters [-] 2cb4e847-114f-440a-b231-65e3fff0f0d2/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '247e0e74-7201-4b9f-afef-a855d92be282', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': 'instance-00000084-f22f95f6-efd0-4710-adf7-895e0acda50c-tapc3de84a1-77', 'timestamp': '2025-11-29T07:28:48.206592', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1499668672', 'name': 'tapc3de84a1-77', 'instance_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c', 'instance_type': 'm1.nano', 'host': 'fb1f4d6d255efe63d2ef702fe4c95a98e37c027293f6cb5b0c303f7f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f7:bb:74', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc3de84a1-77'}, 'message_id': '0b98445e-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.734611496, 'message_signature': 'b0336e168e82d278634019e004d0324e9cf0b010010ab4780090e7d7b679e65b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000087-2cb4e847-114f-440a-b231-65e3fff0f0d2-tap836d3fdf-e9', 'timestamp': '2025-11-29T07:28:48.206592', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-345833542', 'name': 'tap836d3fdf-e9', 'instance_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d5:a1:ff', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap836d3fdf-e9'}, 'message_id': '0b985232-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.738558679, 'message_signature': '5cec2644d6250276e9b28a348e1cf19b634eebbead5d444b3792cff75328ccaf'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000087-2cb4e847-114f-440a-b231-65e3fff0f0d2-tap9c194df0-4c', 'timestamp': '2025-11-29T07:28:48.206592', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-345833542', 'name': 'tap9c194df0-4c', 'instance_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:da:a3:a5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9c194df0-4c'}, 'message_id': '0b985d7c-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.738558679, 'message_signature': '1f2fa7bfb9f802895ac425c762745b48c1f2cb2645b4f027523a82da406dd152'}]}, 'timestamp': '2025-11-29 07:28:48.207547', '_unique_id': 'e2a66ac8b5284fd39fe91e5b926d16b6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.209 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.209 12 DEBUG ceilometer.compute.pollsters [-] f22f95f6-efd0-4710-adf7-895e0acda50c/disk.device.read.bytes volume: 30513664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.209 12 DEBUG ceilometer.compute.pollsters [-] f22f95f6-efd0-4710-adf7-895e0acda50c/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.209 12 DEBUG ceilometer.compute.pollsters [-] 2cb4e847-114f-440a-b231-65e3fff0f0d2/disk.device.read.bytes volume: 29453312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.210 12 DEBUG ceilometer.compute.pollsters [-] 2cb4e847-114f-440a-b231-65e3fff0f0d2/disk.device.read.bytes volume: 221502 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e511efa-ca48-4f03-be0d-89c0a00e0564', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30513664, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c-vda', 'timestamp': '2025-11-29T07:28:48.209114', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1499668672', 'name': 'instance-00000084', 'instance_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c', 'instance_type': 'm1.nano', 'host': 'fb1f4d6d255efe63d2ef702fe4c95a98e37c027293f6cb5b0c303f7f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b98a692-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.7455536, 'message_signature': 'decb21f48f9fb4d15e9dde0ae79f523e9be5bbc964b5194a8bc3711c70035bc5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c-sda', 'timestamp': '2025-11-29T07:28:48.209114', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1499668672', 'name': 'instance-00000084', 'instance_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c', 'instance_type': 'm1.nano', 'host': 'fb1f4d6d255efe63d2ef702fe4c95a98e37c027293f6cb5b0c303f7f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b98b3a8-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.7455536, 'message_signature': '6ddf49cce920d3370b97f5f364f8f620d42a2254bc28cc94a81748973d15e83b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29453312, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2-vda', 'timestamp': '2025-11-29T07:28:48.209114', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-345833542', 'name': 'instance-00000087', 'instance_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b98c1c2-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.773508962, 'message_signature': 'e2a0a8432f5e4375a4d90d1f11ab5bbc661d995eb58b65ddee46aa428e358770'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 221502, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2-sda', 'timestamp': '2025-11-29T07:28:48.209114', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-345833542', 'name': 'instance-00000087', 'instance_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b98ce74-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.773508962, 'message_signature': '22b6797c26d2163205f0b123fd9b98aa165758f4e2655092e1f1ec23f5553083'}]}, 'timestamp': '2025-11-29 07:28:48.210423', '_unique_id': '7e3c9f34572843d68dfa0ac0adb4afb3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.211 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.212 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.212 12 DEBUG ceilometer.compute.pollsters [-] f22f95f6-efd0-4710-adf7-895e0acda50c/disk.device.read.requests volume: 1102 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.212 12 DEBUG ceilometer.compute.pollsters [-] f22f95f6-efd0-4710-adf7-895e0acda50c/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.212 12 DEBUG ceilometer.compute.pollsters [-] 2cb4e847-114f-440a-b231-65e3fff0f0d2/disk.device.read.requests volume: 1055 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.212 12 DEBUG ceilometer.compute.pollsters [-] 2cb4e847-114f-440a-b231-65e3fff0f0d2/disk.device.read.requests volume: 95 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dbf59b57-3c9a-4fe6-90de-513d29a6339a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1102, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c-vda', 'timestamp': '2025-11-29T07:28:48.212113', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1499668672', 'name': 'instance-00000084', 'instance_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c', 'instance_type': 'm1.nano', 'host': 'fb1f4d6d255efe63d2ef702fe4c95a98e37c027293f6cb5b0c303f7f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b991ba4-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.7455536, 'message_signature': '66678bc02eeaf821e151dabd89d689baa8853cb25e91bac97d209c8f4f65e880'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c-sda', 'timestamp': '2025-11-29T07:28:48.212113', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1499668672', 'name': 'instance-00000084', 'instance_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c', 'instance_type': 'm1.nano', 'host': 'fb1f4d6d255efe63d2ef702fe4c95a98e37c027293f6cb5b0c303f7f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b9925d6-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.7455536, 'message_signature': '05176fe3e10bc7c6864aaaa92f967e80fd4fc7ca374a9300318e135f5b876eb9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1055, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2-vda', 'timestamp': '2025-11-29T07:28:48.212113', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-345833542', 'name': 'instance-00000087', 'instance_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b99310c-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.773508962, 'message_signature': '3009ad45acb349027f2f16b33ac70d77e658c10dc71938fa13ade4ac64a52cb5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 95, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2-sda', 'timestamp': '2025-11-29T07:28:48.212113', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-345833542', 'name': 'instance-00000087', 'instance_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b993bf2-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.773508962, 'message_signature': 'da96e864abebc344e140570f3ed0bc12321ccb0870bd86475c1546bdfa53f77d'}]}, 'timestamp': '2025-11-29 07:28:48.213217', '_unique_id': 'ff3914d7a85d486c9f1d209bd862911d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.213 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.214 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.214 12 DEBUG ceilometer.compute.pollsters [-] f22f95f6-efd0-4710-adf7-895e0acda50c/cpu volume: 12130000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.215 12 DEBUG ceilometer.compute.pollsters [-] 2cb4e847-114f-440a-b231-65e3fff0f0d2/cpu volume: 11400000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '779bc568-a866-4e3e-be57-f7435443c02e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12130000000, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c', 'timestamp': '2025-11-29T07:28:48.214780', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1499668672', 'name': 'instance-00000084', 'instance_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c', 'instance_type': 'm1.nano', 'host': 'fb1f4d6d255efe63d2ef702fe4c95a98e37c027293f6cb5b0c303f7f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '0b998508-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.89965376, 'message_signature': '390d8224a329634618f44f830a54fe27f64e81ec8b10354591566ad873665f1a'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11400000000, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'timestamp': '2025-11-29T07:28:48.214780', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-345833542', 'name': 'instance-00000087', 'instance_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '0b998fda-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.918010186, 'message_signature': 'de432cd52d4920ec7710ab5b2214f8611b030b6ead34a96d90a874cd4af5c654'}]}, 'timestamp': '2025-11-29 07:28:48.215394', '_unique_id': '0adf60767fba4458aae62669fd16ca77'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.216 12 DEBUG ceilometer.compute.pollsters [-] f22f95f6-efd0-4710-adf7-895e0acda50c/network.incoming.packets volume: 13 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.217 12 DEBUG ceilometer.compute.pollsters [-] 2cb4e847-114f-440a-b231-65e3fff0f0d2/network.incoming.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.217 12 DEBUG ceilometer.compute.pollsters [-] 2cb4e847-114f-440a-b231-65e3fff0f0d2/network.incoming.packets volume: 7 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fe69b44b-2b77-49b8-9eb7-ea6ef2b03ab6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 13, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': 'instance-00000084-f22f95f6-efd0-4710-adf7-895e0acda50c-tapc3de84a1-77', 'timestamp': '2025-11-29T07:28:48.216841', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1499668672', 'name': 'tapc3de84a1-77', 'instance_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c', 'instance_type': 'm1.nano', 'host': 'fb1f4d6d255efe63d2ef702fe4c95a98e37c027293f6cb5b0c303f7f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f7:bb:74', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc3de84a1-77'}, 'message_id': '0b99d54e-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.734611496, 'message_signature': 'e143a2e9ae8f35840d551b38d22b8ed70e2b920dcbac802babfd8678b41396d5'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 10, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000087-2cb4e847-114f-440a-b231-65e3fff0f0d2-tap836d3fdf-e9', 'timestamp': '2025-11-29T07:28:48.216841', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-345833542', 'name': 'tap836d3fdf-e9', 'instance_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d5:a1:ff', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap836d3fdf-e9'}, 'message_id': '0b99e03e-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.738558679, 'message_signature': '35c36343df7f5ba3a96f3dab7d88de881944223ab2de80bb8bb0e0dd805c417e'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 7, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000087-2cb4e847-114f-440a-b231-65e3fff0f0d2-tap9c194df0-4c', 'timestamp': '2025-11-29T07:28:48.216841', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-345833542', 'name': 'tap9c194df0-4c', 'instance_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:da:a3:a5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9c194df0-4c'}, 'message_id': '0b99ea7a-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.738558679, 'message_signature': '508086f8df50aaacf54a21cf04e369d7b1b5b742547c9862477512fed722445f'}]}, 'timestamp': '2025-11-29 07:28:48.217701', '_unique_id': '30e6ee937f8f4e988255939b75b10b91'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.218 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.219 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.219 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.219 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1499668672>, <NovaLikeServer: tempest-TestGettingAddress-server-345833542>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1499668672>, <NovaLikeServer: tempest-TestGettingAddress-server-345833542>]
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.219 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.219 12 DEBUG ceilometer.compute.pollsters [-] f22f95f6-efd0-4710-adf7-895e0acda50c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.219 12 DEBUG ceilometer.compute.pollsters [-] 2cb4e847-114f-440a-b231-65e3fff0f0d2/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.220 12 DEBUG ceilometer.compute.pollsters [-] 2cb4e847-114f-440a-b231-65e3fff0f0d2/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '12101a07-8051-4f0a-95b5-427484f4c27f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': 'instance-00000084-f22f95f6-efd0-4710-adf7-895e0acda50c-tapc3de84a1-77', 'timestamp': '2025-11-29T07:28:48.219607', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1499668672', 'name': 'tapc3de84a1-77', 'instance_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c', 'instance_type': 'm1.nano', 'host': 'fb1f4d6d255efe63d2ef702fe4c95a98e37c027293f6cb5b0c303f7f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f7:bb:74', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc3de84a1-77'}, 'message_id': '0b9a4088-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.734611496, 'message_signature': 'd01b556b5019b23fe2db84e5c5e7688c16c1732487e8278fdeca6d57200a79fb'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000087-2cb4e847-114f-440a-b231-65e3fff0f0d2-tap836d3fdf-e9', 'timestamp': '2025-11-29T07:28:48.219607', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-345833542', 'name': 'tap836d3fdf-e9', 'instance_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d5:a1:ff', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap836d3fdf-e9'}, 'message_id': '0b9a4cea-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.738558679, 'message_signature': 'e4ac597ce03ec4a5f7e02dfc5b69bfb4f8132a3eb5b6ef0065bce6a635343bc5'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000087-2cb4e847-114f-440a-b231-65e3fff0f0d2-tap9c194df0-4c', 'timestamp': '2025-11-29T07:28:48.219607', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-345833542', 'name': 'tap9c194df0-4c', 'instance_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:da:a3:a5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9c194df0-4c'}, 'message_id': '0b9a5974-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.738558679, 'message_signature': '89e8bcc7b27f6364100792e62a4989acd1cf0d4bc7d6503f0bb63a8e00f946f0'}]}, 'timestamp': '2025-11-29 07:28:48.220537', '_unique_id': 'ac8528bcb53542deaedd49bbcc26309f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.221 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.222 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.222 12 DEBUG ceilometer.compute.pollsters [-] f22f95f6-efd0-4710-adf7-895e0acda50c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.222 12 DEBUG ceilometer.compute.pollsters [-] f22f95f6-efd0-4710-adf7-895e0acda50c/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.222 12 DEBUG ceilometer.compute.pollsters [-] 2cb4e847-114f-440a-b231-65e3fff0f0d2/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.222 12 DEBUG ceilometer.compute.pollsters [-] 2cb4e847-114f-440a-b231-65e3fff0f0d2/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f491bb1b-bd4b-400b-9cb7-74f2fbfccfd5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c-vda', 'timestamp': '2025-11-29T07:28:48.222096', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1499668672', 'name': 'instance-00000084', 'instance_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c', 'instance_type': 'm1.nano', 'host': 'fb1f4d6d255efe63d2ef702fe4c95a98e37c027293f6cb5b0c303f7f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b9aa190-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.831504805, 'message_signature': 'cecefbd6ba38888d6251dc18acf9712d9d7ce5c0dd22dc1b16d87eb021a94f30'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c-sda', 'timestamp': '2025-11-29T07:28:48.222096', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1499668672', 'name': 'instance-00000084', 'instance_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c', 'instance_type': 'm1.nano', 'host': 'fb1f4d6d255efe63d2ef702fe4c95a98e37c027293f6cb5b0c303f7f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b9aabd6-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.831504805, 'message_signature': 'b56e01cd82e30091384d118b7d5fdd7d1b5cc59bd8762f1eb2375284d73af3b1'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2-vda', 'timestamp': '2025-11-29T07:28:48.222096', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-345833542', 'name': 'instance-00000087', 'instance_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b9ab5cc-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.846353271, 'message_signature': '21c0bdaa1356e757cbece7c1e54868341d0fe8dd8d904dcb927609058341c0ba'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2-sda', 'timestamp': '2025-11-29T07:28:48.222096', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-345833542', 'name': 'instance-00000087', 'instance_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b9ac10c-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.846353271, 'message_signature': 'c11874ff474b71dee3a700af69c7e02764892005ed6aef052765f608a59df4fb'}]}, 'timestamp': '2025-11-29 07:28:48.223177', '_unique_id': 'e48ea32594934827b55c60dee3f3f13c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.223 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.224 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.224 12 DEBUG ceilometer.compute.pollsters [-] f22f95f6-efd0-4710-adf7-895e0acda50c/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.225 12 DEBUG ceilometer.compute.pollsters [-] f22f95f6-efd0-4710-adf7-895e0acda50c/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.225 12 DEBUG ceilometer.compute.pollsters [-] 2cb4e847-114f-440a-b231-65e3fff0f0d2/disk.device.usage volume: 28901376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.225 12 DEBUG ceilometer.compute.pollsters [-] 2cb4e847-114f-440a-b231-65e3fff0f0d2/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fb28aac9-2f6c-4fa3-b48c-d445fb86fffa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c-vda', 'timestamp': '2025-11-29T07:28:48.224932', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1499668672', 'name': 'instance-00000084', 'instance_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c', 'instance_type': 'm1.nano', 'host': 'fb1f4d6d255efe63d2ef702fe4c95a98e37c027293f6cb5b0c303f7f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b9b1206-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.831504805, 'message_signature': '92157c7aa7f1adc34879f1ac53fc64e7c30fe9fda15afc199a40ee72c8a529e0'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c-sda', 'timestamp': '2025-11-29T07:28:48.224932', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1499668672', 'name': 'instance-00000084', 'instance_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c', 'instance_type': 'm1.nano', 'host': 'fb1f4d6d255efe63d2ef702fe4c95a98e37c027293f6cb5b0c303f7f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b9b1f58-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.831504805, 'message_signature': '8d89c80239d93f1bf786fd0a7c9613693929fd7158d4885769447c48976d53c3'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 28901376, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2-vda', 'timestamp': '2025-11-29T07:28:48.224932', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-345833542', 'name': 'instance-00000087', 'instance_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b9b29da-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.846353271, 'message_signature': '04c2112fbeecf8896a0e989f9f6108a810c6c1e215512c319327a00336b9d74f'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2-sda', 'timestamp': '2025-11-29T07:28:48.224932', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-345833542', 'name': 'instance-00000087', 'instance_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b9b36aa-ccf5-11f0-8f64-fa163e220349', 'monotonic_time': 6775.846353271, 'message_signature': '2e107438dc452b13e6e3eb5b7e61a2adbfae10d6504eb1ab8c7ea18dee992fda'}]}, 'timestamp': '2025-11-29 07:28:48.226195', '_unique_id': 'caa6ce0aa4d74bd88976a980ec9f1608'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:28:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:28:48.226 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:28:48 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:28:48.858 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '45'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:28:49 compute-0 nova_compute[187185]: 2025-11-29 07:28:49.221 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:51 compute-0 nova_compute[187185]: 2025-11-29 07:28:51.575 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:54 compute-0 nova_compute[187185]: 2025-11-29 07:28:54.224 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:55 compute-0 podman[237284]: 2025-11-29 07:28:55.833237192 +0000 UTC m=+0.089651052 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 07:28:55 compute-0 podman[237285]: 2025-11-29 07:28:55.843478026 +0000 UTC m=+0.092588047 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vcs-type=git, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, config_id=edpm, architecture=x86_64, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 29 07:28:55 compute-0 podman[237286]: 2025-11-29 07:28:55.862837301 +0000 UTC m=+0.091181176 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 07:28:56 compute-0 nova_compute[187185]: 2025-11-29 07:28:56.611 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:28:59 compute-0 nova_compute[187185]: 2025-11-29 07:28:59.228 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:00 compute-0 nova_compute[187185]: 2025-11-29 07:29:00.068 187189 DEBUG oslo_concurrency.lockutils [None req-8964fcd3-ea2c-43c0-b2dd-954a5dc953bf 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Acquiring lock "f22f95f6-efd0-4710-adf7-895e0acda50c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:29:00 compute-0 nova_compute[187185]: 2025-11-29 07:29:00.069 187189 DEBUG oslo_concurrency.lockutils [None req-8964fcd3-ea2c-43c0-b2dd-954a5dc953bf 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "f22f95f6-efd0-4710-adf7-895e0acda50c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:29:00 compute-0 nova_compute[187185]: 2025-11-29 07:29:00.069 187189 DEBUG oslo_concurrency.lockutils [None req-8964fcd3-ea2c-43c0-b2dd-954a5dc953bf 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Acquiring lock "f22f95f6-efd0-4710-adf7-895e0acda50c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:29:00 compute-0 nova_compute[187185]: 2025-11-29 07:29:00.070 187189 DEBUG oslo_concurrency.lockutils [None req-8964fcd3-ea2c-43c0-b2dd-954a5dc953bf 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "f22f95f6-efd0-4710-adf7-895e0acda50c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:29:00 compute-0 nova_compute[187185]: 2025-11-29 07:29:00.070 187189 DEBUG oslo_concurrency.lockutils [None req-8964fcd3-ea2c-43c0-b2dd-954a5dc953bf 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "f22f95f6-efd0-4710-adf7-895e0acda50c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:29:00 compute-0 nova_compute[187185]: 2025-11-29 07:29:00.092 187189 INFO nova.compute.manager [None req-8964fcd3-ea2c-43c0-b2dd-954a5dc953bf 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Terminating instance
Nov 29 07:29:00 compute-0 nova_compute[187185]: 2025-11-29 07:29:00.132 187189 DEBUG nova.compute.manager [None req-8964fcd3-ea2c-43c0-b2dd-954a5dc953bf 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 07:29:00 compute-0 kernel: tapc3de84a1-77 (unregistering): left promiscuous mode
Nov 29 07:29:00 compute-0 NetworkManager[55227]: <info>  [1764401340.1605] device (tapc3de84a1-77): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:29:00 compute-0 ovn_controller[95281]: 2025-11-29T07:29:00Z|00442|binding|INFO|Releasing lport c3de84a1-7764-4c77-a2fd-fd169639ed1e from this chassis (sb_readonly=0)
Nov 29 07:29:00 compute-0 nova_compute[187185]: 2025-11-29 07:29:00.177 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:00 compute-0 ovn_controller[95281]: 2025-11-29T07:29:00Z|00443|binding|INFO|Setting lport c3de84a1-7764-4c77-a2fd-fd169639ed1e down in Southbound
Nov 29 07:29:00 compute-0 ovn_controller[95281]: 2025-11-29T07:29:00Z|00444|binding|INFO|Removing iface tapc3de84a1-77 ovn-installed in OVS
Nov 29 07:29:00 compute-0 nova_compute[187185]: 2025-11-29 07:29:00.183 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:00 compute-0 nova_compute[187185]: 2025-11-29 07:29:00.202 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:00 compute-0 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000084.scope: Deactivated successfully.
Nov 29 07:29:00 compute-0 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000084.scope: Consumed 15.711s CPU time.
Nov 29 07:29:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:29:00.228 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f7:bb:74 10.100.0.13'], port_security=['fa:16:3e:f7:bb:74 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'f22f95f6-efd0-4710-adf7-895e0acda50c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-28412826-5463-46e4-95cb-a7d788b1ab15', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7843cfa993a1428aaaa660321ebba1ac', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b91ab01c-e143-4067-9931-a92270268d77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3cbf7b29-c247-42f8-abc3-94d1e6be8d3f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=c3de84a1-7764-4c77-a2fd-fd169639ed1e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:29:00 compute-0 systemd-machined[153486]: Machine qemu-53-instance-00000084 terminated.
Nov 29 07:29:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:29:00.231 104254 INFO neutron.agent.ovn.metadata.agent [-] Port c3de84a1-7764-4c77-a2fd-fd169639ed1e in datapath 28412826-5463-46e4-95cb-a7d788b1ab15 unbound from our chassis
Nov 29 07:29:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:29:00.235 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 28412826-5463-46e4-95cb-a7d788b1ab15, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:29:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:29:00.237 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[c0f3b2f4-2a48-4622-bd0c-d97ee9179349]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:29:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:29:00.238 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15 namespace which is not needed anymore
Nov 29 07:29:00 compute-0 nova_compute[187185]: 2025-11-29 07:29:00.413 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:00 compute-0 neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15[236768]: [NOTICE]   (236773) : haproxy version is 2.8.14-c23fe91
Nov 29 07:29:00 compute-0 neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15[236768]: [NOTICE]   (236773) : path to executable is /usr/sbin/haproxy
Nov 29 07:29:00 compute-0 neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15[236768]: [WARNING]  (236773) : Exiting Master process...
Nov 29 07:29:00 compute-0 neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15[236768]: [WARNING]  (236773) : Exiting Master process...
Nov 29 07:29:00 compute-0 neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15[236768]: [ALERT]    (236773) : Current worker (236775) exited with code 143 (Terminated)
Nov 29 07:29:00 compute-0 neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15[236768]: [WARNING]  (236773) : All workers exited. Exiting... (0)
Nov 29 07:29:00 compute-0 systemd[1]: libpod-6080745654af630fcfb35ff0e01dd7a6af8f7fadbae34c161bfc4be9d220da35.scope: Deactivated successfully.
Nov 29 07:29:00 compute-0 podman[237375]: 2025-11-29 07:29:00.452544138 +0000 UTC m=+0.062417301 container died 6080745654af630fcfb35ff0e01dd7a6af8f7fadbae34c161bfc4be9d220da35 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:29:00 compute-0 nova_compute[187185]: 2025-11-29 07:29:00.457 187189 INFO nova.virt.libvirt.driver [-] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Instance destroyed successfully.
Nov 29 07:29:00 compute-0 nova_compute[187185]: 2025-11-29 07:29:00.458 187189 DEBUG nova.objects.instance [None req-8964fcd3-ea2c-43c0-b2dd-954a5dc953bf 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lazy-loading 'resources' on Instance uuid f22f95f6-efd0-4710-adf7-895e0acda50c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:29:00 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6080745654af630fcfb35ff0e01dd7a6af8f7fadbae34c161bfc4be9d220da35-userdata-shm.mount: Deactivated successfully.
Nov 29 07:29:00 compute-0 systemd[1]: var-lib-containers-storage-overlay-6f199aa1b7ed84e9d95e9aaa5b9e5c9d0ce191c277ba5603629bbbd6a31194db-merged.mount: Deactivated successfully.
Nov 29 07:29:00 compute-0 podman[237375]: 2025-11-29 07:29:00.491050092 +0000 UTC m=+0.100923265 container cleanup 6080745654af630fcfb35ff0e01dd7a6af8f7fadbae34c161bfc4be9d220da35 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 29 07:29:00 compute-0 nova_compute[187185]: 2025-11-29 07:29:00.493 187189 DEBUG nova.virt.libvirt.vif [None req-8964fcd3-ea2c-43c0-b2dd-954a5dc953bf 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:27:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1499668672',display_name='tempest-ListServerFiltersTestJSON-instance-1499668672',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1499668672',id=132,image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:28:01Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7843cfa993a1428aaaa660321ebba1ac',ramdisk_id='',reservation_id='r-meotbwol',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-1571311845',owner_user_name='tempest-ListServerFiltersTestJSON-1571311845-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:28:01Z,user_data=None,user_id='3e2a40601ced4de78fe1767769f262c0',uuid=f22f95f6-efd0-4710-adf7-895e0acda50c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c3de84a1-7764-4c77-a2fd-fd169639ed1e", "address": "fa:16:3e:f7:bb:74", "network": {"id": "28412826-5463-46e4-95cb-a7d788b1ab15", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1363473809-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7843cfa993a1428aaaa660321ebba1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3de84a1-77", "ovs_interfaceid": "c3de84a1-7764-4c77-a2fd-fd169639ed1e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:29:00 compute-0 nova_compute[187185]: 2025-11-29 07:29:00.494 187189 DEBUG nova.network.os_vif_util [None req-8964fcd3-ea2c-43c0-b2dd-954a5dc953bf 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Converting VIF {"id": "c3de84a1-7764-4c77-a2fd-fd169639ed1e", "address": "fa:16:3e:f7:bb:74", "network": {"id": "28412826-5463-46e4-95cb-a7d788b1ab15", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1363473809-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7843cfa993a1428aaaa660321ebba1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3de84a1-77", "ovs_interfaceid": "c3de84a1-7764-4c77-a2fd-fd169639ed1e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:29:00 compute-0 nova_compute[187185]: 2025-11-29 07:29:00.495 187189 DEBUG nova.network.os_vif_util [None req-8964fcd3-ea2c-43c0-b2dd-954a5dc953bf 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f7:bb:74,bridge_name='br-int',has_traffic_filtering=True,id=c3de84a1-7764-4c77-a2fd-fd169639ed1e,network=Network(28412826-5463-46e4-95cb-a7d788b1ab15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3de84a1-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:29:00 compute-0 nova_compute[187185]: 2025-11-29 07:29:00.495 187189 DEBUG os_vif [None req-8964fcd3-ea2c-43c0-b2dd-954a5dc953bf 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f7:bb:74,bridge_name='br-int',has_traffic_filtering=True,id=c3de84a1-7764-4c77-a2fd-fd169639ed1e,network=Network(28412826-5463-46e4-95cb-a7d788b1ab15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3de84a1-77') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:29:00 compute-0 nova_compute[187185]: 2025-11-29 07:29:00.497 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:00 compute-0 nova_compute[187185]: 2025-11-29 07:29:00.497 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc3de84a1-77, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:29:00 compute-0 nova_compute[187185]: 2025-11-29 07:29:00.499 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:00 compute-0 nova_compute[187185]: 2025-11-29 07:29:00.501 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:29:00 compute-0 nova_compute[187185]: 2025-11-29 07:29:00.503 187189 INFO os_vif [None req-8964fcd3-ea2c-43c0-b2dd-954a5dc953bf 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f7:bb:74,bridge_name='br-int',has_traffic_filtering=True,id=c3de84a1-7764-4c77-a2fd-fd169639ed1e,network=Network(28412826-5463-46e4-95cb-a7d788b1ab15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3de84a1-77')
Nov 29 07:29:00 compute-0 nova_compute[187185]: 2025-11-29 07:29:00.504 187189 INFO nova.virt.libvirt.driver [None req-8964fcd3-ea2c-43c0-b2dd-954a5dc953bf 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Deleting instance files /var/lib/nova/instances/f22f95f6-efd0-4710-adf7-895e0acda50c_del
Nov 29 07:29:00 compute-0 nova_compute[187185]: 2025-11-29 07:29:00.505 187189 INFO nova.virt.libvirt.driver [None req-8964fcd3-ea2c-43c0-b2dd-954a5dc953bf 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Deletion of /var/lib/nova/instances/f22f95f6-efd0-4710-adf7-895e0acda50c_del complete
Nov 29 07:29:00 compute-0 systemd[1]: libpod-conmon-6080745654af630fcfb35ff0e01dd7a6af8f7fadbae34c161bfc4be9d220da35.scope: Deactivated successfully.
Nov 29 07:29:00 compute-0 nova_compute[187185]: 2025-11-29 07:29:00.528 187189 DEBUG nova.compute.manager [req-5f6296c6-ca03-420f-b33b-2b53c897e005 req-80310b58-af78-4454-ad99-1c97de69760e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Received event network-vif-unplugged-c3de84a1-7764-4c77-a2fd-fd169639ed1e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:29:00 compute-0 nova_compute[187185]: 2025-11-29 07:29:00.529 187189 DEBUG oslo_concurrency.lockutils [req-5f6296c6-ca03-420f-b33b-2b53c897e005 req-80310b58-af78-4454-ad99-1c97de69760e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "f22f95f6-efd0-4710-adf7-895e0acda50c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:29:00 compute-0 nova_compute[187185]: 2025-11-29 07:29:00.530 187189 DEBUG oslo_concurrency.lockutils [req-5f6296c6-ca03-420f-b33b-2b53c897e005 req-80310b58-af78-4454-ad99-1c97de69760e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f22f95f6-efd0-4710-adf7-895e0acda50c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:29:00 compute-0 nova_compute[187185]: 2025-11-29 07:29:00.530 187189 DEBUG oslo_concurrency.lockutils [req-5f6296c6-ca03-420f-b33b-2b53c897e005 req-80310b58-af78-4454-ad99-1c97de69760e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f22f95f6-efd0-4710-adf7-895e0acda50c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:29:00 compute-0 nova_compute[187185]: 2025-11-29 07:29:00.530 187189 DEBUG nova.compute.manager [req-5f6296c6-ca03-420f-b33b-2b53c897e005 req-80310b58-af78-4454-ad99-1c97de69760e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] No waiting events found dispatching network-vif-unplugged-c3de84a1-7764-4c77-a2fd-fd169639ed1e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:29:00 compute-0 nova_compute[187185]: 2025-11-29 07:29:00.531 187189 DEBUG nova.compute.manager [req-5f6296c6-ca03-420f-b33b-2b53c897e005 req-80310b58-af78-4454-ad99-1c97de69760e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Received event network-vif-unplugged-c3de84a1-7764-4c77-a2fd-fd169639ed1e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 07:29:00 compute-0 podman[237415]: 2025-11-29 07:29:00.558318971 +0000 UTC m=+0.043977742 container remove 6080745654af630fcfb35ff0e01dd7a6af8f7fadbae34c161bfc4be9d220da35 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0)
Nov 29 07:29:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:29:00.567 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[9c078257-b42c-4a31-b32a-17381adbe18e]: (4, ('Sat Nov 29 07:29:00 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15 (6080745654af630fcfb35ff0e01dd7a6af8f7fadbae34c161bfc4be9d220da35)\n6080745654af630fcfb35ff0e01dd7a6af8f7fadbae34c161bfc4be9d220da35\nSat Nov 29 07:29:00 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15 (6080745654af630fcfb35ff0e01dd7a6af8f7fadbae34c161bfc4be9d220da35)\n6080745654af630fcfb35ff0e01dd7a6af8f7fadbae34c161bfc4be9d220da35\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:29:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:29:00.569 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[48ef54e0-c97a-46b3-a17c-bd01eae4d331]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:29:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:29:00.570 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap28412826-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:29:00 compute-0 nova_compute[187185]: 2025-11-29 07:29:00.572 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:00 compute-0 kernel: tap28412826-50: left promiscuous mode
Nov 29 07:29:00 compute-0 nova_compute[187185]: 2025-11-29 07:29:00.588 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:29:00.591 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[0292ec62-7e2b-44f8-836b-b5b9341fdd6b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:29:00 compute-0 nova_compute[187185]: 2025-11-29 07:29:00.596 187189 INFO nova.compute.manager [None req-8964fcd3-ea2c-43c0-b2dd-954a5dc953bf 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Took 0.46 seconds to destroy the instance on the hypervisor.
Nov 29 07:29:00 compute-0 nova_compute[187185]: 2025-11-29 07:29:00.599 187189 DEBUG oslo.service.loopingcall [None req-8964fcd3-ea2c-43c0-b2dd-954a5dc953bf 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 07:29:00 compute-0 nova_compute[187185]: 2025-11-29 07:29:00.599 187189 DEBUG nova.compute.manager [-] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 07:29:00 compute-0 nova_compute[187185]: 2025-11-29 07:29:00.599 187189 DEBUG nova.network.neutron [-] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 07:29:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:29:00.609 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[82da7d35-981d-4fdc-81a7-39e49ad5d5ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:29:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:29:00.610 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6ea5fba4-fe99-463f-b636-5ad34b024f3a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:29:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:29:00.623 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[c063f106-9f39-4ea5-90bf-6a5fbb660cb0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 672834, 'reachable_time': 25557, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237432, 'error': None, 'target': 'ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:29:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:29:00.626 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:29:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:29:00.626 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[d91bc9fe-9533-4624-b89d-9feb52fd7baf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:29:00 compute-0 systemd[1]: run-netns-ovnmeta\x2d28412826\x2d5463\x2d46e4\x2d95cb\x2da7d788b1ab15.mount: Deactivated successfully.
Nov 29 07:29:01 compute-0 nova_compute[187185]: 2025-11-29 07:29:01.179 187189 DEBUG nova.network.neutron [-] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:29:01 compute-0 nova_compute[187185]: 2025-11-29 07:29:01.219 187189 INFO nova.compute.manager [-] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Took 0.62 seconds to deallocate network for instance.
Nov 29 07:29:01 compute-0 nova_compute[187185]: 2025-11-29 07:29:01.326 187189 DEBUG oslo_concurrency.lockutils [None req-8964fcd3-ea2c-43c0-b2dd-954a5dc953bf 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:29:01 compute-0 nova_compute[187185]: 2025-11-29 07:29:01.327 187189 DEBUG oslo_concurrency.lockutils [None req-8964fcd3-ea2c-43c0-b2dd-954a5dc953bf 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:29:01 compute-0 nova_compute[187185]: 2025-11-29 07:29:01.404 187189 DEBUG nova.compute.provider_tree [None req-8964fcd3-ea2c-43c0-b2dd-954a5dc953bf 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:29:01 compute-0 nova_compute[187185]: 2025-11-29 07:29:01.418 187189 DEBUG nova.scheduler.client.report [None req-8964fcd3-ea2c-43c0-b2dd-954a5dc953bf 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:29:01 compute-0 nova_compute[187185]: 2025-11-29 07:29:01.460 187189 DEBUG oslo_concurrency.lockutils [None req-8964fcd3-ea2c-43c0-b2dd-954a5dc953bf 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:29:01 compute-0 nova_compute[187185]: 2025-11-29 07:29:01.485 187189 INFO nova.scheduler.client.report [None req-8964fcd3-ea2c-43c0-b2dd-954a5dc953bf 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Deleted allocations for instance f22f95f6-efd0-4710-adf7-895e0acda50c
Nov 29 07:29:01 compute-0 nova_compute[187185]: 2025-11-29 07:29:01.602 187189 DEBUG oslo_concurrency.lockutils [None req-8964fcd3-ea2c-43c0-b2dd-954a5dc953bf 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "f22f95f6-efd0-4710-adf7-895e0acda50c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.533s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:29:01 compute-0 nova_compute[187185]: 2025-11-29 07:29:01.613 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:02 compute-0 nova_compute[187185]: 2025-11-29 07:29:02.642 187189 DEBUG nova.compute.manager [req-7ef4b297-731f-479f-a64b-66e26b0d069a req-d1609b7a-9add-4870-b110-13ff463df348 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Received event network-vif-plugged-c3de84a1-7764-4c77-a2fd-fd169639ed1e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:29:02 compute-0 nova_compute[187185]: 2025-11-29 07:29:02.642 187189 DEBUG oslo_concurrency.lockutils [req-7ef4b297-731f-479f-a64b-66e26b0d069a req-d1609b7a-9add-4870-b110-13ff463df348 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "f22f95f6-efd0-4710-adf7-895e0acda50c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:29:02 compute-0 nova_compute[187185]: 2025-11-29 07:29:02.643 187189 DEBUG oslo_concurrency.lockutils [req-7ef4b297-731f-479f-a64b-66e26b0d069a req-d1609b7a-9add-4870-b110-13ff463df348 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f22f95f6-efd0-4710-adf7-895e0acda50c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:29:02 compute-0 nova_compute[187185]: 2025-11-29 07:29:02.643 187189 DEBUG oslo_concurrency.lockutils [req-7ef4b297-731f-479f-a64b-66e26b0d069a req-d1609b7a-9add-4870-b110-13ff463df348 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f22f95f6-efd0-4710-adf7-895e0acda50c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:29:02 compute-0 nova_compute[187185]: 2025-11-29 07:29:02.643 187189 DEBUG nova.compute.manager [req-7ef4b297-731f-479f-a64b-66e26b0d069a req-d1609b7a-9add-4870-b110-13ff463df348 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] No waiting events found dispatching network-vif-plugged-c3de84a1-7764-4c77-a2fd-fd169639ed1e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:29:02 compute-0 nova_compute[187185]: 2025-11-29 07:29:02.643 187189 WARNING nova.compute.manager [req-7ef4b297-731f-479f-a64b-66e26b0d069a req-d1609b7a-9add-4870-b110-13ff463df348 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Received unexpected event network-vif-plugged-c3de84a1-7764-4c77-a2fd-fd169639ed1e for instance with vm_state deleted and task_state None.
Nov 29 07:29:02 compute-0 nova_compute[187185]: 2025-11-29 07:29:02.644 187189 DEBUG nova.compute.manager [req-7ef4b297-731f-479f-a64b-66e26b0d069a req-d1609b7a-9add-4870-b110-13ff463df348 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Received event network-vif-deleted-c3de84a1-7764-4c77-a2fd-fd169639ed1e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:29:04 compute-0 podman[237433]: 2025-11-29 07:29:04.85657984 +0000 UTC m=+0.115300098 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 07:29:05 compute-0 nova_compute[187185]: 2025-11-29 07:29:05.500 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:05 compute-0 ovn_controller[95281]: 2025-11-29T07:29:05Z|00445|binding|INFO|Releasing lport 9d125548-8068-4815-941c-f4536091ef07 from this chassis (sb_readonly=0)
Nov 29 07:29:05 compute-0 ovn_controller[95281]: 2025-11-29T07:29:05Z|00446|binding|INFO|Releasing lport f0ce4da0-40ec-44ef-8179-4cbfad9b57f1 from this chassis (sb_readonly=0)
Nov 29 07:29:05 compute-0 nova_compute[187185]: 2025-11-29 07:29:05.677 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:06 compute-0 nova_compute[187185]: 2025-11-29 07:29:06.615 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:10 compute-0 nova_compute[187185]: 2025-11-29 07:29:10.502 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:11 compute-0 nova_compute[187185]: 2025-11-29 07:29:11.618 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:11 compute-0 podman[237459]: 2025-11-29 07:29:11.788152955 +0000 UTC m=+0.052995521 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 07:29:11 compute-0 ovn_controller[95281]: 2025-11-29T07:29:11Z|00447|binding|INFO|Releasing lport 9d125548-8068-4815-941c-f4536091ef07 from this chassis (sb_readonly=0)
Nov 29 07:29:11 compute-0 ovn_controller[95281]: 2025-11-29T07:29:11Z|00448|binding|INFO|Releasing lport f0ce4da0-40ec-44ef-8179-4cbfad9b57f1 from this chassis (sb_readonly=0)
Nov 29 07:29:11 compute-0 nova_compute[187185]: 2025-11-29 07:29:11.989 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:13 compute-0 nova_compute[187185]: 2025-11-29 07:29:13.371 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:15 compute-0 nova_compute[187185]: 2025-11-29 07:29:15.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:29:15 compute-0 nova_compute[187185]: 2025-11-29 07:29:15.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:29:15 compute-0 nova_compute[187185]: 2025-11-29 07:29:15.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:29:15 compute-0 nova_compute[187185]: 2025-11-29 07:29:15.455 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401340.4544113, f22f95f6-efd0-4710-adf7-895e0acda50c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:29:15 compute-0 nova_compute[187185]: 2025-11-29 07:29:15.456 187189 INFO nova.compute.manager [-] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] VM Stopped (Lifecycle Event)
Nov 29 07:29:15 compute-0 nova_compute[187185]: 2025-11-29 07:29:15.484 187189 DEBUG nova.compute.manager [None req-e3d6197a-6e2d-4f57-98bb-8990452a9086 - - - - - -] [instance: f22f95f6-efd0-4710-adf7-895e0acda50c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:29:15 compute-0 nova_compute[187185]: 2025-11-29 07:29:15.505 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:15 compute-0 nova_compute[187185]: 2025-11-29 07:29:15.642 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "refresh_cache-2cb4e847-114f-440a-b231-65e3fff0f0d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:29:15 compute-0 nova_compute[187185]: 2025-11-29 07:29:15.643 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquired lock "refresh_cache-2cb4e847-114f-440a-b231-65e3fff0f0d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:29:15 compute-0 nova_compute[187185]: 2025-11-29 07:29:15.644 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 29 07:29:15 compute-0 nova_compute[187185]: 2025-11-29 07:29:15.644 187189 DEBUG nova.objects.instance [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 2cb4e847-114f-440a-b231-65e3fff0f0d2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:29:16 compute-0 nova_compute[187185]: 2025-11-29 07:29:16.620 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:16 compute-0 podman[237484]: 2025-11-29 07:29:16.800068263 +0000 UTC m=+0.061111704 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible)
Nov 29 07:29:16 compute-0 podman[237485]: 2025-11-29 07:29:16.815900307 +0000 UTC m=+0.070082631 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:29:17 compute-0 nova_compute[187185]: 2025-11-29 07:29:17.950 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:19 compute-0 nova_compute[187185]: 2025-11-29 07:29:19.365 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Updating instance_info_cache with network_info: [{"id": "836d3fdf-e98b-4a41-864f-9e3fbdb29394", "address": "fa:16:3e:d5:a1:ff", "network": {"id": "94472368-b72a-4e5d-ac59-40b24b7ba792", "bridge": "br-int", "label": "tempest-network-smoke--1761418689", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.220", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap836d3fdf-e9", "ovs_interfaceid": "836d3fdf-e98b-4a41-864f-9e3fbdb29394", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9c194df0-4c84-41bd-94af-7e4ecd312dd5", "address": "fa:16:3e:da:a3:a5", "network": {"id": "ff387e90-45c2-42d7-b536-fee4d2b6eb5e", "bridge": "br-int", "label": "tempest-network-smoke--1344723239", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feda:a3a5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feda:a3a5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c194df0-4c", "ovs_interfaceid": "9c194df0-4c84-41bd-94af-7e4ecd312dd5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:29:19 compute-0 nova_compute[187185]: 2025-11-29 07:29:19.405 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Releasing lock "refresh_cache-2cb4e847-114f-440a-b231-65e3fff0f0d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:29:19 compute-0 nova_compute[187185]: 2025-11-29 07:29:19.406 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 29 07:29:19 compute-0 nova_compute[187185]: 2025-11-29 07:29:19.407 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:29:19 compute-0 nova_compute[187185]: 2025-11-29 07:29:19.408 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:29:19 compute-0 nova_compute[187185]: 2025-11-29 07:29:19.408 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:29:19 compute-0 nova_compute[187185]: 2025-11-29 07:29:19.409 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:29:19 compute-0 nova_compute[187185]: 2025-11-29 07:29:19.438 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:29:19 compute-0 nova_compute[187185]: 2025-11-29 07:29:19.438 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:29:19 compute-0 nova_compute[187185]: 2025-11-29 07:29:19.439 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:29:19 compute-0 nova_compute[187185]: 2025-11-29 07:29:19.439 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:29:19 compute-0 nova_compute[187185]: 2025-11-29 07:29:19.538 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2cb4e847-114f-440a-b231-65e3fff0f0d2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:29:19 compute-0 nova_compute[187185]: 2025-11-29 07:29:19.615 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2cb4e847-114f-440a-b231-65e3fff0f0d2/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:29:19 compute-0 nova_compute[187185]: 2025-11-29 07:29:19.617 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2cb4e847-114f-440a-b231-65e3fff0f0d2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:29:19 compute-0 nova_compute[187185]: 2025-11-29 07:29:19.671 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2cb4e847-114f-440a-b231-65e3fff0f0d2/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:29:19 compute-0 nova_compute[187185]: 2025-11-29 07:29:19.849 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:29:19 compute-0 nova_compute[187185]: 2025-11-29 07:29:19.851 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5521MB free_disk=73.21570205688477GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:29:19 compute-0 nova_compute[187185]: 2025-11-29 07:29:19.851 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:29:19 compute-0 nova_compute[187185]: 2025-11-29 07:29:19.851 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:29:19 compute-0 ovn_controller[95281]: 2025-11-29T07:29:19Z|00449|binding|INFO|Releasing lport 9d125548-8068-4815-941c-f4536091ef07 from this chassis (sb_readonly=0)
Nov 29 07:29:19 compute-0 ovn_controller[95281]: 2025-11-29T07:29:19Z|00450|binding|INFO|Releasing lport f0ce4da0-40ec-44ef-8179-4cbfad9b57f1 from this chassis (sb_readonly=0)
Nov 29 07:29:19 compute-0 nova_compute[187185]: 2025-11-29 07:29:19.975 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:20 compute-0 nova_compute[187185]: 2025-11-29 07:29:20.098 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance 2cb4e847-114f-440a-b231-65e3fff0f0d2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:29:20 compute-0 nova_compute[187185]: 2025-11-29 07:29:20.099 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:29:20 compute-0 nova_compute[187185]: 2025-11-29 07:29:20.099 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:29:20 compute-0 nova_compute[187185]: 2025-11-29 07:29:20.178 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Refreshing inventories for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 29 07:29:20 compute-0 nova_compute[187185]: 2025-11-29 07:29:20.254 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Updating ProviderTree inventory for provider 4e39a026-df39-4e20-874a-dbb5a40df044 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 29 07:29:20 compute-0 nova_compute[187185]: 2025-11-29 07:29:20.254 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Updating inventory in ProviderTree for provider 4e39a026-df39-4e20-874a-dbb5a40df044 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 29 07:29:20 compute-0 nova_compute[187185]: 2025-11-29 07:29:20.268 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Refreshing aggregate associations for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 29 07:29:20 compute-0 nova_compute[187185]: 2025-11-29 07:29:20.291 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Refreshing trait associations for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044, traits: HW_CPU_X86_SSE,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_TRUSTED_CERTS,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 29 07:29:20 compute-0 nova_compute[187185]: 2025-11-29 07:29:20.328 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:29:20 compute-0 nova_compute[187185]: 2025-11-29 07:29:20.508 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:21 compute-0 nova_compute[187185]: 2025-11-29 07:29:21.626 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:21 compute-0 nova_compute[187185]: 2025-11-29 07:29:21.839 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:29:21 compute-0 nova_compute[187185]: 2025-11-29 07:29:21.943 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:29:21 compute-0 nova_compute[187185]: 2025-11-29 07:29:21.944 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:29:22 compute-0 nova_compute[187185]: 2025-11-29 07:29:22.853 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:29:22 compute-0 nova_compute[187185]: 2025-11-29 07:29:22.854 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:29:22 compute-0 nova_compute[187185]: 2025-11-29 07:29:22.854 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:29:22 compute-0 nova_compute[187185]: 2025-11-29 07:29:22.854 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:29:25 compute-0 nova_compute[187185]: 2025-11-29 07:29:25.465 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:25 compute-0 nova_compute[187185]: 2025-11-29 07:29:25.510 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:29:25.520 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:29:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:29:25.521 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:29:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:29:25.522 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:29:26 compute-0 nova_compute[187185]: 2025-11-29 07:29:26.312 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:29:26 compute-0 nova_compute[187185]: 2025-11-29 07:29:26.629 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:26 compute-0 podman[237535]: 2025-11-29 07:29:26.809509495 +0000 UTC m=+0.066695474 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:29:26 compute-0 podman[237534]: 2025-11-29 07:29:26.821965202 +0000 UTC m=+0.071560953 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_id=edpm, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vcs-type=git, io.openshift.expose-services=)
Nov 29 07:29:26 compute-0 podman[237533]: 2025-11-29 07:29:26.842891552 +0000 UTC m=+0.106787083 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Nov 29 07:29:30 compute-0 nova_compute[187185]: 2025-11-29 07:29:30.512 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:30 compute-0 nova_compute[187185]: 2025-11-29 07:29:30.605 187189 DEBUG oslo_concurrency.lockutils [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Acquiring lock "ae465294-b6bd-4481-9fc1-daf5f8ec28a1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:29:30 compute-0 nova_compute[187185]: 2025-11-29 07:29:30.606 187189 DEBUG oslo_concurrency.lockutils [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Lock "ae465294-b6bd-4481-9fc1-daf5f8ec28a1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:29:30 compute-0 nova_compute[187185]: 2025-11-29 07:29:30.640 187189 DEBUG nova.compute.manager [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 07:29:31 compute-0 nova_compute[187185]: 2025-11-29 07:29:31.571 187189 DEBUG oslo_concurrency.lockutils [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:29:31 compute-0 nova_compute[187185]: 2025-11-29 07:29:31.572 187189 DEBUG oslo_concurrency.lockutils [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:29:31 compute-0 nova_compute[187185]: 2025-11-29 07:29:31.580 187189 DEBUG nova.virt.hardware [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:29:31 compute-0 nova_compute[187185]: 2025-11-29 07:29:31.581 187189 INFO nova.compute.claims [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:29:31 compute-0 nova_compute[187185]: 2025-11-29 07:29:31.633 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:31 compute-0 nova_compute[187185]: 2025-11-29 07:29:31.956 187189 DEBUG nova.compute.provider_tree [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:29:32 compute-0 nova_compute[187185]: 2025-11-29 07:29:32.495 187189 DEBUG nova.scheduler.client.report [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:29:32 compute-0 nova_compute[187185]: 2025-11-29 07:29:32.669 187189 DEBUG oslo_concurrency.lockutils [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:29:32 compute-0 nova_compute[187185]: 2025-11-29 07:29:32.670 187189 DEBUG nova.compute.manager [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 07:29:32 compute-0 nova_compute[187185]: 2025-11-29 07:29:32.758 187189 DEBUG nova.compute.manager [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948
Nov 29 07:29:32 compute-0 nova_compute[187185]: 2025-11-29 07:29:32.775 187189 INFO nova.virt.libvirt.driver [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 07:29:32 compute-0 nova_compute[187185]: 2025-11-29 07:29:32.800 187189 DEBUG nova.compute.manager [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 07:29:32 compute-0 nova_compute[187185]: 2025-11-29 07:29:32.962 187189 DEBUG nova.compute.manager [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 07:29:32 compute-0 nova_compute[187185]: 2025-11-29 07:29:32.963 187189 DEBUG nova.virt.libvirt.driver [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:29:32 compute-0 nova_compute[187185]: 2025-11-29 07:29:32.964 187189 INFO nova.virt.libvirt.driver [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Creating image(s)
Nov 29 07:29:32 compute-0 nova_compute[187185]: 2025-11-29 07:29:32.964 187189 DEBUG oslo_concurrency.lockutils [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Acquiring lock "/var/lib/nova/instances/ae465294-b6bd-4481-9fc1-daf5f8ec28a1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:29:32 compute-0 nova_compute[187185]: 2025-11-29 07:29:32.965 187189 DEBUG oslo_concurrency.lockutils [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Lock "/var/lib/nova/instances/ae465294-b6bd-4481-9fc1-daf5f8ec28a1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:29:32 compute-0 nova_compute[187185]: 2025-11-29 07:29:32.966 187189 DEBUG oslo_concurrency.lockutils [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Lock "/var/lib/nova/instances/ae465294-b6bd-4481-9fc1-daf5f8ec28a1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:29:32 compute-0 nova_compute[187185]: 2025-11-29 07:29:32.984 187189 DEBUG oslo_concurrency.processutils [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.004 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.055 187189 DEBUG oslo_concurrency.processutils [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.056 187189 DEBUG oslo_concurrency.lockutils [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.057 187189 DEBUG oslo_concurrency.lockutils [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.073 187189 DEBUG oslo_concurrency.processutils [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.129 187189 DEBUG oslo_concurrency.processutils [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.131 187189 DEBUG oslo_concurrency.processutils [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/ae465294-b6bd-4481-9fc1-daf5f8ec28a1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.173 187189 DEBUG oslo_concurrency.processutils [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/ae465294-b6bd-4481-9fc1-daf5f8ec28a1/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.175 187189 DEBUG oslo_concurrency.lockutils [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.175 187189 DEBUG oslo_concurrency.processutils [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.250 187189 DEBUG oslo_concurrency.processutils [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.251 187189 DEBUG nova.virt.disk.api [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Checking if we can resize image /var/lib/nova/instances/ae465294-b6bd-4481-9fc1-daf5f8ec28a1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.251 187189 DEBUG oslo_concurrency.processutils [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae465294-b6bd-4481-9fc1-daf5f8ec28a1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.316 187189 DEBUG oslo_concurrency.processutils [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae465294-b6bd-4481-9fc1-daf5f8ec28a1/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.318 187189 DEBUG nova.virt.disk.api [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Cannot resize image /var/lib/nova/instances/ae465294-b6bd-4481-9fc1-daf5f8ec28a1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.319 187189 DEBUG nova.objects.instance [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Lazy-loading 'migration_context' on Instance uuid ae465294-b6bd-4481-9fc1-daf5f8ec28a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.337 187189 DEBUG nova.virt.libvirt.driver [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.337 187189 DEBUG nova.virt.libvirt.driver [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Ensure instance console log exists: /var/lib/nova/instances/ae465294-b6bd-4481-9fc1-daf5f8ec28a1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.338 187189 DEBUG oslo_concurrency.lockutils [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.339 187189 DEBUG oslo_concurrency.lockutils [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.339 187189 DEBUG oslo_concurrency.lockutils [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.342 187189 DEBUG nova.virt.libvirt.driver [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.349 187189 WARNING nova.virt.libvirt.driver [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.357 187189 DEBUG nova.virt.libvirt.host [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.358 187189 DEBUG nova.virt.libvirt.host [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.362 187189 DEBUG nova.virt.libvirt.host [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.363 187189 DEBUG nova.virt.libvirt.host [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.365 187189 DEBUG nova.virt.libvirt.driver [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.365 187189 DEBUG nova.virt.hardware [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.366 187189 DEBUG nova.virt.hardware [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.366 187189 DEBUG nova.virt.hardware [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.366 187189 DEBUG nova.virt.hardware [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.367 187189 DEBUG nova.virt.hardware [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.367 187189 DEBUG nova.virt.hardware [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.367 187189 DEBUG nova.virt.hardware [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.367 187189 DEBUG nova.virt.hardware [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.367 187189 DEBUG nova.virt.hardware [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.368 187189 DEBUG nova.virt.hardware [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.368 187189 DEBUG nova.virt.hardware [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.371 187189 DEBUG nova.objects.instance [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Lazy-loading 'pci_devices' on Instance uuid ae465294-b6bd-4481-9fc1-daf5f8ec28a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.390 187189 DEBUG nova.virt.libvirt.driver [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:29:33 compute-0 nova_compute[187185]:   <uuid>ae465294-b6bd-4481-9fc1-daf5f8ec28a1</uuid>
Nov 29 07:29:33 compute-0 nova_compute[187185]:   <name>instance-0000008c</name>
Nov 29 07:29:33 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:29:33 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:29:33 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:29:33 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:       <nova:name>tempest-ServerShowV254Test-server-89990852</nova:name>
Nov 29 07:29:33 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:29:33</nova:creationTime>
Nov 29 07:29:33 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:29:33 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:29:33 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:29:33 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:29:33 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:29:33 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:29:33 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:29:33 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:29:33 compute-0 nova_compute[187185]:         <nova:user uuid="794d169205b34112913d4e9eee1e8456">tempest-ServerShowV254Test-1438671461-project-member</nova:user>
Nov 29 07:29:33 compute-0 nova_compute[187185]:         <nova:project uuid="3b998704163a4d19bfb6f025b536cacc">tempest-ServerShowV254Test-1438671461</nova:project>
Nov 29 07:29:33 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:29:33 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:       <nova:ports/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:29:33 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:29:33 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <system>
Nov 29 07:29:33 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:29:33 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:29:33 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:29:33 compute-0 nova_compute[187185]:       <entry name="serial">ae465294-b6bd-4481-9fc1-daf5f8ec28a1</entry>
Nov 29 07:29:33 compute-0 nova_compute[187185]:       <entry name="uuid">ae465294-b6bd-4481-9fc1-daf5f8ec28a1</entry>
Nov 29 07:29:33 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     </system>
Nov 29 07:29:33 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:29:33 compute-0 nova_compute[187185]:   <os>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:   </os>
Nov 29 07:29:33 compute-0 nova_compute[187185]:   <features>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:   </features>
Nov 29 07:29:33 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:29:33 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:29:33 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:29:33 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/ae465294-b6bd-4481-9fc1-daf5f8ec28a1/disk"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:29:33 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/ae465294-b6bd-4481-9fc1-daf5f8ec28a1/disk.config"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:29:33 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/ae465294-b6bd-4481-9fc1-daf5f8ec28a1/console.log" append="off"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <video>
Nov 29 07:29:33 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     </video>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:29:33 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:29:33 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:29:33 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:29:33 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:29:33 compute-0 nova_compute[187185]: </domain>
Nov 29 07:29:33 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.452 187189 DEBUG nova.virt.libvirt.driver [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.452 187189 DEBUG nova.virt.libvirt.driver [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.453 187189 INFO nova.virt.libvirt.driver [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Using config drive
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.736 187189 INFO nova.virt.libvirt.driver [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Creating config drive at /var/lib/nova/instances/ae465294-b6bd-4481-9fc1-daf5f8ec28a1/disk.config
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.741 187189 DEBUG oslo_concurrency.processutils [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ae465294-b6bd-4481-9fc1-daf5f8ec28a1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprsqjeb_c execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:29:33 compute-0 nova_compute[187185]: 2025-11-29 07:29:33.872 187189 DEBUG oslo_concurrency.processutils [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ae465294-b6bd-4481-9fc1-daf5f8ec28a1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprsqjeb_c" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:29:33 compute-0 systemd-machined[153486]: New machine qemu-55-instance-0000008c.
Nov 29 07:29:34 compute-0 systemd[1]: Started Virtual Machine qemu-55-instance-0000008c.
Nov 29 07:29:34 compute-0 nova_compute[187185]: 2025-11-29 07:29:34.373 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401374.3729742, ae465294-b6bd-4481-9fc1-daf5f8ec28a1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:29:34 compute-0 nova_compute[187185]: 2025-11-29 07:29:34.375 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] VM Resumed (Lifecycle Event)
Nov 29 07:29:34 compute-0 nova_compute[187185]: 2025-11-29 07:29:34.378 187189 DEBUG nova.compute.manager [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:29:34 compute-0 nova_compute[187185]: 2025-11-29 07:29:34.379 187189 DEBUG nova.virt.libvirt.driver [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:29:34 compute-0 nova_compute[187185]: 2025-11-29 07:29:34.389 187189 INFO nova.virt.libvirt.driver [-] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Instance spawned successfully.
Nov 29 07:29:34 compute-0 nova_compute[187185]: 2025-11-29 07:29:34.389 187189 DEBUG nova.virt.libvirt.driver [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:29:34 compute-0 nova_compute[187185]: 2025-11-29 07:29:34.394 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:29:34 compute-0 nova_compute[187185]: 2025-11-29 07:29:34.400 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:29:34 compute-0 nova_compute[187185]: 2025-11-29 07:29:34.417 187189 DEBUG nova.virt.libvirt.driver [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:29:34 compute-0 nova_compute[187185]: 2025-11-29 07:29:34.417 187189 DEBUG nova.virt.libvirt.driver [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:29:34 compute-0 nova_compute[187185]: 2025-11-29 07:29:34.418 187189 DEBUG nova.virt.libvirt.driver [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:29:34 compute-0 nova_compute[187185]: 2025-11-29 07:29:34.419 187189 DEBUG nova.virt.libvirt.driver [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:29:34 compute-0 nova_compute[187185]: 2025-11-29 07:29:34.420 187189 DEBUG nova.virt.libvirt.driver [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:29:34 compute-0 nova_compute[187185]: 2025-11-29 07:29:34.421 187189 DEBUG nova.virt.libvirt.driver [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:29:34 compute-0 nova_compute[187185]: 2025-11-29 07:29:34.427 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:29:34 compute-0 nova_compute[187185]: 2025-11-29 07:29:34.428 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401374.3749323, ae465294-b6bd-4481-9fc1-daf5f8ec28a1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:29:34 compute-0 nova_compute[187185]: 2025-11-29 07:29:34.428 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] VM Started (Lifecycle Event)
Nov 29 07:29:34 compute-0 nova_compute[187185]: 2025-11-29 07:29:34.456 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:29:34 compute-0 nova_compute[187185]: 2025-11-29 07:29:34.463 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:29:34 compute-0 nova_compute[187185]: 2025-11-29 07:29:34.492 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:29:34 compute-0 nova_compute[187185]: 2025-11-29 07:29:34.516 187189 INFO nova.compute.manager [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Took 1.55 seconds to spawn the instance on the hypervisor.
Nov 29 07:29:34 compute-0 nova_compute[187185]: 2025-11-29 07:29:34.516 187189 DEBUG nova.compute.manager [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:29:34 compute-0 nova_compute[187185]: 2025-11-29 07:29:34.632 187189 INFO nova.compute.manager [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Took 3.13 seconds to build instance.
Nov 29 07:29:34 compute-0 nova_compute[187185]: 2025-11-29 07:29:34.650 187189 DEBUG oslo_concurrency.lockutils [None req-7951a19f-a7e4-41f5-bf88-01155b13e742 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Lock "ae465294-b6bd-4481-9fc1-daf5f8ec28a1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.044s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:29:35 compute-0 nova_compute[187185]: 2025-11-29 07:29:35.514 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:35 compute-0 podman[237640]: 2025-11-29 07:29:35.859101825 +0000 UTC m=+0.123272347 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125)
Nov 29 07:29:36 compute-0 nova_compute[187185]: 2025-11-29 07:29:36.636 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:37 compute-0 nova_compute[187185]: 2025-11-29 07:29:37.464 187189 INFO nova.compute.manager [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Rebuilding instance
Nov 29 07:29:37 compute-0 nova_compute[187185]: 2025-11-29 07:29:37.790 187189 DEBUG nova.compute.manager [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:29:37 compute-0 nova_compute[187185]: 2025-11-29 07:29:37.856 187189 DEBUG nova.objects.instance [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Lazy-loading 'pci_requests' on Instance uuid ae465294-b6bd-4481-9fc1-daf5f8ec28a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:29:37 compute-0 nova_compute[187185]: 2025-11-29 07:29:37.866 187189 DEBUG nova.objects.instance [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Lazy-loading 'pci_devices' on Instance uuid ae465294-b6bd-4481-9fc1-daf5f8ec28a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:29:37 compute-0 nova_compute[187185]: 2025-11-29 07:29:37.876 187189 DEBUG nova.objects.instance [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Lazy-loading 'resources' on Instance uuid ae465294-b6bd-4481-9fc1-daf5f8ec28a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:29:37 compute-0 nova_compute[187185]: 2025-11-29 07:29:37.886 187189 DEBUG nova.objects.instance [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Lazy-loading 'migration_context' on Instance uuid ae465294-b6bd-4481-9fc1-daf5f8ec28a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:29:37 compute-0 nova_compute[187185]: 2025-11-29 07:29:37.901 187189 DEBUG nova.objects.instance [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Nov 29 07:29:37 compute-0 nova_compute[187185]: 2025-11-29 07:29:37.907 187189 DEBUG nova.virt.libvirt.driver [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Nov 29 07:29:40 compute-0 nova_compute[187185]: 2025-11-29 07:29:40.518 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:41 compute-0 nova_compute[187185]: 2025-11-29 07:29:41.639 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:42 compute-0 podman[237666]: 2025-11-29 07:29:42.808991655 +0000 UTC m=+0.070070111 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 07:29:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:29:45.299 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=46, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=45) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:29:45 compute-0 nova_compute[187185]: 2025-11-29 07:29:45.300 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:29:45.302 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:29:45 compute-0 nova_compute[187185]: 2025-11-29 07:29:45.520 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:46 compute-0 nova_compute[187185]: 2025-11-29 07:29:46.643 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:47 compute-0 podman[237711]: 2025-11-29 07:29:47.841285967 +0000 UTC m=+0.098101815 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:29:47 compute-0 podman[237712]: 2025-11-29 07:29:47.84872325 +0000 UTC m=+0.096624482 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:29:47 compute-0 ovn_controller[95281]: 2025-11-29T07:29:47Z|00451|binding|INFO|Releasing lport 9d125548-8068-4815-941c-f4536091ef07 from this chassis (sb_readonly=0)
Nov 29 07:29:47 compute-0 ovn_controller[95281]: 2025-11-29T07:29:47Z|00452|binding|INFO|Releasing lport f0ce4da0-40ec-44ef-8179-4cbfad9b57f1 from this chassis (sb_readonly=0)
Nov 29 07:29:47 compute-0 nova_compute[187185]: 2025-11-29 07:29:47.950 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:47 compute-0 nova_compute[187185]: 2025-11-29 07:29:47.971 187189 DEBUG nova.virt.libvirt.driver [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Nov 29 07:29:50 compute-0 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d0000008c.scope: Deactivated successfully.
Nov 29 07:29:50 compute-0 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d0000008c.scope: Consumed 13.565s CPU time.
Nov 29 07:29:50 compute-0 systemd-machined[153486]: Machine qemu-55-instance-0000008c terminated.
Nov 29 07:29:50 compute-0 nova_compute[187185]: 2025-11-29 07:29:50.523 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:50 compute-0 nova_compute[187185]: 2025-11-29 07:29:50.989 187189 INFO nova.virt.libvirt.driver [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Instance shutdown successfully after 13 seconds.
Nov 29 07:29:50 compute-0 nova_compute[187185]: 2025-11-29 07:29:50.998 187189 INFO nova.virt.libvirt.driver [-] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Instance destroyed successfully.
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.004 187189 INFO nova.virt.libvirt.driver [-] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Instance destroyed successfully.
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.004 187189 INFO nova.virt.libvirt.driver [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Deleting instance files /var/lib/nova/instances/ae465294-b6bd-4481-9fc1-daf5f8ec28a1_del
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.006 187189 INFO nova.virt.libvirt.driver [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Deletion of /var/lib/nova/instances/ae465294-b6bd-4481-9fc1-daf5f8ec28a1_del complete
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.211 187189 DEBUG nova.virt.libvirt.driver [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.212 187189 INFO nova.virt.libvirt.driver [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Creating image(s)
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.212 187189 DEBUG oslo_concurrency.lockutils [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Acquiring lock "/var/lib/nova/instances/ae465294-b6bd-4481-9fc1-daf5f8ec28a1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.213 187189 DEBUG oslo_concurrency.lockutils [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Lock "/var/lib/nova/instances/ae465294-b6bd-4481-9fc1-daf5f8ec28a1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.213 187189 DEBUG oslo_concurrency.lockutils [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Lock "/var/lib/nova/instances/ae465294-b6bd-4481-9fc1-daf5f8ec28a1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.226 187189 DEBUG oslo_concurrency.processutils [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.328 187189 DEBUG oslo_concurrency.processutils [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json" returned: 0 in 0.102s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.329 187189 DEBUG oslo_concurrency.lockutils [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Acquiring lock "923f30c548f83d073f1130ce28fd6a6debb4b123" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.331 187189 DEBUG oslo_concurrency.lockutils [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Lock "923f30c548f83d073f1130ce28fd6a6debb4b123" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.350 187189 DEBUG oslo_concurrency.processutils [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.435 187189 DEBUG oslo_concurrency.processutils [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.436 187189 DEBUG oslo_concurrency.processutils [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123,backing_fmt=raw /var/lib/nova/instances/ae465294-b6bd-4481-9fc1-daf5f8ec28a1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.489 187189 DEBUG oslo_concurrency.processutils [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123,backing_fmt=raw /var/lib/nova/instances/ae465294-b6bd-4481-9fc1-daf5f8ec28a1/disk 1073741824" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.491 187189 DEBUG oslo_concurrency.lockutils [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Lock "923f30c548f83d073f1130ce28fd6a6debb4b123" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.491 187189 DEBUG oslo_concurrency.processutils [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.582 187189 DEBUG oslo_concurrency.processutils [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.584 187189 DEBUG nova.virt.disk.api [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Checking if we can resize image /var/lib/nova/instances/ae465294-b6bd-4481-9fc1-daf5f8ec28a1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.585 187189 DEBUG oslo_concurrency.processutils [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae465294-b6bd-4481-9fc1-daf5f8ec28a1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.646 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.671 187189 DEBUG oslo_concurrency.processutils [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae465294-b6bd-4481-9fc1-daf5f8ec28a1/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.672 187189 DEBUG nova.virt.disk.api [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Cannot resize image /var/lib/nova/instances/ae465294-b6bd-4481-9fc1-daf5f8ec28a1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.672 187189 DEBUG nova.virt.libvirt.driver [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.673 187189 DEBUG nova.virt.libvirt.driver [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Ensure instance console log exists: /var/lib/nova/instances/ae465294-b6bd-4481-9fc1-daf5f8ec28a1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.673 187189 DEBUG oslo_concurrency.lockutils [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.674 187189 DEBUG oslo_concurrency.lockutils [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.674 187189 DEBUG oslo_concurrency.lockutils [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.676 187189 DEBUG nova.virt.libvirt.driver [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:09Z,direct_url=<?>,disk_format='qcow2',id=3372b7b2-657b-4c4d-9d9d-7c5b771a630a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.683 187189 WARNING nova.virt.libvirt.driver [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.694 187189 DEBUG nova.virt.libvirt.host [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.695 187189 DEBUG nova.virt.libvirt.host [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.699 187189 DEBUG nova.virt.libvirt.host [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.700 187189 DEBUG nova.virt.libvirt.host [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.702 187189 DEBUG nova.virt.libvirt.driver [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.702 187189 DEBUG nova.virt.hardware [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:09Z,direct_url=<?>,disk_format='qcow2',id=3372b7b2-657b-4c4d-9d9d-7c5b771a630a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.702 187189 DEBUG nova.virt.hardware [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.703 187189 DEBUG nova.virt.hardware [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.703 187189 DEBUG nova.virt.hardware [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.703 187189 DEBUG nova.virt.hardware [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.703 187189 DEBUG nova.virt.hardware [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.703 187189 DEBUG nova.virt.hardware [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.703 187189 DEBUG nova.virt.hardware [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.704 187189 DEBUG nova.virt.hardware [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.704 187189 DEBUG nova.virt.hardware [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.704 187189 DEBUG nova.virt.hardware [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.704 187189 DEBUG nova.objects.instance [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Lazy-loading 'vcpu_model' on Instance uuid ae465294-b6bd-4481-9fc1-daf5f8ec28a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.735 187189 DEBUG nova.virt.libvirt.driver [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:29:51 compute-0 nova_compute[187185]:   <uuid>ae465294-b6bd-4481-9fc1-daf5f8ec28a1</uuid>
Nov 29 07:29:51 compute-0 nova_compute[187185]:   <name>instance-0000008c</name>
Nov 29 07:29:51 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:29:51 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:29:51 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:29:51 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:       <nova:name>tempest-ServerShowV254Test-server-89990852</nova:name>
Nov 29 07:29:51 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:29:51</nova:creationTime>
Nov 29 07:29:51 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:29:51 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:29:51 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:29:51 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:29:51 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:29:51 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:29:51 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:29:51 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:29:51 compute-0 nova_compute[187185]:         <nova:user uuid="794d169205b34112913d4e9eee1e8456">tempest-ServerShowV254Test-1438671461-project-member</nova:user>
Nov 29 07:29:51 compute-0 nova_compute[187185]:         <nova:project uuid="3b998704163a4d19bfb6f025b536cacc">tempest-ServerShowV254Test-1438671461</nova:project>
Nov 29 07:29:51 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:29:51 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="3372b7b2-657b-4c4d-9d9d-7c5b771a630a"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:       <nova:ports/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:29:51 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:29:51 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <system>
Nov 29 07:29:51 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:29:51 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:29:51 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:29:51 compute-0 nova_compute[187185]:       <entry name="serial">ae465294-b6bd-4481-9fc1-daf5f8ec28a1</entry>
Nov 29 07:29:51 compute-0 nova_compute[187185]:       <entry name="uuid">ae465294-b6bd-4481-9fc1-daf5f8ec28a1</entry>
Nov 29 07:29:51 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     </system>
Nov 29 07:29:51 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:29:51 compute-0 nova_compute[187185]:   <os>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:   </os>
Nov 29 07:29:51 compute-0 nova_compute[187185]:   <features>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:   </features>
Nov 29 07:29:51 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:29:51 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:29:51 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:29:51 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/ae465294-b6bd-4481-9fc1-daf5f8ec28a1/disk"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:29:51 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/ae465294-b6bd-4481-9fc1-daf5f8ec28a1/disk.config"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:29:51 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/ae465294-b6bd-4481-9fc1-daf5f8ec28a1/console.log" append="off"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <video>
Nov 29 07:29:51 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     </video>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:29:51 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:29:51 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:29:51 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:29:51 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:29:51 compute-0 nova_compute[187185]: </domain>
Nov 29 07:29:51 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.893 187189 DEBUG nova.virt.libvirt.driver [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.894 187189 DEBUG nova.virt.libvirt.driver [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.896 187189 INFO nova.virt.libvirt.driver [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Using config drive
Nov 29 07:29:51 compute-0 sshd-session[237774]: Received disconnect from 115.190.136.184 port 48550:11: Bye Bye [preauth]
Nov 29 07:29:51 compute-0 sshd-session[237774]: Disconnected from authenticating user root 115.190.136.184 port 48550 [preauth]
Nov 29 07:29:51 compute-0 nova_compute[187185]: 2025-11-29 07:29:51.992 187189 DEBUG nova.objects.instance [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Lazy-loading 'ec2_ids' on Instance uuid ae465294-b6bd-4481-9fc1-daf5f8ec28a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:29:52 compute-0 nova_compute[187185]: 2025-11-29 07:29:52.181 187189 INFO nova.virt.libvirt.driver [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Creating config drive at /var/lib/nova/instances/ae465294-b6bd-4481-9fc1-daf5f8ec28a1/disk.config
Nov 29 07:29:52 compute-0 nova_compute[187185]: 2025-11-29 07:29:52.188 187189 DEBUG oslo_concurrency.processutils [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ae465294-b6bd-4481-9fc1-daf5f8ec28a1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9l6y3dvc execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:29:52 compute-0 nova_compute[187185]: 2025-11-29 07:29:52.332 187189 DEBUG oslo_concurrency.processutils [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ae465294-b6bd-4481-9fc1-daf5f8ec28a1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9l6y3dvc" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:29:52 compute-0 systemd-machined[153486]: New machine qemu-56-instance-0000008c.
Nov 29 07:29:52 compute-0 systemd[1]: Started Virtual Machine qemu-56-instance-0000008c.
Nov 29 07:29:52 compute-0 nova_compute[187185]: 2025-11-29 07:29:52.749 187189 DEBUG nova.virt.libvirt.host [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Removed pending event for ae465294-b6bd-4481-9fc1-daf5f8ec28a1 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Nov 29 07:29:52 compute-0 nova_compute[187185]: 2025-11-29 07:29:52.750 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401392.7494118, ae465294-b6bd-4481-9fc1-daf5f8ec28a1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:29:52 compute-0 nova_compute[187185]: 2025-11-29 07:29:52.750 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] VM Resumed (Lifecycle Event)
Nov 29 07:29:52 compute-0 nova_compute[187185]: 2025-11-29 07:29:52.754 187189 DEBUG nova.compute.manager [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:29:52 compute-0 nova_compute[187185]: 2025-11-29 07:29:52.754 187189 DEBUG nova.virt.libvirt.driver [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:29:52 compute-0 nova_compute[187185]: 2025-11-29 07:29:52.774 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:29:52 compute-0 nova_compute[187185]: 2025-11-29 07:29:52.775 187189 INFO nova.virt.libvirt.driver [-] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Instance spawned successfully.
Nov 29 07:29:52 compute-0 nova_compute[187185]: 2025-11-29 07:29:52.775 187189 DEBUG nova.virt.libvirt.driver [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:29:52 compute-0 nova_compute[187185]: 2025-11-29 07:29:52.782 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:29:52 compute-0 nova_compute[187185]: 2025-11-29 07:29:52.799 187189 DEBUG nova.virt.libvirt.driver [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:29:52 compute-0 nova_compute[187185]: 2025-11-29 07:29:52.800 187189 DEBUG nova.virt.libvirt.driver [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:29:52 compute-0 nova_compute[187185]: 2025-11-29 07:29:52.801 187189 DEBUG nova.virt.libvirt.driver [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:29:52 compute-0 nova_compute[187185]: 2025-11-29 07:29:52.802 187189 DEBUG nova.virt.libvirt.driver [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:29:52 compute-0 nova_compute[187185]: 2025-11-29 07:29:52.802 187189 DEBUG nova.virt.libvirt.driver [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:29:52 compute-0 nova_compute[187185]: 2025-11-29 07:29:52.803 187189 DEBUG nova.virt.libvirt.driver [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:29:52 compute-0 nova_compute[187185]: 2025-11-29 07:29:52.809 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Nov 29 07:29:52 compute-0 nova_compute[187185]: 2025-11-29 07:29:52.810 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401392.7529778, ae465294-b6bd-4481-9fc1-daf5f8ec28a1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:29:52 compute-0 nova_compute[187185]: 2025-11-29 07:29:52.810 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] VM Started (Lifecycle Event)
Nov 29 07:29:52 compute-0 nova_compute[187185]: 2025-11-29 07:29:52.837 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:29:52 compute-0 nova_compute[187185]: 2025-11-29 07:29:52.841 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:29:52 compute-0 nova_compute[187185]: 2025-11-29 07:29:52.863 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Nov 29 07:29:52 compute-0 nova_compute[187185]: 2025-11-29 07:29:52.891 187189 DEBUG nova.compute.manager [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:29:52 compute-0 nova_compute[187185]: 2025-11-29 07:29:52.991 187189 DEBUG oslo_concurrency.lockutils [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:29:52 compute-0 nova_compute[187185]: 2025-11-29 07:29:52.992 187189 DEBUG oslo_concurrency.lockutils [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:29:52 compute-0 nova_compute[187185]: 2025-11-29 07:29:52.992 187189 DEBUG nova.objects.instance [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Nov 29 07:29:53 compute-0 nova_compute[187185]: 2025-11-29 07:29:53.069 187189 DEBUG oslo_concurrency.lockutils [None req-c39d0557-c261-4454-975f-c0b8e7c4c5a5 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.077s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:29:53 compute-0 nova_compute[187185]: 2025-11-29 07:29:53.496 187189 DEBUG oslo_concurrency.lockutils [None req-aadc7f8e-c2d4-49ae-996f-aff918531323 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Acquiring lock "ae465294-b6bd-4481-9fc1-daf5f8ec28a1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:29:53 compute-0 nova_compute[187185]: 2025-11-29 07:29:53.496 187189 DEBUG oslo_concurrency.lockutils [None req-aadc7f8e-c2d4-49ae-996f-aff918531323 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Lock "ae465294-b6bd-4481-9fc1-daf5f8ec28a1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:29:53 compute-0 nova_compute[187185]: 2025-11-29 07:29:53.497 187189 DEBUG oslo_concurrency.lockutils [None req-aadc7f8e-c2d4-49ae-996f-aff918531323 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Acquiring lock "ae465294-b6bd-4481-9fc1-daf5f8ec28a1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:29:53 compute-0 nova_compute[187185]: 2025-11-29 07:29:53.497 187189 DEBUG oslo_concurrency.lockutils [None req-aadc7f8e-c2d4-49ae-996f-aff918531323 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Lock "ae465294-b6bd-4481-9fc1-daf5f8ec28a1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:29:53 compute-0 nova_compute[187185]: 2025-11-29 07:29:53.497 187189 DEBUG oslo_concurrency.lockutils [None req-aadc7f8e-c2d4-49ae-996f-aff918531323 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Lock "ae465294-b6bd-4481-9fc1-daf5f8ec28a1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:29:53 compute-0 nova_compute[187185]: 2025-11-29 07:29:53.510 187189 INFO nova.compute.manager [None req-aadc7f8e-c2d4-49ae-996f-aff918531323 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Terminating instance
Nov 29 07:29:53 compute-0 nova_compute[187185]: 2025-11-29 07:29:53.520 187189 DEBUG oslo_concurrency.lockutils [None req-aadc7f8e-c2d4-49ae-996f-aff918531323 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Acquiring lock "refresh_cache-ae465294-b6bd-4481-9fc1-daf5f8ec28a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:29:53 compute-0 nova_compute[187185]: 2025-11-29 07:29:53.521 187189 DEBUG oslo_concurrency.lockutils [None req-aadc7f8e-c2d4-49ae-996f-aff918531323 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Acquired lock "refresh_cache-ae465294-b6bd-4481-9fc1-daf5f8ec28a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:29:53 compute-0 nova_compute[187185]: 2025-11-29 07:29:53.521 187189 DEBUG nova.network.neutron [None req-aadc7f8e-c2d4-49ae-996f-aff918531323 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:29:53 compute-0 nova_compute[187185]: 2025-11-29 07:29:53.714 187189 DEBUG nova.network.neutron [None req-aadc7f8e-c2d4-49ae-996f-aff918531323 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:29:53 compute-0 nova_compute[187185]: 2025-11-29 07:29:53.974 187189 DEBUG nova.network.neutron [None req-aadc7f8e-c2d4-49ae-996f-aff918531323 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:29:53 compute-0 nova_compute[187185]: 2025-11-29 07:29:53.992 187189 DEBUG oslo_concurrency.lockutils [None req-aadc7f8e-c2d4-49ae-996f-aff918531323 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Releasing lock "refresh_cache-ae465294-b6bd-4481-9fc1-daf5f8ec28a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:29:53 compute-0 nova_compute[187185]: 2025-11-29 07:29:53.993 187189 DEBUG nova.compute.manager [None req-aadc7f8e-c2d4-49ae-996f-aff918531323 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 07:29:54 compute-0 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d0000008c.scope: Deactivated successfully.
Nov 29 07:29:54 compute-0 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d0000008c.scope: Consumed 1.558s CPU time.
Nov 29 07:29:54 compute-0 systemd-machined[153486]: Machine qemu-56-instance-0000008c terminated.
Nov 29 07:29:54 compute-0 nova_compute[187185]: 2025-11-29 07:29:54.262 187189 INFO nova.virt.libvirt.driver [-] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Instance destroyed successfully.
Nov 29 07:29:54 compute-0 nova_compute[187185]: 2025-11-29 07:29:54.263 187189 DEBUG nova.objects.instance [None req-aadc7f8e-c2d4-49ae-996f-aff918531323 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Lazy-loading 'resources' on Instance uuid ae465294-b6bd-4481-9fc1-daf5f8ec28a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:29:54 compute-0 nova_compute[187185]: 2025-11-29 07:29:54.277 187189 INFO nova.virt.libvirt.driver [None req-aadc7f8e-c2d4-49ae-996f-aff918531323 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Deleting instance files /var/lib/nova/instances/ae465294-b6bd-4481-9fc1-daf5f8ec28a1_del
Nov 29 07:29:54 compute-0 nova_compute[187185]: 2025-11-29 07:29:54.277 187189 INFO nova.virt.libvirt.driver [None req-aadc7f8e-c2d4-49ae-996f-aff918531323 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Deletion of /var/lib/nova/instances/ae465294-b6bd-4481-9fc1-daf5f8ec28a1_del complete
Nov 29 07:29:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:29:54.304 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '46'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:29:54 compute-0 nova_compute[187185]: 2025-11-29 07:29:54.348 187189 INFO nova.compute.manager [None req-aadc7f8e-c2d4-49ae-996f-aff918531323 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Took 0.35 seconds to destroy the instance on the hypervisor.
Nov 29 07:29:54 compute-0 nova_compute[187185]: 2025-11-29 07:29:54.349 187189 DEBUG oslo.service.loopingcall [None req-aadc7f8e-c2d4-49ae-996f-aff918531323 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 07:29:54 compute-0 nova_compute[187185]: 2025-11-29 07:29:54.350 187189 DEBUG nova.compute.manager [-] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 07:29:54 compute-0 nova_compute[187185]: 2025-11-29 07:29:54.350 187189 DEBUG nova.network.neutron [-] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 07:29:54 compute-0 nova_compute[187185]: 2025-11-29 07:29:54.578 187189 DEBUG nova.network.neutron [-] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:29:54 compute-0 nova_compute[187185]: 2025-11-29 07:29:54.594 187189 DEBUG nova.network.neutron [-] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:29:54 compute-0 nova_compute[187185]: 2025-11-29 07:29:54.607 187189 INFO nova.compute.manager [-] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Took 0.26 seconds to deallocate network for instance.
Nov 29 07:29:54 compute-0 nova_compute[187185]: 2025-11-29 07:29:54.682 187189 DEBUG oslo_concurrency.lockutils [None req-aadc7f8e-c2d4-49ae-996f-aff918531323 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:29:54 compute-0 nova_compute[187185]: 2025-11-29 07:29:54.683 187189 DEBUG oslo_concurrency.lockutils [None req-aadc7f8e-c2d4-49ae-996f-aff918531323 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:29:54 compute-0 nova_compute[187185]: 2025-11-29 07:29:54.772 187189 DEBUG nova.compute.provider_tree [None req-aadc7f8e-c2d4-49ae-996f-aff918531323 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:29:54 compute-0 nova_compute[187185]: 2025-11-29 07:29:54.790 187189 DEBUG nova.scheduler.client.report [None req-aadc7f8e-c2d4-49ae-996f-aff918531323 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:29:54 compute-0 nova_compute[187185]: 2025-11-29 07:29:54.816 187189 DEBUG oslo_concurrency.lockutils [None req-aadc7f8e-c2d4-49ae-996f-aff918531323 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:29:54 compute-0 nova_compute[187185]: 2025-11-29 07:29:54.839 187189 INFO nova.scheduler.client.report [None req-aadc7f8e-c2d4-49ae-996f-aff918531323 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Deleted allocations for instance ae465294-b6bd-4481-9fc1-daf5f8ec28a1
Nov 29 07:29:54 compute-0 nova_compute[187185]: 2025-11-29 07:29:54.936 187189 DEBUG oslo_concurrency.lockutils [None req-aadc7f8e-c2d4-49ae-996f-aff918531323 794d169205b34112913d4e9eee1e8456 3b998704163a4d19bfb6f025b536cacc - - default default] Lock "ae465294-b6bd-4481-9fc1-daf5f8ec28a1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.439s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:29:55 compute-0 nova_compute[187185]: 2025-11-29 07:29:55.284 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:55 compute-0 nova_compute[187185]: 2025-11-29 07:29:55.526 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:56 compute-0 nova_compute[187185]: 2025-11-29 07:29:56.699 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:57 compute-0 podman[237828]: 2025-11-29 07:29:57.798003877 +0000 UTC m=+0.057912912 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 07:29:57 compute-0 podman[237830]: 2025-11-29 07:29:57.804540954 +0000 UTC m=+0.056568823 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 07:29:57 compute-0 podman[237829]: 2025-11-29 07:29:57.807122959 +0000 UTC m=+0.066980353 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, distribution-scope=public, maintainer=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, vcs-type=git, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, release=1755695350, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, config_id=edpm, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 07:29:58 compute-0 nova_compute[187185]: 2025-11-29 07:29:58.298 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:59 compute-0 nova_compute[187185]: 2025-11-29 07:29:59.517 187189 DEBUG nova.compute.manager [req-266aff83-8fbb-474b-8606-4f8b0df6404f req-c5f665d8-c428-4fd6-9b37-329979969a18 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Received event network-changed-836d3fdf-e98b-4a41-864f-9e3fbdb29394 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:29:59 compute-0 nova_compute[187185]: 2025-11-29 07:29:59.518 187189 DEBUG nova.compute.manager [req-266aff83-8fbb-474b-8606-4f8b0df6404f req-c5f665d8-c428-4fd6-9b37-329979969a18 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Refreshing instance network info cache due to event network-changed-836d3fdf-e98b-4a41-864f-9e3fbdb29394. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:29:59 compute-0 nova_compute[187185]: 2025-11-29 07:29:59.518 187189 DEBUG oslo_concurrency.lockutils [req-266aff83-8fbb-474b-8606-4f8b0df6404f req-c5f665d8-c428-4fd6-9b37-329979969a18 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-2cb4e847-114f-440a-b231-65e3fff0f0d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:29:59 compute-0 nova_compute[187185]: 2025-11-29 07:29:59.518 187189 DEBUG oslo_concurrency.lockutils [req-266aff83-8fbb-474b-8606-4f8b0df6404f req-c5f665d8-c428-4fd6-9b37-329979969a18 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-2cb4e847-114f-440a-b231-65e3fff0f0d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:29:59 compute-0 nova_compute[187185]: 2025-11-29 07:29:59.519 187189 DEBUG nova.network.neutron [req-266aff83-8fbb-474b-8606-4f8b0df6404f req-c5f665d8-c428-4fd6-9b37-329979969a18 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Refreshing network info cache for port 836d3fdf-e98b-4a41-864f-9e3fbdb29394 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:29:59 compute-0 nova_compute[187185]: 2025-11-29 07:29:59.640 187189 DEBUG oslo_concurrency.lockutils [None req-7f2bc6b6-81f0-4e08-a67d-08c09fc33155 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "2cb4e847-114f-440a-b231-65e3fff0f0d2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:29:59 compute-0 nova_compute[187185]: 2025-11-29 07:29:59.641 187189 DEBUG oslo_concurrency.lockutils [None req-7f2bc6b6-81f0-4e08-a67d-08c09fc33155 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2cb4e847-114f-440a-b231-65e3fff0f0d2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:29:59 compute-0 nova_compute[187185]: 2025-11-29 07:29:59.641 187189 DEBUG oslo_concurrency.lockutils [None req-7f2bc6b6-81f0-4e08-a67d-08c09fc33155 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "2cb4e847-114f-440a-b231-65e3fff0f0d2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:29:59 compute-0 nova_compute[187185]: 2025-11-29 07:29:59.641 187189 DEBUG oslo_concurrency.lockutils [None req-7f2bc6b6-81f0-4e08-a67d-08c09fc33155 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2cb4e847-114f-440a-b231-65e3fff0f0d2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:29:59 compute-0 nova_compute[187185]: 2025-11-29 07:29:59.642 187189 DEBUG oslo_concurrency.lockutils [None req-7f2bc6b6-81f0-4e08-a67d-08c09fc33155 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2cb4e847-114f-440a-b231-65e3fff0f0d2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:29:59 compute-0 nova_compute[187185]: 2025-11-29 07:29:59.659 187189 INFO nova.compute.manager [None req-7f2bc6b6-81f0-4e08-a67d-08c09fc33155 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Terminating instance
Nov 29 07:29:59 compute-0 nova_compute[187185]: 2025-11-29 07:29:59.673 187189 DEBUG nova.compute.manager [None req-7f2bc6b6-81f0-4e08-a67d-08c09fc33155 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 07:29:59 compute-0 kernel: tap836d3fdf-e9 (unregistering): left promiscuous mode
Nov 29 07:29:59 compute-0 NetworkManager[55227]: <info>  [1764401399.7023] device (tap836d3fdf-e9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:29:59 compute-0 ovn_controller[95281]: 2025-11-29T07:29:59Z|00453|binding|INFO|Releasing lport 836d3fdf-e98b-4a41-864f-9e3fbdb29394 from this chassis (sb_readonly=0)
Nov 29 07:29:59 compute-0 nova_compute[187185]: 2025-11-29 07:29:59.717 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:59 compute-0 ovn_controller[95281]: 2025-11-29T07:29:59Z|00454|binding|INFO|Setting lport 836d3fdf-e98b-4a41-864f-9e3fbdb29394 down in Southbound
Nov 29 07:29:59 compute-0 ovn_controller[95281]: 2025-11-29T07:29:59Z|00455|binding|INFO|Removing iface tap836d3fdf-e9 ovn-installed in OVS
Nov 29 07:29:59 compute-0 nova_compute[187185]: 2025-11-29 07:29:59.719 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:59 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:29:59.727 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:a1:ff 10.100.0.9'], port_security=['fa:16:3e:d5:a1:ff 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-94472368-b72a-4e5d-ac59-40b24b7ba792', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '291ecd66-4825-4cfd-92f0-8370bd5efe48', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c99c04c3-6b8c-480e-be26-e44e383928c7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=836d3fdf-e98b-4a41-864f-9e3fbdb29394) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:29:59 compute-0 kernel: tap9c194df0-4c (unregistering): left promiscuous mode
Nov 29 07:29:59 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:29:59.730 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 836d3fdf-e98b-4a41-864f-9e3fbdb29394 in datapath 94472368-b72a-4e5d-ac59-40b24b7ba792 unbound from our chassis
Nov 29 07:29:59 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:29:59.734 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 94472368-b72a-4e5d-ac59-40b24b7ba792, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:29:59 compute-0 NetworkManager[55227]: <info>  [1764401399.7365] device (tap9c194df0-4c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:29:59 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:29:59.736 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[b9efdbee-7164-4ddf-8e75-63ccb3ab267b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:29:59 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:29:59.737 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792 namespace which is not needed anymore
Nov 29 07:29:59 compute-0 nova_compute[187185]: 2025-11-29 07:29:59.738 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:59 compute-0 nova_compute[187185]: 2025-11-29 07:29:59.751 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:59 compute-0 ovn_controller[95281]: 2025-11-29T07:29:59Z|00456|binding|INFO|Releasing lport 9c194df0-4c84-41bd-94af-7e4ecd312dd5 from this chassis (sb_readonly=0)
Nov 29 07:29:59 compute-0 ovn_controller[95281]: 2025-11-29T07:29:59Z|00457|binding|INFO|Setting lport 9c194df0-4c84-41bd-94af-7e4ecd312dd5 down in Southbound
Nov 29 07:29:59 compute-0 ovn_controller[95281]: 2025-11-29T07:29:59Z|00458|binding|INFO|Removing iface tap9c194df0-4c ovn-installed in OVS
Nov 29 07:29:59 compute-0 nova_compute[187185]: 2025-11-29 07:29:59.757 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:59 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:29:59.761 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:da:a3:a5 2001:db8:0:1:f816:3eff:feda:a3a5 2001:db8::f816:3eff:feda:a3a5'], port_security=['fa:16:3e:da:a3:a5 2001:db8:0:1:f816:3eff:feda:a3a5 2001:db8::f816:3eff:feda:a3a5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:feda:a3a5/64 2001:db8::f816:3eff:feda:a3a5/64', 'neutron:device_id': '2cb4e847-114f-440a-b231-65e3fff0f0d2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ff387e90-45c2-42d7-b536-fee4d2b6eb5e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '291ecd66-4825-4cfd-92f0-8370bd5efe48', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ed5ad144-c783-4b67-a226-e0c5588d3535, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=9c194df0-4c84-41bd-94af-7e4ecd312dd5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:29:59 compute-0 nova_compute[187185]: 2025-11-29 07:29:59.778 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:59 compute-0 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000087.scope: Deactivated successfully.
Nov 29 07:29:59 compute-0 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000087.scope: Consumed 18.156s CPU time.
Nov 29 07:29:59 compute-0 systemd-machined[153486]: Machine qemu-54-instance-00000087 terminated.
Nov 29 07:29:59 compute-0 neutron-haproxy-ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792[237087]: [NOTICE]   (237091) : haproxy version is 2.8.14-c23fe91
Nov 29 07:29:59 compute-0 neutron-haproxy-ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792[237087]: [NOTICE]   (237091) : path to executable is /usr/sbin/haproxy
Nov 29 07:29:59 compute-0 neutron-haproxy-ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792[237087]: [WARNING]  (237091) : Exiting Master process...
Nov 29 07:29:59 compute-0 neutron-haproxy-ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792[237087]: [ALERT]    (237091) : Current worker (237093) exited with code 143 (Terminated)
Nov 29 07:29:59 compute-0 neutron-haproxy-ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792[237087]: [WARNING]  (237091) : All workers exited. Exiting... (0)
Nov 29 07:29:59 compute-0 systemd[1]: libpod-56e5ba6434cb225dde086a2ec606b7be5a95b70b3965f2411aee8995833bf9f2.scope: Deactivated successfully.
Nov 29 07:29:59 compute-0 podman[237916]: 2025-11-29 07:29:59.886523248 +0000 UTC m=+0.048015419 container died 56e5ba6434cb225dde086a2ec606b7be5a95b70b3965f2411aee8995833bf9f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:29:59 compute-0 NetworkManager[55227]: <info>  [1764401399.9146] manager: (tap9c194df0-4c): new Tun device (/org/freedesktop/NetworkManager/Devices/234)
Nov 29 07:29:59 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-56e5ba6434cb225dde086a2ec606b7be5a95b70b3965f2411aee8995833bf9f2-userdata-shm.mount: Deactivated successfully.
Nov 29 07:29:59 compute-0 systemd[1]: var-lib-containers-storage-overlay-1b47d04aff71d36b1a4920b5e5f528b4acbf90772f99e52b6266be3c57eeb86d-merged.mount: Deactivated successfully.
Nov 29 07:29:59 compute-0 podman[237916]: 2025-11-29 07:29:59.93227013 +0000 UTC m=+0.093762281 container cleanup 56e5ba6434cb225dde086a2ec606b7be5a95b70b3965f2411aee8995833bf9f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:29:59 compute-0 nova_compute[187185]: 2025-11-29 07:29:59.963 187189 INFO nova.virt.libvirt.driver [-] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Instance destroyed successfully.
Nov 29 07:29:59 compute-0 nova_compute[187185]: 2025-11-29 07:29:59.964 187189 DEBUG nova.objects.instance [None req-7f2bc6b6-81f0-4e08-a67d-08c09fc33155 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'resources' on Instance uuid 2cb4e847-114f-440a-b231-65e3fff0f0d2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:29:59 compute-0 systemd[1]: libpod-conmon-56e5ba6434cb225dde086a2ec606b7be5a95b70b3965f2411aee8995833bf9f2.scope: Deactivated successfully.
Nov 29 07:29:59 compute-0 nova_compute[187185]: 2025-11-29 07:29:59.989 187189 DEBUG nova.virt.libvirt.vif [None req-7f2bc6b6-81f0-4e08-a67d-08c09fc33155 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:28:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-345833542',display_name='tempest-TestGettingAddress-server-345833542',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-345833542',id=135,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMDVssChktFh4KaYAW3qIeGMuNugTc7Dbw5csxo4G7cMEBd3kE//4edEXQEzew5nx3IZ0PY0s8qOeS5wzlVZFNNAqhiy1Tau85BZFwFUT3MGeCBu5wHnKv5gWDKRFcpa7w==',key_name='tempest-TestGettingAddress-312359009',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:28:33Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-vmipnsga',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:28:33Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=2cb4e847-114f-440a-b231-65e3fff0f0d2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "836d3fdf-e98b-4a41-864f-9e3fbdb29394", "address": "fa:16:3e:d5:a1:ff", "network": {"id": "94472368-b72a-4e5d-ac59-40b24b7ba792", "bridge": "br-int", "label": "tempest-network-smoke--1761418689", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.220", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap836d3fdf-e9", "ovs_interfaceid": "836d3fdf-e98b-4a41-864f-9e3fbdb29394", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:29:59 compute-0 nova_compute[187185]: 2025-11-29 07:29:59.990 187189 DEBUG nova.network.os_vif_util [None req-7f2bc6b6-81f0-4e08-a67d-08c09fc33155 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "836d3fdf-e98b-4a41-864f-9e3fbdb29394", "address": "fa:16:3e:d5:a1:ff", "network": {"id": "94472368-b72a-4e5d-ac59-40b24b7ba792", "bridge": "br-int", "label": "tempest-network-smoke--1761418689", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.220", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap836d3fdf-e9", "ovs_interfaceid": "836d3fdf-e98b-4a41-864f-9e3fbdb29394", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:29:59 compute-0 nova_compute[187185]: 2025-11-29 07:29:59.991 187189 DEBUG nova.network.os_vif_util [None req-7f2bc6b6-81f0-4e08-a67d-08c09fc33155 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d5:a1:ff,bridge_name='br-int',has_traffic_filtering=True,id=836d3fdf-e98b-4a41-864f-9e3fbdb29394,network=Network(94472368-b72a-4e5d-ac59-40b24b7ba792),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap836d3fdf-e9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:29:59 compute-0 nova_compute[187185]: 2025-11-29 07:29:59.991 187189 DEBUG os_vif [None req-7f2bc6b6-81f0-4e08-a67d-08c09fc33155 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d5:a1:ff,bridge_name='br-int',has_traffic_filtering=True,id=836d3fdf-e98b-4a41-864f-9e3fbdb29394,network=Network(94472368-b72a-4e5d-ac59-40b24b7ba792),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap836d3fdf-e9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:29:59 compute-0 nova_compute[187185]: 2025-11-29 07:29:59.994 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:29:59 compute-0 nova_compute[187185]: 2025-11-29 07:29:59.994 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap836d3fdf-e9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:30:00 compute-0 podman[237965]: 2025-11-29 07:30:00.010080021 +0000 UTC m=+0.048384818 container remove 56e5ba6434cb225dde086a2ec606b7be5a95b70b3965f2411aee8995833bf9f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 07:30:00 compute-0 nova_compute[187185]: 2025-11-29 07:30:00.026 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:30:00 compute-0 nova_compute[187185]: 2025-11-29 07:30:00.029 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:30:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:30:00.030 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[c4b9d3db-c274-4740-aa1d-20677f32bb0e]: (4, ('Sat Nov 29 07:29:59 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792 (56e5ba6434cb225dde086a2ec606b7be5a95b70b3965f2411aee8995833bf9f2)\n56e5ba6434cb225dde086a2ec606b7be5a95b70b3965f2411aee8995833bf9f2\nSat Nov 29 07:29:59 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792 (56e5ba6434cb225dde086a2ec606b7be5a95b70b3965f2411aee8995833bf9f2)\n56e5ba6434cb225dde086a2ec606b7be5a95b70b3965f2411aee8995833bf9f2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:30:00 compute-0 nova_compute[187185]: 2025-11-29 07:30:00.032 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:30:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:30:00.032 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[7d18fbe1-1b71-4414-b610-62b0075ad723]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:30:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:30:00.035 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap94472368-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:30:00 compute-0 nova_compute[187185]: 2025-11-29 07:30:00.037 187189 INFO os_vif [None req-7f2bc6b6-81f0-4e08-a67d-08c09fc33155 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d5:a1:ff,bridge_name='br-int',has_traffic_filtering=True,id=836d3fdf-e98b-4a41-864f-9e3fbdb29394,network=Network(94472368-b72a-4e5d-ac59-40b24b7ba792),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap836d3fdf-e9')
Nov 29 07:30:00 compute-0 kernel: tap94472368-b0: left promiscuous mode
Nov 29 07:30:00 compute-0 nova_compute[187185]: 2025-11-29 07:30:00.038 187189 DEBUG nova.virt.libvirt.vif [None req-7f2bc6b6-81f0-4e08-a67d-08c09fc33155 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:28:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-345833542',display_name='tempest-TestGettingAddress-server-345833542',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-345833542',id=135,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMDVssChktFh4KaYAW3qIeGMuNugTc7Dbw5csxo4G7cMEBd3kE//4edEXQEzew5nx3IZ0PY0s8qOeS5wzlVZFNNAqhiy1Tau85BZFwFUT3MGeCBu5wHnKv5gWDKRFcpa7w==',key_name='tempest-TestGettingAddress-312359009',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:28:33Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-vmipnsga',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:28:33Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=2cb4e847-114f-440a-b231-65e3fff0f0d2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9c194df0-4c84-41bd-94af-7e4ecd312dd5", "address": "fa:16:3e:da:a3:a5", "network": {"id": "ff387e90-45c2-42d7-b536-fee4d2b6eb5e", "bridge": "br-int", "label": "tempest-network-smoke--1344723239", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feda:a3a5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feda:a3a5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c194df0-4c", "ovs_interfaceid": "9c194df0-4c84-41bd-94af-7e4ecd312dd5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:30:00 compute-0 nova_compute[187185]: 2025-11-29 07:30:00.038 187189 DEBUG nova.network.os_vif_util [None req-7f2bc6b6-81f0-4e08-a67d-08c09fc33155 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "9c194df0-4c84-41bd-94af-7e4ecd312dd5", "address": "fa:16:3e:da:a3:a5", "network": {"id": "ff387e90-45c2-42d7-b536-fee4d2b6eb5e", "bridge": "br-int", "label": "tempest-network-smoke--1344723239", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feda:a3a5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feda:a3a5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c194df0-4c", "ovs_interfaceid": "9c194df0-4c84-41bd-94af-7e4ecd312dd5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:30:00 compute-0 nova_compute[187185]: 2025-11-29 07:30:00.039 187189 DEBUG nova.network.os_vif_util [None req-7f2bc6b6-81f0-4e08-a67d-08c09fc33155 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:da:a3:a5,bridge_name='br-int',has_traffic_filtering=True,id=9c194df0-4c84-41bd-94af-7e4ecd312dd5,network=Network(ff387e90-45c2-42d7-b536-fee4d2b6eb5e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9c194df0-4c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:30:00 compute-0 nova_compute[187185]: 2025-11-29 07:30:00.039 187189 DEBUG os_vif [None req-7f2bc6b6-81f0-4e08-a67d-08c09fc33155 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:da:a3:a5,bridge_name='br-int',has_traffic_filtering=True,id=9c194df0-4c84-41bd-94af-7e4ecd312dd5,network=Network(ff387e90-45c2-42d7-b536-fee4d2b6eb5e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9c194df0-4c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:30:00 compute-0 nova_compute[187185]: 2025-11-29 07:30:00.041 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:30:00 compute-0 nova_compute[187185]: 2025-11-29 07:30:00.041 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9c194df0-4c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:30:00 compute-0 nova_compute[187185]: 2025-11-29 07:30:00.042 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:30:00 compute-0 nova_compute[187185]: 2025-11-29 07:30:00.044 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:30:00 compute-0 nova_compute[187185]: 2025-11-29 07:30:00.052 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:30:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:30:00.056 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[77477dcc-0be0-4b22-bbe5-46fc4f038a0e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:30:00 compute-0 nova_compute[187185]: 2025-11-29 07:30:00.056 187189 INFO os_vif [None req-7f2bc6b6-81f0-4e08-a67d-08c09fc33155 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:da:a3:a5,bridge_name='br-int',has_traffic_filtering=True,id=9c194df0-4c84-41bd-94af-7e4ecd312dd5,network=Network(ff387e90-45c2-42d7-b536-fee4d2b6eb5e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9c194df0-4c')
Nov 29 07:30:00 compute-0 nova_compute[187185]: 2025-11-29 07:30:00.057 187189 INFO nova.virt.libvirt.driver [None req-7f2bc6b6-81f0-4e08-a67d-08c09fc33155 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Deleting instance files /var/lib/nova/instances/2cb4e847-114f-440a-b231-65e3fff0f0d2_del
Nov 29 07:30:00 compute-0 nova_compute[187185]: 2025-11-29 07:30:00.058 187189 INFO nova.virt.libvirt.driver [None req-7f2bc6b6-81f0-4e08-a67d-08c09fc33155 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Deletion of /var/lib/nova/instances/2cb4e847-114f-440a-b231-65e3fff0f0d2_del complete
Nov 29 07:30:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:30:00.074 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[511860c4-8501-4277-8361-99aa54f65154]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:30:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:30:00.075 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[ede1e31e-fc3a-44cc-a97b-abdcb864b46d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:30:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:30:00.092 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[75233ac8-9ed9-4bfb-ae7b-d0f7bbcae135]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 675807, 'reachable_time': 40630, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237986, 'error': None, 'target': 'ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:30:00 compute-0 systemd[1]: run-netns-ovnmeta\x2d94472368\x2db72a\x2d4e5d\x2dac59\x2d40b24b7ba792.mount: Deactivated successfully.
Nov 29 07:30:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:30:00.097 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:30:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:30:00.098 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[5f9b76e8-affc-449d-a2ee-a4fbe4151c37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:30:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:30:00.099 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 9c194df0-4c84-41bd-94af-7e4ecd312dd5 in datapath ff387e90-45c2-42d7-b536-fee4d2b6eb5e unbound from our chassis
Nov 29 07:30:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:30:00.100 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ff387e90-45c2-42d7-b536-fee4d2b6eb5e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:30:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:30:00.101 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[c08a7043-d685-4ed8-91b4-e9b45b130814]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:30:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:30:00.101 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e namespace which is not needed anymore
Nov 29 07:30:00 compute-0 neutron-haproxy-ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e[237169]: [NOTICE]   (237173) : haproxy version is 2.8.14-c23fe91
Nov 29 07:30:00 compute-0 neutron-haproxy-ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e[237169]: [NOTICE]   (237173) : path to executable is /usr/sbin/haproxy
Nov 29 07:30:00 compute-0 neutron-haproxy-ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e[237169]: [WARNING]  (237173) : Exiting Master process...
Nov 29 07:30:00 compute-0 neutron-haproxy-ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e[237169]: [ALERT]    (237173) : Current worker (237175) exited with code 143 (Terminated)
Nov 29 07:30:00 compute-0 neutron-haproxy-ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e[237169]: [WARNING]  (237173) : All workers exited. Exiting... (0)
Nov 29 07:30:00 compute-0 systemd[1]: libpod-e033e20188aa4ad87e9df95b5b83bbfd0157c0456746005e1ce373a546b84257.scope: Deactivated successfully.
Nov 29 07:30:00 compute-0 podman[238003]: 2025-11-29 07:30:00.255040877 +0000 UTC m=+0.052017183 container died e033e20188aa4ad87e9df95b5b83bbfd0157c0456746005e1ce373a546b84257 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:30:00 compute-0 nova_compute[187185]: 2025-11-29 07:30:00.266 187189 INFO nova.compute.manager [None req-7f2bc6b6-81f0-4e08-a67d-08c09fc33155 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Took 0.59 seconds to destroy the instance on the hypervisor.
Nov 29 07:30:00 compute-0 nova_compute[187185]: 2025-11-29 07:30:00.267 187189 DEBUG oslo.service.loopingcall [None req-7f2bc6b6-81f0-4e08-a67d-08c09fc33155 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 07:30:00 compute-0 nova_compute[187185]: 2025-11-29 07:30:00.267 187189 DEBUG nova.compute.manager [-] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 07:30:00 compute-0 nova_compute[187185]: 2025-11-29 07:30:00.267 187189 DEBUG nova.network.neutron [-] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 07:30:00 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e033e20188aa4ad87e9df95b5b83bbfd0157c0456746005e1ce373a546b84257-userdata-shm.mount: Deactivated successfully.
Nov 29 07:30:00 compute-0 systemd[1]: var-lib-containers-storage-overlay-4a170deaf292f901bfd34cda8779c75d6590e63df05c96aa56a3e0691adcc5f7-merged.mount: Deactivated successfully.
Nov 29 07:30:00 compute-0 podman[238003]: 2025-11-29 07:30:00.297414393 +0000 UTC m=+0.094390699 container cleanup e033e20188aa4ad87e9df95b5b83bbfd0157c0456746005e1ce373a546b84257 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:30:00 compute-0 systemd[1]: libpod-conmon-e033e20188aa4ad87e9df95b5b83bbfd0157c0456746005e1ce373a546b84257.scope: Deactivated successfully.
Nov 29 07:30:00 compute-0 podman[238032]: 2025-11-29 07:30:00.359949956 +0000 UTC m=+0.043048096 container remove e033e20188aa4ad87e9df95b5b83bbfd0157c0456746005e1ce373a546b84257 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 07:30:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:30:00.364 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[44abde52-d132-4e1d-a471-68180db08081]: (4, ('Sat Nov 29 07:30:00 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e (e033e20188aa4ad87e9df95b5b83bbfd0157c0456746005e1ce373a546b84257)\ne033e20188aa4ad87e9df95b5b83bbfd0157c0456746005e1ce373a546b84257\nSat Nov 29 07:30:00 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e (e033e20188aa4ad87e9df95b5b83bbfd0157c0456746005e1ce373a546b84257)\ne033e20188aa4ad87e9df95b5b83bbfd0157c0456746005e1ce373a546b84257\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:30:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:30:00.366 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d2e7d6c4-ae42-4104-902c-33520f2d4b13]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:30:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:30:00.366 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapff387e90-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:30:00 compute-0 nova_compute[187185]: 2025-11-29 07:30:00.368 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:30:00 compute-0 kernel: tapff387e90-40: left promiscuous mode
Nov 29 07:30:00 compute-0 nova_compute[187185]: 2025-11-29 07:30:00.379 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:30:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:30:00.382 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[81aaf6ff-c028-418e-96e9-2c3edb0978f9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:30:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:30:00.409 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[99628103-7c21-42ad-9fad-2d2ab44cc5fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:30:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:30:00.410 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[02b80e11-34bd-4e0b-b320-b4ffa7b32a64]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:30:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:30:00.425 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[3ae43ead-5db2-4a0e-99d9-d8a5d66ade95]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 675894, 'reachable_time': 32118, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238048, 'error': None, 'target': 'ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:30:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:30:00.428 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:30:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:30:00.428 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[dbd8bcc3-f951-4725-85b9-6a732f5a12c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:30:00 compute-0 systemd[1]: run-netns-ovnmeta\x2dff387e90\x2d45c2\x2d42d7\x2db536\x2dfee4d2b6eb5e.mount: Deactivated successfully.
Nov 29 07:30:01 compute-0 nova_compute[187185]: 2025-11-29 07:30:01.258 187189 DEBUG nova.compute.manager [req-f0efa979-b0f0-48bf-912f-ffd49ad503d9 req-49606fd8-924e-432e-ac41-66a59958eb98 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Received event network-vif-unplugged-9c194df0-4c84-41bd-94af-7e4ecd312dd5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:30:01 compute-0 nova_compute[187185]: 2025-11-29 07:30:01.259 187189 DEBUG oslo_concurrency.lockutils [req-f0efa979-b0f0-48bf-912f-ffd49ad503d9 req-49606fd8-924e-432e-ac41-66a59958eb98 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2cb4e847-114f-440a-b231-65e3fff0f0d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:30:01 compute-0 nova_compute[187185]: 2025-11-29 07:30:01.260 187189 DEBUG oslo_concurrency.lockutils [req-f0efa979-b0f0-48bf-912f-ffd49ad503d9 req-49606fd8-924e-432e-ac41-66a59958eb98 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2cb4e847-114f-440a-b231-65e3fff0f0d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:30:01 compute-0 nova_compute[187185]: 2025-11-29 07:30:01.260 187189 DEBUG oslo_concurrency.lockutils [req-f0efa979-b0f0-48bf-912f-ffd49ad503d9 req-49606fd8-924e-432e-ac41-66a59958eb98 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2cb4e847-114f-440a-b231-65e3fff0f0d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:30:01 compute-0 nova_compute[187185]: 2025-11-29 07:30:01.261 187189 DEBUG nova.compute.manager [req-f0efa979-b0f0-48bf-912f-ffd49ad503d9 req-49606fd8-924e-432e-ac41-66a59958eb98 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] No waiting events found dispatching network-vif-unplugged-9c194df0-4c84-41bd-94af-7e4ecd312dd5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:30:01 compute-0 nova_compute[187185]: 2025-11-29 07:30:01.262 187189 DEBUG nova.compute.manager [req-f0efa979-b0f0-48bf-912f-ffd49ad503d9 req-49606fd8-924e-432e-ac41-66a59958eb98 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Received event network-vif-unplugged-9c194df0-4c84-41bd-94af-7e4ecd312dd5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 07:30:01 compute-0 nova_compute[187185]: 2025-11-29 07:30:01.365 187189 DEBUG nova.compute.manager [req-28896196-c20b-4c15-867d-b497f3490d6b req-78c275f9-1949-41f0-9bdb-b68d74f8719a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Received event network-vif-unplugged-836d3fdf-e98b-4a41-864f-9e3fbdb29394 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:30:01 compute-0 nova_compute[187185]: 2025-11-29 07:30:01.365 187189 DEBUG oslo_concurrency.lockutils [req-28896196-c20b-4c15-867d-b497f3490d6b req-78c275f9-1949-41f0-9bdb-b68d74f8719a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2cb4e847-114f-440a-b231-65e3fff0f0d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:30:01 compute-0 nova_compute[187185]: 2025-11-29 07:30:01.366 187189 DEBUG oslo_concurrency.lockutils [req-28896196-c20b-4c15-867d-b497f3490d6b req-78c275f9-1949-41f0-9bdb-b68d74f8719a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2cb4e847-114f-440a-b231-65e3fff0f0d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:30:01 compute-0 nova_compute[187185]: 2025-11-29 07:30:01.366 187189 DEBUG oslo_concurrency.lockutils [req-28896196-c20b-4c15-867d-b497f3490d6b req-78c275f9-1949-41f0-9bdb-b68d74f8719a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2cb4e847-114f-440a-b231-65e3fff0f0d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:30:01 compute-0 nova_compute[187185]: 2025-11-29 07:30:01.366 187189 DEBUG nova.compute.manager [req-28896196-c20b-4c15-867d-b497f3490d6b req-78c275f9-1949-41f0-9bdb-b68d74f8719a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] No waiting events found dispatching network-vif-unplugged-836d3fdf-e98b-4a41-864f-9e3fbdb29394 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:30:01 compute-0 nova_compute[187185]: 2025-11-29 07:30:01.366 187189 DEBUG nova.compute.manager [req-28896196-c20b-4c15-867d-b497f3490d6b req-78c275f9-1949-41f0-9bdb-b68d74f8719a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Received event network-vif-unplugged-836d3fdf-e98b-4a41-864f-9e3fbdb29394 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 07:30:01 compute-0 nova_compute[187185]: 2025-11-29 07:30:01.611 187189 DEBUG nova.network.neutron [-] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:30:01 compute-0 nova_compute[187185]: 2025-11-29 07:30:01.630 187189 INFO nova.compute.manager [-] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Took 1.36 seconds to deallocate network for instance.
Nov 29 07:30:01 compute-0 nova_compute[187185]: 2025-11-29 07:30:01.703 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:30:01 compute-0 nova_compute[187185]: 2025-11-29 07:30:01.706 187189 DEBUG oslo_concurrency.lockutils [None req-7f2bc6b6-81f0-4e08-a67d-08c09fc33155 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:30:01 compute-0 nova_compute[187185]: 2025-11-29 07:30:01.706 187189 DEBUG oslo_concurrency.lockutils [None req-7f2bc6b6-81f0-4e08-a67d-08c09fc33155 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:30:01 compute-0 nova_compute[187185]: 2025-11-29 07:30:01.790 187189 DEBUG nova.compute.provider_tree [None req-7f2bc6b6-81f0-4e08-a67d-08c09fc33155 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:30:01 compute-0 nova_compute[187185]: 2025-11-29 07:30:01.805 187189 DEBUG nova.scheduler.client.report [None req-7f2bc6b6-81f0-4e08-a67d-08c09fc33155 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:30:01 compute-0 nova_compute[187185]: 2025-11-29 07:30:01.845 187189 DEBUG oslo_concurrency.lockutils [None req-7f2bc6b6-81f0-4e08-a67d-08c09fc33155 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:30:01 compute-0 nova_compute[187185]: 2025-11-29 07:30:01.867 187189 INFO nova.scheduler.client.report [None req-7f2bc6b6-81f0-4e08-a67d-08c09fc33155 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Deleted allocations for instance 2cb4e847-114f-440a-b231-65e3fff0f0d2
Nov 29 07:30:01 compute-0 nova_compute[187185]: 2025-11-29 07:30:01.975 187189 DEBUG oslo_concurrency.lockutils [None req-7f2bc6b6-81f0-4e08-a67d-08c09fc33155 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2cb4e847-114f-440a-b231-65e3fff0f0d2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.334s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:30:02 compute-0 nova_compute[187185]: 2025-11-29 07:30:02.048 187189 DEBUG nova.network.neutron [req-266aff83-8fbb-474b-8606-4f8b0df6404f req-c5f665d8-c428-4fd6-9b37-329979969a18 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Updated VIF entry in instance network info cache for port 836d3fdf-e98b-4a41-864f-9e3fbdb29394. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:30:02 compute-0 nova_compute[187185]: 2025-11-29 07:30:02.049 187189 DEBUG nova.network.neutron [req-266aff83-8fbb-474b-8606-4f8b0df6404f req-c5f665d8-c428-4fd6-9b37-329979969a18 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Updating instance_info_cache with network_info: [{"id": "836d3fdf-e98b-4a41-864f-9e3fbdb29394", "address": "fa:16:3e:d5:a1:ff", "network": {"id": "94472368-b72a-4e5d-ac59-40b24b7ba792", "bridge": "br-int", "label": "tempest-network-smoke--1761418689", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap836d3fdf-e9", "ovs_interfaceid": "836d3fdf-e98b-4a41-864f-9e3fbdb29394", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9c194df0-4c84-41bd-94af-7e4ecd312dd5", "address": "fa:16:3e:da:a3:a5", "network": {"id": "ff387e90-45c2-42d7-b536-fee4d2b6eb5e", "bridge": "br-int", "label": "tempest-network-smoke--1344723239", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feda:a3a5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feda:a3a5", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c194df0-4c", "ovs_interfaceid": "9c194df0-4c84-41bd-94af-7e4ecd312dd5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:30:02 compute-0 nova_compute[187185]: 2025-11-29 07:30:02.072 187189 DEBUG oslo_concurrency.lockutils [req-266aff83-8fbb-474b-8606-4f8b0df6404f req-c5f665d8-c428-4fd6-9b37-329979969a18 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-2cb4e847-114f-440a-b231-65e3fff0f0d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:30:03 compute-0 nova_compute[187185]: 2025-11-29 07:30:03.408 187189 DEBUG nova.compute.manager [req-931dc20e-ef43-4684-8371-235cabee78d7 req-d3418131-f04f-4b54-b15c-2d421eae89db 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Received event network-vif-plugged-9c194df0-4c84-41bd-94af-7e4ecd312dd5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:30:03 compute-0 nova_compute[187185]: 2025-11-29 07:30:03.409 187189 DEBUG oslo_concurrency.lockutils [req-931dc20e-ef43-4684-8371-235cabee78d7 req-d3418131-f04f-4b54-b15c-2d421eae89db 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2cb4e847-114f-440a-b231-65e3fff0f0d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:30:03 compute-0 nova_compute[187185]: 2025-11-29 07:30:03.409 187189 DEBUG oslo_concurrency.lockutils [req-931dc20e-ef43-4684-8371-235cabee78d7 req-d3418131-f04f-4b54-b15c-2d421eae89db 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2cb4e847-114f-440a-b231-65e3fff0f0d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:30:03 compute-0 nova_compute[187185]: 2025-11-29 07:30:03.409 187189 DEBUG oslo_concurrency.lockutils [req-931dc20e-ef43-4684-8371-235cabee78d7 req-d3418131-f04f-4b54-b15c-2d421eae89db 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2cb4e847-114f-440a-b231-65e3fff0f0d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:30:03 compute-0 nova_compute[187185]: 2025-11-29 07:30:03.410 187189 DEBUG nova.compute.manager [req-931dc20e-ef43-4684-8371-235cabee78d7 req-d3418131-f04f-4b54-b15c-2d421eae89db 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] No waiting events found dispatching network-vif-plugged-9c194df0-4c84-41bd-94af-7e4ecd312dd5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:30:03 compute-0 nova_compute[187185]: 2025-11-29 07:30:03.410 187189 WARNING nova.compute.manager [req-931dc20e-ef43-4684-8371-235cabee78d7 req-d3418131-f04f-4b54-b15c-2d421eae89db 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Received unexpected event network-vif-plugged-9c194df0-4c84-41bd-94af-7e4ecd312dd5 for instance with vm_state deleted and task_state None.
Nov 29 07:30:03 compute-0 nova_compute[187185]: 2025-11-29 07:30:03.410 187189 DEBUG nova.compute.manager [req-931dc20e-ef43-4684-8371-235cabee78d7 req-d3418131-f04f-4b54-b15c-2d421eae89db 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Received event network-vif-deleted-9c194df0-4c84-41bd-94af-7e4ecd312dd5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:30:03 compute-0 nova_compute[187185]: 2025-11-29 07:30:03.410 187189 DEBUG nova.compute.manager [req-931dc20e-ef43-4684-8371-235cabee78d7 req-d3418131-f04f-4b54-b15c-2d421eae89db 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Received event network-vif-deleted-836d3fdf-e98b-4a41-864f-9e3fbdb29394 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:30:03 compute-0 nova_compute[187185]: 2025-11-29 07:30:03.676 187189 DEBUG nova.compute.manager [req-5290c6dc-09d0-49d5-bbfe-8eb26638fee1 req-585908a3-a9cd-48be-8145-bb2b19914745 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Received event network-vif-plugged-836d3fdf-e98b-4a41-864f-9e3fbdb29394 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:30:03 compute-0 nova_compute[187185]: 2025-11-29 07:30:03.677 187189 DEBUG oslo_concurrency.lockutils [req-5290c6dc-09d0-49d5-bbfe-8eb26638fee1 req-585908a3-a9cd-48be-8145-bb2b19914745 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2cb4e847-114f-440a-b231-65e3fff0f0d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:30:03 compute-0 nova_compute[187185]: 2025-11-29 07:30:03.677 187189 DEBUG oslo_concurrency.lockutils [req-5290c6dc-09d0-49d5-bbfe-8eb26638fee1 req-585908a3-a9cd-48be-8145-bb2b19914745 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2cb4e847-114f-440a-b231-65e3fff0f0d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:30:03 compute-0 nova_compute[187185]: 2025-11-29 07:30:03.678 187189 DEBUG oslo_concurrency.lockutils [req-5290c6dc-09d0-49d5-bbfe-8eb26638fee1 req-585908a3-a9cd-48be-8145-bb2b19914745 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2cb4e847-114f-440a-b231-65e3fff0f0d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:30:03 compute-0 nova_compute[187185]: 2025-11-29 07:30:03.678 187189 DEBUG nova.compute.manager [req-5290c6dc-09d0-49d5-bbfe-8eb26638fee1 req-585908a3-a9cd-48be-8145-bb2b19914745 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] No waiting events found dispatching network-vif-plugged-836d3fdf-e98b-4a41-864f-9e3fbdb29394 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:30:03 compute-0 nova_compute[187185]: 2025-11-29 07:30:03.679 187189 WARNING nova.compute.manager [req-5290c6dc-09d0-49d5-bbfe-8eb26638fee1 req-585908a3-a9cd-48be-8145-bb2b19914745 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Received unexpected event network-vif-plugged-836d3fdf-e98b-4a41-864f-9e3fbdb29394 for instance with vm_state deleted and task_state None.
Nov 29 07:30:05 compute-0 nova_compute[187185]: 2025-11-29 07:30:05.043 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:30:06 compute-0 nova_compute[187185]: 2025-11-29 07:30:06.673 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:30:06 compute-0 nova_compute[187185]: 2025-11-29 07:30:06.704 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:30:06 compute-0 podman[238049]: 2025-11-29 07:30:06.911358727 +0000 UTC m=+0.166762094 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 29 07:30:09 compute-0 nova_compute[187185]: 2025-11-29 07:30:09.261 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401394.259709, ae465294-b6bd-4481-9fc1-daf5f8ec28a1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:30:09 compute-0 nova_compute[187185]: 2025-11-29 07:30:09.262 187189 INFO nova.compute.manager [-] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] VM Stopped (Lifecycle Event)
Nov 29 07:30:09 compute-0 nova_compute[187185]: 2025-11-29 07:30:09.281 187189 DEBUG nova.compute.manager [None req-3d42d1af-bd85-47c4-b2a6-6f7c8988f743 - - - - - -] [instance: ae465294-b6bd-4481-9fc1-daf5f8ec28a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:30:10 compute-0 nova_compute[187185]: 2025-11-29 07:30:10.045 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:30:11 compute-0 nova_compute[187185]: 2025-11-29 07:30:11.706 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:30:13 compute-0 nova_compute[187185]: 2025-11-29 07:30:13.688 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:30:13 compute-0 nova_compute[187185]: 2025-11-29 07:30:13.911 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:30:13 compute-0 podman[238075]: 2025-11-29 07:30:13.993629523 +0000 UTC m=+0.073883610 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 07:30:14 compute-0 nova_compute[187185]: 2025-11-29 07:30:14.960 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401399.9597566, 2cb4e847-114f-440a-b231-65e3fff0f0d2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:30:14 compute-0 nova_compute[187185]: 2025-11-29 07:30:14.961 187189 INFO nova.compute.manager [-] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] VM Stopped (Lifecycle Event)
Nov 29 07:30:14 compute-0 nova_compute[187185]: 2025-11-29 07:30:14.979 187189 DEBUG nova.compute.manager [None req-4849f452-2570-464c-9787-e472ec5e5e17 - - - - - -] [instance: 2cb4e847-114f-440a-b231-65e3fff0f0d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:30:15 compute-0 nova_compute[187185]: 2025-11-29 07:30:15.048 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:30:16 compute-0 nova_compute[187185]: 2025-11-29 07:30:16.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:30:16 compute-0 nova_compute[187185]: 2025-11-29 07:30:16.708 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:30:17 compute-0 nova_compute[187185]: 2025-11-29 07:30:17.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:30:17 compute-0 nova_compute[187185]: 2025-11-29 07:30:17.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:30:17 compute-0 nova_compute[187185]: 2025-11-29 07:30:17.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:30:17 compute-0 nova_compute[187185]: 2025-11-29 07:30:17.335 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 07:30:17 compute-0 sshd-session[238100]: Invalid user mailuser from 190.181.27.27 port 36338
Nov 29 07:30:17 compute-0 sshd-session[238100]: Received disconnect from 190.181.27.27 port 36338:11: Bye Bye [preauth]
Nov 29 07:30:17 compute-0 sshd-session[238100]: Disconnected from invalid user mailuser 190.181.27.27 port 36338 [preauth]
Nov 29 07:30:18 compute-0 podman[238102]: 2025-11-29 07:30:18.799042297 +0000 UTC m=+0.065338945 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:30:18 compute-0 podman[238103]: 2025-11-29 07:30:18.834326829 +0000 UTC m=+0.088862819 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:30:19 compute-0 nova_compute[187185]: 2025-11-29 07:30:19.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:30:20 compute-0 nova_compute[187185]: 2025-11-29 07:30:20.049 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:30:20 compute-0 nova_compute[187185]: 2025-11-29 07:30:20.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:30:20 compute-0 nova_compute[187185]: 2025-11-29 07:30:20.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:30:20 compute-0 nova_compute[187185]: 2025-11-29 07:30:20.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:30:20 compute-0 nova_compute[187185]: 2025-11-29 07:30:20.318 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:30:20 compute-0 nova_compute[187185]: 2025-11-29 07:30:20.401 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:30:20 compute-0 nova_compute[187185]: 2025-11-29 07:30:20.401 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:30:20 compute-0 nova_compute[187185]: 2025-11-29 07:30:20.401 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:30:20 compute-0 nova_compute[187185]: 2025-11-29 07:30:20.402 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:30:20 compute-0 nova_compute[187185]: 2025-11-29 07:30:20.609 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:30:20 compute-0 nova_compute[187185]: 2025-11-29 07:30:20.611 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5718MB free_disk=73.24444961547852GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:30:20 compute-0 nova_compute[187185]: 2025-11-29 07:30:20.611 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:30:20 compute-0 nova_compute[187185]: 2025-11-29 07:30:20.612 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:30:20 compute-0 nova_compute[187185]: 2025-11-29 07:30:20.899 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:30:20 compute-0 nova_compute[187185]: 2025-11-29 07:30:20.900 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:30:20 compute-0 nova_compute[187185]: 2025-11-29 07:30:20.926 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:30:20 compute-0 nova_compute[187185]: 2025-11-29 07:30:20.980 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:30:21 compute-0 nova_compute[187185]: 2025-11-29 07:30:21.010 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:30:21 compute-0 nova_compute[187185]: 2025-11-29 07:30:21.011 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.399s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:30:21 compute-0 nova_compute[187185]: 2025-11-29 07:30:21.710 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:30:22 compute-0 nova_compute[187185]: 2025-11-29 07:30:22.012 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:30:22 compute-0 nova_compute[187185]: 2025-11-29 07:30:22.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:30:25 compute-0 nova_compute[187185]: 2025-11-29 07:30:25.051 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:30:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:30:25.520 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:30:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:30:25.521 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:30:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:30:25.521 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:30:26 compute-0 nova_compute[187185]: 2025-11-29 07:30:26.311 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:30:26 compute-0 nova_compute[187185]: 2025-11-29 07:30:26.712 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:30:28 compute-0 podman[238146]: 2025-11-29 07:30:28.804984237 +0000 UTC m=+0.062701589 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 29 07:30:28 compute-0 podman[238147]: 2025-11-29 07:30:28.809689602 +0000 UTC m=+0.059464887 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, managed_by=edpm_ansible, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 29 07:30:28 compute-0 podman[238148]: 2025-11-29 07:30:28.82079252 +0000 UTC m=+0.060814585 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 07:30:30 compute-0 nova_compute[187185]: 2025-11-29 07:30:30.053 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:30:31 compute-0 nova_compute[187185]: 2025-11-29 07:30:31.715 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:30:35 compute-0 nova_compute[187185]: 2025-11-29 07:30:35.055 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:30:36 compute-0 nova_compute[187185]: 2025-11-29 07:30:36.717 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:30:37 compute-0 podman[238210]: 2025-11-29 07:30:37.840620859 +0000 UTC m=+0.105089755 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 07:30:40 compute-0 nova_compute[187185]: 2025-11-29 07:30:40.058 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:30:41 compute-0 nova_compute[187185]: 2025-11-29 07:30:41.719 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:30:42 compute-0 nova_compute[187185]: 2025-11-29 07:30:42.024 187189 DEBUG oslo_concurrency.lockutils [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Acquiring lock "985dc7b7-0644-4c5a-8218-9f925ac9e6ec" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:30:42 compute-0 nova_compute[187185]: 2025-11-29 07:30:42.024 187189 DEBUG oslo_concurrency.lockutils [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Lock "985dc7b7-0644-4c5a-8218-9f925ac9e6ec" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:30:42 compute-0 nova_compute[187185]: 2025-11-29 07:30:42.139 187189 DEBUG nova.compute.manager [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 07:30:42 compute-0 nova_compute[187185]: 2025-11-29 07:30:42.561 187189 DEBUG oslo_concurrency.lockutils [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:30:42 compute-0 nova_compute[187185]: 2025-11-29 07:30:42.562 187189 DEBUG oslo_concurrency.lockutils [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:30:42 compute-0 nova_compute[187185]: 2025-11-29 07:30:42.570 187189 DEBUG nova.virt.hardware [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:30:42 compute-0 nova_compute[187185]: 2025-11-29 07:30:42.571 187189 INFO nova.compute.claims [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:30:42 compute-0 nova_compute[187185]: 2025-11-29 07:30:42.856 187189 DEBUG nova.compute.provider_tree [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:30:43 compute-0 nova_compute[187185]: 2025-11-29 07:30:43.059 187189 DEBUG nova.scheduler.client.report [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:30:43 compute-0 nova_compute[187185]: 2025-11-29 07:30:43.085 187189 DEBUG oslo_concurrency.lockutils [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.522s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:30:43 compute-0 nova_compute[187185]: 2025-11-29 07:30:43.086 187189 DEBUG nova.compute.manager [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 07:30:44 compute-0 nova_compute[187185]: 2025-11-29 07:30:44.420 187189 DEBUG nova.compute.manager [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 07:30:44 compute-0 nova_compute[187185]: 2025-11-29 07:30:44.420 187189 DEBUG nova.network.neutron [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 07:30:44 compute-0 nova_compute[187185]: 2025-11-29 07:30:44.714 187189 DEBUG nova.policy [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '97d49ad735124e92ba228df4a6eba8b4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e5d0aef61d814c0ca5b9ed1fabe86010', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 07:30:44 compute-0 podman[238237]: 2025-11-29 07:30:44.788290416 +0000 UTC m=+0.048573895 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 07:30:44 compute-0 nova_compute[187185]: 2025-11-29 07:30:44.829 187189 INFO nova.virt.libvirt.driver [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 07:30:45 compute-0 nova_compute[187185]: 2025-11-29 07:30:45.051 187189 DEBUG nova.compute.manager [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 07:30:45 compute-0 nova_compute[187185]: 2025-11-29 07:30:45.060 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:30:45 compute-0 nova_compute[187185]: 2025-11-29 07:30:45.310 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:30:46 compute-0 nova_compute[187185]: 2025-11-29 07:30:46.386 187189 DEBUG nova.compute.manager [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 07:30:46 compute-0 nova_compute[187185]: 2025-11-29 07:30:46.389 187189 DEBUG nova.virt.libvirt.driver [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:30:46 compute-0 nova_compute[187185]: 2025-11-29 07:30:46.389 187189 INFO nova.virt.libvirt.driver [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Creating image(s)
Nov 29 07:30:46 compute-0 nova_compute[187185]: 2025-11-29 07:30:46.391 187189 DEBUG oslo_concurrency.lockutils [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Acquiring lock "/var/lib/nova/instances/985dc7b7-0644-4c5a-8218-9f925ac9e6ec/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:30:46 compute-0 nova_compute[187185]: 2025-11-29 07:30:46.391 187189 DEBUG oslo_concurrency.lockutils [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Lock "/var/lib/nova/instances/985dc7b7-0644-4c5a-8218-9f925ac9e6ec/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:30:46 compute-0 nova_compute[187185]: 2025-11-29 07:30:46.392 187189 DEBUG oslo_concurrency.lockutils [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Lock "/var/lib/nova/instances/985dc7b7-0644-4c5a-8218-9f925ac9e6ec/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:30:46 compute-0 nova_compute[187185]: 2025-11-29 07:30:46.417 187189 DEBUG oslo_concurrency.processutils [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:30:46 compute-0 nova_compute[187185]: 2025-11-29 07:30:46.482 187189 DEBUG oslo_concurrency.processutils [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:30:46 compute-0 nova_compute[187185]: 2025-11-29 07:30:46.484 187189 DEBUG oslo_concurrency.lockutils [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:30:46 compute-0 nova_compute[187185]: 2025-11-29 07:30:46.485 187189 DEBUG oslo_concurrency.lockutils [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:30:46 compute-0 nova_compute[187185]: 2025-11-29 07:30:46.509 187189 DEBUG oslo_concurrency.processutils [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:30:46 compute-0 nova_compute[187185]: 2025-11-29 07:30:46.580 187189 DEBUG oslo_concurrency.processutils [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:30:46 compute-0 nova_compute[187185]: 2025-11-29 07:30:46.582 187189 DEBUG oslo_concurrency.processutils [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/985dc7b7-0644-4c5a-8218-9f925ac9e6ec/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:30:46 compute-0 nova_compute[187185]: 2025-11-29 07:30:46.638 187189 DEBUG oslo_concurrency.processutils [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/985dc7b7-0644-4c5a-8218-9f925ac9e6ec/disk 1073741824" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:30:46 compute-0 nova_compute[187185]: 2025-11-29 07:30:46.640 187189 DEBUG oslo_concurrency.lockutils [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:30:46 compute-0 nova_compute[187185]: 2025-11-29 07:30:46.641 187189 DEBUG oslo_concurrency.processutils [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:30:46 compute-0 nova_compute[187185]: 2025-11-29 07:30:46.721 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:30:46 compute-0 nova_compute[187185]: 2025-11-29 07:30:46.731 187189 DEBUG oslo_concurrency.processutils [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:30:46 compute-0 nova_compute[187185]: 2025-11-29 07:30:46.732 187189 DEBUG nova.virt.disk.api [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Checking if we can resize image /var/lib/nova/instances/985dc7b7-0644-4c5a-8218-9f925ac9e6ec/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:30:46 compute-0 nova_compute[187185]: 2025-11-29 07:30:46.733 187189 DEBUG oslo_concurrency.processutils [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/985dc7b7-0644-4c5a-8218-9f925ac9e6ec/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:30:46 compute-0 nova_compute[187185]: 2025-11-29 07:30:46.823 187189 DEBUG oslo_concurrency.processutils [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/985dc7b7-0644-4c5a-8218-9f925ac9e6ec/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:30:46 compute-0 nova_compute[187185]: 2025-11-29 07:30:46.824 187189 DEBUG nova.virt.disk.api [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Cannot resize image /var/lib/nova/instances/985dc7b7-0644-4c5a-8218-9f925ac9e6ec/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:30:46 compute-0 nova_compute[187185]: 2025-11-29 07:30:46.825 187189 DEBUG nova.objects.instance [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Lazy-loading 'migration_context' on Instance uuid 985dc7b7-0644-4c5a-8218-9f925ac9e6ec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:30:46 compute-0 nova_compute[187185]: 2025-11-29 07:30:46.855 187189 DEBUG nova.virt.libvirt.driver [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:30:46 compute-0 nova_compute[187185]: 2025-11-29 07:30:46.857 187189 DEBUG nova.virt.libvirt.driver [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Ensure instance console log exists: /var/lib/nova/instances/985dc7b7-0644-4c5a-8218-9f925ac9e6ec/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:30:46 compute-0 nova_compute[187185]: 2025-11-29 07:30:46.858 187189 DEBUG oslo_concurrency.lockutils [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:30:46 compute-0 nova_compute[187185]: 2025-11-29 07:30:46.859 187189 DEBUG oslo_concurrency.lockutils [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:30:46 compute-0 nova_compute[187185]: 2025-11-29 07:30:46.859 187189 DEBUG oslo_concurrency.lockutils [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:30:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:30:48.008 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:30:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:30:48.011 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:30:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:30:48.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:30:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:30:48.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:30:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:30:48.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:30:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:30:48.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:30:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:30:48.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:30:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:30:48.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:30:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:30:48.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:30:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:30:48.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:30:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:30:48.013 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:30:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:30:48.013 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:30:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:30:48.013 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:30:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:30:48.013 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:30:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:30:48.013 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:30:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:30:48.013 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:30:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:30:48.013 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:30:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:30:48.013 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:30:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:30:48.013 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:30:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:30:48.013 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:30:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:30:48.013 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:30:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:30:48.013 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:30:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:30:48.014 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:30:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:30:48.014 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:30:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:30:48.014 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:30:49 compute-0 nova_compute[187185]: 2025-11-29 07:30:49.175 187189 DEBUG nova.network.neutron [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Successfully created port: cb5f49f5-de68-4f15-a036-06a0f45556de _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:30:49 compute-0 podman[238274]: 2025-11-29 07:30:49.788844246 +0000 UTC m=+0.058421267 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:30:49 compute-0 podman[238275]: 2025-11-29 07:30:49.804752752 +0000 UTC m=+0.066980222 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3)
Nov 29 07:30:50 compute-0 nova_compute[187185]: 2025-11-29 07:30:50.062 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:30:51 compute-0 nova_compute[187185]: 2025-11-29 07:30:51.723 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:30:55 compute-0 nova_compute[187185]: 2025-11-29 07:30:55.064 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:30:56 compute-0 nova_compute[187185]: 2025-11-29 07:30:56.727 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:30:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:30:56.827 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=47, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=46) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:30:56 compute-0 nova_compute[187185]: 2025-11-29 07:30:56.829 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:30:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:30:56.831 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:30:59 compute-0 podman[238313]: 2025-11-29 07:30:59.780517958 +0000 UTC m=+0.052014723 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:30:59 compute-0 podman[238314]: 2025-11-29 07:30:59.792769659 +0000 UTC m=+0.059228729 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, release=1755695350, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, version=9.6, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 29 07:30:59 compute-0 podman[238315]: 2025-11-29 07:30:59.798797052 +0000 UTC m=+0.063175723 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 07:31:00 compute-0 nova_compute[187185]: 2025-11-29 07:31:00.065 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:31:01 compute-0 nova_compute[187185]: 2025-11-29 07:31:01.729 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:31:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:31:01.834 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '47'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:31:05 compute-0 nova_compute[187185]: 2025-11-29 07:31:05.067 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:31:06 compute-0 nova_compute[187185]: 2025-11-29 07:31:06.731 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:31:08 compute-0 podman[238373]: 2025-11-29 07:31:08.881253506 +0000 UTC m=+0.138248046 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:31:10 compute-0 nova_compute[187185]: 2025-11-29 07:31:10.076 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:31:11 compute-0 nova_compute[187185]: 2025-11-29 07:31:11.736 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:31:14 compute-0 nova_compute[187185]: 2025-11-29 07:31:14.101 187189 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 0.01 sec
Nov 29 07:31:15 compute-0 nova_compute[187185]: 2025-11-29 07:31:15.079 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:31:15 compute-0 podman[238400]: 2025-11-29 07:31:15.796798353 +0000 UTC m=+0.059007764 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:31:16 compute-0 nova_compute[187185]: 2025-11-29 07:31:16.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:31:16 compute-0 nova_compute[187185]: 2025-11-29 07:31:16.737 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:31:17 compute-0 nova_compute[187185]: 2025-11-29 07:31:17.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:31:17 compute-0 nova_compute[187185]: 2025-11-29 07:31:17.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:31:17 compute-0 nova_compute[187185]: 2025-11-29 07:31:17.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:31:18 compute-0 nova_compute[187185]: 2025-11-29 07:31:18.011 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Nov 29 07:31:18 compute-0 nova_compute[187185]: 2025-11-29 07:31:18.012 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 07:31:18 compute-0 ovn_controller[95281]: 2025-11-29T07:31:18Z|00459|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Nov 29 07:31:20 compute-0 nova_compute[187185]: 2025-11-29 07:31:20.117 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:31:20 compute-0 podman[238426]: 2025-11-29 07:31:20.81811106 +0000 UTC m=+0.079003877 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 07:31:20 compute-0 podman[238425]: 2025-11-29 07:31:20.837545407 +0000 UTC m=+0.097565909 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:31:21 compute-0 nova_compute[187185]: 2025-11-29 07:31:21.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:31:21 compute-0 nova_compute[187185]: 2025-11-29 07:31:21.319 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:31:21 compute-0 nova_compute[187185]: 2025-11-29 07:31:21.740 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:31:22 compute-0 nova_compute[187185]: 2025-11-29 07:31:22.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:31:22 compute-0 nova_compute[187185]: 2025-11-29 07:31:22.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:31:22 compute-0 nova_compute[187185]: 2025-11-29 07:31:22.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:31:22 compute-0 nova_compute[187185]: 2025-11-29 07:31:22.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:31:22 compute-0 nova_compute[187185]: 2025-11-29 07:31:22.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:31:24 compute-0 nova_compute[187185]: 2025-11-29 07:31:24.832 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:31:24 compute-0 nova_compute[187185]: 2025-11-29 07:31:24.833 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:31:24 compute-0 nova_compute[187185]: 2025-11-29 07:31:24.833 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:31:24 compute-0 nova_compute[187185]: 2025-11-29 07:31:24.834 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:31:25 compute-0 nova_compute[187185]: 2025-11-29 07:31:25.118 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:31:25 compute-0 nova_compute[187185]: 2025-11-29 07:31:25.126 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:31:25 compute-0 nova_compute[187185]: 2025-11-29 07:31:25.127 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5715MB free_disk=73.24440002441406GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:31:25 compute-0 nova_compute[187185]: 2025-11-29 07:31:25.127 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:31:25 compute-0 nova_compute[187185]: 2025-11-29 07:31:25.128 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:31:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:31:25.520 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:31:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:31:25.521 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:31:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:31:25.521 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:31:26 compute-0 nova_compute[187185]: 2025-11-29 07:31:26.742 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:31:30 compute-0 nova_compute[187185]: 2025-11-29 07:31:30.121 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:31:30 compute-0 podman[238464]: 2025-11-29 07:31:30.252384175 +0000 UTC m=+0.087821780 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 07:31:30 compute-0 podman[238466]: 2025-11-29 07:31:30.267854509 +0000 UTC m=+0.100920806 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 07:31:30 compute-0 podman[238465]: 2025-11-29 07:31:30.269264129 +0000 UTC m=+0.100087071 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=edpm_ansible, architecture=x86_64, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, config_id=edpm, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter)
Nov 29 07:31:31 compute-0 nova_compute[187185]: 2025-11-29 07:31:31.744 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:31:32 compute-0 nova_compute[187185]: 2025-11-29 07:31:32.064 187189 DEBUG nova.network.neutron [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Successfully updated port: cb5f49f5-de68-4f15-a036-06a0f45556de _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:31:35 compute-0 nova_compute[187185]: 2025-11-29 07:31:35.124 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:31:36 compute-0 nova_compute[187185]: 2025-11-29 07:31:36.746 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:31:37 compute-0 nova_compute[187185]: 2025-11-29 07:31:37.834 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance 985dc7b7-0644-4c5a-8218-9f925ac9e6ec actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:31:37 compute-0 nova_compute[187185]: 2025-11-29 07:31:37.834 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:31:37 compute-0 nova_compute[187185]: 2025-11-29 07:31:37.835 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:31:37 compute-0 nova_compute[187185]: 2025-11-29 07:31:37.908 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:31:39 compute-0 podman[238525]: 2025-11-29 07:31:39.84845135 +0000 UTC m=+0.108885983 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 29 07:31:40 compute-0 nova_compute[187185]: 2025-11-29 07:31:40.127 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:31:41 compute-0 nova_compute[187185]: 2025-11-29 07:31:41.749 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:31:44 compute-0 nova_compute[187185]: 2025-11-29 07:31:44.008 187189 DEBUG nova.compute.manager [req-fc0513af-25ab-45db-b157-7a55c1157315 req-d584b924-b49a-4874-ae82-78e8d4828788 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Received event network-changed-cb5f49f5-de68-4f15-a036-06a0f45556de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:31:44 compute-0 nova_compute[187185]: 2025-11-29 07:31:44.009 187189 DEBUG nova.compute.manager [req-fc0513af-25ab-45db-b157-7a55c1157315 req-d584b924-b49a-4874-ae82-78e8d4828788 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Refreshing instance network info cache due to event network-changed-cb5f49f5-de68-4f15-a036-06a0f45556de. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:31:44 compute-0 nova_compute[187185]: 2025-11-29 07:31:44.009 187189 DEBUG oslo_concurrency.lockutils [req-fc0513af-25ab-45db-b157-7a55c1157315 req-d584b924-b49a-4874-ae82-78e8d4828788 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-985dc7b7-0644-4c5a-8218-9f925ac9e6ec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:31:44 compute-0 nova_compute[187185]: 2025-11-29 07:31:44.009 187189 DEBUG oslo_concurrency.lockutils [req-fc0513af-25ab-45db-b157-7a55c1157315 req-d584b924-b49a-4874-ae82-78e8d4828788 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-985dc7b7-0644-4c5a-8218-9f925ac9e6ec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:31:44 compute-0 nova_compute[187185]: 2025-11-29 07:31:44.009 187189 DEBUG nova.network.neutron [req-fc0513af-25ab-45db-b157-7a55c1157315 req-d584b924-b49a-4874-ae82-78e8d4828788 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Refreshing network info cache for port cb5f49f5-de68-4f15-a036-06a0f45556de _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:31:44 compute-0 nova_compute[187185]: 2025-11-29 07:31:44.158 187189 DEBUG oslo_concurrency.lockutils [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Acquiring lock "refresh_cache-985dc7b7-0644-4c5a-8218-9f925ac9e6ec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:31:45 compute-0 nova_compute[187185]: 2025-11-29 07:31:45.130 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:31:46 compute-0 nova_compute[187185]: 2025-11-29 07:31:46.751 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:31:46 compute-0 podman[238551]: 2025-11-29 07:31:46.809950633 +0000 UTC m=+0.076498475 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 07:31:50 compute-0 nova_compute[187185]: 2025-11-29 07:31:50.132 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:31:51 compute-0 nova_compute[187185]: 2025-11-29 07:31:51.754 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:31:51 compute-0 podman[238574]: 2025-11-29 07:31:51.821238322 +0000 UTC m=+0.073739936 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Nov 29 07:31:51 compute-0 podman[238575]: 2025-11-29 07:31:51.85708508 +0000 UTC m=+0.102960904 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 29 07:31:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:31:54.834 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=48, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=47) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:31:54 compute-0 nova_compute[187185]: 2025-11-29 07:31:54.835 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:31:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:31:54.836 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:31:54 compute-0 nova_compute[187185]: 2025-11-29 07:31:54.880 187189 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 10.77 sec
Nov 29 07:31:54 compute-0 nova_compute[187185]: 2025-11-29 07:31:54.920 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:31:55 compute-0 nova_compute[187185]: 2025-11-29 07:31:55.134 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:31:55 compute-0 nova_compute[187185]: 2025-11-29 07:31:55.570 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:31:55 compute-0 nova_compute[187185]: 2025-11-29 07:31:55.570 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 30.443s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:31:55 compute-0 nova_compute[187185]: 2025-11-29 07:31:55.906 187189 DEBUG nova.network.neutron [req-fc0513af-25ab-45db-b157-7a55c1157315 req-d584b924-b49a-4874-ae82-78e8d4828788 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:31:56 compute-0 nova_compute[187185]: 2025-11-29 07:31:56.546 187189 DEBUG nova.network.neutron [req-fc0513af-25ab-45db-b157-7a55c1157315 req-d584b924-b49a-4874-ae82-78e8d4828788 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:31:56 compute-0 nova_compute[187185]: 2025-11-29 07:31:56.755 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:31:57 compute-0 nova_compute[187185]: 2025-11-29 07:31:57.281 187189 DEBUG oslo_concurrency.lockutils [req-fc0513af-25ab-45db-b157-7a55c1157315 req-d584b924-b49a-4874-ae82-78e8d4828788 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-985dc7b7-0644-4c5a-8218-9f925ac9e6ec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:31:57 compute-0 nova_compute[187185]: 2025-11-29 07:31:57.282 187189 DEBUG oslo_concurrency.lockutils [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Acquired lock "refresh_cache-985dc7b7-0644-4c5a-8218-9f925ac9e6ec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:31:57 compute-0 nova_compute[187185]: 2025-11-29 07:31:57.283 187189 DEBUG nova.network.neutron [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:31:58 compute-0 nova_compute[187185]: 2025-11-29 07:31:58.065 187189 DEBUG nova.network.neutron [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:32:00 compute-0 nova_compute[187185]: 2025-11-29 07:32:00.136 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:32:00 compute-0 podman[238613]: 2025-11-29 07:32:00.805593794 +0000 UTC m=+0.060683182 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Nov 29 07:32:00 compute-0 podman[238615]: 2025-11-29 07:32:00.811671548 +0000 UTC m=+0.067644971 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 07:32:00 compute-0 podman[238614]: 2025-11-29 07:32:00.836385957 +0000 UTC m=+0.087976605 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, version=9.6, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_id=edpm, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1755695350)
Nov 29 07:32:01 compute-0 nova_compute[187185]: 2025-11-29 07:32:01.567 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:32:01 compute-0 nova_compute[187185]: 2025-11-29 07:32:01.757 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:32:03 compute-0 nova_compute[187185]: 2025-11-29 07:32:03.147 187189 DEBUG nova.network.neutron [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Updating instance_info_cache with network_info: [{"id": "cb5f49f5-de68-4f15-a036-06a0f45556de", "address": "fa:16:3e:e4:08:a3", "network": {"id": "27dbf3f9-5c08-43d4-9c88-c573d9704843", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1864816291-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e5d0aef61d814c0ca5b9ed1fabe86010", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcb5f49f5-de", "ovs_interfaceid": "cb5f49f5-de68-4f15-a036-06a0f45556de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:32:04 compute-0 nova_compute[187185]: 2025-11-29 07:32:04.706 187189 DEBUG oslo_concurrency.lockutils [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Releasing lock "refresh_cache-985dc7b7-0644-4c5a-8218-9f925ac9e6ec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:32:04 compute-0 nova_compute[187185]: 2025-11-29 07:32:04.706 187189 DEBUG nova.compute.manager [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Instance network_info: |[{"id": "cb5f49f5-de68-4f15-a036-06a0f45556de", "address": "fa:16:3e:e4:08:a3", "network": {"id": "27dbf3f9-5c08-43d4-9c88-c573d9704843", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1864816291-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e5d0aef61d814c0ca5b9ed1fabe86010", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcb5f49f5-de", "ovs_interfaceid": "cb5f49f5-de68-4f15-a036-06a0f45556de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 07:32:04 compute-0 nova_compute[187185]: 2025-11-29 07:32:04.711 187189 DEBUG nova.virt.libvirt.driver [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Start _get_guest_xml network_info=[{"id": "cb5f49f5-de68-4f15-a036-06a0f45556de", "address": "fa:16:3e:e4:08:a3", "network": {"id": "27dbf3f9-5c08-43d4-9c88-c573d9704843", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1864816291-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e5d0aef61d814c0ca5b9ed1fabe86010", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcb5f49f5-de", "ovs_interfaceid": "cb5f49f5-de68-4f15-a036-06a0f45556de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:32:04 compute-0 nova_compute[187185]: 2025-11-29 07:32:04.717 187189 WARNING nova.virt.libvirt.driver [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:32:04 compute-0 nova_compute[187185]: 2025-11-29 07:32:04.722 187189 DEBUG nova.virt.libvirt.host [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:32:04 compute-0 nova_compute[187185]: 2025-11-29 07:32:04.723 187189 DEBUG nova.virt.libvirt.host [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:32:04 compute-0 nova_compute[187185]: 2025-11-29 07:32:04.728 187189 DEBUG nova.virt.libvirt.host [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:32:04 compute-0 nova_compute[187185]: 2025-11-29 07:32:04.729 187189 DEBUG nova.virt.libvirt.host [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:32:04 compute-0 nova_compute[187185]: 2025-11-29 07:32:04.731 187189 DEBUG nova.virt.libvirt.driver [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:32:04 compute-0 nova_compute[187185]: 2025-11-29 07:32:04.731 187189 DEBUG nova.virt.hardware [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:32:04 compute-0 nova_compute[187185]: 2025-11-29 07:32:04.732 187189 DEBUG nova.virt.hardware [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:32:04 compute-0 nova_compute[187185]: 2025-11-29 07:32:04.733 187189 DEBUG nova.virt.hardware [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:32:04 compute-0 nova_compute[187185]: 2025-11-29 07:32:04.733 187189 DEBUG nova.virt.hardware [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:32:04 compute-0 nova_compute[187185]: 2025-11-29 07:32:04.734 187189 DEBUG nova.virt.hardware [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:32:04 compute-0 nova_compute[187185]: 2025-11-29 07:32:04.734 187189 DEBUG nova.virt.hardware [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:32:04 compute-0 nova_compute[187185]: 2025-11-29 07:32:04.735 187189 DEBUG nova.virt.hardware [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:32:04 compute-0 nova_compute[187185]: 2025-11-29 07:32:04.735 187189 DEBUG nova.virt.hardware [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:32:04 compute-0 nova_compute[187185]: 2025-11-29 07:32:04.736 187189 DEBUG nova.virt.hardware [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:32:04 compute-0 nova_compute[187185]: 2025-11-29 07:32:04.736 187189 DEBUG nova.virt.hardware [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:32:04 compute-0 nova_compute[187185]: 2025-11-29 07:32:04.737 187189 DEBUG nova.virt.hardware [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:32:04 compute-0 nova_compute[187185]: 2025-11-29 07:32:04.743 187189 DEBUG nova.virt.libvirt.vif [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:30:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestMultiTenantJSON-server-963727031',display_name='tempest-ServersNegativeTestMultiTenantJSON-server-963727031',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestmultitenantjson-server-963727031',id=144,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e5d0aef61d814c0ca5b9ed1fabe86010',ramdisk_id='',reservation_id='r-kxwjyrip',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestMultiTenantJSON-1102700638',owner_user_name='tempest-ServersNegativeTestMultiTenantJSON-1102700638-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:30:45Z,user_data=None,user_id='97d49ad735124e92ba228df4a6eba8b4',uuid=985dc7b7-0644-4c5a-8218-9f925ac9e6ec,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cb5f49f5-de68-4f15-a036-06a0f45556de", "address": "fa:16:3e:e4:08:a3", "network": {"id": "27dbf3f9-5c08-43d4-9c88-c573d9704843", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1864816291-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e5d0aef61d814c0ca5b9ed1fabe86010", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcb5f49f5-de", "ovs_interfaceid": "cb5f49f5-de68-4f15-a036-06a0f45556de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:32:04 compute-0 nova_compute[187185]: 2025-11-29 07:32:04.743 187189 DEBUG nova.network.os_vif_util [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Converting VIF {"id": "cb5f49f5-de68-4f15-a036-06a0f45556de", "address": "fa:16:3e:e4:08:a3", "network": {"id": "27dbf3f9-5c08-43d4-9c88-c573d9704843", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1864816291-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e5d0aef61d814c0ca5b9ed1fabe86010", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcb5f49f5-de", "ovs_interfaceid": "cb5f49f5-de68-4f15-a036-06a0f45556de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:32:04 compute-0 nova_compute[187185]: 2025-11-29 07:32:04.745 187189 DEBUG nova.network.os_vif_util [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e4:08:a3,bridge_name='br-int',has_traffic_filtering=True,id=cb5f49f5-de68-4f15-a036-06a0f45556de,network=Network(27dbf3f9-5c08-43d4-9c88-c573d9704843),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcb5f49f5-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:32:04 compute-0 nova_compute[187185]: 2025-11-29 07:32:04.747 187189 DEBUG nova.objects.instance [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Lazy-loading 'pci_devices' on Instance uuid 985dc7b7-0644-4c5a-8218-9f925ac9e6ec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:32:04 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:04.839 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '48'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:32:04 compute-0 nova_compute[187185]: 2025-11-29 07:32:04.988 187189 DEBUG nova.virt.libvirt.driver [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:32:04 compute-0 nova_compute[187185]:   <uuid>985dc7b7-0644-4c5a-8218-9f925ac9e6ec</uuid>
Nov 29 07:32:04 compute-0 nova_compute[187185]:   <name>instance-00000090</name>
Nov 29 07:32:04 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:32:04 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:32:04 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:32:04 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:       <nova:name>tempest-ServersNegativeTestMultiTenantJSON-server-963727031</nova:name>
Nov 29 07:32:04 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:32:04</nova:creationTime>
Nov 29 07:32:04 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:32:04 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:32:04 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:32:04 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:32:04 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:32:04 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:32:04 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:32:04 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:32:04 compute-0 nova_compute[187185]:         <nova:user uuid="97d49ad735124e92ba228df4a6eba8b4">tempest-ServersNegativeTestMultiTenantJSON-1102700638-project-member</nova:user>
Nov 29 07:32:04 compute-0 nova_compute[187185]:         <nova:project uuid="e5d0aef61d814c0ca5b9ed1fabe86010">tempest-ServersNegativeTestMultiTenantJSON-1102700638</nova:project>
Nov 29 07:32:04 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:32:04 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:32:04 compute-0 nova_compute[187185]:         <nova:port uuid="cb5f49f5-de68-4f15-a036-06a0f45556de">
Nov 29 07:32:04 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:32:04 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:32:04 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:32:04 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <system>
Nov 29 07:32:04 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:32:04 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:32:04 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:32:04 compute-0 nova_compute[187185]:       <entry name="serial">985dc7b7-0644-4c5a-8218-9f925ac9e6ec</entry>
Nov 29 07:32:04 compute-0 nova_compute[187185]:       <entry name="uuid">985dc7b7-0644-4c5a-8218-9f925ac9e6ec</entry>
Nov 29 07:32:04 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     </system>
Nov 29 07:32:04 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:32:04 compute-0 nova_compute[187185]:   <os>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:   </os>
Nov 29 07:32:04 compute-0 nova_compute[187185]:   <features>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:   </features>
Nov 29 07:32:04 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:32:04 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:32:04 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:32:04 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/985dc7b7-0644-4c5a-8218-9f925ac9e6ec/disk"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:32:04 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/985dc7b7-0644-4c5a-8218-9f925ac9e6ec/disk.config"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:32:04 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:e4:08:a3"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:       <target dev="tapcb5f49f5-de"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:32:04 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/985dc7b7-0644-4c5a-8218-9f925ac9e6ec/console.log" append="off"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <video>
Nov 29 07:32:04 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     </video>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:32:04 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:32:04 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:32:04 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:32:04 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:32:04 compute-0 nova_compute[187185]: </domain>
Nov 29 07:32:04 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:32:04 compute-0 nova_compute[187185]: 2025-11-29 07:32:04.989 187189 DEBUG nova.compute.manager [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Preparing to wait for external event network-vif-plugged-cb5f49f5-de68-4f15-a036-06a0f45556de prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:32:04 compute-0 nova_compute[187185]: 2025-11-29 07:32:04.989 187189 DEBUG oslo_concurrency.lockutils [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Acquiring lock "985dc7b7-0644-4c5a-8218-9f925ac9e6ec-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:32:04 compute-0 nova_compute[187185]: 2025-11-29 07:32:04.990 187189 DEBUG oslo_concurrency.lockutils [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Lock "985dc7b7-0644-4c5a-8218-9f925ac9e6ec-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:32:04 compute-0 nova_compute[187185]: 2025-11-29 07:32:04.990 187189 DEBUG oslo_concurrency.lockutils [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Lock "985dc7b7-0644-4c5a-8218-9f925ac9e6ec-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:32:04 compute-0 nova_compute[187185]: 2025-11-29 07:32:04.991 187189 DEBUG nova.virt.libvirt.vif [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:30:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestMultiTenantJSON-server-963727031',display_name='tempest-ServersNegativeTestMultiTenantJSON-server-963727031',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestmultitenantjson-server-963727031',id=144,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e5d0aef61d814c0ca5b9ed1fabe86010',ramdisk_id='',reservation_id='r-kxwjyrip',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestMultiTenantJSON-1102700638',owner_user_name='tempest-ServersNegativeTestMultiTenantJSON-1102700638-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:30:45Z,user_data=None,user_id='97d49ad735124e92ba228df4a6eba8b4',uuid=985dc7b7-0644-4c5a-8218-9f925ac9e6ec,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cb5f49f5-de68-4f15-a036-06a0f45556de", "address": "fa:16:3e:e4:08:a3", "network": {"id": "27dbf3f9-5c08-43d4-9c88-c573d9704843", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1864816291-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e5d0aef61d814c0ca5b9ed1fabe86010", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcb5f49f5-de", "ovs_interfaceid": "cb5f49f5-de68-4f15-a036-06a0f45556de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:32:04 compute-0 nova_compute[187185]: 2025-11-29 07:32:04.991 187189 DEBUG nova.network.os_vif_util [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Converting VIF {"id": "cb5f49f5-de68-4f15-a036-06a0f45556de", "address": "fa:16:3e:e4:08:a3", "network": {"id": "27dbf3f9-5c08-43d4-9c88-c573d9704843", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1864816291-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e5d0aef61d814c0ca5b9ed1fabe86010", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcb5f49f5-de", "ovs_interfaceid": "cb5f49f5-de68-4f15-a036-06a0f45556de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:32:04 compute-0 nova_compute[187185]: 2025-11-29 07:32:04.992 187189 DEBUG nova.network.os_vif_util [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e4:08:a3,bridge_name='br-int',has_traffic_filtering=True,id=cb5f49f5-de68-4f15-a036-06a0f45556de,network=Network(27dbf3f9-5c08-43d4-9c88-c573d9704843),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcb5f49f5-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:32:04 compute-0 nova_compute[187185]: 2025-11-29 07:32:04.993 187189 DEBUG os_vif [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e4:08:a3,bridge_name='br-int',has_traffic_filtering=True,id=cb5f49f5-de68-4f15-a036-06a0f45556de,network=Network(27dbf3f9-5c08-43d4-9c88-c573d9704843),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcb5f49f5-de') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:32:04 compute-0 nova_compute[187185]: 2025-11-29 07:32:04.994 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:32:04 compute-0 nova_compute[187185]: 2025-11-29 07:32:04.994 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:32:04 compute-0 nova_compute[187185]: 2025-11-29 07:32:04.995 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:32:04 compute-0 nova_compute[187185]: 2025-11-29 07:32:04.999 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:32:05 compute-0 nova_compute[187185]: 2025-11-29 07:32:04.999 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcb5f49f5-de, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:32:05 compute-0 nova_compute[187185]: 2025-11-29 07:32:05.000 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcb5f49f5-de, col_values=(('external_ids', {'iface-id': 'cb5f49f5-de68-4f15-a036-06a0f45556de', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e4:08:a3', 'vm-uuid': '985dc7b7-0644-4c5a-8218-9f925ac9e6ec'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:32:05 compute-0 NetworkManager[55227]: <info>  [1764401525.0042] manager: (tapcb5f49f5-de): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/235)
Nov 29 07:32:05 compute-0 nova_compute[187185]: 2025-11-29 07:32:05.002 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:32:05 compute-0 nova_compute[187185]: 2025-11-29 07:32:05.004 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:32:05 compute-0 nova_compute[187185]: 2025-11-29 07:32:05.009 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:32:05 compute-0 nova_compute[187185]: 2025-11-29 07:32:05.010 187189 INFO os_vif [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e4:08:a3,bridge_name='br-int',has_traffic_filtering=True,id=cb5f49f5-de68-4f15-a036-06a0f45556de,network=Network(27dbf3f9-5c08-43d4-9c88-c573d9704843),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcb5f49f5-de')
Nov 29 07:32:06 compute-0 nova_compute[187185]: 2025-11-29 07:32:06.356 187189 DEBUG nova.virt.libvirt.driver [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:32:06 compute-0 nova_compute[187185]: 2025-11-29 07:32:06.357 187189 DEBUG nova.virt.libvirt.driver [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:32:06 compute-0 nova_compute[187185]: 2025-11-29 07:32:06.357 187189 DEBUG nova.virt.libvirt.driver [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] No VIF found with MAC fa:16:3e:e4:08:a3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:32:06 compute-0 nova_compute[187185]: 2025-11-29 07:32:06.358 187189 INFO nova.virt.libvirt.driver [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Using config drive
Nov 29 07:32:06 compute-0 nova_compute[187185]: 2025-11-29 07:32:06.759 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:32:08 compute-0 nova_compute[187185]: 2025-11-29 07:32:08.370 187189 INFO nova.virt.libvirt.driver [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Creating config drive at /var/lib/nova/instances/985dc7b7-0644-4c5a-8218-9f925ac9e6ec/disk.config
Nov 29 07:32:08 compute-0 nova_compute[187185]: 2025-11-29 07:32:08.377 187189 DEBUG oslo_concurrency.processutils [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/985dc7b7-0644-4c5a-8218-9f925ac9e6ec/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdqp56qwb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:32:08 compute-0 nova_compute[187185]: 2025-11-29 07:32:08.524 187189 DEBUG oslo_concurrency.processutils [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/985dc7b7-0644-4c5a-8218-9f925ac9e6ec/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdqp56qwb" returned: 0 in 0.147s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:32:08 compute-0 kernel: tapcb5f49f5-de: entered promiscuous mode
Nov 29 07:32:08 compute-0 NetworkManager[55227]: <info>  [1764401528.5883] manager: (tapcb5f49f5-de): new Tun device (/org/freedesktop/NetworkManager/Devices/236)
Nov 29 07:32:08 compute-0 ovn_controller[95281]: 2025-11-29T07:32:08Z|00460|binding|INFO|Claiming lport cb5f49f5-de68-4f15-a036-06a0f45556de for this chassis.
Nov 29 07:32:08 compute-0 ovn_controller[95281]: 2025-11-29T07:32:08Z|00461|binding|INFO|cb5f49f5-de68-4f15-a036-06a0f45556de: Claiming fa:16:3e:e4:08:a3 10.100.0.12
Nov 29 07:32:08 compute-0 nova_compute[187185]: 2025-11-29 07:32:08.591 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:32:08 compute-0 systemd-udevd[238692]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:32:08 compute-0 systemd-machined[153486]: New machine qemu-57-instance-00000090.
Nov 29 07:32:08 compute-0 NetworkManager[55227]: <info>  [1764401528.6413] device (tapcb5f49f5-de): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:32:08 compute-0 NetworkManager[55227]: <info>  [1764401528.6423] device (tapcb5f49f5-de): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:32:08 compute-0 nova_compute[187185]: 2025-11-29 07:32:08.666 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:32:08 compute-0 ovn_controller[95281]: 2025-11-29T07:32:08Z|00462|binding|INFO|Setting lport cb5f49f5-de68-4f15-a036-06a0f45556de ovn-installed in OVS
Nov 29 07:32:08 compute-0 nova_compute[187185]: 2025-11-29 07:32:08.672 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:32:08 compute-0 systemd[1]: Started Virtual Machine qemu-57-instance-00000090.
Nov 29 07:32:09 compute-0 ovn_controller[95281]: 2025-11-29T07:32:09Z|00463|binding|INFO|Setting lport cb5f49f5-de68-4f15-a036-06a0f45556de up in Southbound
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:09.278 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e4:08:a3 10.100.0.12'], port_security=['fa:16:3e:e4:08:a3 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '985dc7b7-0644-4c5a-8218-9f925ac9e6ec', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-27dbf3f9-5c08-43d4-9c88-c573d9704843', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e5d0aef61d814c0ca5b9ed1fabe86010', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9a10274d-cfa2-4077-a15f-fd976f5d404a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=34c7f94d-b8a2-4bb5-86a2-29917808b83d, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=cb5f49f5-de68-4f15-a036-06a0f45556de) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:09.280 104254 INFO neutron.agent.ovn.metadata.agent [-] Port cb5f49f5-de68-4f15-a036-06a0f45556de in datapath 27dbf3f9-5c08-43d4-9c88-c573d9704843 bound to our chassis
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:09.284 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 27dbf3f9-5c08-43d4-9c88-c573d9704843
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:09.299 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6f31fd2a-f87a-42bc-bcf1-4d169f147086]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:09.300 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap27dbf3f9-51 in ovnmeta-27dbf3f9-5c08-43d4-9c88-c573d9704843 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:09.303 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap27dbf3f9-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:09.303 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[1996faf4-e817-4444-b233-46a493e2c910]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:09.305 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[be2aab94-c8cb-4965-88d3-4d8aa74ff503]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:09.323 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[aa1e5a9d-beb4-4124-8817-12a04721ceeb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:09.339 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[1bf9b42b-de41-4412-924f-f723bbdf9aa4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:09.381 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[f73ee812-3444-4b90-977e-cdb5c6a07e41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:09.397 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[94fd850b-d88e-41b2-8693-183ed788e8d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:32:09 compute-0 NetworkManager[55227]: <info>  [1764401529.3980] manager: (tap27dbf3f9-50): new Veth device (/org/freedesktop/NetworkManager/Devices/237)
Nov 29 07:32:09 compute-0 systemd-udevd[238695]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:09.437 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[2a3d7fae-faa4-46a9-8f8d-d757b38affe8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:09.441 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[751953fd-24e2-40c5-8630-a937d1a13589]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:32:09 compute-0 NetworkManager[55227]: <info>  [1764401529.4657] device (tap27dbf3f9-50): carrier: link connected
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:09.471 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[a49ba09e-6338-4e4e-b170-3a326346f4fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:09.495 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[c318bb0d-d424-4547-90ad-235a393edebb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap27dbf3f9-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:64:62:97'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 146], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 697712, 'reachable_time': 27820, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238728, 'error': None, 'target': 'ovnmeta-27dbf3f9-5c08-43d4-9c88-c573d9704843', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:09.514 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[5b22a506-2216-4044-a52e-347de55d11f9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe64:6297'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 697712, 'tstamp': 697712}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238729, 'error': None, 'target': 'ovnmeta-27dbf3f9-5c08-43d4-9c88-c573d9704843', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:09.538 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[37d970c3-b616-4c6a-ba48-48033aa350b0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap27dbf3f9-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:64:62:97'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 146], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 697712, 'reachable_time': 27820, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 238730, 'error': None, 'target': 'ovnmeta-27dbf3f9-5c08-43d4-9c88-c573d9704843', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:09.585 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e6b7c0a2-0f6e-43cd-b04c-921c25e7430d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:09.648 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[919385a9-7d7f-4e94-a811-38d9faa74ebe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:09.649 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap27dbf3f9-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:09.650 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:09.650 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap27dbf3f9-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:32:09 compute-0 nova_compute[187185]: 2025-11-29 07:32:09.652 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:32:09 compute-0 NetworkManager[55227]: <info>  [1764401529.6535] manager: (tap27dbf3f9-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/238)
Nov 29 07:32:09 compute-0 kernel: tap27dbf3f9-50: entered promiscuous mode
Nov 29 07:32:09 compute-0 nova_compute[187185]: 2025-11-29 07:32:09.655 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:09.658 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap27dbf3f9-50, col_values=(('external_ids', {'iface-id': '9491d627-f813-4ff3-8ba6-9afa2788b0bb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:32:09 compute-0 nova_compute[187185]: 2025-11-29 07:32:09.659 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:32:09 compute-0 ovn_controller[95281]: 2025-11-29T07:32:09Z|00464|binding|INFO|Releasing lport 9491d627-f813-4ff3-8ba6-9afa2788b0bb from this chassis (sb_readonly=0)
Nov 29 07:32:09 compute-0 nova_compute[187185]: 2025-11-29 07:32:09.671 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401529.6705813, 985dc7b7-0644-4c5a-8218-9f925ac9e6ec => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:32:09 compute-0 nova_compute[187185]: 2025-11-29 07:32:09.671 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] VM Started (Lifecycle Event)
Nov 29 07:32:09 compute-0 nova_compute[187185]: 2025-11-29 07:32:09.680 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:09.681 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/27dbf3f9-5c08-43d4-9c88-c573d9704843.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/27dbf3f9-5c08-43d4-9c88-c573d9704843.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:09.682 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[ce750337-2715-4744-bdef-96aaf892323d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:09.683 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-27dbf3f9-5c08-43d4-9c88-c573d9704843
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/27dbf3f9-5c08-43d4-9c88-c573d9704843.pid.haproxy
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID 27dbf3f9-5c08-43d4-9c88-c573d9704843
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:09.684 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-27dbf3f9-5c08-43d4-9c88-c573d9704843', 'env', 'PROCESS_TAG=haproxy-27dbf3f9-5c08-43d4-9c88-c573d9704843', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/27dbf3f9-5c08-43d4-9c88-c573d9704843.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:32:09 compute-0 nova_compute[187185]: 2025-11-29 07:32:09.720 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:32:09 compute-0 nova_compute[187185]: 2025-11-29 07:32:09.725 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401529.6706898, 985dc7b7-0644-4c5a-8218-9f925ac9e6ec => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:32:09 compute-0 nova_compute[187185]: 2025-11-29 07:32:09.726 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] VM Paused (Lifecycle Event)
Nov 29 07:32:09 compute-0 nova_compute[187185]: 2025-11-29 07:32:09.895 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:32:09 compute-0 nova_compute[187185]: 2025-11-29 07:32:09.910 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:32:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:09.958 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=49, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=48) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:32:09 compute-0 nova_compute[187185]: 2025-11-29 07:32:09.960 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:32:09 compute-0 nova_compute[187185]: 2025-11-29 07:32:09.963 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:32:10 compute-0 nova_compute[187185]: 2025-11-29 07:32:10.002 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:32:10 compute-0 podman[238770]: 2025-11-29 07:32:10.131105128 +0000 UTC m=+0.061562386 container create fc186ba6037cafe9fae5c9df2e8b8828511c0e999d3d0d33b4b0546ca62ed52b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-27dbf3f9-5c08-43d4-9c88-c573d9704843, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 29 07:32:10 compute-0 systemd[1]: Started libpod-conmon-fc186ba6037cafe9fae5c9df2e8b8828511c0e999d3d0d33b4b0546ca62ed52b.scope.
Nov 29 07:32:10 compute-0 podman[238770]: 2025-11-29 07:32:10.096538667 +0000 UTC m=+0.026995925 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:32:10 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:32:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e398ebdb211977bff1f1c94660271e62ba5415faa1de8ab98d381aab389188bd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:32:10 compute-0 podman[238770]: 2025-11-29 07:32:10.21657549 +0000 UTC m=+0.147032718 container init fc186ba6037cafe9fae5c9df2e8b8828511c0e999d3d0d33b4b0546ca62ed52b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-27dbf3f9-5c08-43d4-9c88-c573d9704843, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:32:10 compute-0 podman[238770]: 2025-11-29 07:32:10.221391278 +0000 UTC m=+0.151848506 container start fc186ba6037cafe9fae5c9df2e8b8828511c0e999d3d0d33b4b0546ca62ed52b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-27dbf3f9-5c08-43d4-9c88-c573d9704843, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:32:10 compute-0 neutron-haproxy-ovnmeta-27dbf3f9-5c08-43d4-9c88-c573d9704843[238786]: [NOTICE]   (238807) : New worker (238814) forked
Nov 29 07:32:10 compute-0 neutron-haproxy-ovnmeta-27dbf3f9-5c08-43d4-9c88-c573d9704843[238786]: [NOTICE]   (238807) : Loading success.
Nov 29 07:32:10 compute-0 podman[238783]: 2025-11-29 07:32:10.275528601 +0000 UTC m=+0.101291666 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 29 07:32:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:10.286 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:32:11 compute-0 nova_compute[187185]: 2025-11-29 07:32:11.761 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:32:14 compute-0 nova_compute[187185]: 2025-11-29 07:32:14.370 187189 DEBUG nova.compute.manager [req-47ca91bb-c985-4060-9ae1-75df00cca07b req-7e0c4bbd-4af1-4404-a04c-f93ae1543a97 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Received event network-vif-plugged-cb5f49f5-de68-4f15-a036-06a0f45556de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:32:14 compute-0 nova_compute[187185]: 2025-11-29 07:32:14.370 187189 DEBUG oslo_concurrency.lockutils [req-47ca91bb-c985-4060-9ae1-75df00cca07b req-7e0c4bbd-4af1-4404-a04c-f93ae1543a97 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "985dc7b7-0644-4c5a-8218-9f925ac9e6ec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:32:14 compute-0 nova_compute[187185]: 2025-11-29 07:32:14.371 187189 DEBUG oslo_concurrency.lockutils [req-47ca91bb-c985-4060-9ae1-75df00cca07b req-7e0c4bbd-4af1-4404-a04c-f93ae1543a97 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "985dc7b7-0644-4c5a-8218-9f925ac9e6ec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:32:14 compute-0 nova_compute[187185]: 2025-11-29 07:32:14.371 187189 DEBUG oslo_concurrency.lockutils [req-47ca91bb-c985-4060-9ae1-75df00cca07b req-7e0c4bbd-4af1-4404-a04c-f93ae1543a97 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "985dc7b7-0644-4c5a-8218-9f925ac9e6ec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:32:14 compute-0 nova_compute[187185]: 2025-11-29 07:32:14.371 187189 DEBUG nova.compute.manager [req-47ca91bb-c985-4060-9ae1-75df00cca07b req-7e0c4bbd-4af1-4404-a04c-f93ae1543a97 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Processing event network-vif-plugged-cb5f49f5-de68-4f15-a036-06a0f45556de _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:32:14 compute-0 nova_compute[187185]: 2025-11-29 07:32:14.372 187189 DEBUG nova.compute.manager [req-47ca91bb-c985-4060-9ae1-75df00cca07b req-7e0c4bbd-4af1-4404-a04c-f93ae1543a97 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Received event network-vif-plugged-cb5f49f5-de68-4f15-a036-06a0f45556de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:32:14 compute-0 nova_compute[187185]: 2025-11-29 07:32:14.372 187189 DEBUG oslo_concurrency.lockutils [req-47ca91bb-c985-4060-9ae1-75df00cca07b req-7e0c4bbd-4af1-4404-a04c-f93ae1543a97 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "985dc7b7-0644-4c5a-8218-9f925ac9e6ec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:32:14 compute-0 nova_compute[187185]: 2025-11-29 07:32:14.372 187189 DEBUG oslo_concurrency.lockutils [req-47ca91bb-c985-4060-9ae1-75df00cca07b req-7e0c4bbd-4af1-4404-a04c-f93ae1543a97 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "985dc7b7-0644-4c5a-8218-9f925ac9e6ec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:32:14 compute-0 nova_compute[187185]: 2025-11-29 07:32:14.372 187189 DEBUG oslo_concurrency.lockutils [req-47ca91bb-c985-4060-9ae1-75df00cca07b req-7e0c4bbd-4af1-4404-a04c-f93ae1543a97 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "985dc7b7-0644-4c5a-8218-9f925ac9e6ec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:32:14 compute-0 nova_compute[187185]: 2025-11-29 07:32:14.373 187189 DEBUG nova.compute.manager [req-47ca91bb-c985-4060-9ae1-75df00cca07b req-7e0c4bbd-4af1-4404-a04c-f93ae1543a97 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] No waiting events found dispatching network-vif-plugged-cb5f49f5-de68-4f15-a036-06a0f45556de pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:32:14 compute-0 nova_compute[187185]: 2025-11-29 07:32:14.373 187189 WARNING nova.compute.manager [req-47ca91bb-c985-4060-9ae1-75df00cca07b req-7e0c4bbd-4af1-4404-a04c-f93ae1543a97 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Received unexpected event network-vif-plugged-cb5f49f5-de68-4f15-a036-06a0f45556de for instance with vm_state building and task_state spawning.
Nov 29 07:32:14 compute-0 nova_compute[187185]: 2025-11-29 07:32:14.373 187189 DEBUG nova.compute.manager [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:32:14 compute-0 nova_compute[187185]: 2025-11-29 07:32:14.380 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401534.3800626, 985dc7b7-0644-4c5a-8218-9f925ac9e6ec => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:32:14 compute-0 nova_compute[187185]: 2025-11-29 07:32:14.380 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] VM Resumed (Lifecycle Event)
Nov 29 07:32:14 compute-0 nova_compute[187185]: 2025-11-29 07:32:14.384 187189 DEBUG nova.virt.libvirt.driver [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:32:14 compute-0 nova_compute[187185]: 2025-11-29 07:32:14.387 187189 INFO nova.virt.libvirt.driver [-] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Instance spawned successfully.
Nov 29 07:32:14 compute-0 nova_compute[187185]: 2025-11-29 07:32:14.387 187189 DEBUG nova.virt.libvirt.driver [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:32:15 compute-0 nova_compute[187185]: 2025-11-29 07:32:15.003 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:32:16 compute-0 nova_compute[187185]: 2025-11-29 07:32:16.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:32:16 compute-0 nova_compute[187185]: 2025-11-29 07:32:16.809 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:32:17 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:17.289 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '49'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:32:17 compute-0 podman[238827]: 2025-11-29 07:32:17.812305204 +0000 UTC m=+0.064824901 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 07:32:18 compute-0 nova_compute[187185]: 2025-11-29 07:32:18.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:32:18 compute-0 nova_compute[187185]: 2025-11-29 07:32:18.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:32:18 compute-0 nova_compute[187185]: 2025-11-29 07:32:18.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:32:19 compute-0 nova_compute[187185]: 2025-11-29 07:32:19.018 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:32:19 compute-0 nova_compute[187185]: 2025-11-29 07:32:19.029 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:32:19 compute-0 nova_compute[187185]: 2025-11-29 07:32:19.036 187189 DEBUG nova.virt.libvirt.driver [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:32:19 compute-0 nova_compute[187185]: 2025-11-29 07:32:19.037 187189 DEBUG nova.virt.libvirt.driver [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:32:19 compute-0 nova_compute[187185]: 2025-11-29 07:32:19.038 187189 DEBUG nova.virt.libvirt.driver [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:32:19 compute-0 nova_compute[187185]: 2025-11-29 07:32:19.039 187189 DEBUG nova.virt.libvirt.driver [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:32:19 compute-0 nova_compute[187185]: 2025-11-29 07:32:19.040 187189 DEBUG nova.virt.libvirt.driver [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:32:19 compute-0 nova_compute[187185]: 2025-11-29 07:32:19.041 187189 DEBUG nova.virt.libvirt.driver [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:32:20 compute-0 nova_compute[187185]: 2025-11-29 07:32:20.006 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:32:20 compute-0 nova_compute[187185]: 2025-11-29 07:32:20.628 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:32:21 compute-0 nova_compute[187185]: 2025-11-29 07:32:21.803 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Nov 29 07:32:21 compute-0 nova_compute[187185]: 2025-11-29 07:32:21.804 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 07:32:21 compute-0 nova_compute[187185]: 2025-11-29 07:32:21.806 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:32:21 compute-0 nova_compute[187185]: 2025-11-29 07:32:21.807 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:32:21 compute-0 nova_compute[187185]: 2025-11-29 07:32:21.811 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:32:22 compute-0 nova_compute[187185]: 2025-11-29 07:32:22.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:32:22 compute-0 nova_compute[187185]: 2025-11-29 07:32:22.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:32:22 compute-0 podman[238850]: 2025-11-29 07:32:22.807875401 +0000 UTC m=+0.072341546 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3)
Nov 29 07:32:22 compute-0 podman[238851]: 2025-11-29 07:32:22.833202817 +0000 UTC m=+0.087569082 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 07:32:22 compute-0 nova_compute[187185]: 2025-11-29 07:32:22.984 187189 INFO nova.compute.manager [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Took 96.60 seconds to spawn the instance on the hypervisor.
Nov 29 07:32:22 compute-0 nova_compute[187185]: 2025-11-29 07:32:22.984 187189 DEBUG nova.compute.manager [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:32:23 compute-0 nova_compute[187185]: 2025-11-29 07:32:23.133 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:32:23 compute-0 nova_compute[187185]: 2025-11-29 07:32:23.133 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:32:23 compute-0 nova_compute[187185]: 2025-11-29 07:32:23.134 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:32:23 compute-0 nova_compute[187185]: 2025-11-29 07:32:23.134 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:32:25 compute-0 nova_compute[187185]: 2025-11-29 07:32:25.011 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:32:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:25.522 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:32:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:25.524 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:32:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:25.524 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:32:26 compute-0 nova_compute[187185]: 2025-11-29 07:32:26.263 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/985dc7b7-0644-4c5a-8218-9f925ac9e6ec/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:32:26 compute-0 nova_compute[187185]: 2025-11-29 07:32:26.349 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/985dc7b7-0644-4c5a-8218-9f925ac9e6ec/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:32:26 compute-0 nova_compute[187185]: 2025-11-29 07:32:26.352 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/985dc7b7-0644-4c5a-8218-9f925ac9e6ec/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:32:26 compute-0 nova_compute[187185]: 2025-11-29 07:32:26.424 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/985dc7b7-0644-4c5a-8218-9f925ac9e6ec/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:32:26 compute-0 nova_compute[187185]: 2025-11-29 07:32:26.626 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:32:26 compute-0 nova_compute[187185]: 2025-11-29 07:32:26.627 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5567MB free_disk=73.2434196472168GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:32:26 compute-0 nova_compute[187185]: 2025-11-29 07:32:26.628 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:32:26 compute-0 nova_compute[187185]: 2025-11-29 07:32:26.628 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:32:26 compute-0 nova_compute[187185]: 2025-11-29 07:32:26.815 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:32:26 compute-0 ovn_controller[95281]: 2025-11-29T07:32:26Z|00059|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e4:08:a3 10.100.0.12
Nov 29 07:32:26 compute-0 ovn_controller[95281]: 2025-11-29T07:32:26Z|00060|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e4:08:a3 10.100.0.12
Nov 29 07:32:28 compute-0 nova_compute[187185]: 2025-11-29 07:32:28.150 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance 985dc7b7-0644-4c5a-8218-9f925ac9e6ec actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:32:28 compute-0 nova_compute[187185]: 2025-11-29 07:32:28.150 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:32:28 compute-0 nova_compute[187185]: 2025-11-29 07:32:28.151 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:32:28 compute-0 nova_compute[187185]: 2025-11-29 07:32:28.200 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:32:28 compute-0 nova_compute[187185]: 2025-11-29 07:32:28.210 187189 INFO nova.compute.manager [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Took 105.73 seconds to build instance.
Nov 29 07:32:28 compute-0 nova_compute[187185]: 2025-11-29 07:32:28.298 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:32:28 compute-0 nova_compute[187185]: 2025-11-29 07:32:28.459 187189 DEBUG oslo_concurrency.lockutils [None req-801e5ef4-1f3a-4079-b806-dbbe4ad68a5b 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Lock "985dc7b7-0644-4c5a-8218-9f925ac9e6ec" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 106.434s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:32:28 compute-0 nova_compute[187185]: 2025-11-29 07:32:28.647 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:32:28 compute-0 nova_compute[187185]: 2025-11-29 07:32:28.647 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.019s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:32:28 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:28.661 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:f1:bd 2001:db8:0:1:f816:3eff:fe4a:f1bd 2001:db8::f816:3eff:fe4a:f1bd'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe4a:f1bd/64 2001:db8::f816:3eff:fe4a:f1bd/64', 'neutron:device_id': 'ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-716ed53e-cc56-4286-b418-2f5e02d33124', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c069d1db-d7e5-4641-988e-cd6e75103caa, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=b0b0536c-6e35-42c5-8936-a1236a4f216e) old=Port_Binding(mac=['fa:16:3e:4a:f1:bd 2001:db8::f816:3eff:fe4a:f1bd'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe4a:f1bd/64', 'neutron:device_id': 'ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-716ed53e-cc56-4286-b418-2f5e02d33124', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:32:28 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:28.664 104254 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port b0b0536c-6e35-42c5-8936-a1236a4f216e in datapath 716ed53e-cc56-4286-b418-2f5e02d33124 updated
Nov 29 07:32:28 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:28.667 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 716ed53e-cc56-4286-b418-2f5e02d33124, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:32:28 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:28.668 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[5e40bbc6-d97c-4a1b-9e97-6aeace0ec3bf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:32:29 compute-0 nova_compute[187185]: 2025-11-29 07:32:29.649 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:32:29 compute-0 nova_compute[187185]: 2025-11-29 07:32:29.649 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:32:29 compute-0 nova_compute[187185]: 2025-11-29 07:32:29.650 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:32:29 compute-0 nova_compute[187185]: 2025-11-29 07:32:29.650 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:32:30 compute-0 nova_compute[187185]: 2025-11-29 07:32:30.062 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:32:31 compute-0 nova_compute[187185]: 2025-11-29 07:32:31.594 187189 DEBUG oslo_concurrency.lockutils [None req-605fbfd8-b22e-471b-b64a-8bddf762833d 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Acquiring lock "985dc7b7-0644-4c5a-8218-9f925ac9e6ec" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:32:31 compute-0 nova_compute[187185]: 2025-11-29 07:32:31.596 187189 DEBUG oslo_concurrency.lockutils [None req-605fbfd8-b22e-471b-b64a-8bddf762833d 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Lock "985dc7b7-0644-4c5a-8218-9f925ac9e6ec" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:32:31 compute-0 nova_compute[187185]: 2025-11-29 07:32:31.597 187189 DEBUG oslo_concurrency.lockutils [None req-605fbfd8-b22e-471b-b64a-8bddf762833d 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Acquiring lock "985dc7b7-0644-4c5a-8218-9f925ac9e6ec-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:32:31 compute-0 nova_compute[187185]: 2025-11-29 07:32:31.597 187189 DEBUG oslo_concurrency.lockutils [None req-605fbfd8-b22e-471b-b64a-8bddf762833d 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Lock "985dc7b7-0644-4c5a-8218-9f925ac9e6ec-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:32:31 compute-0 nova_compute[187185]: 2025-11-29 07:32:31.598 187189 DEBUG oslo_concurrency.lockutils [None req-605fbfd8-b22e-471b-b64a-8bddf762833d 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Lock "985dc7b7-0644-4c5a-8218-9f925ac9e6ec-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:32:31 compute-0 nova_compute[187185]: 2025-11-29 07:32:31.620 187189 INFO nova.compute.manager [None req-605fbfd8-b22e-471b-b64a-8bddf762833d 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Terminating instance
Nov 29 07:32:31 compute-0 nova_compute[187185]: 2025-11-29 07:32:31.730 187189 DEBUG nova.compute.manager [None req-605fbfd8-b22e-471b-b64a-8bddf762833d 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 07:32:31 compute-0 kernel: tapcb5f49f5-de (unregistering): left promiscuous mode
Nov 29 07:32:31 compute-0 NetworkManager[55227]: <info>  [1764401551.7550] device (tapcb5f49f5-de): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:32:31 compute-0 nova_compute[187185]: 2025-11-29 07:32:31.764 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:32:31 compute-0 ovn_controller[95281]: 2025-11-29T07:32:31Z|00465|binding|INFO|Releasing lport cb5f49f5-de68-4f15-a036-06a0f45556de from this chassis (sb_readonly=0)
Nov 29 07:32:31 compute-0 ovn_controller[95281]: 2025-11-29T07:32:31Z|00466|binding|INFO|Setting lport cb5f49f5-de68-4f15-a036-06a0f45556de down in Southbound
Nov 29 07:32:31 compute-0 ovn_controller[95281]: 2025-11-29T07:32:31Z|00467|binding|INFO|Removing iface tapcb5f49f5-de ovn-installed in OVS
Nov 29 07:32:31 compute-0 nova_compute[187185]: 2025-11-29 07:32:31.766 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:32:31 compute-0 nova_compute[187185]: 2025-11-29 07:32:31.820 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:32:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:31.820 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e4:08:a3 10.100.0.12'], port_security=['fa:16:3e:e4:08:a3 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '985dc7b7-0644-4c5a-8218-9f925ac9e6ec', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-27dbf3f9-5c08-43d4-9c88-c573d9704843', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e5d0aef61d814c0ca5b9ed1fabe86010', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9a10274d-cfa2-4077-a15f-fd976f5d404a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=34c7f94d-b8a2-4bb5-86a2-29917808b83d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=cb5f49f5-de68-4f15-a036-06a0f45556de) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:32:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:31.823 104254 INFO neutron.agent.ovn.metadata.agent [-] Port cb5f49f5-de68-4f15-a036-06a0f45556de in datapath 27dbf3f9-5c08-43d4-9c88-c573d9704843 unbound from our chassis
Nov 29 07:32:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:31.827 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 27dbf3f9-5c08-43d4-9c88-c573d9704843, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:32:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:31.828 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[fb1db957-bcc2-4f87-9416-c77dc57e2cb7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:32:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:31.829 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-27dbf3f9-5c08-43d4-9c88-c573d9704843 namespace which is not needed anymore
Nov 29 07:32:31 compute-0 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d00000090.scope: Deactivated successfully.
Nov 29 07:32:31 compute-0 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d00000090.scope: Consumed 13.404s CPU time.
Nov 29 07:32:31 compute-0 systemd-machined[153486]: Machine qemu-57-instance-00000090 terminated.
Nov 29 07:32:31 compute-0 podman[238917]: 2025-11-29 07:32:31.864083964 +0000 UTC m=+0.102609424 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 07:32:31 compute-0 podman[238916]: 2025-11-29 07:32:31.88728702 +0000 UTC m=+0.145162905 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=9.6, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_id=edpm, io.openshift.tags=minimal rhel9, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible)
Nov 29 07:32:31 compute-0 podman[238915]: 2025-11-29 07:32:31.899601923 +0000 UTC m=+0.149563111 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:32:31 compute-0 neutron-haproxy-ovnmeta-27dbf3f9-5c08-43d4-9c88-c573d9704843[238786]: [NOTICE]   (238807) : haproxy version is 2.8.14-c23fe91
Nov 29 07:32:31 compute-0 neutron-haproxy-ovnmeta-27dbf3f9-5c08-43d4-9c88-c573d9704843[238786]: [NOTICE]   (238807) : path to executable is /usr/sbin/haproxy
Nov 29 07:32:31 compute-0 neutron-haproxy-ovnmeta-27dbf3f9-5c08-43d4-9c88-c573d9704843[238786]: [WARNING]  (238807) : Exiting Master process...
Nov 29 07:32:31 compute-0 neutron-haproxy-ovnmeta-27dbf3f9-5c08-43d4-9c88-c573d9704843[238786]: [ALERT]    (238807) : Current worker (238814) exited with code 143 (Terminated)
Nov 29 07:32:31 compute-0 neutron-haproxy-ovnmeta-27dbf3f9-5c08-43d4-9c88-c573d9704843[238786]: [WARNING]  (238807) : All workers exited. Exiting... (0)
Nov 29 07:32:31 compute-0 systemd[1]: libpod-fc186ba6037cafe9fae5c9df2e8b8828511c0e999d3d0d33b4b0546ca62ed52b.scope: Deactivated successfully.
Nov 29 07:32:31 compute-0 podman[239004]: 2025-11-29 07:32:31.98841529 +0000 UTC m=+0.053983139 container died fc186ba6037cafe9fae5c9df2e8b8828511c0e999d3d0d33b4b0546ca62ed52b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-27dbf3f9-5c08-43d4-9c88-c573d9704843, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 29 07:32:32 compute-0 nova_compute[187185]: 2025-11-29 07:32:32.009 187189 INFO nova.virt.libvirt.driver [-] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Instance destroyed successfully.
Nov 29 07:32:32 compute-0 nova_compute[187185]: 2025-11-29 07:32:32.010 187189 DEBUG nova.objects.instance [None req-605fbfd8-b22e-471b-b64a-8bddf762833d 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Lazy-loading 'resources' on Instance uuid 985dc7b7-0644-4c5a-8218-9f925ac9e6ec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:32:32 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fc186ba6037cafe9fae5c9df2e8b8828511c0e999d3d0d33b4b0546ca62ed52b-userdata-shm.mount: Deactivated successfully.
Nov 29 07:32:32 compute-0 systemd[1]: var-lib-containers-storage-overlay-e398ebdb211977bff1f1c94660271e62ba5415faa1de8ab98d381aab389188bd-merged.mount: Deactivated successfully.
Nov 29 07:32:32 compute-0 podman[239004]: 2025-11-29 07:32:32.037762026 +0000 UTC m=+0.103329875 container cleanup fc186ba6037cafe9fae5c9df2e8b8828511c0e999d3d0d33b4b0546ca62ed52b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-27dbf3f9-5c08-43d4-9c88-c573d9704843, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:32:32 compute-0 systemd[1]: libpod-conmon-fc186ba6037cafe9fae5c9df2e8b8828511c0e999d3d0d33b4b0546ca62ed52b.scope: Deactivated successfully.
Nov 29 07:32:32 compute-0 nova_compute[187185]: 2025-11-29 07:32:32.105 187189 DEBUG nova.virt.libvirt.vif [None req-605fbfd8-b22e-471b-b64a-8bddf762833d 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:30:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestMultiTenantJSON-server-963727031',display_name='tempest-ServersNegativeTestMultiTenantJSON-server-963727031',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestmultitenantjson-server-963727031',id=144,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:32:22Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e5d0aef61d814c0ca5b9ed1fabe86010',ramdisk_id='',reservation_id='r-kxwjyrip',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestMultiTenantJSON-1102700638',owner_user_name='tempest-ServersNegativeTestMultiTenantJSON-1102700638-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:32:24Z,user_data=None,user_id='97d49ad735124e92ba228df4a6eba8b4',uuid=985dc7b7-0644-4c5a-8218-9f925ac9e6ec,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cb5f49f5-de68-4f15-a036-06a0f45556de", "address": "fa:16:3e:e4:08:a3", "network": {"id": "27dbf3f9-5c08-43d4-9c88-c573d9704843", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1864816291-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e5d0aef61d814c0ca5b9ed1fabe86010", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcb5f49f5-de", "ovs_interfaceid": "cb5f49f5-de68-4f15-a036-06a0f45556de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:32:32 compute-0 nova_compute[187185]: 2025-11-29 07:32:32.107 187189 DEBUG nova.network.os_vif_util [None req-605fbfd8-b22e-471b-b64a-8bddf762833d 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Converting VIF {"id": "cb5f49f5-de68-4f15-a036-06a0f45556de", "address": "fa:16:3e:e4:08:a3", "network": {"id": "27dbf3f9-5c08-43d4-9c88-c573d9704843", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1864816291-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e5d0aef61d814c0ca5b9ed1fabe86010", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcb5f49f5-de", "ovs_interfaceid": "cb5f49f5-de68-4f15-a036-06a0f45556de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:32:32 compute-0 nova_compute[187185]: 2025-11-29 07:32:32.108 187189 DEBUG nova.network.os_vif_util [None req-605fbfd8-b22e-471b-b64a-8bddf762833d 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e4:08:a3,bridge_name='br-int',has_traffic_filtering=True,id=cb5f49f5-de68-4f15-a036-06a0f45556de,network=Network(27dbf3f9-5c08-43d4-9c88-c573d9704843),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcb5f49f5-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:32:32 compute-0 nova_compute[187185]: 2025-11-29 07:32:32.109 187189 DEBUG os_vif [None req-605fbfd8-b22e-471b-b64a-8bddf762833d 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e4:08:a3,bridge_name='br-int',has_traffic_filtering=True,id=cb5f49f5-de68-4f15-a036-06a0f45556de,network=Network(27dbf3f9-5c08-43d4-9c88-c573d9704843),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcb5f49f5-de') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:32:32 compute-0 nova_compute[187185]: 2025-11-29 07:32:32.111 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:32:32 compute-0 nova_compute[187185]: 2025-11-29 07:32:32.112 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcb5f49f5-de, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:32:32 compute-0 nova_compute[187185]: 2025-11-29 07:32:32.113 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:32:32 compute-0 nova_compute[187185]: 2025-11-29 07:32:32.115 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:32:32 compute-0 nova_compute[187185]: 2025-11-29 07:32:32.119 187189 INFO os_vif [None req-605fbfd8-b22e-471b-b64a-8bddf762833d 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e4:08:a3,bridge_name='br-int',has_traffic_filtering=True,id=cb5f49f5-de68-4f15-a036-06a0f45556de,network=Network(27dbf3f9-5c08-43d4-9c88-c573d9704843),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcb5f49f5-de')
Nov 29 07:32:32 compute-0 nova_compute[187185]: 2025-11-29 07:32:32.120 187189 INFO nova.virt.libvirt.driver [None req-605fbfd8-b22e-471b-b64a-8bddf762833d 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Deleting instance files /var/lib/nova/instances/985dc7b7-0644-4c5a-8218-9f925ac9e6ec_del
Nov 29 07:32:32 compute-0 nova_compute[187185]: 2025-11-29 07:32:32.121 187189 INFO nova.virt.libvirt.driver [None req-605fbfd8-b22e-471b-b64a-8bddf762833d 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Deletion of /var/lib/nova/instances/985dc7b7-0644-4c5a-8218-9f925ac9e6ec_del complete
Nov 29 07:32:32 compute-0 podman[239048]: 2025-11-29 07:32:32.124422211 +0000 UTC m=+0.056269755 container remove fc186ba6037cafe9fae5c9df2e8b8828511c0e999d3d0d33b4b0546ca62ed52b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-27dbf3f9-5c08-43d4-9c88-c573d9704843, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 29 07:32:32 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:32.128 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e92c6737-c817-4380-8180-033d9b8cabdb]: (4, ('Sat Nov 29 07:32:31 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-27dbf3f9-5c08-43d4-9c88-c573d9704843 (fc186ba6037cafe9fae5c9df2e8b8828511c0e999d3d0d33b4b0546ca62ed52b)\nfc186ba6037cafe9fae5c9df2e8b8828511c0e999d3d0d33b4b0546ca62ed52b\nSat Nov 29 07:32:32 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-27dbf3f9-5c08-43d4-9c88-c573d9704843 (fc186ba6037cafe9fae5c9df2e8b8828511c0e999d3d0d33b4b0546ca62ed52b)\nfc186ba6037cafe9fae5c9df2e8b8828511c0e999d3d0d33b4b0546ca62ed52b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:32:32 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:32.130 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[3e860962-51a3-4040-9b7f-1eb1aa140d3a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:32:32 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:32.132 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap27dbf3f9-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:32:32 compute-0 kernel: tap27dbf3f9-50: left promiscuous mode
Nov 29 07:32:32 compute-0 nova_compute[187185]: 2025-11-29 07:32:32.135 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:32:32 compute-0 nova_compute[187185]: 2025-11-29 07:32:32.146 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:32:32 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:32.149 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[de530ad1-6eb2-436c-a7e7-62ebb2973f6f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:32:32 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:32.175 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6142891a-f4c9-4bb4-aad6-a917b8e8c058]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:32:32 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:32.177 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[1ed67905-87dc-4118-ae97-957028c95ae6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:32:32 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:32.192 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f44611b9-3744-4705-96f2-010f4112662a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 697704, 'reachable_time': 15638, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239063, 'error': None, 'target': 'ovnmeta-27dbf3f9-5c08-43d4-9c88-c573d9704843', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:32:32 compute-0 systemd[1]: run-netns-ovnmeta\x2d27dbf3f9\x2d5c08\x2d43d4\x2d9c88\x2dc573d9704843.mount: Deactivated successfully.
Nov 29 07:32:32 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:32.197 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-27dbf3f9-5c08-43d4-9c88-c573d9704843 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:32:32 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:32:32.198 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[649a33fb-9030-46e0-b08b-47b950867c7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:32:32 compute-0 nova_compute[187185]: 2025-11-29 07:32:32.258 187189 INFO nova.compute.manager [None req-605fbfd8-b22e-471b-b64a-8bddf762833d 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Took 0.53 seconds to destroy the instance on the hypervisor.
Nov 29 07:32:32 compute-0 nova_compute[187185]: 2025-11-29 07:32:32.259 187189 DEBUG oslo.service.loopingcall [None req-605fbfd8-b22e-471b-b64a-8bddf762833d 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 07:32:32 compute-0 nova_compute[187185]: 2025-11-29 07:32:32.260 187189 DEBUG nova.compute.manager [-] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 07:32:32 compute-0 nova_compute[187185]: 2025-11-29 07:32:32.261 187189 DEBUG nova.network.neutron [-] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 07:32:32 compute-0 nova_compute[187185]: 2025-11-29 07:32:32.770 187189 DEBUG nova.compute.manager [req-b9bf543b-4d1d-4319-9c05-b044f42fe486 req-3d4f2b21-217f-4247-918e-5f37845aed8d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Received event network-vif-unplugged-cb5f49f5-de68-4f15-a036-06a0f45556de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:32:32 compute-0 nova_compute[187185]: 2025-11-29 07:32:32.770 187189 DEBUG oslo_concurrency.lockutils [req-b9bf543b-4d1d-4319-9c05-b044f42fe486 req-3d4f2b21-217f-4247-918e-5f37845aed8d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "985dc7b7-0644-4c5a-8218-9f925ac9e6ec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:32:32 compute-0 nova_compute[187185]: 2025-11-29 07:32:32.771 187189 DEBUG oslo_concurrency.lockutils [req-b9bf543b-4d1d-4319-9c05-b044f42fe486 req-3d4f2b21-217f-4247-918e-5f37845aed8d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "985dc7b7-0644-4c5a-8218-9f925ac9e6ec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:32:32 compute-0 nova_compute[187185]: 2025-11-29 07:32:32.771 187189 DEBUG oslo_concurrency.lockutils [req-b9bf543b-4d1d-4319-9c05-b044f42fe486 req-3d4f2b21-217f-4247-918e-5f37845aed8d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "985dc7b7-0644-4c5a-8218-9f925ac9e6ec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:32:32 compute-0 nova_compute[187185]: 2025-11-29 07:32:32.771 187189 DEBUG nova.compute.manager [req-b9bf543b-4d1d-4319-9c05-b044f42fe486 req-3d4f2b21-217f-4247-918e-5f37845aed8d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] No waiting events found dispatching network-vif-unplugged-cb5f49f5-de68-4f15-a036-06a0f45556de pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:32:32 compute-0 nova_compute[187185]: 2025-11-29 07:32:32.772 187189 DEBUG nova.compute.manager [req-b9bf543b-4d1d-4319-9c05-b044f42fe486 req-3d4f2b21-217f-4247-918e-5f37845aed8d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Received event network-vif-unplugged-cb5f49f5-de68-4f15-a036-06a0f45556de for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 07:32:35 compute-0 nova_compute[187185]: 2025-11-29 07:32:35.398 187189 DEBUG nova.compute.manager [req-fe2054a7-5f98-43bf-8dba-50b77d448c3b req-b217fa12-bcbe-4be8-9dd7-5a05d832b0ec 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Received event network-vif-plugged-cb5f49f5-de68-4f15-a036-06a0f45556de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:32:35 compute-0 nova_compute[187185]: 2025-11-29 07:32:35.398 187189 DEBUG oslo_concurrency.lockutils [req-fe2054a7-5f98-43bf-8dba-50b77d448c3b req-b217fa12-bcbe-4be8-9dd7-5a05d832b0ec 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "985dc7b7-0644-4c5a-8218-9f925ac9e6ec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:32:35 compute-0 nova_compute[187185]: 2025-11-29 07:32:35.399 187189 DEBUG oslo_concurrency.lockutils [req-fe2054a7-5f98-43bf-8dba-50b77d448c3b req-b217fa12-bcbe-4be8-9dd7-5a05d832b0ec 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "985dc7b7-0644-4c5a-8218-9f925ac9e6ec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:32:35 compute-0 nova_compute[187185]: 2025-11-29 07:32:35.399 187189 DEBUG oslo_concurrency.lockutils [req-fe2054a7-5f98-43bf-8dba-50b77d448c3b req-b217fa12-bcbe-4be8-9dd7-5a05d832b0ec 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "985dc7b7-0644-4c5a-8218-9f925ac9e6ec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:32:35 compute-0 nova_compute[187185]: 2025-11-29 07:32:35.399 187189 DEBUG nova.compute.manager [req-fe2054a7-5f98-43bf-8dba-50b77d448c3b req-b217fa12-bcbe-4be8-9dd7-5a05d832b0ec 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] No waiting events found dispatching network-vif-plugged-cb5f49f5-de68-4f15-a036-06a0f45556de pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:32:35 compute-0 nova_compute[187185]: 2025-11-29 07:32:35.400 187189 WARNING nova.compute.manager [req-fe2054a7-5f98-43bf-8dba-50b77d448c3b req-b217fa12-bcbe-4be8-9dd7-5a05d832b0ec 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Received unexpected event network-vif-plugged-cb5f49f5-de68-4f15-a036-06a0f45556de for instance with vm_state active and task_state deleting.
Nov 29 07:32:36 compute-0 nova_compute[187185]: 2025-11-29 07:32:36.595 187189 DEBUG nova.compute.manager [req-dbdfbfec-1d8d-4932-8507-402fe343c962 req-25c00f2b-2487-4c37-ae39-52bbff9d368f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Received event network-vif-deleted-cb5f49f5-de68-4f15-a036-06a0f45556de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:32:36 compute-0 nova_compute[187185]: 2025-11-29 07:32:36.596 187189 INFO nova.compute.manager [req-dbdfbfec-1d8d-4932-8507-402fe343c962 req-25c00f2b-2487-4c37-ae39-52bbff9d368f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Neutron deleted interface cb5f49f5-de68-4f15-a036-06a0f45556de; detaching it from the instance and deleting it from the info cache
Nov 29 07:32:36 compute-0 nova_compute[187185]: 2025-11-29 07:32:36.597 187189 DEBUG nova.network.neutron [req-dbdfbfec-1d8d-4932-8507-402fe343c962 req-25c00f2b-2487-4c37-ae39-52bbff9d368f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:32:36 compute-0 nova_compute[187185]: 2025-11-29 07:32:36.601 187189 DEBUG nova.network.neutron [-] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:32:36 compute-0 nova_compute[187185]: 2025-11-29 07:32:36.626 187189 INFO nova.compute.manager [-] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Took 4.36 seconds to deallocate network for instance.
Nov 29 07:32:36 compute-0 nova_compute[187185]: 2025-11-29 07:32:36.630 187189 DEBUG nova.compute.manager [req-dbdfbfec-1d8d-4932-8507-402fe343c962 req-25c00f2b-2487-4c37-ae39-52bbff9d368f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Detach interface failed, port_id=cb5f49f5-de68-4f15-a036-06a0f45556de, reason: Instance 985dc7b7-0644-4c5a-8218-9f925ac9e6ec could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Nov 29 07:32:36 compute-0 nova_compute[187185]: 2025-11-29 07:32:36.822 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:32:36 compute-0 nova_compute[187185]: 2025-11-29 07:32:36.975 187189 DEBUG oslo_concurrency.lockutils [None req-605fbfd8-b22e-471b-b64a-8bddf762833d 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:32:36 compute-0 nova_compute[187185]: 2025-11-29 07:32:36.976 187189 DEBUG oslo_concurrency.lockutils [None req-605fbfd8-b22e-471b-b64a-8bddf762833d 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:32:37 compute-0 nova_compute[187185]: 2025-11-29 07:32:37.037 187189 DEBUG nova.compute.provider_tree [None req-605fbfd8-b22e-471b-b64a-8bddf762833d 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:32:37 compute-0 nova_compute[187185]: 2025-11-29 07:32:37.114 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:32:37 compute-0 nova_compute[187185]: 2025-11-29 07:32:37.197 187189 DEBUG nova.scheduler.client.report [None req-605fbfd8-b22e-471b-b64a-8bddf762833d 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:32:37 compute-0 nova_compute[187185]: 2025-11-29 07:32:37.278 187189 DEBUG oslo_concurrency.lockutils [None req-605fbfd8-b22e-471b-b64a-8bddf762833d 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.302s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:32:37 compute-0 nova_compute[187185]: 2025-11-29 07:32:37.307 187189 INFO nova.scheduler.client.report [None req-605fbfd8-b22e-471b-b64a-8bddf762833d 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Deleted allocations for instance 985dc7b7-0644-4c5a-8218-9f925ac9e6ec
Nov 29 07:32:37 compute-0 nova_compute[187185]: 2025-11-29 07:32:37.460 187189 DEBUG oslo_concurrency.lockutils [None req-605fbfd8-b22e-471b-b64a-8bddf762833d 97d49ad735124e92ba228df4a6eba8b4 e5d0aef61d814c0ca5b9ed1fabe86010 - - default default] Lock "985dc7b7-0644-4c5a-8218-9f925ac9e6ec" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.864s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:32:38 compute-0 nova_compute[187185]: 2025-11-29 07:32:38.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:32:38 compute-0 nova_compute[187185]: 2025-11-29 07:32:38.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 29 07:32:40 compute-0 podman[239064]: 2025-11-29 07:32:40.879294078 +0000 UTC m=+0.135443786 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:32:41 compute-0 nova_compute[187185]: 2025-11-29 07:32:41.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:32:41 compute-0 nova_compute[187185]: 2025-11-29 07:32:41.823 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:32:42 compute-0 nova_compute[187185]: 2025-11-29 07:32:42.116 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:32:45 compute-0 nova_compute[187185]: 2025-11-29 07:32:45.328 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:32:46 compute-0 nova_compute[187185]: 2025-11-29 07:32:46.825 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:32:47 compute-0 nova_compute[187185]: 2025-11-29 07:32:47.008 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401552.0059705, 985dc7b7-0644-4c5a-8218-9f925ac9e6ec => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:32:47 compute-0 nova_compute[187185]: 2025-11-29 07:32:47.009 187189 INFO nova.compute.manager [-] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] VM Stopped (Lifecycle Event)
Nov 29 07:32:47 compute-0 nova_compute[187185]: 2025-11-29 07:32:47.119 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:32:47 compute-0 nova_compute[187185]: 2025-11-29 07:32:47.290 187189 DEBUG nova.compute.manager [None req-c534af57-f2b7-4db7-9ede-577cf97a503a - - - - - -] [instance: 985dc7b7-0644-4c5a-8218-9f925ac9e6ec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:32:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:32:48.006 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:32:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:32:48.007 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:32:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:32:48.007 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:32:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:32:48.007 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:32:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:32:48.008 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:32:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:32:48.008 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:32:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:32:48.008 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:32:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:32:48.008 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:32:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:32:48.008 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:32:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:32:48.008 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:32:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:32:48.008 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:32:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:32:48.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:32:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:32:48.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:32:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:32:48.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:32:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:32:48.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:32:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:32:48.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:32:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:32:48.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:32:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:32:48.010 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:32:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:32:48.010 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:32:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:32:48.010 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:32:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:32:48.010 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:32:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:32:48.010 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:32:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:32:48.010 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:32:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:32:48.010 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:32:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:32:48.011 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:32:48 compute-0 podman[239090]: 2025-11-29 07:32:48.808261419 +0000 UTC m=+0.063746759 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 07:32:48 compute-0 nova_compute[187185]: 2025-11-29 07:32:48.898 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:32:51 compute-0 nova_compute[187185]: 2025-11-29 07:32:51.827 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:32:52 compute-0 nova_compute[187185]: 2025-11-29 07:32:52.122 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:32:53 compute-0 podman[239114]: 2025-11-29 07:32:53.821636618 +0000 UTC m=+0.080180240 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 29 07:32:53 compute-0 podman[239115]: 2025-11-29 07:32:53.851951098 +0000 UTC m=+0.103041147 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 29 07:32:56 compute-0 nova_compute[187185]: 2025-11-29 07:32:56.830 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:32:57 compute-0 nova_compute[187185]: 2025-11-29 07:32:57.124 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:33:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:33:00.010 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=50, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=49) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:33:00 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:33:00.011 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:33:00 compute-0 nova_compute[187185]: 2025-11-29 07:33:00.011 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:33:01 compute-0 nova_compute[187185]: 2025-11-29 07:33:01.832 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:33:02 compute-0 nova_compute[187185]: 2025-11-29 07:33:02.127 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:33:02 compute-0 podman[239152]: 2025-11-29 07:33:02.797995789 +0000 UTC m=+0.053443824 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 07:33:02 compute-0 podman[239151]: 2025-11-29 07:33:02.807218043 +0000 UTC m=+0.069669449 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, config_id=edpm, vcs-type=git, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal)
Nov 29 07:33:02 compute-0 podman[239150]: 2025-11-29 07:33:02.812164795 +0000 UTC m=+0.078412830 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 29 07:33:03 compute-0 nova_compute[187185]: 2025-11-29 07:33:03.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:33:03 compute-0 nova_compute[187185]: 2025-11-29 07:33:03.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 29 07:33:04 compute-0 nova_compute[187185]: 2025-11-29 07:33:04.417 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 29 07:33:06 compute-0 nova_compute[187185]: 2025-11-29 07:33:06.835 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:33:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:33:07.014 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '50'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:33:07 compute-0 nova_compute[187185]: 2025-11-29 07:33:07.129 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:33:11 compute-0 sshd-session[239212]: Invalid user under from 20.255.62.58 port 54594
Nov 29 07:33:11 compute-0 nova_compute[187185]: 2025-11-29 07:33:11.837 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:33:11 compute-0 podman[239214]: 2025-11-29 07:33:11.886769435 +0000 UTC m=+0.137102823 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 07:33:11 compute-0 sshd-session[239212]: Received disconnect from 20.255.62.58 port 54594:11: Bye Bye [preauth]
Nov 29 07:33:11 compute-0 sshd-session[239212]: Disconnected from invalid user under 20.255.62.58 port 54594 [preauth]
Nov 29 07:33:12 compute-0 nova_compute[187185]: 2025-11-29 07:33:12.181 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:33:16 compute-0 nova_compute[187185]: 2025-11-29 07:33:16.679 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:33:16 compute-0 nova_compute[187185]: 2025-11-29 07:33:16.839 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:33:17 compute-0 nova_compute[187185]: 2025-11-29 07:33:17.182 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:33:18 compute-0 nova_compute[187185]: 2025-11-29 07:33:18.338 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:33:19 compute-0 nova_compute[187185]: 2025-11-29 07:33:19.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:33:19 compute-0 nova_compute[187185]: 2025-11-29 07:33:19.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:33:19 compute-0 nova_compute[187185]: 2025-11-29 07:33:19.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:33:19 compute-0 podman[239242]: 2025-11-29 07:33:19.796198775 +0000 UTC m=+0.064419098 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 07:33:21 compute-0 nova_compute[187185]: 2025-11-29 07:33:21.438 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 07:33:21 compute-0 nova_compute[187185]: 2025-11-29 07:33:21.439 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:33:21 compute-0 nova_compute[187185]: 2025-11-29 07:33:21.841 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:33:22 compute-0 nova_compute[187185]: 2025-11-29 07:33:22.183 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:33:23 compute-0 nova_compute[187185]: 2025-11-29 07:33:23.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:33:24 compute-0 nova_compute[187185]: 2025-11-29 07:33:24.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:33:24 compute-0 nova_compute[187185]: 2025-11-29 07:33:24.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:33:24 compute-0 nova_compute[187185]: 2025-11-29 07:33:24.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:33:24 compute-0 nova_compute[187185]: 2025-11-29 07:33:24.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:33:24 compute-0 nova_compute[187185]: 2025-11-29 07:33:24.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:33:24 compute-0 nova_compute[187185]: 2025-11-29 07:33:24.365 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:33:24 compute-0 nova_compute[187185]: 2025-11-29 07:33:24.366 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:33:24 compute-0 nova_compute[187185]: 2025-11-29 07:33:24.366 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:33:24 compute-0 nova_compute[187185]: 2025-11-29 07:33:24.367 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:33:24 compute-0 podman[239269]: 2025-11-29 07:33:24.478872928 +0000 UTC m=+0.062422311 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:33:24 compute-0 podman[239268]: 2025-11-29 07:33:24.482516093 +0000 UTC m=+0.069386451 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd)
Nov 29 07:33:24 compute-0 nova_compute[187185]: 2025-11-29 07:33:24.549 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:33:24 compute-0 nova_compute[187185]: 2025-11-29 07:33:24.550 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5737MB free_disk=73.24454498291016GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:33:24 compute-0 nova_compute[187185]: 2025-11-29 07:33:24.550 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:33:24 compute-0 nova_compute[187185]: 2025-11-29 07:33:24.550 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:33:24 compute-0 nova_compute[187185]: 2025-11-29 07:33:24.615 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:33:24 compute-0 nova_compute[187185]: 2025-11-29 07:33:24.616 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:33:24 compute-0 nova_compute[187185]: 2025-11-29 07:33:24.638 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:33:24 compute-0 nova_compute[187185]: 2025-11-29 07:33:24.657 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:33:24 compute-0 nova_compute[187185]: 2025-11-29 07:33:24.688 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:33:24 compute-0 nova_compute[187185]: 2025-11-29 07:33:24.688 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:33:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:33:25.522 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:33:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:33:25.523 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:33:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:33:25.523 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:33:26 compute-0 nova_compute[187185]: 2025-11-29 07:33:26.842 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:33:27 compute-0 nova_compute[187185]: 2025-11-29 07:33:27.185 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:33:28 compute-0 nova_compute[187185]: 2025-11-29 07:33:28.683 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:33:31 compute-0 nova_compute[187185]: 2025-11-29 07:33:31.844 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:33:32 compute-0 nova_compute[187185]: 2025-11-29 07:33:32.240 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:33:33 compute-0 podman[239310]: 2025-11-29 07:33:33.797576169 +0000 UTC m=+0.062694569 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 07:33:33 compute-0 podman[239312]: 2025-11-29 07:33:33.819341634 +0000 UTC m=+0.069839234 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:33:33 compute-0 podman[239311]: 2025-11-29 07:33:33.832061799 +0000 UTC m=+0.094703458 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, config_id=edpm, io.openshift.tags=minimal rhel9, version=9.6, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 29 07:33:36 compute-0 nova_compute[187185]: 2025-11-29 07:33:36.875 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:33:37 compute-0 nova_compute[187185]: 2025-11-29 07:33:37.243 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:33:37 compute-0 ovn_controller[95281]: 2025-11-29T07:33:37Z|00468|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 29 07:33:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:33:39.911 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=51, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=50) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:33:39 compute-0 nova_compute[187185]: 2025-11-29 07:33:39.912 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:33:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:33:39.913 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:33:41 compute-0 nova_compute[187185]: 2025-11-29 07:33:41.877 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:33:42 compute-0 nova_compute[187185]: 2025-11-29 07:33:42.245 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:33:42 compute-0 podman[239373]: 2025-11-29 07:33:42.892971885 +0000 UTC m=+0.157762536 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 07:33:46 compute-0 nova_compute[187185]: 2025-11-29 07:33:46.881 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:33:47 compute-0 nova_compute[187185]: 2025-11-29 07:33:47.248 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:33:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:33:47.917 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '51'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:33:50 compute-0 podman[239398]: 2025-11-29 07:33:50.801660455 +0000 UTC m=+0.066662613 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 07:33:51 compute-0 nova_compute[187185]: 2025-11-29 07:33:51.883 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:33:52 compute-0 nova_compute[187185]: 2025-11-29 07:33:52.249 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:33:54 compute-0 podman[239423]: 2025-11-29 07:33:54.815488947 +0000 UTC m=+0.072187512 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:33:54 compute-0 podman[239424]: 2025-11-29 07:33:54.817811433 +0000 UTC m=+0.067992101 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 29 07:33:56 compute-0 nova_compute[187185]: 2025-11-29 07:33:56.895 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:33:57 compute-0 nova_compute[187185]: 2025-11-29 07:33:57.252 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:34:01 compute-0 nova_compute[187185]: 2025-11-29 07:34:01.110 187189 DEBUG oslo_concurrency.lockutils [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "0a1598ac-7fe7-4004-ad07-c9b7428d7822" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:34:01 compute-0 nova_compute[187185]: 2025-11-29 07:34:01.111 187189 DEBUG oslo_concurrency.lockutils [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "0a1598ac-7fe7-4004-ad07-c9b7428d7822" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:34:01 compute-0 nova_compute[187185]: 2025-11-29 07:34:01.194 187189 DEBUG nova.compute.manager [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 07:34:01 compute-0 nova_compute[187185]: 2025-11-29 07:34:01.897 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:34:02 compute-0 nova_compute[187185]: 2025-11-29 07:34:02.253 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:34:04 compute-0 podman[239467]: 2025-11-29 07:34:04.801770073 +0000 UTC m=+0.059307082 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 07:34:04 compute-0 podman[239465]: 2025-11-29 07:34:04.801740342 +0000 UTC m=+0.062972957 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 29 07:34:04 compute-0 podman[239466]: 2025-11-29 07:34:04.832036671 +0000 UTC m=+0.092636808 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., name=ubi9-minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 07:34:05 compute-0 nova_compute[187185]: 2025-11-29 07:34:05.830 187189 DEBUG oslo_concurrency.lockutils [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:34:05 compute-0 nova_compute[187185]: 2025-11-29 07:34:05.831 187189 DEBUG oslo_concurrency.lockutils [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:34:05 compute-0 nova_compute[187185]: 2025-11-29 07:34:05.840 187189 DEBUG nova.virt.hardware [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:34:05 compute-0 nova_compute[187185]: 2025-11-29 07:34:05.841 187189 INFO nova.compute.claims [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:34:06 compute-0 nova_compute[187185]: 2025-11-29 07:34:06.899 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:34:07 compute-0 nova_compute[187185]: 2025-11-29 07:34:07.255 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:34:09 compute-0 nova_compute[187185]: 2025-11-29 07:34:09.053 187189 DEBUG nova.compute.provider_tree [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:34:09 compute-0 nova_compute[187185]: 2025-11-29 07:34:09.111 187189 DEBUG nova.scheduler.client.report [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:34:09 compute-0 nova_compute[187185]: 2025-11-29 07:34:09.221 187189 DEBUG oslo_concurrency.lockutils [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 3.390s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:34:09 compute-0 nova_compute[187185]: 2025-11-29 07:34:09.222 187189 DEBUG nova.compute.manager [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 07:34:09 compute-0 nova_compute[187185]: 2025-11-29 07:34:09.491 187189 DEBUG nova.compute.manager [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 07:34:09 compute-0 nova_compute[187185]: 2025-11-29 07:34:09.492 187189 DEBUG nova.network.neutron [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 07:34:09 compute-0 nova_compute[187185]: 2025-11-29 07:34:09.570 187189 INFO nova.virt.libvirt.driver [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 07:34:09 compute-0 nova_compute[187185]: 2025-11-29 07:34:09.669 187189 DEBUG nova.compute.manager [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 07:34:11 compute-0 nova_compute[187185]: 2025-11-29 07:34:11.901 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:34:12 compute-0 nova_compute[187185]: 2025-11-29 07:34:12.257 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:34:13 compute-0 podman[239527]: 2025-11-29 07:34:13.861425203 +0000 UTC m=+0.123457542 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 07:34:16 compute-0 nova_compute[187185]: 2025-11-29 07:34:16.903 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:34:17 compute-0 nova_compute[187185]: 2025-11-29 07:34:17.259 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:34:19 compute-0 nova_compute[187185]: 2025-11-29 07:34:19.840 187189 DEBUG nova.policy [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 07:34:20 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:34:20.225 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=52, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=51) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:34:20 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:34:20.226 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:34:20 compute-0 nova_compute[187185]: 2025-11-29 07:34:20.228 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:34:20 compute-0 nova_compute[187185]: 2025-11-29 07:34:20.297 187189 DEBUG nova.compute.manager [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 07:34:20 compute-0 nova_compute[187185]: 2025-11-29 07:34:20.299 187189 DEBUG nova.virt.libvirt.driver [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:34:20 compute-0 nova_compute[187185]: 2025-11-29 07:34:20.300 187189 INFO nova.virt.libvirt.driver [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Creating image(s)
Nov 29 07:34:20 compute-0 nova_compute[187185]: 2025-11-29 07:34:20.301 187189 DEBUG oslo_concurrency.lockutils [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "/var/lib/nova/instances/0a1598ac-7fe7-4004-ad07-c9b7428d7822/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:34:20 compute-0 nova_compute[187185]: 2025-11-29 07:34:20.302 187189 DEBUG oslo_concurrency.lockutils [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "/var/lib/nova/instances/0a1598ac-7fe7-4004-ad07-c9b7428d7822/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:34:20 compute-0 nova_compute[187185]: 2025-11-29 07:34:20.303 187189 DEBUG oslo_concurrency.lockutils [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "/var/lib/nova/instances/0a1598ac-7fe7-4004-ad07-c9b7428d7822/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:34:20 compute-0 nova_compute[187185]: 2025-11-29 07:34:20.333 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:34:20 compute-0 nova_compute[187185]: 2025-11-29 07:34:20.334 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:34:20 compute-0 nova_compute[187185]: 2025-11-29 07:34:20.334 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:34:20 compute-0 nova_compute[187185]: 2025-11-29 07:34:20.338 187189 DEBUG oslo_concurrency.processutils [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:34:20 compute-0 nova_compute[187185]: 2025-11-29 07:34:20.396 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Nov 29 07:34:20 compute-0 nova_compute[187185]: 2025-11-29 07:34:20.397 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 07:34:20 compute-0 nova_compute[187185]: 2025-11-29 07:34:20.398 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:34:20 compute-0 nova_compute[187185]: 2025-11-29 07:34:20.403 187189 DEBUG oslo_concurrency.processutils [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:34:20 compute-0 nova_compute[187185]: 2025-11-29 07:34:20.404 187189 DEBUG oslo_concurrency.lockutils [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:34:20 compute-0 nova_compute[187185]: 2025-11-29 07:34:20.404 187189 DEBUG oslo_concurrency.lockutils [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:34:20 compute-0 nova_compute[187185]: 2025-11-29 07:34:20.419 187189 DEBUG oslo_concurrency.processutils [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:34:20 compute-0 nova_compute[187185]: 2025-11-29 07:34:20.481 187189 DEBUG oslo_concurrency.processutils [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:34:20 compute-0 nova_compute[187185]: 2025-11-29 07:34:20.483 187189 DEBUG oslo_concurrency.processutils [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/0a1598ac-7fe7-4004-ad07-c9b7428d7822/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:34:20 compute-0 nova_compute[187185]: 2025-11-29 07:34:20.639 187189 DEBUG oslo_concurrency.processutils [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/0a1598ac-7fe7-4004-ad07-c9b7428d7822/disk 1073741824" returned: 0 in 0.156s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:34:20 compute-0 nova_compute[187185]: 2025-11-29 07:34:20.641 187189 DEBUG oslo_concurrency.lockutils [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.236s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:34:20 compute-0 nova_compute[187185]: 2025-11-29 07:34:20.642 187189 DEBUG oslo_concurrency.processutils [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:34:20 compute-0 nova_compute[187185]: 2025-11-29 07:34:20.733 187189 DEBUG oslo_concurrency.processutils [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:34:20 compute-0 nova_compute[187185]: 2025-11-29 07:34:20.735 187189 DEBUG nova.virt.disk.api [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Checking if we can resize image /var/lib/nova/instances/0a1598ac-7fe7-4004-ad07-c9b7428d7822/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:34:20 compute-0 nova_compute[187185]: 2025-11-29 07:34:20.736 187189 DEBUG oslo_concurrency.processutils [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0a1598ac-7fe7-4004-ad07-c9b7428d7822/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:34:20 compute-0 nova_compute[187185]: 2025-11-29 07:34:20.803 187189 DEBUG oslo_concurrency.processutils [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0a1598ac-7fe7-4004-ad07-c9b7428d7822/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:34:20 compute-0 nova_compute[187185]: 2025-11-29 07:34:20.805 187189 DEBUG nova.virt.disk.api [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Cannot resize image /var/lib/nova/instances/0a1598ac-7fe7-4004-ad07-c9b7428d7822/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:34:20 compute-0 nova_compute[187185]: 2025-11-29 07:34:20.806 187189 DEBUG nova.objects.instance [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'migration_context' on Instance uuid 0a1598ac-7fe7-4004-ad07-c9b7428d7822 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:34:21 compute-0 nova_compute[187185]: 2025-11-29 07:34:21.085 187189 DEBUG nova.virt.libvirt.driver [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:34:21 compute-0 nova_compute[187185]: 2025-11-29 07:34:21.087 187189 DEBUG nova.virt.libvirt.driver [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Ensure instance console log exists: /var/lib/nova/instances/0a1598ac-7fe7-4004-ad07-c9b7428d7822/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:34:21 compute-0 nova_compute[187185]: 2025-11-29 07:34:21.088 187189 DEBUG oslo_concurrency.lockutils [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:34:21 compute-0 nova_compute[187185]: 2025-11-29 07:34:21.089 187189 DEBUG oslo_concurrency.lockutils [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:34:21 compute-0 nova_compute[187185]: 2025-11-29 07:34:21.089 187189 DEBUG oslo_concurrency.lockutils [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:34:21 compute-0 nova_compute[187185]: 2025-11-29 07:34:21.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:34:21 compute-0 podman[239570]: 2025-11-29 07:34:21.813483356 +0000 UTC m=+0.076354671 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 07:34:21 compute-0 nova_compute[187185]: 2025-11-29 07:34:21.906 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:34:22 compute-0 nova_compute[187185]: 2025-11-29 07:34:22.297 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:34:22 compute-0 sshd-session[239568]: Received disconnect from 45.78.219.119 port 38756:11: Bye Bye [preauth]
Nov 29 07:34:22 compute-0 sshd-session[239568]: Disconnected from authenticating user root 45.78.219.119 port 38756 [preauth]
Nov 29 07:34:23 compute-0 nova_compute[187185]: 2025-11-29 07:34:23.114 187189 DEBUG nova.network.neutron [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Successfully created port: 3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:34:23 compute-0 nova_compute[187185]: 2025-11-29 07:34:23.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:34:23 compute-0 sshd-session[239464]: error: kex_exchange_identification: read: Connection timed out
Nov 29 07:34:23 compute-0 sshd-session[239464]: banner exchange: Connection from 101.126.89.35 port 33342: Connection timed out
Nov 29 07:34:24 compute-0 nova_compute[187185]: 2025-11-29 07:34:24.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:34:24 compute-0 nova_compute[187185]: 2025-11-29 07:34:24.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:34:24 compute-0 nova_compute[187185]: 2025-11-29 07:34:24.945 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:34:24 compute-0 nova_compute[187185]: 2025-11-29 07:34:24.945 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:34:24 compute-0 nova_compute[187185]: 2025-11-29 07:34:24.946 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:34:24 compute-0 nova_compute[187185]: 2025-11-29 07:34:24.946 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:34:25 compute-0 nova_compute[187185]: 2025-11-29 07:34:25.451 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:34:25 compute-0 nova_compute[187185]: 2025-11-29 07:34:25.452 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5731MB free_disk=73.24433517456055GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:34:25 compute-0 nova_compute[187185]: 2025-11-29 07:34:25.453 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:34:25 compute-0 nova_compute[187185]: 2025-11-29 07:34:25.453 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:34:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:34:25.523 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:34:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:34:25.524 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:34:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:34:25.524 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:34:25 compute-0 podman[239595]: 2025-11-29 07:34:25.805711087 +0000 UTC m=+0.063069890 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:34:25 compute-0 podman[239594]: 2025-11-29 07:34:25.813285455 +0000 UTC m=+0.074098497 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, managed_by=edpm_ansible)
Nov 29 07:34:26 compute-0 nova_compute[187185]: 2025-11-29 07:34:26.908 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:34:27 compute-0 nova_compute[187185]: 2025-11-29 07:34:27.298 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:34:28 compute-0 sshd-session[239631]: Invalid user admin from 190.181.27.27 port 47282
Nov 29 07:34:29 compute-0 sshd-session[239631]: Received disconnect from 190.181.27.27 port 47282:11: Bye Bye [preauth]
Nov 29 07:34:29 compute-0 sshd-session[239631]: Disconnected from invalid user admin 190.181.27.27 port 47282 [preauth]
Nov 29 07:34:29 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:34:29.229 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '52'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:34:31 compute-0 nova_compute[187185]: 2025-11-29 07:34:31.942 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:34:32 compute-0 nova_compute[187185]: 2025-11-29 07:34:32.299 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:34:35 compute-0 podman[239640]: 2025-11-29 07:34:35.809491387 +0000 UTC m=+0.050198451 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 07:34:35 compute-0 podman[239634]: 2025-11-29 07:34:35.817076434 +0000 UTC m=+0.063719738 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, container_name=openstack_network_exporter, name=ubi9-minimal, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.openshift.expose-services=, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 29 07:34:35 compute-0 podman[239633]: 2025-11-29 07:34:35.841232377 +0000 UTC m=+0.097060295 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Nov 29 07:34:36 compute-0 nova_compute[187185]: 2025-11-29 07:34:36.944 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:34:37 compute-0 nova_compute[187185]: 2025-11-29 07:34:37.301 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:34:41 compute-0 nova_compute[187185]: 2025-11-29 07:34:41.673 187189 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 6.77 sec
Nov 29 07:34:41 compute-0 nova_compute[187185]: 2025-11-29 07:34:41.947 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:34:42 compute-0 nova_compute[187185]: 2025-11-29 07:34:42.246 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance 0a1598ac-7fe7-4004-ad07-c9b7428d7822 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:34:42 compute-0 nova_compute[187185]: 2025-11-29 07:34:42.247 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:34:42 compute-0 nova_compute[187185]: 2025-11-29 07:34:42.247 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:34:42 compute-0 nova_compute[187185]: 2025-11-29 07:34:42.302 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:34:42 compute-0 nova_compute[187185]: 2025-11-29 07:34:42.360 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Refreshing inventories for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 29 07:34:42 compute-0 nova_compute[187185]: 2025-11-29 07:34:42.440 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Updating ProviderTree inventory for provider 4e39a026-df39-4e20-874a-dbb5a40df044 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 29 07:34:42 compute-0 nova_compute[187185]: 2025-11-29 07:34:42.441 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Updating inventory in ProviderTree for provider 4e39a026-df39-4e20-874a-dbb5a40df044 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 29 07:34:42 compute-0 nova_compute[187185]: 2025-11-29 07:34:42.457 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Refreshing aggregate associations for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 29 07:34:42 compute-0 nova_compute[187185]: 2025-11-29 07:34:42.503 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Refreshing trait associations for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044, traits: HW_CPU_X86_SSE,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_TRUSTED_CERTS,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 29 07:34:42 compute-0 nova_compute[187185]: 2025-11-29 07:34:42.575 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:34:43 compute-0 sshd-session[239697]: Invalid user fred from 115.190.187.93 port 46300
Nov 29 07:34:43 compute-0 sshd-session[239697]: Received disconnect from 115.190.187.93 port 46300:11: Bye Bye [preauth]
Nov 29 07:34:43 compute-0 sshd-session[239697]: Disconnected from invalid user fred 115.190.187.93 port 46300 [preauth]
Nov 29 07:34:44 compute-0 nova_compute[187185]: 2025-11-29 07:34:44.526 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:34:44 compute-0 nova_compute[187185]: 2025-11-29 07:34:44.660 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:34:44 compute-0 nova_compute[187185]: 2025-11-29 07:34:44.660 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 19.207s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:34:44 compute-0 podman[239699]: 2025-11-29 07:34:44.944917441 +0000 UTC m=+0.197594838 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 07:34:45 compute-0 nova_compute[187185]: 2025-11-29 07:34:45.664 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:34:45 compute-0 nova_compute[187185]: 2025-11-29 07:34:45.664 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:34:45 compute-0 nova_compute[187185]: 2025-11-29 07:34:45.665 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:34:45 compute-0 nova_compute[187185]: 2025-11-29 07:34:45.665 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:34:46 compute-0 nova_compute[187185]: 2025-11-29 07:34:46.116 187189 DEBUG nova.network.neutron [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Successfully created port: cca30a15-3dfe-49f3-8b60-50dc1d039795 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:34:46 compute-0 nova_compute[187185]: 2025-11-29 07:34:46.950 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:34:47 compute-0 nova_compute[187185]: 2025-11-29 07:34:47.304 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:34:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:34:48.006 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:34:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:34:48.007 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:34:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:34:48.008 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:34:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:34:48.008 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:34:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:34:48.008 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:34:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:34:48.008 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:34:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:34:48.008 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:34:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:34:48.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:34:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:34:48.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:34:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:34:48.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:34:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:34:48.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:34:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:34:48.010 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:34:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:34:48.010 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:34:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:34:48.010 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:34:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:34:48.010 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:34:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:34:48.011 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:34:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:34:48.011 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:34:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:34:48.011 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:34:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:34:48.011 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:34:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:34:48.011 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:34:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:34:48.011 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:34:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:34:48.011 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:34:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:34:48.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:34:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:34:48.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:34:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:34:48.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:34:49 compute-0 nova_compute[187185]: 2025-11-29 07:34:49.311 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:34:49 compute-0 nova_compute[187185]: 2025-11-29 07:34:49.461 187189 DEBUG nova.network.neutron [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Successfully updated port: 3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:34:50 compute-0 nova_compute[187185]: 2025-11-29 07:34:50.168 187189 DEBUG nova.compute.manager [req-6349539e-a403-4b45-9acf-a9eb7a082339 req-9db0c465-9caa-4345-9c14-fb089b6472aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Received event network-changed-3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:34:50 compute-0 nova_compute[187185]: 2025-11-29 07:34:50.169 187189 DEBUG nova.compute.manager [req-6349539e-a403-4b45-9acf-a9eb7a082339 req-9db0c465-9caa-4345-9c14-fb089b6472aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Refreshing instance network info cache due to event network-changed-3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:34:50 compute-0 nova_compute[187185]: 2025-11-29 07:34:50.170 187189 DEBUG oslo_concurrency.lockutils [req-6349539e-a403-4b45-9acf-a9eb7a082339 req-9db0c465-9caa-4345-9c14-fb089b6472aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-0a1598ac-7fe7-4004-ad07-c9b7428d7822" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:34:50 compute-0 nova_compute[187185]: 2025-11-29 07:34:50.170 187189 DEBUG oslo_concurrency.lockutils [req-6349539e-a403-4b45-9acf-a9eb7a082339 req-9db0c465-9caa-4345-9c14-fb089b6472aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-0a1598ac-7fe7-4004-ad07-c9b7428d7822" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:34:50 compute-0 nova_compute[187185]: 2025-11-29 07:34:50.171 187189 DEBUG nova.network.neutron [req-6349539e-a403-4b45-9acf-a9eb7a082339 req-9db0c465-9caa-4345-9c14-fb089b6472aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Refreshing network info cache for port 3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:34:51 compute-0 nova_compute[187185]: 2025-11-29 07:34:51.953 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:34:52 compute-0 nova_compute[187185]: 2025-11-29 07:34:52.036 187189 DEBUG nova.network.neutron [req-6349539e-a403-4b45-9acf-a9eb7a082339 req-9db0c465-9caa-4345-9c14-fb089b6472aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:34:52 compute-0 nova_compute[187185]: 2025-11-29 07:34:52.306 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:34:52 compute-0 podman[239725]: 2025-11-29 07:34:52.825150773 +0000 UTC m=+0.074996522 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:34:53 compute-0 nova_compute[187185]: 2025-11-29 07:34:53.587 187189 DEBUG nova.network.neutron [req-6349539e-a403-4b45-9acf-a9eb7a082339 req-9db0c465-9caa-4345-9c14-fb089b6472aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:34:53 compute-0 nova_compute[187185]: 2025-11-29 07:34:53.639 187189 DEBUG oslo_concurrency.lockutils [req-6349539e-a403-4b45-9acf-a9eb7a082339 req-9db0c465-9caa-4345-9c14-fb089b6472aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-0a1598ac-7fe7-4004-ad07-c9b7428d7822" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:34:54 compute-0 nova_compute[187185]: 2025-11-29 07:34:54.654 187189 DEBUG nova.network.neutron [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Successfully updated port: cca30a15-3dfe-49f3-8b60-50dc1d039795 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:34:54 compute-0 nova_compute[187185]: 2025-11-29 07:34:54.699 187189 DEBUG oslo_concurrency.lockutils [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "refresh_cache-0a1598ac-7fe7-4004-ad07-c9b7428d7822" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:34:54 compute-0 nova_compute[187185]: 2025-11-29 07:34:54.700 187189 DEBUG oslo_concurrency.lockutils [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquired lock "refresh_cache-0a1598ac-7fe7-4004-ad07-c9b7428d7822" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:34:54 compute-0 nova_compute[187185]: 2025-11-29 07:34:54.700 187189 DEBUG nova.network.neutron [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:34:55 compute-0 nova_compute[187185]: 2025-11-29 07:34:55.399 187189 DEBUG nova.compute.manager [req-04656a8a-431e-48a0-9d29-0716cd9d16be req-5a69b7b8-2a40-41a6-8b7e-4fef32442bc7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Received event network-changed-cca30a15-3dfe-49f3-8b60-50dc1d039795 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:34:55 compute-0 nova_compute[187185]: 2025-11-29 07:34:55.399 187189 DEBUG nova.compute.manager [req-04656a8a-431e-48a0-9d29-0716cd9d16be req-5a69b7b8-2a40-41a6-8b7e-4fef32442bc7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Refreshing instance network info cache due to event network-changed-cca30a15-3dfe-49f3-8b60-50dc1d039795. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:34:55 compute-0 nova_compute[187185]: 2025-11-29 07:34:55.399 187189 DEBUG oslo_concurrency.lockutils [req-04656a8a-431e-48a0-9d29-0716cd9d16be req-5a69b7b8-2a40-41a6-8b7e-4fef32442bc7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-0a1598ac-7fe7-4004-ad07-c9b7428d7822" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:34:56 compute-0 podman[239750]: 2025-11-29 07:34:56.792170492 +0000 UTC m=+0.055892064 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:34:56 compute-0 podman[239749]: 2025-11-29 07:34:56.799223714 +0000 UTC m=+0.067043244 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3)
Nov 29 07:34:56 compute-0 nova_compute[187185]: 2025-11-29 07:34:56.955 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:34:57 compute-0 nova_compute[187185]: 2025-11-29 07:34:57.091 187189 DEBUG nova.network.neutron [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:34:57 compute-0 nova_compute[187185]: 2025-11-29 07:34:57.308 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:01 compute-0 nova_compute[187185]: 2025-11-29 07:35:01.957 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:02 compute-0 nova_compute[187185]: 2025-11-29 07:35:02.310 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:05 compute-0 sshd-session[239789]: Invalid user username from 20.255.62.58 port 49902
Nov 29 07:35:05 compute-0 nova_compute[187185]: 2025-11-29 07:35:05.471 187189 DEBUG nova.network.neutron [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Updating instance_info_cache with network_info: [{"id": "3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5", "address": "fa:16:3e:62:77:74", "network": {"id": "ee42f4e1-7038-4a24-8d9b-8ee99ca415d0", "bridge": "br-int", "label": "tempest-network-smoke--1973917221", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a32e9dc-c9", "ovs_interfaceid": "3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "cca30a15-3dfe-49f3-8b60-50dc1d039795", "address": "fa:16:3e:6a:4b:a1", "network": {"id": "716ed53e-cc56-4286-b418-2f5e02d33124", "bridge": "br-int", "label": "tempest-network-smoke--990040375", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe6a:4ba1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe6a:4ba1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcca30a15-3d", "ovs_interfaceid": "cca30a15-3dfe-49f3-8b60-50dc1d039795", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:35:05 compute-0 sshd-session[239789]: Received disconnect from 20.255.62.58 port 49902:11: Bye Bye [preauth]
Nov 29 07:35:05 compute-0 sshd-session[239789]: Disconnected from invalid user username 20.255.62.58 port 49902 [preauth]
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.555 187189 DEBUG oslo_concurrency.lockutils [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Releasing lock "refresh_cache-0a1598ac-7fe7-4004-ad07-c9b7428d7822" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.556 187189 DEBUG nova.compute.manager [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Instance network_info: |[{"id": "3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5", "address": "fa:16:3e:62:77:74", "network": {"id": "ee42f4e1-7038-4a24-8d9b-8ee99ca415d0", "bridge": "br-int", "label": "tempest-network-smoke--1973917221", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a32e9dc-c9", "ovs_interfaceid": "3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "cca30a15-3dfe-49f3-8b60-50dc1d039795", "address": "fa:16:3e:6a:4b:a1", "network": {"id": "716ed53e-cc56-4286-b418-2f5e02d33124", "bridge": "br-int", "label": "tempest-network-smoke--990040375", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe6a:4ba1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe6a:4ba1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcca30a15-3d", "ovs_interfaceid": "cca30a15-3dfe-49f3-8b60-50dc1d039795", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.557 187189 DEBUG oslo_concurrency.lockutils [req-04656a8a-431e-48a0-9d29-0716cd9d16be req-5a69b7b8-2a40-41a6-8b7e-4fef32442bc7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-0a1598ac-7fe7-4004-ad07-c9b7428d7822" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.557 187189 DEBUG nova.network.neutron [req-04656a8a-431e-48a0-9d29-0716cd9d16be req-5a69b7b8-2a40-41a6-8b7e-4fef32442bc7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Refreshing network info cache for port cca30a15-3dfe-49f3-8b60-50dc1d039795 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.562 187189 DEBUG nova.virt.libvirt.driver [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Start _get_guest_xml network_info=[{"id": "3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5", "address": "fa:16:3e:62:77:74", "network": {"id": "ee42f4e1-7038-4a24-8d9b-8ee99ca415d0", "bridge": "br-int", "label": "tempest-network-smoke--1973917221", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a32e9dc-c9", "ovs_interfaceid": "3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "cca30a15-3dfe-49f3-8b60-50dc1d039795", "address": "fa:16:3e:6a:4b:a1", "network": {"id": "716ed53e-cc56-4286-b418-2f5e02d33124", "bridge": "br-int", "label": "tempest-network-smoke--990040375", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe6a:4ba1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe6a:4ba1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcca30a15-3d", "ovs_interfaceid": "cca30a15-3dfe-49f3-8b60-50dc1d039795", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.567 187189 WARNING nova.virt.libvirt.driver [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.588 187189 DEBUG nova.virt.libvirt.host [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.589 187189 DEBUG nova.virt.libvirt.host [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.592 187189 DEBUG nova.virt.libvirt.host [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.593 187189 DEBUG nova.virt.libvirt.host [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.594 187189 DEBUG nova.virt.libvirt.driver [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.595 187189 DEBUG nova.virt.hardware [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.595 187189 DEBUG nova.virt.hardware [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.595 187189 DEBUG nova.virt.hardware [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.596 187189 DEBUG nova.virt.hardware [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.596 187189 DEBUG nova.virt.hardware [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.596 187189 DEBUG nova.virt.hardware [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.597 187189 DEBUG nova.virt.hardware [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.597 187189 DEBUG nova.virt.hardware [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.597 187189 DEBUG nova.virt.hardware [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.597 187189 DEBUG nova.virt.hardware [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.598 187189 DEBUG nova.virt.hardware [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.601 187189 DEBUG nova.virt.libvirt.vif [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:33:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1402689375',display_name='tempest-TestGettingAddress-server-1402689375',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1402689375',id=149,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBK18BM+cTj/3sUxBzPfGqMZVFW8u0MAYl6D47npYBAuFCVOWerNtWreCBLJmXVjolkwHlCMhwEYpOxlO5EeqIXZi9GpQKv0jmJXL6Uw9pSzb3x5DyZSKVQsbQBVQ+UXj+w==',key_name='tempest-TestGettingAddress-668700412',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-izv751fp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:34:10Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=0a1598ac-7fe7-4004-ad07-c9b7428d7822,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5", "address": "fa:16:3e:62:77:74", "network": {"id": "ee42f4e1-7038-4a24-8d9b-8ee99ca415d0", "bridge": "br-int", "label": "tempest-network-smoke--1973917221", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a32e9dc-c9", "ovs_interfaceid": "3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.602 187189 DEBUG nova.network.os_vif_util [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5", "address": "fa:16:3e:62:77:74", "network": {"id": "ee42f4e1-7038-4a24-8d9b-8ee99ca415d0", "bridge": "br-int", "label": "tempest-network-smoke--1973917221", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a32e9dc-c9", "ovs_interfaceid": "3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.603 187189 DEBUG nova.network.os_vif_util [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:77:74,bridge_name='br-int',has_traffic_filtering=True,id=3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5,network=Network(ee42f4e1-7038-4a24-8d9b-8ee99ca415d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a32e9dc-c9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.603 187189 DEBUG nova.virt.libvirt.vif [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:33:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1402689375',display_name='tempest-TestGettingAddress-server-1402689375',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1402689375',id=149,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBK18BM+cTj/3sUxBzPfGqMZVFW8u0MAYl6D47npYBAuFCVOWerNtWreCBLJmXVjolkwHlCMhwEYpOxlO5EeqIXZi9GpQKv0jmJXL6Uw9pSzb3x5DyZSKVQsbQBVQ+UXj+w==',key_name='tempest-TestGettingAddress-668700412',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-izv751fp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:34:10Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=0a1598ac-7fe7-4004-ad07-c9b7428d7822,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cca30a15-3dfe-49f3-8b60-50dc1d039795", "address": "fa:16:3e:6a:4b:a1", "network": {"id": "716ed53e-cc56-4286-b418-2f5e02d33124", "bridge": "br-int", "label": "tempest-network-smoke--990040375", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe6a:4ba1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe6a:4ba1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcca30a15-3d", "ovs_interfaceid": "cca30a15-3dfe-49f3-8b60-50dc1d039795", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.604 187189 DEBUG nova.network.os_vif_util [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "cca30a15-3dfe-49f3-8b60-50dc1d039795", "address": "fa:16:3e:6a:4b:a1", "network": {"id": "716ed53e-cc56-4286-b418-2f5e02d33124", "bridge": "br-int", "label": "tempest-network-smoke--990040375", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe6a:4ba1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe6a:4ba1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcca30a15-3d", "ovs_interfaceid": "cca30a15-3dfe-49f3-8b60-50dc1d039795", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.605 187189 DEBUG nova.network.os_vif_util [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:4b:a1,bridge_name='br-int',has_traffic_filtering=True,id=cca30a15-3dfe-49f3-8b60-50dc1d039795,network=Network(716ed53e-cc56-4286-b418-2f5e02d33124),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcca30a15-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.605 187189 DEBUG nova.objects.instance [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'pci_devices' on Instance uuid 0a1598ac-7fe7-4004-ad07-c9b7428d7822 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.633 187189 DEBUG nova.virt.libvirt.driver [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:35:06 compute-0 nova_compute[187185]:   <uuid>0a1598ac-7fe7-4004-ad07-c9b7428d7822</uuid>
Nov 29 07:35:06 compute-0 nova_compute[187185]:   <name>instance-00000095</name>
Nov 29 07:35:06 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:35:06 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:35:06 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:35:06 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:       <nova:name>tempest-TestGettingAddress-server-1402689375</nova:name>
Nov 29 07:35:06 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:35:06</nova:creationTime>
Nov 29 07:35:06 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:35:06 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:35:06 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:35:06 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:35:06 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:35:06 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:35:06 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:35:06 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:35:06 compute-0 nova_compute[187185]:         <nova:user uuid="31ac7b05b012433b89143dc9f259644a">tempest-TestGettingAddress-1465017630-project-member</nova:user>
Nov 29 07:35:06 compute-0 nova_compute[187185]:         <nova:project uuid="0111c22b4b954ea586ca20d91ed3970f">tempest-TestGettingAddress-1465017630</nova:project>
Nov 29 07:35:06 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:35:06 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:35:06 compute-0 nova_compute[187185]:         <nova:port uuid="3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5">
Nov 29 07:35:06 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:35:06 compute-0 nova_compute[187185]:         <nova:port uuid="cca30a15-3dfe-49f3-8b60-50dc1d039795">
Nov 29 07:35:06 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fe6a:4ba1" ipVersion="6"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe6a:4ba1" ipVersion="6"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:35:06 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:35:06 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:35:06 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <system>
Nov 29 07:35:06 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:35:06 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:35:06 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:35:06 compute-0 nova_compute[187185]:       <entry name="serial">0a1598ac-7fe7-4004-ad07-c9b7428d7822</entry>
Nov 29 07:35:06 compute-0 nova_compute[187185]:       <entry name="uuid">0a1598ac-7fe7-4004-ad07-c9b7428d7822</entry>
Nov 29 07:35:06 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     </system>
Nov 29 07:35:06 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:35:06 compute-0 nova_compute[187185]:   <os>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:   </os>
Nov 29 07:35:06 compute-0 nova_compute[187185]:   <features>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:   </features>
Nov 29 07:35:06 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:35:06 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:35:06 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:35:06 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/0a1598ac-7fe7-4004-ad07-c9b7428d7822/disk"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:35:06 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/0a1598ac-7fe7-4004-ad07-c9b7428d7822/disk.config"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:35:06 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:62:77:74"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:       <target dev="tap3a32e9dc-c9"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:35:06 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:6a:4b:a1"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:       <target dev="tapcca30a15-3d"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:35:06 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/0a1598ac-7fe7-4004-ad07-c9b7428d7822/console.log" append="off"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <video>
Nov 29 07:35:06 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     </video>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:35:06 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:35:06 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:35:06 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:35:06 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:35:06 compute-0 nova_compute[187185]: </domain>
Nov 29 07:35:06 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.635 187189 DEBUG nova.compute.manager [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Preparing to wait for external event network-vif-plugged-3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.636 187189 DEBUG oslo_concurrency.lockutils [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "0a1598ac-7fe7-4004-ad07-c9b7428d7822-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.636 187189 DEBUG oslo_concurrency.lockutils [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "0a1598ac-7fe7-4004-ad07-c9b7428d7822-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.637 187189 DEBUG oslo_concurrency.lockutils [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "0a1598ac-7fe7-4004-ad07-c9b7428d7822-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.637 187189 DEBUG nova.compute.manager [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Preparing to wait for external event network-vif-plugged-cca30a15-3dfe-49f3-8b60-50dc1d039795 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.637 187189 DEBUG oslo_concurrency.lockutils [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "0a1598ac-7fe7-4004-ad07-c9b7428d7822-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.637 187189 DEBUG oslo_concurrency.lockutils [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "0a1598ac-7fe7-4004-ad07-c9b7428d7822-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.637 187189 DEBUG oslo_concurrency.lockutils [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "0a1598ac-7fe7-4004-ad07-c9b7428d7822-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.638 187189 DEBUG nova.virt.libvirt.vif [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:33:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1402689375',display_name='tempest-TestGettingAddress-server-1402689375',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1402689375',id=149,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBK18BM+cTj/3sUxBzPfGqMZVFW8u0MAYl6D47npYBAuFCVOWerNtWreCBLJmXVjolkwHlCMhwEYpOxlO5EeqIXZi9GpQKv0jmJXL6Uw9pSzb3x5DyZSKVQsbQBVQ+UXj+w==',key_name='tempest-TestGettingAddress-668700412',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-izv751fp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:34:10Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=0a1598ac-7fe7-4004-ad07-c9b7428d7822,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5", "address": "fa:16:3e:62:77:74", "network": {"id": "ee42f4e1-7038-4a24-8d9b-8ee99ca415d0", "bridge": "br-int", "label": "tempest-network-smoke--1973917221", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a32e9dc-c9", "ovs_interfaceid": "3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.638 187189 DEBUG nova.network.os_vif_util [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5", "address": "fa:16:3e:62:77:74", "network": {"id": "ee42f4e1-7038-4a24-8d9b-8ee99ca415d0", "bridge": "br-int", "label": "tempest-network-smoke--1973917221", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a32e9dc-c9", "ovs_interfaceid": "3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.639 187189 DEBUG nova.network.os_vif_util [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:77:74,bridge_name='br-int',has_traffic_filtering=True,id=3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5,network=Network(ee42f4e1-7038-4a24-8d9b-8ee99ca415d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a32e9dc-c9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.639 187189 DEBUG os_vif [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:77:74,bridge_name='br-int',has_traffic_filtering=True,id=3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5,network=Network(ee42f4e1-7038-4a24-8d9b-8ee99ca415d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a32e9dc-c9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.640 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.641 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.641 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.645 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.645 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3a32e9dc-c9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.646 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3a32e9dc-c9, col_values=(('external_ids', {'iface-id': '3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:62:77:74', 'vm-uuid': '0a1598ac-7fe7-4004-ad07-c9b7428d7822'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.648 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:06 compute-0 NetworkManager[55227]: <info>  [1764401706.6494] manager: (tap3a32e9dc-c9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/239)
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.649 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.660 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.661 187189 INFO os_vif [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:77:74,bridge_name='br-int',has_traffic_filtering=True,id=3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5,network=Network(ee42f4e1-7038-4a24-8d9b-8ee99ca415d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a32e9dc-c9')
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.662 187189 DEBUG nova.virt.libvirt.vif [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:33:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1402689375',display_name='tempest-TestGettingAddress-server-1402689375',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1402689375',id=149,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBK18BM+cTj/3sUxBzPfGqMZVFW8u0MAYl6D47npYBAuFCVOWerNtWreCBLJmXVjolkwHlCMhwEYpOxlO5EeqIXZi9GpQKv0jmJXL6Uw9pSzb3x5DyZSKVQsbQBVQ+UXj+w==',key_name='tempest-TestGettingAddress-668700412',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-izv751fp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:34:10Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=0a1598ac-7fe7-4004-ad07-c9b7428d7822,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cca30a15-3dfe-49f3-8b60-50dc1d039795", "address": "fa:16:3e:6a:4b:a1", "network": {"id": "716ed53e-cc56-4286-b418-2f5e02d33124", "bridge": "br-int", "label": "tempest-network-smoke--990040375", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe6a:4ba1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe6a:4ba1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcca30a15-3d", "ovs_interfaceid": "cca30a15-3dfe-49f3-8b60-50dc1d039795", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.662 187189 DEBUG nova.network.os_vif_util [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "cca30a15-3dfe-49f3-8b60-50dc1d039795", "address": "fa:16:3e:6a:4b:a1", "network": {"id": "716ed53e-cc56-4286-b418-2f5e02d33124", "bridge": "br-int", "label": "tempest-network-smoke--990040375", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe6a:4ba1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe6a:4ba1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcca30a15-3d", "ovs_interfaceid": "cca30a15-3dfe-49f3-8b60-50dc1d039795", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.663 187189 DEBUG nova.network.os_vif_util [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:4b:a1,bridge_name='br-int',has_traffic_filtering=True,id=cca30a15-3dfe-49f3-8b60-50dc1d039795,network=Network(716ed53e-cc56-4286-b418-2f5e02d33124),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcca30a15-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.663 187189 DEBUG os_vif [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:4b:a1,bridge_name='br-int',has_traffic_filtering=True,id=cca30a15-3dfe-49f3-8b60-50dc1d039795,network=Network(716ed53e-cc56-4286-b418-2f5e02d33124),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcca30a15-3d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.664 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.664 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.664 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.667 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.667 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcca30a15-3d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.667 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcca30a15-3d, col_values=(('external_ids', {'iface-id': 'cca30a15-3dfe-49f3-8b60-50dc1d039795', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6a:4b:a1', 'vm-uuid': '0a1598ac-7fe7-4004-ad07-c9b7428d7822'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.669 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:06 compute-0 NetworkManager[55227]: <info>  [1764401706.6703] manager: (tapcca30a15-3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/240)
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.671 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.679 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.680 187189 INFO os_vif [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:4b:a1,bridge_name='br-int',has_traffic_filtering=True,id=cca30a15-3dfe-49f3-8b60-50dc1d039795,network=Network(716ed53e-cc56-4286-b418-2f5e02d33124),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcca30a15-3d')
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.759 187189 DEBUG nova.virt.libvirt.driver [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.760 187189 DEBUG nova.virt.libvirt.driver [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.760 187189 DEBUG nova.virt.libvirt.driver [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No VIF found with MAC fa:16:3e:62:77:74, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.760 187189 DEBUG nova.virt.libvirt.driver [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No VIF found with MAC fa:16:3e:6a:4b:a1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.761 187189 INFO nova.virt.libvirt.driver [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Using config drive
Nov 29 07:35:06 compute-0 podman[239797]: 2025-11-29 07:35:06.805572018 +0000 UTC m=+0.060796905 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 07:35:06 compute-0 podman[239796]: 2025-11-29 07:35:06.814594177 +0000 UTC m=+0.071622045 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, vcs-type=git, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, architecture=x86_64, config_id=edpm, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6)
Nov 29 07:35:06 compute-0 podman[239795]: 2025-11-29 07:35:06.819729534 +0000 UTC m=+0.074498277 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 29 07:35:06 compute-0 nova_compute[187185]: 2025-11-29 07:35:06.960 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:07 compute-0 nova_compute[187185]: 2025-11-29 07:35:07.753 187189 INFO nova.virt.libvirt.driver [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Creating config drive at /var/lib/nova/instances/0a1598ac-7fe7-4004-ad07-c9b7428d7822/disk.config
Nov 29 07:35:07 compute-0 nova_compute[187185]: 2025-11-29 07:35:07.759 187189 DEBUG oslo_concurrency.processutils [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0a1598ac-7fe7-4004-ad07-c9b7428d7822/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3g62o9zx execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:35:07 compute-0 nova_compute[187185]: 2025-11-29 07:35:07.894 187189 DEBUG oslo_concurrency.processutils [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0a1598ac-7fe7-4004-ad07-c9b7428d7822/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3g62o9zx" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:35:07 compute-0 NetworkManager[55227]: <info>  [1764401707.9822] manager: (tap3a32e9dc-c9): new Tun device (/org/freedesktop/NetworkManager/Devices/241)
Nov 29 07:35:07 compute-0 kernel: tap3a32e9dc-c9: entered promiscuous mode
Nov 29 07:35:07 compute-0 nova_compute[187185]: 2025-11-29 07:35:07.988 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:07 compute-0 ovn_controller[95281]: 2025-11-29T07:35:07Z|00469|binding|INFO|Claiming lport 3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5 for this chassis.
Nov 29 07:35:07 compute-0 ovn_controller[95281]: 2025-11-29T07:35:07Z|00470|binding|INFO|3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5: Claiming fa:16:3e:62:77:74 10.100.0.4
Nov 29 07:35:08 compute-0 kernel: tapcca30a15-3d: entered promiscuous mode
Nov 29 07:35:08 compute-0 NetworkManager[55227]: <info>  [1764401708.0016] manager: (tapcca30a15-3d): new Tun device (/org/freedesktop/NetworkManager/Devices/242)
Nov 29 07:35:08 compute-0 nova_compute[187185]: 2025-11-29 07:35:08.003 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:08 compute-0 nova_compute[187185]: 2025-11-29 07:35:08.008 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:08 compute-0 nova_compute[187185]: 2025-11-29 07:35:08.010 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:08 compute-0 nova_compute[187185]: 2025-11-29 07:35:08.014 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:08 compute-0 NetworkManager[55227]: <info>  [1764401708.0161] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/243)
Nov 29 07:35:08 compute-0 ovn_controller[95281]: 2025-11-29T07:35:08Z|00471|if_status|INFO|Not updating pb chassis for cca30a15-3dfe-49f3-8b60-50dc1d039795 now as sb is readonly
Nov 29 07:35:08 compute-0 NetworkManager[55227]: <info>  [1764401708.0169] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/244)
Nov 29 07:35:08 compute-0 systemd-udevd[239876]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:35:08 compute-0 systemd-udevd[239878]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:35:08 compute-0 NetworkManager[55227]: <info>  [1764401708.0433] device (tapcca30a15-3d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:35:08 compute-0 NetworkManager[55227]: <info>  [1764401708.0447] device (tapcca30a15-3d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:35:08 compute-0 NetworkManager[55227]: <info>  [1764401708.0454] device (tap3a32e9dc-c9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:35:08 compute-0 NetworkManager[55227]: <info>  [1764401708.0463] device (tap3a32e9dc-c9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:08.039 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:62:77:74 10.100.0.4'], port_security=['fa:16:3e:62:77:74 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '0a1598ac-7fe7-4004-ad07-c9b7428d7822', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7900eada-6f98-452f-b178-ad26f2b82064', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0790a775-3668-4bb8-97f1-d4276df58523, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:08.040 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5 in datapath ee42f4e1-7038-4a24-8d9b-8ee99ca415d0 bound to our chassis
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:08.042 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee42f4e1-7038-4a24-8d9b-8ee99ca415d0
Nov 29 07:35:08 compute-0 systemd-machined[153486]: New machine qemu-58-instance-00000095.
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:08.054 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[9a951d8c-59a0-4d18-b630-9059d9ca71fe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:08.055 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapee42f4e1-71 in ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:08.058 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapee42f4e1-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:08.058 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[3f261906-3310-43a3-9934-2db79019a8d8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:08.059 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d4717af4-81f2-4926-b54f-31d6c11dc119]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:08.073 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[ca9eff40-af50-4a42-8c69-4d08cbc43fdc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:08 compute-0 systemd[1]: Started Virtual Machine qemu-58-instance-00000095.
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:08.099 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a476b695-45cc-429c-8b33-bd3f08081bb2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:08.143 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[89114af0-b26f-44d5-a043-f6ad9d3bb099]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:08.163 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d62e90ab-530c-4ab5-89ef-4db60f0203a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:08 compute-0 NetworkManager[55227]: <info>  [1764401708.1651] manager: (tapee42f4e1-70): new Veth device (/org/freedesktop/NetworkManager/Devices/245)
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:08.206 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[b20d4fad-ed1c-4c7f-9391-bbf928169955]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:08.211 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[813b0bcc-0fb0-4820-9425-b93d8aa50461]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:08 compute-0 nova_compute[187185]: 2025-11-29 07:35:08.216 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:08 compute-0 ovn_controller[95281]: 2025-11-29T07:35:08Z|00472|binding|INFO|Claiming lport cca30a15-3dfe-49f3-8b60-50dc1d039795 for this chassis.
Nov 29 07:35:08 compute-0 ovn_controller[95281]: 2025-11-29T07:35:08Z|00473|binding|INFO|cca30a15-3dfe-49f3-8b60-50dc1d039795: Claiming fa:16:3e:6a:4b:a1 2001:db8:0:1:f816:3eff:fe6a:4ba1 2001:db8::f816:3eff:fe6a:4ba1
Nov 29 07:35:08 compute-0 ovn_controller[95281]: 2025-11-29T07:35:08Z|00474|binding|INFO|Setting lport 3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5 ovn-installed in OVS
Nov 29 07:35:08 compute-0 NetworkManager[55227]: <info>  [1764401708.2387] device (tapee42f4e1-70): carrier: link connected
Nov 29 07:35:08 compute-0 nova_compute[187185]: 2025-11-29 07:35:08.240 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:08.244 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[95466d9f-5d2a-48ab-b71a-5894f1cfed80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:08 compute-0 ovn_controller[95281]: 2025-11-29T07:35:08Z|00475|binding|INFO|Setting lport cca30a15-3dfe-49f3-8b60-50dc1d039795 ovn-installed in OVS
Nov 29 07:35:08 compute-0 nova_compute[187185]: 2025-11-29 07:35:08.259 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:08.263 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[07fe285b-ab4e-4b0e-a3bf-4d4a8538890e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee42f4e1-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bd:a6:7e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 150], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 715590, 'reachable_time': 28678, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239912, 'error': None, 'target': 'ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:08.280 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6f73cc04-c90a-480a-9e3d-21f7281297a3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febd:a67e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 715590, 'tstamp': 715590}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239913, 'error': None, 'target': 'ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:08.299 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e32116c4-7aa4-4016-a2d5-87c09204c96b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee42f4e1-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bd:a6:7e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 150], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 715590, 'reachable_time': 28678, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 239914, 'error': None, 'target': 'ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:08.337 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d4851db6-526f-4264-bf4a-f8153c9ea63b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:08.408 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[24e102ba-beb9-4e33-b241-532881352886]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:08.411 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee42f4e1-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:08.411 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:08.412 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee42f4e1-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:35:08 compute-0 kernel: tapee42f4e1-70: entered promiscuous mode
Nov 29 07:35:08 compute-0 NetworkManager[55227]: <info>  [1764401708.4173] manager: (tapee42f4e1-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/246)
Nov 29 07:35:08 compute-0 nova_compute[187185]: 2025-11-29 07:35:08.417 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:08.420 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee42f4e1-70, col_values=(('external_ids', {'iface-id': '78e8cb8e-6743-4ef2-8e7c-19feddf2ed97'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:35:08 compute-0 nova_compute[187185]: 2025-11-29 07:35:08.421 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:08 compute-0 ovn_controller[95281]: 2025-11-29T07:35:08Z|00476|binding|INFO|Releasing lport 78e8cb8e-6743-4ef2-8e7c-19feddf2ed97 from this chassis (sb_readonly=1)
Nov 29 07:35:08 compute-0 nova_compute[187185]: 2025-11-29 07:35:08.447 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:08.449 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee42f4e1-7038-4a24-8d9b-8ee99ca415d0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee42f4e1-7038-4a24-8d9b-8ee99ca415d0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:08.450 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[68baa471-6800-4e9e-bf92-add964a71019]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:08.450 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/ee42f4e1-7038-4a24-8d9b-8ee99ca415d0.pid.haproxy
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID ee42f4e1-7038-4a24-8d9b-8ee99ca415d0
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:08.451 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0', 'env', 'PROCESS_TAG=haproxy-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ee42f4e1-7038-4a24-8d9b-8ee99ca415d0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:35:08 compute-0 nova_compute[187185]: 2025-11-29 07:35:08.472 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401708.4713564, 0a1598ac-7fe7-4004-ad07-c9b7428d7822 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:35:08 compute-0 nova_compute[187185]: 2025-11-29 07:35:08.472 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] VM Started (Lifecycle Event)
Nov 29 07:35:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:08.723 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6a:4b:a1 2001:db8:0:1:f816:3eff:fe6a:4ba1 2001:db8::f816:3eff:fe6a:4ba1'], port_security=['fa:16:3e:6a:4b:a1 2001:db8:0:1:f816:3eff:fe6a:4ba1 2001:db8::f816:3eff:fe6a:4ba1'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe6a:4ba1/64 2001:db8::f816:3eff:fe6a:4ba1/64', 'neutron:device_id': '0a1598ac-7fe7-4004-ad07-c9b7428d7822', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-716ed53e-cc56-4286-b418-2f5e02d33124', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7900eada-6f98-452f-b178-ad26f2b82064', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c069d1db-d7e5-4641-988e-cd6e75103caa, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=cca30a15-3dfe-49f3-8b60-50dc1d039795) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:35:08 compute-0 ovn_controller[95281]: 2025-11-29T07:35:08Z|00477|binding|INFO|Setting lport cca30a15-3dfe-49f3-8b60-50dc1d039795 up in Southbound
Nov 29 07:35:08 compute-0 ovn_controller[95281]: 2025-11-29T07:35:08Z|00478|binding|INFO|Setting lport 3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5 up in Southbound
Nov 29 07:35:08 compute-0 nova_compute[187185]: 2025-11-29 07:35:08.831 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:35:08 compute-0 nova_compute[187185]: 2025-11-29 07:35:08.838 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401708.4715598, 0a1598ac-7fe7-4004-ad07-c9b7428d7822 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:35:08 compute-0 nova_compute[187185]: 2025-11-29 07:35:08.838 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] VM Paused (Lifecycle Event)
Nov 29 07:35:08 compute-0 nova_compute[187185]: 2025-11-29 07:35:08.889 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:35:08 compute-0 nova_compute[187185]: 2025-11-29 07:35:08.893 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:35:08 compute-0 nova_compute[187185]: 2025-11-29 07:35:08.933 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:35:09 compute-0 podman[239954]: 2025-11-29 07:35:08.914631058 +0000 UTC m=+0.023140884 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:35:09 compute-0 podman[239954]: 2025-11-29 07:35:09.595436635 +0000 UTC m=+0.703946431 container create 4ac20c242a1c358d32fac0b34af1e03750ed6fc415de3adb3019035ae1bb7e10 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 07:35:09 compute-0 nova_compute[187185]: 2025-11-29 07:35:09.689 187189 DEBUG nova.compute.manager [req-b9f005f6-0397-44a1-85d8-34c1343d00e8 req-4a2b4f9d-6b76-457c-9bec-fb4d22978f70 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Received event network-vif-plugged-cca30a15-3dfe-49f3-8b60-50dc1d039795 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:35:09 compute-0 nova_compute[187185]: 2025-11-29 07:35:09.692 187189 DEBUG oslo_concurrency.lockutils [req-b9f005f6-0397-44a1-85d8-34c1343d00e8 req-4a2b4f9d-6b76-457c-9bec-fb4d22978f70 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "0a1598ac-7fe7-4004-ad07-c9b7428d7822-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:35:09 compute-0 nova_compute[187185]: 2025-11-29 07:35:09.692 187189 DEBUG oslo_concurrency.lockutils [req-b9f005f6-0397-44a1-85d8-34c1343d00e8 req-4a2b4f9d-6b76-457c-9bec-fb4d22978f70 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0a1598ac-7fe7-4004-ad07-c9b7428d7822-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:35:09 compute-0 nova_compute[187185]: 2025-11-29 07:35:09.692 187189 DEBUG oslo_concurrency.lockutils [req-b9f005f6-0397-44a1-85d8-34c1343d00e8 req-4a2b4f9d-6b76-457c-9bec-fb4d22978f70 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0a1598ac-7fe7-4004-ad07-c9b7428d7822-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:35:09 compute-0 nova_compute[187185]: 2025-11-29 07:35:09.693 187189 DEBUG nova.compute.manager [req-b9f005f6-0397-44a1-85d8-34c1343d00e8 req-4a2b4f9d-6b76-457c-9bec-fb4d22978f70 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Processing event network-vif-plugged-cca30a15-3dfe-49f3-8b60-50dc1d039795 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:35:09 compute-0 systemd[1]: Started libpod-conmon-4ac20c242a1c358d32fac0b34af1e03750ed6fc415de3adb3019035ae1bb7e10.scope.
Nov 29 07:35:09 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:35:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d1ed4379c7927cc2537b696b8eb11bf17d1616abc7f5929465b99a3cf6e60f8a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:35:10 compute-0 nova_compute[187185]: 2025-11-29 07:35:10.096 187189 DEBUG nova.compute.manager [req-d26153f6-106f-42df-8665-15a28cd25cf4 req-f8c03c33-b53c-4cce-a0ea-dcfd49991ee5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Received event network-vif-plugged-3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:35:10 compute-0 nova_compute[187185]: 2025-11-29 07:35:10.097 187189 DEBUG oslo_concurrency.lockutils [req-d26153f6-106f-42df-8665-15a28cd25cf4 req-f8c03c33-b53c-4cce-a0ea-dcfd49991ee5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "0a1598ac-7fe7-4004-ad07-c9b7428d7822-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:35:10 compute-0 nova_compute[187185]: 2025-11-29 07:35:10.098 187189 DEBUG oslo_concurrency.lockutils [req-d26153f6-106f-42df-8665-15a28cd25cf4 req-f8c03c33-b53c-4cce-a0ea-dcfd49991ee5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0a1598ac-7fe7-4004-ad07-c9b7428d7822-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:35:10 compute-0 nova_compute[187185]: 2025-11-29 07:35:10.098 187189 DEBUG oslo_concurrency.lockutils [req-d26153f6-106f-42df-8665-15a28cd25cf4 req-f8c03c33-b53c-4cce-a0ea-dcfd49991ee5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0a1598ac-7fe7-4004-ad07-c9b7428d7822-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:35:10 compute-0 nova_compute[187185]: 2025-11-29 07:35:10.099 187189 DEBUG nova.compute.manager [req-d26153f6-106f-42df-8665-15a28cd25cf4 req-f8c03c33-b53c-4cce-a0ea-dcfd49991ee5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Processing event network-vif-plugged-3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:35:10 compute-0 nova_compute[187185]: 2025-11-29 07:35:10.100 187189 DEBUG nova.compute.manager [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Instance event wait completed in 1 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:35:10 compute-0 nova_compute[187185]: 2025-11-29 07:35:10.106 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401710.1060195, 0a1598ac-7fe7-4004-ad07-c9b7428d7822 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:35:10 compute-0 nova_compute[187185]: 2025-11-29 07:35:10.107 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] VM Resumed (Lifecycle Event)
Nov 29 07:35:10 compute-0 nova_compute[187185]: 2025-11-29 07:35:10.110 187189 DEBUG nova.virt.libvirt.driver [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:35:10 compute-0 nova_compute[187185]: 2025-11-29 07:35:10.113 187189 INFO nova.virt.libvirt.driver [-] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Instance spawned successfully.
Nov 29 07:35:10 compute-0 nova_compute[187185]: 2025-11-29 07:35:10.114 187189 DEBUG nova.virt.libvirt.driver [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:35:10 compute-0 podman[239954]: 2025-11-29 07:35:10.122884703 +0000 UTC m=+1.231394529 container init 4ac20c242a1c358d32fac0b34af1e03750ed6fc415de3adb3019035ae1bb7e10 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 29 07:35:10 compute-0 podman[239954]: 2025-11-29 07:35:10.135369231 +0000 UTC m=+1.243879057 container start 4ac20c242a1c358d32fac0b34af1e03750ed6fc415de3adb3019035ae1bb7e10 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 29 07:35:10 compute-0 nova_compute[187185]: 2025-11-29 07:35:10.143 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:35:10 compute-0 nova_compute[187185]: 2025-11-29 07:35:10.151 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:35:10 compute-0 nova_compute[187185]: 2025-11-29 07:35:10.155 187189 DEBUG nova.virt.libvirt.driver [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:35:10 compute-0 nova_compute[187185]: 2025-11-29 07:35:10.156 187189 DEBUG nova.virt.libvirt.driver [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:35:10 compute-0 nova_compute[187185]: 2025-11-29 07:35:10.156 187189 DEBUG nova.virt.libvirt.driver [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:35:10 compute-0 nova_compute[187185]: 2025-11-29 07:35:10.157 187189 DEBUG nova.virt.libvirt.driver [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:35:10 compute-0 nova_compute[187185]: 2025-11-29 07:35:10.157 187189 DEBUG nova.virt.libvirt.driver [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:35:10 compute-0 nova_compute[187185]: 2025-11-29 07:35:10.158 187189 DEBUG nova.virt.libvirt.driver [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:35:10 compute-0 neutron-haproxy-ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0[239970]: [NOTICE]   (239974) : New worker (239976) forked
Nov 29 07:35:10 compute-0 neutron-haproxy-ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0[239970]: [NOTICE]   (239974) : Loading success.
Nov 29 07:35:10 compute-0 nova_compute[187185]: 2025-11-29 07:35:10.202 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:10.213 104254 INFO neutron.agent.ovn.metadata.agent [-] Port cca30a15-3dfe-49f3-8b60-50dc1d039795 in datapath 716ed53e-cc56-4286-b418-2f5e02d33124 unbound from our chassis
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:10.215 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 716ed53e-cc56-4286-b418-2f5e02d33124
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:10.234 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e7848d42-41a0-4f7f-9629-8f83874b6c00]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:10.244 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap716ed53e-c1 in ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:10.247 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap716ed53e-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:10.247 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a0459c7d-a299-42c0-ba96-806519af2554]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:10.248 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[494097b6-244d-4e29-8d56-8e0a08609a6f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:10.264 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[47b02593-78fa-492d-b124-9d91f2384bed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:10.283 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[72140cf1-cad9-4fda-9226-6f429f1b1c2e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:10.319 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[93dd8812-39ed-4b05-8d58-2594f67f5d23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:10 compute-0 nova_compute[187185]: 2025-11-29 07:35:10.321 187189 INFO nova.compute.manager [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Took 50.02 seconds to spawn the instance on the hypervisor.
Nov 29 07:35:10 compute-0 nova_compute[187185]: 2025-11-29 07:35:10.323 187189 DEBUG nova.compute.manager [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:10.326 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[28f08b4e-8b8a-4dc8-b2d2-02598690c190]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:10 compute-0 systemd-udevd[239903]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:35:10 compute-0 NetworkManager[55227]: <info>  [1764401710.3323] manager: (tap716ed53e-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/247)
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:10.375 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[57d97468-bd94-48f5-8e6f-f19092a21572]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:10.379 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[1f2ed1c0-bd9b-4e4a-9993-5f96a6143eaf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:10 compute-0 NetworkManager[55227]: <info>  [1764401710.4056] device (tap716ed53e-c0): carrier: link connected
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:10.413 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[caa91efb-78ab-464e-8b89-c8d3ab8f1693]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:10.431 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[8fc6403b-a5fc-4891-8848-1328dde80c17]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap716ed53e-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4a:f1:bd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 151], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 715806, 'reachable_time': 42683, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239996, 'error': None, 'target': 'ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:10.452 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d6c6b12f-be96-41e9-846a-10ca8fa57623]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4a:f1bd'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 715806, 'tstamp': 715806}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239997, 'error': None, 'target': 'ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:10 compute-0 nova_compute[187185]: 2025-11-29 07:35:10.457 187189 INFO nova.compute.manager [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Took 64.77 seconds to build instance.
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:10.471 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[0479280f-5957-44da-8d9d-5ef67374a3aa]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap716ed53e-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4a:f1:bd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 151], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 715806, 'reachable_time': 42683, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 239998, 'error': None, 'target': 'ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:10 compute-0 nova_compute[187185]: 2025-11-29 07:35:10.493 187189 DEBUG oslo_concurrency.lockutils [None req-71ab529c-0ebe-42c4-98f3-f5a3f7b0b823 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "0a1598ac-7fe7-4004-ad07-c9b7428d7822" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 69.382s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:10.503 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[cfe1df50-3219-486e-8cf3-9aa2113f9233]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:10.539 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[13044464-f7d0-46ca-a6db-209607b0f437]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:10.541 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap716ed53e-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:10.542 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:10.542 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap716ed53e-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:35:10 compute-0 kernel: tap716ed53e-c0: entered promiscuous mode
Nov 29 07:35:10 compute-0 NetworkManager[55227]: <info>  [1764401710.5452] manager: (tap716ed53e-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/248)
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:10.549 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap716ed53e-c0, col_values=(('external_ids', {'iface-id': 'b0b0536c-6e35-42c5-8936-a1236a4f216e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:35:10 compute-0 ovn_controller[95281]: 2025-11-29T07:35:10Z|00479|binding|INFO|Releasing lport b0b0536c-6e35-42c5-8936-a1236a4f216e from this chassis (sb_readonly=0)
Nov 29 07:35:10 compute-0 nova_compute[187185]: 2025-11-29 07:35:10.549 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:10 compute-0 nova_compute[187185]: 2025-11-29 07:35:10.564 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:10.566 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/716ed53e-cc56-4286-b418-2f5e02d33124.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/716ed53e-cc56-4286-b418-2f5e02d33124.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:10.567 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e6c31412-b1f6-48c8-b070-b34bbf5f5959]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:10.568 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-716ed53e-cc56-4286-b418-2f5e02d33124
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/716ed53e-cc56-4286-b418-2f5e02d33124.pid.haproxy
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID 716ed53e-cc56-4286-b418-2f5e02d33124
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:35:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:10.569 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124', 'env', 'PROCESS_TAG=haproxy-716ed53e-cc56-4286-b418-2f5e02d33124', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/716ed53e-cc56-4286-b418-2f5e02d33124.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:35:10 compute-0 nova_compute[187185]: 2025-11-29 07:35:10.889 187189 DEBUG nova.network.neutron [req-04656a8a-431e-48a0-9d29-0716cd9d16be req-5a69b7b8-2a40-41a6-8b7e-4fef32442bc7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Updated VIF entry in instance network info cache for port cca30a15-3dfe-49f3-8b60-50dc1d039795. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:35:10 compute-0 nova_compute[187185]: 2025-11-29 07:35:10.890 187189 DEBUG nova.network.neutron [req-04656a8a-431e-48a0-9d29-0716cd9d16be req-5a69b7b8-2a40-41a6-8b7e-4fef32442bc7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Updating instance_info_cache with network_info: [{"id": "3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5", "address": "fa:16:3e:62:77:74", "network": {"id": "ee42f4e1-7038-4a24-8d9b-8ee99ca415d0", "bridge": "br-int", "label": "tempest-network-smoke--1973917221", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a32e9dc-c9", "ovs_interfaceid": "3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "cca30a15-3dfe-49f3-8b60-50dc1d039795", "address": "fa:16:3e:6a:4b:a1", "network": {"id": "716ed53e-cc56-4286-b418-2f5e02d33124", "bridge": "br-int", "label": "tempest-network-smoke--990040375", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe6a:4ba1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe6a:4ba1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcca30a15-3d", "ovs_interfaceid": "cca30a15-3dfe-49f3-8b60-50dc1d039795", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:35:11 compute-0 nova_compute[187185]: 2025-11-29 07:35:11.009 187189 DEBUG oslo_concurrency.lockutils [req-04656a8a-431e-48a0-9d29-0716cd9d16be req-5a69b7b8-2a40-41a6-8b7e-4fef32442bc7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-0a1598ac-7fe7-4004-ad07-c9b7428d7822" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:35:11 compute-0 podman[240029]: 2025-11-29 07:35:10.959472507 +0000 UTC m=+0.032429620 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:35:11 compute-0 podman[240029]: 2025-11-29 07:35:11.323877719 +0000 UTC m=+0.396834842 container create e37448a287192cb2344f84c6c8d53a306d1bbe4e7474ac0cdbd933e55b45a3eb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 07:35:11 compute-0 systemd[1]: Started libpod-conmon-e37448a287192cb2344f84c6c8d53a306d1bbe4e7474ac0cdbd933e55b45a3eb.scope.
Nov 29 07:35:11 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:35:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/804a92621ba746f77d052e4232d066b14ef35779de78486fcda7eb4cfb823a6f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:35:11 compute-0 podman[240029]: 2025-11-29 07:35:11.618085667 +0000 UTC m=+0.691042800 container init e37448a287192cb2344f84c6c8d53a306d1bbe4e7474ac0cdbd933e55b45a3eb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:35:11 compute-0 podman[240029]: 2025-11-29 07:35:11.631870942 +0000 UTC m=+0.704828065 container start e37448a287192cb2344f84c6c8d53a306d1bbe4e7474ac0cdbd933e55b45a3eb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 29 07:35:11 compute-0 neutron-haproxy-ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124[240044]: [NOTICE]   (240048) : New worker (240050) forked
Nov 29 07:35:11 compute-0 neutron-haproxy-ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124[240044]: [NOTICE]   (240048) : Loading success.
Nov 29 07:35:11 compute-0 nova_compute[187185]: 2025-11-29 07:35:11.670 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:11 compute-0 nova_compute[187185]: 2025-11-29 07:35:11.967 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:12 compute-0 nova_compute[187185]: 2025-11-29 07:35:12.811 187189 DEBUG nova.compute.manager [req-1bf01349-8db9-4d62-8062-029c3a55d5c0 req-6af709a3-4922-4c30-a142-84cf5fbdaeec 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Received event network-vif-plugged-cca30a15-3dfe-49f3-8b60-50dc1d039795 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:35:12 compute-0 nova_compute[187185]: 2025-11-29 07:35:12.812 187189 DEBUG oslo_concurrency.lockutils [req-1bf01349-8db9-4d62-8062-029c3a55d5c0 req-6af709a3-4922-4c30-a142-84cf5fbdaeec 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "0a1598ac-7fe7-4004-ad07-c9b7428d7822-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:35:12 compute-0 nova_compute[187185]: 2025-11-29 07:35:12.812 187189 DEBUG oslo_concurrency.lockutils [req-1bf01349-8db9-4d62-8062-029c3a55d5c0 req-6af709a3-4922-4c30-a142-84cf5fbdaeec 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0a1598ac-7fe7-4004-ad07-c9b7428d7822-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:35:12 compute-0 nova_compute[187185]: 2025-11-29 07:35:12.813 187189 DEBUG oslo_concurrency.lockutils [req-1bf01349-8db9-4d62-8062-029c3a55d5c0 req-6af709a3-4922-4c30-a142-84cf5fbdaeec 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0a1598ac-7fe7-4004-ad07-c9b7428d7822-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:35:12 compute-0 nova_compute[187185]: 2025-11-29 07:35:12.813 187189 DEBUG nova.compute.manager [req-1bf01349-8db9-4d62-8062-029c3a55d5c0 req-6af709a3-4922-4c30-a142-84cf5fbdaeec 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] No waiting events found dispatching network-vif-plugged-cca30a15-3dfe-49f3-8b60-50dc1d039795 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:35:12 compute-0 nova_compute[187185]: 2025-11-29 07:35:12.814 187189 WARNING nova.compute.manager [req-1bf01349-8db9-4d62-8062-029c3a55d5c0 req-6af709a3-4922-4c30-a142-84cf5fbdaeec 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Received unexpected event network-vif-plugged-cca30a15-3dfe-49f3-8b60-50dc1d039795 for instance with vm_state active and task_state None.
Nov 29 07:35:14 compute-0 nova_compute[187185]: 2025-11-29 07:35:14.017 187189 DEBUG nova.compute.manager [req-039f00ed-7dde-4eff-a07b-7a6cb8eed8cd req-ff40570a-c149-4dae-a250-9ed955f9c136 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Received event network-vif-plugged-3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:35:14 compute-0 nova_compute[187185]: 2025-11-29 07:35:14.018 187189 DEBUG oslo_concurrency.lockutils [req-039f00ed-7dde-4eff-a07b-7a6cb8eed8cd req-ff40570a-c149-4dae-a250-9ed955f9c136 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "0a1598ac-7fe7-4004-ad07-c9b7428d7822-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:35:14 compute-0 nova_compute[187185]: 2025-11-29 07:35:14.018 187189 DEBUG oslo_concurrency.lockutils [req-039f00ed-7dde-4eff-a07b-7a6cb8eed8cd req-ff40570a-c149-4dae-a250-9ed955f9c136 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0a1598ac-7fe7-4004-ad07-c9b7428d7822-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:35:14 compute-0 nova_compute[187185]: 2025-11-29 07:35:14.019 187189 DEBUG oslo_concurrency.lockutils [req-039f00ed-7dde-4eff-a07b-7a6cb8eed8cd req-ff40570a-c149-4dae-a250-9ed955f9c136 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0a1598ac-7fe7-4004-ad07-c9b7428d7822-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:35:14 compute-0 nova_compute[187185]: 2025-11-29 07:35:14.019 187189 DEBUG nova.compute.manager [req-039f00ed-7dde-4eff-a07b-7a6cb8eed8cd req-ff40570a-c149-4dae-a250-9ed955f9c136 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] No waiting events found dispatching network-vif-plugged-3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:35:14 compute-0 nova_compute[187185]: 2025-11-29 07:35:14.020 187189 WARNING nova.compute.manager [req-039f00ed-7dde-4eff-a07b-7a6cb8eed8cd req-ff40570a-c149-4dae-a250-9ed955f9c136 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Received unexpected event network-vif-plugged-3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5 for instance with vm_state active and task_state None.
Nov 29 07:35:15 compute-0 podman[240059]: 2025-11-29 07:35:15.850097447 +0000 UTC m=+0.118139836 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 07:35:16 compute-0 nova_compute[187185]: 2025-11-29 07:35:16.673 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:16 compute-0 nova_compute[187185]: 2025-11-29 07:35:16.969 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:18 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:18.176 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=53, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=52) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:35:18 compute-0 nova_compute[187185]: 2025-11-29 07:35:18.179 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:18 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:18.181 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:35:20 compute-0 nova_compute[187185]: 2025-11-29 07:35:20.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:35:20 compute-0 sshd-session[239986]: error: kex_exchange_identification: read: Connection timed out
Nov 29 07:35:20 compute-0 sshd-session[239986]: banner exchange: Connection from 115.190.99.78 port 37058: Connection timed out
Nov 29 07:35:21 compute-0 nova_compute[187185]: 2025-11-29 07:35:21.319 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:35:21 compute-0 nova_compute[187185]: 2025-11-29 07:35:21.320 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:35:21 compute-0 nova_compute[187185]: 2025-11-29 07:35:21.320 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:35:21 compute-0 nova_compute[187185]: 2025-11-29 07:35:21.676 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:21 compute-0 ovn_controller[95281]: 2025-11-29T07:35:21Z|00061|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:62:77:74 10.100.0.4
Nov 29 07:35:21 compute-0 ovn_controller[95281]: 2025-11-29T07:35:21Z|00062|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:62:77:74 10.100.0.4
Nov 29 07:35:21 compute-0 nova_compute[187185]: 2025-11-29 07:35:21.970 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:22 compute-0 nova_compute[187185]: 2025-11-29 07:35:22.810 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "refresh_cache-0a1598ac-7fe7-4004-ad07-c9b7428d7822" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:35:22 compute-0 nova_compute[187185]: 2025-11-29 07:35:22.810 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquired lock "refresh_cache-0a1598ac-7fe7-4004-ad07-c9b7428d7822" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:35:22 compute-0 nova_compute[187185]: 2025-11-29 07:35:22.811 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 29 07:35:22 compute-0 nova_compute[187185]: 2025-11-29 07:35:22.811 187189 DEBUG nova.objects.instance [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 0a1598ac-7fe7-4004-ad07-c9b7428d7822 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:35:23 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:23.184 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '53'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:35:23 compute-0 podman[240103]: 2025-11-29 07:35:23.816610956 +0000 UTC m=+0.072811983 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 07:35:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:25.524 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:35:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:25.525 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:35:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:25.526 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:35:26 compute-0 nova_compute[187185]: 2025-11-29 07:35:26.679 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:26 compute-0 nova_compute[187185]: 2025-11-29 07:35:26.973 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:27 compute-0 nova_compute[187185]: 2025-11-29 07:35:27.457 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:27 compute-0 podman[240128]: 2025-11-29 07:35:27.803868407 +0000 UTC m=+0.062785879 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 29 07:35:27 compute-0 podman[240129]: 2025-11-29 07:35:27.808903899 +0000 UTC m=+0.065438994 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:35:28 compute-0 nova_compute[187185]: 2025-11-29 07:35:28.443 187189 DEBUG nova.compute.manager [req-ec656717-08c2-4674-8e3b-565fb2082953 req-fc8a240c-c917-49fa-b66b-2378ec2f9124 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Received event network-changed-3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:35:28 compute-0 nova_compute[187185]: 2025-11-29 07:35:28.445 187189 DEBUG nova.compute.manager [req-ec656717-08c2-4674-8e3b-565fb2082953 req-fc8a240c-c917-49fa-b66b-2378ec2f9124 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Refreshing instance network info cache due to event network-changed-3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:35:28 compute-0 nova_compute[187185]: 2025-11-29 07:35:28.446 187189 DEBUG oslo_concurrency.lockutils [req-ec656717-08c2-4674-8e3b-565fb2082953 req-fc8a240c-c917-49fa-b66b-2378ec2f9124 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-0a1598ac-7fe7-4004-ad07-c9b7428d7822" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:35:31 compute-0 nova_compute[187185]: 2025-11-29 07:35:31.665 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Updating instance_info_cache with network_info: [{"id": "3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5", "address": "fa:16:3e:62:77:74", "network": {"id": "ee42f4e1-7038-4a24-8d9b-8ee99ca415d0", "bridge": "br-int", "label": "tempest-network-smoke--1973917221", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a32e9dc-c9", "ovs_interfaceid": "3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "cca30a15-3dfe-49f3-8b60-50dc1d039795", "address": "fa:16:3e:6a:4b:a1", "network": {"id": "716ed53e-cc56-4286-b418-2f5e02d33124", "bridge": "br-int", "label": "tempest-network-smoke--990040375", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe6a:4ba1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe6a:4ba1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcca30a15-3d", "ovs_interfaceid": "cca30a15-3dfe-49f3-8b60-50dc1d039795", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:35:31 compute-0 nova_compute[187185]: 2025-11-29 07:35:31.682 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:31 compute-0 nova_compute[187185]: 2025-11-29 07:35:31.694 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Releasing lock "refresh_cache-0a1598ac-7fe7-4004-ad07-c9b7428d7822" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:35:31 compute-0 nova_compute[187185]: 2025-11-29 07:35:31.695 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 29 07:35:31 compute-0 nova_compute[187185]: 2025-11-29 07:35:31.695 187189 DEBUG oslo_concurrency.lockutils [req-ec656717-08c2-4674-8e3b-565fb2082953 req-fc8a240c-c917-49fa-b66b-2378ec2f9124 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-0a1598ac-7fe7-4004-ad07-c9b7428d7822" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:35:31 compute-0 nova_compute[187185]: 2025-11-29 07:35:31.695 187189 DEBUG nova.network.neutron [req-ec656717-08c2-4674-8e3b-565fb2082953 req-fc8a240c-c917-49fa-b66b-2378ec2f9124 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Refreshing network info cache for port 3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:35:31 compute-0 nova_compute[187185]: 2025-11-29 07:35:31.696 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:35:31 compute-0 nova_compute[187185]: 2025-11-29 07:35:31.697 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:35:31 compute-0 nova_compute[187185]: 2025-11-29 07:35:31.697 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:35:31 compute-0 nova_compute[187185]: 2025-11-29 07:35:31.697 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:35:31 compute-0 nova_compute[187185]: 2025-11-29 07:35:31.698 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:35:31 compute-0 nova_compute[187185]: 2025-11-29 07:35:31.698 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:35:31 compute-0 nova_compute[187185]: 2025-11-29 07:35:31.698 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:35:31 compute-0 nova_compute[187185]: 2025-11-29 07:35:31.723 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:35:31 compute-0 nova_compute[187185]: 2025-11-29 07:35:31.723 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:35:31 compute-0 nova_compute[187185]: 2025-11-29 07:35:31.723 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:35:31 compute-0 nova_compute[187185]: 2025-11-29 07:35:31.724 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:35:31 compute-0 nova_compute[187185]: 2025-11-29 07:35:31.804 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0a1598ac-7fe7-4004-ad07-c9b7428d7822/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:35:31 compute-0 nova_compute[187185]: 2025-11-29 07:35:31.902 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0a1598ac-7fe7-4004-ad07-c9b7428d7822/disk --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:35:31 compute-0 nova_compute[187185]: 2025-11-29 07:35:31.903 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0a1598ac-7fe7-4004-ad07-c9b7428d7822/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:35:31 compute-0 nova_compute[187185]: 2025-11-29 07:35:31.976 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:31 compute-0 nova_compute[187185]: 2025-11-29 07:35:31.984 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0a1598ac-7fe7-4004-ad07-c9b7428d7822/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:35:32 compute-0 nova_compute[187185]: 2025-11-29 07:35:32.209 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:35:32 compute-0 nova_compute[187185]: 2025-11-29 07:35:32.211 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5532MB free_disk=73.21580123901367GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:35:32 compute-0 nova_compute[187185]: 2025-11-29 07:35:32.212 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:35:32 compute-0 nova_compute[187185]: 2025-11-29 07:35:32.212 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:35:32 compute-0 nova_compute[187185]: 2025-11-29 07:35:32.335 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance 0a1598ac-7fe7-4004-ad07-c9b7428d7822 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:35:32 compute-0 nova_compute[187185]: 2025-11-29 07:35:32.335 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:35:32 compute-0 nova_compute[187185]: 2025-11-29 07:35:32.336 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:35:32 compute-0 nova_compute[187185]: 2025-11-29 07:35:32.406 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:35:32 compute-0 nova_compute[187185]: 2025-11-29 07:35:32.447 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:35:32 compute-0 nova_compute[187185]: 2025-11-29 07:35:32.457 187189 DEBUG nova.compute.manager [req-63fc076f-dfe4-464f-a153-8f172300f585 req-affffac4-a8b5-49d7-8113-d6ee67906754 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Received event network-changed-3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:35:32 compute-0 nova_compute[187185]: 2025-11-29 07:35:32.458 187189 DEBUG nova.compute.manager [req-63fc076f-dfe4-464f-a153-8f172300f585 req-affffac4-a8b5-49d7-8113-d6ee67906754 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Refreshing instance network info cache due to event network-changed-3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:35:32 compute-0 nova_compute[187185]: 2025-11-29 07:35:32.459 187189 DEBUG oslo_concurrency.lockutils [req-63fc076f-dfe4-464f-a153-8f172300f585 req-affffac4-a8b5-49d7-8113-d6ee67906754 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-0a1598ac-7fe7-4004-ad07-c9b7428d7822" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:35:32 compute-0 nova_compute[187185]: 2025-11-29 07:35:32.606 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:35:32 compute-0 nova_compute[187185]: 2025-11-29 07:35:32.606 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.394s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:35:32 compute-0 nova_compute[187185]: 2025-11-29 07:35:32.965 187189 DEBUG oslo_concurrency.lockutils [None req-08cbea0e-ee3a-446c-a32b-eb10dcc22daa 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "0a1598ac-7fe7-4004-ad07-c9b7428d7822" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:35:32 compute-0 nova_compute[187185]: 2025-11-29 07:35:32.966 187189 DEBUG oslo_concurrency.lockutils [None req-08cbea0e-ee3a-446c-a32b-eb10dcc22daa 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "0a1598ac-7fe7-4004-ad07-c9b7428d7822" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:35:32 compute-0 nova_compute[187185]: 2025-11-29 07:35:32.966 187189 DEBUG oslo_concurrency.lockutils [None req-08cbea0e-ee3a-446c-a32b-eb10dcc22daa 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "0a1598ac-7fe7-4004-ad07-c9b7428d7822-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:35:32 compute-0 nova_compute[187185]: 2025-11-29 07:35:32.966 187189 DEBUG oslo_concurrency.lockutils [None req-08cbea0e-ee3a-446c-a32b-eb10dcc22daa 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "0a1598ac-7fe7-4004-ad07-c9b7428d7822-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:35:32 compute-0 nova_compute[187185]: 2025-11-29 07:35:32.967 187189 DEBUG oslo_concurrency.lockutils [None req-08cbea0e-ee3a-446c-a32b-eb10dcc22daa 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "0a1598ac-7fe7-4004-ad07-c9b7428d7822-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:35:32 compute-0 nova_compute[187185]: 2025-11-29 07:35:32.982 187189 INFO nova.compute.manager [None req-08cbea0e-ee3a-446c-a32b-eb10dcc22daa 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Terminating instance
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.002 187189 DEBUG nova.compute.manager [None req-08cbea0e-ee3a-446c-a32b-eb10dcc22daa 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 07:35:33 compute-0 kernel: tap3a32e9dc-c9 (unregistering): left promiscuous mode
Nov 29 07:35:33 compute-0 NetworkManager[55227]: <info>  [1764401733.0267] device (tap3a32e9dc-c9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.034 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:33 compute-0 ovn_controller[95281]: 2025-11-29T07:35:33Z|00480|binding|INFO|Releasing lport 3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5 from this chassis (sb_readonly=0)
Nov 29 07:35:33 compute-0 ovn_controller[95281]: 2025-11-29T07:35:33Z|00481|binding|INFO|Setting lport 3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5 down in Southbound
Nov 29 07:35:33 compute-0 ovn_controller[95281]: 2025-11-29T07:35:33Z|00482|binding|INFO|Removing iface tap3a32e9dc-c9 ovn-installed in OVS
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.037 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.047 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:33.049 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:62:77:74 10.100.0.4'], port_security=['fa:16:3e:62:77:74 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '0a1598ac-7fe7-4004-ad07-c9b7428d7822', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7900eada-6f98-452f-b178-ad26f2b82064', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0790a775-3668-4bb8-97f1-d4276df58523, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:35:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:33.053 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5 in datapath ee42f4e1-7038-4a24-8d9b-8ee99ca415d0 unbound from our chassis
Nov 29 07:35:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:33.057 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ee42f4e1-7038-4a24-8d9b-8ee99ca415d0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:35:33 compute-0 kernel: tapcca30a15-3d (unregistering): left promiscuous mode
Nov 29 07:35:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:33.059 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f45002e2-45e5-4d74-aee1-39b59e1c5bbd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:33.060 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0 namespace which is not needed anymore
Nov 29 07:35:33 compute-0 NetworkManager[55227]: <info>  [1764401733.0634] device (tapcca30a15-3d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.065 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.072 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:33 compute-0 ovn_controller[95281]: 2025-11-29T07:35:33Z|00483|binding|INFO|Releasing lport cca30a15-3dfe-49f3-8b60-50dc1d039795 from this chassis (sb_readonly=0)
Nov 29 07:35:33 compute-0 ovn_controller[95281]: 2025-11-29T07:35:33Z|00484|binding|INFO|Setting lport cca30a15-3dfe-49f3-8b60-50dc1d039795 down in Southbound
Nov 29 07:35:33 compute-0 ovn_controller[95281]: 2025-11-29T07:35:33Z|00485|binding|INFO|Removing iface tapcca30a15-3d ovn-installed in OVS
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.080 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:33.090 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6a:4b:a1 2001:db8:0:1:f816:3eff:fe6a:4ba1 2001:db8::f816:3eff:fe6a:4ba1'], port_security=['fa:16:3e:6a:4b:a1 2001:db8:0:1:f816:3eff:fe6a:4ba1 2001:db8::f816:3eff:fe6a:4ba1'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe6a:4ba1/64 2001:db8::f816:3eff:fe6a:4ba1/64', 'neutron:device_id': '0a1598ac-7fe7-4004-ad07-c9b7428d7822', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-716ed53e-cc56-4286-b418-2f5e02d33124', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7900eada-6f98-452f-b178-ad26f2b82064', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c069d1db-d7e5-4641-988e-cd6e75103caa, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=cca30a15-3dfe-49f3-8b60-50dc1d039795) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.091 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:33 compute-0 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d00000095.scope: Deactivated successfully.
Nov 29 07:35:33 compute-0 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d00000095.scope: Consumed 13.384s CPU time.
Nov 29 07:35:33 compute-0 systemd-machined[153486]: Machine qemu-58-instance-00000095 terminated.
Nov 29 07:35:33 compute-0 neutron-haproxy-ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0[239970]: [NOTICE]   (239974) : haproxy version is 2.8.14-c23fe91
Nov 29 07:35:33 compute-0 neutron-haproxy-ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0[239970]: [NOTICE]   (239974) : path to executable is /usr/sbin/haproxy
Nov 29 07:35:33 compute-0 neutron-haproxy-ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0[239970]: [WARNING]  (239974) : Exiting Master process...
Nov 29 07:35:33 compute-0 neutron-haproxy-ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0[239970]: [ALERT]    (239974) : Current worker (239976) exited with code 143 (Terminated)
Nov 29 07:35:33 compute-0 neutron-haproxy-ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0[239970]: [WARNING]  (239974) : All workers exited. Exiting... (0)
Nov 29 07:35:33 compute-0 systemd[1]: libpod-4ac20c242a1c358d32fac0b34af1e03750ed6fc415de3adb3019035ae1bb7e10.scope: Deactivated successfully.
Nov 29 07:35:33 compute-0 NetworkManager[55227]: <info>  [1764401733.2272] manager: (tap3a32e9dc-c9): new Tun device (/org/freedesktop/NetworkManager/Devices/249)
Nov 29 07:35:33 compute-0 podman[240203]: 2025-11-29 07:35:33.228228369 +0000 UTC m=+0.053895647 container died 4ac20c242a1c358d32fac0b34af1e03750ed6fc415de3adb3019035ae1bb7e10 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:35:33 compute-0 NetworkManager[55227]: <info>  [1764401733.2390] manager: (tapcca30a15-3d): new Tun device (/org/freedesktop/NetworkManager/Devices/250)
Nov 29 07:35:33 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4ac20c242a1c358d32fac0b34af1e03750ed6fc415de3adb3019035ae1bb7e10-userdata-shm.mount: Deactivated successfully.
Nov 29 07:35:33 compute-0 systemd[1]: var-lib-containers-storage-overlay-d1ed4379c7927cc2537b696b8eb11bf17d1616abc7f5929465b99a3cf6e60f8a-merged.mount: Deactivated successfully.
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.285 187189 INFO nova.virt.libvirt.driver [-] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Instance destroyed successfully.
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.286 187189 DEBUG nova.objects.instance [None req-08cbea0e-ee3a-446c-a32b-eb10dcc22daa 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'resources' on Instance uuid 0a1598ac-7fe7-4004-ad07-c9b7428d7822 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:35:33 compute-0 podman[240203]: 2025-11-29 07:35:33.287093885 +0000 UTC m=+0.112761153 container cleanup 4ac20c242a1c358d32fac0b34af1e03750ed6fc415de3adb3019035ae1bb7e10 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 07:35:33 compute-0 systemd[1]: libpod-conmon-4ac20c242a1c358d32fac0b34af1e03750ed6fc415de3adb3019035ae1bb7e10.scope: Deactivated successfully.
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.315 187189 DEBUG nova.virt.libvirt.vif [None req-08cbea0e-ee3a-446c-a32b-eb10dcc22daa 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:33:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1402689375',display_name='tempest-TestGettingAddress-server-1402689375',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1402689375',id=149,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBK18BM+cTj/3sUxBzPfGqMZVFW8u0MAYl6D47npYBAuFCVOWerNtWreCBLJmXVjolkwHlCMhwEYpOxlO5EeqIXZi9GpQKv0jmJXL6Uw9pSzb3x5DyZSKVQsbQBVQ+UXj+w==',key_name='tempest-TestGettingAddress-668700412',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:35:10Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-izv751fp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:35:10Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=0a1598ac-7fe7-4004-ad07-c9b7428d7822,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5", "address": "fa:16:3e:62:77:74", "network": {"id": "ee42f4e1-7038-4a24-8d9b-8ee99ca415d0", "bridge": "br-int", "label": "tempest-network-smoke--1973917221", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a32e9dc-c9", "ovs_interfaceid": "3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.315 187189 DEBUG nova.network.os_vif_util [None req-08cbea0e-ee3a-446c-a32b-eb10dcc22daa 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5", "address": "fa:16:3e:62:77:74", "network": {"id": "ee42f4e1-7038-4a24-8d9b-8ee99ca415d0", "bridge": "br-int", "label": "tempest-network-smoke--1973917221", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a32e9dc-c9", "ovs_interfaceid": "3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.316 187189 DEBUG nova.network.os_vif_util [None req-08cbea0e-ee3a-446c-a32b-eb10dcc22daa 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:62:77:74,bridge_name='br-int',has_traffic_filtering=True,id=3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5,network=Network(ee42f4e1-7038-4a24-8d9b-8ee99ca415d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a32e9dc-c9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.316 187189 DEBUG os_vif [None req-08cbea0e-ee3a-446c-a32b-eb10dcc22daa 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:62:77:74,bridge_name='br-int',has_traffic_filtering=True,id=3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5,network=Network(ee42f4e1-7038-4a24-8d9b-8ee99ca415d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a32e9dc-c9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.318 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.318 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3a32e9dc-c9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.323 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.326 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.330 187189 INFO os_vif [None req-08cbea0e-ee3a-446c-a32b-eb10dcc22daa 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:62:77:74,bridge_name='br-int',has_traffic_filtering=True,id=3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5,network=Network(ee42f4e1-7038-4a24-8d9b-8ee99ca415d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a32e9dc-c9')
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.331 187189 DEBUG nova.virt.libvirt.vif [None req-08cbea0e-ee3a-446c-a32b-eb10dcc22daa 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:33:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1402689375',display_name='tempest-TestGettingAddress-server-1402689375',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1402689375',id=149,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBK18BM+cTj/3sUxBzPfGqMZVFW8u0MAYl6D47npYBAuFCVOWerNtWreCBLJmXVjolkwHlCMhwEYpOxlO5EeqIXZi9GpQKv0jmJXL6Uw9pSzb3x5DyZSKVQsbQBVQ+UXj+w==',key_name='tempest-TestGettingAddress-668700412',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:35:10Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-izv751fp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:35:10Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=0a1598ac-7fe7-4004-ad07-c9b7428d7822,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cca30a15-3dfe-49f3-8b60-50dc1d039795", "address": "fa:16:3e:6a:4b:a1", "network": {"id": "716ed53e-cc56-4286-b418-2f5e02d33124", "bridge": "br-int", "label": "tempest-network-smoke--990040375", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe6a:4ba1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe6a:4ba1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcca30a15-3d", "ovs_interfaceid": "cca30a15-3dfe-49f3-8b60-50dc1d039795", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.331 187189 DEBUG nova.network.os_vif_util [None req-08cbea0e-ee3a-446c-a32b-eb10dcc22daa 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "cca30a15-3dfe-49f3-8b60-50dc1d039795", "address": "fa:16:3e:6a:4b:a1", "network": {"id": "716ed53e-cc56-4286-b418-2f5e02d33124", "bridge": "br-int", "label": "tempest-network-smoke--990040375", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe6a:4ba1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe6a:4ba1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcca30a15-3d", "ovs_interfaceid": "cca30a15-3dfe-49f3-8b60-50dc1d039795", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.332 187189 DEBUG nova.network.os_vif_util [None req-08cbea0e-ee3a-446c-a32b-eb10dcc22daa 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6a:4b:a1,bridge_name='br-int',has_traffic_filtering=True,id=cca30a15-3dfe-49f3-8b60-50dc1d039795,network=Network(716ed53e-cc56-4286-b418-2f5e02d33124),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcca30a15-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.333 187189 DEBUG os_vif [None req-08cbea0e-ee3a-446c-a32b-eb10dcc22daa 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6a:4b:a1,bridge_name='br-int',has_traffic_filtering=True,id=cca30a15-3dfe-49f3-8b60-50dc1d039795,network=Network(716ed53e-cc56-4286-b418-2f5e02d33124),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcca30a15-3d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.334 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.334 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcca30a15-3d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.336 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.337 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.339 187189 INFO os_vif [None req-08cbea0e-ee3a-446c-a32b-eb10dcc22daa 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6a:4b:a1,bridge_name='br-int',has_traffic_filtering=True,id=cca30a15-3dfe-49f3-8b60-50dc1d039795,network=Network(716ed53e-cc56-4286-b418-2f5e02d33124),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcca30a15-3d')
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.340 187189 INFO nova.virt.libvirt.driver [None req-08cbea0e-ee3a-446c-a32b-eb10dcc22daa 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Deleting instance files /var/lib/nova/instances/0a1598ac-7fe7-4004-ad07-c9b7428d7822_del
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.341 187189 INFO nova.virt.libvirt.driver [None req-08cbea0e-ee3a-446c-a32b-eb10dcc22daa 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Deletion of /var/lib/nova/instances/0a1598ac-7fe7-4004-ad07-c9b7428d7822_del complete
Nov 29 07:35:33 compute-0 podman[240261]: 2025-11-29 07:35:33.384462262 +0000 UTC m=+0.069929241 container remove 4ac20c242a1c358d32fac0b34af1e03750ed6fc415de3adb3019035ae1bb7e10 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:35:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:33.390 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[5876f234-7194-47ac-bbcc-aa5c6aa8b757]: (4, ('Sat Nov 29 07:35:33 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0 (4ac20c242a1c358d32fac0b34af1e03750ed6fc415de3adb3019035ae1bb7e10)\n4ac20c242a1c358d32fac0b34af1e03750ed6fc415de3adb3019035ae1bb7e10\nSat Nov 29 07:35:33 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0 (4ac20c242a1c358d32fac0b34af1e03750ed6fc415de3adb3019035ae1bb7e10)\n4ac20c242a1c358d32fac0b34af1e03750ed6fc415de3adb3019035ae1bb7e10\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:33.391 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6edb4ca9-0d95-4f23-b6ff-b4dfd384132b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:33.392 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee42f4e1-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:35:33 compute-0 kernel: tapee42f4e1-70: left promiscuous mode
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.395 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.405 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:33.408 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[3fe70a2f-a41e-48f7-aa94-82eb42754f79]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:33.430 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[635b45b7-9370-46e1-9786-dfc2a0133dc7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:33.432 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[5351161c-bbda-42fa-bbc2-304cc935b82c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.442 187189 INFO nova.compute.manager [None req-08cbea0e-ee3a-446c-a32b-eb10dcc22daa 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Took 0.44 seconds to destroy the instance on the hypervisor.
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.443 187189 DEBUG oslo.service.loopingcall [None req-08cbea0e-ee3a-446c-a32b-eb10dcc22daa 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.444 187189 DEBUG nova.compute.manager [-] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.444 187189 DEBUG nova.network.neutron [-] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 07:35:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:33.448 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[7a1ec3b4-f49b-4375-bffe-0fc13ce813f2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 715579, 'reachable_time': 36515, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240275, 'error': None, 'target': 'ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:33 compute-0 systemd[1]: run-netns-ovnmeta\x2dee42f4e1\x2d7038\x2d4a24\x2d8d9b\x2d8ee99ca415d0.mount: Deactivated successfully.
Nov 29 07:35:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:33.455 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:35:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:33.456 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[2c13a9c5-95d9-4efa-9325-248307cc9c1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:33.457 104254 INFO neutron.agent.ovn.metadata.agent [-] Port cca30a15-3dfe-49f3-8b60-50dc1d039795 in datapath 716ed53e-cc56-4286-b418-2f5e02d33124 unbound from our chassis
Nov 29 07:35:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:33.458 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 716ed53e-cc56-4286-b418-2f5e02d33124, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:35:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:33.460 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[dc38160b-23a3-485a-92dc-2a1db9003e64]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:33.460 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124 namespace which is not needed anymore
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.472 187189 DEBUG nova.compute.manager [req-6527f3be-0b76-4fb1-a0bd-f8402243d213 req-fb130c62-65e8-45c7-badb-298f4641456e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Received event network-vif-unplugged-cca30a15-3dfe-49f3-8b60-50dc1d039795 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.472 187189 DEBUG oslo_concurrency.lockutils [req-6527f3be-0b76-4fb1-a0bd-f8402243d213 req-fb130c62-65e8-45c7-badb-298f4641456e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "0a1598ac-7fe7-4004-ad07-c9b7428d7822-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.473 187189 DEBUG oslo_concurrency.lockutils [req-6527f3be-0b76-4fb1-a0bd-f8402243d213 req-fb130c62-65e8-45c7-badb-298f4641456e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0a1598ac-7fe7-4004-ad07-c9b7428d7822-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.473 187189 DEBUG oslo_concurrency.lockutils [req-6527f3be-0b76-4fb1-a0bd-f8402243d213 req-fb130c62-65e8-45c7-badb-298f4641456e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0a1598ac-7fe7-4004-ad07-c9b7428d7822-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.473 187189 DEBUG nova.compute.manager [req-6527f3be-0b76-4fb1-a0bd-f8402243d213 req-fb130c62-65e8-45c7-badb-298f4641456e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] No waiting events found dispatching network-vif-unplugged-cca30a15-3dfe-49f3-8b60-50dc1d039795 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.474 187189 DEBUG nova.compute.manager [req-6527f3be-0b76-4fb1-a0bd-f8402243d213 req-fb130c62-65e8-45c7-badb-298f4641456e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Received event network-vif-unplugged-cca30a15-3dfe-49f3-8b60-50dc1d039795 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 07:35:33 compute-0 neutron-haproxy-ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124[240044]: [NOTICE]   (240048) : haproxy version is 2.8.14-c23fe91
Nov 29 07:35:33 compute-0 neutron-haproxy-ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124[240044]: [NOTICE]   (240048) : path to executable is /usr/sbin/haproxy
Nov 29 07:35:33 compute-0 neutron-haproxy-ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124[240044]: [WARNING]  (240048) : Exiting Master process...
Nov 29 07:35:33 compute-0 neutron-haproxy-ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124[240044]: [WARNING]  (240048) : Exiting Master process...
Nov 29 07:35:33 compute-0 neutron-haproxy-ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124[240044]: [ALERT]    (240048) : Current worker (240050) exited with code 143 (Terminated)
Nov 29 07:35:33 compute-0 neutron-haproxy-ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124[240044]: [WARNING]  (240048) : All workers exited. Exiting... (0)
Nov 29 07:35:33 compute-0 systemd[1]: libpod-e37448a287192cb2344f84c6c8d53a306d1bbe4e7474ac0cdbd933e55b45a3eb.scope: Deactivated successfully.
Nov 29 07:35:33 compute-0 podman[240293]: 2025-11-29 07:35:33.633710129 +0000 UTC m=+0.063577721 container died e37448a287192cb2344f84c6c8d53a306d1bbe4e7474ac0cdbd933e55b45a3eb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 07:35:33 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e37448a287192cb2344f84c6c8d53a306d1bbe4e7474ac0cdbd933e55b45a3eb-userdata-shm.mount: Deactivated successfully.
Nov 29 07:35:33 compute-0 systemd[1]: var-lib-containers-storage-overlay-804a92621ba746f77d052e4232d066b14ef35779de78486fcda7eb4cfb823a6f-merged.mount: Deactivated successfully.
Nov 29 07:35:33 compute-0 podman[240293]: 2025-11-29 07:35:33.681240305 +0000 UTC m=+0.111107887 container cleanup e37448a287192cb2344f84c6c8d53a306d1bbe4e7474ac0cdbd933e55b45a3eb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:35:33 compute-0 systemd[1]: libpod-conmon-e37448a287192cb2344f84c6c8d53a306d1bbe4e7474ac0cdbd933e55b45a3eb.scope: Deactivated successfully.
Nov 29 07:35:33 compute-0 podman[240323]: 2025-11-29 07:35:33.766829748 +0000 UTC m=+0.057314023 container remove e37448a287192cb2344f84c6c8d53a306d1bbe4e7474ac0cdbd933e55b45a3eb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:35:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:33.772 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[756b6c1c-947d-4e51-a0a9-56c9a167e9ee]: (4, ('Sat Nov 29 07:35:33 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124 (e37448a287192cb2344f84c6c8d53a306d1bbe4e7474ac0cdbd933e55b45a3eb)\ne37448a287192cb2344f84c6c8d53a306d1bbe4e7474ac0cdbd933e55b45a3eb\nSat Nov 29 07:35:33 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124 (e37448a287192cb2344f84c6c8d53a306d1bbe4e7474ac0cdbd933e55b45a3eb)\ne37448a287192cb2344f84c6c8d53a306d1bbe4e7474ac0cdbd933e55b45a3eb\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:33.775 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[fa773401-8e02-4b94-8a9a-555c56331e49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:33.776 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap716ed53e-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.779 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:33 compute-0 kernel: tap716ed53e-c0: left promiscuous mode
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.782 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:33.792 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[19455151-0d29-4603-901b-0ff3f5020b2e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:33 compute-0 nova_compute[187185]: 2025-11-29 07:35:33.808 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:33.814 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[b4bec997-aa9a-41db-953d-57617f1bdc98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:33.815 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[3ca0fe78-6188-4684-b3cf-7527318f683a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:33.838 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[84050be8-7bef-44f4-a795-c78b97b5445d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 715797, 'reachable_time': 37960, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240338, 'error': None, 'target': 'ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:33.842 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:35:33 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:35:33.842 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[854d65fc-22d0-4dc4-aced-68b3a81bea69]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:35:34 compute-0 systemd[1]: run-netns-ovnmeta\x2d716ed53e\x2dcc56\x2d4286\x2db418\x2d2f5e02d33124.mount: Deactivated successfully.
Nov 29 07:35:34 compute-0 nova_compute[187185]: 2025-11-29 07:35:34.612 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:34 compute-0 nova_compute[187185]: 2025-11-29 07:35:34.645 187189 DEBUG nova.network.neutron [req-ec656717-08c2-4674-8e3b-565fb2082953 req-fc8a240c-c917-49fa-b66b-2378ec2f9124 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Updated VIF entry in instance network info cache for port 3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:35:34 compute-0 nova_compute[187185]: 2025-11-29 07:35:34.645 187189 DEBUG nova.network.neutron [req-ec656717-08c2-4674-8e3b-565fb2082953 req-fc8a240c-c917-49fa-b66b-2378ec2f9124 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Updating instance_info_cache with network_info: [{"id": "3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5", "address": "fa:16:3e:62:77:74", "network": {"id": "ee42f4e1-7038-4a24-8d9b-8ee99ca415d0", "bridge": "br-int", "label": "tempest-network-smoke--1973917221", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a32e9dc-c9", "ovs_interfaceid": "3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "cca30a15-3dfe-49f3-8b60-50dc1d039795", "address": "fa:16:3e:6a:4b:a1", "network": {"id": "716ed53e-cc56-4286-b418-2f5e02d33124", "bridge": "br-int", "label": "tempest-network-smoke--990040375", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe6a:4ba1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe6a:4ba1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcca30a15-3d", "ovs_interfaceid": "cca30a15-3dfe-49f3-8b60-50dc1d039795", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:35:34 compute-0 nova_compute[187185]: 2025-11-29 07:35:34.886 187189 DEBUG oslo_concurrency.lockutils [req-ec656717-08c2-4674-8e3b-565fb2082953 req-fc8a240c-c917-49fa-b66b-2378ec2f9124 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-0a1598ac-7fe7-4004-ad07-c9b7428d7822" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:35:34 compute-0 nova_compute[187185]: 2025-11-29 07:35:34.887 187189 DEBUG oslo_concurrency.lockutils [req-63fc076f-dfe4-464f-a153-8f172300f585 req-affffac4-a8b5-49d7-8113-d6ee67906754 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-0a1598ac-7fe7-4004-ad07-c9b7428d7822" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:35:34 compute-0 nova_compute[187185]: 2025-11-29 07:35:34.888 187189 DEBUG nova.network.neutron [req-63fc076f-dfe4-464f-a153-8f172300f585 req-affffac4-a8b5-49d7-8113-d6ee67906754 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Refreshing network info cache for port 3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:35:35 compute-0 nova_compute[187185]: 2025-11-29 07:35:35.283 187189 DEBUG nova.compute.manager [req-1ee2ce42-e55e-4405-a663-8827537c612c req-93288b21-a32a-4dad-9a1a-b0f8fe736fbd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Received event network-vif-deleted-3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:35:35 compute-0 nova_compute[187185]: 2025-11-29 07:35:35.283 187189 INFO nova.compute.manager [req-1ee2ce42-e55e-4405-a663-8827537c612c req-93288b21-a32a-4dad-9a1a-b0f8fe736fbd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Neutron deleted interface 3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5; detaching it from the instance and deleting it from the info cache
Nov 29 07:35:35 compute-0 nova_compute[187185]: 2025-11-29 07:35:35.283 187189 DEBUG nova.network.neutron [req-1ee2ce42-e55e-4405-a663-8827537c612c req-93288b21-a32a-4dad-9a1a-b0f8fe736fbd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Updating instance_info_cache with network_info: [{"id": "cca30a15-3dfe-49f3-8b60-50dc1d039795", "address": "fa:16:3e:6a:4b:a1", "network": {"id": "716ed53e-cc56-4286-b418-2f5e02d33124", "bridge": "br-int", "label": "tempest-network-smoke--990040375", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe6a:4ba1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe6a:4ba1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcca30a15-3d", "ovs_interfaceid": "cca30a15-3dfe-49f3-8b60-50dc1d039795", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:35:35 compute-0 nova_compute[187185]: 2025-11-29 07:35:35.322 187189 DEBUG nova.compute.manager [req-1ee2ce42-e55e-4405-a663-8827537c612c req-93288b21-a32a-4dad-9a1a-b0f8fe736fbd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Detach interface failed, port_id=3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5, reason: Instance 0a1598ac-7fe7-4004-ad07-c9b7428d7822 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Nov 29 07:35:35 compute-0 nova_compute[187185]: 2025-11-29 07:35:35.606 187189 DEBUG nova.compute.manager [req-7df0426c-ffba-445d-8c3e-35f86d073f1b req-840e2dfc-6284-4c31-a155-ff480160e864 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Received event network-vif-plugged-cca30a15-3dfe-49f3-8b60-50dc1d039795 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:35:35 compute-0 nova_compute[187185]: 2025-11-29 07:35:35.607 187189 DEBUG oslo_concurrency.lockutils [req-7df0426c-ffba-445d-8c3e-35f86d073f1b req-840e2dfc-6284-4c31-a155-ff480160e864 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "0a1598ac-7fe7-4004-ad07-c9b7428d7822-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:35:35 compute-0 nova_compute[187185]: 2025-11-29 07:35:35.607 187189 DEBUG oslo_concurrency.lockutils [req-7df0426c-ffba-445d-8c3e-35f86d073f1b req-840e2dfc-6284-4c31-a155-ff480160e864 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0a1598ac-7fe7-4004-ad07-c9b7428d7822-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:35:35 compute-0 nova_compute[187185]: 2025-11-29 07:35:35.607 187189 DEBUG oslo_concurrency.lockutils [req-7df0426c-ffba-445d-8c3e-35f86d073f1b req-840e2dfc-6284-4c31-a155-ff480160e864 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0a1598ac-7fe7-4004-ad07-c9b7428d7822-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:35:35 compute-0 nova_compute[187185]: 2025-11-29 07:35:35.607 187189 DEBUG nova.compute.manager [req-7df0426c-ffba-445d-8c3e-35f86d073f1b req-840e2dfc-6284-4c31-a155-ff480160e864 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] No waiting events found dispatching network-vif-plugged-cca30a15-3dfe-49f3-8b60-50dc1d039795 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:35:35 compute-0 nova_compute[187185]: 2025-11-29 07:35:35.607 187189 WARNING nova.compute.manager [req-7df0426c-ffba-445d-8c3e-35f86d073f1b req-840e2dfc-6284-4c31-a155-ff480160e864 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Received unexpected event network-vif-plugged-cca30a15-3dfe-49f3-8b60-50dc1d039795 for instance with vm_state active and task_state deleting.
Nov 29 07:35:35 compute-0 nova_compute[187185]: 2025-11-29 07:35:35.653 187189 INFO nova.network.neutron [req-63fc076f-dfe4-464f-a153-8f172300f585 req-affffac4-a8b5-49d7-8113-d6ee67906754 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Port 3a32e9dc-c955-41ca-97f3-2e1f3aaeeba5 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Nov 29 07:35:35 compute-0 nova_compute[187185]: 2025-11-29 07:35:35.653 187189 DEBUG nova.network.neutron [req-63fc076f-dfe4-464f-a153-8f172300f585 req-affffac4-a8b5-49d7-8113-d6ee67906754 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Updating instance_info_cache with network_info: [{"id": "cca30a15-3dfe-49f3-8b60-50dc1d039795", "address": "fa:16:3e:6a:4b:a1", "network": {"id": "716ed53e-cc56-4286-b418-2f5e02d33124", "bridge": "br-int", "label": "tempest-network-smoke--990040375", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe6a:4ba1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe6a:4ba1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcca30a15-3d", "ovs_interfaceid": "cca30a15-3dfe-49f3-8b60-50dc1d039795", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:35:35 compute-0 nova_compute[187185]: 2025-11-29 07:35:35.722 187189 DEBUG oslo_concurrency.lockutils [req-63fc076f-dfe4-464f-a153-8f172300f585 req-affffac4-a8b5-49d7-8113-d6ee67906754 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-0a1598ac-7fe7-4004-ad07-c9b7428d7822" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:35:36 compute-0 nova_compute[187185]: 2025-11-29 07:35:36.325 187189 DEBUG nova.network.neutron [-] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:35:36 compute-0 nova_compute[187185]: 2025-11-29 07:35:36.850 187189 INFO nova.compute.manager [-] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Took 3.41 seconds to deallocate network for instance.
Nov 29 07:35:36 compute-0 nova_compute[187185]: 2025-11-29 07:35:36.978 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:37 compute-0 nova_compute[187185]: 2025-11-29 07:35:37.362 187189 DEBUG oslo_concurrency.lockutils [None req-08cbea0e-ee3a-446c-a32b-eb10dcc22daa 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:35:37 compute-0 nova_compute[187185]: 2025-11-29 07:35:37.363 187189 DEBUG oslo_concurrency.lockutils [None req-08cbea0e-ee3a-446c-a32b-eb10dcc22daa 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:35:37 compute-0 nova_compute[187185]: 2025-11-29 07:35:37.556 187189 DEBUG nova.compute.provider_tree [None req-08cbea0e-ee3a-446c-a32b-eb10dcc22daa 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:35:37 compute-0 nova_compute[187185]: 2025-11-29 07:35:37.572 187189 DEBUG nova.scheduler.client.report [None req-08cbea0e-ee3a-446c-a32b-eb10dcc22daa 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:35:37 compute-0 nova_compute[187185]: 2025-11-29 07:35:37.611 187189 DEBUG oslo_concurrency.lockutils [None req-08cbea0e-ee3a-446c-a32b-eb10dcc22daa 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.248s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:35:37 compute-0 nova_compute[187185]: 2025-11-29 07:35:37.649 187189 DEBUG nova.compute.manager [req-2bbed9c3-f216-496b-82f9-131b06cf0df3 req-dcccc269-bdef-4e00-a550-a50ed61efd43 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Received event network-vif-deleted-cca30a15-3dfe-49f3-8b60-50dc1d039795 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:35:37 compute-0 nova_compute[187185]: 2025-11-29 07:35:37.665 187189 INFO nova.scheduler.client.report [None req-08cbea0e-ee3a-446c-a32b-eb10dcc22daa 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Deleted allocations for instance 0a1598ac-7fe7-4004-ad07-c9b7428d7822
Nov 29 07:35:37 compute-0 nova_compute[187185]: 2025-11-29 07:35:37.803 187189 DEBUG oslo_concurrency.lockutils [None req-08cbea0e-ee3a-446c-a32b-eb10dcc22daa 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "0a1598ac-7fe7-4004-ad07-c9b7428d7822" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.837s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:35:37 compute-0 podman[240341]: 2025-11-29 07:35:37.807653627 +0000 UTC m=+0.061114131 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:35:37 compute-0 podman[240340]: 2025-11-29 07:35:37.808422079 +0000 UTC m=+0.066533465 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, config_id=edpm, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., version=9.6, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64)
Nov 29 07:35:37 compute-0 podman[240339]: 2025-11-29 07:35:37.826075698 +0000 UTC m=+0.088085914 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 07:35:38 compute-0 nova_compute[187185]: 2025-11-29 07:35:38.338 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:40 compute-0 nova_compute[187185]: 2025-11-29 07:35:40.598 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:35:41 compute-0 nova_compute[187185]: 2025-11-29 07:35:41.981 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:43 compute-0 nova_compute[187185]: 2025-11-29 07:35:43.340 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:46 compute-0 podman[240403]: 2025-11-29 07:35:46.873042227 +0000 UTC m=+0.126307657 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Nov 29 07:35:46 compute-0 nova_compute[187185]: 2025-11-29 07:35:46.984 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:48 compute-0 nova_compute[187185]: 2025-11-29 07:35:48.285 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401733.2837682, 0a1598ac-7fe7-4004-ad07-c9b7428d7822 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:35:48 compute-0 nova_compute[187185]: 2025-11-29 07:35:48.286 187189 INFO nova.compute.manager [-] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] VM Stopped (Lifecycle Event)
Nov 29 07:35:48 compute-0 nova_compute[187185]: 2025-11-29 07:35:48.342 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:51 compute-0 sshd-session[240431]: Invalid user under from 190.181.27.27 port 34578
Nov 29 07:35:51 compute-0 sshd-session[240431]: Received disconnect from 190.181.27.27 port 34578:11: Bye Bye [preauth]
Nov 29 07:35:51 compute-0 sshd-session[240431]: Disconnected from invalid user under 190.181.27.27 port 34578 [preauth]
Nov 29 07:35:51 compute-0 nova_compute[187185]: 2025-11-29 07:35:51.986 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:52 compute-0 nova_compute[187185]: 2025-11-29 07:35:52.826 187189 DEBUG nova.compute.manager [None req-d8b8461e-c1d4-4326-989e-cd3010ba768f - - - - - -] [instance: 0a1598ac-7fe7-4004-ad07-c9b7428d7822] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:35:53 compute-0 nova_compute[187185]: 2025-11-29 07:35:53.344 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:54 compute-0 nova_compute[187185]: 2025-11-29 07:35:54.071 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:54 compute-0 nova_compute[187185]: 2025-11-29 07:35:54.349 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:54 compute-0 podman[240434]: 2025-11-29 07:35:54.824921762 +0000 UTC m=+0.084349429 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 07:35:56 compute-0 nova_compute[187185]: 2025-11-29 07:35:56.987 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:58 compute-0 nova_compute[187185]: 2025-11-29 07:35:58.345 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:35:58 compute-0 podman[240459]: 2025-11-29 07:35:58.461572187 +0000 UTC m=+0.086765448 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 07:35:58 compute-0 podman[240460]: 2025-11-29 07:35:58.46769601 +0000 UTC m=+0.085070660 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute)
Nov 29 07:36:01 compute-0 nova_compute[187185]: 2025-11-29 07:36:01.988 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:36:03 compute-0 nova_compute[187185]: 2025-11-29 07:36:03.348 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:36:06 compute-0 nova_compute[187185]: 2025-11-29 07:36:06.990 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:36:08 compute-0 nova_compute[187185]: 2025-11-29 07:36:08.350 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:36:08 compute-0 podman[240500]: 2025-11-29 07:36:08.807314759 +0000 UTC m=+0.070408655 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, vcs-type=git, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, version=9.6, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Nov 29 07:36:08 compute-0 podman[240499]: 2025-11-29 07:36:08.826506592 +0000 UTC m=+0.083256348 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Nov 29 07:36:08 compute-0 podman[240501]: 2025-11-29 07:36:08.833229943 +0000 UTC m=+0.077681161 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 07:36:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:36:09.235 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=54, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=53) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:36:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:36:09.236 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:36:09 compute-0 nova_compute[187185]: 2025-11-29 07:36:09.280 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:36:11 compute-0 nova_compute[187185]: 2025-11-29 07:36:11.991 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:36:13 compute-0 nova_compute[187185]: 2025-11-29 07:36:13.352 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:36:16 compute-0 nova_compute[187185]: 2025-11-29 07:36:16.993 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:36:17 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:36:17.238 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '54'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:36:17 compute-0 podman[240563]: 2025-11-29 07:36:17.83806068 +0000 UTC m=+0.106326491 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 29 07:36:18 compute-0 nova_compute[187185]: 2025-11-29 07:36:18.354 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:36:20 compute-0 sshd-session[240590]: Invalid user system from 20.255.62.58 port 40176
Nov 29 07:36:20 compute-0 sshd-session[240590]: Received disconnect from 20.255.62.58 port 40176:11: Bye Bye [preauth]
Nov 29 07:36:20 compute-0 sshd-session[240590]: Disconnected from invalid user system 20.255.62.58 port 40176 [preauth]
Nov 29 07:36:21 compute-0 nova_compute[187185]: 2025-11-29 07:36:21.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:36:21 compute-0 nova_compute[187185]: 2025-11-29 07:36:21.315 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:36:21 compute-0 nova_compute[187185]: 2025-11-29 07:36:21.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:36:21 compute-0 nova_compute[187185]: 2025-11-29 07:36:21.537 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 07:36:21 compute-0 nova_compute[187185]: 2025-11-29 07:36:21.537 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:36:21 compute-0 nova_compute[187185]: 2025-11-29 07:36:21.995 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:36:22 compute-0 nova_compute[187185]: 2025-11-29 07:36:22.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:36:23 compute-0 nova_compute[187185]: 2025-11-29 07:36:23.356 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:36:25 compute-0 nova_compute[187185]: 2025-11-29 07:36:25.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:36:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:36:25.526 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:36:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:36:25.526 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:36:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:36:25.526 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:36:25 compute-0 podman[240592]: 2025-11-29 07:36:25.826895871 +0000 UTC m=+0.057490939 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:36:25 compute-0 nova_compute[187185]: 2025-11-29 07:36:25.918 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:36:25 compute-0 nova_compute[187185]: 2025-11-29 07:36:25.919 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:36:25 compute-0 nova_compute[187185]: 2025-11-29 07:36:25.919 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:36:25 compute-0 nova_compute[187185]: 2025-11-29 07:36:25.919 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:36:26 compute-0 nova_compute[187185]: 2025-11-29 07:36:26.135 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:36:26 compute-0 nova_compute[187185]: 2025-11-29 07:36:26.136 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5741MB free_disk=73.25408935546875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:36:26 compute-0 nova_compute[187185]: 2025-11-29 07:36:26.136 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:36:26 compute-0 nova_compute[187185]: 2025-11-29 07:36:26.136 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:36:26 compute-0 sshd-session[240562]: error: kex_exchange_identification: read: Connection timed out
Nov 29 07:36:26 compute-0 sshd-session[240562]: banner exchange: Connection from 120.48.39.73 port 46558: Connection timed out
Nov 29 07:36:26 compute-0 nova_compute[187185]: 2025-11-29 07:36:26.998 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:36:27 compute-0 nova_compute[187185]: 2025-11-29 07:36:27.022 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:36:27 compute-0 nova_compute[187185]: 2025-11-29 07:36:27.023 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:36:27 compute-0 nova_compute[187185]: 2025-11-29 07:36:27.098 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:36:27 compute-0 nova_compute[187185]: 2025-11-29 07:36:27.116 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:36:27 compute-0 nova_compute[187185]: 2025-11-29 07:36:27.222 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:36:27 compute-0 nova_compute[187185]: 2025-11-29 07:36:27.222 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.086s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:36:28 compute-0 nova_compute[187185]: 2025-11-29 07:36:28.223 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:36:28 compute-0 nova_compute[187185]: 2025-11-29 07:36:28.223 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:36:28 compute-0 nova_compute[187185]: 2025-11-29 07:36:28.223 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:36:28 compute-0 nova_compute[187185]: 2025-11-29 07:36:28.224 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:36:28 compute-0 nova_compute[187185]: 2025-11-29 07:36:28.224 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:36:28 compute-0 nova_compute[187185]: 2025-11-29 07:36:28.359 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:36:28 compute-0 podman[240618]: 2025-11-29 07:36:28.83537487 +0000 UTC m=+0.083024632 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 29 07:36:28 compute-0 podman[240619]: 2025-11-29 07:36:28.862143658 +0000 UTC m=+0.107642729 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:36:29 compute-0 nova_compute[187185]: 2025-11-29 07:36:29.312 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:36:32 compute-0 nova_compute[187185]: 2025-11-29 07:36:32.000 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:36:33 compute-0 nova_compute[187185]: 2025-11-29 07:36:33.361 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:36:37 compute-0 nova_compute[187185]: 2025-11-29 07:36:37.002 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:36:38 compute-0 nova_compute[187185]: 2025-11-29 07:36:38.363 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:36:39 compute-0 podman[240659]: 2025-11-29 07:36:39.819415365 +0000 UTC m=+0.070103096 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:36:39 compute-0 podman[240660]: 2025-11-29 07:36:39.842983882 +0000 UTC m=+0.086091979 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., config_id=edpm, release=1755695350, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter)
Nov 29 07:36:39 compute-0 podman[240661]: 2025-11-29 07:36:39.863352049 +0000 UTC m=+0.105463857 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 07:36:42 compute-0 nova_compute[187185]: 2025-11-29 07:36:42.004 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:36:43 compute-0 sshd-session[240617]: error: kex_exchange_identification: read: Connection timed out
Nov 29 07:36:43 compute-0 sshd-session[240617]: banner exchange: Connection from 106.13.174.45 port 33690: Connection timed out
Nov 29 07:36:43 compute-0 nova_compute[187185]: 2025-11-29 07:36:43.366 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:36:47 compute-0 nova_compute[187185]: 2025-11-29 07:36:47.007 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:36:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:36:48.008 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:36:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:36:48.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:36:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:36:48.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:36:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:36:48.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:36:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:36:48.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:36:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:36:48.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:36:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:36:48.010 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:36:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:36:48.010 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:36:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:36:48.010 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:36:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:36:48.010 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:36:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:36:48.010 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:36:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:36:48.010 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:36:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:36:48.010 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:36:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:36:48.010 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:36:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:36:48.010 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:36:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:36:48.010 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:36:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:36:48.011 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:36:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:36:48.011 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:36:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:36:48.011 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:36:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:36:48.011 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:36:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:36:48.011 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:36:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:36:48.011 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:36:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:36:48.011 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:36:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:36:48.011 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:36:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:36:48.011 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:36:48 compute-0 nova_compute[187185]: 2025-11-29 07:36:48.367 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:36:49 compute-0 podman[240722]: 2025-11-29 07:36:49.004424052 +0000 UTC m=+0.256964096 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 07:36:49 compute-0 nova_compute[187185]: 2025-11-29 07:36:49.345 187189 DEBUG oslo_concurrency.lockutils [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "093d7b65-31ad-44e4-a172-5a83bc186750" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:36:49 compute-0 nova_compute[187185]: 2025-11-29 07:36:49.347 187189 DEBUG oslo_concurrency.lockutils [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "093d7b65-31ad-44e4-a172-5a83bc186750" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:36:49 compute-0 nova_compute[187185]: 2025-11-29 07:36:49.511 187189 DEBUG nova.compute.manager [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 07:36:49 compute-0 nova_compute[187185]: 2025-11-29 07:36:49.624 187189 DEBUG oslo_concurrency.lockutils [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:36:49 compute-0 nova_compute[187185]: 2025-11-29 07:36:49.625 187189 DEBUG oslo_concurrency.lockutils [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:36:49 compute-0 nova_compute[187185]: 2025-11-29 07:36:49.632 187189 DEBUG nova.virt.hardware [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:36:49 compute-0 nova_compute[187185]: 2025-11-29 07:36:49.632 187189 INFO nova.compute.claims [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:36:49 compute-0 nova_compute[187185]: 2025-11-29 07:36:49.764 187189 DEBUG nova.compute.provider_tree [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:36:49 compute-0 nova_compute[187185]: 2025-11-29 07:36:49.782 187189 DEBUG nova.scheduler.client.report [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:36:49 compute-0 nova_compute[187185]: 2025-11-29 07:36:49.799 187189 DEBUG oslo_concurrency.lockutils [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:36:49 compute-0 nova_compute[187185]: 2025-11-29 07:36:49.800 187189 DEBUG nova.compute.manager [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 07:36:49 compute-0 nova_compute[187185]: 2025-11-29 07:36:49.867 187189 DEBUG nova.compute.manager [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 07:36:49 compute-0 nova_compute[187185]: 2025-11-29 07:36:49.868 187189 DEBUG nova.network.neutron [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 07:36:49 compute-0 nova_compute[187185]: 2025-11-29 07:36:49.890 187189 INFO nova.virt.libvirt.driver [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 07:36:49 compute-0 nova_compute[187185]: 2025-11-29 07:36:49.907 187189 DEBUG nova.compute.manager [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 07:36:50 compute-0 nova_compute[187185]: 2025-11-29 07:36:50.077 187189 DEBUG nova.compute.manager [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 07:36:50 compute-0 nova_compute[187185]: 2025-11-29 07:36:50.079 187189 DEBUG nova.virt.libvirt.driver [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:36:50 compute-0 nova_compute[187185]: 2025-11-29 07:36:50.079 187189 INFO nova.virt.libvirt.driver [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Creating image(s)
Nov 29 07:36:50 compute-0 nova_compute[187185]: 2025-11-29 07:36:50.080 187189 DEBUG oslo_concurrency.lockutils [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "/var/lib/nova/instances/093d7b65-31ad-44e4-a172-5a83bc186750/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:36:50 compute-0 nova_compute[187185]: 2025-11-29 07:36:50.080 187189 DEBUG oslo_concurrency.lockutils [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "/var/lib/nova/instances/093d7b65-31ad-44e4-a172-5a83bc186750/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:36:50 compute-0 nova_compute[187185]: 2025-11-29 07:36:50.081 187189 DEBUG oslo_concurrency.lockutils [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "/var/lib/nova/instances/093d7b65-31ad-44e4-a172-5a83bc186750/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:36:50 compute-0 nova_compute[187185]: 2025-11-29 07:36:50.102 187189 DEBUG oslo_concurrency.processutils [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:36:50 compute-0 nova_compute[187185]: 2025-11-29 07:36:50.211 187189 DEBUG oslo_concurrency.processutils [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.108s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:36:50 compute-0 nova_compute[187185]: 2025-11-29 07:36:50.214 187189 DEBUG oslo_concurrency.lockutils [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:36:50 compute-0 nova_compute[187185]: 2025-11-29 07:36:50.216 187189 DEBUG oslo_concurrency.lockutils [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:36:50 compute-0 nova_compute[187185]: 2025-11-29 07:36:50.240 187189 DEBUG oslo_concurrency.processutils [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:36:50 compute-0 nova_compute[187185]: 2025-11-29 07:36:50.303 187189 DEBUG oslo_concurrency.processutils [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:36:50 compute-0 nova_compute[187185]: 2025-11-29 07:36:50.305 187189 DEBUG oslo_concurrency.processutils [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/093d7b65-31ad-44e4-a172-5a83bc186750/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:36:50 compute-0 nova_compute[187185]: 2025-11-29 07:36:50.323 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:36:50 compute-0 nova_compute[187185]: 2025-11-29 07:36:50.388 187189 DEBUG nova.policy [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 07:36:50 compute-0 nova_compute[187185]: 2025-11-29 07:36:50.443 187189 DEBUG oslo_concurrency.processutils [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/093d7b65-31ad-44e4-a172-5a83bc186750/disk 1073741824" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:36:50 compute-0 nova_compute[187185]: 2025-11-29 07:36:50.443 187189 DEBUG oslo_concurrency.lockutils [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.227s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:36:50 compute-0 nova_compute[187185]: 2025-11-29 07:36:50.444 187189 DEBUG oslo_concurrency.processutils [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:36:50 compute-0 nova_compute[187185]: 2025-11-29 07:36:50.507 187189 DEBUG oslo_concurrency.processutils [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:36:50 compute-0 nova_compute[187185]: 2025-11-29 07:36:50.509 187189 DEBUG nova.virt.disk.api [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Checking if we can resize image /var/lib/nova/instances/093d7b65-31ad-44e4-a172-5a83bc186750/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:36:50 compute-0 nova_compute[187185]: 2025-11-29 07:36:50.509 187189 DEBUG oslo_concurrency.processutils [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/093d7b65-31ad-44e4-a172-5a83bc186750/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:36:50 compute-0 nova_compute[187185]: 2025-11-29 07:36:50.592 187189 DEBUG oslo_concurrency.processutils [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/093d7b65-31ad-44e4-a172-5a83bc186750/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:36:50 compute-0 nova_compute[187185]: 2025-11-29 07:36:50.593 187189 DEBUG nova.virt.disk.api [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Cannot resize image /var/lib/nova/instances/093d7b65-31ad-44e4-a172-5a83bc186750/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:36:50 compute-0 nova_compute[187185]: 2025-11-29 07:36:50.593 187189 DEBUG nova.objects.instance [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'migration_context' on Instance uuid 093d7b65-31ad-44e4-a172-5a83bc186750 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:36:50 compute-0 nova_compute[187185]: 2025-11-29 07:36:50.617 187189 DEBUG nova.virt.libvirt.driver [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:36:50 compute-0 nova_compute[187185]: 2025-11-29 07:36:50.618 187189 DEBUG nova.virt.libvirt.driver [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Ensure instance console log exists: /var/lib/nova/instances/093d7b65-31ad-44e4-a172-5a83bc186750/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:36:50 compute-0 nova_compute[187185]: 2025-11-29 07:36:50.618 187189 DEBUG oslo_concurrency.lockutils [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:36:50 compute-0 nova_compute[187185]: 2025-11-29 07:36:50.619 187189 DEBUG oslo_concurrency.lockutils [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:36:50 compute-0 nova_compute[187185]: 2025-11-29 07:36:50.619 187189 DEBUG oslo_concurrency.lockutils [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:36:51 compute-0 nova_compute[187185]: 2025-11-29 07:36:51.321 187189 DEBUG nova.network.neutron [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Successfully created port: 15c71f59-cb6a-4b75-9869-d159a0b45b73 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:36:52 compute-0 nova_compute[187185]: 2025-11-29 07:36:52.007 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:36:52 compute-0 nova_compute[187185]: 2025-11-29 07:36:52.293 187189 DEBUG nova.network.neutron [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Successfully updated port: 15c71f59-cb6a-4b75-9869-d159a0b45b73 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:36:52 compute-0 nova_compute[187185]: 2025-11-29 07:36:52.313 187189 DEBUG oslo_concurrency.lockutils [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "refresh_cache-093d7b65-31ad-44e4-a172-5a83bc186750" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:36:52 compute-0 nova_compute[187185]: 2025-11-29 07:36:52.313 187189 DEBUG oslo_concurrency.lockutils [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquired lock "refresh_cache-093d7b65-31ad-44e4-a172-5a83bc186750" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:36:52 compute-0 nova_compute[187185]: 2025-11-29 07:36:52.313 187189 DEBUG nova.network.neutron [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:36:52 compute-0 nova_compute[187185]: 2025-11-29 07:36:52.513 187189 DEBUG nova.network.neutron [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:36:52 compute-0 nova_compute[187185]: 2025-11-29 07:36:52.796 187189 DEBUG nova.compute.manager [req-8368e9dd-4e21-4817-8d83-fbf2069758e5 req-4f3cf705-ad5a-4bf2-877d-b2a916492623 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Received event network-changed-15c71f59-cb6a-4b75-9869-d159a0b45b73 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:36:52 compute-0 nova_compute[187185]: 2025-11-29 07:36:52.796 187189 DEBUG nova.compute.manager [req-8368e9dd-4e21-4817-8d83-fbf2069758e5 req-4f3cf705-ad5a-4bf2-877d-b2a916492623 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Refreshing instance network info cache due to event network-changed-15c71f59-cb6a-4b75-9869-d159a0b45b73. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:36:52 compute-0 nova_compute[187185]: 2025-11-29 07:36:52.796 187189 DEBUG oslo_concurrency.lockutils [req-8368e9dd-4e21-4817-8d83-fbf2069758e5 req-4f3cf705-ad5a-4bf2-877d-b2a916492623 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-093d7b65-31ad-44e4-a172-5a83bc186750" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:36:53 compute-0 nova_compute[187185]: 2025-11-29 07:36:53.370 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.010 187189 DEBUG nova.network.neutron [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Updating instance_info_cache with network_info: [{"id": "15c71f59-cb6a-4b75-9869-d159a0b45b73", "address": "fa:16:3e:01:e5:f7", "network": {"id": "796c16d7-a3e7-4359-87e5-120ec2e2bed2", "bridge": "br-int", "label": "tempest-network-smoke--1542129935", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15c71f59-cb", "ovs_interfaceid": "15c71f59-cb6a-4b75-9869-d159a0b45b73", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.516 187189 DEBUG oslo_concurrency.lockutils [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Releasing lock "refresh_cache-093d7b65-31ad-44e4-a172-5a83bc186750" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.517 187189 DEBUG nova.compute.manager [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Instance network_info: |[{"id": "15c71f59-cb6a-4b75-9869-d159a0b45b73", "address": "fa:16:3e:01:e5:f7", "network": {"id": "796c16d7-a3e7-4359-87e5-120ec2e2bed2", "bridge": "br-int", "label": "tempest-network-smoke--1542129935", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15c71f59-cb", "ovs_interfaceid": "15c71f59-cb6a-4b75-9869-d159a0b45b73", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.518 187189 DEBUG oslo_concurrency.lockutils [req-8368e9dd-4e21-4817-8d83-fbf2069758e5 req-4f3cf705-ad5a-4bf2-877d-b2a916492623 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-093d7b65-31ad-44e4-a172-5a83bc186750" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.519 187189 DEBUG nova.network.neutron [req-8368e9dd-4e21-4817-8d83-fbf2069758e5 req-4f3cf705-ad5a-4bf2-877d-b2a916492623 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Refreshing network info cache for port 15c71f59-cb6a-4b75-9869-d159a0b45b73 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.524 187189 DEBUG nova.virt.libvirt.driver [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Start _get_guest_xml network_info=[{"id": "15c71f59-cb6a-4b75-9869-d159a0b45b73", "address": "fa:16:3e:01:e5:f7", "network": {"id": "796c16d7-a3e7-4359-87e5-120ec2e2bed2", "bridge": "br-int", "label": "tempest-network-smoke--1542129935", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15c71f59-cb", "ovs_interfaceid": "15c71f59-cb6a-4b75-9869-d159a0b45b73", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.531 187189 WARNING nova.virt.libvirt.driver [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.537 187189 DEBUG nova.virt.libvirt.host [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.537 187189 DEBUG nova.virt.libvirt.host [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.541 187189 DEBUG nova.virt.libvirt.host [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.541 187189 DEBUG nova.virt.libvirt.host [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.542 187189 DEBUG nova.virt.libvirt.driver [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.542 187189 DEBUG nova.virt.hardware [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.543 187189 DEBUG nova.virt.hardware [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.543 187189 DEBUG nova.virt.hardware [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.543 187189 DEBUG nova.virt.hardware [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.543 187189 DEBUG nova.virt.hardware [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.544 187189 DEBUG nova.virt.hardware [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.544 187189 DEBUG nova.virt.hardware [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.544 187189 DEBUG nova.virt.hardware [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.544 187189 DEBUG nova.virt.hardware [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.544 187189 DEBUG nova.virt.hardware [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.545 187189 DEBUG nova.virt.hardware [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.549 187189 DEBUG nova.virt.libvirt.vif [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:36:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-121163888',display_name='tempest-TestNetworkBasicOps-server-121163888',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-121163888',id=151,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOK26+5uIDc4lonJrFoxGqCd8TWMP8ZTknX2qMMhQQA1+nZWPodq11xfasxwa5Xz+ha7qs1MGlSGWz18/tKdvpySVwpztLYQvXGkyIbHXbHHe+jsmqiqCW7siJOXD6nqJg==',key_name='tempest-TestNetworkBasicOps-1548697059',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-0825x0rp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:36:49Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=093d7b65-31ad-44e4-a172-5a83bc186750,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "15c71f59-cb6a-4b75-9869-d159a0b45b73", "address": "fa:16:3e:01:e5:f7", "network": {"id": "796c16d7-a3e7-4359-87e5-120ec2e2bed2", "bridge": "br-int", "label": "tempest-network-smoke--1542129935", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15c71f59-cb", "ovs_interfaceid": "15c71f59-cb6a-4b75-9869-d159a0b45b73", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.550 187189 DEBUG nova.network.os_vif_util [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "15c71f59-cb6a-4b75-9869-d159a0b45b73", "address": "fa:16:3e:01:e5:f7", "network": {"id": "796c16d7-a3e7-4359-87e5-120ec2e2bed2", "bridge": "br-int", "label": "tempest-network-smoke--1542129935", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15c71f59-cb", "ovs_interfaceid": "15c71f59-cb6a-4b75-9869-d159a0b45b73", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.551 187189 DEBUG nova.network.os_vif_util [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:01:e5:f7,bridge_name='br-int',has_traffic_filtering=True,id=15c71f59-cb6a-4b75-9869-d159a0b45b73,network=Network(796c16d7-a3e7-4359-87e5-120ec2e2bed2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15c71f59-cb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.553 187189 DEBUG nova.objects.instance [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 093d7b65-31ad-44e4-a172-5a83bc186750 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.592 187189 DEBUG nova.virt.libvirt.driver [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:36:54 compute-0 nova_compute[187185]:   <uuid>093d7b65-31ad-44e4-a172-5a83bc186750</uuid>
Nov 29 07:36:54 compute-0 nova_compute[187185]:   <name>instance-00000097</name>
Nov 29 07:36:54 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:36:54 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:36:54 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:36:54 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:       <nova:name>tempest-TestNetworkBasicOps-server-121163888</nova:name>
Nov 29 07:36:54 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:36:54</nova:creationTime>
Nov 29 07:36:54 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:36:54 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:36:54 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:36:54 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:36:54 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:36:54 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:36:54 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:36:54 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:36:54 compute-0 nova_compute[187185]:         <nova:user uuid="1dd0a7f5aaff402eb032cd5e60540dcb">tempest-TestNetworkBasicOps-1587012976-project-member</nova:user>
Nov 29 07:36:54 compute-0 nova_compute[187185]:         <nova:project uuid="ec8b80be17a14d1caf666636283749d0">tempest-TestNetworkBasicOps-1587012976</nova:project>
Nov 29 07:36:54 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:36:54 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:36:54 compute-0 nova_compute[187185]:         <nova:port uuid="15c71f59-cb6a-4b75-9869-d159a0b45b73">
Nov 29 07:36:54 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:36:54 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:36:54 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:36:54 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <system>
Nov 29 07:36:54 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:36:54 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:36:54 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:36:54 compute-0 nova_compute[187185]:       <entry name="serial">093d7b65-31ad-44e4-a172-5a83bc186750</entry>
Nov 29 07:36:54 compute-0 nova_compute[187185]:       <entry name="uuid">093d7b65-31ad-44e4-a172-5a83bc186750</entry>
Nov 29 07:36:54 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     </system>
Nov 29 07:36:54 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:36:54 compute-0 nova_compute[187185]:   <os>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:   </os>
Nov 29 07:36:54 compute-0 nova_compute[187185]:   <features>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:   </features>
Nov 29 07:36:54 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:36:54 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:36:54 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:36:54 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/093d7b65-31ad-44e4-a172-5a83bc186750/disk"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:36:54 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/093d7b65-31ad-44e4-a172-5a83bc186750/disk.config"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:36:54 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:01:e5:f7"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:       <target dev="tap15c71f59-cb"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:36:54 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/093d7b65-31ad-44e4-a172-5a83bc186750/console.log" append="off"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <video>
Nov 29 07:36:54 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     </video>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:36:54 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:36:54 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:36:54 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:36:54 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:36:54 compute-0 nova_compute[187185]: </domain>
Nov 29 07:36:54 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.593 187189 DEBUG nova.compute.manager [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Preparing to wait for external event network-vif-plugged-15c71f59-cb6a-4b75-9869-d159a0b45b73 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.594 187189 DEBUG oslo_concurrency.lockutils [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "093d7b65-31ad-44e4-a172-5a83bc186750-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.594 187189 DEBUG oslo_concurrency.lockutils [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "093d7b65-31ad-44e4-a172-5a83bc186750-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.594 187189 DEBUG oslo_concurrency.lockutils [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "093d7b65-31ad-44e4-a172-5a83bc186750-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.595 187189 DEBUG nova.virt.libvirt.vif [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:36:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-121163888',display_name='tempest-TestNetworkBasicOps-server-121163888',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-121163888',id=151,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOK26+5uIDc4lonJrFoxGqCd8TWMP8ZTknX2qMMhQQA1+nZWPodq11xfasxwa5Xz+ha7qs1MGlSGWz18/tKdvpySVwpztLYQvXGkyIbHXbHHe+jsmqiqCW7siJOXD6nqJg==',key_name='tempest-TestNetworkBasicOps-1548697059',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-0825x0rp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:36:49Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=093d7b65-31ad-44e4-a172-5a83bc186750,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "15c71f59-cb6a-4b75-9869-d159a0b45b73", "address": "fa:16:3e:01:e5:f7", "network": {"id": "796c16d7-a3e7-4359-87e5-120ec2e2bed2", "bridge": "br-int", "label": "tempest-network-smoke--1542129935", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15c71f59-cb", "ovs_interfaceid": "15c71f59-cb6a-4b75-9869-d159a0b45b73", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.595 187189 DEBUG nova.network.os_vif_util [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "15c71f59-cb6a-4b75-9869-d159a0b45b73", "address": "fa:16:3e:01:e5:f7", "network": {"id": "796c16d7-a3e7-4359-87e5-120ec2e2bed2", "bridge": "br-int", "label": "tempest-network-smoke--1542129935", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15c71f59-cb", "ovs_interfaceid": "15c71f59-cb6a-4b75-9869-d159a0b45b73", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.596 187189 DEBUG nova.network.os_vif_util [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:01:e5:f7,bridge_name='br-int',has_traffic_filtering=True,id=15c71f59-cb6a-4b75-9869-d159a0b45b73,network=Network(796c16d7-a3e7-4359-87e5-120ec2e2bed2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15c71f59-cb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.596 187189 DEBUG os_vif [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:01:e5:f7,bridge_name='br-int',has_traffic_filtering=True,id=15c71f59-cb6a-4b75-9869-d159a0b45b73,network=Network(796c16d7-a3e7-4359-87e5-120ec2e2bed2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15c71f59-cb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.597 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.597 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.598 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.603 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.604 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap15c71f59-cb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.605 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap15c71f59-cb, col_values=(('external_ids', {'iface-id': '15c71f59-cb6a-4b75-9869-d159a0b45b73', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:01:e5:f7', 'vm-uuid': '093d7b65-31ad-44e4-a172-5a83bc186750'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.607 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:36:54 compute-0 NetworkManager[55227]: <info>  [1764401814.6088] manager: (tap15c71f59-cb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/251)
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.610 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.617 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.618 187189 INFO os_vif [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:01:e5:f7,bridge_name='br-int',has_traffic_filtering=True,id=15c71f59-cb6a-4b75-9869-d159a0b45b73,network=Network(796c16d7-a3e7-4359-87e5-120ec2e2bed2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15c71f59-cb')
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.827 187189 DEBUG nova.virt.libvirt.driver [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.827 187189 DEBUG nova.virt.libvirt.driver [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.827 187189 DEBUG nova.virt.libvirt.driver [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No VIF found with MAC fa:16:3e:01:e5:f7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:36:54 compute-0 nova_compute[187185]: 2025-11-29 07:36:54.828 187189 INFO nova.virt.libvirt.driver [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Using config drive
Nov 29 07:36:55 compute-0 nova_compute[187185]: 2025-11-29 07:36:55.377 187189 INFO nova.virt.libvirt.driver [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Creating config drive at /var/lib/nova/instances/093d7b65-31ad-44e4-a172-5a83bc186750/disk.config
Nov 29 07:36:55 compute-0 nova_compute[187185]: 2025-11-29 07:36:55.383 187189 DEBUG oslo_concurrency.processutils [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/093d7b65-31ad-44e4-a172-5a83bc186750/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf0v4gvhp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:36:55 compute-0 nova_compute[187185]: 2025-11-29 07:36:55.531 187189 DEBUG oslo_concurrency.processutils [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/093d7b65-31ad-44e4-a172-5a83bc186750/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf0v4gvhp" returned: 0 in 0.147s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:36:55 compute-0 kernel: tap15c71f59-cb: entered promiscuous mode
Nov 29 07:36:55 compute-0 NetworkManager[55227]: <info>  [1764401815.6455] manager: (tap15c71f59-cb): new Tun device (/org/freedesktop/NetworkManager/Devices/252)
Nov 29 07:36:55 compute-0 ovn_controller[95281]: 2025-11-29T07:36:55Z|00486|binding|INFO|Claiming lport 15c71f59-cb6a-4b75-9869-d159a0b45b73 for this chassis.
Nov 29 07:36:55 compute-0 ovn_controller[95281]: 2025-11-29T07:36:55Z|00487|binding|INFO|15c71f59-cb6a-4b75-9869-d159a0b45b73: Claiming fa:16:3e:01:e5:f7 10.100.0.13
Nov 29 07:36:55 compute-0 nova_compute[187185]: 2025-11-29 07:36:55.675 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:36:55 compute-0 systemd-udevd[240780]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:36:55 compute-0 nova_compute[187185]: 2025-11-29 07:36:55.688 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:36:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:36:55.699 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:01:e5:f7 10.100.0.13'], port_security=['fa:16:3e:01:e5:f7 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '093d7b65-31ad-44e4-a172-5a83bc186750', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-796c16d7-a3e7-4359-87e5-120ec2e2bed2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ec8b80be17a14d1caf666636283749d0', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9bc2cde2-973a-4def-a1fd-de58363c7267', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=71aa9491-c36d-40d4-9f3a-8ab24eb9a682, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=15c71f59-cb6a-4b75-9869-d159a0b45b73) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:36:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:36:55.702 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 15c71f59-cb6a-4b75-9869-d159a0b45b73 in datapath 796c16d7-a3e7-4359-87e5-120ec2e2bed2 bound to our chassis
Nov 29 07:36:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:36:55.705 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 796c16d7-a3e7-4359-87e5-120ec2e2bed2
Nov 29 07:36:55 compute-0 NetworkManager[55227]: <info>  [1764401815.7244] device (tap15c71f59-cb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:36:55 compute-0 NetworkManager[55227]: <info>  [1764401815.7255] device (tap15c71f59-cb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:36:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:36:55.723 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[44ae482d-36aa-4d54-a4ae-0121460cc37f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:36:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:36:55.724 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap796c16d7-a1 in ovnmeta-796c16d7-a3e7-4359-87e5-120ec2e2bed2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:36:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:36:55.728 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap796c16d7-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:36:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:36:55.729 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[24ec50e7-b877-4be6-9520-1c8aa26c0c76]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:36:55 compute-0 systemd-machined[153486]: New machine qemu-59-instance-00000097.
Nov 29 07:36:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:36:55.730 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d696344c-7d41-471f-a499-ec726ef56536]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:36:55 compute-0 systemd[1]: Started Virtual Machine qemu-59-instance-00000097.
Nov 29 07:36:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:36:55.749 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[a4f2d5fe-d124-49fd-b096-bd3922c91767]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:36:55 compute-0 ovn_controller[95281]: 2025-11-29T07:36:55Z|00488|binding|INFO|Setting lport 15c71f59-cb6a-4b75-9869-d159a0b45b73 ovn-installed in OVS
Nov 29 07:36:55 compute-0 ovn_controller[95281]: 2025-11-29T07:36:55Z|00489|binding|INFO|Setting lport 15c71f59-cb6a-4b75-9869-d159a0b45b73 up in Southbound
Nov 29 07:36:55 compute-0 nova_compute[187185]: 2025-11-29 07:36:55.755 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:36:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:36:55.771 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[527dec3d-427d-4927-8eb4-70bc108cf3fc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:36:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:36:55.825 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[a151d63f-71e9-434d-8b74-c16dea9b1056]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:36:55 compute-0 systemd-udevd[240786]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:36:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:36:55.837 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a041384a-bd33-4400-881b-cec3732a9d89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:36:55 compute-0 NetworkManager[55227]: <info>  [1764401815.8392] manager: (tap796c16d7-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/253)
Nov 29 07:36:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:36:55.887 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[8f3cf12f-234d-4e72-bdbc-562477f636be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:36:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:36:55.891 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[0dde9ebc-8472-4f5f-9eab-c06979ec7f91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:36:55 compute-0 NetworkManager[55227]: <info>  [1764401815.9272] device (tap796c16d7-a0): carrier: link connected
Nov 29 07:36:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:36:55.932 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[0bd721da-6ba9-4598-b7ed-2a843467d092]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:36:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:36:55.956 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e317c23f-b4e8-4ddf-8a50-f48d3234b308]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap796c16d7-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:52:c6:fa'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 155], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 726359, 'reachable_time': 41254, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240828, 'error': None, 'target': 'ovnmeta-796c16d7-a3e7-4359-87e5-120ec2e2bed2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:36:55 compute-0 podman[240807]: 2025-11-29 07:36:55.969107795 +0000 UTC m=+0.073365088 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 07:36:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:36:55.977 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[b91389b9-8053-43de-bdb2-5d58843113a8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe52:c6fa'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 726359, 'tstamp': 726359}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240839, 'error': None, 'target': 'ovnmeta-796c16d7-a3e7-4359-87e5-120ec2e2bed2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:36:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:36:56.001 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[40ed9d1b-4d6a-4b62-bef6-2af2cdc48b1c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap796c16d7-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:52:c6:fa'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 155], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 726359, 'reachable_time': 41254, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 240840, 'error': None, 'target': 'ovnmeta-796c16d7-a3e7-4359-87e5-120ec2e2bed2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:36:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:36:56.043 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6e701f81-46b6-4aed-8990-854e0c29d1fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:36:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:36:56.131 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[74a637cd-7821-4c90-bcbb-a5773a9962ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:36:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:36:56.133 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap796c16d7-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:36:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:36:56.133 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:36:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:36:56.134 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap796c16d7-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:36:56 compute-0 NetworkManager[55227]: <info>  [1764401816.1365] manager: (tap796c16d7-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/254)
Nov 29 07:36:56 compute-0 nova_compute[187185]: 2025-11-29 07:36:56.136 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:36:56 compute-0 kernel: tap796c16d7-a0: entered promiscuous mode
Nov 29 07:36:56 compute-0 nova_compute[187185]: 2025-11-29 07:36:56.141 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:36:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:36:56.142 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap796c16d7-a0, col_values=(('external_ids', {'iface-id': '83384887-ef2a-4b4a-a7db-cff35b23542d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:36:56 compute-0 nova_compute[187185]: 2025-11-29 07:36:56.143 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:36:56 compute-0 ovn_controller[95281]: 2025-11-29T07:36:56Z|00490|binding|INFO|Releasing lport 83384887-ef2a-4b4a-a7db-cff35b23542d from this chassis (sb_readonly=0)
Nov 29 07:36:56 compute-0 nova_compute[187185]: 2025-11-29 07:36:56.165 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:36:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:36:56.166 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/796c16d7-a3e7-4359-87e5-120ec2e2bed2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/796c16d7-a3e7-4359-87e5-120ec2e2bed2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:36:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:36:56.167 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[66ec572c-1124-48e7-93d6-4dfb72ef2d9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:36:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:36:56.168 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:36:56 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:36:56 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:36:56 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-796c16d7-a3e7-4359-87e5-120ec2e2bed2
Nov 29 07:36:56 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:36:56 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:36:56 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:36:56 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/796c16d7-a3e7-4359-87e5-120ec2e2bed2.pid.haproxy
Nov 29 07:36:56 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:36:56 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:36:56 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:36:56 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:36:56 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:36:56 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:36:56 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:36:56 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:36:56 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:36:56 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:36:56 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:36:56 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:36:56 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:36:56 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:36:56 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:36:56 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:36:56 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:36:56 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:36:56 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:36:56 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:36:56 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID 796c16d7-a3e7-4359-87e5-120ec2e2bed2
Nov 29 07:36:56 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:36:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:36:56.169 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-796c16d7-a3e7-4359-87e5-120ec2e2bed2', 'env', 'PROCESS_TAG=haproxy-796c16d7-a3e7-4359-87e5-120ec2e2bed2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/796c16d7-a3e7-4359-87e5-120ec2e2bed2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:36:56 compute-0 nova_compute[187185]: 2025-11-29 07:36:56.312 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401816.3107538, 093d7b65-31ad-44e4-a172-5a83bc186750 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:36:56 compute-0 nova_compute[187185]: 2025-11-29 07:36:56.313 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] VM Started (Lifecycle Event)
Nov 29 07:36:56 compute-0 nova_compute[187185]: 2025-11-29 07:36:56.368 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:36:56 compute-0 nova_compute[187185]: 2025-11-29 07:36:56.374 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401816.3123636, 093d7b65-31ad-44e4-a172-5a83bc186750 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:36:56 compute-0 nova_compute[187185]: 2025-11-29 07:36:56.374 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] VM Paused (Lifecycle Event)
Nov 29 07:36:56 compute-0 nova_compute[187185]: 2025-11-29 07:36:56.599 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:36:56 compute-0 nova_compute[187185]: 2025-11-29 07:36:56.606 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:36:56 compute-0 podman[240879]: 2025-11-29 07:36:56.541913783 +0000 UTC m=+0.030015751 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:36:56 compute-0 nova_compute[187185]: 2025-11-29 07:36:56.651 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:36:56 compute-0 podman[240879]: 2025-11-29 07:36:56.713395168 +0000 UTC m=+0.201497176 container create 58b20ee195526e8c0c4406bd57876dbade9e9b97cbfe66964b3b51742883ba19 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-796c16d7-a3e7-4359-87e5-120ec2e2bed2, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 07:36:56 compute-0 systemd[1]: Started libpod-conmon-58b20ee195526e8c0c4406bd57876dbade9e9b97cbfe66964b3b51742883ba19.scope.
Nov 29 07:36:56 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:36:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/560ad32b59fd6b42f8321ca2ef993b7988ceaa904f408dda7b1012d08e579d99/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:36:56 compute-0 podman[240879]: 2025-11-29 07:36:56.91723895 +0000 UTC m=+0.405340918 container init 58b20ee195526e8c0c4406bd57876dbade9e9b97cbfe66964b3b51742883ba19 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-796c16d7-a3e7-4359-87e5-120ec2e2bed2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 07:36:56 compute-0 podman[240879]: 2025-11-29 07:36:56.924543326 +0000 UTC m=+0.412645294 container start 58b20ee195526e8c0c4406bd57876dbade9e9b97cbfe66964b3b51742883ba19 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-796c16d7-a3e7-4359-87e5-120ec2e2bed2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:36:56 compute-0 neutron-haproxy-ovnmeta-796c16d7-a3e7-4359-87e5-120ec2e2bed2[240895]: [NOTICE]   (240899) : New worker (240901) forked
Nov 29 07:36:56 compute-0 neutron-haproxy-ovnmeta-796c16d7-a3e7-4359-87e5-120ec2e2bed2[240895]: [NOTICE]   (240899) : Loading success.
Nov 29 07:36:57 compute-0 nova_compute[187185]: 2025-11-29 07:36:57.010 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:36:57 compute-0 nova_compute[187185]: 2025-11-29 07:36:57.247 187189 DEBUG nova.compute.manager [req-7dec1354-b7ff-4eac-b178-e93e75af6aab req-8aecb927-8cf9-4a12-921f-3ba92b7c1157 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Received event network-vif-plugged-15c71f59-cb6a-4b75-9869-d159a0b45b73 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:36:57 compute-0 nova_compute[187185]: 2025-11-29 07:36:57.247 187189 DEBUG oslo_concurrency.lockutils [req-7dec1354-b7ff-4eac-b178-e93e75af6aab req-8aecb927-8cf9-4a12-921f-3ba92b7c1157 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "093d7b65-31ad-44e4-a172-5a83bc186750-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:36:57 compute-0 nova_compute[187185]: 2025-11-29 07:36:57.248 187189 DEBUG oslo_concurrency.lockutils [req-7dec1354-b7ff-4eac-b178-e93e75af6aab req-8aecb927-8cf9-4a12-921f-3ba92b7c1157 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "093d7b65-31ad-44e4-a172-5a83bc186750-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:36:57 compute-0 nova_compute[187185]: 2025-11-29 07:36:57.248 187189 DEBUG oslo_concurrency.lockutils [req-7dec1354-b7ff-4eac-b178-e93e75af6aab req-8aecb927-8cf9-4a12-921f-3ba92b7c1157 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "093d7b65-31ad-44e4-a172-5a83bc186750-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:36:57 compute-0 nova_compute[187185]: 2025-11-29 07:36:57.249 187189 DEBUG nova.compute.manager [req-7dec1354-b7ff-4eac-b178-e93e75af6aab req-8aecb927-8cf9-4a12-921f-3ba92b7c1157 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Processing event network-vif-plugged-15c71f59-cb6a-4b75-9869-d159a0b45b73 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:36:57 compute-0 nova_compute[187185]: 2025-11-29 07:36:57.250 187189 DEBUG nova.compute.manager [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:36:57 compute-0 nova_compute[187185]: 2025-11-29 07:36:57.255 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401817.2549598, 093d7b65-31ad-44e4-a172-5a83bc186750 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:36:57 compute-0 nova_compute[187185]: 2025-11-29 07:36:57.255 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] VM Resumed (Lifecycle Event)
Nov 29 07:36:57 compute-0 nova_compute[187185]: 2025-11-29 07:36:57.258 187189 DEBUG nova.virt.libvirt.driver [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:36:57 compute-0 nova_compute[187185]: 2025-11-29 07:36:57.267 187189 INFO nova.virt.libvirt.driver [-] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Instance spawned successfully.
Nov 29 07:36:57 compute-0 nova_compute[187185]: 2025-11-29 07:36:57.268 187189 DEBUG nova.virt.libvirt.driver [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:36:57 compute-0 nova_compute[187185]: 2025-11-29 07:36:57.300 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:36:57 compute-0 nova_compute[187185]: 2025-11-29 07:36:57.308 187189 DEBUG nova.virt.libvirt.driver [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:36:57 compute-0 nova_compute[187185]: 2025-11-29 07:36:57.308 187189 DEBUG nova.virt.libvirt.driver [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:36:57 compute-0 nova_compute[187185]: 2025-11-29 07:36:57.309 187189 DEBUG nova.virt.libvirt.driver [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:36:57 compute-0 nova_compute[187185]: 2025-11-29 07:36:57.310 187189 DEBUG nova.virt.libvirt.driver [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:36:57 compute-0 nova_compute[187185]: 2025-11-29 07:36:57.311 187189 DEBUG nova.virt.libvirt.driver [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:36:57 compute-0 nova_compute[187185]: 2025-11-29 07:36:57.312 187189 DEBUG nova.virt.libvirt.driver [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:36:57 compute-0 nova_compute[187185]: 2025-11-29 07:36:57.324 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:36:57 compute-0 nova_compute[187185]: 2025-11-29 07:36:57.360 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:36:57 compute-0 nova_compute[187185]: 2025-11-29 07:36:57.487 187189 INFO nova.compute.manager [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Took 7.41 seconds to spawn the instance on the hypervisor.
Nov 29 07:36:57 compute-0 nova_compute[187185]: 2025-11-29 07:36:57.487 187189 DEBUG nova.compute.manager [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:36:57 compute-0 nova_compute[187185]: 2025-11-29 07:36:57.547 187189 DEBUG nova.network.neutron [req-8368e9dd-4e21-4817-8d83-fbf2069758e5 req-4f3cf705-ad5a-4bf2-877d-b2a916492623 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Updated VIF entry in instance network info cache for port 15c71f59-cb6a-4b75-9869-d159a0b45b73. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:36:57 compute-0 nova_compute[187185]: 2025-11-29 07:36:57.548 187189 DEBUG nova.network.neutron [req-8368e9dd-4e21-4817-8d83-fbf2069758e5 req-4f3cf705-ad5a-4bf2-877d-b2a916492623 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Updating instance_info_cache with network_info: [{"id": "15c71f59-cb6a-4b75-9869-d159a0b45b73", "address": "fa:16:3e:01:e5:f7", "network": {"id": "796c16d7-a3e7-4359-87e5-120ec2e2bed2", "bridge": "br-int", "label": "tempest-network-smoke--1542129935", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15c71f59-cb", "ovs_interfaceid": "15c71f59-cb6a-4b75-9869-d159a0b45b73", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:36:57 compute-0 nova_compute[187185]: 2025-11-29 07:36:57.572 187189 DEBUG oslo_concurrency.lockutils [req-8368e9dd-4e21-4817-8d83-fbf2069758e5 req-4f3cf705-ad5a-4bf2-877d-b2a916492623 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-093d7b65-31ad-44e4-a172-5a83bc186750" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:36:57 compute-0 nova_compute[187185]: 2025-11-29 07:36:57.600 187189 INFO nova.compute.manager [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Took 8.02 seconds to build instance.
Nov 29 07:36:57 compute-0 nova_compute[187185]: 2025-11-29 07:36:57.621 187189 DEBUG oslo_concurrency.lockutils [None req-a575a7b9-0857-4fad-aab0-cba30a8b7d54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "093d7b65-31ad-44e4-a172-5a83bc186750" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.275s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:36:59 compute-0 nova_compute[187185]: 2025-11-29 07:36:59.392 187189 DEBUG nova.compute.manager [req-d4099eb6-d26b-47fe-840e-d1d79a1508af req-11035f55-4042-4b29-b0ee-7eacf1a7ffb0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Received event network-vif-plugged-15c71f59-cb6a-4b75-9869-d159a0b45b73 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:36:59 compute-0 nova_compute[187185]: 2025-11-29 07:36:59.392 187189 DEBUG oslo_concurrency.lockutils [req-d4099eb6-d26b-47fe-840e-d1d79a1508af req-11035f55-4042-4b29-b0ee-7eacf1a7ffb0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "093d7b65-31ad-44e4-a172-5a83bc186750-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:36:59 compute-0 nova_compute[187185]: 2025-11-29 07:36:59.393 187189 DEBUG oslo_concurrency.lockutils [req-d4099eb6-d26b-47fe-840e-d1d79a1508af req-11035f55-4042-4b29-b0ee-7eacf1a7ffb0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "093d7b65-31ad-44e4-a172-5a83bc186750-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:36:59 compute-0 nova_compute[187185]: 2025-11-29 07:36:59.393 187189 DEBUG oslo_concurrency.lockutils [req-d4099eb6-d26b-47fe-840e-d1d79a1508af req-11035f55-4042-4b29-b0ee-7eacf1a7ffb0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "093d7b65-31ad-44e4-a172-5a83bc186750-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:36:59 compute-0 nova_compute[187185]: 2025-11-29 07:36:59.394 187189 DEBUG nova.compute.manager [req-d4099eb6-d26b-47fe-840e-d1d79a1508af req-11035f55-4042-4b29-b0ee-7eacf1a7ffb0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] No waiting events found dispatching network-vif-plugged-15c71f59-cb6a-4b75-9869-d159a0b45b73 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:36:59 compute-0 nova_compute[187185]: 2025-11-29 07:36:59.394 187189 WARNING nova.compute.manager [req-d4099eb6-d26b-47fe-840e-d1d79a1508af req-11035f55-4042-4b29-b0ee-7eacf1a7ffb0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Received unexpected event network-vif-plugged-15c71f59-cb6a-4b75-9869-d159a0b45b73 for instance with vm_state active and task_state None.
Nov 29 07:36:59 compute-0 nova_compute[187185]: 2025-11-29 07:36:59.608 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:36:59 compute-0 podman[240911]: 2025-11-29 07:36:59.802163631 +0000 UTC m=+0.054487314 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_id=edpm, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 07:36:59 compute-0 podman[240910]: 2025-11-29 07:36:59.83074 +0000 UTC m=+0.082468176 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_id=multipathd)
Nov 29 07:37:01 compute-0 nova_compute[187185]: 2025-11-29 07:37:01.704 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:37:01 compute-0 NetworkManager[55227]: <info>  [1764401821.7055] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/255)
Nov 29 07:37:01 compute-0 NetworkManager[55227]: <info>  [1764401821.7069] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/256)
Nov 29 07:37:01 compute-0 nova_compute[187185]: 2025-11-29 07:37:01.818 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:37:01 compute-0 ovn_controller[95281]: 2025-11-29T07:37:01Z|00491|binding|INFO|Releasing lport 83384887-ef2a-4b4a-a7db-cff35b23542d from this chassis (sb_readonly=0)
Nov 29 07:37:01 compute-0 nova_compute[187185]: 2025-11-29 07:37:01.882 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:37:02 compute-0 nova_compute[187185]: 2025-11-29 07:37:02.012 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:37:02 compute-0 nova_compute[187185]: 2025-11-29 07:37:02.166 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:37:02 compute-0 nova_compute[187185]: 2025-11-29 07:37:02.353 187189 DEBUG nova.compute.manager [req-fe01087b-6374-47a7-a951-10cb01c87717 req-bcf3c8ea-6970-4b3c-83b9-04c9c7d41a07 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Received event network-changed-15c71f59-cb6a-4b75-9869-d159a0b45b73 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:37:02 compute-0 nova_compute[187185]: 2025-11-29 07:37:02.354 187189 DEBUG nova.compute.manager [req-fe01087b-6374-47a7-a951-10cb01c87717 req-bcf3c8ea-6970-4b3c-83b9-04c9c7d41a07 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Refreshing instance network info cache due to event network-changed-15c71f59-cb6a-4b75-9869-d159a0b45b73. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:37:02 compute-0 nova_compute[187185]: 2025-11-29 07:37:02.354 187189 DEBUG oslo_concurrency.lockutils [req-fe01087b-6374-47a7-a951-10cb01c87717 req-bcf3c8ea-6970-4b3c-83b9-04c9c7d41a07 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-093d7b65-31ad-44e4-a172-5a83bc186750" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:37:02 compute-0 nova_compute[187185]: 2025-11-29 07:37:02.355 187189 DEBUG oslo_concurrency.lockutils [req-fe01087b-6374-47a7-a951-10cb01c87717 req-bcf3c8ea-6970-4b3c-83b9-04c9c7d41a07 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-093d7b65-31ad-44e4-a172-5a83bc186750" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:37:02 compute-0 nova_compute[187185]: 2025-11-29 07:37:02.355 187189 DEBUG nova.network.neutron [req-fe01087b-6374-47a7-a951-10cb01c87717 req-bcf3c8ea-6970-4b3c-83b9-04c9c7d41a07 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Refreshing network info cache for port 15c71f59-cb6a-4b75-9869-d159a0b45b73 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:37:04 compute-0 nova_compute[187185]: 2025-11-29 07:37:04.612 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:37:05 compute-0 nova_compute[187185]: 2025-11-29 07:37:05.084 187189 DEBUG nova.network.neutron [req-fe01087b-6374-47a7-a951-10cb01c87717 req-bcf3c8ea-6970-4b3c-83b9-04c9c7d41a07 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Updated VIF entry in instance network info cache for port 15c71f59-cb6a-4b75-9869-d159a0b45b73. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:37:05 compute-0 nova_compute[187185]: 2025-11-29 07:37:05.085 187189 DEBUG nova.network.neutron [req-fe01087b-6374-47a7-a951-10cb01c87717 req-bcf3c8ea-6970-4b3c-83b9-04c9c7d41a07 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Updating instance_info_cache with network_info: [{"id": "15c71f59-cb6a-4b75-9869-d159a0b45b73", "address": "fa:16:3e:01:e5:f7", "network": {"id": "796c16d7-a3e7-4359-87e5-120ec2e2bed2", "bridge": "br-int", "label": "tempest-network-smoke--1542129935", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15c71f59-cb", "ovs_interfaceid": "15c71f59-cb6a-4b75-9869-d159a0b45b73", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:37:05 compute-0 nova_compute[187185]: 2025-11-29 07:37:05.104 187189 DEBUG oslo_concurrency.lockutils [req-fe01087b-6374-47a7-a951-10cb01c87717 req-bcf3c8ea-6970-4b3c-83b9-04c9c7d41a07 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-093d7b65-31ad-44e4-a172-5a83bc186750" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:37:07 compute-0 nova_compute[187185]: 2025-11-29 07:37:07.017 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:37:09 compute-0 nova_compute[187185]: 2025-11-29 07:37:09.032 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:37:09 compute-0 nova_compute[187185]: 2025-11-29 07:37:09.616 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:37:10 compute-0 ovn_controller[95281]: 2025-11-29T07:37:10Z|00063|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:01:e5:f7 10.100.0.13
Nov 29 07:37:10 compute-0 ovn_controller[95281]: 2025-11-29T07:37:10Z|00064|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:01:e5:f7 10.100.0.13
Nov 29 07:37:10 compute-0 podman[240972]: 2025-11-29 07:37:10.825354915 +0000 UTC m=+0.078480113 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 07:37:10 compute-0 podman[240974]: 2025-11-29 07:37:10.850580639 +0000 UTC m=+0.088154057 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:37:10 compute-0 podman[240973]: 2025-11-29 07:37:10.880813105 +0000 UTC m=+0.122985383 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, distribution-scope=public, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, config_id=edpm, container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.)
Nov 29 07:37:12 compute-0 nova_compute[187185]: 2025-11-29 07:37:12.018 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:37:12 compute-0 nova_compute[187185]: 2025-11-29 07:37:12.383 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:37:13 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:37:13.528 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=55, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=54) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:37:13 compute-0 nova_compute[187185]: 2025-11-29 07:37:13.529 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:37:13 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:37:13.530 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:37:14 compute-0 nova_compute[187185]: 2025-11-29 07:37:14.620 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:37:17 compute-0 nova_compute[187185]: 2025-11-29 07:37:17.021 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:37:17 compute-0 nova_compute[187185]: 2025-11-29 07:37:17.490 187189 INFO nova.compute.manager [None req-30f50fc1-1a45-49cb-bc5a-658663151ce7 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Get console output
Nov 29 07:37:17 compute-0 nova_compute[187185]: 2025-11-29 07:37:17.496 213758 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 29 07:37:17 compute-0 sshd-session[241033]: Invalid user tempuser from 190.181.27.27 port 43516
Nov 29 07:37:17 compute-0 sshd-session[241033]: Received disconnect from 190.181.27.27 port 43516:11: Bye Bye [preauth]
Nov 29 07:37:17 compute-0 sshd-session[241033]: Disconnected from invalid user tempuser 190.181.27.27 port 43516 [preauth]
Nov 29 07:37:18 compute-0 nova_compute[187185]: 2025-11-29 07:37:18.448 187189 INFO nova.compute.manager [None req-eaba5cfc-be8a-4191-94a3-dc5910797b2c 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Get console output
Nov 29 07:37:18 compute-0 nova_compute[187185]: 2025-11-29 07:37:18.455 213758 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 29 07:37:19 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:37:19.533 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '55'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:37:19 compute-0 nova_compute[187185]: 2025-11-29 07:37:19.624 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:37:19 compute-0 podman[241036]: 2025-11-29 07:37:19.897347555 +0000 UTC m=+0.150228101 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 07:37:20 compute-0 nova_compute[187185]: 2025-11-29 07:37:20.989 187189 DEBUG nova.compute.manager [req-0cb37614-c376-43fa-96f9-ea914d42ee00 req-34ca454d-d7b0-4bb0-93ce-a9a0773a972e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Received event network-changed-15c71f59-cb6a-4b75-9869-d159a0b45b73 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:37:20 compute-0 nova_compute[187185]: 2025-11-29 07:37:20.989 187189 DEBUG nova.compute.manager [req-0cb37614-c376-43fa-96f9-ea914d42ee00 req-34ca454d-d7b0-4bb0-93ce-a9a0773a972e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Refreshing instance network info cache due to event network-changed-15c71f59-cb6a-4b75-9869-d159a0b45b73. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:37:20 compute-0 nova_compute[187185]: 2025-11-29 07:37:20.990 187189 DEBUG oslo_concurrency.lockutils [req-0cb37614-c376-43fa-96f9-ea914d42ee00 req-34ca454d-d7b0-4bb0-93ce-a9a0773a972e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-093d7b65-31ad-44e4-a172-5a83bc186750" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:37:20 compute-0 nova_compute[187185]: 2025-11-29 07:37:20.991 187189 DEBUG oslo_concurrency.lockutils [req-0cb37614-c376-43fa-96f9-ea914d42ee00 req-34ca454d-d7b0-4bb0-93ce-a9a0773a972e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-093d7b65-31ad-44e4-a172-5a83bc186750" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:37:20 compute-0 nova_compute[187185]: 2025-11-29 07:37:20.991 187189 DEBUG nova.network.neutron [req-0cb37614-c376-43fa-96f9-ea914d42ee00 req-34ca454d-d7b0-4bb0-93ce-a9a0773a972e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Refreshing network info cache for port 15c71f59-cb6a-4b75-9869-d159a0b45b73 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:37:21 compute-0 nova_compute[187185]: 2025-11-29 07:37:21.041 187189 DEBUG oslo_concurrency.lockutils [None req-010d3cb7-7103-4dd3-b272-69e6ea578084 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "093d7b65-31ad-44e4-a172-5a83bc186750" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:37:21 compute-0 nova_compute[187185]: 2025-11-29 07:37:21.042 187189 DEBUG oslo_concurrency.lockutils [None req-010d3cb7-7103-4dd3-b272-69e6ea578084 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "093d7b65-31ad-44e4-a172-5a83bc186750" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:37:21 compute-0 nova_compute[187185]: 2025-11-29 07:37:21.043 187189 DEBUG oslo_concurrency.lockutils [None req-010d3cb7-7103-4dd3-b272-69e6ea578084 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "093d7b65-31ad-44e4-a172-5a83bc186750-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:37:21 compute-0 nova_compute[187185]: 2025-11-29 07:37:21.043 187189 DEBUG oslo_concurrency.lockutils [None req-010d3cb7-7103-4dd3-b272-69e6ea578084 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "093d7b65-31ad-44e4-a172-5a83bc186750-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:37:21 compute-0 nova_compute[187185]: 2025-11-29 07:37:21.044 187189 DEBUG oslo_concurrency.lockutils [None req-010d3cb7-7103-4dd3-b272-69e6ea578084 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "093d7b65-31ad-44e4-a172-5a83bc186750-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:37:21 compute-0 nova_compute[187185]: 2025-11-29 07:37:21.061 187189 INFO nova.compute.manager [None req-010d3cb7-7103-4dd3-b272-69e6ea578084 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Terminating instance
Nov 29 07:37:21 compute-0 nova_compute[187185]: 2025-11-29 07:37:21.075 187189 DEBUG nova.compute.manager [None req-010d3cb7-7103-4dd3-b272-69e6ea578084 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 07:37:21 compute-0 kernel: tap15c71f59-cb (unregistering): left promiscuous mode
Nov 29 07:37:21 compute-0 NetworkManager[55227]: <info>  [1764401841.1048] device (tap15c71f59-cb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:37:21 compute-0 nova_compute[187185]: 2025-11-29 07:37:21.115 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:37:21 compute-0 ovn_controller[95281]: 2025-11-29T07:37:21Z|00492|binding|INFO|Releasing lport 15c71f59-cb6a-4b75-9869-d159a0b45b73 from this chassis (sb_readonly=0)
Nov 29 07:37:21 compute-0 ovn_controller[95281]: 2025-11-29T07:37:21Z|00493|binding|INFO|Setting lport 15c71f59-cb6a-4b75-9869-d159a0b45b73 down in Southbound
Nov 29 07:37:21 compute-0 ovn_controller[95281]: 2025-11-29T07:37:21Z|00494|binding|INFO|Removing iface tap15c71f59-cb ovn-installed in OVS
Nov 29 07:37:21 compute-0 nova_compute[187185]: 2025-11-29 07:37:21.121 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:37:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:37:21.125 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:01:e5:f7 10.100.0.13'], port_security=['fa:16:3e:01:e5:f7 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '093d7b65-31ad-44e4-a172-5a83bc186750', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-796c16d7-a3e7-4359-87e5-120ec2e2bed2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ec8b80be17a14d1caf666636283749d0', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9bc2cde2-973a-4def-a1fd-de58363c7267', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=71aa9491-c36d-40d4-9f3a-8ab24eb9a682, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=15c71f59-cb6a-4b75-9869-d159a0b45b73) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:37:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:37:21.127 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 15c71f59-cb6a-4b75-9869-d159a0b45b73 in datapath 796c16d7-a3e7-4359-87e5-120ec2e2bed2 unbound from our chassis
Nov 29 07:37:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:37:21.131 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 796c16d7-a3e7-4359-87e5-120ec2e2bed2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:37:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:37:21.134 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[7a214c01-71f1-449f-89e0-ef285e965a60]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:37:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:37:21.136 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-796c16d7-a3e7-4359-87e5-120ec2e2bed2 namespace which is not needed anymore
Nov 29 07:37:21 compute-0 nova_compute[187185]: 2025-11-29 07:37:21.151 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:37:21 compute-0 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d00000097.scope: Deactivated successfully.
Nov 29 07:37:21 compute-0 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d00000097.scope: Consumed 13.218s CPU time.
Nov 29 07:37:21 compute-0 systemd-machined[153486]: Machine qemu-59-instance-00000097 terminated.
Nov 29 07:37:21 compute-0 nova_compute[187185]: 2025-11-29 07:37:21.310 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:37:21 compute-0 nova_compute[187185]: 2025-11-29 07:37:21.316 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:37:21 compute-0 neutron-haproxy-ovnmeta-796c16d7-a3e7-4359-87e5-120ec2e2bed2[240895]: [NOTICE]   (240899) : haproxy version is 2.8.14-c23fe91
Nov 29 07:37:21 compute-0 neutron-haproxy-ovnmeta-796c16d7-a3e7-4359-87e5-120ec2e2bed2[240895]: [NOTICE]   (240899) : path to executable is /usr/sbin/haproxy
Nov 29 07:37:21 compute-0 neutron-haproxy-ovnmeta-796c16d7-a3e7-4359-87e5-120ec2e2bed2[240895]: [WARNING]  (240899) : Exiting Master process...
Nov 29 07:37:21 compute-0 neutron-haproxy-ovnmeta-796c16d7-a3e7-4359-87e5-120ec2e2bed2[240895]: [ALERT]    (240899) : Current worker (240901) exited with code 143 (Terminated)
Nov 29 07:37:21 compute-0 neutron-haproxy-ovnmeta-796c16d7-a3e7-4359-87e5-120ec2e2bed2[240895]: [WARNING]  (240899) : All workers exited. Exiting... (0)
Nov 29 07:37:21 compute-0 systemd[1]: libpod-58b20ee195526e8c0c4406bd57876dbade9e9b97cbfe66964b3b51742883ba19.scope: Deactivated successfully.
Nov 29 07:37:21 compute-0 podman[241090]: 2025-11-29 07:37:21.339029585 +0000 UTC m=+0.072441825 container died 58b20ee195526e8c0c4406bd57876dbade9e9b97cbfe66964b3b51742883ba19 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-796c16d7-a3e7-4359-87e5-120ec2e2bed2, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 07:37:21 compute-0 nova_compute[187185]: 2025-11-29 07:37:21.368 187189 INFO nova.virt.libvirt.driver [-] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Instance destroyed successfully.
Nov 29 07:37:21 compute-0 nova_compute[187185]: 2025-11-29 07:37:21.368 187189 DEBUG nova.objects.instance [None req-010d3cb7-7103-4dd3-b272-69e6ea578084 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'resources' on Instance uuid 093d7b65-31ad-44e4-a172-5a83bc186750 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:37:21 compute-0 nova_compute[187185]: 2025-11-29 07:37:21.382 187189 DEBUG nova.virt.libvirt.vif [None req-010d3cb7-7103-4dd3-b272-69e6ea578084 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:36:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-121163888',display_name='tempest-TestNetworkBasicOps-server-121163888',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-121163888',id=151,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOK26+5uIDc4lonJrFoxGqCd8TWMP8ZTknX2qMMhQQA1+nZWPodq11xfasxwa5Xz+ha7qs1MGlSGWz18/tKdvpySVwpztLYQvXGkyIbHXbHHe+jsmqiqCW7siJOXD6nqJg==',key_name='tempest-TestNetworkBasicOps-1548697059',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:36:57Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-0825x0rp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:36:57Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=093d7b65-31ad-44e4-a172-5a83bc186750,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "15c71f59-cb6a-4b75-9869-d159a0b45b73", "address": "fa:16:3e:01:e5:f7", "network": {"id": "796c16d7-a3e7-4359-87e5-120ec2e2bed2", "bridge": "br-int", "label": "tempest-network-smoke--1542129935", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15c71f59-cb", "ovs_interfaceid": "15c71f59-cb6a-4b75-9869-d159a0b45b73", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:37:21 compute-0 nova_compute[187185]: 2025-11-29 07:37:21.382 187189 DEBUG nova.network.os_vif_util [None req-010d3cb7-7103-4dd3-b272-69e6ea578084 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "15c71f59-cb6a-4b75-9869-d159a0b45b73", "address": "fa:16:3e:01:e5:f7", "network": {"id": "796c16d7-a3e7-4359-87e5-120ec2e2bed2", "bridge": "br-int", "label": "tempest-network-smoke--1542129935", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15c71f59-cb", "ovs_interfaceid": "15c71f59-cb6a-4b75-9869-d159a0b45b73", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:37:21 compute-0 nova_compute[187185]: 2025-11-29 07:37:21.383 187189 DEBUG nova.network.os_vif_util [None req-010d3cb7-7103-4dd3-b272-69e6ea578084 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:01:e5:f7,bridge_name='br-int',has_traffic_filtering=True,id=15c71f59-cb6a-4b75-9869-d159a0b45b73,network=Network(796c16d7-a3e7-4359-87e5-120ec2e2bed2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15c71f59-cb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:37:21 compute-0 nova_compute[187185]: 2025-11-29 07:37:21.384 187189 DEBUG os_vif [None req-010d3cb7-7103-4dd3-b272-69e6ea578084 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:01:e5:f7,bridge_name='br-int',has_traffic_filtering=True,id=15c71f59-cb6a-4b75-9869-d159a0b45b73,network=Network(796c16d7-a3e7-4359-87e5-120ec2e2bed2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15c71f59-cb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:37:21 compute-0 nova_compute[187185]: 2025-11-29 07:37:21.386 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:37:21 compute-0 nova_compute[187185]: 2025-11-29 07:37:21.386 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap15c71f59-cb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:37:21 compute-0 nova_compute[187185]: 2025-11-29 07:37:21.389 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:37:21 compute-0 nova_compute[187185]: 2025-11-29 07:37:21.390 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:37:21 compute-0 nova_compute[187185]: 2025-11-29 07:37:21.393 187189 INFO os_vif [None req-010d3cb7-7103-4dd3-b272-69e6ea578084 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:01:e5:f7,bridge_name='br-int',has_traffic_filtering=True,id=15c71f59-cb6a-4b75-9869-d159a0b45b73,network=Network(796c16d7-a3e7-4359-87e5-120ec2e2bed2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15c71f59-cb')
Nov 29 07:37:21 compute-0 nova_compute[187185]: 2025-11-29 07:37:21.394 187189 INFO nova.virt.libvirt.driver [None req-010d3cb7-7103-4dd3-b272-69e6ea578084 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Deleting instance files /var/lib/nova/instances/093d7b65-31ad-44e4-a172-5a83bc186750_del
Nov 29 07:37:21 compute-0 nova_compute[187185]: 2025-11-29 07:37:21.395 187189 INFO nova.virt.libvirt.driver [None req-010d3cb7-7103-4dd3-b272-69e6ea578084 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Deletion of /var/lib/nova/instances/093d7b65-31ad-44e4-a172-5a83bc186750_del complete
Nov 29 07:37:21 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-58b20ee195526e8c0c4406bd57876dbade9e9b97cbfe66964b3b51742883ba19-userdata-shm.mount: Deactivated successfully.
Nov 29 07:37:21 compute-0 systemd[1]: var-lib-containers-storage-overlay-560ad32b59fd6b42f8321ca2ef993b7988ceaa904f408dda7b1012d08e579d99-merged.mount: Deactivated successfully.
Nov 29 07:37:21 compute-0 podman[241090]: 2025-11-29 07:37:21.412963962 +0000 UTC m=+0.146376202 container cleanup 58b20ee195526e8c0c4406bd57876dbade9e9b97cbfe66964b3b51742883ba19 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-796c16d7-a3e7-4359-87e5-120ec2e2bed2, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 29 07:37:21 compute-0 systemd[1]: libpod-conmon-58b20ee195526e8c0c4406bd57876dbade9e9b97cbfe66964b3b51742883ba19.scope: Deactivated successfully.
Nov 29 07:37:21 compute-0 nova_compute[187185]: 2025-11-29 07:37:21.441 187189 DEBUG nova.compute.manager [req-1c5d8c32-86a9-4688-b178-3746a6c514f1 req-ca4aa0da-4a5f-43fb-b29a-e798c1c13cbf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Received event network-vif-unplugged-15c71f59-cb6a-4b75-9869-d159a0b45b73 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:37:21 compute-0 nova_compute[187185]: 2025-11-29 07:37:21.442 187189 DEBUG oslo_concurrency.lockutils [req-1c5d8c32-86a9-4688-b178-3746a6c514f1 req-ca4aa0da-4a5f-43fb-b29a-e798c1c13cbf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "093d7b65-31ad-44e4-a172-5a83bc186750-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:37:21 compute-0 nova_compute[187185]: 2025-11-29 07:37:21.442 187189 DEBUG oslo_concurrency.lockutils [req-1c5d8c32-86a9-4688-b178-3746a6c514f1 req-ca4aa0da-4a5f-43fb-b29a-e798c1c13cbf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "093d7b65-31ad-44e4-a172-5a83bc186750-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:37:21 compute-0 nova_compute[187185]: 2025-11-29 07:37:21.443 187189 DEBUG oslo_concurrency.lockutils [req-1c5d8c32-86a9-4688-b178-3746a6c514f1 req-ca4aa0da-4a5f-43fb-b29a-e798c1c13cbf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "093d7b65-31ad-44e4-a172-5a83bc186750-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:37:21 compute-0 nova_compute[187185]: 2025-11-29 07:37:21.443 187189 DEBUG nova.compute.manager [req-1c5d8c32-86a9-4688-b178-3746a6c514f1 req-ca4aa0da-4a5f-43fb-b29a-e798c1c13cbf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] No waiting events found dispatching network-vif-unplugged-15c71f59-cb6a-4b75-9869-d159a0b45b73 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:37:21 compute-0 nova_compute[187185]: 2025-11-29 07:37:21.443 187189 DEBUG nova.compute.manager [req-1c5d8c32-86a9-4688-b178-3746a6c514f1 req-ca4aa0da-4a5f-43fb-b29a-e798c1c13cbf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Received event network-vif-unplugged-15c71f59-cb6a-4b75-9869-d159a0b45b73 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 07:37:21 compute-0 nova_compute[187185]: 2025-11-29 07:37:21.487 187189 INFO nova.compute.manager [None req-010d3cb7-7103-4dd3-b272-69e6ea578084 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Took 0.41 seconds to destroy the instance on the hypervisor.
Nov 29 07:37:21 compute-0 nova_compute[187185]: 2025-11-29 07:37:21.488 187189 DEBUG oslo.service.loopingcall [None req-010d3cb7-7103-4dd3-b272-69e6ea578084 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 07:37:21 compute-0 nova_compute[187185]: 2025-11-29 07:37:21.488 187189 DEBUG nova.compute.manager [-] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 07:37:21 compute-0 nova_compute[187185]: 2025-11-29 07:37:21.489 187189 DEBUG nova.network.neutron [-] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 07:37:21 compute-0 podman[241136]: 2025-11-29 07:37:21.503560122 +0000 UTC m=+0.059233641 container remove 58b20ee195526e8c0c4406bd57876dbade9e9b97cbfe66964b3b51742883ba19 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-796c16d7-a3e7-4359-87e5-120ec2e2bed2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:37:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:37:21.513 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[08a94337-5395-411a-9ee0-dc5354747020]: (4, ('Sat Nov 29 07:37:21 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-796c16d7-a3e7-4359-87e5-120ec2e2bed2 (58b20ee195526e8c0c4406bd57876dbade9e9b97cbfe66964b3b51742883ba19)\n58b20ee195526e8c0c4406bd57876dbade9e9b97cbfe66964b3b51742883ba19\nSat Nov 29 07:37:21 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-796c16d7-a3e7-4359-87e5-120ec2e2bed2 (58b20ee195526e8c0c4406bd57876dbade9e9b97cbfe66964b3b51742883ba19)\n58b20ee195526e8c0c4406bd57876dbade9e9b97cbfe66964b3b51742883ba19\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:37:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:37:21.515 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[265458b9-f86c-421f-8a01-7f67c8d800d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:37:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:37:21.517 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap796c16d7-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:37:21 compute-0 nova_compute[187185]: 2025-11-29 07:37:21.520 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:37:21 compute-0 kernel: tap796c16d7-a0: left promiscuous mode
Nov 29 07:37:21 compute-0 nova_compute[187185]: 2025-11-29 07:37:21.538 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:37:21 compute-0 nova_compute[187185]: 2025-11-29 07:37:21.540 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:37:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:37:21.542 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[648f130b-6c5c-4afe-ae8a-fd0f59ff16ca]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:37:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:37:21.562 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[ae377919-f142-41a7-86f8-a26b73fe44de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:37:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:37:21.564 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f48d10fd-21f7-48bd-be07-59cda35d5d24]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:37:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:37:21.586 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[eca6f360-c249-471f-8e35-f5d9f4b2713f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 726347, 'reachable_time': 36518, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241151, 'error': None, 'target': 'ovnmeta-796c16d7-a3e7-4359-87e5-120ec2e2bed2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:37:21 compute-0 systemd[1]: run-netns-ovnmeta\x2d796c16d7\x2da3e7\x2d4359\x2d87e5\x2d120ec2e2bed2.mount: Deactivated successfully.
Nov 29 07:37:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:37:21.592 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-796c16d7-a3e7-4359-87e5-120ec2e2bed2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:37:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:37:21.592 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[cc0db046-fb74-4d6e-9b29-aef32c5050f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:37:21 compute-0 nova_compute[187185]: 2025-11-29 07:37:21.669 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:37:22 compute-0 nova_compute[187185]: 2025-11-29 07:37:22.023 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:37:22 compute-0 nova_compute[187185]: 2025-11-29 07:37:22.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:37:22 compute-0 nova_compute[187185]: 2025-11-29 07:37:22.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:37:22 compute-0 nova_compute[187185]: 2025-11-29 07:37:22.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:37:22 compute-0 nova_compute[187185]: 2025-11-29 07:37:22.338 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875
Nov 29 07:37:22 compute-0 nova_compute[187185]: 2025-11-29 07:37:22.339 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 07:37:22 compute-0 nova_compute[187185]: 2025-11-29 07:37:22.339 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:37:22 compute-0 nova_compute[187185]: 2025-11-29 07:37:22.396 187189 DEBUG nova.network.neutron [-] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:37:22 compute-0 nova_compute[187185]: 2025-11-29 07:37:22.418 187189 INFO nova.compute.manager [-] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Took 0.93 seconds to deallocate network for instance.
Nov 29 07:37:22 compute-0 nova_compute[187185]: 2025-11-29 07:37:22.513 187189 DEBUG oslo_concurrency.lockutils [None req-010d3cb7-7103-4dd3-b272-69e6ea578084 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:37:22 compute-0 nova_compute[187185]: 2025-11-29 07:37:22.513 187189 DEBUG oslo_concurrency.lockutils [None req-010d3cb7-7103-4dd3-b272-69e6ea578084 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:37:22 compute-0 nova_compute[187185]: 2025-11-29 07:37:22.599 187189 DEBUG nova.compute.provider_tree [None req-010d3cb7-7103-4dd3-b272-69e6ea578084 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:37:22 compute-0 nova_compute[187185]: 2025-11-29 07:37:22.614 187189 DEBUG nova.scheduler.client.report [None req-010d3cb7-7103-4dd3-b272-69e6ea578084 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:37:22 compute-0 nova_compute[187185]: 2025-11-29 07:37:22.642 187189 DEBUG oslo_concurrency.lockutils [None req-010d3cb7-7103-4dd3-b272-69e6ea578084 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:37:22 compute-0 nova_compute[187185]: 2025-11-29 07:37:22.671 187189 INFO nova.scheduler.client.report [None req-010d3cb7-7103-4dd3-b272-69e6ea578084 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Deleted allocations for instance 093d7b65-31ad-44e4-a172-5a83bc186750
Nov 29 07:37:22 compute-0 nova_compute[187185]: 2025-11-29 07:37:22.823 187189 DEBUG nova.network.neutron [req-0cb37614-c376-43fa-96f9-ea914d42ee00 req-34ca454d-d7b0-4bb0-93ce-a9a0773a972e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Updated VIF entry in instance network info cache for port 15c71f59-cb6a-4b75-9869-d159a0b45b73. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:37:22 compute-0 nova_compute[187185]: 2025-11-29 07:37:22.824 187189 DEBUG nova.network.neutron [req-0cb37614-c376-43fa-96f9-ea914d42ee00 req-34ca454d-d7b0-4bb0-93ce-a9a0773a972e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Updating instance_info_cache with network_info: [{"id": "15c71f59-cb6a-4b75-9869-d159a0b45b73", "address": "fa:16:3e:01:e5:f7", "network": {"id": "796c16d7-a3e7-4359-87e5-120ec2e2bed2", "bridge": "br-int", "label": "tempest-network-smoke--1542129935", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15c71f59-cb", "ovs_interfaceid": "15c71f59-cb6a-4b75-9869-d159a0b45b73", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:37:22 compute-0 nova_compute[187185]: 2025-11-29 07:37:22.882 187189 DEBUG oslo_concurrency.lockutils [None req-010d3cb7-7103-4dd3-b272-69e6ea578084 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "093d7b65-31ad-44e4-a172-5a83bc186750" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.840s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:37:22 compute-0 nova_compute[187185]: 2025-11-29 07:37:22.902 187189 DEBUG oslo_concurrency.lockutils [req-0cb37614-c376-43fa-96f9-ea914d42ee00 req-34ca454d-d7b0-4bb0-93ce-a9a0773a972e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-093d7b65-31ad-44e4-a172-5a83bc186750" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:37:23 compute-0 nova_compute[187185]: 2025-11-29 07:37:23.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:37:23 compute-0 nova_compute[187185]: 2025-11-29 07:37:23.965 187189 DEBUG nova.compute.manager [req-5f3ff842-2bfc-49bb-a37f-c65fc7555029 req-db409089-2e8c-45c4-99fb-c02f2e0c15b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Received event network-vif-plugged-15c71f59-cb6a-4b75-9869-d159a0b45b73 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:37:23 compute-0 nova_compute[187185]: 2025-11-29 07:37:23.965 187189 DEBUG oslo_concurrency.lockutils [req-5f3ff842-2bfc-49bb-a37f-c65fc7555029 req-db409089-2e8c-45c4-99fb-c02f2e0c15b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "093d7b65-31ad-44e4-a172-5a83bc186750-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:37:23 compute-0 nova_compute[187185]: 2025-11-29 07:37:23.966 187189 DEBUG oslo_concurrency.lockutils [req-5f3ff842-2bfc-49bb-a37f-c65fc7555029 req-db409089-2e8c-45c4-99fb-c02f2e0c15b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "093d7b65-31ad-44e4-a172-5a83bc186750-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:37:23 compute-0 nova_compute[187185]: 2025-11-29 07:37:23.966 187189 DEBUG oslo_concurrency.lockutils [req-5f3ff842-2bfc-49bb-a37f-c65fc7555029 req-db409089-2e8c-45c4-99fb-c02f2e0c15b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "093d7b65-31ad-44e4-a172-5a83bc186750-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:37:23 compute-0 nova_compute[187185]: 2025-11-29 07:37:23.966 187189 DEBUG nova.compute.manager [req-5f3ff842-2bfc-49bb-a37f-c65fc7555029 req-db409089-2e8c-45c4-99fb-c02f2e0c15b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] No waiting events found dispatching network-vif-plugged-15c71f59-cb6a-4b75-9869-d159a0b45b73 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:37:23 compute-0 nova_compute[187185]: 2025-11-29 07:37:23.967 187189 WARNING nova.compute.manager [req-5f3ff842-2bfc-49bb-a37f-c65fc7555029 req-db409089-2e8c-45c4-99fb-c02f2e0c15b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Received unexpected event network-vif-plugged-15c71f59-cb6a-4b75-9869-d159a0b45b73 for instance with vm_state deleted and task_state None.
Nov 29 07:37:23 compute-0 nova_compute[187185]: 2025-11-29 07:37:23.967 187189 DEBUG nova.compute.manager [req-5f3ff842-2bfc-49bb-a37f-c65fc7555029 req-db409089-2e8c-45c4-99fb-c02f2e0c15b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Received event network-vif-deleted-15c71f59-cb6a-4b75-9869-d159a0b45b73 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:37:25 compute-0 nova_compute[187185]: 2025-11-29 07:37:25.318 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:37:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:37:25.527 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:37:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:37:25.529 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:37:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:37:25.529 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:37:25 compute-0 nova_compute[187185]: 2025-11-29 07:37:25.700 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:37:25 compute-0 nova_compute[187185]: 2025-11-29 07:37:25.701 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:37:25 compute-0 nova_compute[187185]: 2025-11-29 07:37:25.701 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:37:25 compute-0 nova_compute[187185]: 2025-11-29 07:37:25.702 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:37:25 compute-0 nova_compute[187185]: 2025-11-29 07:37:25.957 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:37:25 compute-0 nova_compute[187185]: 2025-11-29 07:37:25.958 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5732MB free_disk=73.2540397644043GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:37:25 compute-0 nova_compute[187185]: 2025-11-29 07:37:25.959 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:37:25 compute-0 nova_compute[187185]: 2025-11-29 07:37:25.959 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:37:26 compute-0 nova_compute[187185]: 2025-11-29 07:37:26.395 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:37:26 compute-0 podman[241153]: 2025-11-29 07:37:26.830025566 +0000 UTC m=+0.086543526 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 07:37:27 compute-0 nova_compute[187185]: 2025-11-29 07:37:27.029 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:37:27 compute-0 nova_compute[187185]: 2025-11-29 07:37:27.404 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:37:27 compute-0 nova_compute[187185]: 2025-11-29 07:37:27.405 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:37:27 compute-0 nova_compute[187185]: 2025-11-29 07:37:27.430 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:37:27 compute-0 nova_compute[187185]: 2025-11-29 07:37:27.642 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:37:28 compute-0 nova_compute[187185]: 2025-11-29 07:37:28.761 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:37:28 compute-0 nova_compute[187185]: 2025-11-29 07:37:28.761 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.802s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:37:30 compute-0 sshd-session[241178]: error: kex_exchange_identification: read: Connection reset by peer
Nov 29 07:37:30 compute-0 sshd-session[241178]: Connection reset by 45.140.17.97 port 51122
Nov 29 07:37:30 compute-0 nova_compute[187185]: 2025-11-29 07:37:30.760 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:37:30 compute-0 nova_compute[187185]: 2025-11-29 07:37:30.761 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:37:30 compute-0 nova_compute[187185]: 2025-11-29 07:37:30.761 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:37:30 compute-0 nova_compute[187185]: 2025-11-29 07:37:30.761 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:37:30 compute-0 nova_compute[187185]: 2025-11-29 07:37:30.762 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:37:30 compute-0 podman[241180]: 2025-11-29 07:37:30.842593411 +0000 UTC m=+0.086566747 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:37:30 compute-0 podman[241179]: 2025-11-29 07:37:30.842190669 +0000 UTC m=+0.095482840 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Nov 29 07:37:31 compute-0 nova_compute[187185]: 2025-11-29 07:37:31.313 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:37:31 compute-0 nova_compute[187185]: 2025-11-29 07:37:31.399 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:37:32 compute-0 nova_compute[187185]: 2025-11-29 07:37:32.032 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:37:36 compute-0 nova_compute[187185]: 2025-11-29 07:37:36.366 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401841.364576, 093d7b65-31ad-44e4-a172-5a83bc186750 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:37:36 compute-0 nova_compute[187185]: 2025-11-29 07:37:36.366 187189 INFO nova.compute.manager [-] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] VM Stopped (Lifecycle Event)
Nov 29 07:37:36 compute-0 nova_compute[187185]: 2025-11-29 07:37:36.401 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:37:37 compute-0 nova_compute[187185]: 2025-11-29 07:37:37.034 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:37:37 compute-0 nova_compute[187185]: 2025-11-29 07:37:37.158 187189 DEBUG nova.compute.manager [None req-11644e7b-4a4d-4197-808b-bb07d29db471 - - - - - -] [instance: 093d7b65-31ad-44e4-a172-5a83bc186750] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:37:38 compute-0 nova_compute[187185]: 2025-11-29 07:37:38.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:37:38 compute-0 nova_compute[187185]: 2025-11-29 07:37:38.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 29 07:37:40 compute-0 sshd-session[241219]: Invalid user minecraft from 20.255.62.58 port 35058
Nov 29 07:37:40 compute-0 sshd-session[241219]: Received disconnect from 20.255.62.58 port 35058:11: Bye Bye [preauth]
Nov 29 07:37:40 compute-0 sshd-session[241219]: Disconnected from invalid user minecraft 20.255.62.58 port 35058 [preauth]
Nov 29 07:37:41 compute-0 nova_compute[187185]: 2025-11-29 07:37:41.404 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:37:41 compute-0 podman[241223]: 2025-11-29 07:37:41.807449535 +0000 UTC m=+0.067622109 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 07:37:41 compute-0 podman[241221]: 2025-11-29 07:37:41.827048811 +0000 UTC m=+0.083665174 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 07:37:41 compute-0 podman[241222]: 2025-11-29 07:37:41.831810486 +0000 UTC m=+0.086704770 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_id=edpm, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, name=ubi9-minimal)
Nov 29 07:37:42 compute-0 nova_compute[187185]: 2025-11-29 07:37:42.036 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:37:43 compute-0 sshd[128727]: Timeout before authentication for connection from 115.190.136.184 to 38.102.83.110, pid = 240402
Nov 29 07:37:46 compute-0 nova_compute[187185]: 2025-11-29 07:37:46.406 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:37:47 compute-0 nova_compute[187185]: 2025-11-29 07:37:47.039 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:37:48 compute-0 nova_compute[187185]: 2025-11-29 07:37:48.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:37:50 compute-0 podman[241281]: 2025-11-29 07:37:50.868048211 +0000 UTC m=+0.133583331 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller)
Nov 29 07:37:51 compute-0 nova_compute[187185]: 2025-11-29 07:37:51.408 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:37:51 compute-0 nova_compute[187185]: 2025-11-29 07:37:51.827 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:37:52 compute-0 nova_compute[187185]: 2025-11-29 07:37:52.056 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:37:52 compute-0 nova_compute[187185]: 2025-11-29 07:37:52.072 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:37:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:37:56.433 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=56, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=55) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:37:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:37:56.435 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:37:56 compute-0 nova_compute[187185]: 2025-11-29 07:37:56.441 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:37:57 compute-0 nova_compute[187185]: 2025-11-29 07:37:57.062 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:37:57 compute-0 podman[241308]: 2025-11-29 07:37:57.786390104 +0000 UTC m=+0.048418424 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 07:38:01 compute-0 nova_compute[187185]: 2025-11-29 07:38:01.484 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:01 compute-0 podman[241335]: 2025-11-29 07:38:01.830899797 +0000 UTC m=+0.086590577 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible)
Nov 29 07:38:01 compute-0 podman[241334]: 2025-11-29 07:38:01.839000827 +0000 UTC m=+0.093403731 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:38:02 compute-0 nova_compute[187185]: 2025-11-29 07:38:02.064 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:05.437 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '56'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:38:06 compute-0 nova_compute[187185]: 2025-11-29 07:38:06.486 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:07 compute-0 nova_compute[187185]: 2025-11-29 07:38:07.067 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:11 compute-0 nova_compute[187185]: 2025-11-29 07:38:11.488 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:12 compute-0 nova_compute[187185]: 2025-11-29 07:38:12.069 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:12 compute-0 podman[241373]: 2025-11-29 07:38:12.818652312 +0000 UTC m=+0.069250926 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=edpm, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.openshift.expose-services=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, vcs-type=git)
Nov 29 07:38:12 compute-0 podman[241372]: 2025-11-29 07:38:12.819772204 +0000 UTC m=+0.072848288 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:38:12 compute-0 podman[241374]: 2025-11-29 07:38:12.832740181 +0000 UTC m=+0.077873440 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 07:38:14 compute-0 nova_compute[187185]: 2025-11-29 07:38:14.685 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:38:14 compute-0 nova_compute[187185]: 2025-11-29 07:38:14.685 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 29 07:38:14 compute-0 nova_compute[187185]: 2025-11-29 07:38:14.703 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 29 07:38:16 compute-0 nova_compute[187185]: 2025-11-29 07:38:16.490 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:16 compute-0 nova_compute[187185]: 2025-11-29 07:38:16.827 187189 DEBUG oslo_concurrency.lockutils [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:38:16 compute-0 nova_compute[187185]: 2025-11-29 07:38:16.828 187189 DEBUG oslo_concurrency.lockutils [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:38:16 compute-0 nova_compute[187185]: 2025-11-29 07:38:16.847 187189 DEBUG nova.compute.manager [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 07:38:16 compute-0 nova_compute[187185]: 2025-11-29 07:38:16.930 187189 DEBUG oslo_concurrency.lockutils [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "f9a714d0-1a3d-4de4-8fc2-fca74c904eff" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:38:16 compute-0 nova_compute[187185]: 2025-11-29 07:38:16.930 187189 DEBUG oslo_concurrency.lockutils [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "f9a714d0-1a3d-4de4-8fc2-fca74c904eff" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:38:16 compute-0 nova_compute[187185]: 2025-11-29 07:38:16.977 187189 DEBUG nova.compute.manager [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 07:38:17 compute-0 nova_compute[187185]: 2025-11-29 07:38:17.008 187189 DEBUG oslo_concurrency.lockutils [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:38:17 compute-0 nova_compute[187185]: 2025-11-29 07:38:17.009 187189 DEBUG oslo_concurrency.lockutils [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:38:17 compute-0 nova_compute[187185]: 2025-11-29 07:38:17.019 187189 DEBUG nova.virt.hardware [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:38:17 compute-0 nova_compute[187185]: 2025-11-29 07:38:17.019 187189 INFO nova.compute.claims [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:38:17 compute-0 nova_compute[187185]: 2025-11-29 07:38:17.198 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:18 compute-0 nova_compute[187185]: 2025-11-29 07:38:18.414 187189 DEBUG oslo_concurrency.lockutils [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:38:18 compute-0 nova_compute[187185]: 2025-11-29 07:38:18.478 187189 DEBUG nova.compute.provider_tree [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:38:18 compute-0 nova_compute[187185]: 2025-11-29 07:38:18.580 187189 DEBUG nova.scheduler.client.report [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:38:18 compute-0 nova_compute[187185]: 2025-11-29 07:38:18.604 187189 DEBUG oslo_concurrency.lockutils [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.595s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:38:18 compute-0 nova_compute[187185]: 2025-11-29 07:38:18.605 187189 DEBUG nova.compute.manager [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 07:38:18 compute-0 nova_compute[187185]: 2025-11-29 07:38:18.609 187189 DEBUG oslo_concurrency.lockutils [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.195s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:38:18 compute-0 nova_compute[187185]: 2025-11-29 07:38:18.620 187189 DEBUG nova.virt.hardware [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:38:18 compute-0 nova_compute[187185]: 2025-11-29 07:38:18.621 187189 INFO nova.compute.claims [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:38:19 compute-0 nova_compute[187185]: 2025-11-29 07:38:19.489 187189 DEBUG nova.compute.manager [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 07:38:19 compute-0 nova_compute[187185]: 2025-11-29 07:38:19.490 187189 DEBUG nova.network.neutron [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 07:38:19 compute-0 nova_compute[187185]: 2025-11-29 07:38:19.616 187189 DEBUG nova.compute.provider_tree [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:38:19 compute-0 nova_compute[187185]: 2025-11-29 07:38:19.700 187189 DEBUG nova.scheduler.client.report [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:38:19 compute-0 nova_compute[187185]: 2025-11-29 07:38:19.705 187189 INFO nova.virt.libvirt.driver [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 07:38:20 compute-0 nova_compute[187185]: 2025-11-29 07:38:20.169 187189 DEBUG oslo_concurrency.lockutils [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.560s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:38:20 compute-0 nova_compute[187185]: 2025-11-29 07:38:20.170 187189 DEBUG nova.compute.manager [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 07:38:20 compute-0 nova_compute[187185]: 2025-11-29 07:38:20.378 187189 DEBUG nova.policy [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 07:38:20 compute-0 nova_compute[187185]: 2025-11-29 07:38:20.589 187189 DEBUG nova.compute.manager [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 07:38:21 compute-0 nova_compute[187185]: 2025-11-29 07:38:21.493 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:21 compute-0 podman[241436]: 2025-11-29 07:38:21.891642547 +0000 UTC m=+0.137301516 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 29 07:38:22 compute-0 nova_compute[187185]: 2025-11-29 07:38:22.201 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:22 compute-0 nova_compute[187185]: 2025-11-29 07:38:22.333 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:38:22 compute-0 nova_compute[187185]: 2025-11-29 07:38:22.334 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:38:22 compute-0 nova_compute[187185]: 2025-11-29 07:38:22.335 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:38:24 compute-0 nova_compute[187185]: 2025-11-29 07:38:24.178 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Nov 29 07:38:24 compute-0 nova_compute[187185]: 2025-11-29 07:38:24.179 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Nov 29 07:38:24 compute-0 nova_compute[187185]: 2025-11-29 07:38:24.179 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 07:38:24 compute-0 nova_compute[187185]: 2025-11-29 07:38:24.180 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:38:24 compute-0 nova_compute[187185]: 2025-11-29 07:38:24.956 187189 DEBUG nova.compute.manager [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 07:38:24 compute-0 nova_compute[187185]: 2025-11-29 07:38:24.957 187189 DEBUG nova.network.neutron [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 07:38:25 compute-0 nova_compute[187185]: 2025-11-29 07:38:25.055 187189 INFO nova.virt.libvirt.driver [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 07:38:25 compute-0 nova_compute[187185]: 2025-11-29 07:38:25.105 187189 DEBUG nova.compute.manager [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 07:38:25 compute-0 nova_compute[187185]: 2025-11-29 07:38:25.172 187189 DEBUG nova.policy [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 07:38:25 compute-0 nova_compute[187185]: 2025-11-29 07:38:25.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:38:25 compute-0 nova_compute[187185]: 2025-11-29 07:38:25.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:38:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:25.528 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:38:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:25.529 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:38:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:25.529 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:38:25 compute-0 nova_compute[187185]: 2025-11-29 07:38:25.761 187189 DEBUG nova.compute.manager [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 07:38:25 compute-0 nova_compute[187185]: 2025-11-29 07:38:25.762 187189 DEBUG nova.virt.libvirt.driver [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:38:25 compute-0 nova_compute[187185]: 2025-11-29 07:38:25.764 187189 INFO nova.virt.libvirt.driver [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Creating image(s)
Nov 29 07:38:25 compute-0 nova_compute[187185]: 2025-11-29 07:38:25.765 187189 DEBUG oslo_concurrency.lockutils [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "/var/lib/nova/instances/e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:38:25 compute-0 nova_compute[187185]: 2025-11-29 07:38:25.765 187189 DEBUG oslo_concurrency.lockutils [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "/var/lib/nova/instances/e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:38:25 compute-0 nova_compute[187185]: 2025-11-29 07:38:25.766 187189 DEBUG oslo_concurrency.lockutils [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "/var/lib/nova/instances/e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:38:25 compute-0 nova_compute[187185]: 2025-11-29 07:38:25.782 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:38:25 compute-0 nova_compute[187185]: 2025-11-29 07:38:25.782 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:38:25 compute-0 nova_compute[187185]: 2025-11-29 07:38:25.782 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:38:25 compute-0 nova_compute[187185]: 2025-11-29 07:38:25.782 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:38:25 compute-0 nova_compute[187185]: 2025-11-29 07:38:25.784 187189 DEBUG oslo_concurrency.processutils [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:38:25 compute-0 nova_compute[187185]: 2025-11-29 07:38:25.865 187189 DEBUG oslo_concurrency.processutils [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:38:25 compute-0 nova_compute[187185]: 2025-11-29 07:38:25.866 187189 DEBUG oslo_concurrency.lockutils [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:38:25 compute-0 nova_compute[187185]: 2025-11-29 07:38:25.867 187189 DEBUG oslo_concurrency.lockutils [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:38:25 compute-0 nova_compute[187185]: 2025-11-29 07:38:25.881 187189 DEBUG oslo_concurrency.processutils [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:38:25 compute-0 nova_compute[187185]: 2025-11-29 07:38:25.969 187189 DEBUG oslo_concurrency.processutils [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:38:25 compute-0 nova_compute[187185]: 2025-11-29 07:38:25.971 187189 DEBUG oslo_concurrency.processutils [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:38:26 compute-0 nova_compute[187185]: 2025-11-29 07:38:26.030 187189 DEBUG oslo_concurrency.processutils [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/disk 1073741824" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:38:26 compute-0 nova_compute[187185]: 2025-11-29 07:38:26.031 187189 DEBUG oslo_concurrency.lockutils [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.164s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:38:26 compute-0 nova_compute[187185]: 2025-11-29 07:38:26.032 187189 DEBUG oslo_concurrency.processutils [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:38:26 compute-0 nova_compute[187185]: 2025-11-29 07:38:26.106 187189 DEBUG oslo_concurrency.processutils [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:38:26 compute-0 nova_compute[187185]: 2025-11-29 07:38:26.107 187189 DEBUG nova.virt.disk.api [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Checking if we can resize image /var/lib/nova/instances/e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:38:26 compute-0 nova_compute[187185]: 2025-11-29 07:38:26.108 187189 DEBUG oslo_concurrency.processutils [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:38:26 compute-0 nova_compute[187185]: 2025-11-29 07:38:26.171 187189 DEBUG oslo_concurrency.processutils [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:38:26 compute-0 nova_compute[187185]: 2025-11-29 07:38:26.173 187189 DEBUG nova.virt.disk.api [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Cannot resize image /var/lib/nova/instances/e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:38:26 compute-0 nova_compute[187185]: 2025-11-29 07:38:26.173 187189 DEBUG nova.objects.instance [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'migration_context' on Instance uuid e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:38:26 compute-0 nova_compute[187185]: 2025-11-29 07:38:26.179 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:38:26 compute-0 nova_compute[187185]: 2025-11-29 07:38:26.180 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5725MB free_disk=73.2540397644043GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:38:26 compute-0 nova_compute[187185]: 2025-11-29 07:38:26.181 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:38:26 compute-0 nova_compute[187185]: 2025-11-29 07:38:26.181 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:38:26 compute-0 nova_compute[187185]: 2025-11-29 07:38:26.497 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:27 compute-0 nova_compute[187185]: 2025-11-29 07:38:27.172 187189 DEBUG nova.virt.libvirt.driver [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:38:27 compute-0 nova_compute[187185]: 2025-11-29 07:38:27.173 187189 DEBUG nova.virt.libvirt.driver [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Ensure instance console log exists: /var/lib/nova/instances/e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:38:27 compute-0 nova_compute[187185]: 2025-11-29 07:38:27.174 187189 DEBUG oslo_concurrency.lockutils [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:38:27 compute-0 nova_compute[187185]: 2025-11-29 07:38:27.174 187189 DEBUG oslo_concurrency.lockutils [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:38:27 compute-0 nova_compute[187185]: 2025-11-29 07:38:27.175 187189 DEBUG oslo_concurrency.lockutils [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:38:27 compute-0 nova_compute[187185]: 2025-11-29 07:38:27.203 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:28 compute-0 nova_compute[187185]: 2025-11-29 07:38:28.158 187189 DEBUG nova.compute.manager [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 07:38:28 compute-0 nova_compute[187185]: 2025-11-29 07:38:28.160 187189 DEBUG nova.virt.libvirt.driver [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:38:28 compute-0 nova_compute[187185]: 2025-11-29 07:38:28.160 187189 INFO nova.virt.libvirt.driver [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Creating image(s)
Nov 29 07:38:28 compute-0 nova_compute[187185]: 2025-11-29 07:38:28.161 187189 DEBUG oslo_concurrency.lockutils [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "/var/lib/nova/instances/f9a714d0-1a3d-4de4-8fc2-fca74c904eff/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:38:28 compute-0 nova_compute[187185]: 2025-11-29 07:38:28.161 187189 DEBUG oslo_concurrency.lockutils [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "/var/lib/nova/instances/f9a714d0-1a3d-4de4-8fc2-fca74c904eff/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:38:28 compute-0 nova_compute[187185]: 2025-11-29 07:38:28.161 187189 DEBUG oslo_concurrency.lockutils [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "/var/lib/nova/instances/f9a714d0-1a3d-4de4-8fc2-fca74c904eff/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:38:28 compute-0 nova_compute[187185]: 2025-11-29 07:38:28.175 187189 DEBUG oslo_concurrency.processutils [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:38:28 compute-0 nova_compute[187185]: 2025-11-29 07:38:28.232 187189 DEBUG oslo_concurrency.processutils [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:38:28 compute-0 nova_compute[187185]: 2025-11-29 07:38:28.233 187189 DEBUG oslo_concurrency.lockutils [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:38:28 compute-0 nova_compute[187185]: 2025-11-29 07:38:28.234 187189 DEBUG oslo_concurrency.lockutils [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:38:28 compute-0 nova_compute[187185]: 2025-11-29 07:38:28.249 187189 DEBUG oslo_concurrency.processutils [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:38:28 compute-0 nova_compute[187185]: 2025-11-29 07:38:28.313 187189 DEBUG oslo_concurrency.processutils [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:38:28 compute-0 nova_compute[187185]: 2025-11-29 07:38:28.315 187189 DEBUG oslo_concurrency.processutils [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/f9a714d0-1a3d-4de4-8fc2-fca74c904eff/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:38:28 compute-0 nova_compute[187185]: 2025-11-29 07:38:28.681 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:38:28 compute-0 nova_compute[187185]: 2025-11-29 07:38:28.681 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance f9a714d0-1a3d-4de4-8fc2-fca74c904eff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:38:28 compute-0 nova_compute[187185]: 2025-11-29 07:38:28.682 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:38:28 compute-0 nova_compute[187185]: 2025-11-29 07:38:28.682 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:38:28 compute-0 podman[241486]: 2025-11-29 07:38:28.814724925 +0000 UTC m=+0.073079035 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:38:28 compute-0 nova_compute[187185]: 2025-11-29 07:38:28.890 187189 DEBUG oslo_concurrency.processutils [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/f9a714d0-1a3d-4de4-8fc2-fca74c904eff/disk 1073741824" returned: 0 in 0.575s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:38:28 compute-0 nova_compute[187185]: 2025-11-29 07:38:28.890 187189 DEBUG oslo_concurrency.lockutils [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.657s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:38:28 compute-0 nova_compute[187185]: 2025-11-29 07:38:28.891 187189 DEBUG oslo_concurrency.processutils [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:38:28 compute-0 nova_compute[187185]: 2025-11-29 07:38:28.977 187189 DEBUG oslo_concurrency.processutils [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:38:28 compute-0 nova_compute[187185]: 2025-11-29 07:38:28.979 187189 DEBUG nova.virt.disk.api [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Checking if we can resize image /var/lib/nova/instances/f9a714d0-1a3d-4de4-8fc2-fca74c904eff/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:38:28 compute-0 nova_compute[187185]: 2025-11-29 07:38:28.980 187189 DEBUG oslo_concurrency.processutils [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f9a714d0-1a3d-4de4-8fc2-fca74c904eff/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:38:29 compute-0 nova_compute[187185]: 2025-11-29 07:38:29.039 187189 DEBUG oslo_concurrency.processutils [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f9a714d0-1a3d-4de4-8fc2-fca74c904eff/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:38:29 compute-0 nova_compute[187185]: 2025-11-29 07:38:29.040 187189 DEBUG nova.virt.disk.api [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Cannot resize image /var/lib/nova/instances/f9a714d0-1a3d-4de4-8fc2-fca74c904eff/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:38:29 compute-0 nova_compute[187185]: 2025-11-29 07:38:29.041 187189 DEBUG nova.objects.instance [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'migration_context' on Instance uuid f9a714d0-1a3d-4de4-8fc2-fca74c904eff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:38:29 compute-0 nova_compute[187185]: 2025-11-29 07:38:29.126 187189 DEBUG nova.virt.libvirt.driver [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:38:29 compute-0 nova_compute[187185]: 2025-11-29 07:38:29.126 187189 DEBUG nova.virt.libvirt.driver [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Ensure instance console log exists: /var/lib/nova/instances/f9a714d0-1a3d-4de4-8fc2-fca74c904eff/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:38:29 compute-0 nova_compute[187185]: 2025-11-29 07:38:29.127 187189 DEBUG oslo_concurrency.lockutils [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:38:29 compute-0 nova_compute[187185]: 2025-11-29 07:38:29.127 187189 DEBUG oslo_concurrency.lockutils [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:38:29 compute-0 nova_compute[187185]: 2025-11-29 07:38:29.128 187189 DEBUG oslo_concurrency.lockutils [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:38:29 compute-0 nova_compute[187185]: 2025-11-29 07:38:29.213 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:38:29 compute-0 nova_compute[187185]: 2025-11-29 07:38:29.242 187189 DEBUG nova.network.neutron [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Successfully created port: ef353cc9-1e6a-4c76-9acf-917aecd8f9a6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:38:29 compute-0 nova_compute[187185]: 2025-11-29 07:38:29.614 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:38:30 compute-0 nova_compute[187185]: 2025-11-29 07:38:30.881 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:38:30 compute-0 nova_compute[187185]: 2025-11-29 07:38:30.882 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.701s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:38:31 compute-0 nova_compute[187185]: 2025-11-29 07:38:31.499 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:32 compute-0 nova_compute[187185]: 2025-11-29 07:38:32.206 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:32 compute-0 nova_compute[187185]: 2025-11-29 07:38:32.796 187189 DEBUG nova.network.neutron [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Successfully created port: c65a8c36-5997-4e67-9fa0-e361b7c334ca _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:38:32 compute-0 podman[241517]: 2025-11-29 07:38:32.850676896 +0000 UTC m=+0.096485847 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 07:38:32 compute-0 podman[241516]: 2025-11-29 07:38:32.87335618 +0000 UTC m=+0.128611969 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible)
Nov 29 07:38:32 compute-0 nova_compute[187185]: 2025-11-29 07:38:32.883 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:38:32 compute-0 nova_compute[187185]: 2025-11-29 07:38:32.883 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:38:32 compute-0 nova_compute[187185]: 2025-11-29 07:38:32.883 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:38:32 compute-0 nova_compute[187185]: 2025-11-29 07:38:32.883 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:38:32 compute-0 nova_compute[187185]: 2025-11-29 07:38:32.884 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:38:32 compute-0 nova_compute[187185]: 2025-11-29 07:38:32.884 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:38:33 compute-0 nova_compute[187185]: 2025-11-29 07:38:33.485 187189 DEBUG nova.network.neutron [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Successfully created port: 948b9a73-91cd-4803-b642-d1a75f163368 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:38:35 compute-0 nova_compute[187185]: 2025-11-29 07:38:35.308 187189 DEBUG nova.network.neutron [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Successfully updated port: c65a8c36-5997-4e67-9fa0-e361b7c334ca _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:38:35 compute-0 nova_compute[187185]: 2025-11-29 07:38:35.502 187189 DEBUG nova.compute.manager [req-788bbf3f-74f8-4282-b31e-33392f6bb53e req-86c9ecf3-fb7c-4c9c-bde1-0931411cc4ea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Received event network-changed-c65a8c36-5997-4e67-9fa0-e361b7c334ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:38:35 compute-0 nova_compute[187185]: 2025-11-29 07:38:35.503 187189 DEBUG nova.compute.manager [req-788bbf3f-74f8-4282-b31e-33392f6bb53e req-86c9ecf3-fb7c-4c9c-bde1-0931411cc4ea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Refreshing instance network info cache due to event network-changed-c65a8c36-5997-4e67-9fa0-e361b7c334ca. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:38:35 compute-0 nova_compute[187185]: 2025-11-29 07:38:35.503 187189 DEBUG oslo_concurrency.lockutils [req-788bbf3f-74f8-4282-b31e-33392f6bb53e req-86c9ecf3-fb7c-4c9c-bde1-0931411cc4ea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-f9a714d0-1a3d-4de4-8fc2-fca74c904eff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:38:35 compute-0 nova_compute[187185]: 2025-11-29 07:38:35.503 187189 DEBUG oslo_concurrency.lockutils [req-788bbf3f-74f8-4282-b31e-33392f6bb53e req-86c9ecf3-fb7c-4c9c-bde1-0931411cc4ea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-f9a714d0-1a3d-4de4-8fc2-fca74c904eff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:38:35 compute-0 nova_compute[187185]: 2025-11-29 07:38:35.504 187189 DEBUG nova.network.neutron [req-788bbf3f-74f8-4282-b31e-33392f6bb53e req-86c9ecf3-fb7c-4c9c-bde1-0931411cc4ea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Refreshing network info cache for port c65a8c36-5997-4e67-9fa0-e361b7c334ca _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:38:35 compute-0 nova_compute[187185]: 2025-11-29 07:38:35.517 187189 DEBUG oslo_concurrency.lockutils [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "refresh_cache-f9a714d0-1a3d-4de4-8fc2-fca74c904eff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:38:35 compute-0 nova_compute[187185]: 2025-11-29 07:38:35.624 187189 DEBUG nova.network.neutron [req-788bbf3f-74f8-4282-b31e-33392f6bb53e req-86c9ecf3-fb7c-4c9c-bde1-0931411cc4ea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:38:36 compute-0 nova_compute[187185]: 2025-11-29 07:38:36.271 187189 DEBUG nova.network.neutron [req-788bbf3f-74f8-4282-b31e-33392f6bb53e req-86c9ecf3-fb7c-4c9c-bde1-0931411cc4ea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:38:36 compute-0 nova_compute[187185]: 2025-11-29 07:38:36.290 187189 DEBUG nova.network.neutron [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Successfully updated port: ef353cc9-1e6a-4c76-9acf-917aecd8f9a6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:38:36 compute-0 nova_compute[187185]: 2025-11-29 07:38:36.349 187189 DEBUG oslo_concurrency.lockutils [req-788bbf3f-74f8-4282-b31e-33392f6bb53e req-86c9ecf3-fb7c-4c9c-bde1-0931411cc4ea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-f9a714d0-1a3d-4de4-8fc2-fca74c904eff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:38:36 compute-0 nova_compute[187185]: 2025-11-29 07:38:36.353 187189 DEBUG oslo_concurrency.lockutils [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquired lock "refresh_cache-f9a714d0-1a3d-4de4-8fc2-fca74c904eff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:38:36 compute-0 nova_compute[187185]: 2025-11-29 07:38:36.353 187189 DEBUG nova.network.neutron [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:38:36 compute-0 nova_compute[187185]: 2025-11-29 07:38:36.526 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:36 compute-0 nova_compute[187185]: 2025-11-29 07:38:36.753 187189 DEBUG nova.network.neutron [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:38:37 compute-0 nova_compute[187185]: 2025-11-29 07:38:37.208 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:37 compute-0 nova_compute[187185]: 2025-11-29 07:38:37.589 187189 DEBUG nova.network.neutron [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Successfully updated port: 948b9a73-91cd-4803-b642-d1a75f163368 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:38:37 compute-0 nova_compute[187185]: 2025-11-29 07:38:37.645 187189 DEBUG nova.compute.manager [req-55de944d-51b8-4d1c-87ba-d48a884fec81 req-1afa57d7-793e-4b37-b9f1-625e077f5b0b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Received event network-changed-ef353cc9-1e6a-4c76-9acf-917aecd8f9a6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:38:37 compute-0 nova_compute[187185]: 2025-11-29 07:38:37.646 187189 DEBUG nova.compute.manager [req-55de944d-51b8-4d1c-87ba-d48a884fec81 req-1afa57d7-793e-4b37-b9f1-625e077f5b0b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Refreshing instance network info cache due to event network-changed-ef353cc9-1e6a-4c76-9acf-917aecd8f9a6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:38:37 compute-0 nova_compute[187185]: 2025-11-29 07:38:37.646 187189 DEBUG oslo_concurrency.lockutils [req-55de944d-51b8-4d1c-87ba-d48a884fec81 req-1afa57d7-793e-4b37-b9f1-625e077f5b0b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:38:37 compute-0 nova_compute[187185]: 2025-11-29 07:38:37.646 187189 DEBUG oslo_concurrency.lockutils [req-55de944d-51b8-4d1c-87ba-d48a884fec81 req-1afa57d7-793e-4b37-b9f1-625e077f5b0b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:38:37 compute-0 nova_compute[187185]: 2025-11-29 07:38:37.647 187189 DEBUG nova.network.neutron [req-55de944d-51b8-4d1c-87ba-d48a884fec81 req-1afa57d7-793e-4b37-b9f1-625e077f5b0b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Refreshing network info cache for port ef353cc9-1e6a-4c76-9acf-917aecd8f9a6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:38:37 compute-0 nova_compute[187185]: 2025-11-29 07:38:37.655 187189 DEBUG oslo_concurrency.lockutils [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "refresh_cache-e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:38:37 compute-0 nova_compute[187185]: 2025-11-29 07:38:37.831 187189 DEBUG nova.network.neutron [req-55de944d-51b8-4d1c-87ba-d48a884fec81 req-1afa57d7-793e-4b37-b9f1-625e077f5b0b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:38:37 compute-0 nova_compute[187185]: 2025-11-29 07:38:37.864 187189 DEBUG nova.network.neutron [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Updating instance_info_cache with network_info: [{"id": "c65a8c36-5997-4e67-9fa0-e361b7c334ca", "address": "fa:16:3e:5b:95:82", "network": {"id": "8ac0e70c-84ba-415c-841b-4a5a525b1a9d", "bridge": "br-int", "label": "tempest-network-smoke--597026508", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc65a8c36-59", "ovs_interfaceid": "c65a8c36-5997-4e67-9fa0-e361b7c334ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.246 187189 DEBUG oslo_concurrency.lockutils [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Releasing lock "refresh_cache-f9a714d0-1a3d-4de4-8fc2-fca74c904eff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.247 187189 DEBUG nova.compute.manager [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Instance network_info: |[{"id": "c65a8c36-5997-4e67-9fa0-e361b7c334ca", "address": "fa:16:3e:5b:95:82", "network": {"id": "8ac0e70c-84ba-415c-841b-4a5a525b1a9d", "bridge": "br-int", "label": "tempest-network-smoke--597026508", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc65a8c36-59", "ovs_interfaceid": "c65a8c36-5997-4e67-9fa0-e361b7c334ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.251 187189 DEBUG nova.virt.libvirt.driver [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Start _get_guest_xml network_info=[{"id": "c65a8c36-5997-4e67-9fa0-e361b7c334ca", "address": "fa:16:3e:5b:95:82", "network": {"id": "8ac0e70c-84ba-415c-841b-4a5a525b1a9d", "bridge": "br-int", "label": "tempest-network-smoke--597026508", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc65a8c36-59", "ovs_interfaceid": "c65a8c36-5997-4e67-9fa0-e361b7c334ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.258 187189 WARNING nova.virt.libvirt.driver [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.267 187189 DEBUG nova.virt.libvirt.host [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.268 187189 DEBUG nova.virt.libvirt.host [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.270 187189 DEBUG nova.virt.libvirt.host [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.271 187189 DEBUG nova.virt.libvirt.host [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.272 187189 DEBUG nova.virt.libvirt.driver [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.272 187189 DEBUG nova.virt.hardware [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.272 187189 DEBUG nova.virt.hardware [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.273 187189 DEBUG nova.virt.hardware [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.273 187189 DEBUG nova.virt.hardware [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.273 187189 DEBUG nova.virt.hardware [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.273 187189 DEBUG nova.virt.hardware [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.273 187189 DEBUG nova.virt.hardware [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.274 187189 DEBUG nova.virt.hardware [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.274 187189 DEBUG nova.virt.hardware [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.274 187189 DEBUG nova.virt.hardware [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.274 187189 DEBUG nova.virt.hardware [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.277 187189 DEBUG nova.virt.libvirt.vif [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:38:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-501093689',display_name='tempest-TestNetworkBasicOps-server-501093689',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-501093689',id=155,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ5Ztzxn5CQqZYwb2ZkKXPHZ0ufdHUQG1uY8bsPpbcdD+g/M62uTuwYkEj7cRasbqzaR/Kz4+EyR6ADmDGzV1OyeOoChkIqHM+KkJJEQWYFaNk70r6jDVnPuHLA056MeoA==',key_name='tempest-TestNetworkBasicOps-1295521734',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-rdm7tpik',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:38:25Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=f9a714d0-1a3d-4de4-8fc2-fca74c904eff,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c65a8c36-5997-4e67-9fa0-e361b7c334ca", "address": "fa:16:3e:5b:95:82", "network": {"id": "8ac0e70c-84ba-415c-841b-4a5a525b1a9d", "bridge": "br-int", "label": "tempest-network-smoke--597026508", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc65a8c36-59", "ovs_interfaceid": "c65a8c36-5997-4e67-9fa0-e361b7c334ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.277 187189 DEBUG nova.network.os_vif_util [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "c65a8c36-5997-4e67-9fa0-e361b7c334ca", "address": "fa:16:3e:5b:95:82", "network": {"id": "8ac0e70c-84ba-415c-841b-4a5a525b1a9d", "bridge": "br-int", "label": "tempest-network-smoke--597026508", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc65a8c36-59", "ovs_interfaceid": "c65a8c36-5997-4e67-9fa0-e361b7c334ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.278 187189 DEBUG nova.network.os_vif_util [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:95:82,bridge_name='br-int',has_traffic_filtering=True,id=c65a8c36-5997-4e67-9fa0-e361b7c334ca,network=Network(8ac0e70c-84ba-415c-841b-4a5a525b1a9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc65a8c36-59') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.279 187189 DEBUG nova.objects.instance [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'pci_devices' on Instance uuid f9a714d0-1a3d-4de4-8fc2-fca74c904eff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.344 187189 DEBUG nova.virt.libvirt.driver [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:38:38 compute-0 nova_compute[187185]:   <uuid>f9a714d0-1a3d-4de4-8fc2-fca74c904eff</uuid>
Nov 29 07:38:38 compute-0 nova_compute[187185]:   <name>instance-0000009b</name>
Nov 29 07:38:38 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:38:38 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:38:38 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:38:38 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:       <nova:name>tempest-TestNetworkBasicOps-server-501093689</nova:name>
Nov 29 07:38:38 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:38:38</nova:creationTime>
Nov 29 07:38:38 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:38:38 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:38:38 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:38:38 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:38:38 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:38:38 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:38:38 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:38:38 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:38:38 compute-0 nova_compute[187185]:         <nova:user uuid="1dd0a7f5aaff402eb032cd5e60540dcb">tempest-TestNetworkBasicOps-1587012976-project-member</nova:user>
Nov 29 07:38:38 compute-0 nova_compute[187185]:         <nova:project uuid="ec8b80be17a14d1caf666636283749d0">tempest-TestNetworkBasicOps-1587012976</nova:project>
Nov 29 07:38:38 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:38:38 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:38:38 compute-0 nova_compute[187185]:         <nova:port uuid="c65a8c36-5997-4e67-9fa0-e361b7c334ca">
Nov 29 07:38:38 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:38:38 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:38:38 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:38:38 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <system>
Nov 29 07:38:38 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:38:38 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:38:38 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:38:38 compute-0 nova_compute[187185]:       <entry name="serial">f9a714d0-1a3d-4de4-8fc2-fca74c904eff</entry>
Nov 29 07:38:38 compute-0 nova_compute[187185]:       <entry name="uuid">f9a714d0-1a3d-4de4-8fc2-fca74c904eff</entry>
Nov 29 07:38:38 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     </system>
Nov 29 07:38:38 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:38:38 compute-0 nova_compute[187185]:   <os>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:   </os>
Nov 29 07:38:38 compute-0 nova_compute[187185]:   <features>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:   </features>
Nov 29 07:38:38 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:38:38 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:38:38 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:38:38 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/f9a714d0-1a3d-4de4-8fc2-fca74c904eff/disk"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:38:38 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/f9a714d0-1a3d-4de4-8fc2-fca74c904eff/disk.config"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:38:38 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:5b:95:82"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:       <target dev="tapc65a8c36-59"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:38:38 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/f9a714d0-1a3d-4de4-8fc2-fca74c904eff/console.log" append="off"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <video>
Nov 29 07:38:38 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     </video>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:38:38 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:38:38 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:38:38 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:38:38 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:38:38 compute-0 nova_compute[187185]: </domain>
Nov 29 07:38:38 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.346 187189 DEBUG nova.compute.manager [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Preparing to wait for external event network-vif-plugged-c65a8c36-5997-4e67-9fa0-e361b7c334ca prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.347 187189 DEBUG oslo_concurrency.lockutils [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "f9a714d0-1a3d-4de4-8fc2-fca74c904eff-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.348 187189 DEBUG oslo_concurrency.lockutils [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "f9a714d0-1a3d-4de4-8fc2-fca74c904eff-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.348 187189 DEBUG oslo_concurrency.lockutils [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "f9a714d0-1a3d-4de4-8fc2-fca74c904eff-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.349 187189 DEBUG nova.virt.libvirt.vif [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:38:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-501093689',display_name='tempest-TestNetworkBasicOps-server-501093689',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-501093689',id=155,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ5Ztzxn5CQqZYwb2ZkKXPHZ0ufdHUQG1uY8bsPpbcdD+g/M62uTuwYkEj7cRasbqzaR/Kz4+EyR6ADmDGzV1OyeOoChkIqHM+KkJJEQWYFaNk70r6jDVnPuHLA056MeoA==',key_name='tempest-TestNetworkBasicOps-1295521734',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-rdm7tpik',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:38:25Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=f9a714d0-1a3d-4de4-8fc2-fca74c904eff,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c65a8c36-5997-4e67-9fa0-e361b7c334ca", "address": "fa:16:3e:5b:95:82", "network": {"id": "8ac0e70c-84ba-415c-841b-4a5a525b1a9d", "bridge": "br-int", "label": "tempest-network-smoke--597026508", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc65a8c36-59", "ovs_interfaceid": "c65a8c36-5997-4e67-9fa0-e361b7c334ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.350 187189 DEBUG nova.network.os_vif_util [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "c65a8c36-5997-4e67-9fa0-e361b7c334ca", "address": "fa:16:3e:5b:95:82", "network": {"id": "8ac0e70c-84ba-415c-841b-4a5a525b1a9d", "bridge": "br-int", "label": "tempest-network-smoke--597026508", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc65a8c36-59", "ovs_interfaceid": "c65a8c36-5997-4e67-9fa0-e361b7c334ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.351 187189 DEBUG nova.network.os_vif_util [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:95:82,bridge_name='br-int',has_traffic_filtering=True,id=c65a8c36-5997-4e67-9fa0-e361b7c334ca,network=Network(8ac0e70c-84ba-415c-841b-4a5a525b1a9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc65a8c36-59') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.352 187189 DEBUG os_vif [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:95:82,bridge_name='br-int',has_traffic_filtering=True,id=c65a8c36-5997-4e67-9fa0-e361b7c334ca,network=Network(8ac0e70c-84ba-415c-841b-4a5a525b1a9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc65a8c36-59') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.353 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.353 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.354 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.359 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.360 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc65a8c36-59, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.360 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc65a8c36-59, col_values=(('external_ids', {'iface-id': 'c65a8c36-5997-4e67-9fa0-e361b7c334ca', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5b:95:82', 'vm-uuid': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:38:38 compute-0 NetworkManager[55227]: <info>  [1764401918.3650] manager: (tapc65a8c36-59): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/257)
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.367 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.371 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.374 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.375 187189 INFO os_vif [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:95:82,bridge_name='br-int',has_traffic_filtering=True,id=c65a8c36-5997-4e67-9fa0-e361b7c334ca,network=Network(8ac0e70c-84ba-415c-841b-4a5a525b1a9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc65a8c36-59')
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.479 187189 DEBUG nova.network.neutron [req-55de944d-51b8-4d1c-87ba-d48a884fec81 req-1afa57d7-793e-4b37-b9f1-625e077f5b0b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.734 187189 DEBUG oslo_concurrency.lockutils [req-55de944d-51b8-4d1c-87ba-d48a884fec81 req-1afa57d7-793e-4b37-b9f1-625e077f5b0b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.735 187189 DEBUG oslo_concurrency.lockutils [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquired lock "refresh_cache-e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:38:38 compute-0 nova_compute[187185]: 2025-11-29 07:38:38.736 187189 DEBUG nova.network.neutron [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:38:39 compute-0 nova_compute[187185]: 2025-11-29 07:38:39.281 187189 DEBUG nova.virt.libvirt.driver [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:38:39 compute-0 nova_compute[187185]: 2025-11-29 07:38:39.282 187189 DEBUG nova.virt.libvirt.driver [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:38:39 compute-0 nova_compute[187185]: 2025-11-29 07:38:39.282 187189 DEBUG nova.virt.libvirt.driver [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No VIF found with MAC fa:16:3e:5b:95:82, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:38:39 compute-0 nova_compute[187185]: 2025-11-29 07:38:39.283 187189 INFO nova.virt.libvirt.driver [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Using config drive
Nov 29 07:38:39 compute-0 nova_compute[187185]: 2025-11-29 07:38:39.412 187189 DEBUG nova.network.neutron [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:38:39 compute-0 sshd-session[241556]: Invalid user ftpadmin from 190.181.27.27 port 49608
Nov 29 07:38:39 compute-0 sshd-session[241556]: Received disconnect from 190.181.27.27 port 49608:11: Bye Bye [preauth]
Nov 29 07:38:39 compute-0 sshd-session[241556]: Disconnected from invalid user ftpadmin 190.181.27.27 port 49608 [preauth]
Nov 29 07:38:40 compute-0 nova_compute[187185]: 2025-11-29 07:38:40.249 187189 DEBUG nova.compute.manager [req-ded0f6a5-d5e0-46bd-922e-79bdde672a7e req-a04cf16d-89fd-4b9d-9bce-e4da3cbadf8e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Received event network-changed-948b9a73-91cd-4803-b642-d1a75f163368 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:38:40 compute-0 nova_compute[187185]: 2025-11-29 07:38:40.250 187189 DEBUG nova.compute.manager [req-ded0f6a5-d5e0-46bd-922e-79bdde672a7e req-a04cf16d-89fd-4b9d-9bce-e4da3cbadf8e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Refreshing instance network info cache due to event network-changed-948b9a73-91cd-4803-b642-d1a75f163368. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:38:40 compute-0 nova_compute[187185]: 2025-11-29 07:38:40.250 187189 DEBUG oslo_concurrency.lockutils [req-ded0f6a5-d5e0-46bd-922e-79bdde672a7e req-a04cf16d-89fd-4b9d-9bce-e4da3cbadf8e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:38:40 compute-0 nova_compute[187185]: 2025-11-29 07:38:40.480 187189 INFO nova.virt.libvirt.driver [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Creating config drive at /var/lib/nova/instances/f9a714d0-1a3d-4de4-8fc2-fca74c904eff/disk.config
Nov 29 07:38:40 compute-0 nova_compute[187185]: 2025-11-29 07:38:40.487 187189 DEBUG oslo_concurrency.processutils [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f9a714d0-1a3d-4de4-8fc2-fca74c904eff/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpchitsa9r execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:38:40 compute-0 nova_compute[187185]: 2025-11-29 07:38:40.615 187189 DEBUG oslo_concurrency.processutils [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f9a714d0-1a3d-4de4-8fc2-fca74c904eff/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpchitsa9r" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:38:40 compute-0 kernel: tapc65a8c36-59: entered promiscuous mode
Nov 29 07:38:40 compute-0 ovn_controller[95281]: 2025-11-29T07:38:40Z|00495|binding|INFO|Claiming lport c65a8c36-5997-4e67-9fa0-e361b7c334ca for this chassis.
Nov 29 07:38:40 compute-0 ovn_controller[95281]: 2025-11-29T07:38:40Z|00496|binding|INFO|c65a8c36-5997-4e67-9fa0-e361b7c334ca: Claiming fa:16:3e:5b:95:82 10.100.0.11
Nov 29 07:38:40 compute-0 NetworkManager[55227]: <info>  [1764401920.6952] manager: (tapc65a8c36-59): new Tun device (/org/freedesktop/NetworkManager/Devices/258)
Nov 29 07:38:40 compute-0 nova_compute[187185]: 2025-11-29 07:38:40.697 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:40.707 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:95:82 10.100.0.11'], port_security=['fa:16:3e:5b:95:82 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8ac0e70c-84ba-415c-841b-4a5a525b1a9d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ec8b80be17a14d1caf666636283749d0', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ebfe9011-71c9-4c89-a651-d4470dffd8d6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a06529de-8ea4-4c02-8447-f35b6f567d2c, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=c65a8c36-5997-4e67-9fa0-e361b7c334ca) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:38:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:40.709 104254 INFO neutron.agent.ovn.metadata.agent [-] Port c65a8c36-5997-4e67-9fa0-e361b7c334ca in datapath 8ac0e70c-84ba-415c-841b-4a5a525b1a9d bound to our chassis
Nov 29 07:38:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:40.711 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8ac0e70c-84ba-415c-841b-4a5a525b1a9d
Nov 29 07:38:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:40.726 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[0f95ad28-40ac-40b2-b8be-f61c80bf8de1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:40.727 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8ac0e70c-81 in ovnmeta-8ac0e70c-84ba-415c-841b-4a5a525b1a9d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:38:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:40.730 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8ac0e70c-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:38:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:40.730 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[05f817ad-cf40-44f8-ba5d-5018ecf31d6b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:40.731 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[94aa7130-2b99-4453-9e65-f76faf749fd6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:40 compute-0 systemd-udevd[241579]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:38:40 compute-0 systemd-machined[153486]: New machine qemu-60-instance-0000009b.
Nov 29 07:38:40 compute-0 NetworkManager[55227]: <info>  [1764401920.7459] device (tapc65a8c36-59): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:38:40 compute-0 NetworkManager[55227]: <info>  [1764401920.7470] device (tapc65a8c36-59): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:38:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:40.748 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[90f8876e-07c2-4a7a-9727-6fc3091f7704]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:40 compute-0 nova_compute[187185]: 2025-11-29 07:38:40.752 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:40 compute-0 ovn_controller[95281]: 2025-11-29T07:38:40Z|00497|binding|INFO|Setting lport c65a8c36-5997-4e67-9fa0-e361b7c334ca ovn-installed in OVS
Nov 29 07:38:40 compute-0 ovn_controller[95281]: 2025-11-29T07:38:40Z|00498|binding|INFO|Setting lport c65a8c36-5997-4e67-9fa0-e361b7c334ca up in Southbound
Nov 29 07:38:40 compute-0 nova_compute[187185]: 2025-11-29 07:38:40.755 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:40 compute-0 systemd[1]: Started Virtual Machine qemu-60-instance-0000009b.
Nov 29 07:38:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:40.762 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[589ed69f-efe1-404d-9d83-80ba014704ff]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:40.794 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[967e35e9-367e-4a0b-a82b-b0d6dabcfd05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:40.801 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[16c66ab9-5bb1-445c-8ae3-341a1a12f388]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:40 compute-0 systemd-udevd[241582]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:38:40 compute-0 NetworkManager[55227]: <info>  [1764401920.8034] manager: (tap8ac0e70c-80): new Veth device (/org/freedesktop/NetworkManager/Devices/259)
Nov 29 07:38:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:40.835 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[cd10766e-49b3-4b18-b29f-f3923b711293]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:40.838 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[aae9320f-d048-462b-bf02-795d6bd3b749]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:40 compute-0 NetworkManager[55227]: <info>  [1764401920.8632] device (tap8ac0e70c-80): carrier: link connected
Nov 29 07:38:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:40.869 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[5a7a3a77-2f83-4529-b792-2c653484ce42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:40.889 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[68b377f1-64f5-4469-a901-9551f706b374]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8ac0e70c-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b6:3f:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 158], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 736852, 'reachable_time': 24370, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241611, 'error': None, 'target': 'ovnmeta-8ac0e70c-84ba-415c-841b-4a5a525b1a9d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:40.912 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[fc5c0aeb-0c68-42f6-b9d5-e5ca56bac601]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb6:3f33'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 736852, 'tstamp': 736852}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241612, 'error': None, 'target': 'ovnmeta-8ac0e70c-84ba-415c-841b-4a5a525b1a9d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:40.932 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[bda9a4f8-c07a-43ef-9a57-192b6234f345]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8ac0e70c-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b6:3f:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 158], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 736852, 'reachable_time': 24370, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 241613, 'error': None, 'target': 'ovnmeta-8ac0e70c-84ba-415c-841b-4a5a525b1a9d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:40 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:40.975 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[145477b6-2254-4c48-9bfd-ed59a620adc6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.000 187189 DEBUG nova.network.neutron [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Updating instance_info_cache with network_info: [{"id": "ef353cc9-1e6a-4c76-9acf-917aecd8f9a6", "address": "fa:16:3e:bc:8f:e7", "network": {"id": "51013e93-c048-46cc-9a9d-a184eb63e1b4", "bridge": "br-int", "label": "tempest-network-smoke--890595323", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef353cc9-1e", "ovs_interfaceid": "ef353cc9-1e6a-4c76-9acf-917aecd8f9a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "948b9a73-91cd-4803-b642-d1a75f163368", "address": "fa:16:3e:ac:e6:05", "network": {"id": "3085017e-01d1-448e-9eca-033b34f9e960", "bridge": "br-int", "label": "tempest-network-smoke--1681462854", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feac:e605", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap948b9a73-91", "ovs_interfaceid": "948b9a73-91cd-4803-b642-d1a75f163368", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.050 187189 DEBUG oslo_concurrency.lockutils [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Releasing lock "refresh_cache-e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.051 187189 DEBUG nova.compute.manager [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Instance network_info: |[{"id": "ef353cc9-1e6a-4c76-9acf-917aecd8f9a6", "address": "fa:16:3e:bc:8f:e7", "network": {"id": "51013e93-c048-46cc-9a9d-a184eb63e1b4", "bridge": "br-int", "label": "tempest-network-smoke--890595323", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef353cc9-1e", "ovs_interfaceid": "ef353cc9-1e6a-4c76-9acf-917aecd8f9a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "948b9a73-91cd-4803-b642-d1a75f163368", "address": "fa:16:3e:ac:e6:05", "network": {"id": "3085017e-01d1-448e-9eca-033b34f9e960", "bridge": "br-int", "label": "tempest-network-smoke--1681462854", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feac:e605", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap948b9a73-91", "ovs_interfaceid": "948b9a73-91cd-4803-b642-d1a75f163368", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.052 187189 DEBUG oslo_concurrency.lockutils [req-ded0f6a5-d5e0-46bd-922e-79bdde672a7e req-a04cf16d-89fd-4b9d-9bce-e4da3cbadf8e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.052 187189 DEBUG nova.network.neutron [req-ded0f6a5-d5e0-46bd-922e-79bdde672a7e req-a04cf16d-89fd-4b9d-9bce-e4da3cbadf8e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Refreshing network info cache for port 948b9a73-91cd-4803-b642-d1a75f163368 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.058 187189 DEBUG nova.virt.libvirt.driver [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Start _get_guest_xml network_info=[{"id": "ef353cc9-1e6a-4c76-9acf-917aecd8f9a6", "address": "fa:16:3e:bc:8f:e7", "network": {"id": "51013e93-c048-46cc-9a9d-a184eb63e1b4", "bridge": "br-int", "label": "tempest-network-smoke--890595323", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef353cc9-1e", "ovs_interfaceid": "ef353cc9-1e6a-4c76-9acf-917aecd8f9a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "948b9a73-91cd-4803-b642-d1a75f163368", "address": "fa:16:3e:ac:e6:05", "network": {"id": "3085017e-01d1-448e-9eca-033b34f9e960", "bridge": "br-int", "label": "tempest-network-smoke--1681462854", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feac:e605", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap948b9a73-91", "ovs_interfaceid": "948b9a73-91cd-4803-b642-d1a75f163368", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.060 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401921.0595622, f9a714d0-1a3d-4de4-8fc2-fca74c904eff => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.060 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] VM Started (Lifecycle Event)
Nov 29 07:38:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:41.065 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f52c24d6-dbf5-4111-b7c5-031c013561a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.068 187189 WARNING nova.virt.libvirt.driver [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:38:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:41.068 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8ac0e70c-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:38:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:41.069 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:38:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:41.070 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8ac0e70c-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.072 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:41 compute-0 kernel: tap8ac0e70c-80: entered promiscuous mode
Nov 29 07:38:41 compute-0 NetworkManager[55227]: <info>  [1764401921.0731] manager: (tap8ac0e70c-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/260)
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.074 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:41.078 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8ac0e70c-80, col_values=(('external_ids', {'iface-id': '17e71a59-7bb8-4f35-826d-2efc90d0ca9c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.078 187189 DEBUG nova.virt.libvirt.host [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:38:41 compute-0 ovn_controller[95281]: 2025-11-29T07:38:41Z|00499|binding|INFO|Releasing lport 17e71a59-7bb8-4f35-826d-2efc90d0ca9c from this chassis (sb_readonly=0)
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.079 187189 DEBUG nova.virt.libvirt.host [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.080 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:41.084 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8ac0e70c-84ba-415c-841b-4a5a525b1a9d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8ac0e70c-84ba-415c-841b-4a5a525b1a9d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.084 187189 DEBUG nova.virt.libvirt.host [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.084 187189 DEBUG nova.virt.libvirt.host [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:38:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:41.085 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[249b54ca-3881-4d71-9183-c04501a1c282]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.085 187189 DEBUG nova.virt.libvirt.driver [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.086 187189 DEBUG nova.virt.hardware [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.086 187189 DEBUG nova.virt.hardware [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.086 187189 DEBUG nova.virt.hardware [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.086 187189 DEBUG nova.virt.hardware [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.086 187189 DEBUG nova.virt.hardware [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.087 187189 DEBUG nova.virt.hardware [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.087 187189 DEBUG nova.virt.hardware [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.087 187189 DEBUG nova.virt.hardware [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.087 187189 DEBUG nova.virt.hardware [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.087 187189 DEBUG nova.virt.hardware [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.087 187189 DEBUG nova.virt.hardware [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:38:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:41.088 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:38:41 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:38:41 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:38:41 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-8ac0e70c-84ba-415c-841b-4a5a525b1a9d
Nov 29 07:38:41 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:38:41 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:38:41 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:38:41 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/8ac0e70c-84ba-415c-841b-4a5a525b1a9d.pid.haproxy
Nov 29 07:38:41 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:38:41 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:38:41 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:38:41 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:38:41 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:38:41 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:38:41 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:38:41 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:38:41 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:38:41 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:38:41 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:38:41 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:38:41 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:38:41 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:38:41 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:38:41 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:38:41 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:38:41 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:38:41 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:38:41 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:38:41 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID 8ac0e70c-84ba-415c-841b-4a5a525b1a9d
Nov 29 07:38:41 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:38:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:41.090 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8ac0e70c-84ba-415c-841b-4a5a525b1a9d', 'env', 'PROCESS_TAG=haproxy-8ac0e70c-84ba-415c-841b-4a5a525b1a9d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8ac0e70c-84ba-415c-841b-4a5a525b1a9d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.091 187189 DEBUG nova.virt.libvirt.vif [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:38:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1271234496',display_name='tempest-TestGettingAddress-server-1271234496',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1271234496',id=154,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB3is4E+83iEsFAN3k4vmM3DIB3FGnja3FrFYLTxp4hwLSgaHjf7h1x9RWYq0bVKXNWPDcV4Et8a4J2p23tZrThcwleGMa0jsxDB4wzeuHiJCpifdffKRCxdxwXJAAdszQ==',key_name='tempest-TestGettingAddress-369881010',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-mzjvfdsh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:38:21Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ef353cc9-1e6a-4c76-9acf-917aecd8f9a6", "address": "fa:16:3e:bc:8f:e7", "network": {"id": "51013e93-c048-46cc-9a9d-a184eb63e1b4", "bridge": "br-int", "label": "tempest-network-smoke--890595323", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef353cc9-1e", "ovs_interfaceid": "ef353cc9-1e6a-4c76-9acf-917aecd8f9a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.092 187189 DEBUG nova.network.os_vif_util [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "ef353cc9-1e6a-4c76-9acf-917aecd8f9a6", "address": "fa:16:3e:bc:8f:e7", "network": {"id": "51013e93-c048-46cc-9a9d-a184eb63e1b4", "bridge": "br-int", "label": "tempest-network-smoke--890595323", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef353cc9-1e", "ovs_interfaceid": "ef353cc9-1e6a-4c76-9acf-917aecd8f9a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.092 187189 DEBUG nova.network.os_vif_util [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bc:8f:e7,bridge_name='br-int',has_traffic_filtering=True,id=ef353cc9-1e6a-4c76-9acf-917aecd8f9a6,network=Network(51013e93-c048-46cc-9a9d-a184eb63e1b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef353cc9-1e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.093 187189 DEBUG nova.virt.libvirt.vif [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:38:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1271234496',display_name='tempest-TestGettingAddress-server-1271234496',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1271234496',id=154,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB3is4E+83iEsFAN3k4vmM3DIB3FGnja3FrFYLTxp4hwLSgaHjf7h1x9RWYq0bVKXNWPDcV4Et8a4J2p23tZrThcwleGMa0jsxDB4wzeuHiJCpifdffKRCxdxwXJAAdszQ==',key_name='tempest-TestGettingAddress-369881010',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-mzjvfdsh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:38:21Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "948b9a73-91cd-4803-b642-d1a75f163368", "address": "fa:16:3e:ac:e6:05", "network": {"id": "3085017e-01d1-448e-9eca-033b34f9e960", "bridge": "br-int", "label": "tempest-network-smoke--1681462854", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feac:e605", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap948b9a73-91", "ovs_interfaceid": "948b9a73-91cd-4803-b642-d1a75f163368", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.093 187189 DEBUG nova.network.os_vif_util [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "948b9a73-91cd-4803-b642-d1a75f163368", "address": "fa:16:3e:ac:e6:05", "network": {"id": "3085017e-01d1-448e-9eca-033b34f9e960", "bridge": "br-int", "label": "tempest-network-smoke--1681462854", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feac:e605", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap948b9a73-91", "ovs_interfaceid": "948b9a73-91cd-4803-b642-d1a75f163368", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.094 187189 DEBUG nova.network.os_vif_util [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:e6:05,bridge_name='br-int',has_traffic_filtering=True,id=948b9a73-91cd-4803-b642-d1a75f163368,network=Network(3085017e-01d1-448e-9eca-033b34f9e960),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap948b9a73-91') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.095 187189 DEBUG nova.objects.instance [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'pci_devices' on Instance uuid e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.096 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.163 187189 DEBUG nova.compute.manager [req-cd0a8097-dae1-4454-92f4-22ad98c78c55 req-15e3e26d-2822-45d8-8ac0-8cf063d172a0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Received event network-vif-plugged-c65a8c36-5997-4e67-9fa0-e361b7c334ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.163 187189 DEBUG oslo_concurrency.lockutils [req-cd0a8097-dae1-4454-92f4-22ad98c78c55 req-15e3e26d-2822-45d8-8ac0-8cf063d172a0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "f9a714d0-1a3d-4de4-8fc2-fca74c904eff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.163 187189 DEBUG oslo_concurrency.lockutils [req-cd0a8097-dae1-4454-92f4-22ad98c78c55 req-15e3e26d-2822-45d8-8ac0-8cf063d172a0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f9a714d0-1a3d-4de4-8fc2-fca74c904eff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.163 187189 DEBUG oslo_concurrency.lockutils [req-cd0a8097-dae1-4454-92f4-22ad98c78c55 req-15e3e26d-2822-45d8-8ac0-8cf063d172a0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f9a714d0-1a3d-4de4-8fc2-fca74c904eff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.164 187189 DEBUG nova.compute.manager [req-cd0a8097-dae1-4454-92f4-22ad98c78c55 req-15e3e26d-2822-45d8-8ac0-8cf063d172a0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Processing event network-vif-plugged-c65a8c36-5997-4e67-9fa0-e361b7c334ca _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.164 187189 DEBUG nova.compute.manager [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.167 187189 DEBUG nova.virt.libvirt.driver [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.177 187189 INFO nova.virt.libvirt.driver [-] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Instance spawned successfully.
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.177 187189 DEBUG nova.virt.libvirt.driver [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.184 187189 DEBUG nova.virt.libvirt.driver [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:38:41 compute-0 nova_compute[187185]:   <uuid>e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d</uuid>
Nov 29 07:38:41 compute-0 nova_compute[187185]:   <name>instance-0000009a</name>
Nov 29 07:38:41 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:38:41 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:38:41 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:38:41 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:       <nova:name>tempest-TestGettingAddress-server-1271234496</nova:name>
Nov 29 07:38:41 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:38:41</nova:creationTime>
Nov 29 07:38:41 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:38:41 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:38:41 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:38:41 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:38:41 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:38:41 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:38:41 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:38:41 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:38:41 compute-0 nova_compute[187185]:         <nova:user uuid="31ac7b05b012433b89143dc9f259644a">tempest-TestGettingAddress-1465017630-project-member</nova:user>
Nov 29 07:38:41 compute-0 nova_compute[187185]:         <nova:project uuid="0111c22b4b954ea586ca20d91ed3970f">tempest-TestGettingAddress-1465017630</nova:project>
Nov 29 07:38:41 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:38:41 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:38:41 compute-0 nova_compute[187185]:         <nova:port uuid="ef353cc9-1e6a-4c76-9acf-917aecd8f9a6">
Nov 29 07:38:41 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:38:41 compute-0 nova_compute[187185]:         <nova:port uuid="948b9a73-91cd-4803-b642-d1a75f163368">
Nov 29 07:38:41 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:feac:e605" ipVersion="6"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:38:41 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:38:41 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:38:41 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <system>
Nov 29 07:38:41 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:38:41 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:38:41 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:38:41 compute-0 nova_compute[187185]:       <entry name="serial">e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d</entry>
Nov 29 07:38:41 compute-0 nova_compute[187185]:       <entry name="uuid">e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d</entry>
Nov 29 07:38:41 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     </system>
Nov 29 07:38:41 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:38:41 compute-0 nova_compute[187185]:   <os>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:   </os>
Nov 29 07:38:41 compute-0 nova_compute[187185]:   <features>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:   </features>
Nov 29 07:38:41 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:38:41 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:38:41 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:38:41 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/disk"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:38:41 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/disk.config"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:38:41 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:bc:8f:e7"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:       <target dev="tapef353cc9-1e"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:38:41 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:ac:e6:05"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:       <target dev="tap948b9a73-91"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:38:41 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/console.log" append="off"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <video>
Nov 29 07:38:41 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     </video>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:38:41 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:38:41 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:38:41 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:38:41 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:38:41 compute-0 nova_compute[187185]: </domain>
Nov 29 07:38:41 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.184 187189 DEBUG nova.compute.manager [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Preparing to wait for external event network-vif-plugged-ef353cc9-1e6a-4c76-9acf-917aecd8f9a6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.184 187189 DEBUG oslo_concurrency.lockutils [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.184 187189 DEBUG oslo_concurrency.lockutils [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.184 187189 DEBUG oslo_concurrency.lockutils [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.185 187189 DEBUG nova.compute.manager [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Preparing to wait for external event network-vif-plugged-948b9a73-91cd-4803-b642-d1a75f163368 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.185 187189 DEBUG oslo_concurrency.lockutils [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.185 187189 DEBUG oslo_concurrency.lockutils [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.185 187189 DEBUG oslo_concurrency.lockutils [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.186 187189 DEBUG nova.virt.libvirt.vif [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:38:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1271234496',display_name='tempest-TestGettingAddress-server-1271234496',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1271234496',id=154,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB3is4E+83iEsFAN3k4vmM3DIB3FGnja3FrFYLTxp4hwLSgaHjf7h1x9RWYq0bVKXNWPDcV4Et8a4J2p23tZrThcwleGMa0jsxDB4wzeuHiJCpifdffKRCxdxwXJAAdszQ==',key_name='tempest-TestGettingAddress-369881010',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-mzjvfdsh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:38:21Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ef353cc9-1e6a-4c76-9acf-917aecd8f9a6", "address": "fa:16:3e:bc:8f:e7", "network": {"id": "51013e93-c048-46cc-9a9d-a184eb63e1b4", "bridge": "br-int", "label": "tempest-network-smoke--890595323", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef353cc9-1e", "ovs_interfaceid": "ef353cc9-1e6a-4c76-9acf-917aecd8f9a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.186 187189 DEBUG nova.network.os_vif_util [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "ef353cc9-1e6a-4c76-9acf-917aecd8f9a6", "address": "fa:16:3e:bc:8f:e7", "network": {"id": "51013e93-c048-46cc-9a9d-a184eb63e1b4", "bridge": "br-int", "label": "tempest-network-smoke--890595323", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef353cc9-1e", "ovs_interfaceid": "ef353cc9-1e6a-4c76-9acf-917aecd8f9a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.186 187189 DEBUG nova.network.os_vif_util [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bc:8f:e7,bridge_name='br-int',has_traffic_filtering=True,id=ef353cc9-1e6a-4c76-9acf-917aecd8f9a6,network=Network(51013e93-c048-46cc-9a9d-a184eb63e1b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef353cc9-1e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.186 187189 DEBUG os_vif [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bc:8f:e7,bridge_name='br-int',has_traffic_filtering=True,id=ef353cc9-1e6a-4c76-9acf-917aecd8f9a6,network=Network(51013e93-c048-46cc-9a9d-a184eb63e1b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef353cc9-1e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.187 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.187 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.188 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.188 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.191 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.191 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapef353cc9-1e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.191 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapef353cc9-1e, col_values=(('external_ids', {'iface-id': 'ef353cc9-1e6a-4c76-9acf-917aecd8f9a6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bc:8f:e7', 'vm-uuid': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.193 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.194 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:41 compute-0 NetworkManager[55227]: <info>  [1764401921.1949] manager: (tapef353cc9-1e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/261)
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.195 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.200 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.201 187189 INFO os_vif [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bc:8f:e7,bridge_name='br-int',has_traffic_filtering=True,id=ef353cc9-1e6a-4c76-9acf-917aecd8f9a6,network=Network(51013e93-c048-46cc-9a9d-a184eb63e1b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef353cc9-1e')
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.201 187189 DEBUG nova.virt.libvirt.vif [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:38:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1271234496',display_name='tempest-TestGettingAddress-server-1271234496',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1271234496',id=154,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB3is4E+83iEsFAN3k4vmM3DIB3FGnja3FrFYLTxp4hwLSgaHjf7h1x9RWYq0bVKXNWPDcV4Et8a4J2p23tZrThcwleGMa0jsxDB4wzeuHiJCpifdffKRCxdxwXJAAdszQ==',key_name='tempest-TestGettingAddress-369881010',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-mzjvfdsh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:38:21Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "948b9a73-91cd-4803-b642-d1a75f163368", "address": "fa:16:3e:ac:e6:05", "network": {"id": "3085017e-01d1-448e-9eca-033b34f9e960", "bridge": "br-int", "label": "tempest-network-smoke--1681462854", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feac:e605", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap948b9a73-91", "ovs_interfaceid": "948b9a73-91cd-4803-b642-d1a75f163368", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.202 187189 DEBUG nova.network.os_vif_util [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "948b9a73-91cd-4803-b642-d1a75f163368", "address": "fa:16:3e:ac:e6:05", "network": {"id": "3085017e-01d1-448e-9eca-033b34f9e960", "bridge": "br-int", "label": "tempest-network-smoke--1681462854", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feac:e605", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap948b9a73-91", "ovs_interfaceid": "948b9a73-91cd-4803-b642-d1a75f163368", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.202 187189 DEBUG nova.network.os_vif_util [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:e6:05,bridge_name='br-int',has_traffic_filtering=True,id=948b9a73-91cd-4803-b642-d1a75f163368,network=Network(3085017e-01d1-448e-9eca-033b34f9e960),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap948b9a73-91') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.203 187189 DEBUG os_vif [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:e6:05,bridge_name='br-int',has_traffic_filtering=True,id=948b9a73-91cd-4803-b642-d1a75f163368,network=Network(3085017e-01d1-448e-9eca-033b34f9e960),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap948b9a73-91') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.203 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.203 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.203 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.205 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.205 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap948b9a73-91, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.206 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap948b9a73-91, col_values=(('external_ids', {'iface-id': '948b9a73-91cd-4803-b642-d1a75f163368', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ac:e6:05', 'vm-uuid': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.207 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:41 compute-0 NetworkManager[55227]: <info>  [1764401921.2084] manager: (tap948b9a73-91): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/262)
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.209 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.214 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.214 187189 INFO os_vif [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:e6:05,bridge_name='br-int',has_traffic_filtering=True,id=948b9a73-91cd-4803-b642-d1a75f163368,network=Network(3085017e-01d1-448e-9eca-033b34f9e960),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap948b9a73-91')
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.226 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.227 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401921.0629814, f9a714d0-1a3d-4de4-8fc2-fca74c904eff => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.227 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] VM Paused (Lifecycle Event)
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.230 187189 DEBUG nova.virt.libvirt.driver [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.231 187189 DEBUG nova.virt.libvirt.driver [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.231 187189 DEBUG nova.virt.libvirt.driver [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.231 187189 DEBUG nova.virt.libvirt.driver [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.232 187189 DEBUG nova.virt.libvirt.driver [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.232 187189 DEBUG nova.virt.libvirt.driver [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.316 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.322 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401921.1673195, f9a714d0-1a3d-4de4-8fc2-fca74c904eff => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.322 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] VM Resumed (Lifecycle Event)
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.364 187189 DEBUG nova.virt.libvirt.driver [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.364 187189 DEBUG nova.virt.libvirt.driver [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.364 187189 DEBUG nova.virt.libvirt.driver [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No VIF found with MAC fa:16:3e:bc:8f:e7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.365 187189 DEBUG nova.virt.libvirt.driver [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No VIF found with MAC fa:16:3e:ac:e6:05, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.365 187189 INFO nova.virt.libvirt.driver [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Using config drive
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.370 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.374 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.449 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.483 187189 INFO nova.compute.manager [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Took 13.32 seconds to spawn the instance on the hypervisor.
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.483 187189 DEBUG nova.compute.manager [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:38:41 compute-0 podman[241656]: 2025-11-29 07:38:41.518524621 +0000 UTC m=+0.092857225 container create 79e8a40b1d8a63164f793d3c266a1b12a0665348af437d223acced0c91ad08e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8ac0e70c-84ba-415c-841b-4a5a525b1a9d, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 29 07:38:41 compute-0 podman[241656]: 2025-11-29 07:38:41.452402275 +0000 UTC m=+0.026734919 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:38:41 compute-0 systemd[1]: Started libpod-conmon-79e8a40b1d8a63164f793d3c266a1b12a0665348af437d223acced0c91ad08e0.scope.
Nov 29 07:38:41 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:38:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4215897df8d170872bdb6a6c4424230df1f7d017c06f1dfad42a35d7f26d2d9f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:38:41 compute-0 podman[241656]: 2025-11-29 07:38:41.672047285 +0000 UTC m=+0.246379889 container init 79e8a40b1d8a63164f793d3c266a1b12a0665348af437d223acced0c91ad08e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8ac0e70c-84ba-415c-841b-4a5a525b1a9d, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 07:38:41 compute-0 podman[241656]: 2025-11-29 07:38:41.681879784 +0000 UTC m=+0.256212388 container start 79e8a40b1d8a63164f793d3c266a1b12a0665348af437d223acced0c91ad08e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8ac0e70c-84ba-415c-841b-4a5a525b1a9d, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 29 07:38:41 compute-0 neutron-haproxy-ovnmeta-8ac0e70c-84ba-415c-841b-4a5a525b1a9d[241672]: [NOTICE]   (241677) : New worker (241679) forked
Nov 29 07:38:41 compute-0 neutron-haproxy-ovnmeta-8ac0e70c-84ba-415c-841b-4a5a525b1a9d[241672]: [NOTICE]   (241677) : Loading success.
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.768 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:41.769 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=57, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=56) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.770 187189 INFO nova.compute.manager [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Took 23.76 seconds to build instance.
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.819 187189 DEBUG oslo_concurrency.lockutils [None req-e0528dc3-1835-4d03-9f55-427250e17d1e 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "f9a714d0-1a3d-4de4-8fc2-fca74c904eff" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 24.888s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.835 187189 INFO nova.virt.libvirt.driver [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Creating config drive at /var/lib/nova/instances/e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/disk.config
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.841 187189 DEBUG oslo_concurrency.processutils [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfc5i8dh4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:38:41 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:41.853 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:38:41 compute-0 nova_compute[187185]: 2025-11-29 07:38:41.971 187189 DEBUG oslo_concurrency.processutils [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfc5i8dh4" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:38:42 compute-0 kernel: tapef353cc9-1e: entered promiscuous mode
Nov 29 07:38:42 compute-0 systemd-udevd[241607]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:38:42 compute-0 NetworkManager[55227]: <info>  [1764401922.0491] manager: (tapef353cc9-1e): new Tun device (/org/freedesktop/NetworkManager/Devices/263)
Nov 29 07:38:42 compute-0 nova_compute[187185]: 2025-11-29 07:38:42.088 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:42 compute-0 ovn_controller[95281]: 2025-11-29T07:38:42Z|00500|binding|INFO|Claiming lport ef353cc9-1e6a-4c76-9acf-917aecd8f9a6 for this chassis.
Nov 29 07:38:42 compute-0 ovn_controller[95281]: 2025-11-29T07:38:42Z|00501|binding|INFO|ef353cc9-1e6a-4c76-9acf-917aecd8f9a6: Claiming fa:16:3e:bc:8f:e7 10.100.0.14
Nov 29 07:38:42 compute-0 nova_compute[187185]: 2025-11-29 07:38:42.091 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:42 compute-0 NetworkManager[55227]: <info>  [1764401922.0969] device (tapef353cc9-1e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:38:42 compute-0 NetworkManager[55227]: <info>  [1764401922.0991] device (tapef353cc9-1e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:38:42 compute-0 nova_compute[187185]: 2025-11-29 07:38:42.104 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:42 compute-0 NetworkManager[55227]: <info>  [1764401922.1145] manager: (tap948b9a73-91): new Tun device (/org/freedesktop/NetworkManager/Devices/264)
Nov 29 07:38:42 compute-0 kernel: tap948b9a73-91: entered promiscuous mode
Nov 29 07:38:42 compute-0 nova_compute[187185]: 2025-11-29 07:38:42.118 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:42 compute-0 NetworkManager[55227]: <info>  [1764401922.1253] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/265)
Nov 29 07:38:42 compute-0 ovn_controller[95281]: 2025-11-29T07:38:42Z|00502|if_status|INFO|Not updating pb chassis for 948b9a73-91cd-4803-b642-d1a75f163368 now as sb is readonly
Nov 29 07:38:42 compute-0 NetworkManager[55227]: <info>  [1764401922.1281] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/266)
Nov 29 07:38:42 compute-0 NetworkManager[55227]: <info>  [1764401922.1313] device (tap948b9a73-91): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:38:42 compute-0 NetworkManager[55227]: <info>  [1764401922.1327] device (tap948b9a73-91): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:38:42 compute-0 nova_compute[187185]: 2025-11-29 07:38:42.134 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:42 compute-0 systemd-machined[153486]: New machine qemu-61-instance-0000009a.
Nov 29 07:38:42 compute-0 systemd[1]: Started Virtual Machine qemu-61-instance-0000009a.
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:42.222 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bc:8f:e7 10.100.0.14'], port_security=['fa:16:3e:bc:8f:e7 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-51013e93-c048-46cc-9a9d-a184eb63e1b4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b28f67ac-b290-4be5-88df-4393a9d30b89', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a7c7b7bb-9c39-4f1c-a218-71c5fbf31db4, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=ef353cc9-1e6a-4c76-9acf-917aecd8f9a6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:42.224 104254 INFO neutron.agent.ovn.metadata.agent [-] Port ef353cc9-1e6a-4c76-9acf-917aecd8f9a6 in datapath 51013e93-c048-46cc-9a9d-a184eb63e1b4 bound to our chassis
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:42.226 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 51013e93-c048-46cc-9a9d-a184eb63e1b4
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:42.242 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[30947c82-d28b-4f70-a6e7-09bd95549a1c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:42.244 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap51013e93-c1 in ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:42.248 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap51013e93-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:42.248 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[56b126b5-e66e-4ddc-b826-083113373055]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:42.249 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[c1b416c5-7f0d-46e6-ab41-f67b5d10c591]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:42.262 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[5819f56b-d077-4a14-a503-95fc2dac235b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:42.293 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[af28369b-475e-4151-b83e-9e29e131535b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:42.338 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[5c8301c9-0760-48c8-bc14-51b232cda001]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:42 compute-0 ovn_controller[95281]: 2025-11-29T07:38:42Z|00503|memory|INFO|peak resident set size grew 52% in last 4031.5 seconds, from 16384 kB to 24952 kB
Nov 29 07:38:42 compute-0 ovn_controller[95281]: 2025-11-29T07:38:42Z|00504|memory|INFO|idl-cells-OVN_Southbound:10593 idl-cells-Open_vSwitch:927 if_status_mgr_ifaces_state_usage-KB:1 if_status_mgr_ifaces_usage-KB:1 lflow-cache-entries-cache-expr:357 lflow-cache-entries-cache-matches:289 lflow-cache-size-KB:1522 local_datapath_usage-KB:3 ofctrl_desired_flow_usage-KB:668 ofctrl_installed_flow_usage-KB:473 ofctrl_rconn_packet_counter-KB:349 ofctrl_sb_flow_ref_usage-KB:247 oflow_update_usage-KB:1
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:42.350 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[501c3090-946e-46b1-a1a9-c2217fe7b57b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:42 compute-0 NetworkManager[55227]: <info>  [1764401922.3536] manager: (tap51013e93-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/267)
Nov 29 07:38:42 compute-0 nova_compute[187185]: 2025-11-29 07:38:42.354 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:42 compute-0 nova_compute[187185]: 2025-11-29 07:38:42.359 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:42 compute-0 ovn_controller[95281]: 2025-11-29T07:38:42Z|00505|binding|INFO|Releasing lport 17e71a59-7bb8-4f35-826d-2efc90d0ca9c from this chassis (sb_readonly=0)
Nov 29 07:38:42 compute-0 ovn_controller[95281]: 2025-11-29T07:38:42Z|00506|binding|INFO|Claiming lport 948b9a73-91cd-4803-b642-d1a75f163368 for this chassis.
Nov 29 07:38:42 compute-0 ovn_controller[95281]: 2025-11-29T07:38:42Z|00507|binding|INFO|948b9a73-91cd-4803-b642-d1a75f163368: Claiming fa:16:3e:ac:e6:05 2001:db8::f816:3eff:feac:e605
Nov 29 07:38:42 compute-0 ovn_controller[95281]: 2025-11-29T07:38:42Z|00508|binding|INFO|Setting lport ef353cc9-1e6a-4c76-9acf-917aecd8f9a6 ovn-installed in OVS
Nov 29 07:38:42 compute-0 nova_compute[187185]: 2025-11-29 07:38:42.407 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:42.408 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:e6:05 2001:db8::f816:3eff:feac:e605'], port_security=['fa:16:3e:ac:e6:05 2001:db8::f816:3eff:feac:e605'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:feac:e605/64', 'neutron:device_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3085017e-01d1-448e-9eca-033b34f9e960', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b28f67ac-b290-4be5-88df-4393a9d30b89', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3544b146-f2db-4cb1-8570-c2c0dfd4e173, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=948b9a73-91cd-4803-b642-d1a75f163368) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:38:42 compute-0 ovn_controller[95281]: 2025-11-29T07:38:42Z|00509|binding|INFO|Setting lport ef353cc9-1e6a-4c76-9acf-917aecd8f9a6 up in Southbound
Nov 29 07:38:42 compute-0 ovn_controller[95281]: 2025-11-29T07:38:42Z|00510|binding|INFO|Setting lport 948b9a73-91cd-4803-b642-d1a75f163368 ovn-installed in OVS
Nov 29 07:38:42 compute-0 ovn_controller[95281]: 2025-11-29T07:38:42Z|00511|binding|INFO|Setting lport 948b9a73-91cd-4803-b642-d1a75f163368 up in Southbound
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:42.420 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[060b6845-23cb-4913-a34c-468819552647]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:42 compute-0 nova_compute[187185]: 2025-11-29 07:38:42.422 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:42.424 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[75dec84b-0d55-42a4-9a9c-4d9bb000da7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:42 compute-0 NetworkManager[55227]: <info>  [1764401922.4538] device (tap51013e93-c0): carrier: link connected
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:42.462 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[1e3a75eb-0070-4df4-9060-619455870b9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:42.486 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e8e97e33-8a85-4e8c-8fed-30645e0dbeb7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap51013e93-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:01:0a:76'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 161], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 737011, 'reachable_time': 39637, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241730, 'error': None, 'target': 'ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:42.509 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[862ba0e0-9539-4b67-9cf5-74abcc3d6ff8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe01:a76'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 737011, 'tstamp': 737011}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241733, 'error': None, 'target': 'ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:42.532 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[24440c8a-d076-497d-87b6-4dc4bcae087d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap51013e93-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:01:0a:76'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 161], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 737011, 'reachable_time': 39637, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 241739, 'error': None, 'target': 'ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:42.572 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[aa0e6587-a6ff-410b-bc48-895744810c40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:42 compute-0 nova_compute[187185]: 2025-11-29 07:38:42.589 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401922.5883322, e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:38:42 compute-0 nova_compute[187185]: 2025-11-29 07:38:42.589 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] VM Started (Lifecycle Event)
Nov 29 07:38:42 compute-0 nova_compute[187185]: 2025-11-29 07:38:42.635 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:38:42 compute-0 nova_compute[187185]: 2025-11-29 07:38:42.640 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401922.589095, e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:38:42 compute-0 nova_compute[187185]: 2025-11-29 07:38:42.640 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] VM Paused (Lifecycle Event)
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:42.656 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[bb06f74a-49d7-4906-a698-313c2b7055cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:42.658 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap51013e93-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:42.658 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:42.659 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap51013e93-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:38:42 compute-0 NetworkManager[55227]: <info>  [1764401922.6628] manager: (tap51013e93-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/268)
Nov 29 07:38:42 compute-0 kernel: tap51013e93-c0: entered promiscuous mode
Nov 29 07:38:42 compute-0 nova_compute[187185]: 2025-11-29 07:38:42.661 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:42 compute-0 nova_compute[187185]: 2025-11-29 07:38:42.665 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:42.666 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap51013e93-c0, col_values=(('external_ids', {'iface-id': '2ba344d6-366c-48e1-aea9-1f498f9fe4ff'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:38:42 compute-0 nova_compute[187185]: 2025-11-29 07:38:42.668 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:42 compute-0 ovn_controller[95281]: 2025-11-29T07:38:42Z|00512|binding|INFO|Releasing lport 2ba344d6-366c-48e1-aea9-1f498f9fe4ff from this chassis (sb_readonly=0)
Nov 29 07:38:42 compute-0 nova_compute[187185]: 2025-11-29 07:38:42.691 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:42 compute-0 nova_compute[187185]: 2025-11-29 07:38:42.695 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:42.696 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/51013e93-c048-46cc-9a9d-a184eb63e1b4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/51013e93-c048-46cc-9a9d-a184eb63e1b4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:38:42 compute-0 nova_compute[187185]: 2025-11-29 07:38:42.697 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:42.698 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[12db4eb4-d407-43dc-94a7-288305f3b6e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:42.701 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-51013e93-c048-46cc-9a9d-a184eb63e1b4
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/51013e93-c048-46cc-9a9d-a184eb63e1b4.pid.haproxy
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID 51013e93-c048-46cc-9a9d-a184eb63e1b4
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:38:42 compute-0 nova_compute[187185]: 2025-11-29 07:38:42.701 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:42.702 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4', 'env', 'PROCESS_TAG=haproxy-51013e93-c048-46cc-9a9d-a184eb63e1b4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/51013e93-c048-46cc-9a9d-a184eb63e1b4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:38:42 compute-0 nova_compute[187185]: 2025-11-29 07:38:42.723 187189 DEBUG nova.compute.manager [req-928e8589-199a-4b79-a15c-3c156cb86004 req-1ad49d91-8349-4261-afb1-06b85b450ae6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Received event network-vif-plugged-948b9a73-91cd-4803-b642-d1a75f163368 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:38:42 compute-0 nova_compute[187185]: 2025-11-29 07:38:42.723 187189 DEBUG oslo_concurrency.lockutils [req-928e8589-199a-4b79-a15c-3c156cb86004 req-1ad49d91-8349-4261-afb1-06b85b450ae6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:38:42 compute-0 nova_compute[187185]: 2025-11-29 07:38:42.724 187189 DEBUG oslo_concurrency.lockutils [req-928e8589-199a-4b79-a15c-3c156cb86004 req-1ad49d91-8349-4261-afb1-06b85b450ae6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:38:42 compute-0 nova_compute[187185]: 2025-11-29 07:38:42.724 187189 DEBUG oslo_concurrency.lockutils [req-928e8589-199a-4b79-a15c-3c156cb86004 req-1ad49d91-8349-4261-afb1-06b85b450ae6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:38:42 compute-0 nova_compute[187185]: 2025-11-29 07:38:42.724 187189 DEBUG nova.compute.manager [req-928e8589-199a-4b79-a15c-3c156cb86004 req-1ad49d91-8349-4261-afb1-06b85b450ae6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Processing event network-vif-plugged-948b9a73-91cd-4803-b642-d1a75f163368 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:38:42 compute-0 nova_compute[187185]: 2025-11-29 07:38:42.728 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:38:42 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:42.855 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '57'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:38:42 compute-0 nova_compute[187185]: 2025-11-29 07:38:42.920 187189 DEBUG nova.network.neutron [req-ded0f6a5-d5e0-46bd-922e-79bdde672a7e req-a04cf16d-89fd-4b9d-9bce-e4da3cbadf8e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Updated VIF entry in instance network info cache for port 948b9a73-91cd-4803-b642-d1a75f163368. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:38:42 compute-0 nova_compute[187185]: 2025-11-29 07:38:42.921 187189 DEBUG nova.network.neutron [req-ded0f6a5-d5e0-46bd-922e-79bdde672a7e req-a04cf16d-89fd-4b9d-9bce-e4da3cbadf8e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Updating instance_info_cache with network_info: [{"id": "ef353cc9-1e6a-4c76-9acf-917aecd8f9a6", "address": "fa:16:3e:bc:8f:e7", "network": {"id": "51013e93-c048-46cc-9a9d-a184eb63e1b4", "bridge": "br-int", "label": "tempest-network-smoke--890595323", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef353cc9-1e", "ovs_interfaceid": "ef353cc9-1e6a-4c76-9acf-917aecd8f9a6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "948b9a73-91cd-4803-b642-d1a75f163368", "address": "fa:16:3e:ac:e6:05", "network": {"id": "3085017e-01d1-448e-9eca-033b34f9e960", "bridge": "br-int", "label": "tempest-network-smoke--1681462854", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feac:e605", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap948b9a73-91", "ovs_interfaceid": "948b9a73-91cd-4803-b642-d1a75f163368", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:38:42 compute-0 nova_compute[187185]: 2025-11-29 07:38:42.943 187189 DEBUG oslo_concurrency.lockutils [req-ded0f6a5-d5e0-46bd-922e-79bdde672a7e req-a04cf16d-89fd-4b9d-9bce-e4da3cbadf8e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:38:43 compute-0 podman[241770]: 2025-11-29 07:38:43.097673779 +0000 UTC m=+0.029757445 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:38:43 compute-0 nova_compute[187185]: 2025-11-29 07:38:43.309 187189 DEBUG nova.compute.manager [req-eb8c6e21-445c-4879-9503-5c0d59a5b78b req-5242242e-a963-459a-b728-e67416807126 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Received event network-vif-plugged-c65a8c36-5997-4e67-9fa0-e361b7c334ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:38:43 compute-0 nova_compute[187185]: 2025-11-29 07:38:43.309 187189 DEBUG oslo_concurrency.lockutils [req-eb8c6e21-445c-4879-9503-5c0d59a5b78b req-5242242e-a963-459a-b728-e67416807126 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "f9a714d0-1a3d-4de4-8fc2-fca74c904eff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:38:43 compute-0 nova_compute[187185]: 2025-11-29 07:38:43.310 187189 DEBUG oslo_concurrency.lockutils [req-eb8c6e21-445c-4879-9503-5c0d59a5b78b req-5242242e-a963-459a-b728-e67416807126 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f9a714d0-1a3d-4de4-8fc2-fca74c904eff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:38:43 compute-0 nova_compute[187185]: 2025-11-29 07:38:43.310 187189 DEBUG oslo_concurrency.lockutils [req-eb8c6e21-445c-4879-9503-5c0d59a5b78b req-5242242e-a963-459a-b728-e67416807126 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f9a714d0-1a3d-4de4-8fc2-fca74c904eff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:38:43 compute-0 nova_compute[187185]: 2025-11-29 07:38:43.311 187189 DEBUG nova.compute.manager [req-eb8c6e21-445c-4879-9503-5c0d59a5b78b req-5242242e-a963-459a-b728-e67416807126 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] No waiting events found dispatching network-vif-plugged-c65a8c36-5997-4e67-9fa0-e361b7c334ca pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:38:43 compute-0 nova_compute[187185]: 2025-11-29 07:38:43.311 187189 WARNING nova.compute.manager [req-eb8c6e21-445c-4879-9503-5c0d59a5b78b req-5242242e-a963-459a-b728-e67416807126 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Received unexpected event network-vif-plugged-c65a8c36-5997-4e67-9fa0-e361b7c334ca for instance with vm_state active and task_state None.
Nov 29 07:38:43 compute-0 nova_compute[187185]: 2025-11-29 07:38:43.312 187189 DEBUG nova.compute.manager [req-eb8c6e21-445c-4879-9503-5c0d59a5b78b req-5242242e-a963-459a-b728-e67416807126 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Received event network-vif-plugged-ef353cc9-1e6a-4c76-9acf-917aecd8f9a6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:38:43 compute-0 nova_compute[187185]: 2025-11-29 07:38:43.312 187189 DEBUG oslo_concurrency.lockutils [req-eb8c6e21-445c-4879-9503-5c0d59a5b78b req-5242242e-a963-459a-b728-e67416807126 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:38:43 compute-0 nova_compute[187185]: 2025-11-29 07:38:43.313 187189 DEBUG oslo_concurrency.lockutils [req-eb8c6e21-445c-4879-9503-5c0d59a5b78b req-5242242e-a963-459a-b728-e67416807126 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:38:43 compute-0 nova_compute[187185]: 2025-11-29 07:38:43.313 187189 DEBUG oslo_concurrency.lockutils [req-eb8c6e21-445c-4879-9503-5c0d59a5b78b req-5242242e-a963-459a-b728-e67416807126 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:38:43 compute-0 nova_compute[187185]: 2025-11-29 07:38:43.314 187189 DEBUG nova.compute.manager [req-eb8c6e21-445c-4879-9503-5c0d59a5b78b req-5242242e-a963-459a-b728-e67416807126 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Processing event network-vif-plugged-ef353cc9-1e6a-4c76-9acf-917aecd8f9a6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:38:43 compute-0 nova_compute[187185]: 2025-11-29 07:38:43.314 187189 DEBUG nova.compute.manager [req-eb8c6e21-445c-4879-9503-5c0d59a5b78b req-5242242e-a963-459a-b728-e67416807126 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Received event network-vif-plugged-ef353cc9-1e6a-4c76-9acf-917aecd8f9a6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:38:43 compute-0 nova_compute[187185]: 2025-11-29 07:38:43.315 187189 DEBUG oslo_concurrency.lockutils [req-eb8c6e21-445c-4879-9503-5c0d59a5b78b req-5242242e-a963-459a-b728-e67416807126 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:38:43 compute-0 nova_compute[187185]: 2025-11-29 07:38:43.315 187189 DEBUG oslo_concurrency.lockutils [req-eb8c6e21-445c-4879-9503-5c0d59a5b78b req-5242242e-a963-459a-b728-e67416807126 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:38:43 compute-0 nova_compute[187185]: 2025-11-29 07:38:43.316 187189 DEBUG oslo_concurrency.lockutils [req-eb8c6e21-445c-4879-9503-5c0d59a5b78b req-5242242e-a963-459a-b728-e67416807126 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:38:43 compute-0 nova_compute[187185]: 2025-11-29 07:38:43.316 187189 DEBUG nova.compute.manager [req-eb8c6e21-445c-4879-9503-5c0d59a5b78b req-5242242e-a963-459a-b728-e67416807126 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] No waiting events found dispatching network-vif-plugged-ef353cc9-1e6a-4c76-9acf-917aecd8f9a6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:38:43 compute-0 nova_compute[187185]: 2025-11-29 07:38:43.316 187189 WARNING nova.compute.manager [req-eb8c6e21-445c-4879-9503-5c0d59a5b78b req-5242242e-a963-459a-b728-e67416807126 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Received unexpected event network-vif-plugged-ef353cc9-1e6a-4c76-9acf-917aecd8f9a6 for instance with vm_state building and task_state spawning.
Nov 29 07:38:43 compute-0 nova_compute[187185]: 2025-11-29 07:38:43.317 187189 DEBUG nova.compute.manager [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Instance event wait completed in 0 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:38:43 compute-0 nova_compute[187185]: 2025-11-29 07:38:43.333 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401923.3225267, e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:38:43 compute-0 nova_compute[187185]: 2025-11-29 07:38:43.336 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] VM Resumed (Lifecycle Event)
Nov 29 07:38:43 compute-0 nova_compute[187185]: 2025-11-29 07:38:43.342 187189 DEBUG nova.virt.libvirt.driver [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:38:43 compute-0 nova_compute[187185]: 2025-11-29 07:38:43.348 187189 INFO nova.virt.libvirt.driver [-] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Instance spawned successfully.
Nov 29 07:38:43 compute-0 nova_compute[187185]: 2025-11-29 07:38:43.349 187189 DEBUG nova.virt.libvirt.driver [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:38:43 compute-0 nova_compute[187185]: 2025-11-29 07:38:43.365 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:38:43 compute-0 nova_compute[187185]: 2025-11-29 07:38:43.378 187189 DEBUG nova.virt.libvirt.driver [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:38:43 compute-0 nova_compute[187185]: 2025-11-29 07:38:43.379 187189 DEBUG nova.virt.libvirt.driver [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:38:43 compute-0 nova_compute[187185]: 2025-11-29 07:38:43.380 187189 DEBUG nova.virt.libvirt.driver [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:38:43 compute-0 nova_compute[187185]: 2025-11-29 07:38:43.381 187189 DEBUG nova.virt.libvirt.driver [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:38:43 compute-0 nova_compute[187185]: 2025-11-29 07:38:43.382 187189 DEBUG nova.virt.libvirt.driver [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:38:43 compute-0 nova_compute[187185]: 2025-11-29 07:38:43.383 187189 DEBUG nova.virt.libvirt.driver [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:38:43 compute-0 nova_compute[187185]: 2025-11-29 07:38:43.389 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:38:43 compute-0 nova_compute[187185]: 2025-11-29 07:38:43.431 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:38:43 compute-0 nova_compute[187185]: 2025-11-29 07:38:43.466 187189 INFO nova.compute.manager [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Took 17.71 seconds to spawn the instance on the hypervisor.
Nov 29 07:38:43 compute-0 nova_compute[187185]: 2025-11-29 07:38:43.467 187189 DEBUG nova.compute.manager [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:38:43 compute-0 nova_compute[187185]: 2025-11-29 07:38:43.567 187189 INFO nova.compute.manager [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Took 26.62 seconds to build instance.
Nov 29 07:38:43 compute-0 nova_compute[187185]: 2025-11-29 07:38:43.587 187189 DEBUG oslo_concurrency.lockutils [None req-ab2716ae-ca7a-4537-b7ab-706ecbb3a992 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 26.759s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:38:43 compute-0 podman[241770]: 2025-11-29 07:38:43.694316252 +0000 UTC m=+0.626399868 container create 2f8b8cb10efd6f617c050ccc794cd01e10282866102863fa7d0ae1730bb49d85 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 07:38:43 compute-0 systemd[1]: Started libpod-conmon-2f8b8cb10efd6f617c050ccc794cd01e10282866102863fa7d0ae1730bb49d85.scope.
Nov 29 07:38:43 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:38:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/29f05bc38d74387f45bcacf18fa6071e137d578080ed116d64f3a98494023b97/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:38:43 compute-0 podman[241770]: 2025-11-29 07:38:43.807012448 +0000 UTC m=+0.739096164 container init 2f8b8cb10efd6f617c050ccc794cd01e10282866102863fa7d0ae1730bb49d85 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:38:43 compute-0 podman[241770]: 2025-11-29 07:38:43.815604182 +0000 UTC m=+0.747687758 container start 2f8b8cb10efd6f617c050ccc794cd01e10282866102863fa7d0ae1730bb49d85 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:38:43 compute-0 podman[241787]: 2025-11-29 07:38:43.823815825 +0000 UTC m=+0.073765243 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 07:38:43 compute-0 podman[241783]: 2025-11-29 07:38:43.827691685 +0000 UTC m=+0.079411184 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 07:38:43 compute-0 podman[241786]: 2025-11-29 07:38:43.834997462 +0000 UTC m=+0.093731920 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, managed_by=edpm_ansible, release=1755695350, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.openshift.tags=minimal rhel9)
Nov 29 07:38:43 compute-0 neutron-haproxy-ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4[241803]: [NOTICE]   (241846) : New worker (241852) forked
Nov 29 07:38:43 compute-0 neutron-haproxy-ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4[241803]: [NOTICE]   (241846) : Loading success.
Nov 29 07:38:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:43.889 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 948b9a73-91cd-4803-b642-d1a75f163368 in datapath 3085017e-01d1-448e-9eca-033b34f9e960 unbound from our chassis
Nov 29 07:38:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:43.891 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3085017e-01d1-448e-9eca-033b34f9e960
Nov 29 07:38:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:43.902 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[3070e40e-9649-4bff-922c-e332851b1899]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:43.903 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3085017e-01 in ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:38:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:43.906 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3085017e-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:38:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:43.906 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6f97ad18-04ed-4b49-8342-e65d00847946]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:43.907 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[29acf24b-57cc-4bfe-a7bb-fd95e4e05836]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:43.918 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[e49d5054-9e01-4e5e-9153-ab0ce7324942]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:43.936 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[dd09dee4-a4a4-4df2-8320-7c56ecba3a74]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:43.966 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[9959dbaf-106d-48c3-8014-de7c37149c89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:43 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:43.971 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[19323ab7-1785-43e8-8fab-c849211d43c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:43 compute-0 NetworkManager[55227]: <info>  [1764401923.9724] manager: (tap3085017e-00): new Veth device (/org/freedesktop/NetworkManager/Devices/269)
Nov 29 07:38:44 compute-0 systemd-udevd[241869]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:44.019 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[b900f1b9-154f-40d5-9247-4f9347c139e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:44.023 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[86f27e5d-96c6-4e7a-b108-734e2ec509a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:44 compute-0 NetworkManager[55227]: <info>  [1764401924.0491] device (tap3085017e-00): carrier: link connected
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:44.058 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[89bfe585-fa41-441d-af32-0124f513c471]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:44.075 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6752daa5-fbdf-4991-aea0-a1922ab4539f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3085017e-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ba:4a:df'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 162], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 737171, 'reachable_time': 23789, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241888, 'error': None, 'target': 'ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:44.089 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e0efbced-75d4-4d55-9f7f-12048d1956d9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feba:4adf'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 737171, 'tstamp': 737171}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241889, 'error': None, 'target': 'ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:44.104 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e385766f-ca00-42fd-8219-43fa0d09883d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3085017e-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ba:4a:df'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 162], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 737171, 'reachable_time': 23789, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 241890, 'error': None, 'target': 'ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:44.139 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[11ae351e-d2fe-42da-b42a-dc5b66b9bbf9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:44.164 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[61456f64-3436-47cb-bc12-f14a701f9952]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:44.167 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3085017e-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:44.167 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:44.168 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3085017e-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:38:44 compute-0 nova_compute[187185]: 2025-11-29 07:38:44.170 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:44 compute-0 kernel: tap3085017e-00: entered promiscuous mode
Nov 29 07:38:44 compute-0 NetworkManager[55227]: <info>  [1764401924.1709] manager: (tap3085017e-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/270)
Nov 29 07:38:44 compute-0 nova_compute[187185]: 2025-11-29 07:38:44.172 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:44.174 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3085017e-00, col_values=(('external_ids', {'iface-id': '59e75940-874d-45e0-8a95-c21e1cf0e54f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:38:44 compute-0 nova_compute[187185]: 2025-11-29 07:38:44.176 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:44 compute-0 ovn_controller[95281]: 2025-11-29T07:38:44Z|00513|binding|INFO|Releasing lport 59e75940-874d-45e0-8a95-c21e1cf0e54f from this chassis (sb_readonly=0)
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:44.178 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3085017e-01d1-448e-9eca-033b34f9e960.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3085017e-01d1-448e-9eca-033b34f9e960.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:44.179 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[fa852609-f427-4f88-a4fd-b11855f45d21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:44.180 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-3085017e-01d1-448e-9eca-033b34f9e960
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/3085017e-01d1-448e-9eca-033b34f9e960.pid.haproxy
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID 3085017e-01d1-448e-9eca-033b34f9e960
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:38:44 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:38:44.182 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960', 'env', 'PROCESS_TAG=haproxy-3085017e-01d1-448e-9eca-033b34f9e960', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3085017e-01d1-448e-9eca-033b34f9e960.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:38:44 compute-0 nova_compute[187185]: 2025-11-29 07:38:44.187 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:44 compute-0 podman[241921]: 2025-11-29 07:38:44.541229683 +0000 UTC m=+0.050750891 container create 231ae1015c261439746df1967e65cf55f1082736175fc624e88a33d76e16f997 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0)
Nov 29 07:38:44 compute-0 systemd[1]: Started libpod-conmon-231ae1015c261439746df1967e65cf55f1082736175fc624e88a33d76e16f997.scope.
Nov 29 07:38:44 compute-0 podman[241921]: 2025-11-29 07:38:44.511765507 +0000 UTC m=+0.021286745 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:38:44 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:38:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c47aacbac2e405d8881d795db883c9665cb9b6376c54744ab1021ab71213e7fd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:38:44 compute-0 podman[241921]: 2025-11-29 07:38:44.644703398 +0000 UTC m=+0.154224616 container init 231ae1015c261439746df1967e65cf55f1082736175fc624e88a33d76e16f997 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:38:44 compute-0 podman[241921]: 2025-11-29 07:38:44.651615944 +0000 UTC m=+0.161137152 container start 231ae1015c261439746df1967e65cf55f1082736175fc624e88a33d76e16f997 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 07:38:44 compute-0 neutron-haproxy-ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960[241936]: [NOTICE]   (241941) : New worker (241944) forked
Nov 29 07:38:44 compute-0 neutron-haproxy-ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960[241936]: [NOTICE]   (241941) : Loading success.
Nov 29 07:38:46 compute-0 nova_compute[187185]: 2025-11-29 07:38:46.207 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:46 compute-0 nova_compute[187185]: 2025-11-29 07:38:46.248 187189 DEBUG nova.compute.manager [req-1f1b3641-1ef0-4691-a7b0-d1b205253d3f req-2b8f4f46-377a-4b46-bb12-99c9a47eb458 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Received event network-vif-plugged-948b9a73-91cd-4803-b642-d1a75f163368 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:38:46 compute-0 nova_compute[187185]: 2025-11-29 07:38:46.249 187189 DEBUG oslo_concurrency.lockutils [req-1f1b3641-1ef0-4691-a7b0-d1b205253d3f req-2b8f4f46-377a-4b46-bb12-99c9a47eb458 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:38:46 compute-0 nova_compute[187185]: 2025-11-29 07:38:46.249 187189 DEBUG oslo_concurrency.lockutils [req-1f1b3641-1ef0-4691-a7b0-d1b205253d3f req-2b8f4f46-377a-4b46-bb12-99c9a47eb458 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:38:46 compute-0 nova_compute[187185]: 2025-11-29 07:38:46.250 187189 DEBUG oslo_concurrency.lockutils [req-1f1b3641-1ef0-4691-a7b0-d1b205253d3f req-2b8f4f46-377a-4b46-bb12-99c9a47eb458 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:38:46 compute-0 nova_compute[187185]: 2025-11-29 07:38:46.250 187189 DEBUG nova.compute.manager [req-1f1b3641-1ef0-4691-a7b0-d1b205253d3f req-2b8f4f46-377a-4b46-bb12-99c9a47eb458 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] No waiting events found dispatching network-vif-plugged-948b9a73-91cd-4803-b642-d1a75f163368 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:38:46 compute-0 nova_compute[187185]: 2025-11-29 07:38:46.250 187189 WARNING nova.compute.manager [req-1f1b3641-1ef0-4691-a7b0-d1b205253d3f req-2b8f4f46-377a-4b46-bb12-99c9a47eb458 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Received unexpected event network-vif-plugged-948b9a73-91cd-4803-b642-d1a75f163368 for instance with vm_state active and task_state None.
Nov 29 07:38:47 compute-0 nova_compute[187185]: 2025-11-29 07:38:47.361 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.012 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'name': 'tempest-TestGettingAddress-server-1271234496', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000009a', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '0111c22b4b954ea586ca20d91ed3970f', 'user_id': '31ac7b05b012433b89143dc9f259644a', 'hostId': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.016 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff', 'name': 'tempest-TestNetworkBasicOps-server-501093689', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000009b', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'ec8b80be17a14d1caf666636283749d0', 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'hostId': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.017 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.023 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d / tapef353cc9-1e inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.024 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d / tap948b9a73-91 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.024 12 DEBUG ceilometer.compute.pollsters [-] e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.025 12 DEBUG ceilometer.compute.pollsters [-] e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.029 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for f9a714d0-1a3d-4de4-8fc2-fca74c904eff / tapc65a8c36-59 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.029 12 DEBUG ceilometer.compute.pollsters [-] f9a714d0-1a3d-4de4-8fc2-fca74c904eff/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eb3f4158-a07f-4fa5-9d84-62dbbb84bd3d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-0000009a-e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-tapef353cc9-1e', 'timestamp': '2025-11-29T07:38:48.017518', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1271234496', 'name': 'tapef353cc9-1e', 'instance_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bc:8f:e7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapef353cc9-1e'}, 'message_id': '711d4eb8-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.735823604, 'message_signature': '80a7e853a86203c9e067aa267b5036563c3fd6684e33c4e4fe122f25779388ec'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-0000009a-e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-tap948b9a73-91', 'timestamp': '2025-11-29T07:38:48.017518', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1271234496', 'name': 'tap948b9a73-91', 'instance_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ac:e6:05', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap948b9a73-91'}, 'message_id': '711d726c-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.735823604, 'message_signature': '9ed490335a183194834ef4b7e1592436313dcfe78bd5f99c92df277d25bed417'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000009b-f9a714d0-1a3d-4de4-8fc2-fca74c904eff-tapc65a8c36-59', 'timestamp': '2025-11-29T07:38:48.017518', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-501093689', 'name': 'tapc65a8c36-59', 'instance_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:95:82', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc65a8c36-59'}, 'message_id': '711e1082-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.744642684, 'message_signature': '577e9b53fe80c790dd0d37cacb1613b09c3201a91a507da730c9893cd2106605'}]}, 'timestamp': '2025-11-29 07:38:48.030503', '_unique_id': '5e093573026c4a1ebb210cdbf1ca24c5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.032 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.035 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.081 12 DEBUG ceilometer.compute.pollsters [-] e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.082 12 DEBUG ceilometer.compute.pollsters [-] e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.109 12 DEBUG ceilometer.compute.pollsters [-] f9a714d0-1a3d-4de4-8fc2-fca74c904eff/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.110 12 DEBUG ceilometer.compute.pollsters [-] f9a714d0-1a3d-4de4-8fc2-fca74c904eff/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '20430a65-ad07-40dd-90b9-6d9081136fd9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-vda', 'timestamp': '2025-11-29T07:38:48.035338', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1271234496', 'name': 'instance-0000009a', 'instance_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7125f93c-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.753590328, 'message_signature': '6c723a45620277486f27a7966aed0ab1ce6e2943a1d3044bbe15647f5cc39571'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-sda', 'timestamp': '2025-11-29T07:38:48.035338', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1271234496', 'name': 'instance-0000009a', 'instance_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '71260788-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.753590328, 'message_signature': '8c97cda57176991053de22d685ed85146aebb3f19efa7dec2df2142219351c95'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff-vda', 'timestamp': '2025-11-29T07:38:48.035338', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-501093689', 'name': 'instance-0000009b', 'instance_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '712a492e-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.800753956, 'message_signature': '4eb05c7aababcb417864aacd9912f2ffa55037a92778564b3e9dd3d11cfaae6e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff-sda', 'timestamp': '2025-11-29T07:38:48.035338', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-501093689', 'name': 'instance-0000009b', 'instance_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '712a6648-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.800753956, 'message_signature': 'fea897bf359287668d7cc71183d0485f55973a5b05684a2b1c322fece06c811b'}]}, 'timestamp': '2025-11-29 07:38:48.111223', '_unique_id': '92f39d1f3b3e48788dcac2eaaa8f4c57'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:38:48 compute-0 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:38:48 compute-0 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.112 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.113 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.114 12 DEBUG ceilometer.compute.pollsters [-] e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.114 12 DEBUG ceilometer.compute.pollsters [-] e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.114 12 DEBUG ceilometer.compute.pollsters [-] f9a714d0-1a3d-4de4-8fc2-fca74c904eff/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '23e7a8b7-4db4-4f4e-b9e2-7bbc12a79657', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-0000009a-e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-tapef353cc9-1e', 'timestamp': '2025-11-29T07:38:48.114080', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1271234496', 'name': 'tapef353cc9-1e', 'instance_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bc:8f:e7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapef353cc9-1e'}, 'message_id': '712ae67c-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.735823604, 'message_signature': 'baa2df94f50fcabb352ea558801e4ca052f336298f463514698762dc449b9297'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-0000009a-e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-tap948b9a73-91', 'timestamp': '2025-11-29T07:38:48.114080', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1271234496', 'name': 'tap948b9a73-91', 'instance_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ac:e6:05', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap948b9a73-91'}, 'message_id': '712af4a0-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.735823604, 'message_signature': '8ab8d712d2005a5d74bb3dfd783947ad731cddef69db222fbb62709c2444311f'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000009b-f9a714d0-1a3d-4de4-8fc2-fca74c904eff-tapc65a8c36-59', 'timestamp': '2025-11-29T07:38:48.114080', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-501093689', 'name': 'tapc65a8c36-59', 'instance_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:95:82', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc65a8c36-59'}, 'message_id': '712b0350-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.744642684, 'message_signature': 'a9e28f7445a6adfd3368309d005fea89b6f128635a23ecd508bcbe35e2b4876f'}]}, 'timestamp': '2025-11-29 07:38:48.115231', '_unique_id': 'f3cb29c3ac6b4161b7d183a9bbfe929a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.115 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.127 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.127 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.127 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1271234496>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-501093689>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1271234496>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-501093689>]
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.127 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.128 12 DEBUG ceilometer.compute.pollsters [-] e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.128 12 DEBUG ceilometer.compute.pollsters [-] e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.128 12 DEBUG ceilometer.compute.pollsters [-] f9a714d0-1a3d-4de4-8fc2-fca74c904eff/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f6592d55-7d0e-4be7-9a05-83d6acee63cb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-0000009a-e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-tapef353cc9-1e', 'timestamp': '2025-11-29T07:38:48.128453', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1271234496', 'name': 'tapef353cc9-1e', 'instance_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bc:8f:e7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapef353cc9-1e'}, 'message_id': '712d1370-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.735823604, 'message_signature': 'f226e8af815c2d50d0c2a6a14b5cc4673a877f02d59de320024940404759d00c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-0000009a-e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-tap948b9a73-91', 'timestamp': '2025-11-29T07:38:48.128453', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1271234496', 'name': 'tap948b9a73-91', 'instance_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ac:e6:05', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap948b9a73-91'}, 'message_id': '712d1bf4-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.735823604, 'message_signature': '3d80e1a7b8fe9c1f67d9db57d8d657d2cc92b27bd28296cb7464e8db1dc83a4b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000009b-f9a714d0-1a3d-4de4-8fc2-fca74c904eff-tapc65a8c36-59', 'timestamp': '2025-11-29T07:38:48.128453', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-501093689', 'name': 'tapc65a8c36-59', 'instance_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:95:82', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc65a8c36-59'}, 'message_id': '712d2446-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.744642684, 'message_signature': 'c6dd8c4b9105b6932778f29ff177c6b1f3f75e100522270bf7be520128f7cade'}]}, 'timestamp': '2025-11-29 07:38:48.129132', '_unique_id': 'ed9f477cf6304a2494d544f3262fab5f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.130 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.130 12 DEBUG ceilometer.compute.pollsters [-] e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.130 12 DEBUG ceilometer.compute.pollsters [-] e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.130 12 DEBUG ceilometer.compute.pollsters [-] f9a714d0-1a3d-4de4-8fc2-fca74c904eff/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '85ed58f4-f729-4b92-ab74-1df757b28f35', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-0000009a-e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-tapef353cc9-1e', 'timestamp': '2025-11-29T07:38:48.130357', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1271234496', 'name': 'tapef353cc9-1e', 'instance_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bc:8f:e7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapef353cc9-1e'}, 'message_id': '712d5cae-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.735823604, 'message_signature': '49f6839f0469e51337384439b7dc334a81115e29709e070b894cc6db337824c5'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-0000009a-e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-tap948b9a73-91', 'timestamp': '2025-11-29T07:38:48.130357', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1271234496', 'name': 'tap948b9a73-91', 'instance_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ac:e6:05', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap948b9a73-91'}, 'message_id': '712d649c-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.735823604, 'message_signature': 'fa7f36b0c9d68d0e06fb91a996c512248e1d88f11ba8d7b8d672eb2d3402681b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000009b-f9a714d0-1a3d-4de4-8fc2-fca74c904eff-tapc65a8c36-59', 'timestamp': '2025-11-29T07:38:48.130357', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-501093689', 'name': 'tapc65a8c36-59', 'instance_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:95:82', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc65a8c36-59'}, 'message_id': '712d6d7a-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.744642684, 'message_signature': '26148b3442a6a98dfeb80935f7cf515e507ee8ea1f0af896a4071bca2007aacc'}]}, 'timestamp': '2025-11-29 07:38:48.131005', '_unique_id': 'c511b35051ac42279c820af8546bf9b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.139 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.139 12 DEBUG ceilometer.compute.pollsters [-] e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.139 12 DEBUG ceilometer.compute.pollsters [-] e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.139 12 DEBUG ceilometer.compute.pollsters [-] f9a714d0-1a3d-4de4-8fc2-fca74c904eff/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 DEBUG ceilometer.compute.pollsters [-] f9a714d0-1a3d-4de4-8fc2-fca74c904eff/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3d2e64a4-998a-45bc-81aa-72d758a0d04b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-vda', 'timestamp': '2025-11-29T07:38:48.139167', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1271234496', 'name': 'instance-0000009a', 'instance_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '712eb504-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.753590328, 'message_signature': '12cd2c33853c1f18f21e6c5efc988e4646a5fe43660378f46854b6457dc3a8cf'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-sda', 'timestamp': '2025-11-29T07:38:48.139167', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1271234496', 'name': 'instance-0000009a', 'instance_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '712ebe6e-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.753590328, 'message_signature': 'ebc8d5845fe4681dc8e6b1680fbbb7fda4bd4f5bbf852d06f9c857907b2bf270'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff-vda', 'timestamp': '2025-11-29T07:38:48.139167', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-501093689', 'name': 'instance-0000009b', 'instance_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '712ec80a-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.800753956, 'message_signature': '477088dcc6f6f1b01e6cb585d65692c799328062db1ed3826c47587ca5516b77'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff-sda', 'timestamp': '2025-11-29T07:38:48.139167', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-501093689', 'name': 'instance-0000009b', 'instance_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '712ed70a-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.800753956, 'message_signature': '75bdeb6e400ed6b9efd5c7b5db51b2f77309bb0c22b800f14c7b0e9af7a5372a'}]}, 'timestamp': '2025-11-29 07:38:48.140255', '_unique_id': '7217a7aebe9c4d19a96ecefc564d8ebe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.140 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.141 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.141 12 DEBUG ceilometer.compute.pollsters [-] e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.141 12 DEBUG ceilometer.compute.pollsters [-] e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.142 12 DEBUG ceilometer.compute.pollsters [-] f9a714d0-1a3d-4de4-8fc2-fca74c904eff/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5849bd68-d971-40db-8fa7-d1b128db3858', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-0000009a-e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-tapef353cc9-1e', 'timestamp': '2025-11-29T07:38:48.141588', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1271234496', 'name': 'tapef353cc9-1e', 'instance_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bc:8f:e7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapef353cc9-1e'}, 'message_id': '712f14f4-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.735823604, 'message_signature': '06cdd60aed89a53d0211e9cc7d68f7211f8c011afc7e543a241e40b71439d117'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-0000009a-e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-tap948b9a73-91', 'timestamp': '2025-11-29T07:38:48.141588', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1271234496', 'name': 'tap948b9a73-91', 'instance_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ac:e6:05', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap948b9a73-91'}, 'message_id': '712f2156-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.735823604, 'message_signature': 'ea8c7efac9ef32ea0c827fb80f27a623c3ec7e89a112ad0baf43ca70524eb1bd'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000009b-f9a714d0-1a3d-4de4-8fc2-fca74c904eff-tapc65a8c36-59', 'timestamp': '2025-11-29T07:38:48.141588', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-501093689', 'name': 'tapc65a8c36-59', 'instance_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:95:82', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc65a8c36-59'}, 'message_id': '712f2c28-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.744642684, 'message_signature': '46945168c7be0aa97de0e0a48e99f6579b40c4dc2ebdc936e1f760ca26b201bc'}]}, 'timestamp': '2025-11-29 07:38:48.142494', '_unique_id': 'e487daea96384ee287912fed0f807658'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.143 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1271234496>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-501093689>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1271234496>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-501093689>]
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.144 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.160 12 DEBUG ceilometer.compute.pollsters [-] e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.161 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d: ceilometer.compute.pollsters.NoVolumeException
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.176 12 DEBUG ceilometer.compute.pollsters [-] f9a714d0-1a3d-4de4-8fc2-fca74c904eff/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.176 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance f9a714d0-1a3d-4de4-8fc2-fca74c904eff: ceilometer.compute.pollsters.NoVolumeException
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.176 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.177 12 DEBUG ceilometer.compute.pollsters [-] e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.177 12 DEBUG ceilometer.compute.pollsters [-] e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.177 12 DEBUG ceilometer.compute.pollsters [-] f9a714d0-1a3d-4de4-8fc2-fca74c904eff/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.177 12 DEBUG ceilometer.compute.pollsters [-] f9a714d0-1a3d-4de4-8fc2-fca74c904eff/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4d3d7546-322d-42d6-8981-94bb78b0b918', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-vda', 'timestamp': '2025-11-29T07:38:48.177094', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1271234496', 'name': 'instance-0000009a', 'instance_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7134820e-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.753590328, 'message_signature': '7d352920b4c3c241a085ae4744c58b0fa7bc2971f886569a84efd91a8ccfc828'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-sda', 'timestamp': '2025-11-29T07:38:48.177094', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1271234496', 'name': 'instance-0000009a', 'instance_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '71348dd0-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.753590328, 'message_signature': 'bbb299393acb1f85ef5d598daf15f9d41c2832305e5bddb682dd2f004c0f871a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff-vda', 'timestamp': '2025-11-29T07:38:48.177094', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-501093689', 'name': 'instance-0000009b', 'instance_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '713498d4-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.800753956, 'message_signature': 'e30bbc3b945a9bda740d8020a5870dc28ca852b43b00a5c72fa387e04c94669a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff-sda', 'timestamp': '2025-11-29T07:38:48.177094', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-501093689', 'name': 'instance-0000009b', 'instance_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7134a27a-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.800753956, 'message_signature': 'a2c5928d626f28225270c3b2620b0e30efab5cddeb1530ec0fb4d6d853337b1c'}]}, 'timestamp': '2025-11-29 07:38:48.178244', '_unique_id': 'd4886c8e6d684597807b9e01e22eb394'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.179 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.180 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.180 12 DEBUG ceilometer.compute.pollsters [-] e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.180 12 DEBUG ceilometer.compute.pollsters [-] e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.180 12 DEBUG ceilometer.compute.pollsters [-] f9a714d0-1a3d-4de4-8fc2-fca74c904eff/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c73311b0-bd26-4dc0-a736-bb1f4f8ca77c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-0000009a-e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-tapef353cc9-1e', 'timestamp': '2025-11-29T07:38:48.180380', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1271234496', 'name': 'tapef353cc9-1e', 'instance_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bc:8f:e7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapef353cc9-1e'}, 'message_id': '71350076-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.735823604, 'message_signature': '9a3736c94d38e5ad620cb7d2bf6c74ca8c061857863a1b71689ce94a5a23eb00'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-0000009a-e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-tap948b9a73-91', 'timestamp': '2025-11-29T07:38:48.180380', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1271234496', 'name': 'tap948b9a73-91', 'instance_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ac:e6:05', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap948b9a73-91'}, 'message_id': '713509e0-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.735823604, 'message_signature': 'f03fd5c735db61ad55707618f4c3d0ce62c5cbfb367a9ddb6977ede31d5c982c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000009b-f9a714d0-1a3d-4de4-8fc2-fca74c904eff-tapc65a8c36-59', 'timestamp': '2025-11-29T07:38:48.180380', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-501093689', 'name': 'tapc65a8c36-59', 'instance_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:95:82', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc65a8c36-59'}, 'message_id': '7135148a-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.744642684, 'message_signature': '7fea47757a3e2f1ab75270b82423c672a0a9d8c78ce8e93064e29e4c6ac3992e'}]}, 'timestamp': '2025-11-29 07:38:48.181158', '_unique_id': 'a7c770a03be445faaaa5bb2ea0956bc3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.181 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.182 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.182 12 DEBUG ceilometer.compute.pollsters [-] e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/cpu volume: 4510000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.182 12 DEBUG ceilometer.compute.pollsters [-] f9a714d0-1a3d-4de4-8fc2-fca74c904eff/cpu volume: 6660000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2873fd29-7d14-44a7-9bde-be9e9f5fc135', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4510000000, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'timestamp': '2025-11-29T07:38:48.182626', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1271234496', 'name': 'instance-0000009a', 'instance_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '713557ba-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.878967494, 'message_signature': '1aed2e2b95e470e66b10bdf2806f996a5c491e9163eaa72d44e5817db3013454'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6660000000, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff', 'timestamp': '2025-11-29T07:38:48.182626', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-501093689', 'name': 'instance-0000009b', 'instance_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '71356228-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.894777692, 'message_signature': 'd312281cb240478195c65b37e6115a910ca1594f7f4c9e3497891ba36058ff7c'}]}, 'timestamp': '2025-11-29 07:38:48.183140', '_unique_id': '38d4521d6f5e4a6ba06a28e40ee8c7d0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.183 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.184 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.184 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.184 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1271234496>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-501093689>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1271234496>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-501093689>]
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.184 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.184 12 DEBUG ceilometer.compute.pollsters [-] e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/disk.device.read.latency volume: 174028112 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.185 12 DEBUG ceilometer.compute.pollsters [-] e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/disk.device.read.latency volume: 713470 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.185 12 DEBUG ceilometer.compute.pollsters [-] f9a714d0-1a3d-4de4-8fc2-fca74c904eff/disk.device.read.latency volume: 394264184 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.185 12 DEBUG ceilometer.compute.pollsters [-] f9a714d0-1a3d-4de4-8fc2-fca74c904eff/disk.device.read.latency volume: 651478 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bedc61db-5a62-4c0b-98e9-af721b49413f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 174028112, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-vda', 'timestamp': '2025-11-29T07:38:48.184965', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1271234496', 'name': 'instance-0000009a', 'instance_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7135b304-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.753590328, 'message_signature': '5c32b48526d4c93a2878589a2a423a38425fc0d11a9250e77e8c97eec03aef6b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 713470, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-sda', 'timestamp': '2025-11-29T07:38:48.184965', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1271234496', 'name': 'instance-0000009a', 'instance_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7135bc14-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.753590328, 'message_signature': 'b3beeca574cc2416e086890ee6d624bbc34364dcace5fa5578dd1eb785885de7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 394264184, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff-vda', 'timestamp': '2025-11-29T07:38:48.184965', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-501093689', 'name': 'instance-0000009b', 'instance_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7135c4de-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.800753956, 'message_signature': 'cbc8ff78da8ea129d26ba54dde65793118e35298b3886fd0653a9d580910f763'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 651478, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff-sda', 'timestamp': '2025-11-29T07:38:48.184965', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-501093689', 'name': 'instance-0000009b', 'instance_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7135d0c8-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.800753956, 'message_signature': '638e0c901ef6db51a364bb16b34cfbe1e61ba4b8859d8d74388e6de25c2e36b0'}]}, 'timestamp': '2025-11-29 07:38:48.185981', '_unique_id': '52281de440c24cee9791f2e812a8a66b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.186 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.187 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.187 12 DEBUG ceilometer.compute.pollsters [-] e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.187 12 DEBUG ceilometer.compute.pollsters [-] e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.187 12 DEBUG ceilometer.compute.pollsters [-] f9a714d0-1a3d-4de4-8fc2-fca74c904eff/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 DEBUG ceilometer.compute.pollsters [-] f9a714d0-1a3d-4de4-8fc2-fca74c904eff/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f116da7e-9b04-4f77-ba53-fe96408a4ba7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-vda', 'timestamp': '2025-11-29T07:38:48.187425', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1271234496', 'name': 'instance-0000009a', 'instance_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '71361326-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.753590328, 'message_signature': 'd0bb74a517b59782837773f70c1c7dfbaf6ab06fc3ed0bab6678234d465e57f2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-sda', 'timestamp': '2025-11-29T07:38:48.187425', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1271234496', 'name': 'instance-0000009a', 'instance_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '71361c36-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.753590328, 'message_signature': '7e212c4255a0b44f69613234d61bf0687c1750b18688db1e2b5330296f1ae2a0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff-vda', 'timestamp': '2025-11-29T07:38:48.187425', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-501093689', 'name': 'instance-0000009b', 'instance_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '713626d6-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.800753956, 'message_signature': '74351946b63543867712fe5740e413b4ebf244a5c703f7dfc7b211ef687bec53'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff-sda', 'timestamp': '2025-11-29T07:38:48.187425', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-501093689', 'name': 'instance-0000009b', 'instance_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '71362fa0-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.800753956, 'message_signature': '2e97aa252c99e7fbf8f2cd72ff2f098a2eda99ce44dcc2c59618b81efa1bc848'}]}, 'timestamp': '2025-11-29 07:38:48.188397', '_unique_id': 'a712df39377e409c8d94f6e512645d47'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.188 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.189 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.189 12 DEBUG ceilometer.compute.pollsters [-] e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.190 12 DEBUG ceilometer.compute.pollsters [-] e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.190 12 DEBUG ceilometer.compute.pollsters [-] f9a714d0-1a3d-4de4-8fc2-fca74c904eff/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '16c531d0-6be2-4e9d-a232-33f675b3f320', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-0000009a-e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-tapef353cc9-1e', 'timestamp': '2025-11-29T07:38:48.189824', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1271234496', 'name': 'tapef353cc9-1e', 'instance_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bc:8f:e7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapef353cc9-1e'}, 'message_id': '713672da-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.735823604, 'message_signature': '4b18ef04219ebce536133725cde2a6710b1a2ea1b7b53210617ce17dad32af9e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-0000009a-e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-tap948b9a73-91', 'timestamp': '2025-11-29T07:38:48.189824', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1271234496', 'name': 'tap948b9a73-91', 'instance_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ac:e6:05', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap948b9a73-91'}, 'message_id': '71367c58-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.735823604, 'message_signature': '7bd749ae1ae60de8a2877e6bf14fe6b6b4d22c4f450c48baab4f24fd3fd73e76'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000009b-f9a714d0-1a3d-4de4-8fc2-fca74c904eff-tapc65a8c36-59', 'timestamp': '2025-11-29T07:38:48.189824', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-501093689', 'name': 'tapc65a8c36-59', 'instance_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:95:82', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc65a8c36-59'}, 'message_id': '71368572-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.744642684, 'message_signature': '6a26cf29d6312698a0c068d75ef24c3794151313f33e3aa31a0ef40feb37aa37'}]}, 'timestamp': '2025-11-29 07:38:48.190602', '_unique_id': '90dcec591c354197aa995c62ef9c3ff0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.191 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.192 12 DEBUG ceilometer.compute.pollsters [-] e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.192 12 DEBUG ceilometer.compute.pollsters [-] e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.192 12 DEBUG ceilometer.compute.pollsters [-] f9a714d0-1a3d-4de4-8fc2-fca74c904eff/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd8123df8-fbca-431e-856d-b50f275676d1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-0000009a-e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-tapef353cc9-1e', 'timestamp': '2025-11-29T07:38:48.192043', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1271234496', 'name': 'tapef353cc9-1e', 'instance_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bc:8f:e7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapef353cc9-1e'}, 'message_id': '7136c7b2-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.735823604, 'message_signature': '0b57e5aa146649e5283b77a246698bc9eccf3894bf7919f86ddfa84221df7a76'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-0000009a-e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-tap948b9a73-91', 'timestamp': '2025-11-29T07:38:48.192043', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1271234496', 'name': 'tap948b9a73-91', 'instance_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ac:e6:05', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap948b9a73-91'}, 'message_id': '7136d108-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.735823604, 'message_signature': 'ab4fcdf284a3e77d7b58ac7d81942b3a4e90d4dd7abb240f10dbdf191331dd96'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000009b-f9a714d0-1a3d-4de4-8fc2-fca74c904eff-tapc65a8c36-59', 'timestamp': '2025-11-29T07:38:48.192043', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-501093689', 'name': 'tapc65a8c36-59', 'instance_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:95:82', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc65a8c36-59'}, 'message_id': '7136da0e-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.744642684, 'message_signature': '382cabc5a346f6e5c47bf5024f5ce8620bbc6a85f1136ba22d84da4371fbfab3'}]}, 'timestamp': '2025-11-29 07:38:48.192768', '_unique_id': '4ee2fe97251c4aabb5acf5b2ba3615b1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.193 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.194 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.202 12 DEBUG ceilometer.compute.pollsters [-] e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.203 12 DEBUG ceilometer.compute.pollsters [-] e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.211 12 DEBUG ceilometer.compute.pollsters [-] f9a714d0-1a3d-4de4-8fc2-fca74c904eff/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.212 12 DEBUG ceilometer.compute.pollsters [-] f9a714d0-1a3d-4de4-8fc2-fca74c904eff/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2db093cb-c6ba-49c9-9fdc-f9b427046a12', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-vda', 'timestamp': '2025-11-29T07:38:48.194193', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1271234496', 'name': 'instance-0000009a', 'instance_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '713878dc-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.912405352, 'message_signature': '8e1d6807bda4f4feb24b6f474fc23e6bcc6263b466fa4285955c6581523fa315'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-sda', 'timestamp': '2025-11-29T07:38:48.194193', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1271234496', 'name': 'instance-0000009a', 'instance_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '713886d8-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.912405352, 'message_signature': 'a2203b417889af947e1913fabd364c3810607c5c6b51d7171306c77e1fb3c8a7'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff-vda', 'timestamp': '2025-11-29T07:38:48.194193', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-501093689', 'name': 'instance-0000009b', 'instance_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7139cdf4-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.921947623, 'message_signature': 'c472b73dda87ba4f8412fa7b339ef8e33ecc2151f56fad6113c28b388ca7f36a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff-sda', 'timestamp': '2025-11-29T07:38:48.194193', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-501093689', 'name': 'instance-0000009b', 'instance_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7139d966-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.921947623, 'message_signature': 'bc3a6b748c717a289793b5210247abec84a009f670354df60fa6f3ea7199d424'}]}, 'timestamp': '2025-11-29 07:38:48.212427', '_unique_id': '8eb5e9f17ef2426881795b38fa9d0af2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.213 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.214 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.214 12 DEBUG ceilometer.compute.pollsters [-] e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.215 12 DEBUG ceilometer.compute.pollsters [-] e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.215 12 DEBUG ceilometer.compute.pollsters [-] f9a714d0-1a3d-4de4-8fc2-fca74c904eff/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.215 12 DEBUG ceilometer.compute.pollsters [-] f9a714d0-1a3d-4de4-8fc2-fca74c904eff/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '498ddeaf-7dcd-4d06-805e-493dfb0c54fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-vda', 'timestamp': '2025-11-29T07:38:48.214649', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1271234496', 'name': 'instance-0000009a', 'instance_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '713a3de8-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.753590328, 'message_signature': '197e635ad1047b8291c592c93507f89d0c67cd6a6aad96ca481ee1331bdc99aa'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-sda', 'timestamp': '2025-11-29T07:38:48.214649', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1271234496', 'name': 'instance-0000009a', 'instance_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '713a4b12-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.753590328, 'message_signature': '3b151282fbe2df14ecb3c72e53aff022d76bf9982758d7109fd749bc62f0969f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff-vda', 'timestamp': '2025-11-29T07:38:48.214649', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-501093689', 'name': 'instance-0000009b', 'instance_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '713a57ba-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.800753956, 'message_signature': '2411884b1c8c05d1a6cec56ad911415a1e973a303da13fe42d1acdb9154b8b21'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff-sda', 'timestamp': '2025-11-29T07:38:48.214649', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-501093689', 'name': 'instance-0000009b', 'instance_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '713a64e4-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.800753956, 'message_signature': '3034c11507bcf95fe3a50e54834e356e70b2a0051ab0dbf66b4ca24196110b1f'}]}, 'timestamp': '2025-11-29 07:38:48.216025', '_unique_id': '0983da8e8bb94deaae2c143661e73c56'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.216 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.218 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.218 12 DEBUG ceilometer.compute.pollsters [-] e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.218 12 DEBUG ceilometer.compute.pollsters [-] e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.218 12 DEBUG ceilometer.compute.pollsters [-] f9a714d0-1a3d-4de4-8fc2-fca74c904eff/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b652cf40-c5d8-4e29-aad6-9207d6590ae9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-0000009a-e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-tapef353cc9-1e', 'timestamp': '2025-11-29T07:38:48.218169', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1271234496', 'name': 'tapef353cc9-1e', 'instance_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bc:8f:e7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapef353cc9-1e'}, 'message_id': '713ac722-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.735823604, 'message_signature': '02e421ae1d091cfb8c441e0c4127f6fda3c493652f09ea8c4eb36a36589c97aa'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-0000009a-e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-tap948b9a73-91', 'timestamp': '2025-11-29T07:38:48.218169', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1271234496', 'name': 'tap948b9a73-91', 'instance_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ac:e6:05', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap948b9a73-91'}, 'message_id': '713ad4b0-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.735823604, 'message_signature': 'dba769c9651729f66d3a90bc56f5890a6ead5eb0328fdd3ccc2de23019403468'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000009b-f9a714d0-1a3d-4de4-8fc2-fca74c904eff-tapc65a8c36-59', 'timestamp': '2025-11-29T07:38:48.218169', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-501093689', 'name': 'tapc65a8c36-59', 'instance_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:95:82', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc65a8c36-59'}, 'message_id': '713ae2ca-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.744642684, 'message_signature': '1df5c463b39c0abb52e3edcff78b0120239f81e44ad5b73c03e8a0cfbd7b927b'}]}, 'timestamp': '2025-11-29 07:38:48.219259', '_unique_id': 'f2a86530c9b6423cacbea04b959eb19f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.219 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.221 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.221 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.221 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1271234496>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-501093689>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1271234496>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-501093689>]
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.221 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.221 12 DEBUG ceilometer.compute.pollsters [-] e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.222 12 DEBUG ceilometer.compute.pollsters [-] e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.222 12 DEBUG ceilometer.compute.pollsters [-] f9a714d0-1a3d-4de4-8fc2-fca74c904eff/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.222 12 DEBUG ceilometer.compute.pollsters [-] f9a714d0-1a3d-4de4-8fc2-fca74c904eff/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a05f558c-aa40-475f-8b11-7faedd3f53fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-vda', 'timestamp': '2025-11-29T07:38:48.221773', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1271234496', 'name': 'instance-0000009a', 'instance_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '713b54b2-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.912405352, 'message_signature': '604ee0b8dbfd3a30d98c44b8511dbe50a12201253c85ab79bead2f8768b923e3'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-sda', 'timestamp': '2025-11-29T07:38:48.221773', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1271234496', 'name': 'instance-0000009a', 'instance_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '713b61b4-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.912405352, 'message_signature': 'eab4a0f1702cdf0e0a2d3b0f1f8167784bb9c5cc3166eed1fb28305dcbd90ac3'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff-vda', 'timestamp': '2025-11-29T07:38:48.221773', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-501093689', 'name': 'instance-0000009b', 'instance_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '713b6e5c-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.921947623, 'message_signature': '6939ca29184754a67c5292d3b91688752d01e34096ba1d2ab4a0033427c41d08'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff-sda', 'timestamp': '2025-11-29T07:38:48.221773', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-501093689', 'name': 'instance-0000009b', 'instance_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '713b7bd6-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.921947623, 'message_signature': '1314a0dc7491b3390147373d06168677abe92969eeba0f556e758284aff4928c'}]}, 'timestamp': '2025-11-29 07:38:48.223164', '_unique_id': 'cfdf6238b8064fd495dddcbee257d9bb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.223 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.224 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.224 12 DEBUG ceilometer.compute.pollsters [-] e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.225 12 DEBUG ceilometer.compute.pollsters [-] e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.225 12 DEBUG ceilometer.compute.pollsters [-] f9a714d0-1a3d-4de4-8fc2-fca74c904eff/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.225 12 DEBUG ceilometer.compute.pollsters [-] f9a714d0-1a3d-4de4-8fc2-fca74c904eff/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7330f66e-077e-466b-ab35-9f2c2cb797ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-vda', 'timestamp': '2025-11-29T07:38:48.224944', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1271234496', 'name': 'instance-0000009a', 'instance_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '713bcf1e-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.912405352, 'message_signature': 'efcc195e4088bcfc5e13846fc57bbf5cf8a87bded295381ae6403b4fbd535dcf'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-sda', 'timestamp': '2025-11-29T07:38:48.224944', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1271234496', 'name': 'instance-0000009a', 'instance_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '713bdbbc-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.912405352, 'message_signature': '42998a35a74fd8b34e3b3969eaeabe4206cb88d51d098555c19c806c8c097a8a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff-vda', 'timestamp': '2025-11-29T07:38:48.224944', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-501093689', 'name': 'instance-0000009b', 'instance_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '713be828-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.921947623, 'message_signature': '90b170b50b238558b68308941486936845f821f3bd2b0aadf410acbd375cf970'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff-sda', 'timestamp': '2025-11-29T07:38:48.224944', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-501093689', 'name': 'instance-0000009b', 'instance_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '713bf5d4-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.921947623, 'message_signature': '4e1759ec01da3edef31eb03c82442bbe290a179675f4bab10ea4149b2275271a'}]}, 'timestamp': '2025-11-29 07:38:48.226285', '_unique_id': 'fad726ff59034f059e103c2e306d3b2c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.226 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.227 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.228 12 DEBUG ceilometer.compute.pollsters [-] e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.228 12 DEBUG ceilometer.compute.pollsters [-] e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.228 12 DEBUG ceilometer.compute.pollsters [-] f9a714d0-1a3d-4de4-8fc2-fca74c904eff/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b07f1796-d992-4b63-a04f-37afe7abb4eb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-0000009a-e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-tapef353cc9-1e', 'timestamp': '2025-11-29T07:38:48.228123', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1271234496', 'name': 'tapef353cc9-1e', 'instance_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bc:8f:e7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapef353cc9-1e'}, 'message_id': '713c4b60-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.735823604, 'message_signature': 'f4589f9308f0e67abfcf6b7e826277a632ea8d7e624d46c725d277582633b12c'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-0000009a-e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-tap948b9a73-91', 'timestamp': '2025-11-29T07:38:48.228123', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1271234496', 'name': 'tap948b9a73-91', 'instance_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ac:e6:05', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap948b9a73-91'}, 'message_id': '713c589e-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.735823604, 'message_signature': '9d4df7bf407a2bf5fb691f40bf25c253af7fbb296194b49194bfd4af4ea24d40'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000009b-f9a714d0-1a3d-4de4-8fc2-fca74c904eff-tapc65a8c36-59', 'timestamp': '2025-11-29T07:38:48.228123', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-501093689', 'name': 'tapc65a8c36-59', 'instance_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:95:82', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc65a8c36-59'}, 'message_id': '713c6690-ccf6-11f0-8f64-fa163e220349', 'monotonic_time': 7375.744642684, 'message_signature': '000b2ea26cd6b1e915e9ac079c02b835b326e6fbdf7cc53c7d916917acd9c7ef'}]}, 'timestamp': '2025-11-29 07:38:48.229182', '_unique_id': 'f10ffccb32e64c84b7ec2973d06ac664'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:38:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:38:48.229 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:38:48 compute-0 sshd-session[241674]: Received disconnect from 45.78.219.119 port 56456:11: Bye Bye [preauth]
Nov 29 07:38:48 compute-0 sshd-session[241674]: Disconnected from authenticating user root 45.78.219.119 port 56456 [preauth]
Nov 29 07:38:51 compute-0 nova_compute[187185]: 2025-11-29 07:38:51.210 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:52 compute-0 nova_compute[187185]: 2025-11-29 07:38:52.311 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:38:52 compute-0 nova_compute[187185]: 2025-11-29 07:38:52.363 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:52 compute-0 podman[241954]: 2025-11-29 07:38:52.87917408 +0000 UTC m=+0.131528231 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:38:54 compute-0 ovn_controller[95281]: 2025-11-29T07:38:54Z|00065|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5b:95:82 10.100.0.11
Nov 29 07:38:54 compute-0 ovn_controller[95281]: 2025-11-29T07:38:54Z|00066|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5b:95:82 10.100.0.11
Nov 29 07:38:56 compute-0 nova_compute[187185]: 2025-11-29 07:38:56.213 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:57 compute-0 nova_compute[187185]: 2025-11-29 07:38:57.249 187189 DEBUG nova.compute.manager [req-032c4952-a319-47b4-9685-0ec05a8b9b71 req-281351b6-ef8f-4949-ab77-970e2ea54293 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Received event network-changed-c65a8c36-5997-4e67-9fa0-e361b7c334ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:38:57 compute-0 nova_compute[187185]: 2025-11-29 07:38:57.249 187189 DEBUG nova.compute.manager [req-032c4952-a319-47b4-9685-0ec05a8b9b71 req-281351b6-ef8f-4949-ab77-970e2ea54293 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Refreshing instance network info cache due to event network-changed-c65a8c36-5997-4e67-9fa0-e361b7c334ca. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:38:57 compute-0 nova_compute[187185]: 2025-11-29 07:38:57.250 187189 DEBUG oslo_concurrency.lockutils [req-032c4952-a319-47b4-9685-0ec05a8b9b71 req-281351b6-ef8f-4949-ab77-970e2ea54293 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-f9a714d0-1a3d-4de4-8fc2-fca74c904eff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:38:57 compute-0 nova_compute[187185]: 2025-11-29 07:38:57.251 187189 DEBUG oslo_concurrency.lockutils [req-032c4952-a319-47b4-9685-0ec05a8b9b71 req-281351b6-ef8f-4949-ab77-970e2ea54293 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-f9a714d0-1a3d-4de4-8fc2-fca74c904eff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:38:57 compute-0 nova_compute[187185]: 2025-11-29 07:38:57.251 187189 DEBUG nova.network.neutron [req-032c4952-a319-47b4-9685-0ec05a8b9b71 req-281351b6-ef8f-4949-ab77-970e2ea54293 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Refreshing network info cache for port c65a8c36-5997-4e67-9fa0-e361b7c334ca _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:38:57 compute-0 nova_compute[187185]: 2025-11-29 07:38:57.367 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:38:57 compute-0 ovn_controller[95281]: 2025-11-29T07:38:57Z|00067|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:bc:8f:e7 10.100.0.14
Nov 29 07:38:57 compute-0 ovn_controller[95281]: 2025-11-29T07:38:57Z|00068|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:bc:8f:e7 10.100.0.14
Nov 29 07:38:59 compute-0 podman[242006]: 2025-11-29 07:38:59.821425282 +0000 UTC m=+0.074453783 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 07:39:01 compute-0 nova_compute[187185]: 2025-11-29 07:39:01.215 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:02 compute-0 nova_compute[187185]: 2025-11-29 07:39:02.372 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:03 compute-0 sshd-session[242030]: Invalid user testftp from 20.255.62.58 port 34672
Nov 29 07:39:03 compute-0 podman[242032]: 2025-11-29 07:39:03.210717072 +0000 UTC m=+0.067605409 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:39:03 compute-0 podman[242033]: 2025-11-29 07:39:03.227723544 +0000 UTC m=+0.084405635 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:39:03 compute-0 sshd-session[242030]: Received disconnect from 20.255.62.58 port 34672:11: Bye Bye [preauth]
Nov 29 07:39:03 compute-0 sshd-session[242030]: Disconnected from invalid user testftp 20.255.62.58 port 34672 [preauth]
Nov 29 07:39:06 compute-0 nova_compute[187185]: 2025-11-29 07:39:06.217 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:06 compute-0 nova_compute[187185]: 2025-11-29 07:39:06.317 187189 DEBUG nova.network.neutron [req-032c4952-a319-47b4-9685-0ec05a8b9b71 req-281351b6-ef8f-4949-ab77-970e2ea54293 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Updated VIF entry in instance network info cache for port c65a8c36-5997-4e67-9fa0-e361b7c334ca. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:39:06 compute-0 nova_compute[187185]: 2025-11-29 07:39:06.318 187189 DEBUG nova.network.neutron [req-032c4952-a319-47b4-9685-0ec05a8b9b71 req-281351b6-ef8f-4949-ab77-970e2ea54293 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Updating instance_info_cache with network_info: [{"id": "c65a8c36-5997-4e67-9fa0-e361b7c334ca", "address": "fa:16:3e:5b:95:82", "network": {"id": "8ac0e70c-84ba-415c-841b-4a5a525b1a9d", "bridge": "br-int", "label": "tempest-network-smoke--597026508", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc65a8c36-59", "ovs_interfaceid": "c65a8c36-5997-4e67-9fa0-e361b7c334ca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:39:06 compute-0 nova_compute[187185]: 2025-11-29 07:39:06.383 187189 INFO nova.compute.manager [None req-07212627-7057-4460-b472-e64c06e116e5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Get console output
Nov 29 07:39:06 compute-0 nova_compute[187185]: 2025-11-29 07:39:06.388 213758 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 29 07:39:07 compute-0 nova_compute[187185]: 2025-11-29 07:39:07.374 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:10 compute-0 nova_compute[187185]: 2025-11-29 07:39:10.138 187189 DEBUG oslo_concurrency.lockutils [req-032c4952-a319-47b4-9685-0ec05a8b9b71 req-281351b6-ef8f-4949-ab77-970e2ea54293 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-f9a714d0-1a3d-4de4-8fc2-fca74c904eff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:39:11 compute-0 nova_compute[187185]: 2025-11-29 07:39:11.218 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:11 compute-0 nova_compute[187185]: 2025-11-29 07:39:11.224 187189 DEBUG nova.compute.manager [req-b779b68d-f514-42e4-be6e-1121a7f71c11 req-875650f9-beed-4de3-8cfa-c75773d365ae 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Received event network-changed-ef353cc9-1e6a-4c76-9acf-917aecd8f9a6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:39:11 compute-0 nova_compute[187185]: 2025-11-29 07:39:11.225 187189 DEBUG nova.compute.manager [req-b779b68d-f514-42e4-be6e-1121a7f71c11 req-875650f9-beed-4de3-8cfa-c75773d365ae 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Refreshing instance network info cache due to event network-changed-ef353cc9-1e6a-4c76-9acf-917aecd8f9a6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:39:11 compute-0 nova_compute[187185]: 2025-11-29 07:39:11.225 187189 DEBUG oslo_concurrency.lockutils [req-b779b68d-f514-42e4-be6e-1121a7f71c11 req-875650f9-beed-4de3-8cfa-c75773d365ae 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:39:11 compute-0 nova_compute[187185]: 2025-11-29 07:39:11.225 187189 DEBUG oslo_concurrency.lockutils [req-b779b68d-f514-42e4-be6e-1121a7f71c11 req-875650f9-beed-4de3-8cfa-c75773d365ae 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:39:11 compute-0 nova_compute[187185]: 2025-11-29 07:39:11.226 187189 DEBUG nova.network.neutron [req-b779b68d-f514-42e4-be6e-1121a7f71c11 req-875650f9-beed-4de3-8cfa-c75773d365ae 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Refreshing network info cache for port ef353cc9-1e6a-4c76-9acf-917aecd8f9a6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:39:12 compute-0 nova_compute[187185]: 2025-11-29 07:39:12.181 187189 DEBUG nova.compute.manager [req-53736a8e-0148-4f8c-803b-7689e81e054e req-5e17d866-2f71-421e-a219-c936ab652b63 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Received event network-changed-c65a8c36-5997-4e67-9fa0-e361b7c334ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:39:12 compute-0 nova_compute[187185]: 2025-11-29 07:39:12.181 187189 DEBUG nova.compute.manager [req-53736a8e-0148-4f8c-803b-7689e81e054e req-5e17d866-2f71-421e-a219-c936ab652b63 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Refreshing instance network info cache due to event network-changed-c65a8c36-5997-4e67-9fa0-e361b7c334ca. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:39:12 compute-0 nova_compute[187185]: 2025-11-29 07:39:12.182 187189 DEBUG oslo_concurrency.lockutils [req-53736a8e-0148-4f8c-803b-7689e81e054e req-5e17d866-2f71-421e-a219-c936ab652b63 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-f9a714d0-1a3d-4de4-8fc2-fca74c904eff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:39:12 compute-0 nova_compute[187185]: 2025-11-29 07:39:12.182 187189 DEBUG oslo_concurrency.lockutils [req-53736a8e-0148-4f8c-803b-7689e81e054e req-5e17d866-2f71-421e-a219-c936ab652b63 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-f9a714d0-1a3d-4de4-8fc2-fca74c904eff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:39:12 compute-0 nova_compute[187185]: 2025-11-29 07:39:12.183 187189 DEBUG nova.network.neutron [req-53736a8e-0148-4f8c-803b-7689e81e054e req-5e17d866-2f71-421e-a219-c936ab652b63 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Refreshing network info cache for port c65a8c36-5997-4e67-9fa0-e361b7c334ca _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:39:12 compute-0 nova_compute[187185]: 2025-11-29 07:39:12.377 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:13 compute-0 nova_compute[187185]: 2025-11-29 07:39:13.557 187189 DEBUG nova.network.neutron [req-b779b68d-f514-42e4-be6e-1121a7f71c11 req-875650f9-beed-4de3-8cfa-c75773d365ae 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Updated VIF entry in instance network info cache for port ef353cc9-1e6a-4c76-9acf-917aecd8f9a6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:39:13 compute-0 nova_compute[187185]: 2025-11-29 07:39:13.558 187189 DEBUG nova.network.neutron [req-b779b68d-f514-42e4-be6e-1121a7f71c11 req-875650f9-beed-4de3-8cfa-c75773d365ae 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Updating instance_info_cache with network_info: [{"id": "ef353cc9-1e6a-4c76-9acf-917aecd8f9a6", "address": "fa:16:3e:bc:8f:e7", "network": {"id": "51013e93-c048-46cc-9a9d-a184eb63e1b4", "bridge": "br-int", "label": "tempest-network-smoke--890595323", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef353cc9-1e", "ovs_interfaceid": "ef353cc9-1e6a-4c76-9acf-917aecd8f9a6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "948b9a73-91cd-4803-b642-d1a75f163368", "address": "fa:16:3e:ac:e6:05", "network": {"id": "3085017e-01d1-448e-9eca-033b34f9e960", "bridge": "br-int", "label": "tempest-network-smoke--1681462854", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feac:e605", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap948b9a73-91", "ovs_interfaceid": "948b9a73-91cd-4803-b642-d1a75f163368", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:39:13 compute-0 nova_compute[187185]: 2025-11-29 07:39:13.615 187189 DEBUG oslo_concurrency.lockutils [req-b779b68d-f514-42e4-be6e-1121a7f71c11 req-875650f9-beed-4de3-8cfa-c75773d365ae 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:39:14 compute-0 podman[242073]: 2025-11-29 07:39:14.778671982 +0000 UTC m=+0.046113009 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:39:14 compute-0 podman[242075]: 2025-11-29 07:39:14.812346297 +0000 UTC m=+0.072226080 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 07:39:14 compute-0 podman[242074]: 2025-11-29 07:39:14.822855415 +0000 UTC m=+0.087678728 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Nov 29 07:39:15 compute-0 nova_compute[187185]: 2025-11-29 07:39:15.346 187189 DEBUG nova.network.neutron [req-53736a8e-0148-4f8c-803b-7689e81e054e req-5e17d866-2f71-421e-a219-c936ab652b63 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Updated VIF entry in instance network info cache for port c65a8c36-5997-4e67-9fa0-e361b7c334ca. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:39:15 compute-0 nova_compute[187185]: 2025-11-29 07:39:15.347 187189 DEBUG nova.network.neutron [req-53736a8e-0148-4f8c-803b-7689e81e054e req-5e17d866-2f71-421e-a219-c936ab652b63 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Updating instance_info_cache with network_info: [{"id": "c65a8c36-5997-4e67-9fa0-e361b7c334ca", "address": "fa:16:3e:5b:95:82", "network": {"id": "8ac0e70c-84ba-415c-841b-4a5a525b1a9d", "bridge": "br-int", "label": "tempest-network-smoke--597026508", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc65a8c36-59", "ovs_interfaceid": "c65a8c36-5997-4e67-9fa0-e361b7c334ca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:39:16 compute-0 nova_compute[187185]: 2025-11-29 07:39:16.220 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:16 compute-0 nova_compute[187185]: 2025-11-29 07:39:16.881 187189 DEBUG oslo_concurrency.lockutils [req-53736a8e-0148-4f8c-803b-7689e81e054e req-5e17d866-2f71-421e-a219-c936ab652b63 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-f9a714d0-1a3d-4de4-8fc2-fca74c904eff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:39:17 compute-0 nova_compute[187185]: 2025-11-29 07:39:17.380 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:18 compute-0 nova_compute[187185]: 2025-11-29 07:39:18.809 187189 DEBUG nova.compute.manager [req-aba1800e-5c41-4ef8-92a6-8ecd7459fe95 req-f37ab46b-7668-4880-8005-a18ffa4f4654 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Received event network-changed-ef353cc9-1e6a-4c76-9acf-917aecd8f9a6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:39:18 compute-0 nova_compute[187185]: 2025-11-29 07:39:18.809 187189 DEBUG nova.compute.manager [req-aba1800e-5c41-4ef8-92a6-8ecd7459fe95 req-f37ab46b-7668-4880-8005-a18ffa4f4654 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Refreshing instance network info cache due to event network-changed-ef353cc9-1e6a-4c76-9acf-917aecd8f9a6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:39:18 compute-0 nova_compute[187185]: 2025-11-29 07:39:18.810 187189 DEBUG oslo_concurrency.lockutils [req-aba1800e-5c41-4ef8-92a6-8ecd7459fe95 req-f37ab46b-7668-4880-8005-a18ffa4f4654 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:39:18 compute-0 nova_compute[187185]: 2025-11-29 07:39:18.810 187189 DEBUG oslo_concurrency.lockutils [req-aba1800e-5c41-4ef8-92a6-8ecd7459fe95 req-f37ab46b-7668-4880-8005-a18ffa4f4654 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:39:18 compute-0 nova_compute[187185]: 2025-11-29 07:39:18.811 187189 DEBUG nova.network.neutron [req-aba1800e-5c41-4ef8-92a6-8ecd7459fe95 req-f37ab46b-7668-4880-8005-a18ffa4f4654 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Refreshing network info cache for port ef353cc9-1e6a-4c76-9acf-917aecd8f9a6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:39:19 compute-0 nova_compute[187185]: 2025-11-29 07:39:19.729 187189 DEBUG oslo_concurrency.lockutils [None req-ac602948-8282-420a-8c30-0c3c1f0d46c7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:39:19 compute-0 nova_compute[187185]: 2025-11-29 07:39:19.729 187189 DEBUG oslo_concurrency.lockutils [None req-ac602948-8282-420a-8c30-0c3c1f0d46c7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:39:19 compute-0 nova_compute[187185]: 2025-11-29 07:39:19.730 187189 DEBUG oslo_concurrency.lockutils [None req-ac602948-8282-420a-8c30-0c3c1f0d46c7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:39:19 compute-0 nova_compute[187185]: 2025-11-29 07:39:19.731 187189 DEBUG oslo_concurrency.lockutils [None req-ac602948-8282-420a-8c30-0c3c1f0d46c7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:39:19 compute-0 nova_compute[187185]: 2025-11-29 07:39:19.731 187189 DEBUG oslo_concurrency.lockutils [None req-ac602948-8282-420a-8c30-0c3c1f0d46c7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:39:20 compute-0 nova_compute[187185]: 2025-11-29 07:39:20.531 187189 INFO nova.compute.manager [None req-ac602948-8282-420a-8c30-0c3c1f0d46c7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Terminating instance
Nov 29 07:39:20 compute-0 nova_compute[187185]: 2025-11-29 07:39:20.628 187189 DEBUG nova.network.neutron [req-aba1800e-5c41-4ef8-92a6-8ecd7459fe95 req-f37ab46b-7668-4880-8005-a18ffa4f4654 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Updated VIF entry in instance network info cache for port ef353cc9-1e6a-4c76-9acf-917aecd8f9a6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:39:20 compute-0 nova_compute[187185]: 2025-11-29 07:39:20.629 187189 DEBUG nova.network.neutron [req-aba1800e-5c41-4ef8-92a6-8ecd7459fe95 req-f37ab46b-7668-4880-8005-a18ffa4f4654 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Updating instance_info_cache with network_info: [{"id": "ef353cc9-1e6a-4c76-9acf-917aecd8f9a6", "address": "fa:16:3e:bc:8f:e7", "network": {"id": "51013e93-c048-46cc-9a9d-a184eb63e1b4", "bridge": "br-int", "label": "tempest-network-smoke--890595323", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef353cc9-1e", "ovs_interfaceid": "ef353cc9-1e6a-4c76-9acf-917aecd8f9a6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "948b9a73-91cd-4803-b642-d1a75f163368", "address": "fa:16:3e:ac:e6:05", "network": {"id": "3085017e-01d1-448e-9eca-033b34f9e960", "bridge": "br-int", "label": "tempest-network-smoke--1681462854", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feac:e605", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap948b9a73-91", "ovs_interfaceid": "948b9a73-91cd-4803-b642-d1a75f163368", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:39:20 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:20.906 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=58, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=57) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:39:20 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:20.908 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:39:20 compute-0 nova_compute[187185]: 2025-11-29 07:39:20.930 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:21 compute-0 nova_compute[187185]: 2025-11-29 07:39:21.223 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:21 compute-0 nova_compute[187185]: 2025-11-29 07:39:21.477 187189 DEBUG nova.compute.manager [None req-ac602948-8282-420a-8c30-0c3c1f0d46c7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 07:39:21 compute-0 nova_compute[187185]: 2025-11-29 07:39:21.485 187189 DEBUG oslo_concurrency.lockutils [req-aba1800e-5c41-4ef8-92a6-8ecd7459fe95 req-f37ab46b-7668-4880-8005-a18ffa4f4654 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:39:21 compute-0 kernel: tapef353cc9-1e (unregistering): left promiscuous mode
Nov 29 07:39:21 compute-0 NetworkManager[55227]: <info>  [1764401961.5083] device (tapef353cc9-1e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:39:21 compute-0 ovn_controller[95281]: 2025-11-29T07:39:21Z|00514|binding|INFO|Releasing lport ef353cc9-1e6a-4c76-9acf-917aecd8f9a6 from this chassis (sb_readonly=1)
Nov 29 07:39:21 compute-0 nova_compute[187185]: 2025-11-29 07:39:21.514 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:21 compute-0 ovn_controller[95281]: 2025-11-29T07:39:21Z|00515|binding|INFO|Removing iface tapef353cc9-1e ovn-installed in OVS
Nov 29 07:39:21 compute-0 ovn_controller[95281]: 2025-11-29T07:39:21Z|00516|if_status|INFO|Dropped 6 log messages in last 769 seconds (most recently, 769 seconds ago) due to excessive rate
Nov 29 07:39:21 compute-0 ovn_controller[95281]: 2025-11-29T07:39:21Z|00517|if_status|INFO|Not setting lport ef353cc9-1e6a-4c76-9acf-917aecd8f9a6 down as sb is readonly
Nov 29 07:39:21 compute-0 nova_compute[187185]: 2025-11-29 07:39:21.517 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:21 compute-0 kernel: tap948b9a73-91 (unregistering): left promiscuous mode
Nov 29 07:39:21 compute-0 NetworkManager[55227]: <info>  [1764401961.5380] device (tap948b9a73-91): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:39:21 compute-0 nova_compute[187185]: 2025-11-29 07:39:21.539 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:21 compute-0 ovn_controller[95281]: 2025-11-29T07:39:21Z|00518|binding|INFO|Releasing lport 948b9a73-91cd-4803-b642-d1a75f163368 from this chassis (sb_readonly=1)
Nov 29 07:39:21 compute-0 ovn_controller[95281]: 2025-11-29T07:39:21Z|00519|binding|INFO|Removing iface tap948b9a73-91 ovn-installed in OVS
Nov 29 07:39:21 compute-0 nova_compute[187185]: 2025-11-29 07:39:21.547 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:21 compute-0 nova_compute[187185]: 2025-11-29 07:39:21.562 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:21 compute-0 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d0000009a.scope: Deactivated successfully.
Nov 29 07:39:21 compute-0 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d0000009a.scope: Consumed 14.873s CPU time.
Nov 29 07:39:21 compute-0 systemd-machined[153486]: Machine qemu-61-instance-0000009a terminated.
Nov 29 07:39:21 compute-0 nova_compute[187185]: 2025-11-29 07:39:21.709 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:21 compute-0 NetworkManager[55227]: <info>  [1764401961.7185] manager: (tap948b9a73-91): new Tun device (/org/freedesktop/NetworkManager/Devices/271)
Nov 29 07:39:21 compute-0 nova_compute[187185]: 2025-11-29 07:39:21.719 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:21 compute-0 nova_compute[187185]: 2025-11-29 07:39:21.773 187189 INFO nova.virt.libvirt.driver [-] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Instance destroyed successfully.
Nov 29 07:39:21 compute-0 nova_compute[187185]: 2025-11-29 07:39:21.775 187189 DEBUG nova.objects.instance [None req-ac602948-8282-420a-8c30-0c3c1f0d46c7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'resources' on Instance uuid e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:39:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:21.910 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '58'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:39:21 compute-0 sshd-session[242072]: error: kex_exchange_identification: read: Connection timed out
Nov 29 07:39:21 compute-0 sshd-session[242072]: banner exchange: Connection from 115.190.187.93 port 56646: Connection timed out
Nov 29 07:39:22 compute-0 nova_compute[187185]: 2025-11-29 07:39:22.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:39:22 compute-0 nova_compute[187185]: 2025-11-29 07:39:22.382 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:23 compute-0 podman[242174]: 2025-11-29 07:39:23.875186635 +0000 UTC m=+0.130508252 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 07:39:23 compute-0 ovn_controller[95281]: 2025-11-29T07:39:23Z|00520|binding|INFO|Setting lport ef353cc9-1e6a-4c76-9acf-917aecd8f9a6 down in Southbound
Nov 29 07:39:23 compute-0 ovn_controller[95281]: 2025-11-29T07:39:23Z|00521|binding|INFO|Setting lport 948b9a73-91cd-4803-b642-d1a75f163368 down in Southbound
Nov 29 07:39:24 compute-0 nova_compute[187185]: 2025-11-29 07:39:24.225 187189 DEBUG nova.virt.libvirt.vif [None req-ac602948-8282-420a-8c30-0c3c1f0d46c7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:38:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1271234496',display_name='tempest-TestGettingAddress-server-1271234496',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1271234496',id=154,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB3is4E+83iEsFAN3k4vmM3DIB3FGnja3FrFYLTxp4hwLSgaHjf7h1x9RWYq0bVKXNWPDcV4Et8a4J2p23tZrThcwleGMa0jsxDB4wzeuHiJCpifdffKRCxdxwXJAAdszQ==',key_name='tempest-TestGettingAddress-369881010',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:38:43Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-mzjvfdsh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:38:43Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ef353cc9-1e6a-4c76-9acf-917aecd8f9a6", "address": "fa:16:3e:bc:8f:e7", "network": {"id": "51013e93-c048-46cc-9a9d-a184eb63e1b4", "bridge": "br-int", "label": "tempest-network-smoke--890595323", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef353cc9-1e", "ovs_interfaceid": "ef353cc9-1e6a-4c76-9acf-917aecd8f9a6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:39:24 compute-0 nova_compute[187185]: 2025-11-29 07:39:24.225 187189 DEBUG nova.network.os_vif_util [None req-ac602948-8282-420a-8c30-0c3c1f0d46c7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "ef353cc9-1e6a-4c76-9acf-917aecd8f9a6", "address": "fa:16:3e:bc:8f:e7", "network": {"id": "51013e93-c048-46cc-9a9d-a184eb63e1b4", "bridge": "br-int", "label": "tempest-network-smoke--890595323", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef353cc9-1e", "ovs_interfaceid": "ef353cc9-1e6a-4c76-9acf-917aecd8f9a6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:39:24 compute-0 nova_compute[187185]: 2025-11-29 07:39:24.226 187189 DEBUG nova.network.os_vif_util [None req-ac602948-8282-420a-8c30-0c3c1f0d46c7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:bc:8f:e7,bridge_name='br-int',has_traffic_filtering=True,id=ef353cc9-1e6a-4c76-9acf-917aecd8f9a6,network=Network(51013e93-c048-46cc-9a9d-a184eb63e1b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef353cc9-1e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:39:24 compute-0 nova_compute[187185]: 2025-11-29 07:39:24.227 187189 DEBUG os_vif [None req-ac602948-8282-420a-8c30-0c3c1f0d46c7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:bc:8f:e7,bridge_name='br-int',has_traffic_filtering=True,id=ef353cc9-1e6a-4c76-9acf-917aecd8f9a6,network=Network(51013e93-c048-46cc-9a9d-a184eb63e1b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef353cc9-1e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:39:24 compute-0 nova_compute[187185]: 2025-11-29 07:39:24.229 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:24 compute-0 nova_compute[187185]: 2025-11-29 07:39:24.230 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapef353cc9-1e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:39:24 compute-0 nova_compute[187185]: 2025-11-29 07:39:24.232 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:24 compute-0 nova_compute[187185]: 2025-11-29 07:39:24.234 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:39:24 compute-0 nova_compute[187185]: 2025-11-29 07:39:24.242 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:24 compute-0 nova_compute[187185]: 2025-11-29 07:39:24.249 187189 INFO os_vif [None req-ac602948-8282-420a-8c30-0c3c1f0d46c7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:bc:8f:e7,bridge_name='br-int',has_traffic_filtering=True,id=ef353cc9-1e6a-4c76-9acf-917aecd8f9a6,network=Network(51013e93-c048-46cc-9a9d-a184eb63e1b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef353cc9-1e')
Nov 29 07:39:24 compute-0 nova_compute[187185]: 2025-11-29 07:39:24.251 187189 DEBUG nova.virt.libvirt.vif [None req-ac602948-8282-420a-8c30-0c3c1f0d46c7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:38:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1271234496',display_name='tempest-TestGettingAddress-server-1271234496',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1271234496',id=154,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB3is4E+83iEsFAN3k4vmM3DIB3FGnja3FrFYLTxp4hwLSgaHjf7h1x9RWYq0bVKXNWPDcV4Et8a4J2p23tZrThcwleGMa0jsxDB4wzeuHiJCpifdffKRCxdxwXJAAdszQ==',key_name='tempest-TestGettingAddress-369881010',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:38:43Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-mzjvfdsh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:38:43Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "948b9a73-91cd-4803-b642-d1a75f163368", "address": "fa:16:3e:ac:e6:05", "network": {"id": "3085017e-01d1-448e-9eca-033b34f9e960", "bridge": "br-int", "label": "tempest-network-smoke--1681462854", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feac:e605", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap948b9a73-91", "ovs_interfaceid": "948b9a73-91cd-4803-b642-d1a75f163368", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:39:24 compute-0 nova_compute[187185]: 2025-11-29 07:39:24.252 187189 DEBUG nova.network.os_vif_util [None req-ac602948-8282-420a-8c30-0c3c1f0d46c7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "948b9a73-91cd-4803-b642-d1a75f163368", "address": "fa:16:3e:ac:e6:05", "network": {"id": "3085017e-01d1-448e-9eca-033b34f9e960", "bridge": "br-int", "label": "tempest-network-smoke--1681462854", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feac:e605", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap948b9a73-91", "ovs_interfaceid": "948b9a73-91cd-4803-b642-d1a75f163368", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:39:24 compute-0 nova_compute[187185]: 2025-11-29 07:39:24.253 187189 DEBUG nova.network.os_vif_util [None req-ac602948-8282-420a-8c30-0c3c1f0d46c7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:e6:05,bridge_name='br-int',has_traffic_filtering=True,id=948b9a73-91cd-4803-b642-d1a75f163368,network=Network(3085017e-01d1-448e-9eca-033b34f9e960),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap948b9a73-91') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:39:24 compute-0 nova_compute[187185]: 2025-11-29 07:39:24.254 187189 DEBUG os_vif [None req-ac602948-8282-420a-8c30-0c3c1f0d46c7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:e6:05,bridge_name='br-int',has_traffic_filtering=True,id=948b9a73-91cd-4803-b642-d1a75f163368,network=Network(3085017e-01d1-448e-9eca-033b34f9e960),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap948b9a73-91') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:39:24 compute-0 nova_compute[187185]: 2025-11-29 07:39:24.257 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:24 compute-0 nova_compute[187185]: 2025-11-29 07:39:24.257 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap948b9a73-91, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:39:24 compute-0 nova_compute[187185]: 2025-11-29 07:39:24.259 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:24 compute-0 nova_compute[187185]: 2025-11-29 07:39:24.261 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:24 compute-0 nova_compute[187185]: 2025-11-29 07:39:24.264 187189 INFO os_vif [None req-ac602948-8282-420a-8c30-0c3c1f0d46c7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:e6:05,bridge_name='br-int',has_traffic_filtering=True,id=948b9a73-91cd-4803-b642-d1a75f163368,network=Network(3085017e-01d1-448e-9eca-033b34f9e960),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap948b9a73-91')
Nov 29 07:39:24 compute-0 nova_compute[187185]: 2025-11-29 07:39:24.265 187189 INFO nova.virt.libvirt.driver [None req-ac602948-8282-420a-8c30-0c3c1f0d46c7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Deleting instance files /var/lib/nova/instances/e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d_del
Nov 29 07:39:24 compute-0 nova_compute[187185]: 2025-11-29 07:39:24.266 187189 INFO nova.virt.libvirt.driver [None req-ac602948-8282-420a-8c30-0c3c1f0d46c7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Deletion of /var/lib/nova/instances/e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d_del complete
Nov 29 07:39:24 compute-0 nova_compute[187185]: 2025-11-29 07:39:24.318 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:39:24 compute-0 nova_compute[187185]: 2025-11-29 07:39:24.318 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:39:24 compute-0 nova_compute[187185]: 2025-11-29 07:39:24.319 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:39:25 compute-0 nova_compute[187185]: 2025-11-29 07:39:25.491 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875
Nov 29 07:39:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:25.495 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:e6:05 2001:db8::f816:3eff:feac:e605'], port_security=['fa:16:3e:ac:e6:05 2001:db8::f816:3eff:feac:e605'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:feac:e605/64', 'neutron:device_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3085017e-01d1-448e-9eca-033b34f9e960', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b28f67ac-b290-4be5-88df-4393a9d30b89', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3544b146-f2db-4cb1-8570-c2c0dfd4e173, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=948b9a73-91cd-4803-b642-d1a75f163368) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:39:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:25.497 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bc:8f:e7 10.100.0.14'], port_security=['fa:16:3e:bc:8f:e7 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-51013e93-c048-46cc-9a9d-a184eb63e1b4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b28f67ac-b290-4be5-88df-4393a9d30b89', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a7c7b7bb-9c39-4f1c-a218-71c5fbf31db4, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=ef353cc9-1e6a-4c76-9acf-917aecd8f9a6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:39:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:25.498 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 948b9a73-91cd-4803-b642-d1a75f163368 in datapath 3085017e-01d1-448e-9eca-033b34f9e960 unbound from our chassis
Nov 29 07:39:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:25.500 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3085017e-01d1-448e-9eca-033b34f9e960, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:39:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:25.502 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f551cd71-30bc-41f7-88cb-95815a44b68d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:39:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:25.503 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960 namespace which is not needed anymore
Nov 29 07:39:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:25.529 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:39:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:25.530 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:39:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:25.531 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:39:25 compute-0 neutron-haproxy-ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960[241936]: [NOTICE]   (241941) : haproxy version is 2.8.14-c23fe91
Nov 29 07:39:25 compute-0 neutron-haproxy-ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960[241936]: [NOTICE]   (241941) : path to executable is /usr/sbin/haproxy
Nov 29 07:39:25 compute-0 neutron-haproxy-ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960[241936]: [WARNING]  (241941) : Exiting Master process...
Nov 29 07:39:25 compute-0 neutron-haproxy-ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960[241936]: [ALERT]    (241941) : Current worker (241944) exited with code 143 (Terminated)
Nov 29 07:39:25 compute-0 neutron-haproxy-ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960[241936]: [WARNING]  (241941) : All workers exited. Exiting... (0)
Nov 29 07:39:25 compute-0 systemd[1]: libpod-231ae1015c261439746df1967e65cf55f1082736175fc624e88a33d76e16f997.scope: Deactivated successfully.
Nov 29 07:39:25 compute-0 podman[242220]: 2025-11-29 07:39:25.75812079 +0000 UTC m=+0.052846470 container died 231ae1015c261439746df1967e65cf55f1082736175fc624e88a33d76e16f997 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 29 07:39:25 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-231ae1015c261439746df1967e65cf55f1082736175fc624e88a33d76e16f997-userdata-shm.mount: Deactivated successfully.
Nov 29 07:39:25 compute-0 systemd[1]: var-lib-containers-storage-overlay-c47aacbac2e405d8881d795db883c9665cb9b6376c54744ab1021ab71213e7fd-merged.mount: Deactivated successfully.
Nov 29 07:39:25 compute-0 podman[242220]: 2025-11-29 07:39:25.806004948 +0000 UTC m=+0.100730628 container cleanup 231ae1015c261439746df1967e65cf55f1082736175fc624e88a33d76e16f997 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 07:39:25 compute-0 systemd[1]: libpod-conmon-231ae1015c261439746df1967e65cf55f1082736175fc624e88a33d76e16f997.scope: Deactivated successfully.
Nov 29 07:39:25 compute-0 podman[242251]: 2025-11-29 07:39:25.890120024 +0000 UTC m=+0.056550835 container remove 231ae1015c261439746df1967e65cf55f1082736175fc624e88a33d76e16f997 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 29 07:39:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:25.897 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[3f9d796d-a5b6-49b2-9abb-ac1b65b9bd18]: (4, ('Sat Nov 29 07:39:25 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960 (231ae1015c261439746df1967e65cf55f1082736175fc624e88a33d76e16f997)\n231ae1015c261439746df1967e65cf55f1082736175fc624e88a33d76e16f997\nSat Nov 29 07:39:25 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960 (231ae1015c261439746df1967e65cf55f1082736175fc624e88a33d76e16f997)\n231ae1015c261439746df1967e65cf55f1082736175fc624e88a33d76e16f997\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:39:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:25.900 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6b736697-5110-4fe4-8297-d0f925f9b5b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:39:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:25.902 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3085017e-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:39:25 compute-0 nova_compute[187185]: 2025-11-29 07:39:25.905 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:25 compute-0 kernel: tap3085017e-00: left promiscuous mode
Nov 29 07:39:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:25.914 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d67392e1-7e1a-4512-9256-13051d1d2df2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:39:25 compute-0 nova_compute[187185]: 2025-11-29 07:39:25.924 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:25.932 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[5813c3d4-70e1-4430-a73e-0036cc18cd0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:39:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:25.935 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[daeae938-944f-4efa-94ba-be96818f4208]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:39:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:25.966 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[8e190be0-264c-4184-923f-06d9ee787cff]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 737162, 'reachable_time': 39403, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242266, 'error': None, 'target': 'ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:39:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:25.970 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:39:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:25.970 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[03040b9f-5426-48ed-aa73-266d4cc4c758]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:39:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:25.972 104254 INFO neutron.agent.ovn.metadata.agent [-] Port ef353cc9-1e6a-4c76-9acf-917aecd8f9a6 in datapath 51013e93-c048-46cc-9a9d-a184eb63e1b4 unbound from our chassis
Nov 29 07:39:25 compute-0 systemd[1]: run-netns-ovnmeta\x2d3085017e\x2d01d1\x2d448e\x2d9eca\x2d033b34f9e960.mount: Deactivated successfully.
Nov 29 07:39:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:25.975 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 51013e93-c048-46cc-9a9d-a184eb63e1b4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:39:25 compute-0 nova_compute[187185]: 2025-11-29 07:39:25.975 187189 INFO nova.compute.manager [None req-ac602948-8282-420a-8c30-0c3c1f0d46c7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Took 4.50 seconds to destroy the instance on the hypervisor.
Nov 29 07:39:25 compute-0 nova_compute[187185]: 2025-11-29 07:39:25.976 187189 DEBUG oslo.service.loopingcall [None req-ac602948-8282-420a-8c30-0c3c1f0d46c7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 07:39:25 compute-0 nova_compute[187185]: 2025-11-29 07:39:25.976 187189 DEBUG nova.compute.manager [-] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 07:39:25 compute-0 nova_compute[187185]: 2025-11-29 07:39:25.977 187189 DEBUG nova.network.neutron [-] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 07:39:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:25.977 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[5f5db2dd-7624-4330-9834-099f59eb0870]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:39:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:25.977 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4 namespace which is not needed anymore
Nov 29 07:39:26 compute-0 neutron-haproxy-ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4[241803]: [NOTICE]   (241846) : haproxy version is 2.8.14-c23fe91
Nov 29 07:39:26 compute-0 neutron-haproxy-ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4[241803]: [NOTICE]   (241846) : path to executable is /usr/sbin/haproxy
Nov 29 07:39:26 compute-0 neutron-haproxy-ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4[241803]: [WARNING]  (241846) : Exiting Master process...
Nov 29 07:39:26 compute-0 neutron-haproxy-ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4[241803]: [WARNING]  (241846) : Exiting Master process...
Nov 29 07:39:26 compute-0 neutron-haproxy-ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4[241803]: [ALERT]    (241846) : Current worker (241852) exited with code 143 (Terminated)
Nov 29 07:39:26 compute-0 neutron-haproxy-ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4[241803]: [WARNING]  (241846) : All workers exited. Exiting... (0)
Nov 29 07:39:26 compute-0 systemd[1]: libpod-2f8b8cb10efd6f617c050ccc794cd01e10282866102863fa7d0ae1730bb49d85.scope: Deactivated successfully.
Nov 29 07:39:26 compute-0 podman[242288]: 2025-11-29 07:39:26.150590692 +0000 UTC m=+0.055535817 container died 2f8b8cb10efd6f617c050ccc794cd01e10282866102863fa7d0ae1730bb49d85 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:39:26 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2f8b8cb10efd6f617c050ccc794cd01e10282866102863fa7d0ae1730bb49d85-userdata-shm.mount: Deactivated successfully.
Nov 29 07:39:26 compute-0 systemd[1]: var-lib-containers-storage-overlay-29f05bc38d74387f45bcacf18fa6071e137d578080ed116d64f3a98494023b97-merged.mount: Deactivated successfully.
Nov 29 07:39:26 compute-0 podman[242288]: 2025-11-29 07:39:26.187365755 +0000 UTC m=+0.092310850 container cleanup 2f8b8cb10efd6f617c050ccc794cd01e10282866102863fa7d0ae1730bb49d85 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 07:39:26 compute-0 systemd[1]: libpod-conmon-2f8b8cb10efd6f617c050ccc794cd01e10282866102863fa7d0ae1730bb49d85.scope: Deactivated successfully.
Nov 29 07:39:26 compute-0 podman[242319]: 2025-11-29 07:39:26.263238797 +0000 UTC m=+0.051929624 container remove 2f8b8cb10efd6f617c050ccc794cd01e10282866102863fa7d0ae1730bb49d85 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:39:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:26.270 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[5ae55480-ed68-45cf-a357-355611b81019]: (4, ('Sat Nov 29 07:39:26 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4 (2f8b8cb10efd6f617c050ccc794cd01e10282866102863fa7d0ae1730bb49d85)\n2f8b8cb10efd6f617c050ccc794cd01e10282866102863fa7d0ae1730bb49d85\nSat Nov 29 07:39:26 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4 (2f8b8cb10efd6f617c050ccc794cd01e10282866102863fa7d0ae1730bb49d85)\n2f8b8cb10efd6f617c050ccc794cd01e10282866102863fa7d0ae1730bb49d85\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:39:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:26.272 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[3e164ef8-01a6-43c2-8df7-75953b57e701]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:39:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:26.273 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap51013e93-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:39:26 compute-0 kernel: tap51013e93-c0: left promiscuous mode
Nov 29 07:39:26 compute-0 nova_compute[187185]: 2025-11-29 07:39:26.276 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:26 compute-0 nova_compute[187185]: 2025-11-29 07:39:26.290 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:26.292 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[cc556d4b-71b6-4c61-ad86-c06a54eabae5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:39:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:26.308 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[1cfb86fc-de59-4b83-9ee0-f91e7e04ab8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:39:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:26.310 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[7863649c-28fd-4009-af42-c9f278bdbe8f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:39:26 compute-0 nova_compute[187185]: 2025-11-29 07:39:26.328 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "refresh_cache-f9a714d0-1a3d-4de4-8fc2-fca74c904eff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:39:26 compute-0 nova_compute[187185]: 2025-11-29 07:39:26.328 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquired lock "refresh_cache-f9a714d0-1a3d-4de4-8fc2-fca74c904eff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:39:26 compute-0 nova_compute[187185]: 2025-11-29 07:39:26.328 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 29 07:39:26 compute-0 nova_compute[187185]: 2025-11-29 07:39:26.329 187189 DEBUG nova.objects.instance [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9a714d0-1a3d-4de4-8fc2-fca74c904eff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:39:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:26.333 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6b5b3acd-fe5f-4de4-a718-b0277fe061bd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 736999, 'reachable_time': 20435, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242335, 'error': None, 'target': 'ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:39:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:26.335 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:39:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:26.336 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[0b8fd1e7-e98e-46a8-92ad-35793f3630c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:39:26 compute-0 systemd[1]: run-netns-ovnmeta\x2d51013e93\x2dc048\x2d46cc\x2d9a9d\x2da184eb63e1b4.mount: Deactivated successfully.
Nov 29 07:39:26 compute-0 nova_compute[187185]: 2025-11-29 07:39:26.827 187189 DEBUG nova.compute.manager [req-eb7554f7-a84c-4c93-aa20-24333dfd052e req-89bbdd00-7113-4848-ba5e-ef3c2a63fa42 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Received event network-vif-unplugged-ef353cc9-1e6a-4c76-9acf-917aecd8f9a6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:39:26 compute-0 nova_compute[187185]: 2025-11-29 07:39:26.828 187189 DEBUG oslo_concurrency.lockutils [req-eb7554f7-a84c-4c93-aa20-24333dfd052e req-89bbdd00-7113-4848-ba5e-ef3c2a63fa42 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:39:26 compute-0 nova_compute[187185]: 2025-11-29 07:39:26.828 187189 DEBUG oslo_concurrency.lockutils [req-eb7554f7-a84c-4c93-aa20-24333dfd052e req-89bbdd00-7113-4848-ba5e-ef3c2a63fa42 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:39:26 compute-0 nova_compute[187185]: 2025-11-29 07:39:26.828 187189 DEBUG oslo_concurrency.lockutils [req-eb7554f7-a84c-4c93-aa20-24333dfd052e req-89bbdd00-7113-4848-ba5e-ef3c2a63fa42 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:39:26 compute-0 nova_compute[187185]: 2025-11-29 07:39:26.828 187189 DEBUG nova.compute.manager [req-eb7554f7-a84c-4c93-aa20-24333dfd052e req-89bbdd00-7113-4848-ba5e-ef3c2a63fa42 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] No waiting events found dispatching network-vif-unplugged-ef353cc9-1e6a-4c76-9acf-917aecd8f9a6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:39:26 compute-0 nova_compute[187185]: 2025-11-29 07:39:26.829 187189 DEBUG nova.compute.manager [req-eb7554f7-a84c-4c93-aa20-24333dfd052e req-89bbdd00-7113-4848-ba5e-ef3c2a63fa42 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Received event network-vif-unplugged-ef353cc9-1e6a-4c76-9acf-917aecd8f9a6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 07:39:27 compute-0 nova_compute[187185]: 2025-11-29 07:39:27.384 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:27 compute-0 nova_compute[187185]: 2025-11-29 07:39:27.773 187189 DEBUG nova.network.neutron [-] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:39:27 compute-0 nova_compute[187185]: 2025-11-29 07:39:27.794 187189 INFO nova.compute.manager [-] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Took 1.82 seconds to deallocate network for instance.
Nov 29 07:39:27 compute-0 nova_compute[187185]: 2025-11-29 07:39:27.898 187189 DEBUG oslo_concurrency.lockutils [None req-ac602948-8282-420a-8c30-0c3c1f0d46c7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:39:27 compute-0 nova_compute[187185]: 2025-11-29 07:39:27.899 187189 DEBUG oslo_concurrency.lockutils [None req-ac602948-8282-420a-8c30-0c3c1f0d46c7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:39:27 compute-0 nova_compute[187185]: 2025-11-29 07:39:27.906 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Updating instance_info_cache with network_info: [{"id": "c65a8c36-5997-4e67-9fa0-e361b7c334ca", "address": "fa:16:3e:5b:95:82", "network": {"id": "8ac0e70c-84ba-415c-841b-4a5a525b1a9d", "bridge": "br-int", "label": "tempest-network-smoke--597026508", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc65a8c36-59", "ovs_interfaceid": "c65a8c36-5997-4e67-9fa0-e361b7c334ca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:39:27 compute-0 nova_compute[187185]: 2025-11-29 07:39:27.926 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Releasing lock "refresh_cache-f9a714d0-1a3d-4de4-8fc2-fca74c904eff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:39:27 compute-0 nova_compute[187185]: 2025-11-29 07:39:27.926 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 29 07:39:27 compute-0 nova_compute[187185]: 2025-11-29 07:39:27.927 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:39:27 compute-0 nova_compute[187185]: 2025-11-29 07:39:27.928 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:39:27 compute-0 nova_compute[187185]: 2025-11-29 07:39:27.948 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:39:27 compute-0 nova_compute[187185]: 2025-11-29 07:39:27.995 187189 DEBUG nova.compute.provider_tree [None req-ac602948-8282-420a-8c30-0c3c1f0d46c7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:39:28 compute-0 nova_compute[187185]: 2025-11-29 07:39:28.010 187189 DEBUG nova.scheduler.client.report [None req-ac602948-8282-420a-8c30-0c3c1f0d46c7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:39:28 compute-0 nova_compute[187185]: 2025-11-29 07:39:28.039 187189 DEBUG oslo_concurrency.lockutils [None req-ac602948-8282-420a-8c30-0c3c1f0d46c7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:39:28 compute-0 nova_compute[187185]: 2025-11-29 07:39:28.041 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:39:28 compute-0 nova_compute[187185]: 2025-11-29 07:39:28.042 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:39:28 compute-0 nova_compute[187185]: 2025-11-29 07:39:28.042 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:39:28 compute-0 nova_compute[187185]: 2025-11-29 07:39:28.090 187189 INFO nova.scheduler.client.report [None req-ac602948-8282-420a-8c30-0c3c1f0d46c7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Deleted allocations for instance e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d
Nov 29 07:39:28 compute-0 nova_compute[187185]: 2025-11-29 07:39:28.140 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f9a714d0-1a3d-4de4-8fc2-fca74c904eff/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:39:28 compute-0 nova_compute[187185]: 2025-11-29 07:39:28.189 187189 DEBUG oslo_concurrency.lockutils [None req-ac602948-8282-420a-8c30-0c3c1f0d46c7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.459s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:39:28 compute-0 nova_compute[187185]: 2025-11-29 07:39:28.209 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f9a714d0-1a3d-4de4-8fc2-fca74c904eff/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:39:28 compute-0 nova_compute[187185]: 2025-11-29 07:39:28.210 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f9a714d0-1a3d-4de4-8fc2-fca74c904eff/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:39:28 compute-0 nova_compute[187185]: 2025-11-29 07:39:28.276 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f9a714d0-1a3d-4de4-8fc2-fca74c904eff/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:39:28 compute-0 nova_compute[187185]: 2025-11-29 07:39:28.467 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:39:28 compute-0 nova_compute[187185]: 2025-11-29 07:39:28.469 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5542MB free_disk=73.22529220581055GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:39:28 compute-0 nova_compute[187185]: 2025-11-29 07:39:28.470 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:39:28 compute-0 nova_compute[187185]: 2025-11-29 07:39:28.470 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:39:28 compute-0 nova_compute[187185]: 2025-11-29 07:39:28.556 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance f9a714d0-1a3d-4de4-8fc2-fca74c904eff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:39:28 compute-0 nova_compute[187185]: 2025-11-29 07:39:28.556 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:39:28 compute-0 nova_compute[187185]: 2025-11-29 07:39:28.557 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:39:28 compute-0 nova_compute[187185]: 2025-11-29 07:39:28.613 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:39:28 compute-0 nova_compute[187185]: 2025-11-29 07:39:28.638 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:39:28 compute-0 nova_compute[187185]: 2025-11-29 07:39:28.671 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:39:28 compute-0 nova_compute[187185]: 2025-11-29 07:39:28.672 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.202s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:39:28 compute-0 nova_compute[187185]: 2025-11-29 07:39:28.948 187189 DEBUG nova.compute.manager [req-efa381b2-08fd-4925-8a82-7dfe9dc1e040 req-d1e0838f-60ba-4a69-be91-e9bc6f6f27d7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Received event network-vif-plugged-ef353cc9-1e6a-4c76-9acf-917aecd8f9a6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:39:28 compute-0 nova_compute[187185]: 2025-11-29 07:39:28.949 187189 DEBUG oslo_concurrency.lockutils [req-efa381b2-08fd-4925-8a82-7dfe9dc1e040 req-d1e0838f-60ba-4a69-be91-e9bc6f6f27d7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:39:28 compute-0 nova_compute[187185]: 2025-11-29 07:39:28.951 187189 DEBUG oslo_concurrency.lockutils [req-efa381b2-08fd-4925-8a82-7dfe9dc1e040 req-d1e0838f-60ba-4a69-be91-e9bc6f6f27d7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:39:28 compute-0 nova_compute[187185]: 2025-11-29 07:39:28.952 187189 DEBUG oslo_concurrency.lockutils [req-efa381b2-08fd-4925-8a82-7dfe9dc1e040 req-d1e0838f-60ba-4a69-be91-e9bc6f6f27d7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:39:28 compute-0 nova_compute[187185]: 2025-11-29 07:39:28.952 187189 DEBUG nova.compute.manager [req-efa381b2-08fd-4925-8a82-7dfe9dc1e040 req-d1e0838f-60ba-4a69-be91-e9bc6f6f27d7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] No waiting events found dispatching network-vif-plugged-ef353cc9-1e6a-4c76-9acf-917aecd8f9a6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:39:28 compute-0 nova_compute[187185]: 2025-11-29 07:39:28.952 187189 WARNING nova.compute.manager [req-efa381b2-08fd-4925-8a82-7dfe9dc1e040 req-d1e0838f-60ba-4a69-be91-e9bc6f6f27d7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Received unexpected event network-vif-plugged-ef353cc9-1e6a-4c76-9acf-917aecd8f9a6 for instance with vm_state deleted and task_state None.
Nov 29 07:39:29 compute-0 nova_compute[187185]: 2025-11-29 07:39:29.061 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:39:29 compute-0 nova_compute[187185]: 2025-11-29 07:39:29.062 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:39:29 compute-0 nova_compute[187185]: 2025-11-29 07:39:29.079 187189 DEBUG nova.compute.manager [req-163cb3a1-c7a8-4f6b-a776-e29ef9ae5086 req-cbd90d80-9c64-4dff-a2ae-94b3ac291045 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Received event network-vif-unplugged-948b9a73-91cd-4803-b642-d1a75f163368 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:39:29 compute-0 nova_compute[187185]: 2025-11-29 07:39:29.079 187189 DEBUG oslo_concurrency.lockutils [req-163cb3a1-c7a8-4f6b-a776-e29ef9ae5086 req-cbd90d80-9c64-4dff-a2ae-94b3ac291045 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:39:29 compute-0 nova_compute[187185]: 2025-11-29 07:39:29.080 187189 DEBUG oslo_concurrency.lockutils [req-163cb3a1-c7a8-4f6b-a776-e29ef9ae5086 req-cbd90d80-9c64-4dff-a2ae-94b3ac291045 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:39:29 compute-0 nova_compute[187185]: 2025-11-29 07:39:29.080 187189 DEBUG oslo_concurrency.lockutils [req-163cb3a1-c7a8-4f6b-a776-e29ef9ae5086 req-cbd90d80-9c64-4dff-a2ae-94b3ac291045 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:39:29 compute-0 nova_compute[187185]: 2025-11-29 07:39:29.081 187189 DEBUG nova.compute.manager [req-163cb3a1-c7a8-4f6b-a776-e29ef9ae5086 req-cbd90d80-9c64-4dff-a2ae-94b3ac291045 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] No waiting events found dispatching network-vif-unplugged-948b9a73-91cd-4803-b642-d1a75f163368 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:39:29 compute-0 nova_compute[187185]: 2025-11-29 07:39:29.081 187189 WARNING nova.compute.manager [req-163cb3a1-c7a8-4f6b-a776-e29ef9ae5086 req-cbd90d80-9c64-4dff-a2ae-94b3ac291045 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Received unexpected event network-vif-unplugged-948b9a73-91cd-4803-b642-d1a75f163368 for instance with vm_state deleted and task_state None.
Nov 29 07:39:29 compute-0 nova_compute[187185]: 2025-11-29 07:39:29.081 187189 DEBUG nova.compute.manager [req-163cb3a1-c7a8-4f6b-a776-e29ef9ae5086 req-cbd90d80-9c64-4dff-a2ae-94b3ac291045 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Received event network-vif-plugged-948b9a73-91cd-4803-b642-d1a75f163368 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:39:29 compute-0 nova_compute[187185]: 2025-11-29 07:39:29.081 187189 DEBUG oslo_concurrency.lockutils [req-163cb3a1-c7a8-4f6b-a776-e29ef9ae5086 req-cbd90d80-9c64-4dff-a2ae-94b3ac291045 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:39:29 compute-0 nova_compute[187185]: 2025-11-29 07:39:29.082 187189 DEBUG oslo_concurrency.lockutils [req-163cb3a1-c7a8-4f6b-a776-e29ef9ae5086 req-cbd90d80-9c64-4dff-a2ae-94b3ac291045 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:39:29 compute-0 nova_compute[187185]: 2025-11-29 07:39:29.082 187189 DEBUG oslo_concurrency.lockutils [req-163cb3a1-c7a8-4f6b-a776-e29ef9ae5086 req-cbd90d80-9c64-4dff-a2ae-94b3ac291045 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:39:29 compute-0 nova_compute[187185]: 2025-11-29 07:39:29.082 187189 DEBUG nova.compute.manager [req-163cb3a1-c7a8-4f6b-a776-e29ef9ae5086 req-cbd90d80-9c64-4dff-a2ae-94b3ac291045 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] No waiting events found dispatching network-vif-plugged-948b9a73-91cd-4803-b642-d1a75f163368 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:39:29 compute-0 nova_compute[187185]: 2025-11-29 07:39:29.083 187189 WARNING nova.compute.manager [req-163cb3a1-c7a8-4f6b-a776-e29ef9ae5086 req-cbd90d80-9c64-4dff-a2ae-94b3ac291045 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Received unexpected event network-vif-plugged-948b9a73-91cd-4803-b642-d1a75f163368 for instance with vm_state deleted and task_state None.
Nov 29 07:39:29 compute-0 nova_compute[187185]: 2025-11-29 07:39:29.083 187189 DEBUG nova.compute.manager [req-163cb3a1-c7a8-4f6b-a776-e29ef9ae5086 req-cbd90d80-9c64-4dff-a2ae-94b3ac291045 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Received event network-vif-deleted-948b9a73-91cd-4803-b642-d1a75f163368 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:39:29 compute-0 nova_compute[187185]: 2025-11-29 07:39:29.083 187189 DEBUG nova.compute.manager [req-163cb3a1-c7a8-4f6b-a776-e29ef9ae5086 req-cbd90d80-9c64-4dff-a2ae-94b3ac291045 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Received event network-vif-deleted-ef353cc9-1e6a-4c76-9acf-917aecd8f9a6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:39:29 compute-0 nova_compute[187185]: 2025-11-29 07:39:29.261 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:29 compute-0 nova_compute[187185]: 2025-11-29 07:39:29.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:39:29 compute-0 nova_compute[187185]: 2025-11-29 07:39:29.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:39:30 compute-0 podman[242343]: 2025-11-29 07:39:30.828059488 +0000 UTC m=+0.079241978 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 07:39:31 compute-0 nova_compute[187185]: 2025-11-29 07:39:31.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:39:32 compute-0 nova_compute[187185]: 2025-11-29 07:39:32.388 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:33 compute-0 nova_compute[187185]: 2025-11-29 07:39:33.794 187189 DEBUG oslo_concurrency.lockutils [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "fab512a0-a8b3-423a-bcf4-58a43bc605e5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:39:33 compute-0 nova_compute[187185]: 2025-11-29 07:39:33.794 187189 DEBUG oslo_concurrency.lockutils [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "fab512a0-a8b3-423a-bcf4-58a43bc605e5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:39:33 compute-0 podman[242368]: 2025-11-29 07:39:33.804326033 +0000 UTC m=+0.068134983 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:39:33 compute-0 podman[242369]: 2025-11-29 07:39:33.828407716 +0000 UTC m=+0.090603510 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 07:39:33 compute-0 nova_compute[187185]: 2025-11-29 07:39:33.944 187189 DEBUG nova.compute.manager [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 07:39:34 compute-0 nova_compute[187185]: 2025-11-29 07:39:34.311 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:39:34 compute-0 nova_compute[187185]: 2025-11-29 07:39:34.314 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:35 compute-0 nova_compute[187185]: 2025-11-29 07:39:35.411 187189 DEBUG oslo_concurrency.lockutils [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:39:35 compute-0 nova_compute[187185]: 2025-11-29 07:39:35.413 187189 DEBUG oslo_concurrency.lockutils [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:39:35 compute-0 nova_compute[187185]: 2025-11-29 07:39:35.423 187189 DEBUG nova.virt.hardware [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:39:35 compute-0 nova_compute[187185]: 2025-11-29 07:39:35.423 187189 INFO nova.compute.claims [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:39:36 compute-0 nova_compute[187185]: 2025-11-29 07:39:36.771 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401961.7697332, e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:39:36 compute-0 nova_compute[187185]: 2025-11-29 07:39:36.772 187189 INFO nova.compute.manager [-] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] VM Stopped (Lifecycle Event)
Nov 29 07:39:36 compute-0 nova_compute[187185]: 2025-11-29 07:39:36.885 187189 DEBUG nova.compute.manager [None req-2230a2b9-2f19-4fb8-b307-776c93acaa58 - - - - - -] [instance: e1eca1fc-26b8-46eb-a483-4ee20e2f8f0d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:39:37 compute-0 nova_compute[187185]: 2025-11-29 07:39:37.390 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:39 compute-0 nova_compute[187185]: 2025-11-29 07:39:39.012 187189 DEBUG nova.compute.provider_tree [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:39:39 compute-0 nova_compute[187185]: 2025-11-29 07:39:39.097 187189 DEBUG nova.scheduler.client.report [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:39:39 compute-0 nova_compute[187185]: 2025-11-29 07:39:39.193 187189 DEBUG oslo_concurrency.lockutils [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 3.780s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:39:39 compute-0 nova_compute[187185]: 2025-11-29 07:39:39.194 187189 DEBUG nova.compute.manager [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 07:39:39 compute-0 nova_compute[187185]: 2025-11-29 07:39:39.317 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:42 compute-0 nova_compute[187185]: 2025-11-29 07:39:42.392 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:43 compute-0 nova_compute[187185]: 2025-11-29 07:39:43.315 187189 DEBUG nova.compute.manager [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 07:39:43 compute-0 nova_compute[187185]: 2025-11-29 07:39:43.316 187189 DEBUG nova.network.neutron [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 07:39:43 compute-0 nova_compute[187185]: 2025-11-29 07:39:43.738 187189 DEBUG nova.policy [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 07:39:43 compute-0 nova_compute[187185]: 2025-11-29 07:39:43.822 187189 INFO nova.virt.libvirt.driver [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 07:39:44 compute-0 nova_compute[187185]: 2025-11-29 07:39:44.154 187189 DEBUG nova.compute.manager [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 07:39:44 compute-0 nova_compute[187185]: 2025-11-29 07:39:44.321 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:45 compute-0 nova_compute[187185]: 2025-11-29 07:39:45.307 187189 DEBUG nova.network.neutron [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Successfully created port: 0fd3d73b-a8b8-4477-a245-c3b02228c5ac _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:39:45 compute-0 nova_compute[187185]: 2025-11-29 07:39:45.452 187189 DEBUG nova.compute.manager [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 07:39:45 compute-0 nova_compute[187185]: 2025-11-29 07:39:45.454 187189 DEBUG nova.virt.libvirt.driver [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:39:45 compute-0 nova_compute[187185]: 2025-11-29 07:39:45.455 187189 INFO nova.virt.libvirt.driver [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Creating image(s)
Nov 29 07:39:45 compute-0 nova_compute[187185]: 2025-11-29 07:39:45.456 187189 DEBUG oslo_concurrency.lockutils [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "/var/lib/nova/instances/fab512a0-a8b3-423a-bcf4-58a43bc605e5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:39:45 compute-0 nova_compute[187185]: 2025-11-29 07:39:45.457 187189 DEBUG oslo_concurrency.lockutils [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "/var/lib/nova/instances/fab512a0-a8b3-423a-bcf4-58a43bc605e5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:39:45 compute-0 nova_compute[187185]: 2025-11-29 07:39:45.457 187189 DEBUG oslo_concurrency.lockutils [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "/var/lib/nova/instances/fab512a0-a8b3-423a-bcf4-58a43bc605e5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:39:45 compute-0 nova_compute[187185]: 2025-11-29 07:39:45.470 187189 DEBUG oslo_concurrency.processutils [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:39:45 compute-0 nova_compute[187185]: 2025-11-29 07:39:45.542 187189 DEBUG oslo_concurrency.processutils [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:39:45 compute-0 nova_compute[187185]: 2025-11-29 07:39:45.543 187189 DEBUG oslo_concurrency.lockutils [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:39:45 compute-0 nova_compute[187185]: 2025-11-29 07:39:45.544 187189 DEBUG oslo_concurrency.lockutils [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:39:45 compute-0 nova_compute[187185]: 2025-11-29 07:39:45.560 187189 DEBUG oslo_concurrency.processutils [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:39:45 compute-0 nova_compute[187185]: 2025-11-29 07:39:45.630 187189 DEBUG oslo_concurrency.processutils [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:39:45 compute-0 nova_compute[187185]: 2025-11-29 07:39:45.632 187189 DEBUG oslo_concurrency.processutils [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/fab512a0-a8b3-423a-bcf4-58a43bc605e5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:39:45 compute-0 nova_compute[187185]: 2025-11-29 07:39:45.703 187189 DEBUG oslo_concurrency.processutils [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/fab512a0-a8b3-423a-bcf4-58a43bc605e5/disk 1073741824" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:39:45 compute-0 nova_compute[187185]: 2025-11-29 07:39:45.704 187189 DEBUG oslo_concurrency.lockutils [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:39:45 compute-0 nova_compute[187185]: 2025-11-29 07:39:45.704 187189 DEBUG oslo_concurrency.processutils [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:39:45 compute-0 nova_compute[187185]: 2025-11-29 07:39:45.781 187189 DEBUG oslo_concurrency.processutils [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:39:45 compute-0 nova_compute[187185]: 2025-11-29 07:39:45.783 187189 DEBUG nova.virt.disk.api [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Checking if we can resize image /var/lib/nova/instances/fab512a0-a8b3-423a-bcf4-58a43bc605e5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:39:45 compute-0 nova_compute[187185]: 2025-11-29 07:39:45.783 187189 DEBUG oslo_concurrency.processutils [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fab512a0-a8b3-423a-bcf4-58a43bc605e5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:39:45 compute-0 podman[242418]: 2025-11-29 07:39:45.826475146 +0000 UTC m=+0.083927642 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 07:39:45 compute-0 podman[242420]: 2025-11-29 07:39:45.838929759 +0000 UTC m=+0.086279308 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 07:39:45 compute-0 podman[242419]: 2025-11-29 07:39:45.838935029 +0000 UTC m=+0.089262253 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, name=ubi9-minimal, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, distribution-scope=public, managed_by=edpm_ansible, release=1755695350, config_id=edpm, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 07:39:45 compute-0 nova_compute[187185]: 2025-11-29 07:39:45.848 187189 DEBUG oslo_concurrency.processutils [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fab512a0-a8b3-423a-bcf4-58a43bc605e5/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:39:45 compute-0 nova_compute[187185]: 2025-11-29 07:39:45.849 187189 DEBUG nova.virt.disk.api [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Cannot resize image /var/lib/nova/instances/fab512a0-a8b3-423a-bcf4-58a43bc605e5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:39:45 compute-0 nova_compute[187185]: 2025-11-29 07:39:45.849 187189 DEBUG nova.objects.instance [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'migration_context' on Instance uuid fab512a0-a8b3-423a-bcf4-58a43bc605e5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:39:45 compute-0 nova_compute[187185]: 2025-11-29 07:39:45.885 187189 DEBUG nova.virt.libvirt.driver [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:39:45 compute-0 nova_compute[187185]: 2025-11-29 07:39:45.886 187189 DEBUG nova.virt.libvirt.driver [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Ensure instance console log exists: /var/lib/nova/instances/fab512a0-a8b3-423a-bcf4-58a43bc605e5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:39:45 compute-0 nova_compute[187185]: 2025-11-29 07:39:45.886 187189 DEBUG oslo_concurrency.lockutils [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:39:45 compute-0 nova_compute[187185]: 2025-11-29 07:39:45.886 187189 DEBUG oslo_concurrency.lockutils [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:39:45 compute-0 nova_compute[187185]: 2025-11-29 07:39:45.887 187189 DEBUG oslo_concurrency.lockutils [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:39:46 compute-0 nova_compute[187185]: 2025-11-29 07:39:46.859 187189 DEBUG nova.network.neutron [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Successfully updated port: 0fd3d73b-a8b8-4477-a245-c3b02228c5ac _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:39:47 compute-0 nova_compute[187185]: 2025-11-29 07:39:47.001 187189 DEBUG oslo_concurrency.lockutils [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "refresh_cache-fab512a0-a8b3-423a-bcf4-58a43bc605e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:39:47 compute-0 nova_compute[187185]: 2025-11-29 07:39:47.001 187189 DEBUG oslo_concurrency.lockutils [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquired lock "refresh_cache-fab512a0-a8b3-423a-bcf4-58a43bc605e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:39:47 compute-0 nova_compute[187185]: 2025-11-29 07:39:47.001 187189 DEBUG nova.network.neutron [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:39:47 compute-0 nova_compute[187185]: 2025-11-29 07:39:47.022 187189 DEBUG nova.compute.manager [req-ffd0f64c-0a19-4590-93be-74e3a3cdd6d1 req-12b26cf4-4eae-48bd-bd41-af1d090585d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Received event network-changed-0fd3d73b-a8b8-4477-a245-c3b02228c5ac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:39:47 compute-0 nova_compute[187185]: 2025-11-29 07:39:47.022 187189 DEBUG nova.compute.manager [req-ffd0f64c-0a19-4590-93be-74e3a3cdd6d1 req-12b26cf4-4eae-48bd-bd41-af1d090585d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Refreshing instance network info cache due to event network-changed-0fd3d73b-a8b8-4477-a245-c3b02228c5ac. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:39:47 compute-0 nova_compute[187185]: 2025-11-29 07:39:47.023 187189 DEBUG oslo_concurrency.lockutils [req-ffd0f64c-0a19-4590-93be-74e3a3cdd6d1 req-12b26cf4-4eae-48bd-bd41-af1d090585d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-fab512a0-a8b3-423a-bcf4-58a43bc605e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:39:47 compute-0 nova_compute[187185]: 2025-11-29 07:39:47.333 187189 DEBUG nova.network.neutron [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:39:47 compute-0 nova_compute[187185]: 2025-11-29 07:39:47.394 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.107 187189 DEBUG nova.network.neutron [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Updating instance_info_cache with network_info: [{"id": "0fd3d73b-a8b8-4477-a245-c3b02228c5ac", "address": "fa:16:3e:68:77:27", "network": {"id": "8ac0e70c-84ba-415c-841b-4a5a525b1a9d", "bridge": "br-int", "label": "tempest-network-smoke--597026508", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fd3d73b-a8", "ovs_interfaceid": "0fd3d73b-a8b8-4477-a245-c3b02228c5ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.267 187189 DEBUG oslo_concurrency.lockutils [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Releasing lock "refresh_cache-fab512a0-a8b3-423a-bcf4-58a43bc605e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.267 187189 DEBUG nova.compute.manager [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Instance network_info: |[{"id": "0fd3d73b-a8b8-4477-a245-c3b02228c5ac", "address": "fa:16:3e:68:77:27", "network": {"id": "8ac0e70c-84ba-415c-841b-4a5a525b1a9d", "bridge": "br-int", "label": "tempest-network-smoke--597026508", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fd3d73b-a8", "ovs_interfaceid": "0fd3d73b-a8b8-4477-a245-c3b02228c5ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.268 187189 DEBUG oslo_concurrency.lockutils [req-ffd0f64c-0a19-4590-93be-74e3a3cdd6d1 req-12b26cf4-4eae-48bd-bd41-af1d090585d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-fab512a0-a8b3-423a-bcf4-58a43bc605e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.268 187189 DEBUG nova.network.neutron [req-ffd0f64c-0a19-4590-93be-74e3a3cdd6d1 req-12b26cf4-4eae-48bd-bd41-af1d090585d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Refreshing network info cache for port 0fd3d73b-a8b8-4477-a245-c3b02228c5ac _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.271 187189 DEBUG nova.virt.libvirt.driver [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Start _get_guest_xml network_info=[{"id": "0fd3d73b-a8b8-4477-a245-c3b02228c5ac", "address": "fa:16:3e:68:77:27", "network": {"id": "8ac0e70c-84ba-415c-841b-4a5a525b1a9d", "bridge": "br-int", "label": "tempest-network-smoke--597026508", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fd3d73b-a8", "ovs_interfaceid": "0fd3d73b-a8b8-4477-a245-c3b02228c5ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.276 187189 WARNING nova.virt.libvirt.driver [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.281 187189 DEBUG nova.virt.libvirt.host [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.282 187189 DEBUG nova.virt.libvirt.host [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.285 187189 DEBUG nova.virt.libvirt.host [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.286 187189 DEBUG nova.virt.libvirt.host [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.287 187189 DEBUG nova.virt.libvirt.driver [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.287 187189 DEBUG nova.virt.hardware [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.288 187189 DEBUG nova.virt.hardware [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.288 187189 DEBUG nova.virt.hardware [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.288 187189 DEBUG nova.virt.hardware [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.288 187189 DEBUG nova.virt.hardware [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.288 187189 DEBUG nova.virt.hardware [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.289 187189 DEBUG nova.virt.hardware [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.289 187189 DEBUG nova.virt.hardware [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.289 187189 DEBUG nova.virt.hardware [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.289 187189 DEBUG nova.virt.hardware [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.290 187189 DEBUG nova.virt.hardware [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.293 187189 DEBUG nova.virt.libvirt.vif [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:39:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1434926764',display_name='tempest-TestNetworkBasicOps-server-1434926764',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1434926764',id=157,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMcfDmYJ0HBYhMemWiLMJKd0cqvXEGpygRmLhfHquTWqJn5OCZyKRBqP+H/wNvOiW/vd5uFxcRFLtr2MKLO4qkP/QBdiSS+HUjvoRBOvapMvCWoIoFcpibPB61ltG2xzgw==',key_name='tempest-TestNetworkBasicOps-1660436421',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-85ojj5os',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:39:44Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=fab512a0-a8b3-423a-bcf4-58a43bc605e5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0fd3d73b-a8b8-4477-a245-c3b02228c5ac", "address": "fa:16:3e:68:77:27", "network": {"id": "8ac0e70c-84ba-415c-841b-4a5a525b1a9d", "bridge": "br-int", "label": "tempest-network-smoke--597026508", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fd3d73b-a8", "ovs_interfaceid": "0fd3d73b-a8b8-4477-a245-c3b02228c5ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.293 187189 DEBUG nova.network.os_vif_util [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "0fd3d73b-a8b8-4477-a245-c3b02228c5ac", "address": "fa:16:3e:68:77:27", "network": {"id": "8ac0e70c-84ba-415c-841b-4a5a525b1a9d", "bridge": "br-int", "label": "tempest-network-smoke--597026508", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fd3d73b-a8", "ovs_interfaceid": "0fd3d73b-a8b8-4477-a245-c3b02228c5ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.294 187189 DEBUG nova.network.os_vif_util [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:68:77:27,bridge_name='br-int',has_traffic_filtering=True,id=0fd3d73b-a8b8-4477-a245-c3b02228c5ac,network=Network(8ac0e70c-84ba-415c-841b-4a5a525b1a9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0fd3d73b-a8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.295 187189 DEBUG nova.objects.instance [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'pci_devices' on Instance uuid fab512a0-a8b3-423a-bcf4-58a43bc605e5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.468 187189 DEBUG nova.virt.libvirt.driver [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:39:48 compute-0 nova_compute[187185]:   <uuid>fab512a0-a8b3-423a-bcf4-58a43bc605e5</uuid>
Nov 29 07:39:48 compute-0 nova_compute[187185]:   <name>instance-0000009d</name>
Nov 29 07:39:48 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:39:48 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:39:48 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:39:48 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:       <nova:name>tempest-TestNetworkBasicOps-server-1434926764</nova:name>
Nov 29 07:39:48 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:39:48</nova:creationTime>
Nov 29 07:39:48 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:39:48 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:39:48 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:39:48 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:39:48 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:39:48 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:39:48 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:39:48 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:39:48 compute-0 nova_compute[187185]:         <nova:user uuid="1dd0a7f5aaff402eb032cd5e60540dcb">tempest-TestNetworkBasicOps-1587012976-project-member</nova:user>
Nov 29 07:39:48 compute-0 nova_compute[187185]:         <nova:project uuid="ec8b80be17a14d1caf666636283749d0">tempest-TestNetworkBasicOps-1587012976</nova:project>
Nov 29 07:39:48 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:39:48 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:39:48 compute-0 nova_compute[187185]:         <nova:port uuid="0fd3d73b-a8b8-4477-a245-c3b02228c5ac">
Nov 29 07:39:48 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:39:48 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:39:48 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:39:48 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <system>
Nov 29 07:39:48 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:39:48 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:39:48 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:39:48 compute-0 nova_compute[187185]:       <entry name="serial">fab512a0-a8b3-423a-bcf4-58a43bc605e5</entry>
Nov 29 07:39:48 compute-0 nova_compute[187185]:       <entry name="uuid">fab512a0-a8b3-423a-bcf4-58a43bc605e5</entry>
Nov 29 07:39:48 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     </system>
Nov 29 07:39:48 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:39:48 compute-0 nova_compute[187185]:   <os>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:   </os>
Nov 29 07:39:48 compute-0 nova_compute[187185]:   <features>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:   </features>
Nov 29 07:39:48 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:39:48 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:39:48 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:39:48 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/fab512a0-a8b3-423a-bcf4-58a43bc605e5/disk"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:39:48 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/fab512a0-a8b3-423a-bcf4-58a43bc605e5/disk.config"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:39:48 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:68:77:27"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:       <target dev="tap0fd3d73b-a8"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:39:48 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/fab512a0-a8b3-423a-bcf4-58a43bc605e5/console.log" append="off"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <video>
Nov 29 07:39:48 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     </video>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:39:48 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:39:48 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:39:48 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:39:48 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:39:48 compute-0 nova_compute[187185]: </domain>
Nov 29 07:39:48 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.469 187189 DEBUG nova.compute.manager [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Preparing to wait for external event network-vif-plugged-0fd3d73b-a8b8-4477-a245-c3b02228c5ac prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.469 187189 DEBUG oslo_concurrency.lockutils [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "fab512a0-a8b3-423a-bcf4-58a43bc605e5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.469 187189 DEBUG oslo_concurrency.lockutils [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "fab512a0-a8b3-423a-bcf4-58a43bc605e5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.469 187189 DEBUG oslo_concurrency.lockutils [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "fab512a0-a8b3-423a-bcf4-58a43bc605e5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.470 187189 DEBUG nova.virt.libvirt.vif [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:39:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1434926764',display_name='tempest-TestNetworkBasicOps-server-1434926764',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1434926764',id=157,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMcfDmYJ0HBYhMemWiLMJKd0cqvXEGpygRmLhfHquTWqJn5OCZyKRBqP+H/wNvOiW/vd5uFxcRFLtr2MKLO4qkP/QBdiSS+HUjvoRBOvapMvCWoIoFcpibPB61ltG2xzgw==',key_name='tempest-TestNetworkBasicOps-1660436421',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-85ojj5os',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:39:44Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=fab512a0-a8b3-423a-bcf4-58a43bc605e5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0fd3d73b-a8b8-4477-a245-c3b02228c5ac", "address": "fa:16:3e:68:77:27", "network": {"id": "8ac0e70c-84ba-415c-841b-4a5a525b1a9d", "bridge": "br-int", "label": "tempest-network-smoke--597026508", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fd3d73b-a8", "ovs_interfaceid": "0fd3d73b-a8b8-4477-a245-c3b02228c5ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.470 187189 DEBUG nova.network.os_vif_util [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "0fd3d73b-a8b8-4477-a245-c3b02228c5ac", "address": "fa:16:3e:68:77:27", "network": {"id": "8ac0e70c-84ba-415c-841b-4a5a525b1a9d", "bridge": "br-int", "label": "tempest-network-smoke--597026508", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fd3d73b-a8", "ovs_interfaceid": "0fd3d73b-a8b8-4477-a245-c3b02228c5ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.471 187189 DEBUG nova.network.os_vif_util [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:68:77:27,bridge_name='br-int',has_traffic_filtering=True,id=0fd3d73b-a8b8-4477-a245-c3b02228c5ac,network=Network(8ac0e70c-84ba-415c-841b-4a5a525b1a9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0fd3d73b-a8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.471 187189 DEBUG os_vif [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:68:77:27,bridge_name='br-int',has_traffic_filtering=True,id=0fd3d73b-a8b8-4477-a245-c3b02228c5ac,network=Network(8ac0e70c-84ba-415c-841b-4a5a525b1a9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0fd3d73b-a8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.472 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.472 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.472 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.477 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.478 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0fd3d73b-a8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.479 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0fd3d73b-a8, col_values=(('external_ids', {'iface-id': '0fd3d73b-a8b8-4477-a245-c3b02228c5ac', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:68:77:27', 'vm-uuid': 'fab512a0-a8b3-423a-bcf4-58a43bc605e5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.481 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:48 compute-0 NetworkManager[55227]: <info>  [1764401988.4830] manager: (tap0fd3d73b-a8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/272)
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.485 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.491 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.492 187189 INFO os_vif [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:68:77:27,bridge_name='br-int',has_traffic_filtering=True,id=0fd3d73b-a8b8-4477-a245-c3b02228c5ac,network=Network(8ac0e70c-84ba-415c-841b-4a5a525b1a9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0fd3d73b-a8')
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.931 187189 DEBUG nova.virt.libvirt.driver [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.932 187189 DEBUG nova.virt.libvirt.driver [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.932 187189 DEBUG nova.virt.libvirt.driver [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No VIF found with MAC fa:16:3e:68:77:27, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:39:48 compute-0 nova_compute[187185]: 2025-11-29 07:39:48.932 187189 INFO nova.virt.libvirt.driver [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Using config drive
Nov 29 07:39:49 compute-0 nova_compute[187185]: 2025-11-29 07:39:49.542 187189 INFO nova.virt.libvirt.driver [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Creating config drive at /var/lib/nova/instances/fab512a0-a8b3-423a-bcf4-58a43bc605e5/disk.config
Nov 29 07:39:49 compute-0 nova_compute[187185]: 2025-11-29 07:39:49.547 187189 DEBUG oslo_concurrency.processutils [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fab512a0-a8b3-423a-bcf4-58a43bc605e5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvhsgvixt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:39:49 compute-0 nova_compute[187185]: 2025-11-29 07:39:49.679 187189 DEBUG oslo_concurrency.processutils [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fab512a0-a8b3-423a-bcf4-58a43bc605e5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvhsgvixt" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:39:49 compute-0 kernel: tap0fd3d73b-a8: entered promiscuous mode
Nov 29 07:39:49 compute-0 nova_compute[187185]: 2025-11-29 07:39:49.762 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:49 compute-0 ovn_controller[95281]: 2025-11-29T07:39:49Z|00522|binding|INFO|Claiming lport 0fd3d73b-a8b8-4477-a245-c3b02228c5ac for this chassis.
Nov 29 07:39:49 compute-0 ovn_controller[95281]: 2025-11-29T07:39:49Z|00523|binding|INFO|0fd3d73b-a8b8-4477-a245-c3b02228c5ac: Claiming fa:16:3e:68:77:27 10.100.0.5
Nov 29 07:39:49 compute-0 NetworkManager[55227]: <info>  [1764401989.7637] manager: (tap0fd3d73b-a8): new Tun device (/org/freedesktop/NetworkManager/Devices/273)
Nov 29 07:39:49 compute-0 ovn_controller[95281]: 2025-11-29T07:39:49Z|00524|binding|INFO|Setting lport 0fd3d73b-a8b8-4477-a245-c3b02228c5ac ovn-installed in OVS
Nov 29 07:39:49 compute-0 nova_compute[187185]: 2025-11-29 07:39:49.793 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:49 compute-0 nova_compute[187185]: 2025-11-29 07:39:49.797 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:49 compute-0 systemd-udevd[242502]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:39:49 compute-0 NetworkManager[55227]: <info>  [1764401989.8253] device (tap0fd3d73b-a8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:39:49 compute-0 systemd-machined[153486]: New machine qemu-62-instance-0000009d.
Nov 29 07:39:49 compute-0 NetworkManager[55227]: <info>  [1764401989.8277] device (tap0fd3d73b-a8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:39:49 compute-0 ovn_controller[95281]: 2025-11-29T07:39:49Z|00525|binding|INFO|Setting lport 0fd3d73b-a8b8-4477-a245-c3b02228c5ac up in Southbound
Nov 29 07:39:49 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:49.827 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:68:77:27 10.100.0.5'], port_security=['fa:16:3e:68:77:27 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'fab512a0-a8b3-423a-bcf4-58a43bc605e5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8ac0e70c-84ba-415c-841b-4a5a525b1a9d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ec8b80be17a14d1caf666636283749d0', 'neutron:revision_number': '2', 'neutron:security_group_ids': '36b9f1ed-957e-468e-a894-c294dde65d52', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a06529de-8ea4-4c02-8447-f35b6f567d2c, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=0fd3d73b-a8b8-4477-a245-c3b02228c5ac) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:39:49 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:49.830 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 0fd3d73b-a8b8-4477-a245-c3b02228c5ac in datapath 8ac0e70c-84ba-415c-841b-4a5a525b1a9d bound to our chassis
Nov 29 07:39:49 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:49.832 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8ac0e70c-84ba-415c-841b-4a5a525b1a9d
Nov 29 07:39:49 compute-0 systemd[1]: Started Virtual Machine qemu-62-instance-0000009d.
Nov 29 07:39:49 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:49.856 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[cf453e53-0946-4a05-b6c8-34efa8246a9f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:39:49 compute-0 nova_compute[187185]: 2025-11-29 07:39:49.866 187189 DEBUG nova.network.neutron [req-ffd0f64c-0a19-4590-93be-74e3a3cdd6d1 req-12b26cf4-4eae-48bd-bd41-af1d090585d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Updated VIF entry in instance network info cache for port 0fd3d73b-a8b8-4477-a245-c3b02228c5ac. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:39:49 compute-0 nova_compute[187185]: 2025-11-29 07:39:49.867 187189 DEBUG nova.network.neutron [req-ffd0f64c-0a19-4590-93be-74e3a3cdd6d1 req-12b26cf4-4eae-48bd-bd41-af1d090585d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Updating instance_info_cache with network_info: [{"id": "0fd3d73b-a8b8-4477-a245-c3b02228c5ac", "address": "fa:16:3e:68:77:27", "network": {"id": "8ac0e70c-84ba-415c-841b-4a5a525b1a9d", "bridge": "br-int", "label": "tempest-network-smoke--597026508", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fd3d73b-a8", "ovs_interfaceid": "0fd3d73b-a8b8-4477-a245-c3b02228c5ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:39:49 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:49.904 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[798b1eeb-7aa5-4633-b61e-a55141df3f6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:39:49 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:49.907 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[d3d903d5-9c33-408a-a4ce-37fca9104a1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:39:49 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:49.944 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[9fa3703b-9410-4a0f-a293-90b7314fe4be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:39:49 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:49.966 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[bcf5c6fa-14f5-4f00-a76d-9b446701333c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8ac0e70c-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b6:3f:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 158], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 736852, 'reachable_time': 24370, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242518, 'error': None, 'target': 'ovnmeta-8ac0e70c-84ba-415c-841b-4a5a525b1a9d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:39:49 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:49.988 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[5bd3bb86-a06a-426a-b9db-124c297869ea]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap8ac0e70c-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 736867, 'tstamp': 736867}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242519, 'error': None, 'target': 'ovnmeta-8ac0e70c-84ba-415c-841b-4a5a525b1a9d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap8ac0e70c-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 736872, 'tstamp': 736872}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242519, 'error': None, 'target': 'ovnmeta-8ac0e70c-84ba-415c-841b-4a5a525b1a9d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:39:49 compute-0 nova_compute[187185]: 2025-11-29 07:39:49.989 187189 DEBUG oslo_concurrency.lockutils [req-ffd0f64c-0a19-4590-93be-74e3a3cdd6d1 req-12b26cf4-4eae-48bd-bd41-af1d090585d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-fab512a0-a8b3-423a-bcf4-58a43bc605e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:39:49 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:49.992 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8ac0e70c-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:39:49 compute-0 nova_compute[187185]: 2025-11-29 07:39:49.994 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:49 compute-0 nova_compute[187185]: 2025-11-29 07:39:49.995 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:49 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:49.996 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8ac0e70c-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:39:49 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:49.997 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:39:49 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:49.997 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8ac0e70c-80, col_values=(('external_ids', {'iface-id': '17e71a59-7bb8-4f35-826d-2efc90d0ca9c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:39:49 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:39:49.997 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:39:50 compute-0 nova_compute[187185]: 2025-11-29 07:39:50.448 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401990.448048, fab512a0-a8b3-423a-bcf4-58a43bc605e5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:39:50 compute-0 nova_compute[187185]: 2025-11-29 07:39:50.449 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] VM Started (Lifecycle Event)
Nov 29 07:39:50 compute-0 nova_compute[187185]: 2025-11-29 07:39:50.497 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:39:50 compute-0 nova_compute[187185]: 2025-11-29 07:39:50.503 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401990.44849, fab512a0-a8b3-423a-bcf4-58a43bc605e5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:39:50 compute-0 nova_compute[187185]: 2025-11-29 07:39:50.504 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] VM Paused (Lifecycle Event)
Nov 29 07:39:50 compute-0 nova_compute[187185]: 2025-11-29 07:39:50.559 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:39:50 compute-0 nova_compute[187185]: 2025-11-29 07:39:50.565 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:39:50 compute-0 nova_compute[187185]: 2025-11-29 07:39:50.602 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:39:51 compute-0 nova_compute[187185]: 2025-11-29 07:39:51.829 187189 DEBUG nova.compute.manager [req-baa6520f-e29c-4c89-a38b-df108fb54302 req-b93d4e7b-79fa-4d28-8e11-d528dd1b15fe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Received event network-vif-plugged-0fd3d73b-a8b8-4477-a245-c3b02228c5ac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:39:51 compute-0 nova_compute[187185]: 2025-11-29 07:39:51.830 187189 DEBUG oslo_concurrency.lockutils [req-baa6520f-e29c-4c89-a38b-df108fb54302 req-b93d4e7b-79fa-4d28-8e11-d528dd1b15fe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "fab512a0-a8b3-423a-bcf4-58a43bc605e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:39:51 compute-0 nova_compute[187185]: 2025-11-29 07:39:51.830 187189 DEBUG oslo_concurrency.lockutils [req-baa6520f-e29c-4c89-a38b-df108fb54302 req-b93d4e7b-79fa-4d28-8e11-d528dd1b15fe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "fab512a0-a8b3-423a-bcf4-58a43bc605e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:39:51 compute-0 nova_compute[187185]: 2025-11-29 07:39:51.830 187189 DEBUG oslo_concurrency.lockutils [req-baa6520f-e29c-4c89-a38b-df108fb54302 req-b93d4e7b-79fa-4d28-8e11-d528dd1b15fe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "fab512a0-a8b3-423a-bcf4-58a43bc605e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:39:51 compute-0 nova_compute[187185]: 2025-11-29 07:39:51.831 187189 DEBUG nova.compute.manager [req-baa6520f-e29c-4c89-a38b-df108fb54302 req-b93d4e7b-79fa-4d28-8e11-d528dd1b15fe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Processing event network-vif-plugged-0fd3d73b-a8b8-4477-a245-c3b02228c5ac _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:39:51 compute-0 nova_compute[187185]: 2025-11-29 07:39:51.831 187189 DEBUG nova.compute.manager [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:39:51 compute-0 nova_compute[187185]: 2025-11-29 07:39:51.835 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764401991.8350499, fab512a0-a8b3-423a-bcf4-58a43bc605e5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:39:51 compute-0 nova_compute[187185]: 2025-11-29 07:39:51.835 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] VM Resumed (Lifecycle Event)
Nov 29 07:39:51 compute-0 nova_compute[187185]: 2025-11-29 07:39:51.837 187189 DEBUG nova.virt.libvirt.driver [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:39:51 compute-0 nova_compute[187185]: 2025-11-29 07:39:51.841 187189 INFO nova.virt.libvirt.driver [-] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Instance spawned successfully.
Nov 29 07:39:51 compute-0 nova_compute[187185]: 2025-11-29 07:39:51.841 187189 DEBUG nova.virt.libvirt.driver [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:39:51 compute-0 nova_compute[187185]: 2025-11-29 07:39:51.918 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:39:51 compute-0 nova_compute[187185]: 2025-11-29 07:39:51.924 187189 DEBUG nova.virt.libvirt.driver [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:39:51 compute-0 nova_compute[187185]: 2025-11-29 07:39:51.925 187189 DEBUG nova.virt.libvirt.driver [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:39:51 compute-0 nova_compute[187185]: 2025-11-29 07:39:51.925 187189 DEBUG nova.virt.libvirt.driver [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:39:51 compute-0 nova_compute[187185]: 2025-11-29 07:39:51.926 187189 DEBUG nova.virt.libvirt.driver [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:39:51 compute-0 nova_compute[187185]: 2025-11-29 07:39:51.927 187189 DEBUG nova.virt.libvirt.driver [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:39:51 compute-0 nova_compute[187185]: 2025-11-29 07:39:51.927 187189 DEBUG nova.virt.libvirt.driver [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:39:51 compute-0 nova_compute[187185]: 2025-11-29 07:39:51.933 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:39:52 compute-0 nova_compute[187185]: 2025-11-29 07:39:52.069 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:39:52 compute-0 nova_compute[187185]: 2025-11-29 07:39:52.397 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:53 compute-0 nova_compute[187185]: 2025-11-29 07:39:53.483 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:54 compute-0 nova_compute[187185]: 2025-11-29 07:39:54.364 187189 DEBUG nova.compute.manager [req-bf630ab9-54c6-49ac-b571-caaf13c677a3 req-68504275-3ffb-4597-a027-696aebe2e54d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Received event network-vif-plugged-0fd3d73b-a8b8-4477-a245-c3b02228c5ac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:39:54 compute-0 nova_compute[187185]: 2025-11-29 07:39:54.365 187189 DEBUG oslo_concurrency.lockutils [req-bf630ab9-54c6-49ac-b571-caaf13c677a3 req-68504275-3ffb-4597-a027-696aebe2e54d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "fab512a0-a8b3-423a-bcf4-58a43bc605e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:39:54 compute-0 nova_compute[187185]: 2025-11-29 07:39:54.365 187189 DEBUG oslo_concurrency.lockutils [req-bf630ab9-54c6-49ac-b571-caaf13c677a3 req-68504275-3ffb-4597-a027-696aebe2e54d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "fab512a0-a8b3-423a-bcf4-58a43bc605e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:39:54 compute-0 nova_compute[187185]: 2025-11-29 07:39:54.366 187189 DEBUG oslo_concurrency.lockutils [req-bf630ab9-54c6-49ac-b571-caaf13c677a3 req-68504275-3ffb-4597-a027-696aebe2e54d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "fab512a0-a8b3-423a-bcf4-58a43bc605e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:39:54 compute-0 nova_compute[187185]: 2025-11-29 07:39:54.367 187189 DEBUG nova.compute.manager [req-bf630ab9-54c6-49ac-b571-caaf13c677a3 req-68504275-3ffb-4597-a027-696aebe2e54d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] No waiting events found dispatching network-vif-plugged-0fd3d73b-a8b8-4477-a245-c3b02228c5ac pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:39:54 compute-0 nova_compute[187185]: 2025-11-29 07:39:54.367 187189 WARNING nova.compute.manager [req-bf630ab9-54c6-49ac-b571-caaf13c677a3 req-68504275-3ffb-4597-a027-696aebe2e54d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Received unexpected event network-vif-plugged-0fd3d73b-a8b8-4477-a245-c3b02228c5ac for instance with vm_state building and task_state spawning.
Nov 29 07:39:54 compute-0 nova_compute[187185]: 2025-11-29 07:39:54.441 187189 INFO nova.compute.manager [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Took 8.99 seconds to spawn the instance on the hypervisor.
Nov 29 07:39:54 compute-0 nova_compute[187185]: 2025-11-29 07:39:54.442 187189 DEBUG nova.compute.manager [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:39:54 compute-0 ovn_controller[95281]: 2025-11-29T07:39:54Z|00526|binding|INFO|Releasing lport 17e71a59-7bb8-4f35-826d-2efc90d0ca9c from this chassis (sb_readonly=0)
Nov 29 07:39:54 compute-0 nova_compute[187185]: 2025-11-29 07:39:54.503 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:54 compute-0 ovn_controller[95281]: 2025-11-29T07:39:54Z|00527|binding|INFO|Releasing lport 17e71a59-7bb8-4f35-826d-2efc90d0ca9c from this chassis (sb_readonly=0)
Nov 29 07:39:54 compute-0 nova_compute[187185]: 2025-11-29 07:39:54.740 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:54 compute-0 podman[242527]: 2025-11-29 07:39:54.840050485 +0000 UTC m=+0.091253999 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 29 07:39:55 compute-0 nova_compute[187185]: 2025-11-29 07:39:55.022 187189 INFO nova.compute.manager [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Took 20.59 seconds to build instance.
Nov 29 07:39:55 compute-0 nova_compute[187185]: 2025-11-29 07:39:55.185 187189 DEBUG oslo_concurrency.lockutils [None req-643093f3-55c9-43aa-8477-ba13d1234a08 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "fab512a0-a8b3-423a-bcf4-58a43bc605e5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 21.390s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:39:57 compute-0 nova_compute[187185]: 2025-11-29 07:39:57.400 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:39:58 compute-0 nova_compute[187185]: 2025-11-29 07:39:58.487 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:00 compute-0 NetworkManager[55227]: <info>  [1764402000.4789] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/274)
Nov 29 07:40:00 compute-0 nova_compute[187185]: 2025-11-29 07:40:00.477 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:00 compute-0 NetworkManager[55227]: <info>  [1764402000.4802] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/275)
Nov 29 07:40:00 compute-0 nova_compute[187185]: 2025-11-29 07:40:00.680 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:00 compute-0 ovn_controller[95281]: 2025-11-29T07:40:00Z|00528|binding|INFO|Releasing lport 17e71a59-7bb8-4f35-826d-2efc90d0ca9c from this chassis (sb_readonly=0)
Nov 29 07:40:00 compute-0 nova_compute[187185]: 2025-11-29 07:40:00.708 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:01 compute-0 nova_compute[187185]: 2025-11-29 07:40:01.068 187189 DEBUG nova.compute.manager [req-5d1d783c-f4e0-41d1-880f-f67b14871de9 req-7d18c323-2c8d-43a5-bc0b-bbec3e28428f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Received event network-changed-0fd3d73b-a8b8-4477-a245-c3b02228c5ac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:40:01 compute-0 nova_compute[187185]: 2025-11-29 07:40:01.069 187189 DEBUG nova.compute.manager [req-5d1d783c-f4e0-41d1-880f-f67b14871de9 req-7d18c323-2c8d-43a5-bc0b-bbec3e28428f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Refreshing instance network info cache due to event network-changed-0fd3d73b-a8b8-4477-a245-c3b02228c5ac. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:40:01 compute-0 nova_compute[187185]: 2025-11-29 07:40:01.069 187189 DEBUG oslo_concurrency.lockutils [req-5d1d783c-f4e0-41d1-880f-f67b14871de9 req-7d18c323-2c8d-43a5-bc0b-bbec3e28428f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-fab512a0-a8b3-423a-bcf4-58a43bc605e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:40:01 compute-0 nova_compute[187185]: 2025-11-29 07:40:01.069 187189 DEBUG oslo_concurrency.lockutils [req-5d1d783c-f4e0-41d1-880f-f67b14871de9 req-7d18c323-2c8d-43a5-bc0b-bbec3e28428f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-fab512a0-a8b3-423a-bcf4-58a43bc605e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:40:01 compute-0 nova_compute[187185]: 2025-11-29 07:40:01.070 187189 DEBUG nova.network.neutron [req-5d1d783c-f4e0-41d1-880f-f67b14871de9 req-7d18c323-2c8d-43a5-bc0b-bbec3e28428f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Refreshing network info cache for port 0fd3d73b-a8b8-4477-a245-c3b02228c5ac _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:40:01 compute-0 podman[242556]: 2025-11-29 07:40:01.803095547 +0000 UTC m=+0.055794424 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 07:40:02 compute-0 nova_compute[187185]: 2025-11-29 07:40:02.403 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:03 compute-0 sshd-session[242579]: Invalid user bob from 190.181.27.27 port 53184
Nov 29 07:40:03 compute-0 sshd-session[242579]: Received disconnect from 190.181.27.27 port 53184:11: Bye Bye [preauth]
Nov 29 07:40:03 compute-0 sshd-session[242579]: Disconnected from invalid user bob 190.181.27.27 port 53184 [preauth]
Nov 29 07:40:03 compute-0 nova_compute[187185]: 2025-11-29 07:40:03.489 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:04 compute-0 nova_compute[187185]: 2025-11-29 07:40:04.367 187189 DEBUG nova.network.neutron [req-5d1d783c-f4e0-41d1-880f-f67b14871de9 req-7d18c323-2c8d-43a5-bc0b-bbec3e28428f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Updated VIF entry in instance network info cache for port 0fd3d73b-a8b8-4477-a245-c3b02228c5ac. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:40:04 compute-0 nova_compute[187185]: 2025-11-29 07:40:04.368 187189 DEBUG nova.network.neutron [req-5d1d783c-f4e0-41d1-880f-f67b14871de9 req-7d18c323-2c8d-43a5-bc0b-bbec3e28428f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Updating instance_info_cache with network_info: [{"id": "0fd3d73b-a8b8-4477-a245-c3b02228c5ac", "address": "fa:16:3e:68:77:27", "network": {"id": "8ac0e70c-84ba-415c-841b-4a5a525b1a9d", "bridge": "br-int", "label": "tempest-network-smoke--597026508", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fd3d73b-a8", "ovs_interfaceid": "0fd3d73b-a8b8-4477-a245-c3b02228c5ac", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:40:04 compute-0 nova_compute[187185]: 2025-11-29 07:40:04.443 187189 DEBUG oslo_concurrency.lockutils [req-5d1d783c-f4e0-41d1-880f-f67b14871de9 req-7d18c323-2c8d-43a5-bc0b-bbec3e28428f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-fab512a0-a8b3-423a-bcf4-58a43bc605e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:40:04 compute-0 podman[242586]: 2025-11-29 07:40:04.822792003 +0000 UTC m=+0.080874144 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 07:40:04 compute-0 podman[242587]: 2025-11-29 07:40:04.874427728 +0000 UTC m=+0.115772655 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 07:40:07 compute-0 nova_compute[187185]: 2025-11-29 07:40:07.405 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:07 compute-0 ovn_controller[95281]: 2025-11-29T07:40:07Z|00069|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:68:77:27 10.100.0.5
Nov 29 07:40:07 compute-0 ovn_controller[95281]: 2025-11-29T07:40:07Z|00070|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:68:77:27 10.100.0.5
Nov 29 07:40:08 compute-0 nova_compute[187185]: 2025-11-29 07:40:08.492 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:12 compute-0 nova_compute[187185]: 2025-11-29 07:40:12.408 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:13 compute-0 nova_compute[187185]: 2025-11-29 07:40:13.495 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:13 compute-0 nova_compute[187185]: 2025-11-29 07:40:13.768 187189 INFO nova.compute.manager [None req-a83fe537-83be-40c9-8761-1975c07f9c6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Get console output
Nov 29 07:40:13 compute-0 nova_compute[187185]: 2025-11-29 07:40:13.776 213758 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 29 07:40:14 compute-0 nova_compute[187185]: 2025-11-29 07:40:14.230 187189 DEBUG oslo_concurrency.lockutils [None req-8fc26cef-7153-4c67-867a-a55b189fcf4f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "fab512a0-a8b3-423a-bcf4-58a43bc605e5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:40:14 compute-0 nova_compute[187185]: 2025-11-29 07:40:14.231 187189 DEBUG oslo_concurrency.lockutils [None req-8fc26cef-7153-4c67-867a-a55b189fcf4f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "fab512a0-a8b3-423a-bcf4-58a43bc605e5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:40:14 compute-0 nova_compute[187185]: 2025-11-29 07:40:14.231 187189 DEBUG oslo_concurrency.lockutils [None req-8fc26cef-7153-4c67-867a-a55b189fcf4f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "fab512a0-a8b3-423a-bcf4-58a43bc605e5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:40:14 compute-0 nova_compute[187185]: 2025-11-29 07:40:14.231 187189 DEBUG oslo_concurrency.lockutils [None req-8fc26cef-7153-4c67-867a-a55b189fcf4f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "fab512a0-a8b3-423a-bcf4-58a43bc605e5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:40:14 compute-0 nova_compute[187185]: 2025-11-29 07:40:14.232 187189 DEBUG oslo_concurrency.lockutils [None req-8fc26cef-7153-4c67-867a-a55b189fcf4f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "fab512a0-a8b3-423a-bcf4-58a43bc605e5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:40:14 compute-0 nova_compute[187185]: 2025-11-29 07:40:14.245 187189 INFO nova.compute.manager [None req-8fc26cef-7153-4c67-867a-a55b189fcf4f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Terminating instance
Nov 29 07:40:14 compute-0 nova_compute[187185]: 2025-11-29 07:40:14.259 187189 DEBUG nova.compute.manager [None req-8fc26cef-7153-4c67-867a-a55b189fcf4f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 07:40:14 compute-0 kernel: tap0fd3d73b-a8 (unregistering): left promiscuous mode
Nov 29 07:40:14 compute-0 NetworkManager[55227]: <info>  [1764402014.2894] device (tap0fd3d73b-a8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:40:14 compute-0 nova_compute[187185]: 2025-11-29 07:40:14.295 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:14 compute-0 ovn_controller[95281]: 2025-11-29T07:40:14Z|00529|binding|INFO|Releasing lport 0fd3d73b-a8b8-4477-a245-c3b02228c5ac from this chassis (sb_readonly=0)
Nov 29 07:40:14 compute-0 ovn_controller[95281]: 2025-11-29T07:40:14Z|00530|binding|INFO|Setting lport 0fd3d73b-a8b8-4477-a245-c3b02228c5ac down in Southbound
Nov 29 07:40:14 compute-0 ovn_controller[95281]: 2025-11-29T07:40:14Z|00531|binding|INFO|Removing iface tap0fd3d73b-a8 ovn-installed in OVS
Nov 29 07:40:14 compute-0 nova_compute[187185]: 2025-11-29 07:40:14.300 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:14.311 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:68:77:27 10.100.0.5'], port_security=['fa:16:3e:68:77:27 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'fab512a0-a8b3-423a-bcf4-58a43bc605e5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8ac0e70c-84ba-415c-841b-4a5a525b1a9d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ec8b80be17a14d1caf666636283749d0', 'neutron:revision_number': '4', 'neutron:security_group_ids': '36b9f1ed-957e-468e-a894-c294dde65d52', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.180'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a06529de-8ea4-4c02-8447-f35b6f567d2c, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=0fd3d73b-a8b8-4477-a245-c3b02228c5ac) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:40:14 compute-0 nova_compute[187185]: 2025-11-29 07:40:14.312 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:14.315 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 0fd3d73b-a8b8-4477-a245-c3b02228c5ac in datapath 8ac0e70c-84ba-415c-841b-4a5a525b1a9d unbound from our chassis
Nov 29 07:40:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:14.317 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8ac0e70c-84ba-415c-841b-4a5a525b1a9d
Nov 29 07:40:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:14.339 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[bfe98973-bae2-4adf-b27e-0026476fc4d6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:40:14 compute-0 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d0000009d.scope: Deactivated successfully.
Nov 29 07:40:14 compute-0 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d0000009d.scope: Consumed 13.918s CPU time.
Nov 29 07:40:14 compute-0 systemd-machined[153486]: Machine qemu-62-instance-0000009d terminated.
Nov 29 07:40:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:14.389 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[196b55b9-0d24-4510-b1a2-261f0847aac7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:40:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:14.393 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[3235fd1e-7c78-4194-8c02-698a38df7135]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:40:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:14.431 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[be7b62b4-2d11-4adf-84a6-1d2fc7be5772]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:40:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:14.453 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[03a8d1ef-2943-47ed-9760-f0f9d0c3f4eb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8ac0e70c-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b6:3f:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 9, 'tx_packets': 7, 'rx_bytes': 658, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 9, 'tx_packets': 7, 'rx_bytes': 658, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 158], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 736852, 'reachable_time': 24370, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242647, 'error': None, 'target': 'ovnmeta-8ac0e70c-84ba-415c-841b-4a5a525b1a9d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:40:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:14.474 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d2d0e806-8d9d-4732-9718-d95258904ad1]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap8ac0e70c-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 736867, 'tstamp': 736867}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242648, 'error': None, 'target': 'ovnmeta-8ac0e70c-84ba-415c-841b-4a5a525b1a9d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap8ac0e70c-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 736872, 'tstamp': 736872}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242648, 'error': None, 'target': 'ovnmeta-8ac0e70c-84ba-415c-841b-4a5a525b1a9d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:40:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:14.477 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8ac0e70c-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:40:14 compute-0 nova_compute[187185]: 2025-11-29 07:40:14.479 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:14 compute-0 nova_compute[187185]: 2025-11-29 07:40:14.486 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:14.486 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8ac0e70c-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:40:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:14.487 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:40:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:14.488 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8ac0e70c-80, col_values=(('external_ids', {'iface-id': '17e71a59-7bb8-4f35-826d-2efc90d0ca9c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:40:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:14.489 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:40:14 compute-0 nova_compute[187185]: 2025-11-29 07:40:14.533 187189 INFO nova.virt.libvirt.driver [-] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Instance destroyed successfully.
Nov 29 07:40:14 compute-0 nova_compute[187185]: 2025-11-29 07:40:14.533 187189 DEBUG nova.objects.instance [None req-8fc26cef-7153-4c67-867a-a55b189fcf4f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'resources' on Instance uuid fab512a0-a8b3-423a-bcf4-58a43bc605e5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:40:14 compute-0 nova_compute[187185]: 2025-11-29 07:40:14.558 187189 DEBUG nova.virt.libvirt.vif [None req-8fc26cef-7153-4c67-867a-a55b189fcf4f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:39:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1434926764',display_name='tempest-TestNetworkBasicOps-server-1434926764',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1434926764',id=157,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMcfDmYJ0HBYhMemWiLMJKd0cqvXEGpygRmLhfHquTWqJn5OCZyKRBqP+H/wNvOiW/vd5uFxcRFLtr2MKLO4qkP/QBdiSS+HUjvoRBOvapMvCWoIoFcpibPB61ltG2xzgw==',key_name='tempest-TestNetworkBasicOps-1660436421',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:39:54Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-85ojj5os',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:39:54Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=fab512a0-a8b3-423a-bcf4-58a43bc605e5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0fd3d73b-a8b8-4477-a245-c3b02228c5ac", "address": "fa:16:3e:68:77:27", "network": {"id": "8ac0e70c-84ba-415c-841b-4a5a525b1a9d", "bridge": "br-int", "label": "tempest-network-smoke--597026508", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fd3d73b-a8", "ovs_interfaceid": "0fd3d73b-a8b8-4477-a245-c3b02228c5ac", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:40:14 compute-0 nova_compute[187185]: 2025-11-29 07:40:14.559 187189 DEBUG nova.network.os_vif_util [None req-8fc26cef-7153-4c67-867a-a55b189fcf4f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "0fd3d73b-a8b8-4477-a245-c3b02228c5ac", "address": "fa:16:3e:68:77:27", "network": {"id": "8ac0e70c-84ba-415c-841b-4a5a525b1a9d", "bridge": "br-int", "label": "tempest-network-smoke--597026508", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0fd3d73b-a8", "ovs_interfaceid": "0fd3d73b-a8b8-4477-a245-c3b02228c5ac", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:40:14 compute-0 nova_compute[187185]: 2025-11-29 07:40:14.560 187189 DEBUG nova.network.os_vif_util [None req-8fc26cef-7153-4c67-867a-a55b189fcf4f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:68:77:27,bridge_name='br-int',has_traffic_filtering=True,id=0fd3d73b-a8b8-4477-a245-c3b02228c5ac,network=Network(8ac0e70c-84ba-415c-841b-4a5a525b1a9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0fd3d73b-a8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:40:14 compute-0 nova_compute[187185]: 2025-11-29 07:40:14.560 187189 DEBUG os_vif [None req-8fc26cef-7153-4c67-867a-a55b189fcf4f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:68:77:27,bridge_name='br-int',has_traffic_filtering=True,id=0fd3d73b-a8b8-4477-a245-c3b02228c5ac,network=Network(8ac0e70c-84ba-415c-841b-4a5a525b1a9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0fd3d73b-a8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:40:14 compute-0 nova_compute[187185]: 2025-11-29 07:40:14.563 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:14 compute-0 nova_compute[187185]: 2025-11-29 07:40:14.563 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0fd3d73b-a8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:40:14 compute-0 nova_compute[187185]: 2025-11-29 07:40:14.565 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:14 compute-0 nova_compute[187185]: 2025-11-29 07:40:14.571 187189 DEBUG nova.compute.manager [req-fe4c30ef-e5d5-4efe-a4c0-6e4f904a0366 req-9833c470-8eb3-4e22-b4cb-8db78e48a745 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Received event network-vif-unplugged-0fd3d73b-a8b8-4477-a245-c3b02228c5ac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:40:14 compute-0 nova_compute[187185]: 2025-11-29 07:40:14.572 187189 DEBUG oslo_concurrency.lockutils [req-fe4c30ef-e5d5-4efe-a4c0-6e4f904a0366 req-9833c470-8eb3-4e22-b4cb-8db78e48a745 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "fab512a0-a8b3-423a-bcf4-58a43bc605e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:40:14 compute-0 nova_compute[187185]: 2025-11-29 07:40:14.572 187189 DEBUG oslo_concurrency.lockutils [req-fe4c30ef-e5d5-4efe-a4c0-6e4f904a0366 req-9833c470-8eb3-4e22-b4cb-8db78e48a745 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "fab512a0-a8b3-423a-bcf4-58a43bc605e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:40:14 compute-0 nova_compute[187185]: 2025-11-29 07:40:14.573 187189 DEBUG oslo_concurrency.lockutils [req-fe4c30ef-e5d5-4efe-a4c0-6e4f904a0366 req-9833c470-8eb3-4e22-b4cb-8db78e48a745 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "fab512a0-a8b3-423a-bcf4-58a43bc605e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:40:14 compute-0 nova_compute[187185]: 2025-11-29 07:40:14.573 187189 DEBUG nova.compute.manager [req-fe4c30ef-e5d5-4efe-a4c0-6e4f904a0366 req-9833c470-8eb3-4e22-b4cb-8db78e48a745 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] No waiting events found dispatching network-vif-unplugged-0fd3d73b-a8b8-4477-a245-c3b02228c5ac pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:40:14 compute-0 nova_compute[187185]: 2025-11-29 07:40:14.574 187189 DEBUG nova.compute.manager [req-fe4c30ef-e5d5-4efe-a4c0-6e4f904a0366 req-9833c470-8eb3-4e22-b4cb-8db78e48a745 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Received event network-vif-unplugged-0fd3d73b-a8b8-4477-a245-c3b02228c5ac for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 07:40:14 compute-0 nova_compute[187185]: 2025-11-29 07:40:14.574 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:14 compute-0 nova_compute[187185]: 2025-11-29 07:40:14.578 187189 INFO os_vif [None req-8fc26cef-7153-4c67-867a-a55b189fcf4f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:68:77:27,bridge_name='br-int',has_traffic_filtering=True,id=0fd3d73b-a8b8-4477-a245-c3b02228c5ac,network=Network(8ac0e70c-84ba-415c-841b-4a5a525b1a9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0fd3d73b-a8')
Nov 29 07:40:14 compute-0 nova_compute[187185]: 2025-11-29 07:40:14.580 187189 INFO nova.virt.libvirt.driver [None req-8fc26cef-7153-4c67-867a-a55b189fcf4f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Deleting instance files /var/lib/nova/instances/fab512a0-a8b3-423a-bcf4-58a43bc605e5_del
Nov 29 07:40:14 compute-0 nova_compute[187185]: 2025-11-29 07:40:14.581 187189 INFO nova.virt.libvirt.driver [None req-8fc26cef-7153-4c67-867a-a55b189fcf4f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Deletion of /var/lib/nova/instances/fab512a0-a8b3-423a-bcf4-58a43bc605e5_del complete
Nov 29 07:40:14 compute-0 nova_compute[187185]: 2025-11-29 07:40:14.704 187189 INFO nova.compute.manager [None req-8fc26cef-7153-4c67-867a-a55b189fcf4f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Took 0.45 seconds to destroy the instance on the hypervisor.
Nov 29 07:40:14 compute-0 nova_compute[187185]: 2025-11-29 07:40:14.705 187189 DEBUG oslo.service.loopingcall [None req-8fc26cef-7153-4c67-867a-a55b189fcf4f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 07:40:14 compute-0 nova_compute[187185]: 2025-11-29 07:40:14.706 187189 DEBUG nova.compute.manager [-] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 07:40:14 compute-0 nova_compute[187185]: 2025-11-29 07:40:14.707 187189 DEBUG nova.network.neutron [-] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 07:40:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:14.787 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=59, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=58) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:40:14 compute-0 nova_compute[187185]: 2025-11-29 07:40:14.787 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:14.789 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:40:15 compute-0 nova_compute[187185]: 2025-11-29 07:40:15.802 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:16 compute-0 nova_compute[187185]: 2025-11-29 07:40:16.357 187189 DEBUG nova.network.neutron [-] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:40:16 compute-0 nova_compute[187185]: 2025-11-29 07:40:16.560 187189 INFO nova.compute.manager [-] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Took 1.85 seconds to deallocate network for instance.
Nov 29 07:40:16 compute-0 nova_compute[187185]: 2025-11-29 07:40:16.725 187189 DEBUG nova.compute.manager [req-b5d4c106-bd8d-438e-8471-0c5a396586ed req-e4d5cab3-deb8-4ad5-a37b-0d64ee804cbe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Received event network-vif-plugged-0fd3d73b-a8b8-4477-a245-c3b02228c5ac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:40:16 compute-0 nova_compute[187185]: 2025-11-29 07:40:16.726 187189 DEBUG oslo_concurrency.lockutils [req-b5d4c106-bd8d-438e-8471-0c5a396586ed req-e4d5cab3-deb8-4ad5-a37b-0d64ee804cbe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "fab512a0-a8b3-423a-bcf4-58a43bc605e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:40:16 compute-0 nova_compute[187185]: 2025-11-29 07:40:16.727 187189 DEBUG oslo_concurrency.lockutils [req-b5d4c106-bd8d-438e-8471-0c5a396586ed req-e4d5cab3-deb8-4ad5-a37b-0d64ee804cbe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "fab512a0-a8b3-423a-bcf4-58a43bc605e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:40:16 compute-0 nova_compute[187185]: 2025-11-29 07:40:16.727 187189 DEBUG oslo_concurrency.lockutils [req-b5d4c106-bd8d-438e-8471-0c5a396586ed req-e4d5cab3-deb8-4ad5-a37b-0d64ee804cbe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "fab512a0-a8b3-423a-bcf4-58a43bc605e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:40:16 compute-0 nova_compute[187185]: 2025-11-29 07:40:16.728 187189 DEBUG nova.compute.manager [req-b5d4c106-bd8d-438e-8471-0c5a396586ed req-e4d5cab3-deb8-4ad5-a37b-0d64ee804cbe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] No waiting events found dispatching network-vif-plugged-0fd3d73b-a8b8-4477-a245-c3b02228c5ac pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:40:16 compute-0 nova_compute[187185]: 2025-11-29 07:40:16.728 187189 WARNING nova.compute.manager [req-b5d4c106-bd8d-438e-8471-0c5a396586ed req-e4d5cab3-deb8-4ad5-a37b-0d64ee804cbe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Received unexpected event network-vif-plugged-0fd3d73b-a8b8-4477-a245-c3b02228c5ac for instance with vm_state active and task_state deleting.
Nov 29 07:40:16 compute-0 nova_compute[187185]: 2025-11-29 07:40:16.728 187189 DEBUG nova.compute.manager [req-b5d4c106-bd8d-438e-8471-0c5a396586ed req-e4d5cab3-deb8-4ad5-a37b-0d64ee804cbe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Received event network-vif-deleted-0fd3d73b-a8b8-4477-a245-c3b02228c5ac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:40:16 compute-0 podman[242669]: 2025-11-29 07:40:16.828024805 +0000 UTC m=+0.074261157 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:40:16 compute-0 podman[242667]: 2025-11-29 07:40:16.836813764 +0000 UTC m=+0.084019794 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 07:40:16 compute-0 podman[242668]: 2025-11-29 07:40:16.849256467 +0000 UTC m=+0.092985488 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-type=git, config_id=edpm, architecture=x86_64, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350)
Nov 29 07:40:16 compute-0 nova_compute[187185]: 2025-11-29 07:40:16.930 187189 DEBUG oslo_concurrency.lockutils [None req-8fc26cef-7153-4c67-867a-a55b189fcf4f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:40:16 compute-0 nova_compute[187185]: 2025-11-29 07:40:16.931 187189 DEBUG oslo_concurrency.lockutils [None req-8fc26cef-7153-4c67-867a-a55b189fcf4f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:40:16 compute-0 nova_compute[187185]: 2025-11-29 07:40:16.982 187189 DEBUG nova.scheduler.client.report [None req-8fc26cef-7153-4c67-867a-a55b189fcf4f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Refreshing inventories for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 29 07:40:17 compute-0 nova_compute[187185]: 2025-11-29 07:40:17.002 187189 DEBUG nova.scheduler.client.report [None req-8fc26cef-7153-4c67-867a-a55b189fcf4f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Updating ProviderTree inventory for provider 4e39a026-df39-4e20-874a-dbb5a40df044 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 29 07:40:17 compute-0 nova_compute[187185]: 2025-11-29 07:40:17.003 187189 DEBUG nova.compute.provider_tree [None req-8fc26cef-7153-4c67-867a-a55b189fcf4f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Updating inventory in ProviderTree for provider 4e39a026-df39-4e20-874a-dbb5a40df044 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 29 07:40:17 compute-0 nova_compute[187185]: 2025-11-29 07:40:17.023 187189 DEBUG nova.scheduler.client.report [None req-8fc26cef-7153-4c67-867a-a55b189fcf4f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Refreshing aggregate associations for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 29 07:40:17 compute-0 nova_compute[187185]: 2025-11-29 07:40:17.048 187189 DEBUG nova.scheduler.client.report [None req-8fc26cef-7153-4c67-867a-a55b189fcf4f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Refreshing trait associations for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044, traits: HW_CPU_X86_SSE,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_TRUSTED_CERTS,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 29 07:40:17 compute-0 nova_compute[187185]: 2025-11-29 07:40:17.107 187189 DEBUG nova.compute.provider_tree [None req-8fc26cef-7153-4c67-867a-a55b189fcf4f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:40:17 compute-0 nova_compute[187185]: 2025-11-29 07:40:17.176 187189 DEBUG nova.scheduler.client.report [None req-8fc26cef-7153-4c67-867a-a55b189fcf4f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:40:17 compute-0 nova_compute[187185]: 2025-11-29 07:40:17.231 187189 DEBUG oslo_concurrency.lockutils [None req-8fc26cef-7153-4c67-867a-a55b189fcf4f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.300s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:40:17 compute-0 nova_compute[187185]: 2025-11-29 07:40:17.270 187189 INFO nova.scheduler.client.report [None req-8fc26cef-7153-4c67-867a-a55b189fcf4f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Deleted allocations for instance fab512a0-a8b3-423a-bcf4-58a43bc605e5
Nov 29 07:40:17 compute-0 nova_compute[187185]: 2025-11-29 07:40:17.410 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:17 compute-0 nova_compute[187185]: 2025-11-29 07:40:17.709 187189 DEBUG oslo_concurrency.lockutils [None req-8fc26cef-7153-4c67-867a-a55b189fcf4f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "fab512a0-a8b3-423a-bcf4-58a43bc605e5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.479s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:40:19 compute-0 nova_compute[187185]: 2025-11-29 07:40:19.565 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:19 compute-0 nova_compute[187185]: 2025-11-29 07:40:19.978 187189 DEBUG oslo_concurrency.lockutils [None req-15b83b91-b2ed-4e0c-ad78-5c3cf4170ee7 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "f9a714d0-1a3d-4de4-8fc2-fca74c904eff" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:40:19 compute-0 nova_compute[187185]: 2025-11-29 07:40:19.978 187189 DEBUG oslo_concurrency.lockutils [None req-15b83b91-b2ed-4e0c-ad78-5c3cf4170ee7 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "f9a714d0-1a3d-4de4-8fc2-fca74c904eff" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:40:19 compute-0 nova_compute[187185]: 2025-11-29 07:40:19.979 187189 DEBUG oslo_concurrency.lockutils [None req-15b83b91-b2ed-4e0c-ad78-5c3cf4170ee7 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "f9a714d0-1a3d-4de4-8fc2-fca74c904eff-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:40:19 compute-0 nova_compute[187185]: 2025-11-29 07:40:19.979 187189 DEBUG oslo_concurrency.lockutils [None req-15b83b91-b2ed-4e0c-ad78-5c3cf4170ee7 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "f9a714d0-1a3d-4de4-8fc2-fca74c904eff-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:40:19 compute-0 nova_compute[187185]: 2025-11-29 07:40:19.979 187189 DEBUG oslo_concurrency.lockutils [None req-15b83b91-b2ed-4e0c-ad78-5c3cf4170ee7 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "f9a714d0-1a3d-4de4-8fc2-fca74c904eff-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:40:20 compute-0 nova_compute[187185]: 2025-11-29 07:40:20.006 187189 INFO nova.compute.manager [None req-15b83b91-b2ed-4e0c-ad78-5c3cf4170ee7 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Terminating instance
Nov 29 07:40:20 compute-0 nova_compute[187185]: 2025-11-29 07:40:20.021 187189 DEBUG nova.compute.manager [None req-15b83b91-b2ed-4e0c-ad78-5c3cf4170ee7 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 07:40:20 compute-0 kernel: tapc65a8c36-59 (unregistering): left promiscuous mode
Nov 29 07:40:20 compute-0 NetworkManager[55227]: <info>  [1764402020.0492] device (tapc65a8c36-59): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:40:20 compute-0 ovn_controller[95281]: 2025-11-29T07:40:20Z|00532|binding|INFO|Releasing lport c65a8c36-5997-4e67-9fa0-e361b7c334ca from this chassis (sb_readonly=0)
Nov 29 07:40:20 compute-0 ovn_controller[95281]: 2025-11-29T07:40:20Z|00533|binding|INFO|Setting lport c65a8c36-5997-4e67-9fa0-e361b7c334ca down in Southbound
Nov 29 07:40:20 compute-0 nova_compute[187185]: 2025-11-29 07:40:20.057 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:20 compute-0 ovn_controller[95281]: 2025-11-29T07:40:20Z|00534|binding|INFO|Removing iface tapc65a8c36-59 ovn-installed in OVS
Nov 29 07:40:20 compute-0 nova_compute[187185]: 2025-11-29 07:40:20.060 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:20 compute-0 ovn_controller[95281]: 2025-11-29T07:40:20Z|00535|binding|INFO|Releasing lport 17e71a59-7bb8-4f35-826d-2efc90d0ca9c from this chassis (sb_readonly=0)
Nov 29 07:40:20 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:20.086 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:95:82 10.100.0.11'], port_security=['fa:16:3e:5b:95:82 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'f9a714d0-1a3d-4de4-8fc2-fca74c904eff', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8ac0e70c-84ba-415c-841b-4a5a525b1a9d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ec8b80be17a14d1caf666636283749d0', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ebfe9011-71c9-4c89-a651-d4470dffd8d6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a06529de-8ea4-4c02-8447-f35b6f567d2c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=c65a8c36-5997-4e67-9fa0-e361b7c334ca) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:40:20 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:20.087 104254 INFO neutron.agent.ovn.metadata.agent [-] Port c65a8c36-5997-4e67-9fa0-e361b7c334ca in datapath 8ac0e70c-84ba-415c-841b-4a5a525b1a9d unbound from our chassis
Nov 29 07:40:20 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:20.089 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8ac0e70c-84ba-415c-841b-4a5a525b1a9d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:40:20 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:20.089 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[15ab7b28-77df-4c9d-a779-629ca3b44500]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:40:20 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:20.090 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8ac0e70c-84ba-415c-841b-4a5a525b1a9d namespace which is not needed anymore
Nov 29 07:40:20 compute-0 nova_compute[187185]: 2025-11-29 07:40:20.093 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:20 compute-0 nova_compute[187185]: 2025-11-29 07:40:20.095 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:20 compute-0 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d0000009b.scope: Deactivated successfully.
Nov 29 07:40:20 compute-0 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d0000009b.scope: Consumed 17.307s CPU time.
Nov 29 07:40:20 compute-0 systemd-machined[153486]: Machine qemu-60-instance-0000009b terminated.
Nov 29 07:40:20 compute-0 ovn_controller[95281]: 2025-11-29T07:40:20Z|00536|binding|INFO|Releasing lport 17e71a59-7bb8-4f35-826d-2efc90d0ca9c from this chassis (sb_readonly=0)
Nov 29 07:40:20 compute-0 nova_compute[187185]: 2025-11-29 07:40:20.335 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:20 compute-0 neutron-haproxy-ovnmeta-8ac0e70c-84ba-415c-841b-4a5a525b1a9d[241672]: [NOTICE]   (241677) : haproxy version is 2.8.14-c23fe91
Nov 29 07:40:20 compute-0 neutron-haproxy-ovnmeta-8ac0e70c-84ba-415c-841b-4a5a525b1a9d[241672]: [NOTICE]   (241677) : path to executable is /usr/sbin/haproxy
Nov 29 07:40:20 compute-0 neutron-haproxy-ovnmeta-8ac0e70c-84ba-415c-841b-4a5a525b1a9d[241672]: [WARNING]  (241677) : Exiting Master process...
Nov 29 07:40:20 compute-0 neutron-haproxy-ovnmeta-8ac0e70c-84ba-415c-841b-4a5a525b1a9d[241672]: [WARNING]  (241677) : Exiting Master process...
Nov 29 07:40:20 compute-0 neutron-haproxy-ovnmeta-8ac0e70c-84ba-415c-841b-4a5a525b1a9d[241672]: [ALERT]    (241677) : Current worker (241679) exited with code 143 (Terminated)
Nov 29 07:40:20 compute-0 neutron-haproxy-ovnmeta-8ac0e70c-84ba-415c-841b-4a5a525b1a9d[241672]: [WARNING]  (241677) : All workers exited. Exiting... (0)
Nov 29 07:40:20 compute-0 systemd[1]: libpod-79e8a40b1d8a63164f793d3c266a1b12a0665348af437d223acced0c91ad08e0.scope: Deactivated successfully.
Nov 29 07:40:20 compute-0 podman[242758]: 2025-11-29 07:40:20.376461949 +0000 UTC m=+0.148656097 container died 79e8a40b1d8a63164f793d3c266a1b12a0665348af437d223acced0c91ad08e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8ac0e70c-84ba-415c-841b-4a5a525b1a9d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 29 07:40:20 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-79e8a40b1d8a63164f793d3c266a1b12a0665348af437d223acced0c91ad08e0-userdata-shm.mount: Deactivated successfully.
Nov 29 07:40:20 compute-0 systemd[1]: var-lib-containers-storage-overlay-4215897df8d170872bdb6a6c4424230df1f7d017c06f1dfad42a35d7f26d2d9f-merged.mount: Deactivated successfully.
Nov 29 07:40:20 compute-0 podman[242758]: 2025-11-29 07:40:20.4251593 +0000 UTC m=+0.197353428 container cleanup 79e8a40b1d8a63164f793d3c266a1b12a0665348af437d223acced0c91ad08e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8ac0e70c-84ba-415c-841b-4a5a525b1a9d, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 07:40:20 compute-0 systemd[1]: libpod-conmon-79e8a40b1d8a63164f793d3c266a1b12a0665348af437d223acced0c91ad08e0.scope: Deactivated successfully.
Nov 29 07:40:20 compute-0 nova_compute[187185]: 2025-11-29 07:40:20.451 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:20 compute-0 nova_compute[187185]: 2025-11-29 07:40:20.457 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:20 compute-0 nova_compute[187185]: 2025-11-29 07:40:20.497 187189 INFO nova.virt.libvirt.driver [-] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Instance destroyed successfully.
Nov 29 07:40:20 compute-0 nova_compute[187185]: 2025-11-29 07:40:20.498 187189 DEBUG nova.objects.instance [None req-15b83b91-b2ed-4e0c-ad78-5c3cf4170ee7 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'resources' on Instance uuid f9a714d0-1a3d-4de4-8fc2-fca74c904eff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:40:20 compute-0 nova_compute[187185]: 2025-11-29 07:40:20.515 187189 DEBUG nova.virt.libvirt.vif [None req-15b83b91-b2ed-4e0c-ad78-5c3cf4170ee7 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:38:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-501093689',display_name='tempest-TestNetworkBasicOps-server-501093689',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-501093689',id=155,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ5Ztzxn5CQqZYwb2ZkKXPHZ0ufdHUQG1uY8bsPpbcdD+g/M62uTuwYkEj7cRasbqzaR/Kz4+EyR6ADmDGzV1OyeOoChkIqHM+KkJJEQWYFaNk70r6jDVnPuHLA056MeoA==',key_name='tempest-TestNetworkBasicOps-1295521734',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:38:41Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-rdm7tpik',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:38:41Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=f9a714d0-1a3d-4de4-8fc2-fca74c904eff,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c65a8c36-5997-4e67-9fa0-e361b7c334ca", "address": "fa:16:3e:5b:95:82", "network": {"id": "8ac0e70c-84ba-415c-841b-4a5a525b1a9d", "bridge": "br-int", "label": "tempest-network-smoke--597026508", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc65a8c36-59", "ovs_interfaceid": "c65a8c36-5997-4e67-9fa0-e361b7c334ca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:40:20 compute-0 nova_compute[187185]: 2025-11-29 07:40:20.515 187189 DEBUG nova.network.os_vif_util [None req-15b83b91-b2ed-4e0c-ad78-5c3cf4170ee7 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "c65a8c36-5997-4e67-9fa0-e361b7c334ca", "address": "fa:16:3e:5b:95:82", "network": {"id": "8ac0e70c-84ba-415c-841b-4a5a525b1a9d", "bridge": "br-int", "label": "tempest-network-smoke--597026508", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc65a8c36-59", "ovs_interfaceid": "c65a8c36-5997-4e67-9fa0-e361b7c334ca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:40:20 compute-0 nova_compute[187185]: 2025-11-29 07:40:20.516 187189 DEBUG nova.network.os_vif_util [None req-15b83b91-b2ed-4e0c-ad78-5c3cf4170ee7 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5b:95:82,bridge_name='br-int',has_traffic_filtering=True,id=c65a8c36-5997-4e67-9fa0-e361b7c334ca,network=Network(8ac0e70c-84ba-415c-841b-4a5a525b1a9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc65a8c36-59') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:40:20 compute-0 nova_compute[187185]: 2025-11-29 07:40:20.516 187189 DEBUG os_vif [None req-15b83b91-b2ed-4e0c-ad78-5c3cf4170ee7 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5b:95:82,bridge_name='br-int',has_traffic_filtering=True,id=c65a8c36-5997-4e67-9fa0-e361b7c334ca,network=Network(8ac0e70c-84ba-415c-841b-4a5a525b1a9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc65a8c36-59') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:40:20 compute-0 nova_compute[187185]: 2025-11-29 07:40:20.517 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:20 compute-0 nova_compute[187185]: 2025-11-29 07:40:20.518 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc65a8c36-59, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:40:20 compute-0 podman[242786]: 2025-11-29 07:40:20.518653422 +0000 UTC m=+0.061682660 container remove 79e8a40b1d8a63164f793d3c266a1b12a0665348af437d223acced0c91ad08e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8ac0e70c-84ba-415c-841b-4a5a525b1a9d, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 07:40:20 compute-0 nova_compute[187185]: 2025-11-29 07:40:20.520 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:20 compute-0 nova_compute[187185]: 2025-11-29 07:40:20.522 187189 INFO os_vif [None req-15b83b91-b2ed-4e0c-ad78-5c3cf4170ee7 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5b:95:82,bridge_name='br-int',has_traffic_filtering=True,id=c65a8c36-5997-4e67-9fa0-e361b7c334ca,network=Network(8ac0e70c-84ba-415c-841b-4a5a525b1a9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc65a8c36-59')
Nov 29 07:40:20 compute-0 nova_compute[187185]: 2025-11-29 07:40:20.522 187189 INFO nova.virt.libvirt.driver [None req-15b83b91-b2ed-4e0c-ad78-5c3cf4170ee7 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Deleting instance files /var/lib/nova/instances/f9a714d0-1a3d-4de4-8fc2-fca74c904eff_del
Nov 29 07:40:20 compute-0 nova_compute[187185]: 2025-11-29 07:40:20.523 187189 INFO nova.virt.libvirt.driver [None req-15b83b91-b2ed-4e0c-ad78-5c3cf4170ee7 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Deletion of /var/lib/nova/instances/f9a714d0-1a3d-4de4-8fc2-fca74c904eff_del complete
Nov 29 07:40:20 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:20.526 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[65845bb2-8520-4e97-8fbc-37866fd7d9d4]: (4, ('Sat Nov 29 07:40:20 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-8ac0e70c-84ba-415c-841b-4a5a525b1a9d (79e8a40b1d8a63164f793d3c266a1b12a0665348af437d223acced0c91ad08e0)\n79e8a40b1d8a63164f793d3c266a1b12a0665348af437d223acced0c91ad08e0\nSat Nov 29 07:40:20 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-8ac0e70c-84ba-415c-841b-4a5a525b1a9d (79e8a40b1d8a63164f793d3c266a1b12a0665348af437d223acced0c91ad08e0)\n79e8a40b1d8a63164f793d3c266a1b12a0665348af437d223acced0c91ad08e0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:40:20 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:20.528 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[1f5eac25-f04a-4961-b398-0ed49f566e22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:40:20 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:20.529 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8ac0e70c-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:40:20 compute-0 nova_compute[187185]: 2025-11-29 07:40:20.530 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:20 compute-0 kernel: tap8ac0e70c-80: left promiscuous mode
Nov 29 07:40:20 compute-0 nova_compute[187185]: 2025-11-29 07:40:20.542 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:20 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:20.546 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[80aafca8-09c2-403d-b6da-cc8e0aa53f92]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:40:20 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:20.563 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[1fbfc247-0551-4545-8f6a-c4ca21b434ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:40:20 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:20.564 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[5d082889-c595-4d69-8177-8e8b8b282c77]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:40:20 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:20.589 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d00f8b81-b11c-45ac-8aa6-b1eff41009fe]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 736845, 'reachable_time': 26146, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242814, 'error': None, 'target': 'ovnmeta-8ac0e70c-84ba-415c-841b-4a5a525b1a9d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:40:20 compute-0 systemd[1]: run-netns-ovnmeta\x2d8ac0e70c\x2d84ba\x2d415c\x2d841b\x2d4a5a525b1a9d.mount: Deactivated successfully.
Nov 29 07:40:20 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:20.596 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8ac0e70c-84ba-415c-841b-4a5a525b1a9d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:40:20 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:20.597 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[9d0d39f8-168f-4439-ad1a-eedd482dd6fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:40:20 compute-0 nova_compute[187185]: 2025-11-29 07:40:20.640 187189 INFO nova.compute.manager [None req-15b83b91-b2ed-4e0c-ad78-5c3cf4170ee7 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Took 0.62 seconds to destroy the instance on the hypervisor.
Nov 29 07:40:20 compute-0 nova_compute[187185]: 2025-11-29 07:40:20.641 187189 DEBUG oslo.service.loopingcall [None req-15b83b91-b2ed-4e0c-ad78-5c3cf4170ee7 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 07:40:20 compute-0 nova_compute[187185]: 2025-11-29 07:40:20.642 187189 DEBUG nova.compute.manager [-] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 07:40:20 compute-0 nova_compute[187185]: 2025-11-29 07:40:20.642 187189 DEBUG nova.network.neutron [-] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 07:40:21 compute-0 nova_compute[187185]: 2025-11-29 07:40:21.103 187189 DEBUG nova.compute.manager [req-8cb31d79-65f9-4514-8ad4-6c5cf9e49d1a req-008e8829-d7eb-4cb9-a20f-cc0233d49fe0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Received event network-vif-unplugged-c65a8c36-5997-4e67-9fa0-e361b7c334ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:40:21 compute-0 nova_compute[187185]: 2025-11-29 07:40:21.104 187189 DEBUG oslo_concurrency.lockutils [req-8cb31d79-65f9-4514-8ad4-6c5cf9e49d1a req-008e8829-d7eb-4cb9-a20f-cc0233d49fe0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "f9a714d0-1a3d-4de4-8fc2-fca74c904eff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:40:21 compute-0 nova_compute[187185]: 2025-11-29 07:40:21.104 187189 DEBUG oslo_concurrency.lockutils [req-8cb31d79-65f9-4514-8ad4-6c5cf9e49d1a req-008e8829-d7eb-4cb9-a20f-cc0233d49fe0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f9a714d0-1a3d-4de4-8fc2-fca74c904eff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:40:21 compute-0 nova_compute[187185]: 2025-11-29 07:40:21.104 187189 DEBUG oslo_concurrency.lockutils [req-8cb31d79-65f9-4514-8ad4-6c5cf9e49d1a req-008e8829-d7eb-4cb9-a20f-cc0233d49fe0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f9a714d0-1a3d-4de4-8fc2-fca74c904eff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:40:21 compute-0 nova_compute[187185]: 2025-11-29 07:40:21.105 187189 DEBUG nova.compute.manager [req-8cb31d79-65f9-4514-8ad4-6c5cf9e49d1a req-008e8829-d7eb-4cb9-a20f-cc0233d49fe0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] No waiting events found dispatching network-vif-unplugged-c65a8c36-5997-4e67-9fa0-e361b7c334ca pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:40:21 compute-0 nova_compute[187185]: 2025-11-29 07:40:21.105 187189 DEBUG nova.compute.manager [req-8cb31d79-65f9-4514-8ad4-6c5cf9e49d1a req-008e8829-d7eb-4cb9-a20f-cc0233d49fe0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Received event network-vif-unplugged-c65a8c36-5997-4e67-9fa0-e361b7c334ca for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 07:40:21 compute-0 nova_compute[187185]: 2025-11-29 07:40:21.105 187189 DEBUG nova.compute.manager [req-8cb31d79-65f9-4514-8ad4-6c5cf9e49d1a req-008e8829-d7eb-4cb9-a20f-cc0233d49fe0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Received event network-vif-plugged-c65a8c36-5997-4e67-9fa0-e361b7c334ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:40:21 compute-0 nova_compute[187185]: 2025-11-29 07:40:21.106 187189 DEBUG oslo_concurrency.lockutils [req-8cb31d79-65f9-4514-8ad4-6c5cf9e49d1a req-008e8829-d7eb-4cb9-a20f-cc0233d49fe0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "f9a714d0-1a3d-4de4-8fc2-fca74c904eff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:40:21 compute-0 nova_compute[187185]: 2025-11-29 07:40:21.106 187189 DEBUG oslo_concurrency.lockutils [req-8cb31d79-65f9-4514-8ad4-6c5cf9e49d1a req-008e8829-d7eb-4cb9-a20f-cc0233d49fe0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f9a714d0-1a3d-4de4-8fc2-fca74c904eff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:40:21 compute-0 nova_compute[187185]: 2025-11-29 07:40:21.106 187189 DEBUG oslo_concurrency.lockutils [req-8cb31d79-65f9-4514-8ad4-6c5cf9e49d1a req-008e8829-d7eb-4cb9-a20f-cc0233d49fe0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f9a714d0-1a3d-4de4-8fc2-fca74c904eff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:40:21 compute-0 nova_compute[187185]: 2025-11-29 07:40:21.107 187189 DEBUG nova.compute.manager [req-8cb31d79-65f9-4514-8ad4-6c5cf9e49d1a req-008e8829-d7eb-4cb9-a20f-cc0233d49fe0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] No waiting events found dispatching network-vif-plugged-c65a8c36-5997-4e67-9fa0-e361b7c334ca pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:40:21 compute-0 nova_compute[187185]: 2025-11-29 07:40:21.107 187189 WARNING nova.compute.manager [req-8cb31d79-65f9-4514-8ad4-6c5cf9e49d1a req-008e8829-d7eb-4cb9-a20f-cc0233d49fe0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Received unexpected event network-vif-plugged-c65a8c36-5997-4e67-9fa0-e361b7c334ca for instance with vm_state active and task_state deleting.
Nov 29 07:40:22 compute-0 nova_compute[187185]: 2025-11-29 07:40:22.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:40:22 compute-0 nova_compute[187185]: 2025-11-29 07:40:22.413 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:22 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:22.792 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '59'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:40:23 compute-0 nova_compute[187185]: 2025-11-29 07:40:23.225 187189 DEBUG nova.network.neutron [-] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:40:23 compute-0 nova_compute[187185]: 2025-11-29 07:40:23.269 187189 INFO nova.compute.manager [-] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Took 2.63 seconds to deallocate network for instance.
Nov 29 07:40:23 compute-0 nova_compute[187185]: 2025-11-29 07:40:23.367 187189 DEBUG nova.compute.manager [req-804a86c4-f651-412e-980d-36cfaf2a79c8 req-e1e61791-912d-4143-9cee-aa3bb195f14e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Received event network-vif-deleted-c65a8c36-5997-4e67-9fa0-e361b7c334ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:40:23 compute-0 nova_compute[187185]: 2025-11-29 07:40:23.600 187189 DEBUG oslo_concurrency.lockutils [None req-15b83b91-b2ed-4e0c-ad78-5c3cf4170ee7 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:40:23 compute-0 nova_compute[187185]: 2025-11-29 07:40:23.601 187189 DEBUG oslo_concurrency.lockutils [None req-15b83b91-b2ed-4e0c-ad78-5c3cf4170ee7 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:40:23 compute-0 nova_compute[187185]: 2025-11-29 07:40:23.680 187189 DEBUG nova.compute.provider_tree [None req-15b83b91-b2ed-4e0c-ad78-5c3cf4170ee7 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:40:23 compute-0 nova_compute[187185]: 2025-11-29 07:40:23.705 187189 DEBUG nova.scheduler.client.report [None req-15b83b91-b2ed-4e0c-ad78-5c3cf4170ee7 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:40:23 compute-0 nova_compute[187185]: 2025-11-29 07:40:23.771 187189 DEBUG oslo_concurrency.lockutils [None req-15b83b91-b2ed-4e0c-ad78-5c3cf4170ee7 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:40:23 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:23.780 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:f6:2b 10.100.0.2 2001:db8::f816:3eff:feac:f62b'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:feac:f62b/64', 'neutron:device_id': 'ovnmeta-600edac6-24aa-414f-b977-07c2890470f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-600edac6-24aa-414f-b977-07c2890470f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=de1096f6-2a15-4f04-9ea7-22d2dff24e74, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=2459b7bb-f6d0-4520-a009-14c9d4a2b794) old=Port_Binding(mac=['fa:16:3e:ac:f6:2b 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-600edac6-24aa-414f-b977-07c2890470f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-600edac6-24aa-414f-b977-07c2890470f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:40:23 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:23.782 104254 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 2459b7bb-f6d0-4520-a009-14c9d4a2b794 in datapath 600edac6-24aa-414f-b977-07c2890470f1 updated
Nov 29 07:40:23 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:23.784 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 600edac6-24aa-414f-b977-07c2890470f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:40:23 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:23.786 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[39b3b7db-9000-43b8-800d-315e9232efd4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:40:23 compute-0 nova_compute[187185]: 2025-11-29 07:40:23.909 187189 INFO nova.scheduler.client.report [None req-15b83b91-b2ed-4e0c-ad78-5c3cf4170ee7 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Deleted allocations for instance f9a714d0-1a3d-4de4-8fc2-fca74c904eff
Nov 29 07:40:24 compute-0 nova_compute[187185]: 2025-11-29 07:40:24.279 187189 DEBUG oslo_concurrency.lockutils [None req-15b83b91-b2ed-4e0c-ad78-5c3cf4170ee7 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "f9a714d0-1a3d-4de4-8fc2-fca74c904eff" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.301s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:40:25 compute-0 nova_compute[187185]: 2025-11-29 07:40:25.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:40:25 compute-0 nova_compute[187185]: 2025-11-29 07:40:25.345 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:40:25 compute-0 nova_compute[187185]: 2025-11-29 07:40:25.346 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:40:25 compute-0 nova_compute[187185]: 2025-11-29 07:40:25.346 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:40:25 compute-0 nova_compute[187185]: 2025-11-29 07:40:25.346 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:40:25 compute-0 nova_compute[187185]: 2025-11-29 07:40:25.521 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:25.530 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:40:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:25.530 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:40:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:25.531 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:40:25 compute-0 podman[242818]: 2025-11-29 07:40:25.544860041 +0000 UTC m=+0.136077932 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 29 07:40:25 compute-0 nova_compute[187185]: 2025-11-29 07:40:25.567 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:40:25 compute-0 nova_compute[187185]: 2025-11-29 07:40:25.568 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5728MB free_disk=73.25391387939453GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:40:25 compute-0 nova_compute[187185]: 2025-11-29 07:40:25.568 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:40:25 compute-0 nova_compute[187185]: 2025-11-29 07:40:25.569 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:40:25 compute-0 nova_compute[187185]: 2025-11-29 07:40:25.626 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:40:25 compute-0 nova_compute[187185]: 2025-11-29 07:40:25.627 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:40:25 compute-0 nova_compute[187185]: 2025-11-29 07:40:25.653 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:40:25 compute-0 nova_compute[187185]: 2025-11-29 07:40:25.669 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:40:25 compute-0 nova_compute[187185]: 2025-11-29 07:40:25.690 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:40:25 compute-0 nova_compute[187185]: 2025-11-29 07:40:25.691 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:40:26 compute-0 sshd-session[242666]: error: kex_exchange_identification: read: Connection timed out
Nov 29 07:40:26 compute-0 sshd-session[242666]: banner exchange: Connection from 115.190.187.93 port 56808: Connection timed out
Nov 29 07:40:26 compute-0 sshd-session[242815]: Received disconnect from 20.255.62.58 port 55120:11: Bye Bye [preauth]
Nov 29 07:40:26 compute-0 sshd-session[242815]: Disconnected from authenticating user root 20.255.62.58 port 55120 [preauth]
Nov 29 07:40:26 compute-0 nova_compute[187185]: 2025-11-29 07:40:26.692 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:40:26 compute-0 nova_compute[187185]: 2025-11-29 07:40:26.693 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:40:26 compute-0 nova_compute[187185]: 2025-11-29 07:40:26.693 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:40:26 compute-0 nova_compute[187185]: 2025-11-29 07:40:26.714 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 07:40:27 compute-0 nova_compute[187185]: 2025-11-29 07:40:27.415 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:28 compute-0 nova_compute[187185]: 2025-11-29 07:40:28.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:40:29 compute-0 nova_compute[187185]: 2025-11-29 07:40:29.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:40:29 compute-0 nova_compute[187185]: 2025-11-29 07:40:29.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:40:29 compute-0 nova_compute[187185]: 2025-11-29 07:40:29.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:40:29 compute-0 nova_compute[187185]: 2025-11-29 07:40:29.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:40:29 compute-0 nova_compute[187185]: 2025-11-29 07:40:29.531 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764402014.5290368, fab512a0-a8b3-423a-bcf4-58a43bc605e5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:40:29 compute-0 nova_compute[187185]: 2025-11-29 07:40:29.531 187189 INFO nova.compute.manager [-] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] VM Stopped (Lifecycle Event)
Nov 29 07:40:29 compute-0 nova_compute[187185]: 2025-11-29 07:40:29.579 187189 DEBUG nova.compute.manager [None req-56798e80-4052-4635-9a06-a6a4649ab4cf - - - - - -] [instance: fab512a0-a8b3-423a-bcf4-58a43bc605e5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:40:30 compute-0 nova_compute[187185]: 2025-11-29 07:40:30.524 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:32 compute-0 nova_compute[187185]: 2025-11-29 07:40:32.417 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:32 compute-0 podman[242844]: 2025-11-29 07:40:32.835528904 +0000 UTC m=+0.086264127 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 07:40:33 compute-0 nova_compute[187185]: 2025-11-29 07:40:33.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:40:35 compute-0 nova_compute[187185]: 2025-11-29 07:40:35.495 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764402020.4939768, f9a714d0-1a3d-4de4-8fc2-fca74c904eff => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:40:35 compute-0 nova_compute[187185]: 2025-11-29 07:40:35.496 187189 INFO nova.compute.manager [-] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] VM Stopped (Lifecycle Event)
Nov 29 07:40:35 compute-0 nova_compute[187185]: 2025-11-29 07:40:35.517 187189 DEBUG nova.compute.manager [None req-7b5bc67d-9245-42ce-a606-7507a7cbc468 - - - - - -] [instance: f9a714d0-1a3d-4de4-8fc2-fca74c904eff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:40:35 compute-0 nova_compute[187185]: 2025-11-29 07:40:35.527 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:35 compute-0 podman[242868]: 2025-11-29 07:40:35.80509072 +0000 UTC m=+0.070009517 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, tcib_managed=true)
Nov 29 07:40:35 compute-0 podman[242869]: 2025-11-29 07:40:35.829473501 +0000 UTC m=+0.091512246 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 29 07:40:36 compute-0 nova_compute[187185]: 2025-11-29 07:40:36.311 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:40:37 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:37.179 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:f6:2b 10.100.0.2 2001:db8:0:1:f816:3eff:feac:f62b 2001:db8::f816:3eff:feac:f62b'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:feac:f62b/64 2001:db8::f816:3eff:feac:f62b/64', 'neutron:device_id': 'ovnmeta-600edac6-24aa-414f-b977-07c2890470f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-600edac6-24aa-414f-b977-07c2890470f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=de1096f6-2a15-4f04-9ea7-22d2dff24e74, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=2459b7bb-f6d0-4520-a009-14c9d4a2b794) old=Port_Binding(mac=['fa:16:3e:ac:f6:2b 10.100.0.2 2001:db8::f816:3eff:feac:f62b'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:feac:f62b/64', 'neutron:device_id': 'ovnmeta-600edac6-24aa-414f-b977-07c2890470f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-600edac6-24aa-414f-b977-07c2890470f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:40:37 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:37.181 104254 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 2459b7bb-f6d0-4520-a009-14c9d4a2b794 in datapath 600edac6-24aa-414f-b977-07c2890470f1 updated
Nov 29 07:40:37 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:37.183 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 600edac6-24aa-414f-b977-07c2890470f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:40:37 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:37.184 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[56f994db-9c0d-4f5a-81f5-fb63b37eb1cb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:40:37 compute-0 nova_compute[187185]: 2025-11-29 07:40:37.422 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:40 compute-0 nova_compute[187185]: 2025-11-29 07:40:40.533 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:42 compute-0 nova_compute[187185]: 2025-11-29 07:40:42.421 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:45 compute-0 nova_compute[187185]: 2025-11-29 07:40:45.539 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:47 compute-0 nova_compute[187185]: 2025-11-29 07:40:47.423 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:47 compute-0 podman[242908]: 2025-11-29 07:40:47.842639776 +0000 UTC m=+0.095548861 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 29 07:40:47 compute-0 podman[242910]: 2025-11-29 07:40:47.850287023 +0000 UTC m=+0.083870150 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 07:40:47 compute-0 podman[242909]: 2025-11-29 07:40:47.878282387 +0000 UTC m=+0.123750521 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, release=1755695350, build-date=2025-08-20T13:12:41, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, architecture=x86_64)
Nov 29 07:40:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:40:48.011 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:40:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:40:48.011 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:40:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:40:48.011 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:40:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:40:48.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:40:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:40:48.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:40:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:40:48.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:40:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:40:48.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:40:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:40:48.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:40:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:40:48.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:40:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:40:48.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:40:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:40:48.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:40:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:40:48.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:40:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:40:48.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:40:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:40:48.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:40:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:40:48.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:40:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:40:48.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:40:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:40:48.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:40:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:40:48.013 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:40:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:40:48.013 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:40:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:40:48.013 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:40:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:40:48.013 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:40:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:40:48.013 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:40:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:40:48.013 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:40:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:40:48.013 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:40:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:40:48.013 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:40:50 compute-0 nova_compute[187185]: 2025-11-29 07:40:50.543 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:52 compute-0 nova_compute[187185]: 2025-11-29 07:40:52.426 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:52.644 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=60, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=59) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:40:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:40:52.645 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:40:52 compute-0 nova_compute[187185]: 2025-11-29 07:40:52.689 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:55 compute-0 nova_compute[187185]: 2025-11-29 07:40:55.311 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:40:55 compute-0 nova_compute[187185]: 2025-11-29 07:40:55.546 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:40:55 compute-0 podman[242972]: 2025-11-29 07:40:55.884988222 +0000 UTC m=+0.143960255 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller)
Nov 29 07:40:57 compute-0 nova_compute[187185]: 2025-11-29 07:40:57.428 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:41:00 compute-0 nova_compute[187185]: 2025-11-29 07:41:00.549 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:41:02 compute-0 nova_compute[187185]: 2025-11-29 07:41:02.429 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:41:02 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:41:02.649 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '60'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:41:03 compute-0 podman[242999]: 2025-11-29 07:41:03.817559993 +0000 UTC m=+0.073587809 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:41:05 compute-0 nova_compute[187185]: 2025-11-29 07:41:05.555 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:41:06 compute-0 podman[243023]: 2025-11-29 07:41:06.81225742 +0000 UTC m=+0.065012325 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251125)
Nov 29 07:41:06 compute-0 podman[243022]: 2025-11-29 07:41:06.831053844 +0000 UTC m=+0.093588566 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 07:41:07 compute-0 nova_compute[187185]: 2025-11-29 07:41:07.431 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:41:10 compute-0 nova_compute[187185]: 2025-11-29 07:41:10.558 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:41:12 compute-0 nova_compute[187185]: 2025-11-29 07:41:12.433 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:41:15 compute-0 nova_compute[187185]: 2025-11-29 07:41:15.562 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:41:17 compute-0 nova_compute[187185]: 2025-11-29 07:41:17.435 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:41:17 compute-0 sshd-session[243062]: Invalid user dmdba from 190.181.27.27 port 51252
Nov 29 07:41:17 compute-0 podman[243066]: 2025-11-29 07:41:17.994072338 +0000 UTC m=+0.064436269 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 07:41:17 compute-0 podman[243065]: 2025-11-29 07:41:17.99416039 +0000 UTC m=+0.069104701 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, release=1755695350, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-type=git, architecture=x86_64, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, version=9.6, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 29 07:41:18 compute-0 podman[243064]: 2025-11-29 07:41:18.012181891 +0000 UTC m=+0.094516241 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 07:41:18 compute-0 sshd-session[243062]: Received disconnect from 190.181.27.27 port 51252:11: Bye Bye [preauth]
Nov 29 07:41:18 compute-0 sshd-session[243062]: Disconnected from invalid user dmdba 190.181.27.27 port 51252 [preauth]
Nov 29 07:41:20 compute-0 nova_compute[187185]: 2025-11-29 07:41:20.575 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:41:22 compute-0 nova_compute[187185]: 2025-11-29 07:41:22.438 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:41:23 compute-0 nova_compute[187185]: 2025-11-29 07:41:23.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:41:25 compute-0 nova_compute[187185]: 2025-11-29 07:41:25.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:41:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:41:25.532 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:41:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:41:25.533 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:41:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:41:25.534 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:41:25 compute-0 nova_compute[187185]: 2025-11-29 07:41:25.578 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:41:25 compute-0 nova_compute[187185]: 2025-11-29 07:41:25.884 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:41:25 compute-0 nova_compute[187185]: 2025-11-29 07:41:25.884 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:41:25 compute-0 nova_compute[187185]: 2025-11-29 07:41:25.885 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:41:25 compute-0 nova_compute[187185]: 2025-11-29 07:41:25.886 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:41:26 compute-0 nova_compute[187185]: 2025-11-29 07:41:26.508 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:41:26 compute-0 nova_compute[187185]: 2025-11-29 07:41:26.509 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5723MB free_disk=73.25392532348633GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:41:26 compute-0 nova_compute[187185]: 2025-11-29 07:41:26.509 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:41:26 compute-0 nova_compute[187185]: 2025-11-29 07:41:26.510 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:41:26 compute-0 podman[243128]: 2025-11-29 07:41:26.898421359 +0000 UTC m=+0.154538073 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 29 07:41:27 compute-0 nova_compute[187185]: 2025-11-29 07:41:27.087 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:41:27 compute-0 nova_compute[187185]: 2025-11-29 07:41:27.088 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:41:27 compute-0 nova_compute[187185]: 2025-11-29 07:41:27.116 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:41:27 compute-0 nova_compute[187185]: 2025-11-29 07:41:27.137 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:41:27 compute-0 nova_compute[187185]: 2025-11-29 07:41:27.140 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:41:27 compute-0 nova_compute[187185]: 2025-11-29 07:41:27.140 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.630s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:41:27 compute-0 nova_compute[187185]: 2025-11-29 07:41:27.439 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:41:30 compute-0 nova_compute[187185]: 2025-11-29 07:41:30.141 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:41:30 compute-0 nova_compute[187185]: 2025-11-29 07:41:30.142 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:41:30 compute-0 nova_compute[187185]: 2025-11-29 07:41:30.143 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:41:30 compute-0 nova_compute[187185]: 2025-11-29 07:41:30.181 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 07:41:30 compute-0 nova_compute[187185]: 2025-11-29 07:41:30.182 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:41:30 compute-0 nova_compute[187185]: 2025-11-29 07:41:30.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:41:30 compute-0 nova_compute[187185]: 2025-11-29 07:41:30.582 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:41:31 compute-0 nova_compute[187185]: 2025-11-29 07:41:31.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:41:31 compute-0 nova_compute[187185]: 2025-11-29 07:41:31.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:41:31 compute-0 nova_compute[187185]: 2025-11-29 07:41:31.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:41:32 compute-0 nova_compute[187185]: 2025-11-29 07:41:32.441 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:41:33 compute-0 nova_compute[187185]: 2025-11-29 07:41:33.318 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_shelved_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:41:34 compute-0 nova_compute[187185]: 2025-11-29 07:41:34.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:41:34 compute-0 podman[243155]: 2025-11-29 07:41:34.81480201 +0000 UTC m=+0.073994620 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 07:41:35 compute-0 nova_compute[187185]: 2025-11-29 07:41:35.586 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:41:36 compute-0 nova_compute[187185]: 2025-11-29 07:41:36.311 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:41:37 compute-0 nova_compute[187185]: 2025-11-29 07:41:37.444 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:41:37 compute-0 podman[243179]: 2025-11-29 07:41:37.790929751 +0000 UTC m=+0.057861663 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 29 07:41:37 compute-0 podman[243180]: 2025-11-29 07:41:37.807038208 +0000 UTC m=+0.066872278 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 29 07:41:40 compute-0 nova_compute[187185]: 2025-11-29 07:41:40.590 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:41:42 compute-0 nova_compute[187185]: 2025-11-29 07:41:42.444 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:41:45 compute-0 nova_compute[187185]: 2025-11-29 07:41:45.594 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:41:45 compute-0 nova_compute[187185]: 2025-11-29 07:41:45.989 187189 DEBUG oslo_concurrency.lockutils [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "9b64431b-400c-4b7e-b7bf-986103d270c2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:41:45 compute-0 nova_compute[187185]: 2025-11-29 07:41:45.990 187189 DEBUG oslo_concurrency.lockutils [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "9b64431b-400c-4b7e-b7bf-986103d270c2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:41:46 compute-0 nova_compute[187185]: 2025-11-29 07:41:46.010 187189 DEBUG nova.compute.manager [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 07:41:46 compute-0 sshd-session[243219]: Invalid user ahmed from 20.255.62.58 port 55448
Nov 29 07:41:47 compute-0 nova_compute[187185]: 2025-11-29 07:41:47.133 187189 DEBUG oslo_concurrency.lockutils [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:41:47 compute-0 nova_compute[187185]: 2025-11-29 07:41:47.134 187189 DEBUG oslo_concurrency.lockutils [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:41:47 compute-0 nova_compute[187185]: 2025-11-29 07:41:47.142 187189 DEBUG nova.virt.hardware [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:41:47 compute-0 nova_compute[187185]: 2025-11-29 07:41:47.142 187189 INFO nova.compute.claims [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:41:47 compute-0 sshd-session[243219]: Received disconnect from 20.255.62.58 port 55448:11: Bye Bye [preauth]
Nov 29 07:41:47 compute-0 sshd-session[243219]: Disconnected from invalid user ahmed 20.255.62.58 port 55448 [preauth]
Nov 29 07:41:47 compute-0 nova_compute[187185]: 2025-11-29 07:41:47.447 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:41:47 compute-0 ovn_controller[95281]: 2025-11-29T07:41:47Z|00537|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 29 07:41:47 compute-0 nova_compute[187185]: 2025-11-29 07:41:47.866 187189 DEBUG nova.compute.provider_tree [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:41:47 compute-0 nova_compute[187185]: 2025-11-29 07:41:47.881 187189 DEBUG nova.scheduler.client.report [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:41:47 compute-0 nova_compute[187185]: 2025-11-29 07:41:47.903 187189 DEBUG oslo_concurrency.lockutils [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.769s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:41:47 compute-0 nova_compute[187185]: 2025-11-29 07:41:47.904 187189 DEBUG nova.compute.manager [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 07:41:47 compute-0 nova_compute[187185]: 2025-11-29 07:41:47.994 187189 DEBUG nova.compute.manager [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 07:41:47 compute-0 nova_compute[187185]: 2025-11-29 07:41:47.994 187189 DEBUG nova.network.neutron [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 07:41:48 compute-0 nova_compute[187185]: 2025-11-29 07:41:48.014 187189 INFO nova.virt.libvirt.driver [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 07:41:48 compute-0 nova_compute[187185]: 2025-11-29 07:41:48.038 187189 DEBUG nova.compute.manager [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 07:41:48 compute-0 nova_compute[187185]: 2025-11-29 07:41:48.224 187189 DEBUG nova.compute.manager [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 07:41:48 compute-0 nova_compute[187185]: 2025-11-29 07:41:48.225 187189 DEBUG nova.virt.libvirt.driver [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:41:48 compute-0 nova_compute[187185]: 2025-11-29 07:41:48.225 187189 INFO nova.virt.libvirt.driver [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Creating image(s)
Nov 29 07:41:48 compute-0 nova_compute[187185]: 2025-11-29 07:41:48.226 187189 DEBUG oslo_concurrency.lockutils [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "/var/lib/nova/instances/9b64431b-400c-4b7e-b7bf-986103d270c2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:41:48 compute-0 nova_compute[187185]: 2025-11-29 07:41:48.226 187189 DEBUG oslo_concurrency.lockutils [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "/var/lib/nova/instances/9b64431b-400c-4b7e-b7bf-986103d270c2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:41:48 compute-0 nova_compute[187185]: 2025-11-29 07:41:48.227 187189 DEBUG oslo_concurrency.lockutils [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "/var/lib/nova/instances/9b64431b-400c-4b7e-b7bf-986103d270c2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:41:48 compute-0 nova_compute[187185]: 2025-11-29 07:41:48.239 187189 DEBUG oslo_concurrency.processutils [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:41:48 compute-0 nova_compute[187185]: 2025-11-29 07:41:48.319 187189 DEBUG oslo_concurrency.processutils [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:41:48 compute-0 nova_compute[187185]: 2025-11-29 07:41:48.321 187189 DEBUG oslo_concurrency.lockutils [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:41:48 compute-0 nova_compute[187185]: 2025-11-29 07:41:48.322 187189 DEBUG oslo_concurrency.lockutils [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:41:48 compute-0 nova_compute[187185]: 2025-11-29 07:41:48.333 187189 DEBUG oslo_concurrency.processutils [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:41:48 compute-0 nova_compute[187185]: 2025-11-29 07:41:48.400 187189 DEBUG oslo_concurrency.processutils [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:41:48 compute-0 nova_compute[187185]: 2025-11-29 07:41:48.402 187189 DEBUG oslo_concurrency.processutils [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/9b64431b-400c-4b7e-b7bf-986103d270c2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:41:48 compute-0 nova_compute[187185]: 2025-11-29 07:41:48.433 187189 DEBUG nova.policy [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 07:41:48 compute-0 nova_compute[187185]: 2025-11-29 07:41:48.467 187189 DEBUG oslo_concurrency.processutils [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/9b64431b-400c-4b7e-b7bf-986103d270c2/disk 1073741824" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:41:48 compute-0 nova_compute[187185]: 2025-11-29 07:41:48.468 187189 DEBUG oslo_concurrency.lockutils [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:41:48 compute-0 nova_compute[187185]: 2025-11-29 07:41:48.468 187189 DEBUG oslo_concurrency.processutils [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:41:48 compute-0 nova_compute[187185]: 2025-11-29 07:41:48.528 187189 DEBUG oslo_concurrency.processutils [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:41:48 compute-0 nova_compute[187185]: 2025-11-29 07:41:48.530 187189 DEBUG nova.virt.disk.api [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Checking if we can resize image /var/lib/nova/instances/9b64431b-400c-4b7e-b7bf-986103d270c2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:41:48 compute-0 nova_compute[187185]: 2025-11-29 07:41:48.531 187189 DEBUG oslo_concurrency.processutils [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9b64431b-400c-4b7e-b7bf-986103d270c2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:41:48 compute-0 nova_compute[187185]: 2025-11-29 07:41:48.600 187189 DEBUG oslo_concurrency.processutils [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9b64431b-400c-4b7e-b7bf-986103d270c2/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:41:48 compute-0 nova_compute[187185]: 2025-11-29 07:41:48.601 187189 DEBUG nova.virt.disk.api [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Cannot resize image /var/lib/nova/instances/9b64431b-400c-4b7e-b7bf-986103d270c2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:41:48 compute-0 nova_compute[187185]: 2025-11-29 07:41:48.602 187189 DEBUG nova.objects.instance [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'migration_context' on Instance uuid 9b64431b-400c-4b7e-b7bf-986103d270c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:41:48 compute-0 nova_compute[187185]: 2025-11-29 07:41:48.655 187189 DEBUG nova.virt.libvirt.driver [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:41:48 compute-0 nova_compute[187185]: 2025-11-29 07:41:48.656 187189 DEBUG nova.virt.libvirt.driver [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Ensure instance console log exists: /var/lib/nova/instances/9b64431b-400c-4b7e-b7bf-986103d270c2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:41:48 compute-0 nova_compute[187185]: 2025-11-29 07:41:48.656 187189 DEBUG oslo_concurrency.lockutils [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:41:48 compute-0 nova_compute[187185]: 2025-11-29 07:41:48.657 187189 DEBUG oslo_concurrency.lockutils [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:41:48 compute-0 nova_compute[187185]: 2025-11-29 07:41:48.657 187189 DEBUG oslo_concurrency.lockutils [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:41:48 compute-0 podman[243236]: 2025-11-29 07:41:48.787986848 +0000 UTC m=+0.053965292 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 07:41:48 compute-0 podman[243238]: 2025-11-29 07:41:48.810002683 +0000 UTC m=+0.067105955 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 07:41:48 compute-0 podman[243237]: 2025-11-29 07:41:48.831908374 +0000 UTC m=+0.092759012 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, distribution-scope=public, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, architecture=x86_64, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., container_name=openstack_network_exporter, name=ubi9-minimal)
Nov 29 07:41:50 compute-0 nova_compute[187185]: 2025-11-29 07:41:50.627 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:41:52 compute-0 nova_compute[187185]: 2025-11-29 07:41:52.448 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:41:55 compute-0 nova_compute[187185]: 2025-11-29 07:41:55.631 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:41:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:41:56.349 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=61, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=60) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:41:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:41:56.351 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:41:56 compute-0 nova_compute[187185]: 2025-11-29 07:41:56.355 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:41:57 compute-0 nova_compute[187185]: 2025-11-29 07:41:57.450 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:41:57 compute-0 podman[243295]: 2025-11-29 07:41:57.812958082 +0000 UTC m=+0.080013651 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 29 07:41:58 compute-0 nova_compute[187185]: 2025-11-29 07:41:58.039 187189 DEBUG nova.network.neutron [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Successfully created port: 66645927-47fc-4df8-b8f3-2254e1a841ed _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:42:00 compute-0 nova_compute[187185]: 2025-11-29 07:42:00.685 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:02 compute-0 nova_compute[187185]: 2025-11-29 07:42:02.451 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:02 compute-0 nova_compute[187185]: 2025-11-29 07:42:02.654 187189 DEBUG nova.network.neutron [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Successfully updated port: 66645927-47fc-4df8-b8f3-2254e1a841ed _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:42:02 compute-0 nova_compute[187185]: 2025-11-29 07:42:02.728 187189 DEBUG oslo_concurrency.lockutils [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "refresh_cache-9b64431b-400c-4b7e-b7bf-986103d270c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:42:02 compute-0 nova_compute[187185]: 2025-11-29 07:42:02.729 187189 DEBUG oslo_concurrency.lockutils [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquired lock "refresh_cache-9b64431b-400c-4b7e-b7bf-986103d270c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:42:02 compute-0 nova_compute[187185]: 2025-11-29 07:42:02.729 187189 DEBUG nova.network.neutron [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:42:03 compute-0 nova_compute[187185]: 2025-11-29 07:42:03.027 187189 DEBUG nova.compute.manager [req-f09b4dcb-4585-4986-a37f-6fe59242fb4a req-cc7667bd-903f-4087-aa12-7b9679e50bc8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Received event network-changed-66645927-47fc-4df8-b8f3-2254e1a841ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:42:03 compute-0 nova_compute[187185]: 2025-11-29 07:42:03.028 187189 DEBUG nova.compute.manager [req-f09b4dcb-4585-4986-a37f-6fe59242fb4a req-cc7667bd-903f-4087-aa12-7b9679e50bc8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Refreshing instance network info cache due to event network-changed-66645927-47fc-4df8-b8f3-2254e1a841ed. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:42:03 compute-0 nova_compute[187185]: 2025-11-29 07:42:03.029 187189 DEBUG oslo_concurrency.lockutils [req-f09b4dcb-4585-4986-a37f-6fe59242fb4a req-cc7667bd-903f-4087-aa12-7b9679e50bc8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-9b64431b-400c-4b7e-b7bf-986103d270c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:42:03 compute-0 nova_compute[187185]: 2025-11-29 07:42:03.495 187189 DEBUG nova.network.neutron [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:42:05 compute-0 nova_compute[187185]: 2025-11-29 07:42:05.689 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:05 compute-0 podman[243321]: 2025-11-29 07:42:05.790101217 +0000 UTC m=+0.051719349 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:42:06 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:06.354 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '61'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:42:06 compute-0 nova_compute[187185]: 2025-11-29 07:42:06.879 187189 DEBUG nova.network.neutron [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Updating instance_info_cache with network_info: [{"id": "66645927-47fc-4df8-b8f3-2254e1a841ed", "address": "fa:16:3e:7c:02:c3", "network": {"id": "600edac6-24aa-414f-b977-07c2890470f1", "bridge": "br-int", "label": "tempest-network-smoke--176960828", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe7c:2c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe7c:2c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66645927-47", "ovs_interfaceid": "66645927-47fc-4df8-b8f3-2254e1a841ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.110 187189 DEBUG oslo_concurrency.lockutils [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Releasing lock "refresh_cache-9b64431b-400c-4b7e-b7bf-986103d270c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.110 187189 DEBUG nova.compute.manager [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Instance network_info: |[{"id": "66645927-47fc-4df8-b8f3-2254e1a841ed", "address": "fa:16:3e:7c:02:c3", "network": {"id": "600edac6-24aa-414f-b977-07c2890470f1", "bridge": "br-int", "label": "tempest-network-smoke--176960828", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe7c:2c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe7c:2c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66645927-47", "ovs_interfaceid": "66645927-47fc-4df8-b8f3-2254e1a841ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.112 187189 DEBUG oslo_concurrency.lockutils [req-f09b4dcb-4585-4986-a37f-6fe59242fb4a req-cc7667bd-903f-4087-aa12-7b9679e50bc8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-9b64431b-400c-4b7e-b7bf-986103d270c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.112 187189 DEBUG nova.network.neutron [req-f09b4dcb-4585-4986-a37f-6fe59242fb4a req-cc7667bd-903f-4087-aa12-7b9679e50bc8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Refreshing network info cache for port 66645927-47fc-4df8-b8f3-2254e1a841ed _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.120 187189 DEBUG nova.virt.libvirt.driver [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Start _get_guest_xml network_info=[{"id": "66645927-47fc-4df8-b8f3-2254e1a841ed", "address": "fa:16:3e:7c:02:c3", "network": {"id": "600edac6-24aa-414f-b977-07c2890470f1", "bridge": "br-int", "label": "tempest-network-smoke--176960828", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe7c:2c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe7c:2c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66645927-47", "ovs_interfaceid": "66645927-47fc-4df8-b8f3-2254e1a841ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.130 187189 WARNING nova.virt.libvirt.driver [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.146 187189 DEBUG nova.virt.libvirt.host [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.147 187189 DEBUG nova.virt.libvirt.host [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.156 187189 DEBUG nova.virt.libvirt.host [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.157 187189 DEBUG nova.virt.libvirt.host [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.160 187189 DEBUG nova.virt.libvirt.driver [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.160 187189 DEBUG nova.virt.hardware [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.161 187189 DEBUG nova.virt.hardware [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.162 187189 DEBUG nova.virt.hardware [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.162 187189 DEBUG nova.virt.hardware [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.163 187189 DEBUG nova.virt.hardware [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.163 187189 DEBUG nova.virt.hardware [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.164 187189 DEBUG nova.virt.hardware [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.165 187189 DEBUG nova.virt.hardware [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.165 187189 DEBUG nova.virt.hardware [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.166 187189 DEBUG nova.virt.hardware [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.166 187189 DEBUG nova.virt.hardware [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.173 187189 DEBUG nova.virt.libvirt.vif [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:41:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-715278411',display_name='tempest-TestGettingAddress-server-715278411',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-715278411',id=160,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD3d0ZsvCuIHNmZM7lmf14lwcU9LYA+YzS+DsUoU/RUNt3FNYs43WlwoA0reTsUUFEVQa4lagWavf3wAARjW0IrVdX6QLhMZ1dtoKB8yeTuH2S9PjazhePg7oe9bdCIqjQ==',key_name='tempest-TestGettingAddress-1639164498',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-0feovezc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:41:48Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=9b64431b-400c-4b7e-b7bf-986103d270c2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "66645927-47fc-4df8-b8f3-2254e1a841ed", "address": "fa:16:3e:7c:02:c3", "network": {"id": "600edac6-24aa-414f-b977-07c2890470f1", "bridge": "br-int", "label": "tempest-network-smoke--176960828", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe7c:2c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe7c:2c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66645927-47", "ovs_interfaceid": "66645927-47fc-4df8-b8f3-2254e1a841ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.173 187189 DEBUG nova.network.os_vif_util [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "66645927-47fc-4df8-b8f3-2254e1a841ed", "address": "fa:16:3e:7c:02:c3", "network": {"id": "600edac6-24aa-414f-b977-07c2890470f1", "bridge": "br-int", "label": "tempest-network-smoke--176960828", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe7c:2c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe7c:2c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66645927-47", "ovs_interfaceid": "66645927-47fc-4df8-b8f3-2254e1a841ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.175 187189 DEBUG nova.network.os_vif_util [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:02:c3,bridge_name='br-int',has_traffic_filtering=True,id=66645927-47fc-4df8-b8f3-2254e1a841ed,network=Network(600edac6-24aa-414f-b977-07c2890470f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66645927-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.177 187189 DEBUG nova.objects.instance [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'pci_devices' on Instance uuid 9b64431b-400c-4b7e-b7bf-986103d270c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.330 187189 DEBUG nova.virt.libvirt.driver [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:42:07 compute-0 nova_compute[187185]:   <uuid>9b64431b-400c-4b7e-b7bf-986103d270c2</uuid>
Nov 29 07:42:07 compute-0 nova_compute[187185]:   <name>instance-000000a0</name>
Nov 29 07:42:07 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:42:07 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:42:07 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:42:07 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:       <nova:name>tempest-TestGettingAddress-server-715278411</nova:name>
Nov 29 07:42:07 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:42:07</nova:creationTime>
Nov 29 07:42:07 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:42:07 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:42:07 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:42:07 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:42:07 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:42:07 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:42:07 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:42:07 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:42:07 compute-0 nova_compute[187185]:         <nova:user uuid="31ac7b05b012433b89143dc9f259644a">tempest-TestGettingAddress-1465017630-project-member</nova:user>
Nov 29 07:42:07 compute-0 nova_compute[187185]:         <nova:project uuid="0111c22b4b954ea586ca20d91ed3970f">tempest-TestGettingAddress-1465017630</nova:project>
Nov 29 07:42:07 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:42:07 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:42:07 compute-0 nova_compute[187185]:         <nova:port uuid="66645927-47fc-4df8-b8f3-2254e1a841ed">
Nov 29 07:42:07 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fe7c:2c3" ipVersion="6"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe7c:2c3" ipVersion="6"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:42:07 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:42:07 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:42:07 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <system>
Nov 29 07:42:07 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:42:07 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:42:07 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:42:07 compute-0 nova_compute[187185]:       <entry name="serial">9b64431b-400c-4b7e-b7bf-986103d270c2</entry>
Nov 29 07:42:07 compute-0 nova_compute[187185]:       <entry name="uuid">9b64431b-400c-4b7e-b7bf-986103d270c2</entry>
Nov 29 07:42:07 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     </system>
Nov 29 07:42:07 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:42:07 compute-0 nova_compute[187185]:   <os>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:   </os>
Nov 29 07:42:07 compute-0 nova_compute[187185]:   <features>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:   </features>
Nov 29 07:42:07 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:42:07 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:42:07 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:42:07 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/9b64431b-400c-4b7e-b7bf-986103d270c2/disk"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:42:07 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/9b64431b-400c-4b7e-b7bf-986103d270c2/disk.config"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:42:07 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:7c:02:c3"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:       <target dev="tap66645927-47"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:42:07 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/9b64431b-400c-4b7e-b7bf-986103d270c2/console.log" append="off"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <video>
Nov 29 07:42:07 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     </video>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:42:07 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:42:07 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:42:07 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:42:07 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:42:07 compute-0 nova_compute[187185]: </domain>
Nov 29 07:42:07 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.332 187189 DEBUG nova.compute.manager [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Preparing to wait for external event network-vif-plugged-66645927-47fc-4df8-b8f3-2254e1a841ed prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.333 187189 DEBUG oslo_concurrency.lockutils [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "9b64431b-400c-4b7e-b7bf-986103d270c2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.334 187189 DEBUG oslo_concurrency.lockutils [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "9b64431b-400c-4b7e-b7bf-986103d270c2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.334 187189 DEBUG oslo_concurrency.lockutils [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "9b64431b-400c-4b7e-b7bf-986103d270c2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.335 187189 DEBUG nova.virt.libvirt.vif [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:41:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-715278411',display_name='tempest-TestGettingAddress-server-715278411',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-715278411',id=160,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD3d0ZsvCuIHNmZM7lmf14lwcU9LYA+YzS+DsUoU/RUNt3FNYs43WlwoA0reTsUUFEVQa4lagWavf3wAARjW0IrVdX6QLhMZ1dtoKB8yeTuH2S9PjazhePg7oe9bdCIqjQ==',key_name='tempest-TestGettingAddress-1639164498',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-0feovezc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:41:48Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=9b64431b-400c-4b7e-b7bf-986103d270c2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "66645927-47fc-4df8-b8f3-2254e1a841ed", "address": "fa:16:3e:7c:02:c3", "network": {"id": "600edac6-24aa-414f-b977-07c2890470f1", "bridge": "br-int", "label": "tempest-network-smoke--176960828", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe7c:2c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe7c:2c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66645927-47", "ovs_interfaceid": "66645927-47fc-4df8-b8f3-2254e1a841ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.336 187189 DEBUG nova.network.os_vif_util [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "66645927-47fc-4df8-b8f3-2254e1a841ed", "address": "fa:16:3e:7c:02:c3", "network": {"id": "600edac6-24aa-414f-b977-07c2890470f1", "bridge": "br-int", "label": "tempest-network-smoke--176960828", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe7c:2c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe7c:2c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66645927-47", "ovs_interfaceid": "66645927-47fc-4df8-b8f3-2254e1a841ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.337 187189 DEBUG nova.network.os_vif_util [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:02:c3,bridge_name='br-int',has_traffic_filtering=True,id=66645927-47fc-4df8-b8f3-2254e1a841ed,network=Network(600edac6-24aa-414f-b977-07c2890470f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66645927-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.338 187189 DEBUG os_vif [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:02:c3,bridge_name='br-int',has_traffic_filtering=True,id=66645927-47fc-4df8-b8f3-2254e1a841ed,network=Network(600edac6-24aa-414f-b977-07c2890470f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66645927-47') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.339 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.340 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.340 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.345 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.345 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap66645927-47, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.346 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap66645927-47, col_values=(('external_ids', {'iface-id': '66645927-47fc-4df8-b8f3-2254e1a841ed', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7c:02:c3', 'vm-uuid': '9b64431b-400c-4b7e-b7bf-986103d270c2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.348 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:07 compute-0 NetworkManager[55227]: <info>  [1764402127.3502] manager: (tap66645927-47): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/276)
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.350 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.358 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.361 187189 INFO os_vif [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:02:c3,bridge_name='br-int',has_traffic_filtering=True,id=66645927-47fc-4df8-b8f3-2254e1a841ed,network=Network(600edac6-24aa-414f-b977-07c2890470f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66645927-47')
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.454 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.765 187189 DEBUG nova.virt.libvirt.driver [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.766 187189 DEBUG nova.virt.libvirt.driver [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.766 187189 DEBUG nova.virt.libvirt.driver [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No VIF found with MAC fa:16:3e:7c:02:c3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:42:07 compute-0 nova_compute[187185]: 2025-11-29 07:42:07.767 187189 INFO nova.virt.libvirt.driver [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Using config drive
Nov 29 07:42:08 compute-0 nova_compute[187185]: 2025-11-29 07:42:08.287 187189 INFO nova.virt.libvirt.driver [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Creating config drive at /var/lib/nova/instances/9b64431b-400c-4b7e-b7bf-986103d270c2/disk.config
Nov 29 07:42:08 compute-0 nova_compute[187185]: 2025-11-29 07:42:08.293 187189 DEBUG oslo_concurrency.processutils [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9b64431b-400c-4b7e-b7bf-986103d270c2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppgvkb8b_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:42:08 compute-0 nova_compute[187185]: 2025-11-29 07:42:08.440 187189 DEBUG oslo_concurrency.processutils [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9b64431b-400c-4b7e-b7bf-986103d270c2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppgvkb8b_" returned: 0 in 0.147s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:42:08 compute-0 kernel: tap66645927-47: entered promiscuous mode
Nov 29 07:42:08 compute-0 ovn_controller[95281]: 2025-11-29T07:42:08Z|00538|binding|INFO|Claiming lport 66645927-47fc-4df8-b8f3-2254e1a841ed for this chassis.
Nov 29 07:42:08 compute-0 NetworkManager[55227]: <info>  [1764402128.5536] manager: (tap66645927-47): new Tun device (/org/freedesktop/NetworkManager/Devices/277)
Nov 29 07:42:08 compute-0 ovn_controller[95281]: 2025-11-29T07:42:08Z|00539|binding|INFO|66645927-47fc-4df8-b8f3-2254e1a841ed: Claiming fa:16:3e:7c:02:c3 10.100.0.13 2001:db8:0:1:f816:3eff:fe7c:2c3 2001:db8::f816:3eff:fe7c:2c3
Nov 29 07:42:08 compute-0 nova_compute[187185]: 2025-11-29 07:42:08.554 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:08 compute-0 nova_compute[187185]: 2025-11-29 07:42:08.564 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:08 compute-0 nova_compute[187185]: 2025-11-29 07:42:08.569 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:08 compute-0 nova_compute[187185]: 2025-11-29 07:42:08.576 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:08 compute-0 nova_compute[187185]: 2025-11-29 07:42:08.588 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:08 compute-0 systemd-udevd[243381]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:42:08 compute-0 NetworkManager[55227]: <info>  [1764402128.5907] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/278)
Nov 29 07:42:08 compute-0 NetworkManager[55227]: <info>  [1764402128.5917] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/279)
Nov 29 07:42:08 compute-0 NetworkManager[55227]: <info>  [1764402128.6053] device (tap66645927-47): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:42:08 compute-0 NetworkManager[55227]: <info>  [1764402128.6067] device (tap66645927-47): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:42:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:08.628 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:02:c3 10.100.0.13 2001:db8:0:1:f816:3eff:fe7c:2c3 2001:db8::f816:3eff:fe7c:2c3'], port_security=['fa:16:3e:7c:02:c3 10.100.0.13 2001:db8:0:1:f816:3eff:fe7c:2c3 2001:db8::f816:3eff:fe7c:2c3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28 2001:db8:0:1:f816:3eff:fe7c:2c3/64 2001:db8::f816:3eff:fe7c:2c3/64', 'neutron:device_id': '9b64431b-400c-4b7e-b7bf-986103d270c2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-600edac6-24aa-414f-b977-07c2890470f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e2af89ad-a80e-4dc1-aa45-ab6ce3534b4f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=de1096f6-2a15-4f04-9ea7-22d2dff24e74, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=66645927-47fc-4df8-b8f3-2254e1a841ed) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:42:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:08.629 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 66645927-47fc-4df8-b8f3-2254e1a841ed in datapath 600edac6-24aa-414f-b977-07c2890470f1 bound to our chassis
Nov 29 07:42:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:08.630 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 600edac6-24aa-414f-b977-07c2890470f1
Nov 29 07:42:08 compute-0 systemd-machined[153486]: New machine qemu-63-instance-000000a0.
Nov 29 07:42:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:08.643 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[69691c73-a4ce-4de3-a532-518f530f52db]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:42:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:08.644 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap600edac6-21 in ovnmeta-600edac6-24aa-414f-b977-07c2890470f1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:42:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:08.646 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap600edac6-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:42:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:08.646 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[5e34a5e7-1a79-4960-afd6-22c059d1fd5f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:42:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:08.647 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[2c99c2a4-adbb-4a35-9562-d0054cec7a3b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:42:08 compute-0 podman[243357]: 2025-11-29 07:42:08.650893027 +0000 UTC m=+0.091939349 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 29 07:42:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:08.661 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[7eb49f0d-6d41-47f7-ab90-a59b9fb702e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:42:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:08.689 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[14a76d90-209a-4b73-8a08-45bf55fbda4f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:42:08 compute-0 systemd[1]: Started Virtual Machine qemu-63-instance-000000a0.
Nov 29 07:42:08 compute-0 podman[243358]: 2025-11-29 07:42:08.722964461 +0000 UTC m=+0.173728259 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:42:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:08.726 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[813e4db6-b54b-45a0-9107-7cdaa0c79ef6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:42:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:08.747 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[9f08e416-2a15-4edb-a649-fac1e9b31c20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:42:08 compute-0 NetworkManager[55227]: <info>  [1764402128.7505] manager: (tap600edac6-20): new Veth device (/org/freedesktop/NetworkManager/Devices/280)
Nov 29 07:42:08 compute-0 nova_compute[187185]: 2025-11-29 07:42:08.784 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:08.788 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[d0fc0f70-6a04-4db4-9b51-81c5802d4189]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:42:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:08.792 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[87ef4f02-6fe4-47d9-91c4-e6e944ae5ac7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:42:08 compute-0 nova_compute[187185]: 2025-11-29 07:42:08.810 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:08 compute-0 ovn_controller[95281]: 2025-11-29T07:42:08Z|00540|binding|INFO|Setting lport 66645927-47fc-4df8-b8f3-2254e1a841ed ovn-installed in OVS
Nov 29 07:42:08 compute-0 ovn_controller[95281]: 2025-11-29T07:42:08Z|00541|binding|INFO|Setting lport 66645927-47fc-4df8-b8f3-2254e1a841ed up in Southbound
Nov 29 07:42:08 compute-0 nova_compute[187185]: 2025-11-29 07:42:08.820 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:08 compute-0 NetworkManager[55227]: <info>  [1764402128.8307] device (tap600edac6-20): carrier: link connected
Nov 29 07:42:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:08.835 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[621a7f26-d0b9-44f7-af45-c7f30cacc2db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:42:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:08.852 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[1e6499c4-38e7-4825-acfe-b53952e9f4e6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap600edac6-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ac:f6:2b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 169], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 757649, 'reachable_time': 16787, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243435, 'error': None, 'target': 'ovnmeta-600edac6-24aa-414f-b977-07c2890470f1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:42:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:08.871 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f440afca-b297-46be-9885-c48c708ca9f4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feac:f62b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 757649, 'tstamp': 757649}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 243436, 'error': None, 'target': 'ovnmeta-600edac6-24aa-414f-b977-07c2890470f1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:42:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:08.887 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e723ddaa-2f0e-416b-9e57-9c733ff56a0c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap600edac6-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ac:f6:2b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 169], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 757649, 'reachable_time': 16787, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 243437, 'error': None, 'target': 'ovnmeta-600edac6-24aa-414f-b977-07c2890470f1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:42:08 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:08.928 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[ed5a8900-a275-4e36-b3e2-83aa59945ce6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:42:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:09.004 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[eb5b6564-c115-4159-bb33-b3f7c1f72fa1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:42:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:09.005 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap600edac6-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:42:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:09.005 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:42:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:09.006 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap600edac6-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:42:09 compute-0 NetworkManager[55227]: <info>  [1764402129.0082] manager: (tap600edac6-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/281)
Nov 29 07:42:09 compute-0 kernel: tap600edac6-20: entered promiscuous mode
Nov 29 07:42:09 compute-0 nova_compute[187185]: 2025-11-29 07:42:09.007 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:09 compute-0 nova_compute[187185]: 2025-11-29 07:42:09.057 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:09 compute-0 ovn_controller[95281]: 2025-11-29T07:42:09Z|00542|binding|INFO|Releasing lport 2459b7bb-f6d0-4520-a009-14c9d4a2b794 from this chassis (sb_readonly=0)
Nov 29 07:42:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:09.056 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap600edac6-20, col_values=(('external_ids', {'iface-id': '2459b7bb-f6d0-4520-a009-14c9d4a2b794'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:42:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:09.058 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/600edac6-24aa-414f-b977-07c2890470f1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/600edac6-24aa-414f-b977-07c2890470f1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:42:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:09.059 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e5134c8b-15cd-41b5-9304-47ff92a99740]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:42:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:09.060 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:42:09 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:42:09 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:42:09 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-600edac6-24aa-414f-b977-07c2890470f1
Nov 29 07:42:09 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:42:09 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:42:09 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:42:09 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/600edac6-24aa-414f-b977-07c2890470f1.pid.haproxy
Nov 29 07:42:09 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:42:09 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:42:09 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:42:09 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:42:09 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:42:09 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:42:09 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:42:09 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:42:09 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:42:09 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:42:09 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:42:09 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:42:09 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:42:09 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:42:09 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:42:09 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:42:09 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:42:09 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:42:09 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:42:09 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:42:09 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID 600edac6-24aa-414f-b977-07c2890470f1
Nov 29 07:42:09 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:42:09 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:09.061 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-600edac6-24aa-414f-b977-07c2890470f1', 'env', 'PROCESS_TAG=haproxy-600edac6-24aa-414f-b977-07c2890470f1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/600edac6-24aa-414f-b977-07c2890470f1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:42:09 compute-0 nova_compute[187185]: 2025-11-29 07:42:09.135 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:09 compute-0 nova_compute[187185]: 2025-11-29 07:42:09.263 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764402129.2632034, 9b64431b-400c-4b7e-b7bf-986103d270c2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:42:09 compute-0 nova_compute[187185]: 2025-11-29 07:42:09.264 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] VM Started (Lifecycle Event)
Nov 29 07:42:09 compute-0 podman[243476]: 2025-11-29 07:42:09.523642321 +0000 UTC m=+0.039817841 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:42:09 compute-0 nova_compute[187185]: 2025-11-29 07:42:09.826 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:42:09 compute-0 nova_compute[187185]: 2025-11-29 07:42:09.833 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764402129.2674227, 9b64431b-400c-4b7e-b7bf-986103d270c2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:42:09 compute-0 nova_compute[187185]: 2025-11-29 07:42:09.834 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] VM Paused (Lifecycle Event)
Nov 29 07:42:10 compute-0 nova_compute[187185]: 2025-11-29 07:42:10.243 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:42:10 compute-0 nova_compute[187185]: 2025-11-29 07:42:10.248 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:42:10 compute-0 nova_compute[187185]: 2025-11-29 07:42:10.321 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:42:10 compute-0 podman[243476]: 2025-11-29 07:42:10.35289968 +0000 UTC m=+0.869075130 container create 0297440574c4204c74d79f3c027983bf281387bfa82dc92bc6c6b95d938a13a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-600edac6-24aa-414f-b977-07c2890470f1, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 29 07:42:10 compute-0 systemd[1]: Started libpod-conmon-0297440574c4204c74d79f3c027983bf281387bfa82dc92bc6c6b95d938a13a7.scope.
Nov 29 07:42:10 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:42:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/470a5e0c2553de13b0b77314c3eb020bd2b761116060a514326aa4743e66ea7c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:42:10 compute-0 nova_compute[187185]: 2025-11-29 07:42:10.935 187189 DEBUG nova.compute.manager [req-2e885388-b09b-47b0-b6ee-dad846b3d95a req-65f147c4-3960-4054-a8f1-ae79df415d50 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Received event network-vif-plugged-66645927-47fc-4df8-b8f3-2254e1a841ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:42:10 compute-0 nova_compute[187185]: 2025-11-29 07:42:10.937 187189 DEBUG oslo_concurrency.lockutils [req-2e885388-b09b-47b0-b6ee-dad846b3d95a req-65f147c4-3960-4054-a8f1-ae79df415d50 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "9b64431b-400c-4b7e-b7bf-986103d270c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:42:10 compute-0 nova_compute[187185]: 2025-11-29 07:42:10.938 187189 DEBUG oslo_concurrency.lockutils [req-2e885388-b09b-47b0-b6ee-dad846b3d95a req-65f147c4-3960-4054-a8f1-ae79df415d50 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9b64431b-400c-4b7e-b7bf-986103d270c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:42:10 compute-0 nova_compute[187185]: 2025-11-29 07:42:10.938 187189 DEBUG oslo_concurrency.lockutils [req-2e885388-b09b-47b0-b6ee-dad846b3d95a req-65f147c4-3960-4054-a8f1-ae79df415d50 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9b64431b-400c-4b7e-b7bf-986103d270c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:42:10 compute-0 nova_compute[187185]: 2025-11-29 07:42:10.939 187189 DEBUG nova.compute.manager [req-2e885388-b09b-47b0-b6ee-dad846b3d95a req-65f147c4-3960-4054-a8f1-ae79df415d50 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Processing event network-vif-plugged-66645927-47fc-4df8-b8f3-2254e1a841ed _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:42:10 compute-0 nova_compute[187185]: 2025-11-29 07:42:10.940 187189 DEBUG nova.compute.manager [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:42:10 compute-0 nova_compute[187185]: 2025-11-29 07:42:10.945 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764402130.9456308, 9b64431b-400c-4b7e-b7bf-986103d270c2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:42:10 compute-0 nova_compute[187185]: 2025-11-29 07:42:10.946 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] VM Resumed (Lifecycle Event)
Nov 29 07:42:10 compute-0 nova_compute[187185]: 2025-11-29 07:42:10.949 187189 DEBUG nova.virt.libvirt.driver [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:42:10 compute-0 nova_compute[187185]: 2025-11-29 07:42:10.954 187189 INFO nova.virt.libvirt.driver [-] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Instance spawned successfully.
Nov 29 07:42:10 compute-0 nova_compute[187185]: 2025-11-29 07:42:10.955 187189 DEBUG nova.virt.libvirt.driver [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:42:10 compute-0 nova_compute[187185]: 2025-11-29 07:42:10.978 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:42:10 compute-0 nova_compute[187185]: 2025-11-29 07:42:10.980 187189 DEBUG nova.network.neutron [req-f09b4dcb-4585-4986-a37f-6fe59242fb4a req-cc7667bd-903f-4087-aa12-7b9679e50bc8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Updated VIF entry in instance network info cache for port 66645927-47fc-4df8-b8f3-2254e1a841ed. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:42:10 compute-0 nova_compute[187185]: 2025-11-29 07:42:10.981 187189 DEBUG nova.network.neutron [req-f09b4dcb-4585-4986-a37f-6fe59242fb4a req-cc7667bd-903f-4087-aa12-7b9679e50bc8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Updating instance_info_cache with network_info: [{"id": "66645927-47fc-4df8-b8f3-2254e1a841ed", "address": "fa:16:3e:7c:02:c3", "network": {"id": "600edac6-24aa-414f-b977-07c2890470f1", "bridge": "br-int", "label": "tempest-network-smoke--176960828", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe7c:2c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe7c:2c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66645927-47", "ovs_interfaceid": "66645927-47fc-4df8-b8f3-2254e1a841ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:42:10 compute-0 nova_compute[187185]: 2025-11-29 07:42:10.985 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:42:11 compute-0 podman[243476]: 2025-11-29 07:42:11.078229082 +0000 UTC m=+1.594404512 container init 0297440574c4204c74d79f3c027983bf281387bfa82dc92bc6c6b95d938a13a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-600edac6-24aa-414f-b977-07c2890470f1, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 29 07:42:11 compute-0 podman[243476]: 2025-11-29 07:42:11.086149647 +0000 UTC m=+1.602325057 container start 0297440574c4204c74d79f3c027983bf281387bfa82dc92bc6c6b95d938a13a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-600edac6-24aa-414f-b977-07c2890470f1, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 29 07:42:11 compute-0 neutron-haproxy-ovnmeta-600edac6-24aa-414f-b977-07c2890470f1[243491]: [NOTICE]   (243495) : New worker (243497) forked
Nov 29 07:42:11 compute-0 neutron-haproxy-ovnmeta-600edac6-24aa-414f-b977-07c2890470f1[243491]: [NOTICE]   (243495) : Loading success.
Nov 29 07:42:11 compute-0 nova_compute[187185]: 2025-11-29 07:42:11.304 187189 DEBUG nova.virt.libvirt.driver [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:42:11 compute-0 nova_compute[187185]: 2025-11-29 07:42:11.305 187189 DEBUG nova.virt.libvirt.driver [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:42:11 compute-0 nova_compute[187185]: 2025-11-29 07:42:11.306 187189 DEBUG nova.virt.libvirt.driver [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:42:11 compute-0 nova_compute[187185]: 2025-11-29 07:42:11.307 187189 DEBUG nova.virt.libvirt.driver [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:42:11 compute-0 nova_compute[187185]: 2025-11-29 07:42:11.309 187189 DEBUG nova.virt.libvirt.driver [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:42:11 compute-0 nova_compute[187185]: 2025-11-29 07:42:11.310 187189 DEBUG nova.virt.libvirt.driver [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:42:11 compute-0 nova_compute[187185]: 2025-11-29 07:42:11.387 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:42:11 compute-0 nova_compute[187185]: 2025-11-29 07:42:11.392 187189 DEBUG oslo_concurrency.lockutils [req-f09b4dcb-4585-4986-a37f-6fe59242fb4a req-cc7667bd-903f-4087-aa12-7b9679e50bc8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-9b64431b-400c-4b7e-b7bf-986103d270c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:42:12 compute-0 nova_compute[187185]: 2025-11-29 07:42:12.059 187189 INFO nova.compute.manager [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Took 23.83 seconds to spawn the instance on the hypervisor.
Nov 29 07:42:12 compute-0 nova_compute[187185]: 2025-11-29 07:42:12.060 187189 DEBUG nova.compute.manager [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:42:12 compute-0 nova_compute[187185]: 2025-11-29 07:42:12.348 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:12 compute-0 nova_compute[187185]: 2025-11-29 07:42:12.457 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:13 compute-0 nova_compute[187185]: 2025-11-29 07:42:13.312 187189 DEBUG nova.compute.manager [req-521b057e-f040-471b-9db2-ab214312e730 req-5f332542-3366-4ef0-89f0-7e06de18206e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Received event network-vif-plugged-66645927-47fc-4df8-b8f3-2254e1a841ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:42:13 compute-0 nova_compute[187185]: 2025-11-29 07:42:13.313 187189 DEBUG oslo_concurrency.lockutils [req-521b057e-f040-471b-9db2-ab214312e730 req-5f332542-3366-4ef0-89f0-7e06de18206e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "9b64431b-400c-4b7e-b7bf-986103d270c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:42:13 compute-0 nova_compute[187185]: 2025-11-29 07:42:13.313 187189 DEBUG oslo_concurrency.lockutils [req-521b057e-f040-471b-9db2-ab214312e730 req-5f332542-3366-4ef0-89f0-7e06de18206e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9b64431b-400c-4b7e-b7bf-986103d270c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:42:13 compute-0 nova_compute[187185]: 2025-11-29 07:42:13.313 187189 DEBUG oslo_concurrency.lockutils [req-521b057e-f040-471b-9db2-ab214312e730 req-5f332542-3366-4ef0-89f0-7e06de18206e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9b64431b-400c-4b7e-b7bf-986103d270c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:42:13 compute-0 nova_compute[187185]: 2025-11-29 07:42:13.314 187189 DEBUG nova.compute.manager [req-521b057e-f040-471b-9db2-ab214312e730 req-5f332542-3366-4ef0-89f0-7e06de18206e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] No waiting events found dispatching network-vif-plugged-66645927-47fc-4df8-b8f3-2254e1a841ed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:42:13 compute-0 nova_compute[187185]: 2025-11-29 07:42:13.314 187189 WARNING nova.compute.manager [req-521b057e-f040-471b-9db2-ab214312e730 req-5f332542-3366-4ef0-89f0-7e06de18206e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Received unexpected event network-vif-plugged-66645927-47fc-4df8-b8f3-2254e1a841ed for instance with vm_state building and task_state spawning.
Nov 29 07:42:14 compute-0 nova_compute[187185]: 2025-11-29 07:42:14.097 187189 INFO nova.compute.manager [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Took 27.02 seconds to build instance.
Nov 29 07:42:14 compute-0 nova_compute[187185]: 2025-11-29 07:42:14.161 187189 DEBUG oslo_concurrency.lockutils [None req-6ccb9985-564a-411e-9ecc-ae554dfbd5e7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "9b64431b-400c-4b7e-b7bf-986103d270c2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 28.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:42:17 compute-0 nova_compute[187185]: 2025-11-29 07:42:17.406 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:17 compute-0 nova_compute[187185]: 2025-11-29 07:42:17.460 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:19 compute-0 podman[243507]: 2025-11-29 07:42:19.858761363 +0000 UTC m=+0.100716598 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 29 07:42:19 compute-0 podman[243508]: 2025-11-29 07:42:19.865521205 +0000 UTC m=+0.106427850 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=edpm, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, maintainer=Red Hat, Inc., vcs-type=git, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, container_name=openstack_network_exporter)
Nov 29 07:42:19 compute-0 podman[243509]: 2025-11-29 07:42:19.908040631 +0000 UTC m=+0.134899698 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 07:42:22 compute-0 nova_compute[187185]: 2025-11-29 07:42:22.410 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:22 compute-0 nova_compute[187185]: 2025-11-29 07:42:22.462 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:24 compute-0 nova_compute[187185]: 2025-11-29 07:42:24.809 187189 DEBUG nova.compute.manager [req-a358e37f-7814-47fc-8bd3-257be4eb901f req-ea44bab6-7f5b-47c7-934e-c5a8843a8fc9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Received event network-changed-66645927-47fc-4df8-b8f3-2254e1a841ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:42:24 compute-0 nova_compute[187185]: 2025-11-29 07:42:24.811 187189 DEBUG nova.compute.manager [req-a358e37f-7814-47fc-8bd3-257be4eb901f req-ea44bab6-7f5b-47c7-934e-c5a8843a8fc9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Refreshing instance network info cache due to event network-changed-66645927-47fc-4df8-b8f3-2254e1a841ed. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:42:24 compute-0 nova_compute[187185]: 2025-11-29 07:42:24.812 187189 DEBUG oslo_concurrency.lockutils [req-a358e37f-7814-47fc-8bd3-257be4eb901f req-ea44bab6-7f5b-47c7-934e-c5a8843a8fc9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-9b64431b-400c-4b7e-b7bf-986103d270c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:42:24 compute-0 nova_compute[187185]: 2025-11-29 07:42:24.812 187189 DEBUG oslo_concurrency.lockutils [req-a358e37f-7814-47fc-8bd3-257be4eb901f req-ea44bab6-7f5b-47c7-934e-c5a8843a8fc9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-9b64431b-400c-4b7e-b7bf-986103d270c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:42:24 compute-0 nova_compute[187185]: 2025-11-29 07:42:24.813 187189 DEBUG nova.network.neutron [req-a358e37f-7814-47fc-8bd3-257be4eb901f req-ea44bab6-7f5b-47c7-934e-c5a8843a8fc9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Refreshing network info cache for port 66645927-47fc-4df8-b8f3-2254e1a841ed _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:42:25 compute-0 nova_compute[187185]: 2025-11-29 07:42:25.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:42:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:25.535 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:42:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:25.541 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:42:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:25.543 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:42:26 compute-0 nova_compute[187185]: 2025-11-29 07:42:26.367 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:42:26 compute-0 nova_compute[187185]: 2025-11-29 07:42:26.398 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:42:26 compute-0 nova_compute[187185]: 2025-11-29 07:42:26.399 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:42:26 compute-0 nova_compute[187185]: 2025-11-29 07:42:26.399 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:42:26 compute-0 nova_compute[187185]: 2025-11-29 07:42:26.400 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:42:26 compute-0 nova_compute[187185]: 2025-11-29 07:42:26.480 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9b64431b-400c-4b7e-b7bf-986103d270c2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:42:26 compute-0 nova_compute[187185]: 2025-11-29 07:42:26.580 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9b64431b-400c-4b7e-b7bf-986103d270c2/disk --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:42:26 compute-0 nova_compute[187185]: 2025-11-29 07:42:26.582 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9b64431b-400c-4b7e-b7bf-986103d270c2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:42:26 compute-0 nova_compute[187185]: 2025-11-29 07:42:26.668 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9b64431b-400c-4b7e-b7bf-986103d270c2/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:42:26 compute-0 nova_compute[187185]: 2025-11-29 07:42:26.896 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:42:26 compute-0 nova_compute[187185]: 2025-11-29 07:42:26.897 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5541MB free_disk=73.22607421875GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:42:26 compute-0 nova_compute[187185]: 2025-11-29 07:42:26.898 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:42:26 compute-0 nova_compute[187185]: 2025-11-29 07:42:26.898 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:42:27 compute-0 nova_compute[187185]: 2025-11-29 07:42:27.415 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:27 compute-0 nova_compute[187185]: 2025-11-29 07:42:27.467 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:27 compute-0 nova_compute[187185]: 2025-11-29 07:42:27.641 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance 9b64431b-400c-4b7e-b7bf-986103d270c2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:42:27 compute-0 nova_compute[187185]: 2025-11-29 07:42:27.642 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:42:27 compute-0 nova_compute[187185]: 2025-11-29 07:42:27.643 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:42:27 compute-0 nova_compute[187185]: 2025-11-29 07:42:27.691 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:42:27 compute-0 nova_compute[187185]: 2025-11-29 07:42:27.707 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:42:27 compute-0 nova_compute[187185]: 2025-11-29 07:42:27.732 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:42:27 compute-0 nova_compute[187185]: 2025-11-29 07:42:27.733 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.835s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:42:28 compute-0 ovn_controller[95281]: 2025-11-29T07:42:28Z|00071|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7c:02:c3 10.100.0.13
Nov 29 07:42:28 compute-0 ovn_controller[95281]: 2025-11-29T07:42:28Z|00072|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7c:02:c3 10.100.0.13
Nov 29 07:42:28 compute-0 nova_compute[187185]: 2025-11-29 07:42:28.839 187189 DEBUG nova.network.neutron [req-a358e37f-7814-47fc-8bd3-257be4eb901f req-ea44bab6-7f5b-47c7-934e-c5a8843a8fc9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Updated VIF entry in instance network info cache for port 66645927-47fc-4df8-b8f3-2254e1a841ed. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:42:28 compute-0 nova_compute[187185]: 2025-11-29 07:42:28.840 187189 DEBUG nova.network.neutron [req-a358e37f-7814-47fc-8bd3-257be4eb901f req-ea44bab6-7f5b-47c7-934e-c5a8843a8fc9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Updating instance_info_cache with network_info: [{"id": "66645927-47fc-4df8-b8f3-2254e1a841ed", "address": "fa:16:3e:7c:02:c3", "network": {"id": "600edac6-24aa-414f-b977-07c2890470f1", "bridge": "br-int", "label": "tempest-network-smoke--176960828", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe7c:2c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe7c:2c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66645927-47", "ovs_interfaceid": "66645927-47fc-4df8-b8f3-2254e1a841ed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:42:28 compute-0 nova_compute[187185]: 2025-11-29 07:42:28.913 187189 DEBUG oslo_concurrency.lockutils [req-a358e37f-7814-47fc-8bd3-257be4eb901f req-ea44bab6-7f5b-47c7-934e-c5a8843a8fc9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-9b64431b-400c-4b7e-b7bf-986103d270c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:42:28 compute-0 podman[243592]: 2025-11-29 07:42:28.927952869 +0000 UTC m=+0.173430199 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:42:30 compute-0 sshd-session[243618]: Invalid user support from 190.181.27.27 port 42378
Nov 29 07:42:31 compute-0 sshd-session[243618]: Received disconnect from 190.181.27.27 port 42378:11: Bye Bye [preauth]
Nov 29 07:42:31 compute-0 sshd-session[243618]: Disconnected from invalid user support 190.181.27.27 port 42378 [preauth]
Nov 29 07:42:31 compute-0 nova_compute[187185]: 2025-11-29 07:42:31.683 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:42:31 compute-0 nova_compute[187185]: 2025-11-29 07:42:31.684 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:42:31 compute-0 nova_compute[187185]: 2025-11-29 07:42:31.684 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:42:31 compute-0 nova_compute[187185]: 2025-11-29 07:42:31.811 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "refresh_cache-9b64431b-400c-4b7e-b7bf-986103d270c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:42:31 compute-0 nova_compute[187185]: 2025-11-29 07:42:31.812 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquired lock "refresh_cache-9b64431b-400c-4b7e-b7bf-986103d270c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:42:31 compute-0 nova_compute[187185]: 2025-11-29 07:42:31.812 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 29 07:42:31 compute-0 nova_compute[187185]: 2025-11-29 07:42:31.812 187189 DEBUG nova.objects.instance [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9b64431b-400c-4b7e-b7bf-986103d270c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:42:32 compute-0 nova_compute[187185]: 2025-11-29 07:42:32.418 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:32 compute-0 nova_compute[187185]: 2025-11-29 07:42:32.469 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:34 compute-0 nova_compute[187185]: 2025-11-29 07:42:34.940 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Updating instance_info_cache with network_info: [{"id": "66645927-47fc-4df8-b8f3-2254e1a841ed", "address": "fa:16:3e:7c:02:c3", "network": {"id": "600edac6-24aa-414f-b977-07c2890470f1", "bridge": "br-int", "label": "tempest-network-smoke--176960828", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe7c:2c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe7c:2c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66645927-47", "ovs_interfaceid": "66645927-47fc-4df8-b8f3-2254e1a841ed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:42:34 compute-0 nova_compute[187185]: 2025-11-29 07:42:34.956 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Releasing lock "refresh_cache-9b64431b-400c-4b7e-b7bf-986103d270c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:42:34 compute-0 nova_compute[187185]: 2025-11-29 07:42:34.956 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 29 07:42:34 compute-0 nova_compute[187185]: 2025-11-29 07:42:34.957 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:42:34 compute-0 nova_compute[187185]: 2025-11-29 07:42:34.957 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:42:34 compute-0 nova_compute[187185]: 2025-11-29 07:42:34.958 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:42:34 compute-0 nova_compute[187185]: 2025-11-29 07:42:34.958 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:42:34 compute-0 nova_compute[187185]: 2025-11-29 07:42:34.959 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:42:35 compute-0 nova_compute[187185]: 2025-11-29 07:42:35.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:42:36 compute-0 nova_compute[187185]: 2025-11-29 07:42:36.311 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:42:36 compute-0 podman[243620]: 2025-11-29 07:42:36.817924471 +0000 UTC m=+0.072327092 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 07:42:37 compute-0 nova_compute[187185]: 2025-11-29 07:42:37.422 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:37 compute-0 nova_compute[187185]: 2025-11-29 07:42:37.473 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:39 compute-0 podman[243646]: 2025-11-29 07:42:39.824559638 +0000 UTC m=+0.080339110 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:42:39 compute-0 podman[243647]: 2025-11-29 07:42:39.846041047 +0000 UTC m=+0.088312775 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:42:40 compute-0 nova_compute[187185]: 2025-11-29 07:42:40.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:42:40 compute-0 nova_compute[187185]: 2025-11-29 07:42:40.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 29 07:42:42 compute-0 nova_compute[187185]: 2025-11-29 07:42:42.424 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:42 compute-0 nova_compute[187185]: 2025-11-29 07:42:42.477 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:45 compute-0 nova_compute[187185]: 2025-11-29 07:42:45.957 187189 DEBUG nova.compute.manager [req-3f9eb1c6-f1f0-44dd-9f7d-597ba7b937d2 req-03517c37-f42a-4ce5-9c54-6d4a27ed9c53 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Received event network-changed-66645927-47fc-4df8-b8f3-2254e1a841ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:42:45 compute-0 nova_compute[187185]: 2025-11-29 07:42:45.958 187189 DEBUG nova.compute.manager [req-3f9eb1c6-f1f0-44dd-9f7d-597ba7b937d2 req-03517c37-f42a-4ce5-9c54-6d4a27ed9c53 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Refreshing instance network info cache due to event network-changed-66645927-47fc-4df8-b8f3-2254e1a841ed. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:42:45 compute-0 nova_compute[187185]: 2025-11-29 07:42:45.958 187189 DEBUG oslo_concurrency.lockutils [req-3f9eb1c6-f1f0-44dd-9f7d-597ba7b937d2 req-03517c37-f42a-4ce5-9c54-6d4a27ed9c53 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-9b64431b-400c-4b7e-b7bf-986103d270c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:42:45 compute-0 nova_compute[187185]: 2025-11-29 07:42:45.959 187189 DEBUG oslo_concurrency.lockutils [req-3f9eb1c6-f1f0-44dd-9f7d-597ba7b937d2 req-03517c37-f42a-4ce5-9c54-6d4a27ed9c53 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-9b64431b-400c-4b7e-b7bf-986103d270c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:42:45 compute-0 nova_compute[187185]: 2025-11-29 07:42:45.959 187189 DEBUG nova.network.neutron [req-3f9eb1c6-f1f0-44dd-9f7d-597ba7b937d2 req-03517c37-f42a-4ce5-9c54-6d4a27ed9c53 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Refreshing network info cache for port 66645927-47fc-4df8-b8f3-2254e1a841ed _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:42:47 compute-0 nova_compute[187185]: 2025-11-29 07:42:47.470 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:47 compute-0 nova_compute[187185]: 2025-11-29 07:42:47.478 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.016 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '9b64431b-400c-4b7e-b7bf-986103d270c2', 'name': 'tempest-TestGettingAddress-server-715278411', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000a0', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '0111c22b4b954ea586ca20d91ed3970f', 'user_id': '31ac7b05b012433b89143dc9f259644a', 'hostId': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.017 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.017 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.018 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-715278411>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-715278411>]
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.018 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.018 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.019 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-715278411>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-715278411>]
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.019 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.023 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 9b64431b-400c-4b7e-b7bf-986103d270c2 / tap66645927-47 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.024 12 DEBUG ceilometer.compute.pollsters [-] 9b64431b-400c-4b7e-b7bf-986103d270c2/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '11b03e13-3fd9-4ec3-8f6d-3e0e6cc64828', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-000000a0-9b64431b-400c-4b7e-b7bf-986103d270c2-tap66645927-47', 'timestamp': '2025-11-29T07:42:48.019771', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-715278411', 'name': 'tap66645927-47', 'instance_id': '9b64431b-400c-4b7e-b7bf-986103d270c2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7c:02:c3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap66645927-47'}, 'message_id': '002a58f8-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7615.738073261, 'message_signature': '4eae8824d41087ec2667c03228a7ce8d6b7d4826fd03c380b75e95fb198c07a5'}]}, 'timestamp': '2025-11-29 07:42:48.025299', '_unique_id': '6ddb662fe2e0488888d3728d246ff3c4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.027 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.029 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.030 12 DEBUG ceilometer.compute.pollsters [-] 9b64431b-400c-4b7e-b7bf-986103d270c2/network.incoming.packets volume: 167 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e9c83358-8035-48db-b479-2548fe21bce3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 167, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-000000a0-9b64431b-400c-4b7e-b7bf-986103d270c2-tap66645927-47', 'timestamp': '2025-11-29T07:42:48.030074', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-715278411', 'name': 'tap66645927-47', 'instance_id': '9b64431b-400c-4b7e-b7bf-986103d270c2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7c:02:c3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap66645927-47'}, 'message_id': '002b30c0-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7615.738073261, 'message_signature': 'ef4d30c24ab4fb5962493699b8030cdaaaca9f3db0399839620e872074d23a1a'}]}, 'timestamp': '2025-11-29 07:42:48.030703', '_unique_id': '68f5d03b7135428b8a1e7cb82e604a0c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.031 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.033 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.080 12 DEBUG ceilometer.compute.pollsters [-] 9b64431b-400c-4b7e-b7bf-986103d270c2/disk.device.write.requests volume: 323 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.081 12 DEBUG ceilometer.compute.pollsters [-] 9b64431b-400c-4b7e-b7bf-986103d270c2/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b72ce51a-6cec-4366-b8e0-f0aa18580e6b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 323, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '9b64431b-400c-4b7e-b7bf-986103d270c2-vda', 'timestamp': '2025-11-29T07:42:48.034078', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-715278411', 'name': 'instance-000000a0', 'instance_id': '9b64431b-400c-4b7e-b7bf-986103d270c2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0032fb3e-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7615.752324755, 'message_signature': 'f472d564349fa0f8ccda546fb91150b9175ba02968fca70f2d542fee49cf2f09'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '9b64431b-400c-4b7e-b7bf-986103d270c2-sda', 'timestamp': '2025-11-29T07:42:48.034078', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-715278411', 'name': 'instance-000000a0', 'instance_id': '9b64431b-400c-4b7e-b7bf-986103d270c2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0033190c-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7615.752324755, 'message_signature': '3a24778c0433a053bed88acbfbf01fca0f99f5da498d9c33ce594fc4d373a4ad'}]}, 'timestamp': '2025-11-29 07:42:48.082504', '_unique_id': '808ccb0abe1a4828968672790633de7d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.084 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.086 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.086 12 DEBUG ceilometer.compute.pollsters [-] 9b64431b-400c-4b7e-b7bf-986103d270c2/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba9e634f-594a-431b-8e57-1cc49490a359', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-000000a0-9b64431b-400c-4b7e-b7bf-986103d270c2-tap66645927-47', 'timestamp': '2025-11-29T07:42:48.086787', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-715278411', 'name': 'tap66645927-47', 'instance_id': '9b64431b-400c-4b7e-b7bf-986103d270c2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7c:02:c3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap66645927-47'}, 'message_id': '0033de32-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7615.738073261, 'message_signature': '62cf5bfd8faaf47c274efc4876ce8a577a169fc873830f3976bcb1cadc46d736'}]}, 'timestamp': '2025-11-29 07:42:48.087646', '_unique_id': '5adfc7bf18404deebda7ac808435b43c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.089 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.091 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.122 12 DEBUG ceilometer.compute.pollsters [-] 9b64431b-400c-4b7e-b7bf-986103d270c2/memory.usage volume: 46.73046875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ea537c83-df16-4068-83e8-608913a9cfca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 46.73046875, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '9b64431b-400c-4b7e-b7bf-986103d270c2', 'timestamp': '2025-11-29T07:42:48.091639', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-715278411', 'name': 'instance-000000a0', 'instance_id': '9b64431b-400c-4b7e-b7bf-986103d270c2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '003953bc-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7615.840368022, 'message_signature': 'b68aa840340b3f897f64716e8845fe0b60d9dd1d05da326de19d01dfdf3c6eed'}]}, 'timestamp': '2025-11-29 07:42:48.123418', '_unique_id': '14ce3e60cf1e41aeb3cda8762d4a354c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.125 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.127 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.127 12 DEBUG ceilometer.compute.pollsters [-] 9b64431b-400c-4b7e-b7bf-986103d270c2/disk.device.write.latency volume: 27892683303 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.128 12 DEBUG ceilometer.compute.pollsters [-] 9b64431b-400c-4b7e-b7bf-986103d270c2/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9da8644c-b6bb-4018-853d-b50194aaa490', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 27892683303, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '9b64431b-400c-4b7e-b7bf-986103d270c2-vda', 'timestamp': '2025-11-29T07:42:48.127579', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-715278411', 'name': 'instance-000000a0', 'instance_id': '9b64431b-400c-4b7e-b7bf-986103d270c2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '003a1388-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7615.752324755, 'message_signature': 'f91e996ed3e554713cb17f46b9a5f8d435f409cfa3adcf55183893dc074027bc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '9b64431b-400c-4b7e-b7bf-986103d270c2-sda', 'timestamp': '2025-11-29T07:42:48.127579', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-715278411', 'name': 'instance-000000a0', 'instance_id': '9b64431b-400c-4b7e-b7bf-986103d270c2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '003a28c8-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7615.752324755, 'message_signature': '0bba1c7fc78fb31b620a260f6302c17f506a96427785a0f7fbdc63d78f464123'}]}, 'timestamp': '2025-11-29 07:42:48.128763', '_unique_id': '44155bf430a74ca7bc055e14de570878'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.131 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.133 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.133 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-715278411>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-715278411>]
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.133 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.134 12 DEBUG ceilometer.compute.pollsters [-] 9b64431b-400c-4b7e-b7bf-986103d270c2/network.outgoing.bytes volume: 30164 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ccc4c83b-5f4b-46cd-8966-7c67c1a7d7a9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30164, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-000000a0-9b64431b-400c-4b7e-b7bf-986103d270c2-tap66645927-47', 'timestamp': '2025-11-29T07:42:48.134095', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-715278411', 'name': 'tap66645927-47', 'instance_id': '9b64431b-400c-4b7e-b7bf-986103d270c2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7c:02:c3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap66645927-47'}, 'message_id': '003b0fc2-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7615.738073261, 'message_signature': '9bb968d9c2275b7975e9d228ba9365bdd77a52d609f8284461d000bf83a0517b'}]}, 'timestamp': '2025-11-29 07:42:48.134821', '_unique_id': '899d5da3aeb64eb1bbc7cf270e5c0a4c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.136 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.138 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.156 12 DEBUG ceilometer.compute.pollsters [-] 9b64431b-400c-4b7e-b7bf-986103d270c2/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.157 12 DEBUG ceilometer.compute.pollsters [-] 9b64431b-400c-4b7e-b7bf-986103d270c2/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'db679995-bea4-4f52-9101-91cb4d6742b2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '9b64431b-400c-4b7e-b7bf-986103d270c2-vda', 'timestamp': '2025-11-29T07:42:48.138560', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-715278411', 'name': 'instance-000000a0', 'instance_id': '9b64431b-400c-4b7e-b7bf-986103d270c2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '003e7ec8-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7615.85686211, 'message_signature': '4969532f598aa441eb6ea4c3ab54f7840d9118b1274a28cb992c4ccceca661aa'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '9b64431b-400c-4b7e-b7bf-986103d270c2-sda', 'timestamp': '2025-11-29T07:42:48.138560', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-715278411', 'name': 'instance-000000a0', 'instance_id': '9b64431b-400c-4b7e-b7bf-986103d270c2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '003e98cc-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7615.85686211, 'message_signature': '5b32897eb2fd72d4267dc8c73ee7e521e9b35ae6a2bc5fb90d0e6e15d4f927aa'}]}, 'timestamp': '2025-11-29 07:42:48.157998', '_unique_id': 'f47ed508b59948aba633017eb98dd256'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.159 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.161 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.161 12 DEBUG ceilometer.compute.pollsters [-] 9b64431b-400c-4b7e-b7bf-986103d270c2/network.outgoing.packets volume: 189 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f92e2e1b-0965-4b55-be26-39b0dc251637', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 189, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-000000a0-9b64431b-400c-4b7e-b7bf-986103d270c2-tap66645927-47', 'timestamp': '2025-11-29T07:42:48.161694', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-715278411', 'name': 'tap66645927-47', 'instance_id': '9b64431b-400c-4b7e-b7bf-986103d270c2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7c:02:c3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap66645927-47'}, 'message_id': '003f4ac4-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7615.738073261, 'message_signature': 'd8981b6cac3d6075fe84acdfd2379a2e516e1dd342647310743712d3569db486'}]}, 'timestamp': '2025-11-29 07:42:48.162522', '_unique_id': 'beb30a9ec68942c0ab5e39d0fcb21756'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.163 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.166 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.166 12 DEBUG ceilometer.compute.pollsters [-] 9b64431b-400c-4b7e-b7bf-986103d270c2/disk.device.write.bytes volume: 72970240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.167 12 DEBUG ceilometer.compute.pollsters [-] 9b64431b-400c-4b7e-b7bf-986103d270c2/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f86dd442-8ef2-4269-9075-9bcda0ab3c15', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72970240, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '9b64431b-400c-4b7e-b7bf-986103d270c2-vda', 'timestamp': '2025-11-29T07:42:48.166420', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-715278411', 'name': 'instance-000000a0', 'instance_id': '9b64431b-400c-4b7e-b7bf-986103d270c2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '003fff1e-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7615.752324755, 'message_signature': '065d4f3cc3295bf6a0f2cbde15b2caaaf444eb0552bd285f93f7968e9ee684a1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '9b64431b-400c-4b7e-b7bf-986103d270c2-sda', 'timestamp': '2025-11-29T07:42:48.166420', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-715278411', 'name': 'instance-000000a0', 'instance_id': '9b64431b-400c-4b7e-b7bf-986103d270c2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '004019ea-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7615.752324755, 'message_signature': '3ba69805cb02db4841769ae6f77423143f761af3291d54d1e0bb9e52fa0d2f1b'}]}, 'timestamp': '2025-11-29 07:42:48.167788', '_unique_id': '062b8e40ba1d40288a796077eaa08a1d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.170 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.170 12 DEBUG ceilometer.compute.pollsters [-] 9b64431b-400c-4b7e-b7bf-986103d270c2/disk.device.read.latency volume: 1324205224 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.171 12 DEBUG ceilometer.compute.pollsters [-] 9b64431b-400c-4b7e-b7bf-986103d270c2/disk.device.read.latency volume: 33386971 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bdd1b855-2169-43b2-b88a-bc4853bb6724', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1324205224, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '9b64431b-400c-4b7e-b7bf-986103d270c2-vda', 'timestamp': '2025-11-29T07:42:48.170867', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-715278411', 'name': 'instance-000000a0', 'instance_id': '9b64431b-400c-4b7e-b7bf-986103d270c2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0040a946-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7615.752324755, 'message_signature': '6d4de543d30a7f7b2bd043553b37566296d4533d58c1479aebfbd23fa2254d0f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 33386971, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '9b64431b-400c-4b7e-b7bf-986103d270c2-sda', 'timestamp': '2025-11-29T07:42:48.170867', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-715278411', 'name': 'instance-000000a0', 'instance_id': '9b64431b-400c-4b7e-b7bf-986103d270c2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0040bae4-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7615.752324755, 'message_signature': '54472af42a7eec6d0a4f9603e54e9114361553bdf73d0a59804f706fa99d87cd'}]}, 'timestamp': '2025-11-29 07:42:48.171803', '_unique_id': '67abf835179845ca8ba90bd444e09820'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.172 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.174 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.174 12 DEBUG ceilometer.compute.pollsters [-] 9b64431b-400c-4b7e-b7bf-986103d270c2/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4dff3006-4751-4402-9246-2b5c10aa5299', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-000000a0-9b64431b-400c-4b7e-b7bf-986103d270c2-tap66645927-47', 'timestamp': '2025-11-29T07:42:48.174310', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-715278411', 'name': 'tap66645927-47', 'instance_id': '9b64431b-400c-4b7e-b7bf-986103d270c2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7c:02:c3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap66645927-47'}, 'message_id': '00412f88-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7615.738073261, 'message_signature': '87bba5d74576d89c2167da43d65f3b0d8bf690c865746bffb04a98a27ed5b52b'}]}, 'timestamp': '2025-11-29 07:42:48.174816', '_unique_id': '43cb6fd822bc4493b12be3ab502fbddb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.175 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.177 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.177 12 DEBUG ceilometer.compute.pollsters [-] 9b64431b-400c-4b7e-b7bf-986103d270c2/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ea42d194-f97b-48a4-8155-e7733c613a0b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-000000a0-9b64431b-400c-4b7e-b7bf-986103d270c2-tap66645927-47', 'timestamp': '2025-11-29T07:42:48.177143', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-715278411', 'name': 'tap66645927-47', 'instance_id': '9b64431b-400c-4b7e-b7bf-986103d270c2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7c:02:c3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap66645927-47'}, 'message_id': '00419f18-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7615.738073261, 'message_signature': '65af221d64fa6c5277380c3e7df3532ab68eb0cd0728d02ffeae2f991b59d720'}]}, 'timestamp': '2025-11-29 07:42:48.177694', '_unique_id': '269fb8c9a4574a2bb99cc009e6355183'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.178 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.179 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.180 12 DEBUG ceilometer.compute.pollsters [-] 9b64431b-400c-4b7e-b7bf-986103d270c2/cpu volume: 12280000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '771a9be0-2ac2-4667-9f0e-783afcb0120f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12280000000, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '9b64431b-400c-4b7e-b7bf-986103d270c2', 'timestamp': '2025-11-29T07:42:48.180122', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-715278411', 'name': 'instance-000000a0', 'instance_id': '9b64431b-400c-4b7e-b7bf-986103d270c2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '004211a0-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7615.840368022, 'message_signature': '3886ed019a7290dc11d9149126baa586c2e1bf2dcf104e9f99cb4158ca662886'}]}, 'timestamp': '2025-11-29 07:42:48.180623', '_unique_id': '585a17977d974898a090daa2701e0a3f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.181 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.182 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.183 12 DEBUG ceilometer.compute.pollsters [-] 9b64431b-400c-4b7e-b7bf-986103d270c2/network.incoming.bytes volume: 30257 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e17dc7ff-288f-4bd8-913b-a84b13b26811', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30257, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-000000a0-9b64431b-400c-4b7e-b7bf-986103d270c2-tap66645927-47', 'timestamp': '2025-11-29T07:42:48.183040', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-715278411', 'name': 'tap66645927-47', 'instance_id': '9b64431b-400c-4b7e-b7bf-986103d270c2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7c:02:c3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap66645927-47'}, 'message_id': '004283f6-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7615.738073261, 'message_signature': '09c32455b0f02fedee9f97665b0e04ebbf91e7115737916ba7917b680f028d8b'}]}, 'timestamp': '2025-11-29 07:42:48.183530', '_unique_id': '61514e58cce74208b0b50099455f89a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.184 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.185 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.185 12 DEBUG ceilometer.compute.pollsters [-] 9b64431b-400c-4b7e-b7bf-986103d270c2/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.186 12 DEBUG ceilometer.compute.pollsters [-] 9b64431b-400c-4b7e-b7bf-986103d270c2/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '67f63038-8fed-4349-93fa-029a6c5bdc67', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '9b64431b-400c-4b7e-b7bf-986103d270c2-vda', 'timestamp': '2025-11-29T07:42:48.185878', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-715278411', 'name': 'instance-000000a0', 'instance_id': '9b64431b-400c-4b7e-b7bf-986103d270c2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0042f2f0-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7615.85686211, 'message_signature': '5813937cefe6cbe45724bb70d65e87b45cf5a3cdb225977f2652cb12ba286858'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '9b64431b-400c-4b7e-b7bf-986103d270c2-sda', 'timestamp': '2025-11-29T07:42:48.185878', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-715278411', 'name': 'instance-000000a0', 'instance_id': '9b64431b-400c-4b7e-b7bf-986103d270c2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '00430420-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7615.85686211, 'message_signature': 'ae95ebfee39d09478a42aa45b8044fc792635f84ab44873c4446d4383f8b0fa0'}]}, 'timestamp': '2025-11-29 07:42:48.186728', '_unique_id': '326f9f22953b42a39c6ef1dbeba9600f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.187 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.188 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.188 12 DEBUG ceilometer.compute.pollsters [-] 9b64431b-400c-4b7e-b7bf-986103d270c2/disk.device.read.requests volume: 1101 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.188 12 DEBUG ceilometer.compute.pollsters [-] 9b64431b-400c-4b7e-b7bf-986103d270c2/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f2e2bc5-d63a-48be-a26e-a0b531f219ac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1101, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '9b64431b-400c-4b7e-b7bf-986103d270c2-vda', 'timestamp': '2025-11-29T07:42:48.188181', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-715278411', 'name': 'instance-000000a0', 'instance_id': '9b64431b-400c-4b7e-b7bf-986103d270c2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '004348cc-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7615.752324755, 'message_signature': '999885c71952ced1c80c8091b81f6f747176824c226c835297c4ed627b29ef95'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '9b64431b-400c-4b7e-b7bf-986103d270c2-sda', 'timestamp': '2025-11-29T07:42:48.188181', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-715278411', 'name': 'instance-000000a0', 'instance_id': '9b64431b-400c-4b7e-b7bf-986103d270c2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '004352f4-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7615.752324755, 'message_signature': 'c7524d694d9549558c9b774a64253b3b8dc828ff3ee3a291e993087925e698f8'}]}, 'timestamp': '2025-11-29 07:42:48.188720', '_unique_id': '78d0a61d81e9457abe7f7b04df25a07a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.189 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.190 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.190 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.190 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-715278411>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-715278411>]
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.190 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.190 12 DEBUG ceilometer.compute.pollsters [-] 9b64431b-400c-4b7e-b7bf-986103d270c2/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 DEBUG ceilometer.compute.pollsters [-] 9b64431b-400c-4b7e-b7bf-986103d270c2/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '00be639e-24bb-49f1-ae53-ceb491ee7a31', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '9b64431b-400c-4b7e-b7bf-986103d270c2-vda', 'timestamp': '2025-11-29T07:42:48.190769', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-715278411', 'name': 'instance-000000a0', 'instance_id': '9b64431b-400c-4b7e-b7bf-986103d270c2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0043aeac-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7615.85686211, 'message_signature': 'ec38a42563f52fe91f1b6b6b4ad33a4c7828d6b3887dcf916519742c79ab5432'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '9b64431b-400c-4b7e-b7bf-986103d270c2-sda', 'timestamp': '2025-11-29T07:42:48.190769', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-715278411', 'name': 'instance-000000a0', 'instance_id': '9b64431b-400c-4b7e-b7bf-986103d270c2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0043b988-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7615.85686211, 'message_signature': 'cfffc3e4908cc16417a3496f7c929179014118a9e9bd44b320448b03d8823663'}]}, 'timestamp': '2025-11-29 07:42:48.191346', '_unique_id': '17fc4780cd6c4fb1a38e4afe7c2eb6c5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.192 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.192 12 DEBUG ceilometer.compute.pollsters [-] 9b64431b-400c-4b7e-b7bf-986103d270c2/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd44389b-5f9d-4dfa-834b-4adb9fc7c2f5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-000000a0-9b64431b-400c-4b7e-b7bf-986103d270c2-tap66645927-47', 'timestamp': '2025-11-29T07:42:48.192767', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-715278411', 'name': 'tap66645927-47', 'instance_id': '9b64431b-400c-4b7e-b7bf-986103d270c2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7c:02:c3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap66645927-47'}, 'message_id': '0043fcf4-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7615.738073261, 'message_signature': '35cc69a4f893a9a9be73a32f109d0dc6e15e6eed050a0278fc23ce94e1ba69b3'}]}, 'timestamp': '2025-11-29 07:42:48.193100', '_unique_id': 'daae423b77be402caae7bec53039a452'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.193 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.194 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.194 12 DEBUG ceilometer.compute.pollsters [-] 9b64431b-400c-4b7e-b7bf-986103d270c2/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '096e54c1-9502-4e26-8816-c11410785658', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-000000a0-9b64431b-400c-4b7e-b7bf-986103d270c2-tap66645927-47', 'timestamp': '2025-11-29T07:42:48.194525', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-715278411', 'name': 'tap66645927-47', 'instance_id': '9b64431b-400c-4b7e-b7bf-986103d270c2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7c:02:c3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap66645927-47'}, 'message_id': '004440ba-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7615.738073261, 'message_signature': '9824819bd22894130a4c0d314188e9911d5567f75560636953205b8f5aafce1b'}]}, 'timestamp': '2025-11-29 07:42:48.194828', '_unique_id': '4d24597995884a4bbfe7507e2139567d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.195 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.196 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.196 12 DEBUG ceilometer.compute.pollsters [-] 9b64431b-400c-4b7e-b7bf-986103d270c2/disk.device.read.bytes volume: 30439936 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.196 12 DEBUG ceilometer.compute.pollsters [-] 9b64431b-400c-4b7e-b7bf-986103d270c2/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '468914ce-bc55-4878-a34f-f4bdac4554f8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30439936, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '9b64431b-400c-4b7e-b7bf-986103d270c2-vda', 'timestamp': '2025-11-29T07:42:48.196267', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-715278411', 'name': 'instance-000000a0', 'instance_id': '9b64431b-400c-4b7e-b7bf-986103d270c2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0044849e-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7615.752324755, 'message_signature': '372885797cea1b5ba9342664556344bf8c8d1481e5abcbc72894a2b64efd0a5f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '9b64431b-400c-4b7e-b7bf-986103d270c2-sda', 'timestamp': '2025-11-29T07:42:48.196267', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-715278411', 'name': 'instance-000000a0', 'instance_id': '9b64431b-400c-4b7e-b7bf-986103d270c2', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '00448eee-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7615.752324755, 'message_signature': 'a7c82ad5529ab926bf1a10bc4a0ef50345b23c2127008d079a6dcc2178d69d85'}]}, 'timestamp': '2025-11-29 07:42:48.196808', '_unique_id': '14916e69019041a09640a97ec8943d7d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:42:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:42:48.197 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:42:50 compute-0 podman[243688]: 2025-11-29 07:42:50.784748829 +0000 UTC m=+0.046202062 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 07:42:50 compute-0 podman[243686]: 2025-11-29 07:42:50.789942786 +0000 UTC m=+0.055279759 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 07:42:50 compute-0 podman[243687]: 2025-11-29 07:42:50.792006115 +0000 UTC m=+0.054375594 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.buildah.version=1.33.7, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter)
Nov 29 07:42:51 compute-0 nova_compute[187185]: 2025-11-29 07:42:51.307 187189 DEBUG oslo_concurrency.lockutils [None req-d86f221e-d116-4180-b5fd-65bc6ddbb8da 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "9b64431b-400c-4b7e-b7bf-986103d270c2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:42:51 compute-0 nova_compute[187185]: 2025-11-29 07:42:51.308 187189 DEBUG oslo_concurrency.lockutils [None req-d86f221e-d116-4180-b5fd-65bc6ddbb8da 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "9b64431b-400c-4b7e-b7bf-986103d270c2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:42:51 compute-0 nova_compute[187185]: 2025-11-29 07:42:51.309 187189 DEBUG oslo_concurrency.lockutils [None req-d86f221e-d116-4180-b5fd-65bc6ddbb8da 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "9b64431b-400c-4b7e-b7bf-986103d270c2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:42:51 compute-0 nova_compute[187185]: 2025-11-29 07:42:51.309 187189 DEBUG oslo_concurrency.lockutils [None req-d86f221e-d116-4180-b5fd-65bc6ddbb8da 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "9b64431b-400c-4b7e-b7bf-986103d270c2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:42:51 compute-0 nova_compute[187185]: 2025-11-29 07:42:51.309 187189 DEBUG oslo_concurrency.lockutils [None req-d86f221e-d116-4180-b5fd-65bc6ddbb8da 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "9b64431b-400c-4b7e-b7bf-986103d270c2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:42:51 compute-0 nova_compute[187185]: 2025-11-29 07:42:51.506 187189 INFO nova.compute.manager [None req-d86f221e-d116-4180-b5fd-65bc6ddbb8da 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Terminating instance
Nov 29 07:42:51 compute-0 nova_compute[187185]: 2025-11-29 07:42:51.531 187189 DEBUG nova.compute.manager [None req-d86f221e-d116-4180-b5fd-65bc6ddbb8da 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 07:42:51 compute-0 kernel: tap66645927-47 (unregistering): left promiscuous mode
Nov 29 07:42:51 compute-0 NetworkManager[55227]: <info>  [1764402171.5550] device (tap66645927-47): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:42:51 compute-0 nova_compute[187185]: 2025-11-29 07:42:51.568 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:51 compute-0 ovn_controller[95281]: 2025-11-29T07:42:51Z|00543|binding|INFO|Releasing lport 66645927-47fc-4df8-b8f3-2254e1a841ed from this chassis (sb_readonly=0)
Nov 29 07:42:51 compute-0 ovn_controller[95281]: 2025-11-29T07:42:51Z|00544|binding|INFO|Setting lport 66645927-47fc-4df8-b8f3-2254e1a841ed down in Southbound
Nov 29 07:42:51 compute-0 ovn_controller[95281]: 2025-11-29T07:42:51Z|00545|binding|INFO|Removing iface tap66645927-47 ovn-installed in OVS
Nov 29 07:42:51 compute-0 nova_compute[187185]: 2025-11-29 07:42:51.573 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:51 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:51.582 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:02:c3 10.100.0.13 2001:db8:0:1:f816:3eff:fe7c:2c3 2001:db8::f816:3eff:fe7c:2c3'], port_security=['fa:16:3e:7c:02:c3 10.100.0.13 2001:db8:0:1:f816:3eff:fe7c:2c3 2001:db8::f816:3eff:fe7c:2c3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28 2001:db8:0:1:f816:3eff:fe7c:2c3/64 2001:db8::f816:3eff:fe7c:2c3/64', 'neutron:device_id': '9b64431b-400c-4b7e-b7bf-986103d270c2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-600edac6-24aa-414f-b977-07c2890470f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e2af89ad-a80e-4dc1-aa45-ab6ce3534b4f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=de1096f6-2a15-4f04-9ea7-22d2dff24e74, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=66645927-47fc-4df8-b8f3-2254e1a841ed) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:42:51 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:51.584 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 66645927-47fc-4df8-b8f3-2254e1a841ed in datapath 600edac6-24aa-414f-b977-07c2890470f1 unbound from our chassis
Nov 29 07:42:51 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:51.588 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 600edac6-24aa-414f-b977-07c2890470f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:42:51 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:51.590 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[9e099101-3c93-44fa-9021-bce75d1169b7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:42:51 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:51.591 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-600edac6-24aa-414f-b977-07c2890470f1 namespace which is not needed anymore
Nov 29 07:42:51 compute-0 nova_compute[187185]: 2025-11-29 07:42:51.591 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:51 compute-0 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d000000a0.scope: Deactivated successfully.
Nov 29 07:42:51 compute-0 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d000000a0.scope: Consumed 15.208s CPU time.
Nov 29 07:42:51 compute-0 systemd-machined[153486]: Machine qemu-63-instance-000000a0 terminated.
Nov 29 07:42:51 compute-0 nova_compute[187185]: 2025-11-29 07:42:51.816 187189 INFO nova.virt.libvirt.driver [-] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Instance destroyed successfully.
Nov 29 07:42:51 compute-0 nova_compute[187185]: 2025-11-29 07:42:51.818 187189 DEBUG nova.objects.instance [None req-d86f221e-d116-4180-b5fd-65bc6ddbb8da 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'resources' on Instance uuid 9b64431b-400c-4b7e-b7bf-986103d270c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:42:52 compute-0 nova_compute[187185]: 2025-11-29 07:42:52.022 187189 DEBUG nova.virt.libvirt.vif [None req-d86f221e-d116-4180-b5fd-65bc6ddbb8da 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:41:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-715278411',display_name='tempest-TestGettingAddress-server-715278411',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-715278411',id=160,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD3d0ZsvCuIHNmZM7lmf14lwcU9LYA+YzS+DsUoU/RUNt3FNYs43WlwoA0reTsUUFEVQa4lagWavf3wAARjW0IrVdX6QLhMZ1dtoKB8yeTuH2S9PjazhePg7oe9bdCIqjQ==',key_name='tempest-TestGettingAddress-1639164498',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:42:12Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-0feovezc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:42:13Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=9b64431b-400c-4b7e-b7bf-986103d270c2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "66645927-47fc-4df8-b8f3-2254e1a841ed", "address": "fa:16:3e:7c:02:c3", "network": {"id": "600edac6-24aa-414f-b977-07c2890470f1", "bridge": "br-int", "label": "tempest-network-smoke--176960828", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe7c:2c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe7c:2c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66645927-47", "ovs_interfaceid": "66645927-47fc-4df8-b8f3-2254e1a841ed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:42:52 compute-0 nova_compute[187185]: 2025-11-29 07:42:52.023 187189 DEBUG nova.network.os_vif_util [None req-d86f221e-d116-4180-b5fd-65bc6ddbb8da 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "66645927-47fc-4df8-b8f3-2254e1a841ed", "address": "fa:16:3e:7c:02:c3", "network": {"id": "600edac6-24aa-414f-b977-07c2890470f1", "bridge": "br-int", "label": "tempest-network-smoke--176960828", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe7c:2c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe7c:2c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66645927-47", "ovs_interfaceid": "66645927-47fc-4df8-b8f3-2254e1a841ed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:42:52 compute-0 nova_compute[187185]: 2025-11-29 07:42:52.023 187189 DEBUG nova.network.os_vif_util [None req-d86f221e-d116-4180-b5fd-65bc6ddbb8da 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7c:02:c3,bridge_name='br-int',has_traffic_filtering=True,id=66645927-47fc-4df8-b8f3-2254e1a841ed,network=Network(600edac6-24aa-414f-b977-07c2890470f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66645927-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:42:52 compute-0 nova_compute[187185]: 2025-11-29 07:42:52.024 187189 DEBUG os_vif [None req-d86f221e-d116-4180-b5fd-65bc6ddbb8da 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7c:02:c3,bridge_name='br-int',has_traffic_filtering=True,id=66645927-47fc-4df8-b8f3-2254e1a841ed,network=Network(600edac6-24aa-414f-b977-07c2890470f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66645927-47') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:42:52 compute-0 nova_compute[187185]: 2025-11-29 07:42:52.025 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:52 compute-0 nova_compute[187185]: 2025-11-29 07:42:52.026 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66645927-47, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:42:52 compute-0 nova_compute[187185]: 2025-11-29 07:42:52.029 187189 DEBUG nova.compute.manager [req-1d520def-3a39-4862-85ef-784fda284022 req-f742c279-c206-4ef2-893b-fd799225d89d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Received event network-vif-unplugged-66645927-47fc-4df8-b8f3-2254e1a841ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:42:52 compute-0 nova_compute[187185]: 2025-11-29 07:42:52.029 187189 DEBUG oslo_concurrency.lockutils [req-1d520def-3a39-4862-85ef-784fda284022 req-f742c279-c206-4ef2-893b-fd799225d89d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "9b64431b-400c-4b7e-b7bf-986103d270c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:42:52 compute-0 nova_compute[187185]: 2025-11-29 07:42:52.029 187189 DEBUG oslo_concurrency.lockutils [req-1d520def-3a39-4862-85ef-784fda284022 req-f742c279-c206-4ef2-893b-fd799225d89d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9b64431b-400c-4b7e-b7bf-986103d270c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:42:52 compute-0 nova_compute[187185]: 2025-11-29 07:42:52.030 187189 DEBUG oslo_concurrency.lockutils [req-1d520def-3a39-4862-85ef-784fda284022 req-f742c279-c206-4ef2-893b-fd799225d89d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9b64431b-400c-4b7e-b7bf-986103d270c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:42:52 compute-0 nova_compute[187185]: 2025-11-29 07:42:52.030 187189 DEBUG nova.compute.manager [req-1d520def-3a39-4862-85ef-784fda284022 req-f742c279-c206-4ef2-893b-fd799225d89d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] No waiting events found dispatching network-vif-unplugged-66645927-47fc-4df8-b8f3-2254e1a841ed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:42:52 compute-0 nova_compute[187185]: 2025-11-29 07:42:52.030 187189 DEBUG nova.compute.manager [req-1d520def-3a39-4862-85ef-784fda284022 req-f742c279-c206-4ef2-893b-fd799225d89d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Received event network-vif-unplugged-66645927-47fc-4df8-b8f3-2254e1a841ed for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 07:42:52 compute-0 nova_compute[187185]: 2025-11-29 07:42:52.030 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:52 compute-0 nova_compute[187185]: 2025-11-29 07:42:52.031 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:42:52 compute-0 nova_compute[187185]: 2025-11-29 07:42:52.035 187189 INFO os_vif [None req-d86f221e-d116-4180-b5fd-65bc6ddbb8da 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7c:02:c3,bridge_name='br-int',has_traffic_filtering=True,id=66645927-47fc-4df8-b8f3-2254e1a841ed,network=Network(600edac6-24aa-414f-b977-07c2890470f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66645927-47')
Nov 29 07:42:52 compute-0 nova_compute[187185]: 2025-11-29 07:42:52.035 187189 INFO nova.virt.libvirt.driver [None req-d86f221e-d116-4180-b5fd-65bc6ddbb8da 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Deleting instance files /var/lib/nova/instances/9b64431b-400c-4b7e-b7bf-986103d270c2_del
Nov 29 07:42:52 compute-0 nova_compute[187185]: 2025-11-29 07:42:52.036 187189 INFO nova.virt.libvirt.driver [None req-d86f221e-d116-4180-b5fd-65bc6ddbb8da 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Deletion of /var/lib/nova/instances/9b64431b-400c-4b7e-b7bf-986103d270c2_del complete
Nov 29 07:42:52 compute-0 nova_compute[187185]: 2025-11-29 07:42:52.157 187189 INFO nova.compute.manager [None req-d86f221e-d116-4180-b5fd-65bc6ddbb8da 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Took 0.63 seconds to destroy the instance on the hypervisor.
Nov 29 07:42:52 compute-0 nova_compute[187185]: 2025-11-29 07:42:52.160 187189 DEBUG oslo.service.loopingcall [None req-d86f221e-d116-4180-b5fd-65bc6ddbb8da 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 07:42:52 compute-0 nova_compute[187185]: 2025-11-29 07:42:52.162 187189 DEBUG nova.compute.manager [-] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 07:42:52 compute-0 nova_compute[187185]: 2025-11-29 07:42:52.162 187189 DEBUG nova.network.neutron [-] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 07:42:52 compute-0 nova_compute[187185]: 2025-11-29 07:42:52.404 187189 DEBUG nova.network.neutron [req-3f9eb1c6-f1f0-44dd-9f7d-597ba7b937d2 req-03517c37-f42a-4ce5-9c54-6d4a27ed9c53 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Updated VIF entry in instance network info cache for port 66645927-47fc-4df8-b8f3-2254e1a841ed. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:42:52 compute-0 nova_compute[187185]: 2025-11-29 07:42:52.405 187189 DEBUG nova.network.neutron [req-3f9eb1c6-f1f0-44dd-9f7d-597ba7b937d2 req-03517c37-f42a-4ce5-9c54-6d4a27ed9c53 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Updating instance_info_cache with network_info: [{"id": "66645927-47fc-4df8-b8f3-2254e1a841ed", "address": "fa:16:3e:7c:02:c3", "network": {"id": "600edac6-24aa-414f-b977-07c2890470f1", "bridge": "br-int", "label": "tempest-network-smoke--176960828", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe7c:2c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe7c:2c3", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66645927-47", "ovs_interfaceid": "66645927-47fc-4df8-b8f3-2254e1a841ed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:42:52 compute-0 nova_compute[187185]: 2025-11-29 07:42:52.434 187189 DEBUG oslo_concurrency.lockutils [req-3f9eb1c6-f1f0-44dd-9f7d-597ba7b937d2 req-03517c37-f42a-4ce5-9c54-6d4a27ed9c53 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-9b64431b-400c-4b7e-b7bf-986103d270c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:42:52 compute-0 nova_compute[187185]: 2025-11-29 07:42:52.481 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:52 compute-0 neutron-haproxy-ovnmeta-600edac6-24aa-414f-b977-07c2890470f1[243491]: [NOTICE]   (243495) : haproxy version is 2.8.14-c23fe91
Nov 29 07:42:52 compute-0 neutron-haproxy-ovnmeta-600edac6-24aa-414f-b977-07c2890470f1[243491]: [NOTICE]   (243495) : path to executable is /usr/sbin/haproxy
Nov 29 07:42:52 compute-0 neutron-haproxy-ovnmeta-600edac6-24aa-414f-b977-07c2890470f1[243491]: [WARNING]  (243495) : Exiting Master process...
Nov 29 07:42:52 compute-0 neutron-haproxy-ovnmeta-600edac6-24aa-414f-b977-07c2890470f1[243491]: [WARNING]  (243495) : Exiting Master process...
Nov 29 07:42:52 compute-0 neutron-haproxy-ovnmeta-600edac6-24aa-414f-b977-07c2890470f1[243491]: [ALERT]    (243495) : Current worker (243497) exited with code 143 (Terminated)
Nov 29 07:42:52 compute-0 neutron-haproxy-ovnmeta-600edac6-24aa-414f-b977-07c2890470f1[243491]: [WARNING]  (243495) : All workers exited. Exiting... (0)
Nov 29 07:42:52 compute-0 systemd[1]: libpod-0297440574c4204c74d79f3c027983bf281387bfa82dc92bc6c6b95d938a13a7.scope: Deactivated successfully.
Nov 29 07:42:52 compute-0 conmon[243491]: conmon 0297440574c4204c74d7 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-0297440574c4204c74d79f3c027983bf281387bfa82dc92bc6c6b95d938a13a7.scope/container/memory.events
Nov 29 07:42:52 compute-0 podman[243769]: 2025-11-29 07:42:52.742557557 +0000 UTC m=+1.032138144 container died 0297440574c4204c74d79f3c027983bf281387bfa82dc92bc6c6b95d938a13a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-600edac6-24aa-414f-b977-07c2890470f1, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:42:52 compute-0 nova_compute[187185]: 2025-11-29 07:42:52.925 187189 DEBUG nova.network.neutron [-] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:42:52 compute-0 nova_compute[187185]: 2025-11-29 07:42:52.948 187189 INFO nova.compute.manager [-] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Took 0.79 seconds to deallocate network for instance.
Nov 29 07:42:52 compute-0 nova_compute[187185]: 2025-11-29 07:42:52.994 187189 DEBUG nova.compute.manager [req-62895dba-687c-4d85-8fc8-135aae053ca1 req-279620ac-1335-4db8-8450-2cc918b8689a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Received event network-vif-deleted-66645927-47fc-4df8-b8f3-2254e1a841ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:42:53 compute-0 nova_compute[187185]: 2025-11-29 07:42:53.039 187189 DEBUG oslo_concurrency.lockutils [None req-d86f221e-d116-4180-b5fd-65bc6ddbb8da 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:42:53 compute-0 nova_compute[187185]: 2025-11-29 07:42:53.040 187189 DEBUG oslo_concurrency.lockutils [None req-d86f221e-d116-4180-b5fd-65bc6ddbb8da 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:42:53 compute-0 nova_compute[187185]: 2025-11-29 07:42:53.107 187189 DEBUG nova.compute.provider_tree [None req-d86f221e-d116-4180-b5fd-65bc6ddbb8da 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:42:53 compute-0 nova_compute[187185]: 2025-11-29 07:42:53.123 187189 DEBUG nova.scheduler.client.report [None req-d86f221e-d116-4180-b5fd-65bc6ddbb8da 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:42:53 compute-0 nova_compute[187185]: 2025-11-29 07:42:53.146 187189 DEBUG oslo_concurrency.lockutils [None req-d86f221e-d116-4180-b5fd-65bc6ddbb8da 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:42:53 compute-0 nova_compute[187185]: 2025-11-29 07:42:53.175 187189 INFO nova.scheduler.client.report [None req-d86f221e-d116-4180-b5fd-65bc6ddbb8da 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Deleted allocations for instance 9b64431b-400c-4b7e-b7bf-986103d270c2
Nov 29 07:42:53 compute-0 nova_compute[187185]: 2025-11-29 07:42:53.282 187189 DEBUG oslo_concurrency.lockutils [None req-d86f221e-d116-4180-b5fd-65bc6ddbb8da 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "9b64431b-400c-4b7e-b7bf-986103d270c2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.974s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:42:53 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0297440574c4204c74d79f3c027983bf281387bfa82dc92bc6c6b95d938a13a7-userdata-shm.mount: Deactivated successfully.
Nov 29 07:42:53 compute-0 systemd[1]: var-lib-containers-storage-overlay-470a5e0c2553de13b0b77314c3eb020bd2b761116060a514326aa4743e66ea7c-merged.mount: Deactivated successfully.
Nov 29 07:42:54 compute-0 nova_compute[187185]: 2025-11-29 07:42:54.138 187189 DEBUG nova.compute.manager [req-4e3c8754-a199-44c0-a0c7-aa78220c2e2a req-65ff80ae-a077-4ef3-b0d4-009d5e4e7bd0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Received event network-vif-plugged-66645927-47fc-4df8-b8f3-2254e1a841ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:42:54 compute-0 nova_compute[187185]: 2025-11-29 07:42:54.138 187189 DEBUG oslo_concurrency.lockutils [req-4e3c8754-a199-44c0-a0c7-aa78220c2e2a req-65ff80ae-a077-4ef3-b0d4-009d5e4e7bd0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "9b64431b-400c-4b7e-b7bf-986103d270c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:42:54 compute-0 nova_compute[187185]: 2025-11-29 07:42:54.139 187189 DEBUG oslo_concurrency.lockutils [req-4e3c8754-a199-44c0-a0c7-aa78220c2e2a req-65ff80ae-a077-4ef3-b0d4-009d5e4e7bd0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9b64431b-400c-4b7e-b7bf-986103d270c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:42:54 compute-0 nova_compute[187185]: 2025-11-29 07:42:54.139 187189 DEBUG oslo_concurrency.lockutils [req-4e3c8754-a199-44c0-a0c7-aa78220c2e2a req-65ff80ae-a077-4ef3-b0d4-009d5e4e7bd0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9b64431b-400c-4b7e-b7bf-986103d270c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:42:54 compute-0 nova_compute[187185]: 2025-11-29 07:42:54.139 187189 DEBUG nova.compute.manager [req-4e3c8754-a199-44c0-a0c7-aa78220c2e2a req-65ff80ae-a077-4ef3-b0d4-009d5e4e7bd0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] No waiting events found dispatching network-vif-plugged-66645927-47fc-4df8-b8f3-2254e1a841ed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:42:54 compute-0 nova_compute[187185]: 2025-11-29 07:42:54.140 187189 WARNING nova.compute.manager [req-4e3c8754-a199-44c0-a0c7-aa78220c2e2a req-65ff80ae-a077-4ef3-b0d4-009d5e4e7bd0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Received unexpected event network-vif-plugged-66645927-47fc-4df8-b8f3-2254e1a841ed for instance with vm_state deleted and task_state None.
Nov 29 07:42:54 compute-0 podman[243769]: 2025-11-29 07:42:54.897979381 +0000 UTC m=+3.187559948 container cleanup 0297440574c4204c74d79f3c027983bf281387bfa82dc92bc6c6b95d938a13a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-600edac6-24aa-414f-b977-07c2890470f1, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 07:42:54 compute-0 systemd[1]: libpod-conmon-0297440574c4204c74d79f3c027983bf281387bfa82dc92bc6c6b95d938a13a7.scope: Deactivated successfully.
Nov 29 07:42:55 compute-0 nova_compute[187185]: 2025-11-29 07:42:55.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:42:55 compute-0 podman[243814]: 2025-11-29 07:42:55.509650031 +0000 UTC m=+0.569098473 container remove 0297440574c4204c74d79f3c027983bf281387bfa82dc92bc6c6b95d938a13a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-600edac6-24aa-414f-b977-07c2890470f1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:42:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:55.515 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[c1b20b59-bc30-4b68-a4b8-8c5a147c01d5]: (4, ('Sat Nov 29 07:42:51 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-600edac6-24aa-414f-b977-07c2890470f1 (0297440574c4204c74d79f3c027983bf281387bfa82dc92bc6c6b95d938a13a7)\n0297440574c4204c74d79f3c027983bf281387bfa82dc92bc6c6b95d938a13a7\nSat Nov 29 07:42:54 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-600edac6-24aa-414f-b977-07c2890470f1 (0297440574c4204c74d79f3c027983bf281387bfa82dc92bc6c6b95d938a13a7)\n0297440574c4204c74d79f3c027983bf281387bfa82dc92bc6c6b95d938a13a7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:42:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:55.517 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e1d0996a-523a-4b93-a662-bfdfb99144d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:42:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:55.519 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap600edac6-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:42:55 compute-0 kernel: tap600edac6-20: left promiscuous mode
Nov 29 07:42:55 compute-0 nova_compute[187185]: 2025-11-29 07:42:55.522 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:55.527 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[3720021c-0c6d-4120-a0ab-de149d7ebb8d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:42:55 compute-0 nova_compute[187185]: 2025-11-29 07:42:55.542 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:55.561 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[0ca8e710-250b-4b84-8aec-f1baa3c2ea3a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:42:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:55.564 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[9ad9e20d-bf42-4427-976d-143e418051e0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:42:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:55.587 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[be0e2b1b-ad79-40e4-85f1-c525090560f9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 757638, 'reachable_time': 31151, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243828, 'error': None, 'target': 'ovnmeta-600edac6-24aa-414f-b977-07c2890470f1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:42:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:55.595 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-600edac6-24aa-414f-b977-07c2890470f1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:42:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:42:55.596 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[6ec50fb1-77e0-4744-bee9-fe072b44c861]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:42:55 compute-0 systemd[1]: run-netns-ovnmeta\x2d600edac6\x2d24aa\x2d414f\x2db977\x2d07c2890470f1.mount: Deactivated successfully.
Nov 29 07:42:57 compute-0 nova_compute[187185]: 2025-11-29 07:42:57.028 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:57 compute-0 nova_compute[187185]: 2025-11-29 07:42:57.324 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:42:57 compute-0 nova_compute[187185]: 2025-11-29 07:42:57.483 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:42:58 compute-0 sshd-session[243834]: banner exchange: Connection from 65.49.1.152 port 46052: invalid format
Nov 29 07:42:59 compute-0 podman[243835]: 2025-11-29 07:42:59.876060393 +0000 UTC m=+0.126251620 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:43:02 compute-0 nova_compute[187185]: 2025-11-29 07:43:02.032 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:43:02 compute-0 nova_compute[187185]: 2025-11-29 07:43:02.538 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:43:06 compute-0 nova_compute[187185]: 2025-11-29 07:43:06.814 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764402171.8136992, 9b64431b-400c-4b7e-b7bf-986103d270c2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:43:06 compute-0 nova_compute[187185]: 2025-11-29 07:43:06.815 187189 INFO nova.compute.manager [-] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] VM Stopped (Lifecycle Event)
Nov 29 07:43:07 compute-0 nova_compute[187185]: 2025-11-29 07:43:07.034 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:43:07 compute-0 nova_compute[187185]: 2025-11-29 07:43:07.330 187189 DEBUG nova.compute.manager [None req-bd07b1a1-a331-4789-a216-bc5ebc581e96 - - - - - -] [instance: 9b64431b-400c-4b7e-b7bf-986103d270c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:43:07 compute-0 nova_compute[187185]: 2025-11-29 07:43:07.540 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:43:07 compute-0 podman[243863]: 2025-11-29 07:43:07.796594342 +0000 UTC m=+0.058249973 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 07:43:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:43:07.796 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=62, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=61) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:43:07 compute-0 nova_compute[187185]: 2025-11-29 07:43:07.797 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:43:07 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:43:07.799 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:43:08 compute-0 sshd-session[243861]: Received disconnect from 20.255.62.58 port 43018:11: Bye Bye [preauth]
Nov 29 07:43:08 compute-0 sshd-session[243861]: Disconnected from authenticating user root 20.255.62.58 port 43018 [preauth]
Nov 29 07:43:10 compute-0 podman[243886]: 2025-11-29 07:43:10.855088439 +0000 UTC m=+0.113883191 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 07:43:10 compute-0 podman[243887]: 2025-11-29 07:43:10.861925353 +0000 UTC m=+0.109789935 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=edpm)
Nov 29 07:43:11 compute-0 nova_compute[187185]: 2025-11-29 07:43:11.005 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:43:11 compute-0 nova_compute[187185]: 2025-11-29 07:43:11.191 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:43:12 compute-0 nova_compute[187185]: 2025-11-29 07:43:12.037 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:43:12 compute-0 nova_compute[187185]: 2025-11-29 07:43:12.542 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:43:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:43:14.802 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '62'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:43:15 compute-0 podman[201381]: time="2025-11-29T07:43:15Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 29 07:43:15 compute-0 podman[201381]: @ - - [29/Nov/2025:07:43:15 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 22840 "" "Go-http-client/1.1"
Nov 29 07:43:16 compute-0 nova_compute[187185]: 2025-11-29 07:43:16.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:43:16 compute-0 nova_compute[187185]: 2025-11-29 07:43:16.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 29 07:43:16 compute-0 nova_compute[187185]: 2025-11-29 07:43:16.331 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 29 07:43:17 compute-0 nova_compute[187185]: 2025-11-29 07:43:17.040 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:43:17 compute-0 nova_compute[187185]: 2025-11-29 07:43:17.594 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:43:21 compute-0 podman[243929]: 2025-11-29 07:43:21.807645235 +0000 UTC m=+0.060051894 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:43:21 compute-0 podman[243928]: 2025-11-29 07:43:21.825728298 +0000 UTC m=+0.075517863 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, release=1755695350, maintainer=Red Hat, Inc., architecture=x86_64, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, managed_by=edpm_ansible)
Nov 29 07:43:21 compute-0 podman[243927]: 2025-11-29 07:43:21.827638662 +0000 UTC m=+0.089734796 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 29 07:43:22 compute-0 nova_compute[187185]: 2025-11-29 07:43:22.043 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:43:22 compute-0 nova_compute[187185]: 2025-11-29 07:43:22.625 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:43:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:43:24.697 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c8:31:c8 10.100.0.2 2001:db8::f816:3eff:fec8:31c8'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fec8:31c8/64', 'neutron:device_id': 'ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e23e9510-a780-4254-b7f0-36040139e7db', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ff62ba9a-db01-45ed-b4a4-c5b2c8f5434e, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=4e85a268-4b8a-4015-a903-2252d696f8f5) old=Port_Binding(mac=['fa:16:3e:c8:31:c8 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e23e9510-a780-4254-b7f0-36040139e7db', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:43:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:43:24.699 104254 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 4e85a268-4b8a-4015-a903-2252d696f8f5 in datapath e23e9510-a780-4254-b7f0-36040139e7db updated
Nov 29 07:43:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:43:24.701 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e23e9510-a780-4254-b7f0-36040139e7db, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:43:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:43:24.702 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[ce616b81-c825-449f-96c3-14b679608336]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:43:25 compute-0 nova_compute[187185]: 2025-11-29 07:43:25.331 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:43:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:43:25.535 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:43:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:43:25.536 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:43:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:43:25.536 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:43:27 compute-0 nova_compute[187185]: 2025-11-29 07:43:27.067 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:43:27 compute-0 nova_compute[187185]: 2025-11-29 07:43:27.628 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:43:28 compute-0 nova_compute[187185]: 2025-11-29 07:43:28.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:43:28 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:43:28.956 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c8:31:c8 10.100.0.2 2001:db8:0:1:f816:3eff:fec8:31c8 2001:db8::f816:3eff:fec8:31c8'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:fec8:31c8/64 2001:db8::f816:3eff:fec8:31c8/64', 'neutron:device_id': 'ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e23e9510-a780-4254-b7f0-36040139e7db', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ff62ba9a-db01-45ed-b4a4-c5b2c8f5434e, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=4e85a268-4b8a-4015-a903-2252d696f8f5) old=Port_Binding(mac=['fa:16:3e:c8:31:c8 10.100.0.2 2001:db8::f816:3eff:fec8:31c8'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fec8:31c8/64', 'neutron:device_id': 'ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e23e9510-a780-4254-b7f0-36040139e7db', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:43:28 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:43:28.958 104254 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 4e85a268-4b8a-4015-a903-2252d696f8f5 in datapath e23e9510-a780-4254-b7f0-36040139e7db updated
Nov 29 07:43:28 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:43:28.959 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e23e9510-a780-4254-b7f0-36040139e7db, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:43:28 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:43:28.960 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f39e0dff-46e9-4e94-9178-90150665909d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:43:28 compute-0 nova_compute[187185]: 2025-11-29 07:43:28.974 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:43:28 compute-0 nova_compute[187185]: 2025-11-29 07:43:28.974 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:43:28 compute-0 nova_compute[187185]: 2025-11-29 07:43:28.974 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:43:28 compute-0 nova_compute[187185]: 2025-11-29 07:43:28.975 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:43:29 compute-0 nova_compute[187185]: 2025-11-29 07:43:29.154 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:43:29 compute-0 nova_compute[187185]: 2025-11-29 07:43:29.155 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5742MB free_disk=73.25365829467773GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:43:29 compute-0 nova_compute[187185]: 2025-11-29 07:43:29.156 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:43:29 compute-0 nova_compute[187185]: 2025-11-29 07:43:29.156 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:43:29 compute-0 nova_compute[187185]: 2025-11-29 07:43:29.208 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:43:29 compute-0 nova_compute[187185]: 2025-11-29 07:43:29.208 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:43:29 compute-0 nova_compute[187185]: 2025-11-29 07:43:29.229 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:43:29 compute-0 nova_compute[187185]: 2025-11-29 07:43:29.245 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:43:29 compute-0 nova_compute[187185]: 2025-11-29 07:43:29.264 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:43:29 compute-0 nova_compute[187185]: 2025-11-29 07:43:29.265 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:43:30 compute-0 podman[243986]: 2025-11-29 07:43:30.872813959 +0000 UTC m=+0.126575621 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true)
Nov 29 07:43:32 compute-0 nova_compute[187185]: 2025-11-29 07:43:32.069 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:43:32 compute-0 nova_compute[187185]: 2025-11-29 07:43:32.265 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:43:32 compute-0 nova_compute[187185]: 2025-11-29 07:43:32.265 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:43:32 compute-0 nova_compute[187185]: 2025-11-29 07:43:32.265 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:43:32 compute-0 nova_compute[187185]: 2025-11-29 07:43:32.282 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 07:43:32 compute-0 nova_compute[187185]: 2025-11-29 07:43:32.283 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:43:32 compute-0 nova_compute[187185]: 2025-11-29 07:43:32.283 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:43:32 compute-0 nova_compute[187185]: 2025-11-29 07:43:32.283 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:43:32 compute-0 nova_compute[187185]: 2025-11-29 07:43:32.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:43:32 compute-0 nova_compute[187185]: 2025-11-29 07:43:32.670 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:43:33 compute-0 nova_compute[187185]: 2025-11-29 07:43:33.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:43:35 compute-0 sshd-session[244013]: Invalid user support from 78.128.112.74 port 60032
Nov 29 07:43:35 compute-0 sshd-session[244013]: Connection closed by invalid user support 78.128.112.74 port 60032 [preauth]
Nov 29 07:43:35 compute-0 nova_compute[187185]: 2025-11-29 07:43:35.938 187189 DEBUG oslo_concurrency.lockutils [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "461f4281-0d15-4de0-b5c6-d642b24bfab4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:43:35 compute-0 nova_compute[187185]: 2025-11-29 07:43:35.940 187189 DEBUG oslo_concurrency.lockutils [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "461f4281-0d15-4de0-b5c6-d642b24bfab4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:43:35 compute-0 nova_compute[187185]: 2025-11-29 07:43:35.985 187189 DEBUG nova.compute.manager [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 07:43:36 compute-0 nova_compute[187185]: 2025-11-29 07:43:36.122 187189 DEBUG oslo_concurrency.lockutils [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:43:36 compute-0 nova_compute[187185]: 2025-11-29 07:43:36.122 187189 DEBUG oslo_concurrency.lockutils [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:43:36 compute-0 nova_compute[187185]: 2025-11-29 07:43:36.133 187189 DEBUG nova.virt.hardware [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:43:36 compute-0 nova_compute[187185]: 2025-11-29 07:43:36.133 187189 INFO nova.compute.claims [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:43:36 compute-0 nova_compute[187185]: 2025-11-29 07:43:36.249 187189 DEBUG nova.compute.provider_tree [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:43:36 compute-0 nova_compute[187185]: 2025-11-29 07:43:36.267 187189 DEBUG nova.scheduler.client.report [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:43:36 compute-0 nova_compute[187185]: 2025-11-29 07:43:36.287 187189 DEBUG oslo_concurrency.lockutils [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.165s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:43:36 compute-0 nova_compute[187185]: 2025-11-29 07:43:36.288 187189 DEBUG nova.compute.manager [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 07:43:36 compute-0 nova_compute[187185]: 2025-11-29 07:43:36.355 187189 DEBUG nova.compute.manager [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 07:43:36 compute-0 nova_compute[187185]: 2025-11-29 07:43:36.356 187189 DEBUG nova.network.neutron [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 07:43:36 compute-0 nova_compute[187185]: 2025-11-29 07:43:36.375 187189 INFO nova.virt.libvirt.driver [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 07:43:36 compute-0 nova_compute[187185]: 2025-11-29 07:43:36.397 187189 DEBUG nova.compute.manager [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 07:43:36 compute-0 nova_compute[187185]: 2025-11-29 07:43:36.539 187189 DEBUG nova.compute.manager [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 07:43:36 compute-0 nova_compute[187185]: 2025-11-29 07:43:36.540 187189 DEBUG nova.virt.libvirt.driver [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:43:36 compute-0 nova_compute[187185]: 2025-11-29 07:43:36.541 187189 INFO nova.virt.libvirt.driver [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Creating image(s)
Nov 29 07:43:36 compute-0 nova_compute[187185]: 2025-11-29 07:43:36.542 187189 DEBUG oslo_concurrency.lockutils [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "/var/lib/nova/instances/461f4281-0d15-4de0-b5c6-d642b24bfab4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:43:36 compute-0 nova_compute[187185]: 2025-11-29 07:43:36.542 187189 DEBUG oslo_concurrency.lockutils [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "/var/lib/nova/instances/461f4281-0d15-4de0-b5c6-d642b24bfab4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:43:36 compute-0 nova_compute[187185]: 2025-11-29 07:43:36.543 187189 DEBUG oslo_concurrency.lockutils [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "/var/lib/nova/instances/461f4281-0d15-4de0-b5c6-d642b24bfab4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:43:36 compute-0 nova_compute[187185]: 2025-11-29 07:43:36.562 187189 DEBUG oslo_concurrency.processutils [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:43:36 compute-0 nova_compute[187185]: 2025-11-29 07:43:36.594 187189 DEBUG nova.policy [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 07:43:36 compute-0 nova_compute[187185]: 2025-11-29 07:43:36.663 187189 DEBUG oslo_concurrency.processutils [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:43:36 compute-0 nova_compute[187185]: 2025-11-29 07:43:36.664 187189 DEBUG oslo_concurrency.lockutils [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:43:36 compute-0 nova_compute[187185]: 2025-11-29 07:43:36.665 187189 DEBUG oslo_concurrency.lockutils [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:43:36 compute-0 nova_compute[187185]: 2025-11-29 07:43:36.687 187189 DEBUG oslo_concurrency.processutils [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:43:36 compute-0 nova_compute[187185]: 2025-11-29 07:43:36.747 187189 DEBUG oslo_concurrency.processutils [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:43:36 compute-0 nova_compute[187185]: 2025-11-29 07:43:36.749 187189 DEBUG oslo_concurrency.processutils [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/461f4281-0d15-4de0-b5c6-d642b24bfab4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:43:37 compute-0 nova_compute[187185]: 2025-11-29 07:43:37.034 187189 DEBUG oslo_concurrency.processutils [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/461f4281-0d15-4de0-b5c6-d642b24bfab4/disk 1073741824" returned: 0 in 0.285s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:43:37 compute-0 nova_compute[187185]: 2025-11-29 07:43:37.037 187189 DEBUG oslo_concurrency.lockutils [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.371s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:43:37 compute-0 nova_compute[187185]: 2025-11-29 07:43:37.038 187189 DEBUG oslo_concurrency.processutils [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:43:37 compute-0 nova_compute[187185]: 2025-11-29 07:43:37.113 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:43:37 compute-0 nova_compute[187185]: 2025-11-29 07:43:37.134 187189 DEBUG oslo_concurrency.processutils [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:43:37 compute-0 nova_compute[187185]: 2025-11-29 07:43:37.135 187189 DEBUG nova.virt.disk.api [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Checking if we can resize image /var/lib/nova/instances/461f4281-0d15-4de0-b5c6-d642b24bfab4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:43:37 compute-0 nova_compute[187185]: 2025-11-29 07:43:37.136 187189 DEBUG oslo_concurrency.processutils [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/461f4281-0d15-4de0-b5c6-d642b24bfab4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:43:37 compute-0 nova_compute[187185]: 2025-11-29 07:43:37.196 187189 DEBUG oslo_concurrency.processutils [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/461f4281-0d15-4de0-b5c6-d642b24bfab4/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:43:37 compute-0 nova_compute[187185]: 2025-11-29 07:43:37.197 187189 DEBUG nova.virt.disk.api [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Cannot resize image /var/lib/nova/instances/461f4281-0d15-4de0-b5c6-d642b24bfab4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:43:37 compute-0 nova_compute[187185]: 2025-11-29 07:43:37.198 187189 DEBUG nova.objects.instance [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'migration_context' on Instance uuid 461f4281-0d15-4de0-b5c6-d642b24bfab4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:43:37 compute-0 nova_compute[187185]: 2025-11-29 07:43:37.218 187189 DEBUG nova.virt.libvirt.driver [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:43:37 compute-0 nova_compute[187185]: 2025-11-29 07:43:37.218 187189 DEBUG nova.virt.libvirt.driver [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Ensure instance console log exists: /var/lib/nova/instances/461f4281-0d15-4de0-b5c6-d642b24bfab4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:43:37 compute-0 nova_compute[187185]: 2025-11-29 07:43:37.219 187189 DEBUG oslo_concurrency.lockutils [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:43:37 compute-0 nova_compute[187185]: 2025-11-29 07:43:37.219 187189 DEBUG oslo_concurrency.lockutils [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:43:37 compute-0 nova_compute[187185]: 2025-11-29 07:43:37.220 187189 DEBUG oslo_concurrency.lockutils [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:43:37 compute-0 nova_compute[187185]: 2025-11-29 07:43:37.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:43:37 compute-0 nova_compute[187185]: 2025-11-29 07:43:37.672 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:43:38 compute-0 nova_compute[187185]: 2025-11-29 07:43:38.311 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:43:38 compute-0 podman[244030]: 2025-11-29 07:43:38.811933974 +0000 UTC m=+0.071994113 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 07:43:39 compute-0 nova_compute[187185]: 2025-11-29 07:43:39.101 187189 DEBUG nova.network.neutron [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Successfully created port: 75ff1c6d-b5d1-4b97-953a-7eb71b306a96 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:43:39 compute-0 nova_compute[187185]: 2025-11-29 07:43:39.660 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:43:41 compute-0 nova_compute[187185]: 2025-11-29 07:43:41.613 187189 DEBUG nova.network.neutron [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Successfully updated port: 75ff1c6d-b5d1-4b97-953a-7eb71b306a96 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:43:41 compute-0 podman[244054]: 2025-11-29 07:43:41.798466711 +0000 UTC m=+0.062082402 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Nov 29 07:43:41 compute-0 podman[244055]: 2025-11-29 07:43:41.81076918 +0000 UTC m=+0.068003540 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:43:42 compute-0 nova_compute[187185]: 2025-11-29 07:43:42.000 187189 DEBUG nova.compute.manager [req-90c9a789-a52c-48ee-b9eb-2cf6e8746984 req-81a9b07e-e489-40e9-9f70-3ad901be0ee6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Received event network-changed-75ff1c6d-b5d1-4b97-953a-7eb71b306a96 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:43:42 compute-0 nova_compute[187185]: 2025-11-29 07:43:42.000 187189 DEBUG nova.compute.manager [req-90c9a789-a52c-48ee-b9eb-2cf6e8746984 req-81a9b07e-e489-40e9-9f70-3ad901be0ee6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Refreshing instance network info cache due to event network-changed-75ff1c6d-b5d1-4b97-953a-7eb71b306a96. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:43:42 compute-0 nova_compute[187185]: 2025-11-29 07:43:42.000 187189 DEBUG oslo_concurrency.lockutils [req-90c9a789-a52c-48ee-b9eb-2cf6e8746984 req-81a9b07e-e489-40e9-9f70-3ad901be0ee6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-461f4281-0d15-4de0-b5c6-d642b24bfab4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:43:42 compute-0 nova_compute[187185]: 2025-11-29 07:43:42.001 187189 DEBUG oslo_concurrency.lockutils [req-90c9a789-a52c-48ee-b9eb-2cf6e8746984 req-81a9b07e-e489-40e9-9f70-3ad901be0ee6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-461f4281-0d15-4de0-b5c6-d642b24bfab4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:43:42 compute-0 nova_compute[187185]: 2025-11-29 07:43:42.001 187189 DEBUG nova.network.neutron [req-90c9a789-a52c-48ee-b9eb-2cf6e8746984 req-81a9b07e-e489-40e9-9f70-3ad901be0ee6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Refreshing network info cache for port 75ff1c6d-b5d1-4b97-953a-7eb71b306a96 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:43:42 compute-0 nova_compute[187185]: 2025-11-29 07:43:42.011 187189 DEBUG oslo_concurrency.lockutils [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "refresh_cache-461f4281-0d15-4de0-b5c6-d642b24bfab4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:43:42 compute-0 nova_compute[187185]: 2025-11-29 07:43:42.116 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:43:42 compute-0 nova_compute[187185]: 2025-11-29 07:43:42.158 187189 DEBUG nova.network.neutron [req-90c9a789-a52c-48ee-b9eb-2cf6e8746984 req-81a9b07e-e489-40e9-9f70-3ad901be0ee6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:43:42 compute-0 nova_compute[187185]: 2025-11-29 07:43:42.443 187189 DEBUG nova.network.neutron [req-90c9a789-a52c-48ee-b9eb-2cf6e8746984 req-81a9b07e-e489-40e9-9f70-3ad901be0ee6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:43:42 compute-0 nova_compute[187185]: 2025-11-29 07:43:42.458 187189 DEBUG oslo_concurrency.lockutils [req-90c9a789-a52c-48ee-b9eb-2cf6e8746984 req-81a9b07e-e489-40e9-9f70-3ad901be0ee6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-461f4281-0d15-4de0-b5c6-d642b24bfab4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:43:42 compute-0 nova_compute[187185]: 2025-11-29 07:43:42.460 187189 DEBUG oslo_concurrency.lockutils [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquired lock "refresh_cache-461f4281-0d15-4de0-b5c6-d642b24bfab4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:43:42 compute-0 nova_compute[187185]: 2025-11-29 07:43:42.460 187189 DEBUG nova.network.neutron [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:43:42 compute-0 nova_compute[187185]: 2025-11-29 07:43:42.622 187189 DEBUG nova.network.neutron [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:43:42 compute-0 nova_compute[187185]: 2025-11-29 07:43:42.674 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:43:43 compute-0 nova_compute[187185]: 2025-11-29 07:43:43.580 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:43:43 compute-0 nova_compute[187185]: 2025-11-29 07:43:43.601 187189 WARNING nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] While synchronizing instance power states, found 1 instances in the database and 0 instances on the hypervisor.
Nov 29 07:43:43 compute-0 nova_compute[187185]: 2025-11-29 07:43:43.601 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Triggering sync for uuid 461f4281-0d15-4de0-b5c6-d642b24bfab4 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Nov 29 07:43:43 compute-0 nova_compute[187185]: 2025-11-29 07:43:43.602 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "461f4281-0d15-4de0-b5c6-d642b24bfab4" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.852 187189 DEBUG nova.network.neutron [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Updating instance_info_cache with network_info: [{"id": "75ff1c6d-b5d1-4b97-953a-7eb71b306a96", "address": "fa:16:3e:a6:b5:c9", "network": {"id": "e23e9510-a780-4254-b7f0-36040139e7db", "bridge": "br-int", "label": "tempest-network-smoke--622290722", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fea6:b5c9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fea6:b5c9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75ff1c6d-b5", "ovs_interfaceid": "75ff1c6d-b5d1-4b97-953a-7eb71b306a96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.874 187189 DEBUG oslo_concurrency.lockutils [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Releasing lock "refresh_cache-461f4281-0d15-4de0-b5c6-d642b24bfab4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.875 187189 DEBUG nova.compute.manager [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Instance network_info: |[{"id": "75ff1c6d-b5d1-4b97-953a-7eb71b306a96", "address": "fa:16:3e:a6:b5:c9", "network": {"id": "e23e9510-a780-4254-b7f0-36040139e7db", "bridge": "br-int", "label": "tempest-network-smoke--622290722", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fea6:b5c9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fea6:b5c9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75ff1c6d-b5", "ovs_interfaceid": "75ff1c6d-b5d1-4b97-953a-7eb71b306a96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.878 187189 DEBUG nova.virt.libvirt.driver [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Start _get_guest_xml network_info=[{"id": "75ff1c6d-b5d1-4b97-953a-7eb71b306a96", "address": "fa:16:3e:a6:b5:c9", "network": {"id": "e23e9510-a780-4254-b7f0-36040139e7db", "bridge": "br-int", "label": "tempest-network-smoke--622290722", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fea6:b5c9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fea6:b5c9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75ff1c6d-b5", "ovs_interfaceid": "75ff1c6d-b5d1-4b97-953a-7eb71b306a96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.882 187189 WARNING nova.virt.libvirt.driver [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.886 187189 DEBUG nova.virt.libvirt.host [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.886 187189 DEBUG nova.virt.libvirt.host [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.889 187189 DEBUG nova.virt.libvirt.host [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.890 187189 DEBUG nova.virt.libvirt.host [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.891 187189 DEBUG nova.virt.libvirt.driver [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.892 187189 DEBUG nova.virt.hardware [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.892 187189 DEBUG nova.virt.hardware [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.892 187189 DEBUG nova.virt.hardware [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.893 187189 DEBUG nova.virt.hardware [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.893 187189 DEBUG nova.virt.hardware [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.893 187189 DEBUG nova.virt.hardware [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.893 187189 DEBUG nova.virt.hardware [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.894 187189 DEBUG nova.virt.hardware [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.894 187189 DEBUG nova.virt.hardware [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.894 187189 DEBUG nova.virt.hardware [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.894 187189 DEBUG nova.virt.hardware [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.900 187189 DEBUG nova.virt.libvirt.vif [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:43:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1071640379',display_name='tempest-TestGettingAddress-server-1071640379',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1071640379',id=162,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCUa75YDe0UFuF3zC2EDX2ybM5J+nbJmodk4MsTI0LH8wgrY+SnJ+ndJio6A3wgT3MOCb4Huw6Ay1X+CWth0nVP7co5Y+kbzpcJTZjopF6Z8gsZ88jWOxRb0FFz1vDfcNA==',key_name='tempest-TestGettingAddress-1094641829',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-j09y60t5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:43:36Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=461f4281-0d15-4de0-b5c6-d642b24bfab4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "75ff1c6d-b5d1-4b97-953a-7eb71b306a96", "address": "fa:16:3e:a6:b5:c9", "network": {"id": "e23e9510-a780-4254-b7f0-36040139e7db", "bridge": "br-int", "label": "tempest-network-smoke--622290722", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fea6:b5c9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fea6:b5c9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75ff1c6d-b5", "ovs_interfaceid": "75ff1c6d-b5d1-4b97-953a-7eb71b306a96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.901 187189 DEBUG nova.network.os_vif_util [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "75ff1c6d-b5d1-4b97-953a-7eb71b306a96", "address": "fa:16:3e:a6:b5:c9", "network": {"id": "e23e9510-a780-4254-b7f0-36040139e7db", "bridge": "br-int", "label": "tempest-network-smoke--622290722", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fea6:b5c9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fea6:b5c9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75ff1c6d-b5", "ovs_interfaceid": "75ff1c6d-b5d1-4b97-953a-7eb71b306a96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.902 187189 DEBUG nova.network.os_vif_util [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a6:b5:c9,bridge_name='br-int',has_traffic_filtering=True,id=75ff1c6d-b5d1-4b97-953a-7eb71b306a96,network=Network(e23e9510-a780-4254-b7f0-36040139e7db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75ff1c6d-b5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.903 187189 DEBUG nova.objects.instance [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'pci_devices' on Instance uuid 461f4281-0d15-4de0-b5c6-d642b24bfab4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.920 187189 DEBUG nova.virt.libvirt.driver [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:43:44 compute-0 nova_compute[187185]:   <uuid>461f4281-0d15-4de0-b5c6-d642b24bfab4</uuid>
Nov 29 07:43:44 compute-0 nova_compute[187185]:   <name>instance-000000a2</name>
Nov 29 07:43:44 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:43:44 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:43:44 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:43:44 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:       <nova:name>tempest-TestGettingAddress-server-1071640379</nova:name>
Nov 29 07:43:44 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:43:44</nova:creationTime>
Nov 29 07:43:44 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:43:44 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:43:44 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:43:44 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:43:44 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:43:44 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:43:44 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:43:44 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:43:44 compute-0 nova_compute[187185]:         <nova:user uuid="31ac7b05b012433b89143dc9f259644a">tempest-TestGettingAddress-1465017630-project-member</nova:user>
Nov 29 07:43:44 compute-0 nova_compute[187185]:         <nova:project uuid="0111c22b4b954ea586ca20d91ed3970f">tempest-TestGettingAddress-1465017630</nova:project>
Nov 29 07:43:44 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:43:44 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:43:44 compute-0 nova_compute[187185]:         <nova:port uuid="75ff1c6d-b5d1-4b97-953a-7eb71b306a96">
Nov 29 07:43:44 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fea6:b5c9" ipVersion="6"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fea6:b5c9" ipVersion="6"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:43:44 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:43:44 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:43:44 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <system>
Nov 29 07:43:44 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:43:44 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:43:44 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:43:44 compute-0 nova_compute[187185]:       <entry name="serial">461f4281-0d15-4de0-b5c6-d642b24bfab4</entry>
Nov 29 07:43:44 compute-0 nova_compute[187185]:       <entry name="uuid">461f4281-0d15-4de0-b5c6-d642b24bfab4</entry>
Nov 29 07:43:44 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     </system>
Nov 29 07:43:44 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:43:44 compute-0 nova_compute[187185]:   <os>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:   </os>
Nov 29 07:43:44 compute-0 nova_compute[187185]:   <features>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:   </features>
Nov 29 07:43:44 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:43:44 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:43:44 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:43:44 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/461f4281-0d15-4de0-b5c6-d642b24bfab4/disk"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:43:44 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/461f4281-0d15-4de0-b5c6-d642b24bfab4/disk.config"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:43:44 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:a6:b5:c9"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:       <target dev="tap75ff1c6d-b5"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:43:44 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/461f4281-0d15-4de0-b5c6-d642b24bfab4/console.log" append="off"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <video>
Nov 29 07:43:44 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     </video>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:43:44 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:43:44 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:43:44 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:43:44 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:43:44 compute-0 nova_compute[187185]: </domain>
Nov 29 07:43:44 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.922 187189 DEBUG nova.compute.manager [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Preparing to wait for external event network-vif-plugged-75ff1c6d-b5d1-4b97-953a-7eb71b306a96 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.923 187189 DEBUG oslo_concurrency.lockutils [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "461f4281-0d15-4de0-b5c6-d642b24bfab4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.923 187189 DEBUG oslo_concurrency.lockutils [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "461f4281-0d15-4de0-b5c6-d642b24bfab4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.924 187189 DEBUG oslo_concurrency.lockutils [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "461f4281-0d15-4de0-b5c6-d642b24bfab4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.925 187189 DEBUG nova.virt.libvirt.vif [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:43:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1071640379',display_name='tempest-TestGettingAddress-server-1071640379',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1071640379',id=162,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCUa75YDe0UFuF3zC2EDX2ybM5J+nbJmodk4MsTI0LH8wgrY+SnJ+ndJio6A3wgT3MOCb4Huw6Ay1X+CWth0nVP7co5Y+kbzpcJTZjopF6Z8gsZ88jWOxRb0FFz1vDfcNA==',key_name='tempest-TestGettingAddress-1094641829',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-j09y60t5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:43:36Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=461f4281-0d15-4de0-b5c6-d642b24bfab4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "75ff1c6d-b5d1-4b97-953a-7eb71b306a96", "address": "fa:16:3e:a6:b5:c9", "network": {"id": "e23e9510-a780-4254-b7f0-36040139e7db", "bridge": "br-int", "label": "tempest-network-smoke--622290722", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fea6:b5c9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fea6:b5c9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75ff1c6d-b5", "ovs_interfaceid": "75ff1c6d-b5d1-4b97-953a-7eb71b306a96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.925 187189 DEBUG nova.network.os_vif_util [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "75ff1c6d-b5d1-4b97-953a-7eb71b306a96", "address": "fa:16:3e:a6:b5:c9", "network": {"id": "e23e9510-a780-4254-b7f0-36040139e7db", "bridge": "br-int", "label": "tempest-network-smoke--622290722", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fea6:b5c9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fea6:b5c9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75ff1c6d-b5", "ovs_interfaceid": "75ff1c6d-b5d1-4b97-953a-7eb71b306a96", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.926 187189 DEBUG nova.network.os_vif_util [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a6:b5:c9,bridge_name='br-int',has_traffic_filtering=True,id=75ff1c6d-b5d1-4b97-953a-7eb71b306a96,network=Network(e23e9510-a780-4254-b7f0-36040139e7db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75ff1c6d-b5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.926 187189 DEBUG os_vif [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a6:b5:c9,bridge_name='br-int',has_traffic_filtering=True,id=75ff1c6d-b5d1-4b97-953a-7eb71b306a96,network=Network(e23e9510-a780-4254-b7f0-36040139e7db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75ff1c6d-b5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.927 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.927 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.928 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.932 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.932 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap75ff1c6d-b5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.932 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap75ff1c6d-b5, col_values=(('external_ids', {'iface-id': '75ff1c6d-b5d1-4b97-953a-7eb71b306a96', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a6:b5:c9', 'vm-uuid': '461f4281-0d15-4de0-b5c6-d642b24bfab4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.934 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:43:44 compute-0 NetworkManager[55227]: <info>  [1764402224.9360] manager: (tap75ff1c6d-b5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/282)
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.937 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.945 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:43:44 compute-0 nova_compute[187185]: 2025-11-29 07:43:44.946 187189 INFO os_vif [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a6:b5:c9,bridge_name='br-int',has_traffic_filtering=True,id=75ff1c6d-b5d1-4b97-953a-7eb71b306a96,network=Network(e23e9510-a780-4254-b7f0-36040139e7db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75ff1c6d-b5')
Nov 29 07:43:45 compute-0 nova_compute[187185]: 2025-11-29 07:43:45.005 187189 DEBUG nova.virt.libvirt.driver [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:43:45 compute-0 nova_compute[187185]: 2025-11-29 07:43:45.006 187189 DEBUG nova.virt.libvirt.driver [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:43:45 compute-0 nova_compute[187185]: 2025-11-29 07:43:45.006 187189 DEBUG nova.virt.libvirt.driver [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No VIF found with MAC fa:16:3e:a6:b5:c9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:43:45 compute-0 nova_compute[187185]: 2025-11-29 07:43:45.007 187189 INFO nova.virt.libvirt.driver [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Using config drive
Nov 29 07:43:45 compute-0 nova_compute[187185]: 2025-11-29 07:43:45.556 187189 INFO nova.virt.libvirt.driver [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Creating config drive at /var/lib/nova/instances/461f4281-0d15-4de0-b5c6-d642b24bfab4/disk.config
Nov 29 07:43:45 compute-0 nova_compute[187185]: 2025-11-29 07:43:45.561 187189 DEBUG oslo_concurrency.processutils [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/461f4281-0d15-4de0-b5c6-d642b24bfab4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7pzs37k0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:43:45 compute-0 nova_compute[187185]: 2025-11-29 07:43:45.690 187189 DEBUG oslo_concurrency.processutils [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/461f4281-0d15-4de0-b5c6-d642b24bfab4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7pzs37k0" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:43:45 compute-0 kernel: tap75ff1c6d-b5: entered promiscuous mode
Nov 29 07:43:45 compute-0 ovn_controller[95281]: 2025-11-29T07:43:45Z|00546|binding|INFO|Claiming lport 75ff1c6d-b5d1-4b97-953a-7eb71b306a96 for this chassis.
Nov 29 07:43:45 compute-0 NetworkManager[55227]: <info>  [1764402225.7767] manager: (tap75ff1c6d-b5): new Tun device (/org/freedesktop/NetworkManager/Devices/283)
Nov 29 07:43:45 compute-0 ovn_controller[95281]: 2025-11-29T07:43:45Z|00547|binding|INFO|75ff1c6d-b5d1-4b97-953a-7eb71b306a96: Claiming fa:16:3e:a6:b5:c9 10.100.0.13 2001:db8:0:1:f816:3eff:fea6:b5c9 2001:db8::f816:3eff:fea6:b5c9
Nov 29 07:43:45 compute-0 nova_compute[187185]: 2025-11-29 07:43:45.777 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:43:45 compute-0 nova_compute[187185]: 2025-11-29 07:43:45.781 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:43:45 compute-0 nova_compute[187185]: 2025-11-29 07:43:45.787 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:43:45 compute-0 nova_compute[187185]: 2025-11-29 07:43:45.792 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:43:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:43:45.806 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a6:b5:c9 10.100.0.13 2001:db8:0:1:f816:3eff:fea6:b5c9 2001:db8::f816:3eff:fea6:b5c9'], port_security=['fa:16:3e:a6:b5:c9 10.100.0.13 2001:db8:0:1:f816:3eff:fea6:b5c9 2001:db8::f816:3eff:fea6:b5c9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28 2001:db8:0:1:f816:3eff:fea6:b5c9/64 2001:db8::f816:3eff:fea6:b5c9/64', 'neutron:device_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e23e9510-a780-4254-b7f0-36040139e7db', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fcfda89f-6716-48ad-9493-dabb00233aaf', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ff62ba9a-db01-45ed-b4a4-c5b2c8f5434e, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=75ff1c6d-b5d1-4b97-953a-7eb71b306a96) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:43:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:43:45.807 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 75ff1c6d-b5d1-4b97-953a-7eb71b306a96 in datapath e23e9510-a780-4254-b7f0-36040139e7db bound to our chassis
Nov 29 07:43:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:43:45.809 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e23e9510-a780-4254-b7f0-36040139e7db
Nov 29 07:43:45 compute-0 systemd-machined[153486]: New machine qemu-64-instance-000000a2.
Nov 29 07:43:45 compute-0 systemd-udevd[244114]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:43:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:43:45.828 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[c180b0d1-6af7-41d1-82d2-2dffa42c6dbc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:43:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:43:45.830 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape23e9510-a1 in ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:43:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:43:45.837 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape23e9510-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:43:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:43:45.837 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[efd61dd8-96b9-418d-8fc1-20d106ee7758]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:43:45 compute-0 NetworkManager[55227]: <info>  [1764402225.8396] device (tap75ff1c6d-b5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:43:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:43:45.839 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[315320d4-3689-461a-9799-e5e7a61304d3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:43:45 compute-0 NetworkManager[55227]: <info>  [1764402225.8409] device (tap75ff1c6d-b5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:43:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:43:45.855 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[3ed73de2-ff81-4d2d-95ed-8441ea11dd36]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:43:45 compute-0 ovn_controller[95281]: 2025-11-29T07:43:45Z|00548|binding|INFO|Setting lport 75ff1c6d-b5d1-4b97-953a-7eb71b306a96 ovn-installed in OVS
Nov 29 07:43:45 compute-0 ovn_controller[95281]: 2025-11-29T07:43:45Z|00549|binding|INFO|Setting lport 75ff1c6d-b5d1-4b97-953a-7eb71b306a96 up in Southbound
Nov 29 07:43:45 compute-0 nova_compute[187185]: 2025-11-29 07:43:45.861 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:43:45 compute-0 systemd[1]: Started Virtual Machine qemu-64-instance-000000a2.
Nov 29 07:43:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:43:45.888 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[36e561c2-0411-45b8-b3f7-653d4785bbdf]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:43:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:43:45.928 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[3ee72c84-d424-4325-abbd-2d0673e1fc6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:43:45 compute-0 systemd-udevd[244116]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:43:45 compute-0 NetworkManager[55227]: <info>  [1764402225.9387] manager: (tape23e9510-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/284)
Nov 29 07:43:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:43:45.937 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[48ffecb3-9d57-4c5e-a213-9d261edbae02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:43:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:43:45.990 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[032b135f-eacd-41c2-92cb-42c014f7e076]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:43:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:43:45.994 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[2c34df93-2b53-40f7-b928-c098b5dd58e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:43:46 compute-0 NetworkManager[55227]: <info>  [1764402226.0243] device (tape23e9510-a0): carrier: link connected
Nov 29 07:43:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:43:46.029 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[c4353595-b692-47ab-a9aa-3e31dfca7458]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:43:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:43:46.054 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[edb241db-4958-446c-bd72-7397f0288040]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape23e9510-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c8:31:c8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 172], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 767368, 'reachable_time': 16329, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244146, 'error': None, 'target': 'ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:43:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:43:46.080 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[2a9555cc-55fd-4913-8ec1-c0b14dcf9573]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec8:31c8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 767368, 'tstamp': 767368}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 244147, 'error': None, 'target': 'ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:43:46 compute-0 nova_compute[187185]: 2025-11-29 07:43:46.099 187189 DEBUG nova.compute.manager [req-21c20881-6cf4-4e77-bf79-557aa8230701 req-5b2404d5-6939-4d74-b485-1ec930dfed52 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Received event network-vif-plugged-75ff1c6d-b5d1-4b97-953a-7eb71b306a96 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:43:46 compute-0 nova_compute[187185]: 2025-11-29 07:43:46.100 187189 DEBUG oslo_concurrency.lockutils [req-21c20881-6cf4-4e77-bf79-557aa8230701 req-5b2404d5-6939-4d74-b485-1ec930dfed52 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "461f4281-0d15-4de0-b5c6-d642b24bfab4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:43:46 compute-0 nova_compute[187185]: 2025-11-29 07:43:46.100 187189 DEBUG oslo_concurrency.lockutils [req-21c20881-6cf4-4e77-bf79-557aa8230701 req-5b2404d5-6939-4d74-b485-1ec930dfed52 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "461f4281-0d15-4de0-b5c6-d642b24bfab4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:43:46 compute-0 nova_compute[187185]: 2025-11-29 07:43:46.101 187189 DEBUG oslo_concurrency.lockutils [req-21c20881-6cf4-4e77-bf79-557aa8230701 req-5b2404d5-6939-4d74-b485-1ec930dfed52 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "461f4281-0d15-4de0-b5c6-d642b24bfab4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:43:46 compute-0 nova_compute[187185]: 2025-11-29 07:43:46.101 187189 DEBUG nova.compute.manager [req-21c20881-6cf4-4e77-bf79-557aa8230701 req-5b2404d5-6939-4d74-b485-1ec930dfed52 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Processing event network-vif-plugged-75ff1c6d-b5d1-4b97-953a-7eb71b306a96 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:43:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:43:46.109 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[674c3223-68e1-4109-9c0f-d59a4e570bce]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape23e9510-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c8:31:c8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 172], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 767368, 'reachable_time': 16329, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 244148, 'error': None, 'target': 'ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:43:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:43:46.165 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[285fc6fc-2e75-4b52-9c44-55a79cb7966a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:43:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:43:46.252 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[db1eb7ff-37d5-4e50-b48b-055c52384479]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:43:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:43:46.254 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape23e9510-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:43:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:43:46.254 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:43:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:43:46.254 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape23e9510-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:43:46 compute-0 nova_compute[187185]: 2025-11-29 07:43:46.257 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:43:46 compute-0 NetworkManager[55227]: <info>  [1764402226.2585] manager: (tape23e9510-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/285)
Nov 29 07:43:46 compute-0 kernel: tape23e9510-a0: entered promiscuous mode
Nov 29 07:43:46 compute-0 nova_compute[187185]: 2025-11-29 07:43:46.260 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:43:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:43:46.261 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape23e9510-a0, col_values=(('external_ids', {'iface-id': '4e85a268-4b8a-4015-a903-2252d696f8f5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:43:46 compute-0 nova_compute[187185]: 2025-11-29 07:43:46.262 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:43:46 compute-0 ovn_controller[95281]: 2025-11-29T07:43:46Z|00550|binding|INFO|Releasing lport 4e85a268-4b8a-4015-a903-2252d696f8f5 from this chassis (sb_readonly=0)
Nov 29 07:43:46 compute-0 nova_compute[187185]: 2025-11-29 07:43:46.285 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:43:46 compute-0 nova_compute[187185]: 2025-11-29 07:43:46.287 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:43:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:43:46.288 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e23e9510-a780-4254-b7f0-36040139e7db.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e23e9510-a780-4254-b7f0-36040139e7db.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:43:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:43:46.290 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[0f0c2ab9-ccdd-4b48-a9c5-7bd2201f7cea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:43:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:43:46.291 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:43:46 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:43:46 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:43:46 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-e23e9510-a780-4254-b7f0-36040139e7db
Nov 29 07:43:46 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:43:46 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:43:46 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:43:46 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/e23e9510-a780-4254-b7f0-36040139e7db.pid.haproxy
Nov 29 07:43:46 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:43:46 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:43:46 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:43:46 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:43:46 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:43:46 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:43:46 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:43:46 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:43:46 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:43:46 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:43:46 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:43:46 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:43:46 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:43:46 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:43:46 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:43:46 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:43:46 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:43:46 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:43:46 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:43:46 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:43:46 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID e23e9510-a780-4254-b7f0-36040139e7db
Nov 29 07:43:46 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:43:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:43:46.292 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db', 'env', 'PROCESS_TAG=haproxy-e23e9510-a780-4254-b7f0-36040139e7db', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e23e9510-a780-4254-b7f0-36040139e7db.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:43:46 compute-0 podman[244180]: 2025-11-29 07:43:46.707165876 +0000 UTC m=+0.030927118 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:43:47 compute-0 nova_compute[187185]: 2025-11-29 07:43:47.005 187189 DEBUG nova.compute.manager [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:43:47 compute-0 nova_compute[187185]: 2025-11-29 07:43:47.007 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764402227.0043936, 461f4281-0d15-4de0-b5c6-d642b24bfab4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:43:47 compute-0 nova_compute[187185]: 2025-11-29 07:43:47.007 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] VM Started (Lifecycle Event)
Nov 29 07:43:47 compute-0 nova_compute[187185]: 2025-11-29 07:43:47.014 187189 DEBUG nova.virt.libvirt.driver [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:43:47 compute-0 nova_compute[187185]: 2025-11-29 07:43:47.020 187189 INFO nova.virt.libvirt.driver [-] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Instance spawned successfully.
Nov 29 07:43:47 compute-0 nova_compute[187185]: 2025-11-29 07:43:47.021 187189 DEBUG nova.virt.libvirt.driver [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:43:47 compute-0 nova_compute[187185]: 2025-11-29 07:43:47.031 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:43:47 compute-0 nova_compute[187185]: 2025-11-29 07:43:47.038 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:43:47 compute-0 nova_compute[187185]: 2025-11-29 07:43:47.041 187189 DEBUG nova.virt.libvirt.driver [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:43:47 compute-0 nova_compute[187185]: 2025-11-29 07:43:47.041 187189 DEBUG nova.virt.libvirt.driver [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:43:47 compute-0 nova_compute[187185]: 2025-11-29 07:43:47.041 187189 DEBUG nova.virt.libvirt.driver [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:43:47 compute-0 nova_compute[187185]: 2025-11-29 07:43:47.042 187189 DEBUG nova.virt.libvirt.driver [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:43:47 compute-0 nova_compute[187185]: 2025-11-29 07:43:47.042 187189 DEBUG nova.virt.libvirt.driver [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:43:47 compute-0 nova_compute[187185]: 2025-11-29 07:43:47.042 187189 DEBUG nova.virt.libvirt.driver [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:43:47 compute-0 nova_compute[187185]: 2025-11-29 07:43:47.063 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:43:47 compute-0 nova_compute[187185]: 2025-11-29 07:43:47.063 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764402227.0047603, 461f4281-0d15-4de0-b5c6-d642b24bfab4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:43:47 compute-0 nova_compute[187185]: 2025-11-29 07:43:47.063 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] VM Paused (Lifecycle Event)
Nov 29 07:43:47 compute-0 nova_compute[187185]: 2025-11-29 07:43:47.083 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:43:47 compute-0 nova_compute[187185]: 2025-11-29 07:43:47.087 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764402227.012933, 461f4281-0d15-4de0-b5c6-d642b24bfab4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:43:47 compute-0 nova_compute[187185]: 2025-11-29 07:43:47.087 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] VM Resumed (Lifecycle Event)
Nov 29 07:43:47 compute-0 nova_compute[187185]: 2025-11-29 07:43:47.116 187189 INFO nova.compute.manager [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Took 10.58 seconds to spawn the instance on the hypervisor.
Nov 29 07:43:47 compute-0 nova_compute[187185]: 2025-11-29 07:43:47.116 187189 DEBUG nova.compute.manager [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:43:47 compute-0 nova_compute[187185]: 2025-11-29 07:43:47.118 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:43:47 compute-0 nova_compute[187185]: 2025-11-29 07:43:47.127 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:43:47 compute-0 nova_compute[187185]: 2025-11-29 07:43:47.179 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:43:47 compute-0 nova_compute[187185]: 2025-11-29 07:43:47.219 187189 INFO nova.compute.manager [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Took 11.14 seconds to build instance.
Nov 29 07:43:47 compute-0 nova_compute[187185]: 2025-11-29 07:43:47.238 187189 DEBUG oslo_concurrency.lockutils [None req-cdeb0636-af8f-42d6-bac1-19ad9452aff7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "461f4281-0d15-4de0-b5c6-d642b24bfab4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.298s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:43:47 compute-0 nova_compute[187185]: 2025-11-29 07:43:47.239 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "461f4281-0d15-4de0-b5c6-d642b24bfab4" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 3.637s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:43:47 compute-0 nova_compute[187185]: 2025-11-29 07:43:47.239 187189 INFO nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:43:47 compute-0 nova_compute[187185]: 2025-11-29 07:43:47.239 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "461f4281-0d15-4de0-b5c6-d642b24bfab4" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:43:47 compute-0 podman[244180]: 2025-11-29 07:43:47.480601093 +0000 UTC m=+0.804362315 container create 01643e218f209285b358c24b06417199280ba1b17cc4f95eeb6ea52b5e70d52e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 29 07:43:47 compute-0 systemd[1]: Started libpod-conmon-01643e218f209285b358c24b06417199280ba1b17cc4f95eeb6ea52b5e70d52e.scope.
Nov 29 07:43:47 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:43:47 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dc3b75dbdef3fc1e81d149d8a0c71592057419e00aa111a2fdd90691704e9fa1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:43:47 compute-0 nova_compute[187185]: 2025-11-29 07:43:47.686 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:43:47 compute-0 podman[244180]: 2025-11-29 07:43:47.991619427 +0000 UTC m=+1.315380679 container init 01643e218f209285b358c24b06417199280ba1b17cc4f95eeb6ea52b5e70d52e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 07:43:47 compute-0 podman[244180]: 2025-11-29 07:43:47.999237463 +0000 UTC m=+1.322998695 container start 01643e218f209285b358c24b06417199280ba1b17cc4f95eeb6ea52b5e70d52e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:43:48 compute-0 neutron-haproxy-ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db[244202]: [NOTICE]   (244206) : New worker (244208) forked
Nov 29 07:43:48 compute-0 neutron-haproxy-ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db[244202]: [NOTICE]   (244206) : Loading success.
Nov 29 07:43:48 compute-0 nova_compute[187185]: 2025-11-29 07:43:48.817 187189 DEBUG nova.compute.manager [req-532d3f86-9563-45d0-801a-c64d4182a8b0 req-654a7b33-9e6a-4b9e-9fc3-91b5d0c93e57 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Received event network-vif-plugged-75ff1c6d-b5d1-4b97-953a-7eb71b306a96 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:43:48 compute-0 nova_compute[187185]: 2025-11-29 07:43:48.818 187189 DEBUG oslo_concurrency.lockutils [req-532d3f86-9563-45d0-801a-c64d4182a8b0 req-654a7b33-9e6a-4b9e-9fc3-91b5d0c93e57 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "461f4281-0d15-4de0-b5c6-d642b24bfab4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:43:48 compute-0 nova_compute[187185]: 2025-11-29 07:43:48.818 187189 DEBUG oslo_concurrency.lockutils [req-532d3f86-9563-45d0-801a-c64d4182a8b0 req-654a7b33-9e6a-4b9e-9fc3-91b5d0c93e57 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "461f4281-0d15-4de0-b5c6-d642b24bfab4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:43:48 compute-0 nova_compute[187185]: 2025-11-29 07:43:48.818 187189 DEBUG oslo_concurrency.lockutils [req-532d3f86-9563-45d0-801a-c64d4182a8b0 req-654a7b33-9e6a-4b9e-9fc3-91b5d0c93e57 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "461f4281-0d15-4de0-b5c6-d642b24bfab4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:43:48 compute-0 nova_compute[187185]: 2025-11-29 07:43:48.819 187189 DEBUG nova.compute.manager [req-532d3f86-9563-45d0-801a-c64d4182a8b0 req-654a7b33-9e6a-4b9e-9fc3-91b5d0c93e57 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] No waiting events found dispatching network-vif-plugged-75ff1c6d-b5d1-4b97-953a-7eb71b306a96 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:43:48 compute-0 nova_compute[187185]: 2025-11-29 07:43:48.819 187189 WARNING nova.compute.manager [req-532d3f86-9563-45d0-801a-c64d4182a8b0 req-654a7b33-9e6a-4b9e-9fc3-91b5d0c93e57 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Received unexpected event network-vif-plugged-75ff1c6d-b5d1-4b97-953a-7eb71b306a96 for instance with vm_state active and task_state None.
Nov 29 07:43:49 compute-0 sshd-session[244217]: Invalid user postgres from 190.181.27.27 port 50442
Nov 29 07:43:49 compute-0 sshd-session[244217]: Received disconnect from 190.181.27.27 port 50442:11: Bye Bye [preauth]
Nov 29 07:43:49 compute-0 sshd-session[244217]: Disconnected from invalid user postgres 190.181.27.27 port 50442 [preauth]
Nov 29 07:43:49 compute-0 nova_compute[187185]: 2025-11-29 07:43:49.937 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:43:52 compute-0 nova_compute[187185]: 2025-11-29 07:43:52.692 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:43:52 compute-0 NetworkManager[55227]: <info>  [1764402232.7873] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/286)
Nov 29 07:43:52 compute-0 NetworkManager[55227]: <info>  [1764402232.7882] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/287)
Nov 29 07:43:52 compute-0 nova_compute[187185]: 2025-11-29 07:43:52.788 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:43:52 compute-0 podman[244219]: 2025-11-29 07:43:52.827060254 +0000 UTC m=+0.081775601 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 07:43:52 compute-0 podman[244220]: 2025-11-29 07:43:52.837672825 +0000 UTC m=+0.089624443 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.6)
Nov 29 07:43:52 compute-0 podman[244221]: 2025-11-29 07:43:52.838663643 +0000 UTC m=+0.085374893 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:43:52 compute-0 nova_compute[187185]: 2025-11-29 07:43:52.939 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:43:52 compute-0 ovn_controller[95281]: 2025-11-29T07:43:52Z|00551|binding|INFO|Releasing lport 4e85a268-4b8a-4015-a903-2252d696f8f5 from this chassis (sb_readonly=0)
Nov 29 07:43:52 compute-0 nova_compute[187185]: 2025-11-29 07:43:52.968 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:43:53 compute-0 nova_compute[187185]: 2025-11-29 07:43:53.144 187189 DEBUG nova.compute.manager [req-323d8081-5e28-4d94-8879-77b6af60e99e req-d68aba47-a143-4e8b-8d48-94e40de4e47b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Received event network-changed-75ff1c6d-b5d1-4b97-953a-7eb71b306a96 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:43:53 compute-0 nova_compute[187185]: 2025-11-29 07:43:53.144 187189 DEBUG nova.compute.manager [req-323d8081-5e28-4d94-8879-77b6af60e99e req-d68aba47-a143-4e8b-8d48-94e40de4e47b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Refreshing instance network info cache due to event network-changed-75ff1c6d-b5d1-4b97-953a-7eb71b306a96. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:43:53 compute-0 nova_compute[187185]: 2025-11-29 07:43:53.145 187189 DEBUG oslo_concurrency.lockutils [req-323d8081-5e28-4d94-8879-77b6af60e99e req-d68aba47-a143-4e8b-8d48-94e40de4e47b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-461f4281-0d15-4de0-b5c6-d642b24bfab4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:43:53 compute-0 nova_compute[187185]: 2025-11-29 07:43:53.145 187189 DEBUG oslo_concurrency.lockutils [req-323d8081-5e28-4d94-8879-77b6af60e99e req-d68aba47-a143-4e8b-8d48-94e40de4e47b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-461f4281-0d15-4de0-b5c6-d642b24bfab4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:43:53 compute-0 nova_compute[187185]: 2025-11-29 07:43:53.145 187189 DEBUG nova.network.neutron [req-323d8081-5e28-4d94-8879-77b6af60e99e req-d68aba47-a143-4e8b-8d48-94e40de4e47b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Refreshing network info cache for port 75ff1c6d-b5d1-4b97-953a-7eb71b306a96 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:43:54 compute-0 nova_compute[187185]: 2025-11-29 07:43:54.941 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:43:55 compute-0 nova_compute[187185]: 2025-11-29 07:43:55.107 187189 DEBUG nova.network.neutron [req-323d8081-5e28-4d94-8879-77b6af60e99e req-d68aba47-a143-4e8b-8d48-94e40de4e47b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Updated VIF entry in instance network info cache for port 75ff1c6d-b5d1-4b97-953a-7eb71b306a96. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:43:55 compute-0 nova_compute[187185]: 2025-11-29 07:43:55.109 187189 DEBUG nova.network.neutron [req-323d8081-5e28-4d94-8879-77b6af60e99e req-d68aba47-a143-4e8b-8d48-94e40de4e47b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Updating instance_info_cache with network_info: [{"id": "75ff1c6d-b5d1-4b97-953a-7eb71b306a96", "address": "fa:16:3e:a6:b5:c9", "network": {"id": "e23e9510-a780-4254-b7f0-36040139e7db", "bridge": "br-int", "label": "tempest-network-smoke--622290722", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fea6:b5c9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fea6:b5c9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75ff1c6d-b5", "ovs_interfaceid": "75ff1c6d-b5d1-4b97-953a-7eb71b306a96", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:43:55 compute-0 nova_compute[187185]: 2025-11-29 07:43:55.140 187189 DEBUG oslo_concurrency.lockutils [req-323d8081-5e28-4d94-8879-77b6af60e99e req-d68aba47-a143-4e8b-8d48-94e40de4e47b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-461f4281-0d15-4de0-b5c6-d642b24bfab4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:43:57 compute-0 nova_compute[187185]: 2025-11-29 07:43:57.695 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:43:59 compute-0 nova_compute[187185]: 2025-11-29 07:43:59.945 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:44:01 compute-0 ovn_controller[95281]: 2025-11-29T07:44:01Z|00073|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a6:b5:c9 10.100.0.13
Nov 29 07:44:01 compute-0 ovn_controller[95281]: 2025-11-29T07:44:01Z|00074|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a6:b5:c9 10.100.0.13
Nov 29 07:44:01 compute-0 podman[244292]: 2025-11-29 07:44:01.842307642 +0000 UTC m=+0.102415356 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Nov 29 07:44:02 compute-0 nova_compute[187185]: 2025-11-29 07:44:02.697 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:44:04 compute-0 nova_compute[187185]: 2025-11-29 07:44:04.949 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:44:07 compute-0 nova_compute[187185]: 2025-11-29 07:44:07.699 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:44:09 compute-0 podman[244318]: 2025-11-29 07:44:09.811369656 +0000 UTC m=+0.066236379 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:44:09 compute-0 nova_compute[187185]: 2025-11-29 07:44:09.952 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:44:12 compute-0 nova_compute[187185]: 2025-11-29 07:44:12.745 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:44:12 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:44:12.854 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=63, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=62) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:44:12 compute-0 nova_compute[187185]: 2025-11-29 07:44:12.855 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:44:12 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:44:12.858 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:44:12 compute-0 podman[244344]: 2025-11-29 07:44:12.892469755 +0000 UTC m=+0.111868964 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 07:44:12 compute-0 podman[244345]: 2025-11-29 07:44:12.922795305 +0000 UTC m=+0.134082104 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Nov 29 07:44:14 compute-0 nova_compute[187185]: 2025-11-29 07:44:14.956 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:44:17 compute-0 nova_compute[187185]: 2025-11-29 07:44:17.748 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:44:19 compute-0 nova_compute[187185]: 2025-11-29 07:44:19.973 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:44:20 compute-0 ovn_controller[95281]: 2025-11-29T07:44:20Z|00552|binding|INFO|Releasing lport 4e85a268-4b8a-4015-a903-2252d696f8f5 from this chassis (sb_readonly=0)
Nov 29 07:44:20 compute-0 nova_compute[187185]: 2025-11-29 07:44:20.820 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:44:21 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:44:21.862 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '63'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:44:22 compute-0 nova_compute[187185]: 2025-11-29 07:44:22.802 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:44:23 compute-0 podman[244385]: 2025-11-29 07:44:23.8161553 +0000 UTC m=+0.073541547 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:44:23 compute-0 podman[244387]: 2025-11-29 07:44:23.834198121 +0000 UTC m=+0.090945220 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 07:44:23 compute-0 podman[244386]: 2025-11-29 07:44:23.853686084 +0000 UTC m=+0.106182963 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-type=git, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, config_id=edpm, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64)
Nov 29 07:44:24 compute-0 nova_compute[187185]: 2025-11-29 07:44:24.976 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:44:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:44:25.744 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:44:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:44:25.745 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:44:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:44:25.746 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:44:27 compute-0 nova_compute[187185]: 2025-11-29 07:44:27.338 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:44:27 compute-0 nova_compute[187185]: 2025-11-29 07:44:27.804 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:44:28 compute-0 nova_compute[187185]: 2025-11-29 07:44:28.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:44:29 compute-0 nova_compute[187185]: 2025-11-29 07:44:29.979 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:44:31 compute-0 sshd-session[244446]: Invalid user support from 20.255.62.58 port 52788
Nov 29 07:44:31 compute-0 sshd-session[244446]: Received disconnect from 20.255.62.58 port 52788:11: Bye Bye [preauth]
Nov 29 07:44:31 compute-0 sshd-session[244446]: Disconnected from invalid user support 20.255.62.58 port 52788 [preauth]
Nov 29 07:44:32 compute-0 nova_compute[187185]: 2025-11-29 07:44:32.806 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:44:32 compute-0 podman[244451]: 2025-11-29 07:44:32.87735925 +0000 UTC m=+0.127228609 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 29 07:44:34 compute-0 nova_compute[187185]: 2025-11-29 07:44:34.545 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:44:34 compute-0 nova_compute[187185]: 2025-11-29 07:44:34.546 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:44:34 compute-0 nova_compute[187185]: 2025-11-29 07:44:34.546 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:44:34 compute-0 nova_compute[187185]: 2025-11-29 07:44:34.546 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:44:34 compute-0 nova_compute[187185]: 2025-11-29 07:44:34.643 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/461f4281-0d15-4de0-b5c6-d642b24bfab4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:44:34 compute-0 nova_compute[187185]: 2025-11-29 07:44:34.714 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/461f4281-0d15-4de0-b5c6-d642b24bfab4/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:44:34 compute-0 nova_compute[187185]: 2025-11-29 07:44:34.715 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/461f4281-0d15-4de0-b5c6-d642b24bfab4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:44:34 compute-0 nova_compute[187185]: 2025-11-29 07:44:34.777 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/461f4281-0d15-4de0-b5c6-d642b24bfab4/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:44:34 compute-0 nova_compute[187185]: 2025-11-29 07:44:34.959 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:44:34 compute-0 nova_compute[187185]: 2025-11-29 07:44:34.961 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5576MB free_disk=73.22501373291016GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:44:34 compute-0 nova_compute[187185]: 2025-11-29 07:44:34.961 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:44:34 compute-0 nova_compute[187185]: 2025-11-29 07:44:34.962 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:44:34 compute-0 nova_compute[187185]: 2025-11-29 07:44:34.981 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:44:35 compute-0 nova_compute[187185]: 2025-11-29 07:44:35.162 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance 461f4281-0d15-4de0-b5c6-d642b24bfab4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:44:35 compute-0 nova_compute[187185]: 2025-11-29 07:44:35.163 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:44:35 compute-0 nova_compute[187185]: 2025-11-29 07:44:35.164 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:44:35 compute-0 nova_compute[187185]: 2025-11-29 07:44:35.220 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:44:35 compute-0 nova_compute[187185]: 2025-11-29 07:44:35.234 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:44:35 compute-0 nova_compute[187185]: 2025-11-29 07:44:35.263 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:44:35 compute-0 nova_compute[187185]: 2025-11-29 07:44:35.264 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.302s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:44:37 compute-0 nova_compute[187185]: 2025-11-29 07:44:37.808 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:44:38 compute-0 nova_compute[187185]: 2025-11-29 07:44:38.264 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:44:38 compute-0 nova_compute[187185]: 2025-11-29 07:44:38.265 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:44:38 compute-0 nova_compute[187185]: 2025-11-29 07:44:38.266 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:44:38 compute-0 sshd-session[244448]: Invalid user dd from 45.78.219.119 port 55586
Nov 29 07:44:38 compute-0 nova_compute[187185]: 2025-11-29 07:44:38.586 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "refresh_cache-461f4281-0d15-4de0-b5c6-d642b24bfab4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:44:38 compute-0 nova_compute[187185]: 2025-11-29 07:44:38.587 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquired lock "refresh_cache-461f4281-0d15-4de0-b5c6-d642b24bfab4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:44:38 compute-0 nova_compute[187185]: 2025-11-29 07:44:38.588 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 29 07:44:38 compute-0 nova_compute[187185]: 2025-11-29 07:44:38.588 187189 DEBUG nova.objects.instance [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 461f4281-0d15-4de0-b5c6-d642b24bfab4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:44:39 compute-0 sshd-session[244448]: Received disconnect from 45.78.219.119 port 55586:11: Bye Bye [preauth]
Nov 29 07:44:39 compute-0 sshd-session[244448]: Disconnected from invalid user dd 45.78.219.119 port 55586 [preauth]
Nov 29 07:44:39 compute-0 nova_compute[187185]: 2025-11-29 07:44:39.712 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:44:39 compute-0 nova_compute[187185]: 2025-11-29 07:44:39.983 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:44:40 compute-0 podman[244485]: 2025-11-29 07:44:40.825576493 +0000 UTC m=+0.074855954 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 07:44:42 compute-0 nova_compute[187185]: 2025-11-29 07:44:42.194 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Updating instance_info_cache with network_info: [{"id": "75ff1c6d-b5d1-4b97-953a-7eb71b306a96", "address": "fa:16:3e:a6:b5:c9", "network": {"id": "e23e9510-a780-4254-b7f0-36040139e7db", "bridge": "br-int", "label": "tempest-network-smoke--622290722", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fea6:b5c9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fea6:b5c9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75ff1c6d-b5", "ovs_interfaceid": "75ff1c6d-b5d1-4b97-953a-7eb71b306a96", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:44:42 compute-0 nova_compute[187185]: 2025-11-29 07:44:42.429 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Releasing lock "refresh_cache-461f4281-0d15-4de0-b5c6-d642b24bfab4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:44:42 compute-0 nova_compute[187185]: 2025-11-29 07:44:42.430 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 29 07:44:42 compute-0 nova_compute[187185]: 2025-11-29 07:44:42.430 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:44:42 compute-0 nova_compute[187185]: 2025-11-29 07:44:42.431 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:44:42 compute-0 nova_compute[187185]: 2025-11-29 07:44:42.431 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:44:42 compute-0 nova_compute[187185]: 2025-11-29 07:44:42.431 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:44:42 compute-0 nova_compute[187185]: 2025-11-29 07:44:42.431 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:44:42 compute-0 nova_compute[187185]: 2025-11-29 07:44:42.431 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:44:42 compute-0 nova_compute[187185]: 2025-11-29 07:44:42.477 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:44:42 compute-0 nova_compute[187185]: 2025-11-29 07:44:42.812 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:44:43 compute-0 podman[244511]: 2025-11-29 07:44:43.815026792 +0000 UTC m=+0.067578967 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 07:44:43 compute-0 podman[244510]: 2025-11-29 07:44:43.814987301 +0000 UTC m=+0.070756008 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, config_id=multipathd, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 29 07:44:44 compute-0 nova_compute[187185]: 2025-11-29 07:44:44.987 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:44:46 compute-0 nova_compute[187185]: 2025-11-29 07:44:46.744 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:44:47 compute-0 nova_compute[187185]: 2025-11-29 07:44:47.815 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.016 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '461f4281-0d15-4de0-b5c6-d642b24bfab4', 'name': 'tempest-TestGettingAddress-server-1071640379', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000a2', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '0111c22b4b954ea586ca20d91ed3970f', 'user_id': '31ac7b05b012433b89143dc9f259644a', 'hostId': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.017 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.020 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 461f4281-0d15-4de0-b5c6-d642b24bfab4 / tap75ff1c6d-b5 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.020 12 DEBUG ceilometer.compute.pollsters [-] 461f4281-0d15-4de0-b5c6-d642b24bfab4/network.outgoing.bytes volume: 30230 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e6a66424-b426-4b77-b162-19169137e6b0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30230, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-000000a2-461f4281-0d15-4de0-b5c6-d642b24bfab4-tap75ff1c6d-b5', 'timestamp': '2025-11-29T07:44:48.017891', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1071640379', 'name': 'tap75ff1c6d-b5', 'instance_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a6:b5:c9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap75ff1c6d-b5'}, 'message_id': '47b057c2-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7735.73613606, 'message_signature': '015d1ec754a0fa1e03d9fedebcb1df811bdb7916b25ae9b30e18c119e3c12b3f'}]}, 'timestamp': '2025-11-29 07:44:48.021605', '_unique_id': 'daa442c71cdb4a3b969b578c86cfaaf5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.024 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.025 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.058 12 DEBUG ceilometer.compute.pollsters [-] 461f4281-0d15-4de0-b5c6-d642b24bfab4/disk.device.read.requests volume: 1089 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.059 12 DEBUG ceilometer.compute.pollsters [-] 461f4281-0d15-4de0-b5c6-d642b24bfab4/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b5f401f-1d34-44c5-9023-89834c9d627d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1089, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4-vda', 'timestamp': '2025-11-29T07:44:48.025401', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1071640379', 'name': 'instance-000000a2', 'instance_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '47b63228-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7735.743661833, 'message_signature': '0a36e31fe91773aab2c34decc600f70f9a09ff703c7a764f027cff09797c98ab'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4-sda', 'timestamp': '2025-11-29T07:44:48.025401', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1071640379', 'name': 'instance-000000a2', 'instance_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '47b644c0-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7735.743661833, 'message_signature': '9ff4f04fdf3883d982180248006a8e6fee6b9899dc83ea113d473c2dbf40f658'}]}, 'timestamp': '2025-11-29 07:44:48.060295', '_unique_id': '2b2217b29d584f97bff331907031f6ab'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.061 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.062 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.063 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.063 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1071640379>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1071640379>]
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.063 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.063 12 DEBUG ceilometer.compute.pollsters [-] 461f4281-0d15-4de0-b5c6-d642b24bfab4/disk.device.read.latency volume: 375348323 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.063 12 DEBUG ceilometer.compute.pollsters [-] 461f4281-0d15-4de0-b5c6-d642b24bfab4/disk.device.read.latency volume: 48677082 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '09282f98-55f9-42da-8ab5-c77712673d1f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 375348323, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4-vda', 'timestamp': '2025-11-29T07:44:48.063570', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1071640379', 'name': 'instance-000000a2', 'instance_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '47b6d084-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7735.743661833, 'message_signature': '0219ab4869be5fe20d16fbca7b83a2b2a0b7cc2c2201c0f3a04b2da6b79d826a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 48677082, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4-sda', 'timestamp': '2025-11-29T07:44:48.063570', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1071640379', 'name': 'instance-000000a2', 'instance_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '47b6ddc2-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7735.743661833, 'message_signature': 'a2da7ee1ffb54f323fd703659c74b7bd58ae8a9f8840b666f1507f1133a9f307'}]}, 'timestamp': '2025-11-29 07:44:48.064178', '_unique_id': '5ef77d85825a409f85e127d7efb3e777'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.065 12 DEBUG ceilometer.compute.pollsters [-] 461f4281-0d15-4de0-b5c6-d642b24bfab4/disk.device.read.bytes volume: 30374400 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.066 12 DEBUG ceilometer.compute.pollsters [-] 461f4281-0d15-4de0-b5c6-d642b24bfab4/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b4811ee9-49d7-4cf4-9eca-fd3e1a3d5c86', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30374400, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4-vda', 'timestamp': '2025-11-29T07:44:48.065934', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1071640379', 'name': 'instance-000000a2', 'instance_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '47b72e6c-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7735.743661833, 'message_signature': 'e1f3c16491b1138372a0823d1d721cfcdc6dcf1171ef1c138f71ee3360fb3c9e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4-sda', 'timestamp': '2025-11-29T07:44:48.065934', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1071640379', 'name': 'instance-000000a2', 'instance_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '47b73a24-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7735.743661833, 'message_signature': '464584dde60f5e8e7a9f82b2dd576b3bb653fcc62e4f2fd3a3dc0cad7c93e798'}]}, 'timestamp': '2025-11-29 07:44:48.066516', '_unique_id': '1e0d74d64e1a4fd2b482d7d56916933f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.067 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.086 12 DEBUG ceilometer.compute.pollsters [-] 461f4281-0d15-4de0-b5c6-d642b24bfab4/cpu volume: 12000000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '850c4f6c-e633-41eb-825f-8fa99535701b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12000000000, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4', 'timestamp': '2025-11-29T07:44:48.067901', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1071640379', 'name': 'instance-000000a2', 'instance_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '47ba673a-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7735.804752156, 'message_signature': '69c392c031e2b883b7f1bf44c4ade20dfa5891b34cbed8a64078d6b5e92f335d'}]}, 'timestamp': '2025-11-29 07:44:48.087471', '_unique_id': '92b9b805f89e48e5bc07cf9d9ca02816'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.088 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.089 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.089 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.089 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1071640379>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1071640379>]
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 DEBUG ceilometer.compute.pollsters [-] 461f4281-0d15-4de0-b5c6-d642b24bfab4/network.incoming.bytes volume: 31589 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a11484c4-312c-4bcd-857f-d39110fe36e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31589, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-000000a2-461f4281-0d15-4de0-b5c6-d642b24bfab4-tap75ff1c6d-b5', 'timestamp': '2025-11-29T07:44:48.090104', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1071640379', 'name': 'tap75ff1c6d-b5', 'instance_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a6:b5:c9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap75ff1c6d-b5'}, 'message_id': '47badcf6-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7735.73613606, 'message_signature': '9f4442e931d3e567ef3697cfe5f5235d393fcb5b57b65f35a94070b1a4143347'}]}, 'timestamp': '2025-11-29 07:44:48.090351', '_unique_id': '379dc1dbc715453fa1e5f1193e039790'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.090 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.091 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.091 12 DEBUG ceilometer.compute.pollsters [-] 461f4281-0d15-4de0-b5c6-d642b24bfab4/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2b01261b-35f9-420e-a124-89d9d146671c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-000000a2-461f4281-0d15-4de0-b5c6-d642b24bfab4-tap75ff1c6d-b5', 'timestamp': '2025-11-29T07:44:48.091617', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1071640379', 'name': 'tap75ff1c6d-b5', 'instance_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a6:b5:c9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap75ff1c6d-b5'}, 'message_id': '47bb1766-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7735.73613606, 'message_signature': '55cb46bb1436c2a06b44bbd78d728173cf8ab62390079c17ebc0d5a8c322b46b'}]}, 'timestamp': '2025-11-29 07:44:48.091871', '_unique_id': '1013bc609e2743fd8ff4d9a22f082878'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.092 12 DEBUG ceilometer.compute.pollsters [-] 461f4281-0d15-4de0-b5c6-d642b24bfab4/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '515b7acd-fd9e-45aa-8143-36fbcf22b141', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-000000a2-461f4281-0d15-4de0-b5c6-d642b24bfab4-tap75ff1c6d-b5', 'timestamp': '2025-11-29T07:44:48.092943', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1071640379', 'name': 'tap75ff1c6d-b5', 'instance_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a6:b5:c9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap75ff1c6d-b5'}, 'message_id': '47bb4c18-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7735.73613606, 'message_signature': '31bbec88532d56b91c6f01ac8ab061494d74d84a087b3c290fd1fddf3d948d6d'}]}, 'timestamp': '2025-11-29 07:44:48.093203', '_unique_id': 'd98467d175c040e59c74349d719a7eb8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.093 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.094 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.094 12 DEBUG ceilometer.compute.pollsters [-] 461f4281-0d15-4de0-b5c6-d642b24bfab4/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ff7b56b3-d3d2-4def-a87d-438dee80925b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-000000a2-461f4281-0d15-4de0-b5c6-d642b24bfab4-tap75ff1c6d-b5', 'timestamp': '2025-11-29T07:44:48.094541', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1071640379', 'name': 'tap75ff1c6d-b5', 'instance_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a6:b5:c9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap75ff1c6d-b5'}, 'message_id': '47bb89f8-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7735.73613606, 'message_signature': 'c378199302aab9e1554dd225699871b4e79df12b510bf6edcfb575fae14214cc'}]}, 'timestamp': '2025-11-29 07:44:48.094775', '_unique_id': 'aafa4002208a4102bc2e073f90c27151'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.095 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.096 12 DEBUG ceilometer.compute.pollsters [-] 461f4281-0d15-4de0-b5c6-d642b24bfab4/disk.device.write.requests volume: 317 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.096 12 DEBUG ceilometer.compute.pollsters [-] 461f4281-0d15-4de0-b5c6-d642b24bfab4/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '624ce185-0e4e-4999-82b2-95d92601fe3c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 317, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4-vda', 'timestamp': '2025-11-29T07:44:48.096057', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1071640379', 'name': 'instance-000000a2', 'instance_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '47bbc4ea-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7735.743661833, 'message_signature': 'bc06e07fb709dfbbf2e04465521fd2dbca080a11e8dbf5fa7826294bc9c9b229'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4-sda', 'timestamp': '2025-11-29T07:44:48.096057', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1071640379', 'name': 'instance-000000a2', 'instance_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '47bbcc7e-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7735.743661833, 'message_signature': 'ada926bd467cedc0ca17bd1ce684e9f46da6188ccd1f7132d4696d1ce5cdf26e'}]}, 'timestamp': '2025-11-29 07:44:48.096480', '_unique_id': 'e8a437ae1dd846df9e6d17e024828478'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.097 12 DEBUG ceilometer.compute.pollsters [-] 461f4281-0d15-4de0-b5c6-d642b24bfab4/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dc0ec577-6697-454b-91a1-dceef71957ca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-000000a2-461f4281-0d15-4de0-b5c6-d642b24bfab4-tap75ff1c6d-b5', 'timestamp': '2025-11-29T07:44:48.097688', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1071640379', 'name': 'tap75ff1c6d-b5', 'instance_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a6:b5:c9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap75ff1c6d-b5'}, 'message_id': '47bc048c-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7735.73613606, 'message_signature': 'c1d6cffa35a75cc3158a4d51825bcf513e1cc0b43615aae7bf6c3131aea92d2a'}]}, 'timestamp': '2025-11-29 07:44:48.097945', '_unique_id': 'f1d498791f76469b8bb999b1ec6346c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.098 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 DEBUG ceilometer.compute.pollsters [-] 461f4281-0d15-4de0-b5c6-d642b24bfab4/network.incoming.packets volume: 183 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '630c6f37-2bf0-4140-8615-f7f96f84d0bd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 183, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-000000a2-461f4281-0d15-4de0-b5c6-d642b24bfab4-tap75ff1c6d-b5', 'timestamp': '2025-11-29T07:44:48.099000', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1071640379', 'name': 'tap75ff1c6d-b5', 'instance_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a6:b5:c9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap75ff1c6d-b5'}, 'message_id': '47bc38e4-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7735.73613606, 'message_signature': '5a76b842bf8e545e08e961231710ab013bc325bd10cca8d149b4868d5a5daef5'}]}, 'timestamp': '2025-11-29 07:44:48.099290', '_unique_id': '8d72485a16294575a8c0c6b3512487c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.099 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.100 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.100 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.100 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1071640379>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1071640379>]
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.100 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.100 12 DEBUG ceilometer.compute.pollsters [-] 461f4281-0d15-4de0-b5c6-d642b24bfab4/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f73fe4c-7747-47bc-bb7d-5e30eea97a09', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-000000a2-461f4281-0d15-4de0-b5c6-d642b24bfab4-tap75ff1c6d-b5', 'timestamp': '2025-11-29T07:44:48.100882', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1071640379', 'name': 'tap75ff1c6d-b5', 'instance_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a6:b5:c9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap75ff1c6d-b5'}, 'message_id': '47bc8448-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7735.73613606, 'message_signature': '7275ef1ab3ee7d1605537fe414889ec95e9033fc1fee1f5683379395cbf1990f'}]}, 'timestamp': '2025-11-29 07:44:48.101200', '_unique_id': '2b974fe56eb94755a3bd2ae4566db035'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.101 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.102 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.113 12 DEBUG ceilometer.compute.pollsters [-] 461f4281-0d15-4de0-b5c6-d642b24bfab4/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.114 12 DEBUG ceilometer.compute.pollsters [-] 461f4281-0d15-4de0-b5c6-d642b24bfab4/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c69302da-542e-495d-a9d7-f9489443e2e5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4-vda', 'timestamp': '2025-11-29T07:44:48.102324', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1071640379', 'name': 'instance-000000a2', 'instance_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '47be7ef6-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7735.820563684, 'message_signature': '9c03ddf35f6ab4ee4a3dafa53cae95b29404d5600a32406eba7c67a9ccbf9466'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4-sda', 'timestamp': '2025-11-29T07:44:48.102324', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1071640379', 'name': 'instance-000000a2', 'instance_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '47be8e28-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7735.820563684, 'message_signature': '1366db819f1c66d94b2df82a6397f32ce4cedfab151b6924306bdafaa2077c18'}]}, 'timestamp': '2025-11-29 07:44:48.114570', '_unique_id': 'c2e78cc5253a470ea6d1b34f16e346c5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.116 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 DEBUG ceilometer.compute.pollsters [-] 461f4281-0d15-4de0-b5c6-d642b24bfab4/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6feb7dfa-2a9e-4370-9495-36e89c088c72', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-000000a2-461f4281-0d15-4de0-b5c6-d642b24bfab4-tap75ff1c6d-b5', 'timestamp': '2025-11-29T07:44:48.117055', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1071640379', 'name': 'tap75ff1c6d-b5', 'instance_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a6:b5:c9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap75ff1c6d-b5'}, 'message_id': '47befa34-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7735.73613606, 'message_signature': 'd6fb4dc2ec3adc5b63790b349d0755802f93c104b563166a67ae629abb94637b'}]}, 'timestamp': '2025-11-29 07:44:48.117314', '_unique_id': '3abc69f1089e427a971d466a0f787c3c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.117 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.118 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.118 12 DEBUG ceilometer.compute.pollsters [-] 461f4281-0d15-4de0-b5c6-d642b24bfab4/disk.device.write.latency volume: 24261570130 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.118 12 DEBUG ceilometer.compute.pollsters [-] 461f4281-0d15-4de0-b5c6-d642b24bfab4/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '94bbaa5d-455f-4b69-97a2-fcfa31b14e66', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24261570130, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4-vda', 'timestamp': '2025-11-29T07:44:48.118459', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1071640379', 'name': 'instance-000000a2', 'instance_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '47bf3008-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7735.743661833, 'message_signature': 'bb271d95408f97114deaf53f763dc9f17c8b2b85e73239e75672265a93619f1e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4-sda', 'timestamp': '2025-11-29T07:44:48.118459', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1071640379', 'name': 'instance-000000a2', 'instance_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '47bf37ba-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7735.743661833, 'message_signature': '0c8eba4c3e7e596b564c5c62d00954b27d93481f753f249f6633b09bdc0183e3'}]}, 'timestamp': '2025-11-29 07:44:48.118896', '_unique_id': 'bfe4217b1206420fa050c62f82e9dde0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.119 12 DEBUG ceilometer.compute.pollsters [-] 461f4281-0d15-4de0-b5c6-d642b24bfab4/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 DEBUG ceilometer.compute.pollsters [-] 461f4281-0d15-4de0-b5c6-d642b24bfab4/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba86d718-e93a-4ae5-b916-b830add33e80', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4-vda', 'timestamp': '2025-11-29T07:44:48.119959', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1071640379', 'name': 'instance-000000a2', 'instance_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '47bf6a6e-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7735.820563684, 'message_signature': '9f0d628cd6fa2c13468ab6a58963bac83e1f1c9af74cf86f78d9759285db9af5'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4-sda', 'timestamp': '2025-11-29T07:44:48.119959', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1071640379', 'name': 'instance-000000a2', 'instance_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '47bf7220-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7735.820563684, 'message_signature': '14e7d8cad6e93b3b6ed05b338db216eab78f03212518d49b18343b1ef8e069c4'}]}, 'timestamp': '2025-11-29 07:44:48.120364', '_unique_id': 'fa0fe0339de3475f99991ed464ad0373'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.120 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.121 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.121 12 DEBUG ceilometer.compute.pollsters [-] 461f4281-0d15-4de0-b5c6-d642b24bfab4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.121 12 DEBUG ceilometer.compute.pollsters [-] 461f4281-0d15-4de0-b5c6-d642b24bfab4/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '185f9075-f982-47c6-9825-b3e77d779849', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4-vda', 'timestamp': '2025-11-29T07:44:48.121466', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1071640379', 'name': 'instance-000000a2', 'instance_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '47bfa556-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7735.820563684, 'message_signature': '447c0f386e4bd56926651145c678a615919ea2b61af560885515f12ccb60442e'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4-sda', 'timestamp': '2025-11-29T07:44:48.121466', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1071640379', 'name': 'instance-000000a2', 'instance_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '47bfad12-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7735.820563684, 'message_signature': 'ee93b0e01a699d443a30f6dba75e3b690a5d08d324ce43dfef141bf24c80ccd4'}]}, 'timestamp': '2025-11-29 07:44:48.121897', '_unique_id': '066a540e7d2c4a23b0cf457809870c79'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.122 12 DEBUG ceilometer.compute.pollsters [-] 461f4281-0d15-4de0-b5c6-d642b24bfab4/network.outgoing.packets volume: 190 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '05400c82-e67e-45b0-8e76-831d95b25094', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 190, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-000000a2-461f4281-0d15-4de0-b5c6-d642b24bfab4-tap75ff1c6d-b5', 'timestamp': '2025-11-29T07:44:48.122967', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1071640379', 'name': 'tap75ff1c6d-b5', 'instance_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a6:b5:c9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap75ff1c6d-b5'}, 'message_id': '47bfdfee-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7735.73613606, 'message_signature': 'b219b5c3c87883b93eb344e30d27f1cce8afb41826cabbe3b030d5befba2e95c'}]}, 'timestamp': '2025-11-29 07:44:48.123188', '_unique_id': 'c67fab6f29b3431f909d342972687e98'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.123 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 DEBUG ceilometer.compute.pollsters [-] 461f4281-0d15-4de0-b5c6-d642b24bfab4/memory.usage volume: 46.84765625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e51dac31-da3d-4bea-9959-8bf76385be29', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 46.84765625, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4', 'timestamp': '2025-11-29T07:44:48.124213', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1071640379', 'name': 'instance-000000a2', 'instance_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '47c01090-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7735.804752156, 'message_signature': 'd18f3c38caf66a530ea155b0919f7387866da4ccfda5027fe6b89bc078533ce1'}]}, 'timestamp': '2025-11-29 07:44:48.124425', '_unique_id': 'cbd7bef8665540c29fa590d5ec562003'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.124 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.125 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.125 12 DEBUG ceilometer.compute.pollsters [-] 461f4281-0d15-4de0-b5c6-d642b24bfab4/disk.device.write.bytes volume: 72990720 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.125 12 DEBUG ceilometer.compute.pollsters [-] 461f4281-0d15-4de0-b5c6-d642b24bfab4/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7443f1ea-d0b1-46ee-a6ec-a804627d9581', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72990720, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4-vda', 'timestamp': '2025-11-29T07:44:48.125485', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1071640379', 'name': 'instance-000000a2', 'instance_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '47c04222-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7735.743661833, 'message_signature': 'a641bf90ab9868670fe310b821ebc23cde5532aa24bea579131c1e2999034586'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4-sda', 'timestamp': '2025-11-29T07:44:48.125485', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1071640379', 'name': 'instance-000000a2', 'instance_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4', 'instance_type': 'm1.nano', 'host': 'c2038ca1e08569276bf7870242fc5417718cd59e729cc9e449a8de0e', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '47c049ac-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7735.743661833, 'message_signature': '0e40c101de02ea16e0d6f4c139d6d738ad686982dca4bace70bdf997ec9f3e05'}]}, 'timestamp': '2025-11-29 07:44:48.125908', '_unique_id': '3d3c8112c1f34b9299c3d91bbf53fe7c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.126 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.127 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:44:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:44:48.127 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1071640379>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1071640379>]
Nov 29 07:44:48 compute-0 nova_compute[187185]: 2025-11-29 07:44:48.643 187189 DEBUG nova.compute.manager [req-b971f9d5-bfb9-43c0-bb2c-2f78b5358c4f req-b213ca42-17f8-408e-95dc-720466f8f078 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Received event network-changed-75ff1c6d-b5d1-4b97-953a-7eb71b306a96 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:44:48 compute-0 nova_compute[187185]: 2025-11-29 07:44:48.643 187189 DEBUG nova.compute.manager [req-b971f9d5-bfb9-43c0-bb2c-2f78b5358c4f req-b213ca42-17f8-408e-95dc-720466f8f078 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Refreshing instance network info cache due to event network-changed-75ff1c6d-b5d1-4b97-953a-7eb71b306a96. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:44:48 compute-0 nova_compute[187185]: 2025-11-29 07:44:48.644 187189 DEBUG oslo_concurrency.lockutils [req-b971f9d5-bfb9-43c0-bb2c-2f78b5358c4f req-b213ca42-17f8-408e-95dc-720466f8f078 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-461f4281-0d15-4de0-b5c6-d642b24bfab4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:44:48 compute-0 nova_compute[187185]: 2025-11-29 07:44:48.644 187189 DEBUG oslo_concurrency.lockutils [req-b971f9d5-bfb9-43c0-bb2c-2f78b5358c4f req-b213ca42-17f8-408e-95dc-720466f8f078 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-461f4281-0d15-4de0-b5c6-d642b24bfab4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:44:48 compute-0 nova_compute[187185]: 2025-11-29 07:44:48.644 187189 DEBUG nova.network.neutron [req-b971f9d5-bfb9-43c0-bb2c-2f78b5358c4f req-b213ca42-17f8-408e-95dc-720466f8f078 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Refreshing network info cache for port 75ff1c6d-b5d1-4b97-953a-7eb71b306a96 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:44:48 compute-0 nova_compute[187185]: 2025-11-29 07:44:48.756 187189 DEBUG oslo_concurrency.lockutils [None req-039ce53b-9a0d-4c52-8ee7-f724d4f7b820 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "461f4281-0d15-4de0-b5c6-d642b24bfab4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:44:48 compute-0 nova_compute[187185]: 2025-11-29 07:44:48.757 187189 DEBUG oslo_concurrency.lockutils [None req-039ce53b-9a0d-4c52-8ee7-f724d4f7b820 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "461f4281-0d15-4de0-b5c6-d642b24bfab4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:44:48 compute-0 nova_compute[187185]: 2025-11-29 07:44:48.757 187189 DEBUG oslo_concurrency.lockutils [None req-039ce53b-9a0d-4c52-8ee7-f724d4f7b820 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "461f4281-0d15-4de0-b5c6-d642b24bfab4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:44:48 compute-0 nova_compute[187185]: 2025-11-29 07:44:48.758 187189 DEBUG oslo_concurrency.lockutils [None req-039ce53b-9a0d-4c52-8ee7-f724d4f7b820 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "461f4281-0d15-4de0-b5c6-d642b24bfab4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:44:48 compute-0 nova_compute[187185]: 2025-11-29 07:44:48.758 187189 DEBUG oslo_concurrency.lockutils [None req-039ce53b-9a0d-4c52-8ee7-f724d4f7b820 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "461f4281-0d15-4de0-b5c6-d642b24bfab4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:44:48 compute-0 nova_compute[187185]: 2025-11-29 07:44:48.773 187189 INFO nova.compute.manager [None req-039ce53b-9a0d-4c52-8ee7-f724d4f7b820 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Terminating instance
Nov 29 07:44:48 compute-0 nova_compute[187185]: 2025-11-29 07:44:48.786 187189 DEBUG nova.compute.manager [None req-039ce53b-9a0d-4c52-8ee7-f724d4f7b820 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 07:44:48 compute-0 kernel: tap75ff1c6d-b5 (unregistering): left promiscuous mode
Nov 29 07:44:48 compute-0 NetworkManager[55227]: <info>  [1764402288.8207] device (tap75ff1c6d-b5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:44:48 compute-0 nova_compute[187185]: 2025-11-29 07:44:48.823 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:44:48 compute-0 ovn_controller[95281]: 2025-11-29T07:44:48Z|00553|binding|INFO|Releasing lport 75ff1c6d-b5d1-4b97-953a-7eb71b306a96 from this chassis (sb_readonly=0)
Nov 29 07:44:48 compute-0 ovn_controller[95281]: 2025-11-29T07:44:48Z|00554|binding|INFO|Setting lport 75ff1c6d-b5d1-4b97-953a-7eb71b306a96 down in Southbound
Nov 29 07:44:48 compute-0 ovn_controller[95281]: 2025-11-29T07:44:48Z|00555|binding|INFO|Removing iface tap75ff1c6d-b5 ovn-installed in OVS
Nov 29 07:44:48 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:44:48.843 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a6:b5:c9 10.100.0.13 2001:db8:0:1:f816:3eff:fea6:b5c9 2001:db8::f816:3eff:fea6:b5c9'], port_security=['fa:16:3e:a6:b5:c9 10.100.0.13 2001:db8:0:1:f816:3eff:fea6:b5c9 2001:db8::f816:3eff:fea6:b5c9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28 2001:db8:0:1:f816:3eff:fea6:b5c9/64 2001:db8::f816:3eff:fea6:b5c9/64', 'neutron:device_id': '461f4281-0d15-4de0-b5c6-d642b24bfab4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e23e9510-a780-4254-b7f0-36040139e7db', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fcfda89f-6716-48ad-9493-dabb00233aaf', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ff62ba9a-db01-45ed-b4a4-c5b2c8f5434e, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=75ff1c6d-b5d1-4b97-953a-7eb71b306a96) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:44:48 compute-0 nova_compute[187185]: 2025-11-29 07:44:48.845 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:44:48 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:44:48.847 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 75ff1c6d-b5d1-4b97-953a-7eb71b306a96 in datapath e23e9510-a780-4254-b7f0-36040139e7db unbound from our chassis
Nov 29 07:44:48 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:44:48.849 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e23e9510-a780-4254-b7f0-36040139e7db, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:44:48 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:44:48.853 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[eb5b5396-ebaa-43d8-a916-667e4134aad1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:44:48 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:44:48.854 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db namespace which is not needed anymore
Nov 29 07:44:48 compute-0 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d000000a2.scope: Deactivated successfully.
Nov 29 07:44:48 compute-0 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d000000a2.scope: Consumed 16.041s CPU time.
Nov 29 07:44:48 compute-0 systemd-machined[153486]: Machine qemu-64-instance-000000a2 terminated.
Nov 29 07:44:49 compute-0 nova_compute[187185]: 2025-11-29 07:44:49.018 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:44:49 compute-0 nova_compute[187185]: 2025-11-29 07:44:49.026 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:44:49 compute-0 nova_compute[187185]: 2025-11-29 07:44:49.070 187189 INFO nova.virt.libvirt.driver [-] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Instance destroyed successfully.
Nov 29 07:44:49 compute-0 nova_compute[187185]: 2025-11-29 07:44:49.071 187189 DEBUG nova.objects.instance [None req-039ce53b-9a0d-4c52-8ee7-f724d4f7b820 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'resources' on Instance uuid 461f4281-0d15-4de0-b5c6-d642b24bfab4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:44:49 compute-0 nova_compute[187185]: 2025-11-29 07:44:49.085 187189 DEBUG nova.virt.libvirt.vif [None req-039ce53b-9a0d-4c52-8ee7-f724d4f7b820 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:43:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1071640379',display_name='tempest-TestGettingAddress-server-1071640379',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1071640379',id=162,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCUa75YDe0UFuF3zC2EDX2ybM5J+nbJmodk4MsTI0LH8wgrY+SnJ+ndJio6A3wgT3MOCb4Huw6Ay1X+CWth0nVP7co5Y+kbzpcJTZjopF6Z8gsZ88jWOxRb0FFz1vDfcNA==',key_name='tempest-TestGettingAddress-1094641829',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:43:47Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-j09y60t5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:43:47Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=461f4281-0d15-4de0-b5c6-d642b24bfab4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "75ff1c6d-b5d1-4b97-953a-7eb71b306a96", "address": "fa:16:3e:a6:b5:c9", "network": {"id": "e23e9510-a780-4254-b7f0-36040139e7db", "bridge": "br-int", "label": "tempest-network-smoke--622290722", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fea6:b5c9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fea6:b5c9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75ff1c6d-b5", "ovs_interfaceid": "75ff1c6d-b5d1-4b97-953a-7eb71b306a96", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:44:49 compute-0 nova_compute[187185]: 2025-11-29 07:44:49.085 187189 DEBUG nova.network.os_vif_util [None req-039ce53b-9a0d-4c52-8ee7-f724d4f7b820 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "75ff1c6d-b5d1-4b97-953a-7eb71b306a96", "address": "fa:16:3e:a6:b5:c9", "network": {"id": "e23e9510-a780-4254-b7f0-36040139e7db", "bridge": "br-int", "label": "tempest-network-smoke--622290722", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fea6:b5c9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fea6:b5c9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75ff1c6d-b5", "ovs_interfaceid": "75ff1c6d-b5d1-4b97-953a-7eb71b306a96", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:44:49 compute-0 nova_compute[187185]: 2025-11-29 07:44:49.086 187189 DEBUG nova.network.os_vif_util [None req-039ce53b-9a0d-4c52-8ee7-f724d4f7b820 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a6:b5:c9,bridge_name='br-int',has_traffic_filtering=True,id=75ff1c6d-b5d1-4b97-953a-7eb71b306a96,network=Network(e23e9510-a780-4254-b7f0-36040139e7db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75ff1c6d-b5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:44:49 compute-0 nova_compute[187185]: 2025-11-29 07:44:49.086 187189 DEBUG os_vif [None req-039ce53b-9a0d-4c52-8ee7-f724d4f7b820 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a6:b5:c9,bridge_name='br-int',has_traffic_filtering=True,id=75ff1c6d-b5d1-4b97-953a-7eb71b306a96,network=Network(e23e9510-a780-4254-b7f0-36040139e7db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75ff1c6d-b5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:44:49 compute-0 nova_compute[187185]: 2025-11-29 07:44:49.088 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:44:49 compute-0 nova_compute[187185]: 2025-11-29 07:44:49.088 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap75ff1c6d-b5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:44:49 compute-0 nova_compute[187185]: 2025-11-29 07:44:49.090 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:44:49 compute-0 nova_compute[187185]: 2025-11-29 07:44:49.092 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:44:49 compute-0 nova_compute[187185]: 2025-11-29 07:44:49.095 187189 INFO os_vif [None req-039ce53b-9a0d-4c52-8ee7-f724d4f7b820 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a6:b5:c9,bridge_name='br-int',has_traffic_filtering=True,id=75ff1c6d-b5d1-4b97-953a-7eb71b306a96,network=Network(e23e9510-a780-4254-b7f0-36040139e7db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75ff1c6d-b5')
Nov 29 07:44:49 compute-0 nova_compute[187185]: 2025-11-29 07:44:49.096 187189 INFO nova.virt.libvirt.driver [None req-039ce53b-9a0d-4c52-8ee7-f724d4f7b820 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Deleting instance files /var/lib/nova/instances/461f4281-0d15-4de0-b5c6-d642b24bfab4_del
Nov 29 07:44:49 compute-0 nova_compute[187185]: 2025-11-29 07:44:49.096 187189 INFO nova.virt.libvirt.driver [None req-039ce53b-9a0d-4c52-8ee7-f724d4f7b820 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Deletion of /var/lib/nova/instances/461f4281-0d15-4de0-b5c6-d642b24bfab4_del complete
Nov 29 07:44:49 compute-0 nova_compute[187185]: 2025-11-29 07:44:49.171 187189 INFO nova.compute.manager [None req-039ce53b-9a0d-4c52-8ee7-f724d4f7b820 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Took 0.38 seconds to destroy the instance on the hypervisor.
Nov 29 07:44:49 compute-0 nova_compute[187185]: 2025-11-29 07:44:49.172 187189 DEBUG oslo.service.loopingcall [None req-039ce53b-9a0d-4c52-8ee7-f724d4f7b820 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 07:44:49 compute-0 nova_compute[187185]: 2025-11-29 07:44:49.172 187189 DEBUG nova.compute.manager [-] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 07:44:49 compute-0 nova_compute[187185]: 2025-11-29 07:44:49.172 187189 DEBUG nova.network.neutron [-] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 07:44:49 compute-0 neutron-haproxy-ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db[244202]: [NOTICE]   (244206) : haproxy version is 2.8.14-c23fe91
Nov 29 07:44:49 compute-0 neutron-haproxy-ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db[244202]: [NOTICE]   (244206) : path to executable is /usr/sbin/haproxy
Nov 29 07:44:49 compute-0 neutron-haproxy-ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db[244202]: [WARNING]  (244206) : Exiting Master process...
Nov 29 07:44:49 compute-0 neutron-haproxy-ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db[244202]: [WARNING]  (244206) : Exiting Master process...
Nov 29 07:44:49 compute-0 neutron-haproxy-ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db[244202]: [ALERT]    (244206) : Current worker (244208) exited with code 143 (Terminated)
Nov 29 07:44:49 compute-0 neutron-haproxy-ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db[244202]: [WARNING]  (244206) : All workers exited. Exiting... (0)
Nov 29 07:44:49 compute-0 systemd[1]: libpod-01643e218f209285b358c24b06417199280ba1b17cc4f95eeb6ea52b5e70d52e.scope: Deactivated successfully.
Nov 29 07:44:49 compute-0 podman[244580]: 2025-11-29 07:44:49.370667636 +0000 UTC m=+0.409981239 container died 01643e218f209285b358c24b06417199280ba1b17cc4f95eeb6ea52b5e70d52e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:44:49 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-01643e218f209285b358c24b06417199280ba1b17cc4f95eeb6ea52b5e70d52e-userdata-shm.mount: Deactivated successfully.
Nov 29 07:44:49 compute-0 systemd[1]: var-lib-containers-storage-overlay-dc3b75dbdef3fc1e81d149d8a0c71592057419e00aa111a2fdd90691704e9fa1-merged.mount: Deactivated successfully.
Nov 29 07:44:49 compute-0 podman[244580]: 2025-11-29 07:44:49.924712791 +0000 UTC m=+0.964026364 container cleanup 01643e218f209285b358c24b06417199280ba1b17cc4f95eeb6ea52b5e70d52e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 29 07:44:49 compute-0 systemd[1]: libpod-conmon-01643e218f209285b358c24b06417199280ba1b17cc4f95eeb6ea52b5e70d52e.scope: Deactivated successfully.
Nov 29 07:44:50 compute-0 podman[244627]: 2025-11-29 07:44:50.131628267 +0000 UTC m=+0.174507488 container remove 01643e218f209285b358c24b06417199280ba1b17cc4f95eeb6ea52b5e70d52e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 07:44:50 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:44:50.140 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d0284297-3141-444a-b182-c9d3dba7ed42]: (4, ('Sat Nov 29 07:44:48 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db (01643e218f209285b358c24b06417199280ba1b17cc4f95eeb6ea52b5e70d52e)\n01643e218f209285b358c24b06417199280ba1b17cc4f95eeb6ea52b5e70d52e\nSat Nov 29 07:44:49 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db (01643e218f209285b358c24b06417199280ba1b17cc4f95eeb6ea52b5e70d52e)\n01643e218f209285b358c24b06417199280ba1b17cc4f95eeb6ea52b5e70d52e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:44:50 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:44:50.143 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[5f910ab2-aeb4-4d15-be84-731dc0a5c60f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:44:50 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:44:50.145 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape23e9510-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:44:50 compute-0 nova_compute[187185]: 2025-11-29 07:44:50.148 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:44:50 compute-0 kernel: tape23e9510-a0: left promiscuous mode
Nov 29 07:44:50 compute-0 nova_compute[187185]: 2025-11-29 07:44:50.173 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:44:50 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:44:50.176 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[00b74ebe-c0a6-4edd-a7d0-8b8146910cb3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:44:50 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:44:50.202 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[dcf77f02-4ecb-45dd-a977-950752ad2908]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:44:50 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:44:50.204 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[4c135ab8-6b0c-44f3-8ecb-b96e97fa70f9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:44:50 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:44:50.223 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[89a30081-a3d9-4f32-8726-562ba7ce7ed4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 767358, 'reachable_time': 31069, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244643, 'error': None, 'target': 'ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:44:50 compute-0 systemd[1]: run-netns-ovnmeta\x2de23e9510\x2da780\x2d4254\x2db7f0\x2d36040139e7db.mount: Deactivated successfully.
Nov 29 07:44:50 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:44:50.230 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:44:50 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:44:50.231 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[fffc756d-2ffe-47c7-94cd-3b735c94e62c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:44:50 compute-0 nova_compute[187185]: 2025-11-29 07:44:50.481 187189 DEBUG nova.compute.manager [req-d15ee8dd-a7d1-465f-8d8d-47074f88a83a req-2d86eefb-5032-4c61-ad52-8a805864f783 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Received event network-vif-unplugged-75ff1c6d-b5d1-4b97-953a-7eb71b306a96 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:44:50 compute-0 nova_compute[187185]: 2025-11-29 07:44:50.481 187189 DEBUG oslo_concurrency.lockutils [req-d15ee8dd-a7d1-465f-8d8d-47074f88a83a req-2d86eefb-5032-4c61-ad52-8a805864f783 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "461f4281-0d15-4de0-b5c6-d642b24bfab4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:44:50 compute-0 nova_compute[187185]: 2025-11-29 07:44:50.482 187189 DEBUG oslo_concurrency.lockutils [req-d15ee8dd-a7d1-465f-8d8d-47074f88a83a req-2d86eefb-5032-4c61-ad52-8a805864f783 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "461f4281-0d15-4de0-b5c6-d642b24bfab4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:44:50 compute-0 nova_compute[187185]: 2025-11-29 07:44:50.482 187189 DEBUG oslo_concurrency.lockutils [req-d15ee8dd-a7d1-465f-8d8d-47074f88a83a req-2d86eefb-5032-4c61-ad52-8a805864f783 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "461f4281-0d15-4de0-b5c6-d642b24bfab4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:44:50 compute-0 nova_compute[187185]: 2025-11-29 07:44:50.483 187189 DEBUG nova.compute.manager [req-d15ee8dd-a7d1-465f-8d8d-47074f88a83a req-2d86eefb-5032-4c61-ad52-8a805864f783 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] No waiting events found dispatching network-vif-unplugged-75ff1c6d-b5d1-4b97-953a-7eb71b306a96 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:44:50 compute-0 nova_compute[187185]: 2025-11-29 07:44:50.483 187189 DEBUG nova.compute.manager [req-d15ee8dd-a7d1-465f-8d8d-47074f88a83a req-2d86eefb-5032-4c61-ad52-8a805864f783 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Received event network-vif-unplugged-75ff1c6d-b5d1-4b97-953a-7eb71b306a96 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 07:44:50 compute-0 nova_compute[187185]: 2025-11-29 07:44:50.484 187189 DEBUG nova.compute.manager [req-d15ee8dd-a7d1-465f-8d8d-47074f88a83a req-2d86eefb-5032-4c61-ad52-8a805864f783 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Received event network-vif-plugged-75ff1c6d-b5d1-4b97-953a-7eb71b306a96 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:44:50 compute-0 nova_compute[187185]: 2025-11-29 07:44:50.484 187189 DEBUG oslo_concurrency.lockutils [req-d15ee8dd-a7d1-465f-8d8d-47074f88a83a req-2d86eefb-5032-4c61-ad52-8a805864f783 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "461f4281-0d15-4de0-b5c6-d642b24bfab4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:44:50 compute-0 nova_compute[187185]: 2025-11-29 07:44:50.484 187189 DEBUG oslo_concurrency.lockutils [req-d15ee8dd-a7d1-465f-8d8d-47074f88a83a req-2d86eefb-5032-4c61-ad52-8a805864f783 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "461f4281-0d15-4de0-b5c6-d642b24bfab4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:44:50 compute-0 nova_compute[187185]: 2025-11-29 07:44:50.485 187189 DEBUG oslo_concurrency.lockutils [req-d15ee8dd-a7d1-465f-8d8d-47074f88a83a req-2d86eefb-5032-4c61-ad52-8a805864f783 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "461f4281-0d15-4de0-b5c6-d642b24bfab4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:44:50 compute-0 nova_compute[187185]: 2025-11-29 07:44:50.485 187189 DEBUG nova.compute.manager [req-d15ee8dd-a7d1-465f-8d8d-47074f88a83a req-2d86eefb-5032-4c61-ad52-8a805864f783 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] No waiting events found dispatching network-vif-plugged-75ff1c6d-b5d1-4b97-953a-7eb71b306a96 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:44:50 compute-0 nova_compute[187185]: 2025-11-29 07:44:50.485 187189 WARNING nova.compute.manager [req-d15ee8dd-a7d1-465f-8d8d-47074f88a83a req-2d86eefb-5032-4c61-ad52-8a805864f783 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Received unexpected event network-vif-plugged-75ff1c6d-b5d1-4b97-953a-7eb71b306a96 for instance with vm_state active and task_state deleting.
Nov 29 07:44:50 compute-0 nova_compute[187185]: 2025-11-29 07:44:50.488 187189 DEBUG nova.network.neutron [-] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:44:50 compute-0 nova_compute[187185]: 2025-11-29 07:44:50.508 187189 INFO nova.compute.manager [-] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Took 1.34 seconds to deallocate network for instance.
Nov 29 07:44:51 compute-0 nova_compute[187185]: 2025-11-29 07:44:51.015 187189 DEBUG nova.compute.manager [req-1e107268-6273-4df2-8f77-d11032248a4f req-ceabecef-7d5c-4718-833c-0e43e2b069d9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Received event network-vif-deleted-75ff1c6d-b5d1-4b97-953a-7eb71b306a96 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:44:51 compute-0 nova_compute[187185]: 2025-11-29 07:44:51.022 187189 DEBUG oslo_concurrency.lockutils [None req-039ce53b-9a0d-4c52-8ee7-f724d4f7b820 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:44:51 compute-0 nova_compute[187185]: 2025-11-29 07:44:51.022 187189 DEBUG oslo_concurrency.lockutils [None req-039ce53b-9a0d-4c52-8ee7-f724d4f7b820 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:44:51 compute-0 nova_compute[187185]: 2025-11-29 07:44:51.162 187189 DEBUG nova.compute.provider_tree [None req-039ce53b-9a0d-4c52-8ee7-f724d4f7b820 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:44:51 compute-0 nova_compute[187185]: 2025-11-29 07:44:51.180 187189 DEBUG nova.scheduler.client.report [None req-039ce53b-9a0d-4c52-8ee7-f724d4f7b820 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:44:51 compute-0 nova_compute[187185]: 2025-11-29 07:44:51.210 187189 DEBUG oslo_concurrency.lockutils [None req-039ce53b-9a0d-4c52-8ee7-f724d4f7b820 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.188s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:44:51 compute-0 nova_compute[187185]: 2025-11-29 07:44:51.238 187189 INFO nova.scheduler.client.report [None req-039ce53b-9a0d-4c52-8ee7-f724d4f7b820 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Deleted allocations for instance 461f4281-0d15-4de0-b5c6-d642b24bfab4
Nov 29 07:44:51 compute-0 nova_compute[187185]: 2025-11-29 07:44:51.321 187189 DEBUG oslo_concurrency.lockutils [None req-039ce53b-9a0d-4c52-8ee7-f724d4f7b820 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "461f4281-0d15-4de0-b5c6-d642b24bfab4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.564s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:44:51 compute-0 nova_compute[187185]: 2025-11-29 07:44:51.667 187189 DEBUG nova.network.neutron [req-b971f9d5-bfb9-43c0-bb2c-2f78b5358c4f req-b213ca42-17f8-408e-95dc-720466f8f078 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Updated VIF entry in instance network info cache for port 75ff1c6d-b5d1-4b97-953a-7eb71b306a96. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:44:51 compute-0 nova_compute[187185]: 2025-11-29 07:44:51.667 187189 DEBUG nova.network.neutron [req-b971f9d5-bfb9-43c0-bb2c-2f78b5358c4f req-b213ca42-17f8-408e-95dc-720466f8f078 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Updating instance_info_cache with network_info: [{"id": "75ff1c6d-b5d1-4b97-953a-7eb71b306a96", "address": "fa:16:3e:a6:b5:c9", "network": {"id": "e23e9510-a780-4254-b7f0-36040139e7db", "bridge": "br-int", "label": "tempest-network-smoke--622290722", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fea6:b5c9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fea6:b5c9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75ff1c6d-b5", "ovs_interfaceid": "75ff1c6d-b5d1-4b97-953a-7eb71b306a96", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:44:51 compute-0 nova_compute[187185]: 2025-11-29 07:44:51.687 187189 DEBUG oslo_concurrency.lockutils [req-b971f9d5-bfb9-43c0-bb2c-2f78b5358c4f req-b213ca42-17f8-408e-95dc-720466f8f078 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-461f4281-0d15-4de0-b5c6-d642b24bfab4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:44:52 compute-0 nova_compute[187185]: 2025-11-29 07:44:52.818 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:44:54 compute-0 nova_compute[187185]: 2025-11-29 07:44:54.091 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:44:54 compute-0 podman[244644]: 2025-11-29 07:44:54.819793949 +0000 UTC m=+0.070746808 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 29 07:44:54 compute-0 podman[244646]: 2025-11-29 07:44:54.820323384 +0000 UTC m=+0.070005017 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 07:44:54 compute-0 podman[244645]: 2025-11-29 07:44:54.826433767 +0000 UTC m=+0.080559786 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., distribution-scope=public, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 29 07:44:57 compute-0 nova_compute[187185]: 2025-11-29 07:44:57.310 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:44:57 compute-0 nova_compute[187185]: 2025-11-29 07:44:57.853 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:44:59 compute-0 nova_compute[187185]: 2025-11-29 07:44:59.094 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:44:59 compute-0 nova_compute[187185]: 2025-11-29 07:44:59.507 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:44:59 compute-0 nova_compute[187185]: 2025-11-29 07:44:59.711 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:45:02 compute-0 nova_compute[187185]: 2025-11-29 07:45:02.855 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:45:03 compute-0 podman[244705]: 2025-11-29 07:45:03.858291247 +0000 UTC m=+0.120544751 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 29 07:45:04 compute-0 nova_compute[187185]: 2025-11-29 07:45:04.068 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764402289.067511, 461f4281-0d15-4de0-b5c6-d642b24bfab4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:45:04 compute-0 nova_compute[187185]: 2025-11-29 07:45:04.069 187189 INFO nova.compute.manager [-] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] VM Stopped (Lifecycle Event)
Nov 29 07:45:04 compute-0 nova_compute[187185]: 2025-11-29 07:45:04.097 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:45:04 compute-0 nova_compute[187185]: 2025-11-29 07:45:04.398 187189 DEBUG nova.compute.manager [None req-7b8eabef-3754-4321-8a11-9dd0ad3ea840 - - - - - -] [instance: 461f4281-0d15-4de0-b5c6-d642b24bfab4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:45:06 compute-0 sshd-session[244703]: Received disconnect from 115.190.136.184 port 34416:11: Bye Bye [preauth]
Nov 29 07:45:06 compute-0 sshd-session[244703]: Disconnected from authenticating user root 115.190.136.184 port 34416 [preauth]
Nov 29 07:45:06 compute-0 sshd-session[244732]: Received disconnect from 190.181.27.27 port 52130:11: Bye Bye [preauth]
Nov 29 07:45:06 compute-0 sshd-session[244732]: Disconnected from authenticating user root 190.181.27.27 port 52130 [preauth]
Nov 29 07:45:07 compute-0 nova_compute[187185]: 2025-11-29 07:45:07.857 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:45:09 compute-0 nova_compute[187185]: 2025-11-29 07:45:09.099 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:45:11 compute-0 podman[244734]: 2025-11-29 07:45:11.801010593 +0000 UTC m=+0.068005620 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 07:45:12 compute-0 nova_compute[187185]: 2025-11-29 07:45:12.858 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:45:14 compute-0 nova_compute[187185]: 2025-11-29 07:45:14.102 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:45:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:45:14.443 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=64, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=63) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:45:14 compute-0 nova_compute[187185]: 2025-11-29 07:45:14.443 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:45:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:45:14.445 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:45:14 compute-0 podman[244759]: 2025-11-29 07:45:14.810470581 +0000 UTC m=+0.073804455 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Nov 29 07:45:14 compute-0 podman[244758]: 2025-11-29 07:45:14.812173119 +0000 UTC m=+0.071923801 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:45:17 compute-0 nova_compute[187185]: 2025-11-29 07:45:17.860 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:45:19 compute-0 nova_compute[187185]: 2025-11-29 07:45:19.104 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:45:19 compute-0 sshd-session[244729]: Connection closed by 167.94.138.198 port 37782 [preauth]
Nov 29 07:45:20 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:45:20.448 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '64'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:45:22 compute-0 nova_compute[187185]: 2025-11-29 07:45:22.901 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:45:24 compute-0 nova_compute[187185]: 2025-11-29 07:45:24.107 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:45:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:45:25.744 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:45:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:45:25.744 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:45:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:45:25.744 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:45:25 compute-0 podman[244799]: 2025-11-29 07:45:25.816513562 +0000 UTC m=+0.070265774 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 07:45:25 compute-0 podman[244797]: 2025-11-29 07:45:25.827684389 +0000 UTC m=+0.088435730 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 07:45:25 compute-0 podman[244798]: 2025-11-29 07:45:25.839938777 +0000 UTC m=+0.090252382 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_id=edpm, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, architecture=x86_64, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, io.openshift.tags=minimal rhel9, release=1755695350)
Nov 29 07:45:27 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:45:27.103 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e1:bd:9f 10.100.0.2 2001:db8::f816:3eff:fee1:bd9f'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fee1:bd9f/64', 'neutron:device_id': 'ovnmeta-7b412a37-c227-42ad-9fca-23287613486a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b412a37-c227-42ad-9fca-23287613486a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=045a9acc-370f-460b-b7b5-7c57bd647b8b, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=27074c74-d81e-4dc1-9e05-b59b6b9a0624) old=Port_Binding(mac=['fa:16:3e:e1:bd:9f 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-7b412a37-c227-42ad-9fca-23287613486a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b412a37-c227-42ad-9fca-23287613486a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:45:27 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:45:27.106 104254 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 27074c74-d81e-4dc1-9e05-b59b6b9a0624 in datapath 7b412a37-c227-42ad-9fca-23287613486a updated
Nov 29 07:45:27 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:45:27.110 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7b412a37-c227-42ad-9fca-23287613486a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:45:27 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:45:27.111 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d34bde91-b85a-4243-9996-c9a242a92e99]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:45:27 compute-0 nova_compute[187185]: 2025-11-29 07:45:27.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:45:27 compute-0 nova_compute[187185]: 2025-11-29 07:45:27.935 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:45:29 compute-0 nova_compute[187185]: 2025-11-29 07:45:29.109 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:45:30 compute-0 nova_compute[187185]: 2025-11-29 07:45:30.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:45:32 compute-0 nova_compute[187185]: 2025-11-29 07:45:32.937 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:45:34 compute-0 nova_compute[187185]: 2025-11-29 07:45:34.112 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:45:34 compute-0 nova_compute[187185]: 2025-11-29 07:45:34.252 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:45:34 compute-0 nova_compute[187185]: 2025-11-29 07:45:34.253 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:45:34 compute-0 nova_compute[187185]: 2025-11-29 07:45:34.253 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:45:34 compute-0 nova_compute[187185]: 2025-11-29 07:45:34.253 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:45:34 compute-0 podman[244862]: 2025-11-29 07:45:34.441278275 +0000 UTC m=+0.130107811 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Nov 29 07:45:34 compute-0 nova_compute[187185]: 2025-11-29 07:45:34.500 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:45:34 compute-0 nova_compute[187185]: 2025-11-29 07:45:34.502 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5741MB free_disk=73.25366973876953GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:45:34 compute-0 nova_compute[187185]: 2025-11-29 07:45:34.502 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:45:34 compute-0 nova_compute[187185]: 2025-11-29 07:45:34.502 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:45:34 compute-0 nova_compute[187185]: 2025-11-29 07:45:34.790 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:45:34 compute-0 nova_compute[187185]: 2025-11-29 07:45:34.791 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:45:34 compute-0 nova_compute[187185]: 2025-11-29 07:45:34.809 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Refreshing inventories for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 29 07:45:34 compute-0 nova_compute[187185]: 2025-11-29 07:45:34.836 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Updating ProviderTree inventory for provider 4e39a026-df39-4e20-874a-dbb5a40df044 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 29 07:45:34 compute-0 nova_compute[187185]: 2025-11-29 07:45:34.836 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Updating inventory in ProviderTree for provider 4e39a026-df39-4e20-874a-dbb5a40df044 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 29 07:45:34 compute-0 nova_compute[187185]: 2025-11-29 07:45:34.858 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Refreshing aggregate associations for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 29 07:45:34 compute-0 nova_compute[187185]: 2025-11-29 07:45:34.881 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Refreshing trait associations for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044, traits: HW_CPU_X86_SSE,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_TRUSTED_CERTS,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 29 07:45:35 compute-0 nova_compute[187185]: 2025-11-29 07:45:35.197 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:45:35 compute-0 nova_compute[187185]: 2025-11-29 07:45:35.593 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:45:35 compute-0 nova_compute[187185]: 2025-11-29 07:45:35.681 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:45:35 compute-0 nova_compute[187185]: 2025-11-29 07:45:35.682 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:45:36 compute-0 nova_compute[187185]: 2025-11-29 07:45:36.682 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:45:36 compute-0 nova_compute[187185]: 2025-11-29 07:45:36.682 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:45:36 compute-0 nova_compute[187185]: 2025-11-29 07:45:36.683 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:45:37 compute-0 nova_compute[187185]: 2025-11-29 07:45:37.394 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 07:45:37 compute-0 nova_compute[187185]: 2025-11-29 07:45:37.394 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:45:37 compute-0 nova_compute[187185]: 2025-11-29 07:45:37.395 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:45:37 compute-0 nova_compute[187185]: 2025-11-29 07:45:37.395 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:45:37 compute-0 nova_compute[187185]: 2025-11-29 07:45:37.395 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:45:37 compute-0 nova_compute[187185]: 2025-11-29 07:45:37.396 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:45:37 compute-0 nova_compute[187185]: 2025-11-29 07:45:37.939 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:45:38 compute-0 nova_compute[187185]: 2025-11-29 07:45:38.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:45:39 compute-0 nova_compute[187185]: 2025-11-29 07:45:39.114 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:45:40 compute-0 nova_compute[187185]: 2025-11-29 07:45:40.311 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:45:42 compute-0 podman[244889]: 2025-11-29 07:45:42.794164797 +0000 UTC m=+0.064515331 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 07:45:42 compute-0 nova_compute[187185]: 2025-11-29 07:45:42.941 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:45:44 compute-0 nova_compute[187185]: 2025-11-29 07:45:44.116 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:45:45 compute-0 podman[244913]: 2025-11-29 07:45:45.802896523 +0000 UTC m=+0.068236936 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 07:45:45 compute-0 podman[244914]: 2025-11-29 07:45:45.818375662 +0000 UTC m=+0.072580959 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 29 07:45:47 compute-0 nova_compute[187185]: 2025-11-29 07:45:47.944 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:45:48 compute-0 sshd-session[244888]: error: kex_exchange_identification: read: Connection timed out
Nov 29 07:45:48 compute-0 sshd-session[244888]: banner exchange: Connection from 115.190.136.184 port 25622: Connection timed out
Nov 29 07:45:49 compute-0 nova_compute[187185]: 2025-11-29 07:45:49.118 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:45:52 compute-0 nova_compute[187185]: 2025-11-29 07:45:52.947 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:45:54 compute-0 nova_compute[187185]: 2025-11-29 07:45:54.120 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:45:54 compute-0 sshd-session[244953]: Received disconnect from 20.255.62.58 port 44436:11: Bye Bye [preauth]
Nov 29 07:45:54 compute-0 sshd-session[244953]: Disconnected from authenticating user root 20.255.62.58 port 44436 [preauth]
Nov 29 07:45:55 compute-0 ovn_controller[95281]: 2025-11-29T07:45:55Z|00556|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Nov 29 07:45:56 compute-0 podman[244956]: 2025-11-29 07:45:56.806019224 +0000 UTC m=+0.066946956 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, name=ubi9-minimal, io.openshift.tags=minimal rhel9, vcs-type=git, vendor=Red Hat, Inc., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64)
Nov 29 07:45:56 compute-0 podman[244955]: 2025-11-29 07:45:56.814989599 +0000 UTC m=+0.082126208 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 29 07:45:56 compute-0 podman[244957]: 2025-11-29 07:45:56.851247431 +0000 UTC m=+0.102741815 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 07:45:57 compute-0 nova_compute[187185]: 2025-11-29 07:45:57.950 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:45:59 compute-0 nova_compute[187185]: 2025-11-29 07:45:59.141 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:46:02 compute-0 nova_compute[187185]: 2025-11-29 07:46:02.987 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:46:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:46:03.067 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=65, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=64) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:46:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:46:03.068 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:46:03 compute-0 nova_compute[187185]: 2025-11-29 07:46:03.069 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:46:04 compute-0 nova_compute[187185]: 2025-11-29 07:46:04.146 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:46:04 compute-0 podman[245017]: 2025-11-29 07:46:04.855854863 +0000 UTC m=+0.118691099 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 07:46:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:46:05.071 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '65'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:46:07 compute-0 nova_compute[187185]: 2025-11-29 07:46:07.987 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:46:09 compute-0 nova_compute[187185]: 2025-11-29 07:46:09.149 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:46:12 compute-0 nova_compute[187185]: 2025-11-29 07:46:12.989 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:46:13 compute-0 podman[245045]: 2025-11-29 07:46:13.798080972 +0000 UTC m=+0.062292183 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 07:46:14 compute-0 nova_compute[187185]: 2025-11-29 07:46:14.152 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:46:16 compute-0 podman[245072]: 2025-11-29 07:46:16.813862306 +0000 UTC m=+0.067068549 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:46:16 compute-0 podman[245071]: 2025-11-29 07:46:16.829912253 +0000 UTC m=+0.080964485 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=multipathd, tcib_managed=true, container_name=multipathd)
Nov 29 07:46:17 compute-0 nova_compute[187185]: 2025-11-29 07:46:17.991 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:46:19 compute-0 nova_compute[187185]: 2025-11-29 07:46:19.154 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:46:22 compute-0 nova_compute[187185]: 2025-11-29 07:46:22.994 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:46:24 compute-0 nova_compute[187185]: 2025-11-29 07:46:24.157 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:46:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:46:25.744 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:46:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:46:25.745 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:46:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:46:25.745 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:46:27 compute-0 sshd-session[245109]: Invalid user system from 190.181.27.27 port 33406
Nov 29 07:46:27 compute-0 podman[245112]: 2025-11-29 07:46:27.805096183 +0000 UTC m=+0.066740971 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.expose-services=, version=9.6, build-date=2025-08-20T13:12:41, architecture=x86_64, managed_by=edpm_ansible, name=ubi9-minimal, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 29 07:46:27 compute-0 podman[245111]: 2025-11-29 07:46:27.825975057 +0000 UTC m=+0.091772533 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 07:46:27 compute-0 podman[245113]: 2025-11-29 07:46:27.827411348 +0000 UTC m=+0.077158387 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 07:46:27 compute-0 sshd-session[245109]: Received disconnect from 190.181.27.27 port 33406:11: Bye Bye [preauth]
Nov 29 07:46:27 compute-0 sshd-session[245109]: Disconnected from invalid user system 190.181.27.27 port 33406 [preauth]
Nov 29 07:46:27 compute-0 nova_compute[187185]: 2025-11-29 07:46:27.996 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:46:28 compute-0 nova_compute[187185]: 2025-11-29 07:46:28.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:46:29 compute-0 nova_compute[187185]: 2025-11-29 07:46:29.160 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:46:30 compute-0 nova_compute[187185]: 2025-11-29 07:46:30.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:46:30 compute-0 nova_compute[187185]: 2025-11-29 07:46:30.566 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:46:30 compute-0 nova_compute[187185]: 2025-11-29 07:46:30.566 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:46:30 compute-0 nova_compute[187185]: 2025-11-29 07:46:30.567 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:46:30 compute-0 nova_compute[187185]: 2025-11-29 07:46:30.567 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:46:30 compute-0 nova_compute[187185]: 2025-11-29 07:46:30.759 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:46:30 compute-0 nova_compute[187185]: 2025-11-29 07:46:30.761 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5726MB free_disk=73.25359344482422GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:46:30 compute-0 nova_compute[187185]: 2025-11-29 07:46:30.761 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:46:30 compute-0 nova_compute[187185]: 2025-11-29 07:46:30.762 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:46:30 compute-0 nova_compute[187185]: 2025-11-29 07:46:30.860 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:46:30 compute-0 nova_compute[187185]: 2025-11-29 07:46:30.861 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:46:30 compute-0 nova_compute[187185]: 2025-11-29 07:46:30.889 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:46:30 compute-0 nova_compute[187185]: 2025-11-29 07:46:30.913 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:46:30 compute-0 nova_compute[187185]: 2025-11-29 07:46:30.915 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:46:30 compute-0 nova_compute[187185]: 2025-11-29 07:46:30.915 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:46:31 compute-0 nova_compute[187185]: 2025-11-29 07:46:31.914 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:46:32 compute-0 nova_compute[187185]: 2025-11-29 07:46:32.999 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:46:34 compute-0 nova_compute[187185]: 2025-11-29 07:46:34.202 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:46:35 compute-0 nova_compute[187185]: 2025-11-29 07:46:35.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:46:35 compute-0 nova_compute[187185]: 2025-11-29 07:46:35.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:46:35 compute-0 nova_compute[187185]: 2025-11-29 07:46:35.318 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:46:35 compute-0 nova_compute[187185]: 2025-11-29 07:46:35.341 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 07:46:35 compute-0 nova_compute[187185]: 2025-11-29 07:46:35.341 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:46:35 compute-0 podman[245173]: 2025-11-29 07:46:35.865868134 +0000 UTC m=+0.134335015 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:46:36 compute-0 nova_compute[187185]: 2025-11-29 07:46:36.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:46:36 compute-0 nova_compute[187185]: 2025-11-29 07:46:36.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:46:38 compute-0 nova_compute[187185]: 2025-11-29 07:46:38.001 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:46:38 compute-0 nova_compute[187185]: 2025-11-29 07:46:38.318 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:46:39 compute-0 nova_compute[187185]: 2025-11-29 07:46:39.204 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:46:39 compute-0 nova_compute[187185]: 2025-11-29 07:46:39.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:46:40 compute-0 nova_compute[187185]: 2025-11-29 07:46:40.311 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:46:43 compute-0 nova_compute[187185]: 2025-11-29 07:46:43.034 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:46:44 compute-0 nova_compute[187185]: 2025-11-29 07:46:44.249 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:46:44 compute-0 podman[245199]: 2025-11-29 07:46:44.800057064 +0000 UTC m=+0.063002764 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 07:46:45 compute-0 nova_compute[187185]: 2025-11-29 07:46:45.245 187189 DEBUG oslo_concurrency.lockutils [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "45154ba2-ac58-4c10-a3ff-b0290dae3c8d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:46:45 compute-0 nova_compute[187185]: 2025-11-29 07:46:45.246 187189 DEBUG oslo_concurrency.lockutils [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "45154ba2-ac58-4c10-a3ff-b0290dae3c8d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:46:45 compute-0 nova_compute[187185]: 2025-11-29 07:46:45.281 187189 DEBUG nova.compute.manager [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 07:46:46 compute-0 nova_compute[187185]: 2025-11-29 07:46:46.232 187189 DEBUG oslo_concurrency.lockutils [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:46:46 compute-0 nova_compute[187185]: 2025-11-29 07:46:46.233 187189 DEBUG oslo_concurrency.lockutils [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:46:46 compute-0 nova_compute[187185]: 2025-11-29 07:46:46.240 187189 DEBUG nova.virt.hardware [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:46:46 compute-0 nova_compute[187185]: 2025-11-29 07:46:46.241 187189 INFO nova.compute.claims [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:46:46 compute-0 nova_compute[187185]: 2025-11-29 07:46:46.374 187189 DEBUG nova.compute.provider_tree [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:46:46 compute-0 nova_compute[187185]: 2025-11-29 07:46:46.389 187189 DEBUG nova.scheduler.client.report [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:46:46 compute-0 nova_compute[187185]: 2025-11-29 07:46:46.410 187189 DEBUG oslo_concurrency.lockutils [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:46:46 compute-0 nova_compute[187185]: 2025-11-29 07:46:46.411 187189 DEBUG nova.compute.manager [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 07:46:46 compute-0 nova_compute[187185]: 2025-11-29 07:46:46.465 187189 DEBUG nova.compute.manager [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 07:46:46 compute-0 nova_compute[187185]: 2025-11-29 07:46:46.465 187189 DEBUG nova.network.neutron [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 07:46:46 compute-0 nova_compute[187185]: 2025-11-29 07:46:46.487 187189 INFO nova.virt.libvirt.driver [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 07:46:46 compute-0 nova_compute[187185]: 2025-11-29 07:46:46.520 187189 DEBUG nova.compute.manager [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 07:46:46 compute-0 nova_compute[187185]: 2025-11-29 07:46:46.661 187189 DEBUG nova.compute.manager [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 07:46:46 compute-0 nova_compute[187185]: 2025-11-29 07:46:46.663 187189 DEBUG nova.virt.libvirt.driver [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:46:46 compute-0 nova_compute[187185]: 2025-11-29 07:46:46.663 187189 INFO nova.virt.libvirt.driver [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Creating image(s)
Nov 29 07:46:46 compute-0 nova_compute[187185]: 2025-11-29 07:46:46.664 187189 DEBUG oslo_concurrency.lockutils [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "/var/lib/nova/instances/45154ba2-ac58-4c10-a3ff-b0290dae3c8d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:46:46 compute-0 nova_compute[187185]: 2025-11-29 07:46:46.664 187189 DEBUG oslo_concurrency.lockutils [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "/var/lib/nova/instances/45154ba2-ac58-4c10-a3ff-b0290dae3c8d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:46:46 compute-0 nova_compute[187185]: 2025-11-29 07:46:46.665 187189 DEBUG oslo_concurrency.lockutils [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "/var/lib/nova/instances/45154ba2-ac58-4c10-a3ff-b0290dae3c8d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:46:46 compute-0 nova_compute[187185]: 2025-11-29 07:46:46.679 187189 DEBUG oslo_concurrency.processutils [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:46:46 compute-0 nova_compute[187185]: 2025-11-29 07:46:46.762 187189 DEBUG oslo_concurrency.processutils [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:46:46 compute-0 nova_compute[187185]: 2025-11-29 07:46:46.763 187189 DEBUG oslo_concurrency.lockutils [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:46:46 compute-0 nova_compute[187185]: 2025-11-29 07:46:46.764 187189 DEBUG oslo_concurrency.lockutils [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:46:46 compute-0 nova_compute[187185]: 2025-11-29 07:46:46.775 187189 DEBUG oslo_concurrency.processutils [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:46:46 compute-0 nova_compute[187185]: 2025-11-29 07:46:46.852 187189 DEBUG oslo_concurrency.processutils [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:46:46 compute-0 nova_compute[187185]: 2025-11-29 07:46:46.854 187189 DEBUG oslo_concurrency.processutils [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/45154ba2-ac58-4c10-a3ff-b0290dae3c8d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:46:46 compute-0 nova_compute[187185]: 2025-11-29 07:46:46.966 187189 DEBUG oslo_concurrency.processutils [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/45154ba2-ac58-4c10-a3ff-b0290dae3c8d/disk 1073741824" returned: 0 in 0.112s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:46:46 compute-0 nova_compute[187185]: 2025-11-29 07:46:46.967 187189 DEBUG oslo_concurrency.lockutils [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.203s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:46:46 compute-0 nova_compute[187185]: 2025-11-29 07:46:46.968 187189 DEBUG oslo_concurrency.processutils [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:46:47 compute-0 nova_compute[187185]: 2025-11-29 07:46:47.003 187189 DEBUG nova.policy [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 07:46:47 compute-0 nova_compute[187185]: 2025-11-29 07:46:47.029 187189 DEBUG oslo_concurrency.processutils [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:46:47 compute-0 nova_compute[187185]: 2025-11-29 07:46:47.030 187189 DEBUG nova.virt.disk.api [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Checking if we can resize image /var/lib/nova/instances/45154ba2-ac58-4c10-a3ff-b0290dae3c8d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:46:47 compute-0 nova_compute[187185]: 2025-11-29 07:46:47.031 187189 DEBUG oslo_concurrency.processutils [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/45154ba2-ac58-4c10-a3ff-b0290dae3c8d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:46:47 compute-0 nova_compute[187185]: 2025-11-29 07:46:47.097 187189 DEBUG oslo_concurrency.processutils [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/45154ba2-ac58-4c10-a3ff-b0290dae3c8d/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:46:47 compute-0 nova_compute[187185]: 2025-11-29 07:46:47.099 187189 DEBUG nova.virt.disk.api [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Cannot resize image /var/lib/nova/instances/45154ba2-ac58-4c10-a3ff-b0290dae3c8d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:46:47 compute-0 nova_compute[187185]: 2025-11-29 07:46:47.099 187189 DEBUG nova.objects.instance [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'migration_context' on Instance uuid 45154ba2-ac58-4c10-a3ff-b0290dae3c8d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:46:47 compute-0 nova_compute[187185]: 2025-11-29 07:46:47.118 187189 DEBUG nova.virt.libvirt.driver [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:46:47 compute-0 nova_compute[187185]: 2025-11-29 07:46:47.119 187189 DEBUG nova.virt.libvirt.driver [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Ensure instance console log exists: /var/lib/nova/instances/45154ba2-ac58-4c10-a3ff-b0290dae3c8d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:46:47 compute-0 nova_compute[187185]: 2025-11-29 07:46:47.119 187189 DEBUG oslo_concurrency.lockutils [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:46:47 compute-0 nova_compute[187185]: 2025-11-29 07:46:47.120 187189 DEBUG oslo_concurrency.lockutils [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:46:47 compute-0 nova_compute[187185]: 2025-11-29 07:46:47.120 187189 DEBUG oslo_concurrency.lockutils [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:46:47 compute-0 podman[245238]: 2025-11-29 07:46:47.803368233 +0000 UTC m=+0.064267150 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, container_name=multipathd)
Nov 29 07:46:47 compute-0 podman[245239]: 2025-11-29 07:46:47.842657071 +0000 UTC m=+0.095194570 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS)
Nov 29 07:46:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:46:48.020 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:46:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:46:48.023 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:46:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:46:48.023 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:46:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:46:48.023 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:46:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:46:48.023 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:46:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:46:48.023 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:46:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:46:48.024 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:46:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:46:48.024 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:46:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:46:48.024 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:46:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:46:48.024 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:46:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:46:48.024 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:46:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:46:48.024 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:46:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:46:48.024 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:46:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:46:48.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:46:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:46:48.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:46:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:46:48.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:46:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:46:48.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:46:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:46:48.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:46:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:46:48.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:46:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:46:48.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:46:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:46:48.027 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:46:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:46:48.027 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:46:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:46:48.027 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:46:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:46:48.027 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:46:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:46:48.028 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:46:48 compute-0 nova_compute[187185]: 2025-11-29 07:46:48.036 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:46:48 compute-0 nova_compute[187185]: 2025-11-29 07:46:48.555 187189 DEBUG nova.network.neutron [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Successfully created port: 231b4077-66e5-463d-9600-7e94d305692d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:46:49 compute-0 nova_compute[187185]: 2025-11-29 07:46:49.253 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:46:50 compute-0 nova_compute[187185]: 2025-11-29 07:46:50.633 187189 DEBUG nova.network.neutron [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Successfully updated port: 231b4077-66e5-463d-9600-7e94d305692d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:46:51 compute-0 nova_compute[187185]: 2025-11-29 07:46:51.568 187189 DEBUG oslo_concurrency.lockutils [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "refresh_cache-45154ba2-ac58-4c10-a3ff-b0290dae3c8d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:46:51 compute-0 nova_compute[187185]: 2025-11-29 07:46:51.568 187189 DEBUG oslo_concurrency.lockutils [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquired lock "refresh_cache-45154ba2-ac58-4c10-a3ff-b0290dae3c8d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:46:51 compute-0 nova_compute[187185]: 2025-11-29 07:46:51.569 187189 DEBUG nova.network.neutron [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:46:51 compute-0 nova_compute[187185]: 2025-11-29 07:46:51.723 187189 DEBUG nova.compute.manager [req-f64e82e3-b500-4b9d-bf7f-845f9fb36439 req-242fd20f-04ae-4ff2-aa4c-921d719167d2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Received event network-changed-231b4077-66e5-463d-9600-7e94d305692d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:46:51 compute-0 nova_compute[187185]: 2025-11-29 07:46:51.724 187189 DEBUG nova.compute.manager [req-f64e82e3-b500-4b9d-bf7f-845f9fb36439 req-242fd20f-04ae-4ff2-aa4c-921d719167d2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Refreshing instance network info cache due to event network-changed-231b4077-66e5-463d-9600-7e94d305692d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:46:51 compute-0 nova_compute[187185]: 2025-11-29 07:46:51.724 187189 DEBUG oslo_concurrency.lockutils [req-f64e82e3-b500-4b9d-bf7f-845f9fb36439 req-242fd20f-04ae-4ff2-aa4c-921d719167d2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-45154ba2-ac58-4c10-a3ff-b0290dae3c8d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:46:52 compute-0 nova_compute[187185]: 2025-11-29 07:46:52.028 187189 DEBUG nova.network.neutron [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:46:53 compute-0 nova_compute[187185]: 2025-11-29 07:46:53.038 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:46:53 compute-0 ovn_controller[95281]: 2025-11-29T07:46:53Z|00557|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Nov 29 07:46:54 compute-0 nova_compute[187185]: 2025-11-29 07:46:54.255 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:46:54 compute-0 nova_compute[187185]: 2025-11-29 07:46:54.990 187189 DEBUG nova.network.neutron [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Updating instance_info_cache with network_info: [{"id": "231b4077-66e5-463d-9600-7e94d305692d", "address": "fa:16:3e:82:46:2d", "network": {"id": "7b412a37-c227-42ad-9fca-23287613486a", "bridge": "br-int", "label": "tempest-network-smoke--2027641750", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe82:462d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap231b4077-66", "ovs_interfaceid": "231b4077-66e5-463d-9600-7e94d305692d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.318 187189 DEBUG oslo_concurrency.lockutils [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Releasing lock "refresh_cache-45154ba2-ac58-4c10-a3ff-b0290dae3c8d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.319 187189 DEBUG nova.compute.manager [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Instance network_info: |[{"id": "231b4077-66e5-463d-9600-7e94d305692d", "address": "fa:16:3e:82:46:2d", "network": {"id": "7b412a37-c227-42ad-9fca-23287613486a", "bridge": "br-int", "label": "tempest-network-smoke--2027641750", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe82:462d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap231b4077-66", "ovs_interfaceid": "231b4077-66e5-463d-9600-7e94d305692d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.320 187189 DEBUG oslo_concurrency.lockutils [req-f64e82e3-b500-4b9d-bf7f-845f9fb36439 req-242fd20f-04ae-4ff2-aa4c-921d719167d2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-45154ba2-ac58-4c10-a3ff-b0290dae3c8d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.320 187189 DEBUG nova.network.neutron [req-f64e82e3-b500-4b9d-bf7f-845f9fb36439 req-242fd20f-04ae-4ff2-aa4c-921d719167d2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Refreshing network info cache for port 231b4077-66e5-463d-9600-7e94d305692d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.326 187189 DEBUG nova.virt.libvirt.driver [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Start _get_guest_xml network_info=[{"id": "231b4077-66e5-463d-9600-7e94d305692d", "address": "fa:16:3e:82:46:2d", "network": {"id": "7b412a37-c227-42ad-9fca-23287613486a", "bridge": "br-int", "label": "tempest-network-smoke--2027641750", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe82:462d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap231b4077-66", "ovs_interfaceid": "231b4077-66e5-463d-9600-7e94d305692d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.331 187189 WARNING nova.virt.libvirt.driver [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.340 187189 DEBUG nova.virt.libvirt.host [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.342 187189 DEBUG nova.virt.libvirt.host [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.352 187189 DEBUG nova.virt.libvirt.host [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.353 187189 DEBUG nova.virt.libvirt.host [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.355 187189 DEBUG nova.virt.libvirt.driver [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.355 187189 DEBUG nova.virt.hardware [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.356 187189 DEBUG nova.virt.hardware [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.356 187189 DEBUG nova.virt.hardware [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.356 187189 DEBUG nova.virt.hardware [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.357 187189 DEBUG nova.virt.hardware [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.357 187189 DEBUG nova.virt.hardware [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.357 187189 DEBUG nova.virt.hardware [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.357 187189 DEBUG nova.virt.hardware [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.358 187189 DEBUG nova.virt.hardware [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.358 187189 DEBUG nova.virt.hardware [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.358 187189 DEBUG nova.virt.hardware [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.363 187189 DEBUG nova.virt.libvirt.vif [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:46:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1930125207',display_name='tempest-TestGettingAddress-server-1930125207',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1930125207',id=167,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMRqMqB2m2OgDWFFhqrQomXOlqtsH3DkZX/q3f9H1IQ0ObpMW22Tv9hUlgFTK1dOU/3/nBcrYC6MrtFKRuaytFioBJb/QmB1UkysPgqPE38bvZ1GGYFs/tP1vPRobYVgdQ==',key_name='tempest-TestGettingAddress-277961909',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-nh1laork',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:46:46Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=45154ba2-ac58-4c10-a3ff-b0290dae3c8d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "231b4077-66e5-463d-9600-7e94d305692d", "address": "fa:16:3e:82:46:2d", "network": {"id": "7b412a37-c227-42ad-9fca-23287613486a", "bridge": "br-int", "label": "tempest-network-smoke--2027641750", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe82:462d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap231b4077-66", "ovs_interfaceid": "231b4077-66e5-463d-9600-7e94d305692d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.363 187189 DEBUG nova.network.os_vif_util [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "231b4077-66e5-463d-9600-7e94d305692d", "address": "fa:16:3e:82:46:2d", "network": {"id": "7b412a37-c227-42ad-9fca-23287613486a", "bridge": "br-int", "label": "tempest-network-smoke--2027641750", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe82:462d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap231b4077-66", "ovs_interfaceid": "231b4077-66e5-463d-9600-7e94d305692d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.364 187189 DEBUG nova.network.os_vif_util [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:46:2d,bridge_name='br-int',has_traffic_filtering=True,id=231b4077-66e5-463d-9600-7e94d305692d,network=Network(7b412a37-c227-42ad-9fca-23287613486a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap231b4077-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.365 187189 DEBUG nova.objects.instance [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'pci_devices' on Instance uuid 45154ba2-ac58-4c10-a3ff-b0290dae3c8d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.779 187189 DEBUG nova.virt.libvirt.driver [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:46:55 compute-0 nova_compute[187185]:   <uuid>45154ba2-ac58-4c10-a3ff-b0290dae3c8d</uuid>
Nov 29 07:46:55 compute-0 nova_compute[187185]:   <name>instance-000000a7</name>
Nov 29 07:46:55 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:46:55 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:46:55 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:46:55 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:       <nova:name>tempest-TestGettingAddress-server-1930125207</nova:name>
Nov 29 07:46:55 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:46:55</nova:creationTime>
Nov 29 07:46:55 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:46:55 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:46:55 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:46:55 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:46:55 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:46:55 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:46:55 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:46:55 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:46:55 compute-0 nova_compute[187185]:         <nova:user uuid="31ac7b05b012433b89143dc9f259644a">tempest-TestGettingAddress-1465017630-project-member</nova:user>
Nov 29 07:46:55 compute-0 nova_compute[187185]:         <nova:project uuid="0111c22b4b954ea586ca20d91ed3970f">tempest-TestGettingAddress-1465017630</nova:project>
Nov 29 07:46:55 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:46:55 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:46:55 compute-0 nova_compute[187185]:         <nova:port uuid="231b4077-66e5-463d-9600-7e94d305692d">
Nov 29 07:46:55 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fe82:462d" ipVersion="6"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:46:55 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:46:55 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:46:55 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <system>
Nov 29 07:46:55 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:46:55 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:46:55 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:46:55 compute-0 nova_compute[187185]:       <entry name="serial">45154ba2-ac58-4c10-a3ff-b0290dae3c8d</entry>
Nov 29 07:46:55 compute-0 nova_compute[187185]:       <entry name="uuid">45154ba2-ac58-4c10-a3ff-b0290dae3c8d</entry>
Nov 29 07:46:55 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     </system>
Nov 29 07:46:55 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:46:55 compute-0 nova_compute[187185]:   <os>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:   </os>
Nov 29 07:46:55 compute-0 nova_compute[187185]:   <features>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:   </features>
Nov 29 07:46:55 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:46:55 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:46:55 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:46:55 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/45154ba2-ac58-4c10-a3ff-b0290dae3c8d/disk"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:46:55 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/45154ba2-ac58-4c10-a3ff-b0290dae3c8d/disk.config"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:46:55 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:82:46:2d"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:       <target dev="tap231b4077-66"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:46:55 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/45154ba2-ac58-4c10-a3ff-b0290dae3c8d/console.log" append="off"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <video>
Nov 29 07:46:55 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     </video>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:46:55 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:46:55 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:46:55 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:46:55 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:46:55 compute-0 nova_compute[187185]: </domain>
Nov 29 07:46:55 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.781 187189 DEBUG nova.compute.manager [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Preparing to wait for external event network-vif-plugged-231b4077-66e5-463d-9600-7e94d305692d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.782 187189 DEBUG oslo_concurrency.lockutils [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "45154ba2-ac58-4c10-a3ff-b0290dae3c8d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.782 187189 DEBUG oslo_concurrency.lockutils [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "45154ba2-ac58-4c10-a3ff-b0290dae3c8d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.782 187189 DEBUG oslo_concurrency.lockutils [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "45154ba2-ac58-4c10-a3ff-b0290dae3c8d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.783 187189 DEBUG nova.virt.libvirt.vif [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:46:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1930125207',display_name='tempest-TestGettingAddress-server-1930125207',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1930125207',id=167,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMRqMqB2m2OgDWFFhqrQomXOlqtsH3DkZX/q3f9H1IQ0ObpMW22Tv9hUlgFTK1dOU/3/nBcrYC6MrtFKRuaytFioBJb/QmB1UkysPgqPE38bvZ1GGYFs/tP1vPRobYVgdQ==',key_name='tempest-TestGettingAddress-277961909',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-nh1laork',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:46:46Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=45154ba2-ac58-4c10-a3ff-b0290dae3c8d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "231b4077-66e5-463d-9600-7e94d305692d", "address": "fa:16:3e:82:46:2d", "network": {"id": "7b412a37-c227-42ad-9fca-23287613486a", "bridge": "br-int", "label": "tempest-network-smoke--2027641750", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe82:462d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap231b4077-66", "ovs_interfaceid": "231b4077-66e5-463d-9600-7e94d305692d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.783 187189 DEBUG nova.network.os_vif_util [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "231b4077-66e5-463d-9600-7e94d305692d", "address": "fa:16:3e:82:46:2d", "network": {"id": "7b412a37-c227-42ad-9fca-23287613486a", "bridge": "br-int", "label": "tempest-network-smoke--2027641750", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe82:462d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap231b4077-66", "ovs_interfaceid": "231b4077-66e5-463d-9600-7e94d305692d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.784 187189 DEBUG nova.network.os_vif_util [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:46:2d,bridge_name='br-int',has_traffic_filtering=True,id=231b4077-66e5-463d-9600-7e94d305692d,network=Network(7b412a37-c227-42ad-9fca-23287613486a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap231b4077-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.784 187189 DEBUG os_vif [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:46:2d,bridge_name='br-int',has_traffic_filtering=True,id=231b4077-66e5-463d-9600-7e94d305692d,network=Network(7b412a37-c227-42ad-9fca-23287613486a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap231b4077-66') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.785 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.786 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.786 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.790 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.791 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap231b4077-66, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.792 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap231b4077-66, col_values=(('external_ids', {'iface-id': '231b4077-66e5-463d-9600-7e94d305692d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:82:46:2d', 'vm-uuid': '45154ba2-ac58-4c10-a3ff-b0290dae3c8d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.794 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.797 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:46:55 compute-0 NetworkManager[55227]: <info>  [1764402415.7974] manager: (tap231b4077-66): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/288)
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.804 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:46:55 compute-0 nova_compute[187185]: 2025-11-29 07:46:55.807 187189 INFO os_vif [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:46:2d,bridge_name='br-int',has_traffic_filtering=True,id=231b4077-66e5-463d-9600-7e94d305692d,network=Network(7b412a37-c227-42ad-9fca-23287613486a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap231b4077-66')
Nov 29 07:46:56 compute-0 nova_compute[187185]: 2025-11-29 07:46:56.855 187189 DEBUG nova.virt.libvirt.driver [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:46:56 compute-0 nova_compute[187185]: 2025-11-29 07:46:56.856 187189 DEBUG nova.virt.libvirt.driver [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:46:56 compute-0 nova_compute[187185]: 2025-11-29 07:46:56.856 187189 DEBUG nova.virt.libvirt.driver [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No VIF found with MAC fa:16:3e:82:46:2d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:46:56 compute-0 nova_compute[187185]: 2025-11-29 07:46:56.857 187189 INFO nova.virt.libvirt.driver [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Using config drive
Nov 29 07:46:57 compute-0 nova_compute[187185]: 2025-11-29 07:46:57.211 187189 INFO nova.virt.libvirt.driver [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Creating config drive at /var/lib/nova/instances/45154ba2-ac58-4c10-a3ff-b0290dae3c8d/disk.config
Nov 29 07:46:57 compute-0 nova_compute[187185]: 2025-11-29 07:46:57.220 187189 DEBUG oslo_concurrency.processutils [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/45154ba2-ac58-4c10-a3ff-b0290dae3c8d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6dcaoqdj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:46:57 compute-0 nova_compute[187185]: 2025-11-29 07:46:57.310 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:46:57 compute-0 nova_compute[187185]: 2025-11-29 07:46:57.356 187189 DEBUG oslo_concurrency.processutils [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/45154ba2-ac58-4c10-a3ff-b0290dae3c8d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6dcaoqdj" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:46:57 compute-0 kernel: tap231b4077-66: entered promiscuous mode
Nov 29 07:46:57 compute-0 NetworkManager[55227]: <info>  [1764402417.4411] manager: (tap231b4077-66): new Tun device (/org/freedesktop/NetworkManager/Devices/289)
Nov 29 07:46:57 compute-0 ovn_controller[95281]: 2025-11-29T07:46:57Z|00558|binding|INFO|Claiming lport 231b4077-66e5-463d-9600-7e94d305692d for this chassis.
Nov 29 07:46:57 compute-0 ovn_controller[95281]: 2025-11-29T07:46:57Z|00559|binding|INFO|231b4077-66e5-463d-9600-7e94d305692d: Claiming fa:16:3e:82:46:2d 10.100.0.9 2001:db8::f816:3eff:fe82:462d
Nov 29 07:46:57 compute-0 nova_compute[187185]: 2025-11-29 07:46:57.443 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:46:57 compute-0 nova_compute[187185]: 2025-11-29 07:46:57.446 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:46:57 compute-0 nova_compute[187185]: 2025-11-29 07:46:57.450 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:46:57 compute-0 nova_compute[187185]: 2025-11-29 07:46:57.458 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:46:57 compute-0 systemd-udevd[245300]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:46:57 compute-0 systemd-machined[153486]: New machine qemu-65-instance-000000a7.
Nov 29 07:46:57 compute-0 NetworkManager[55227]: <info>  [1764402417.4825] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/290)
Nov 29 07:46:57 compute-0 NetworkManager[55227]: <info>  [1764402417.4837] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/291)
Nov 29 07:46:57 compute-0 nova_compute[187185]: 2025-11-29 07:46:57.484 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:46:57 compute-0 NetworkManager[55227]: <info>  [1764402417.4940] device (tap231b4077-66): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:46:57 compute-0 NetworkManager[55227]: <info>  [1764402417.4954] device (tap231b4077-66): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:46:57 compute-0 systemd[1]: Started Virtual Machine qemu-65-instance-000000a7.
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:46:57.568 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:46:2d 10.100.0.9 2001:db8::f816:3eff:fe82:462d'], port_security=['fa:16:3e:82:46:2d 10.100.0.9 2001:db8::f816:3eff:fe82:462d'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28 2001:db8::f816:3eff:fe82:462d/64', 'neutron:device_id': '45154ba2-ac58-4c10-a3ff-b0290dae3c8d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b412a37-c227-42ad-9fca-23287613486a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0e7f9e0d-709d-40ea-bb38-80f3b9bd57d2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=045a9acc-370f-460b-b7b5-7c57bd647b8b, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=231b4077-66e5-463d-9600-7e94d305692d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:46:57.569 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 231b4077-66e5-463d-9600-7e94d305692d in datapath 7b412a37-c227-42ad-9fca-23287613486a bound to our chassis
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:46:57.571 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7b412a37-c227-42ad-9fca-23287613486a
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:46:57.581 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[bb05f387-9263-4534-ae91-17de7e5d3bc2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:46:57.583 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7b412a37-c1 in ovnmeta-7b412a37-c227-42ad-9fca-23287613486a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:46:57.585 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7b412a37-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:46:57.585 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[deb2be59-e416-49c1-b922-cec7c008d1d4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:46:57.586 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[b71b10a8-10b6-4bb5-9781-56ba0aad3344]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:46:57.600 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[1a12bc5b-4e2b-4e72-80b9-e867f45c15c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:46:57.626 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[4556143a-7aca-4048-b3f1-145d4a3d3e07]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:46:57 compute-0 nova_compute[187185]: 2025-11-29 07:46:57.653 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:46:57.665 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[3cb9fd20-b50e-4bd2-9510-2b4276b6f14a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:46:57.671 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[9ca5201e-5930-46f2-b1df-d6d87c754dcb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:46:57 compute-0 NetworkManager[55227]: <info>  [1764402417.6728] manager: (tap7b412a37-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/292)
Nov 29 07:46:57 compute-0 systemd-udevd[245302]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:46:57 compute-0 nova_compute[187185]: 2025-11-29 07:46:57.686 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:46:57 compute-0 ovn_controller[95281]: 2025-11-29T07:46:57Z|00560|binding|INFO|Setting lport 231b4077-66e5-463d-9600-7e94d305692d ovn-installed in OVS
Nov 29 07:46:57 compute-0 ovn_controller[95281]: 2025-11-29T07:46:57Z|00561|binding|INFO|Setting lport 231b4077-66e5-463d-9600-7e94d305692d up in Southbound
Nov 29 07:46:57 compute-0 nova_compute[187185]: 2025-11-29 07:46:57.699 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:46:57.717 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[f822c61e-3f4c-49b5-91cd-ad045598eca4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:46:57.725 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[89ef04a9-32a3-4fe4-be2a-5fbb0b44ea91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:46:57 compute-0 NetworkManager[55227]: <info>  [1764402417.7530] device (tap7b412a37-c0): carrier: link connected
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:46:57.763 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[b14bd050-738a-4a0e-abeb-65aa8dafd5ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:46:57.785 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[02355997-d564-48a9-908c-5312f7113e15]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7b412a37-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e1:bd:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 175], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 786541, 'reachable_time': 37759, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245333, 'error': None, 'target': 'ovnmeta-7b412a37-c227-42ad-9fca-23287613486a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:46:57.808 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[c0e60216-3e1a-438a-9801-58a4508074f2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee1:bd9f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 786541, 'tstamp': 786541}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 245334, 'error': None, 'target': 'ovnmeta-7b412a37-c227-42ad-9fca-23287613486a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:46:57.831 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a0b32c57-6de6-48f6-8985-1c091bd06864]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7b412a37-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e1:bd:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 175], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 786541, 'reachable_time': 37759, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 245335, 'error': None, 'target': 'ovnmeta-7b412a37-c227-42ad-9fca-23287613486a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:46:57.874 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d6ec49f3-29c2-42cf-8f42-4aa79d850d4b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:46:57.963 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[b668e3c9-6057-4770-9939-f944d17f4e05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:46:57.965 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b412a37-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:46:57.965 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:46:57.966 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7b412a37-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:46:57 compute-0 nova_compute[187185]: 2025-11-29 07:46:57.968 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:46:57 compute-0 NetworkManager[55227]: <info>  [1764402417.9693] manager: (tap7b412a37-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/293)
Nov 29 07:46:57 compute-0 kernel: tap7b412a37-c0: entered promiscuous mode
Nov 29 07:46:57 compute-0 nova_compute[187185]: 2025-11-29 07:46:57.972 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:46:57.973 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7b412a37-c0, col_values=(('external_ids', {'iface-id': '27074c74-d81e-4dc1-9e05-b59b6b9a0624'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:46:57 compute-0 nova_compute[187185]: 2025-11-29 07:46:57.975 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:46:57 compute-0 ovn_controller[95281]: 2025-11-29T07:46:57Z|00562|binding|INFO|Releasing lport 27074c74-d81e-4dc1-9e05-b59b6b9a0624 from this chassis (sb_readonly=0)
Nov 29 07:46:57 compute-0 nova_compute[187185]: 2025-11-29 07:46:57.990 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:46:57.993 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7b412a37-c227-42ad-9fca-23287613486a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7b412a37-c227-42ad-9fca-23287613486a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:46:57.994 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[85e11061-41d7-4854-9991-1acda6c4be62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:46:57.995 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-7b412a37-c227-42ad-9fca-23287613486a
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/7b412a37-c227-42ad-9fca-23287613486a.pid.haproxy
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID 7b412a37-c227-42ad-9fca-23287613486a
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:46:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:46:57.997 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7b412a37-c227-42ad-9fca-23287613486a', 'env', 'PROCESS_TAG=haproxy-7b412a37-c227-42ad-9fca-23287613486a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7b412a37-c227-42ad-9fca-23287613486a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:46:58 compute-0 nova_compute[187185]: 2025-11-29 07:46:58.014 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764402418.0138836, 45154ba2-ac58-4c10-a3ff-b0290dae3c8d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:46:58 compute-0 nova_compute[187185]: 2025-11-29 07:46:58.015 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] VM Started (Lifecycle Event)
Nov 29 07:46:58 compute-0 nova_compute[187185]: 2025-11-29 07:46:58.040 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:46:58 compute-0 nova_compute[187185]: 2025-11-29 07:46:58.264 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:46:58 compute-0 nova_compute[187185]: 2025-11-29 07:46:58.270 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764402418.014739, 45154ba2-ac58-4c10-a3ff-b0290dae3c8d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:46:58 compute-0 nova_compute[187185]: 2025-11-29 07:46:58.271 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] VM Paused (Lifecycle Event)
Nov 29 07:46:58 compute-0 nova_compute[187185]: 2025-11-29 07:46:58.296 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:46:58 compute-0 nova_compute[187185]: 2025-11-29 07:46:58.301 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:46:58 compute-0 nova_compute[187185]: 2025-11-29 07:46:58.324 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:46:58 compute-0 podman[245373]: 2025-11-29 07:46:58.339683612 +0000 UTC m=+0.026194317 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:46:58 compute-0 podman[245373]: 2025-11-29 07:46:58.897308182 +0000 UTC m=+0.583818867 container create 7755127ec8441879f78be828a3e70052a54493a5920decd1a02b59d23bed270f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b412a37-c227-42ad-9fca-23287613486a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Nov 29 07:46:58 compute-0 podman[245387]: 2025-11-29 07:46:58.911757154 +0000 UTC m=+0.162456875 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_id=edpm, maintainer=Red Hat, Inc., name=ubi9-minimal, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, architecture=x86_64, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 29 07:46:58 compute-0 podman[245386]: 2025-11-29 07:46:58.928433718 +0000 UTC m=+0.178795999 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:46:58 compute-0 podman[245388]: 2025-11-29 07:46:58.928569682 +0000 UTC m=+0.172178611 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 07:46:58 compute-0 systemd[1]: Started libpod-conmon-7755127ec8441879f78be828a3e70052a54493a5920decd1a02b59d23bed270f.scope.
Nov 29 07:46:58 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:46:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d62ada2ba9ef00e188ab1aa85bf580c5695086f7c38b085fc61e9706a27c9ba5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:46:59 compute-0 podman[245373]: 2025-11-29 07:46:59.176321934 +0000 UTC m=+0.862832669 container init 7755127ec8441879f78be828a3e70052a54493a5920decd1a02b59d23bed270f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b412a37-c227-42ad-9fca-23287613486a, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true)
Nov 29 07:46:59 compute-0 podman[245373]: 2025-11-29 07:46:59.186049681 +0000 UTC m=+0.872560376 container start 7755127ec8441879f78be828a3e70052a54493a5920decd1a02b59d23bed270f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b412a37-c227-42ad-9fca-23287613486a, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Nov 29 07:46:59 compute-0 neutron-haproxy-ovnmeta-7b412a37-c227-42ad-9fca-23287613486a[245449]: [NOTICE]   (245453) : New worker (245455) forked
Nov 29 07:46:59 compute-0 neutron-haproxy-ovnmeta-7b412a37-c227-42ad-9fca-23287613486a[245449]: [NOTICE]   (245453) : Loading success.
Nov 29 07:46:59 compute-0 nova_compute[187185]: 2025-11-29 07:46:59.580 187189 DEBUG nova.compute.manager [req-e109fb0b-2f9f-4f63-94a0-9418ea1b1cf7 req-a3789add-e579-49c6-b4a9-0497d3833783 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Received event network-vif-plugged-231b4077-66e5-463d-9600-7e94d305692d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:46:59 compute-0 nova_compute[187185]: 2025-11-29 07:46:59.581 187189 DEBUG oslo_concurrency.lockutils [req-e109fb0b-2f9f-4f63-94a0-9418ea1b1cf7 req-a3789add-e579-49c6-b4a9-0497d3833783 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "45154ba2-ac58-4c10-a3ff-b0290dae3c8d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:46:59 compute-0 nova_compute[187185]: 2025-11-29 07:46:59.582 187189 DEBUG oslo_concurrency.lockutils [req-e109fb0b-2f9f-4f63-94a0-9418ea1b1cf7 req-a3789add-e579-49c6-b4a9-0497d3833783 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "45154ba2-ac58-4c10-a3ff-b0290dae3c8d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:46:59 compute-0 nova_compute[187185]: 2025-11-29 07:46:59.582 187189 DEBUG oslo_concurrency.lockutils [req-e109fb0b-2f9f-4f63-94a0-9418ea1b1cf7 req-a3789add-e579-49c6-b4a9-0497d3833783 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "45154ba2-ac58-4c10-a3ff-b0290dae3c8d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:46:59 compute-0 nova_compute[187185]: 2025-11-29 07:46:59.583 187189 DEBUG nova.compute.manager [req-e109fb0b-2f9f-4f63-94a0-9418ea1b1cf7 req-a3789add-e579-49c6-b4a9-0497d3833783 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Processing event network-vif-plugged-231b4077-66e5-463d-9600-7e94d305692d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:46:59 compute-0 nova_compute[187185]: 2025-11-29 07:46:59.584 187189 DEBUG nova.compute.manager [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:46:59 compute-0 nova_compute[187185]: 2025-11-29 07:46:59.591 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764402419.5914593, 45154ba2-ac58-4c10-a3ff-b0290dae3c8d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:46:59 compute-0 nova_compute[187185]: 2025-11-29 07:46:59.592 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] VM Resumed (Lifecycle Event)
Nov 29 07:46:59 compute-0 nova_compute[187185]: 2025-11-29 07:46:59.596 187189 DEBUG nova.virt.libvirt.driver [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:46:59 compute-0 nova_compute[187185]: 2025-11-29 07:46:59.601 187189 INFO nova.virt.libvirt.driver [-] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Instance spawned successfully.
Nov 29 07:46:59 compute-0 nova_compute[187185]: 2025-11-29 07:46:59.601 187189 DEBUG nova.virt.libvirt.driver [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:46:59 compute-0 nova_compute[187185]: 2025-11-29 07:46:59.748 187189 DEBUG nova.network.neutron [req-f64e82e3-b500-4b9d-bf7f-845f9fb36439 req-242fd20f-04ae-4ff2-aa4c-921d719167d2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Updated VIF entry in instance network info cache for port 231b4077-66e5-463d-9600-7e94d305692d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:46:59 compute-0 nova_compute[187185]: 2025-11-29 07:46:59.750 187189 DEBUG nova.network.neutron [req-f64e82e3-b500-4b9d-bf7f-845f9fb36439 req-242fd20f-04ae-4ff2-aa4c-921d719167d2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Updating instance_info_cache with network_info: [{"id": "231b4077-66e5-463d-9600-7e94d305692d", "address": "fa:16:3e:82:46:2d", "network": {"id": "7b412a37-c227-42ad-9fca-23287613486a", "bridge": "br-int", "label": "tempest-network-smoke--2027641750", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe82:462d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap231b4077-66", "ovs_interfaceid": "231b4077-66e5-463d-9600-7e94d305692d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:46:59 compute-0 nova_compute[187185]: 2025-11-29 07:46:59.814 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:46:59 compute-0 nova_compute[187185]: 2025-11-29 07:46:59.817 187189 DEBUG oslo_concurrency.lockutils [req-f64e82e3-b500-4b9d-bf7f-845f9fb36439 req-242fd20f-04ae-4ff2-aa4c-921d719167d2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-45154ba2-ac58-4c10-a3ff-b0290dae3c8d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:46:59 compute-0 nova_compute[187185]: 2025-11-29 07:46:59.821 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:46:59 compute-0 nova_compute[187185]: 2025-11-29 07:46:59.825 187189 DEBUG nova.virt.libvirt.driver [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:46:59 compute-0 nova_compute[187185]: 2025-11-29 07:46:59.825 187189 DEBUG nova.virt.libvirt.driver [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:46:59 compute-0 nova_compute[187185]: 2025-11-29 07:46:59.826 187189 DEBUG nova.virt.libvirt.driver [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:46:59 compute-0 nova_compute[187185]: 2025-11-29 07:46:59.826 187189 DEBUG nova.virt.libvirt.driver [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:46:59 compute-0 nova_compute[187185]: 2025-11-29 07:46:59.827 187189 DEBUG nova.virt.libvirt.driver [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:46:59 compute-0 nova_compute[187185]: 2025-11-29 07:46:59.827 187189 DEBUG nova.virt.libvirt.driver [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:47:00 compute-0 nova_compute[187185]: 2025-11-29 07:47:00.022 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:47:00 compute-0 nova_compute[187185]: 2025-11-29 07:47:00.795 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:47:01 compute-0 nova_compute[187185]: 2025-11-29 07:47:01.271 187189 INFO nova.compute.manager [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Took 14.61 seconds to spawn the instance on the hypervisor.
Nov 29 07:47:01 compute-0 nova_compute[187185]: 2025-11-29 07:47:01.272 187189 DEBUG nova.compute.manager [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:47:01 compute-0 sshd-session[245279]: error: kex_exchange_identification: read: Connection timed out
Nov 29 07:47:01 compute-0 sshd-session[245279]: banner exchange: Connection from 115.190.136.184 port 36868: Connection timed out
Nov 29 07:47:02 compute-0 nova_compute[187185]: 2025-11-29 07:47:02.700 187189 DEBUG nova.compute.manager [req-4411a639-bcbd-42af-81ea-2beb39f48257 req-88b9eddf-a133-4190-b0e5-02f374849171 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Received event network-vif-plugged-231b4077-66e5-463d-9600-7e94d305692d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:47:02 compute-0 nova_compute[187185]: 2025-11-29 07:47:02.700 187189 DEBUG oslo_concurrency.lockutils [req-4411a639-bcbd-42af-81ea-2beb39f48257 req-88b9eddf-a133-4190-b0e5-02f374849171 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "45154ba2-ac58-4c10-a3ff-b0290dae3c8d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:47:02 compute-0 nova_compute[187185]: 2025-11-29 07:47:02.701 187189 DEBUG oslo_concurrency.lockutils [req-4411a639-bcbd-42af-81ea-2beb39f48257 req-88b9eddf-a133-4190-b0e5-02f374849171 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "45154ba2-ac58-4c10-a3ff-b0290dae3c8d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:47:02 compute-0 nova_compute[187185]: 2025-11-29 07:47:02.701 187189 DEBUG oslo_concurrency.lockutils [req-4411a639-bcbd-42af-81ea-2beb39f48257 req-88b9eddf-a133-4190-b0e5-02f374849171 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "45154ba2-ac58-4c10-a3ff-b0290dae3c8d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:47:02 compute-0 nova_compute[187185]: 2025-11-29 07:47:02.701 187189 DEBUG nova.compute.manager [req-4411a639-bcbd-42af-81ea-2beb39f48257 req-88b9eddf-a133-4190-b0e5-02f374849171 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] No waiting events found dispatching network-vif-plugged-231b4077-66e5-463d-9600-7e94d305692d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:47:02 compute-0 nova_compute[187185]: 2025-11-29 07:47:02.702 187189 WARNING nova.compute.manager [req-4411a639-bcbd-42af-81ea-2beb39f48257 req-88b9eddf-a133-4190-b0e5-02f374849171 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Received unexpected event network-vif-plugged-231b4077-66e5-463d-9600-7e94d305692d for instance with vm_state active and task_state None.
Nov 29 07:47:03 compute-0 nova_compute[187185]: 2025-11-29 07:47:03.046 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:47:03 compute-0 nova_compute[187185]: 2025-11-29 07:47:03.730 187189 INFO nova.compute.manager [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Took 18.34 seconds to build instance.
Nov 29 07:47:03 compute-0 nova_compute[187185]: 2025-11-29 07:47:03.834 187189 DEBUG oslo_concurrency.lockutils [None req-ba728f05-5bbd-4ddd-96a7-d13861a0291b 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "45154ba2-ac58-4c10-a3ff-b0290dae3c8d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.588s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:47:05 compute-0 nova_compute[187185]: 2025-11-29 07:47:05.798 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:47:06 compute-0 podman[245464]: 2025-11-29 07:47:06.893918756 +0000 UTC m=+0.143605627 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.license=GPLv2)
Nov 29 07:47:08 compute-0 nova_compute[187185]: 2025-11-29 07:47:08.047 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:47:10 compute-0 nova_compute[187185]: 2025-11-29 07:47:10.803 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:47:13 compute-0 nova_compute[187185]: 2025-11-29 07:47:13.049 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:47:13 compute-0 nova_compute[187185]: 2025-11-29 07:47:13.240 187189 DEBUG nova.compute.manager [req-7228e3ff-3070-4a55-805b-abf7fdf9c4ae req-31ef14f6-94c2-4517-92e9-f663ef6ee6ad 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Received event network-changed-231b4077-66e5-463d-9600-7e94d305692d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:47:13 compute-0 nova_compute[187185]: 2025-11-29 07:47:13.241 187189 DEBUG nova.compute.manager [req-7228e3ff-3070-4a55-805b-abf7fdf9c4ae req-31ef14f6-94c2-4517-92e9-f663ef6ee6ad 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Refreshing instance network info cache due to event network-changed-231b4077-66e5-463d-9600-7e94d305692d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:47:13 compute-0 nova_compute[187185]: 2025-11-29 07:47:13.242 187189 DEBUG oslo_concurrency.lockutils [req-7228e3ff-3070-4a55-805b-abf7fdf9c4ae req-31ef14f6-94c2-4517-92e9-f663ef6ee6ad 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-45154ba2-ac58-4c10-a3ff-b0290dae3c8d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:47:13 compute-0 nova_compute[187185]: 2025-11-29 07:47:13.242 187189 DEBUG oslo_concurrency.lockutils [req-7228e3ff-3070-4a55-805b-abf7fdf9c4ae req-31ef14f6-94c2-4517-92e9-f663ef6ee6ad 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-45154ba2-ac58-4c10-a3ff-b0290dae3c8d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:47:13 compute-0 nova_compute[187185]: 2025-11-29 07:47:13.243 187189 DEBUG nova.network.neutron [req-7228e3ff-3070-4a55-805b-abf7fdf9c4ae req-31ef14f6-94c2-4517-92e9-f663ef6ee6ad 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Refreshing network info cache for port 231b4077-66e5-463d-9600-7e94d305692d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:47:14 compute-0 nova_compute[187185]: 2025-11-29 07:47:14.005 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:47:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:47:14.006 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=66, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=65) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:47:14 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:47:14.009 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:47:14 compute-0 ovn_controller[95281]: 2025-11-29T07:47:14Z|00075|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:82:46:2d 10.100.0.9
Nov 29 07:47:14 compute-0 ovn_controller[95281]: 2025-11-29T07:47:14Z|00076|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:82:46:2d 10.100.0.9
Nov 29 07:47:15 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:47:15.014 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '66'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:47:15 compute-0 nova_compute[187185]: 2025-11-29 07:47:15.047 187189 DEBUG nova.network.neutron [req-7228e3ff-3070-4a55-805b-abf7fdf9c4ae req-31ef14f6-94c2-4517-92e9-f663ef6ee6ad 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Updated VIF entry in instance network info cache for port 231b4077-66e5-463d-9600-7e94d305692d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:47:15 compute-0 nova_compute[187185]: 2025-11-29 07:47:15.048 187189 DEBUG nova.network.neutron [req-7228e3ff-3070-4a55-805b-abf7fdf9c4ae req-31ef14f6-94c2-4517-92e9-f663ef6ee6ad 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Updating instance_info_cache with network_info: [{"id": "231b4077-66e5-463d-9600-7e94d305692d", "address": "fa:16:3e:82:46:2d", "network": {"id": "7b412a37-c227-42ad-9fca-23287613486a", "bridge": "br-int", "label": "tempest-network-smoke--2027641750", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe82:462d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap231b4077-66", "ovs_interfaceid": "231b4077-66e5-463d-9600-7e94d305692d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:47:15 compute-0 nova_compute[187185]: 2025-11-29 07:47:15.077 187189 DEBUG oslo_concurrency.lockutils [req-7228e3ff-3070-4a55-805b-abf7fdf9c4ae req-31ef14f6-94c2-4517-92e9-f663ef6ee6ad 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-45154ba2-ac58-4c10-a3ff-b0290dae3c8d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:47:15 compute-0 sshd-session[245510]: Received disconnect from 20.255.62.58 port 53676:11: Bye Bye [preauth]
Nov 29 07:47:15 compute-0 sshd-session[245510]: Disconnected from authenticating user root 20.255.62.58 port 53676 [preauth]
Nov 29 07:47:15 compute-0 podman[245512]: 2025-11-29 07:47:15.791921736 +0000 UTC m=+0.058068323 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 07:47:15 compute-0 nova_compute[187185]: 2025-11-29 07:47:15.806 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:47:18 compute-0 nova_compute[187185]: 2025-11-29 07:47:18.051 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:47:18 compute-0 podman[245538]: 2025-11-29 07:47:18.812151026 +0000 UTC m=+0.080971235 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:47:18 compute-0 podman[245539]: 2025-11-29 07:47:18.817692004 +0000 UTC m=+0.081869351 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Nov 29 07:47:20 compute-0 nova_compute[187185]: 2025-11-29 07:47:20.809 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:47:22 compute-0 nova_compute[187185]: 2025-11-29 07:47:22.347 187189 DEBUG nova.compute.manager [req-ed9fd159-953c-433a-8078-53c4849e3cfd req-9d7fdc7d-f2bd-4b56-aef7-66f89becdd91 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Received event network-changed-231b4077-66e5-463d-9600-7e94d305692d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:47:22 compute-0 nova_compute[187185]: 2025-11-29 07:47:22.348 187189 DEBUG nova.compute.manager [req-ed9fd159-953c-433a-8078-53c4849e3cfd req-9d7fdc7d-f2bd-4b56-aef7-66f89becdd91 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Refreshing instance network info cache due to event network-changed-231b4077-66e5-463d-9600-7e94d305692d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:47:22 compute-0 nova_compute[187185]: 2025-11-29 07:47:22.349 187189 DEBUG oslo_concurrency.lockutils [req-ed9fd159-953c-433a-8078-53c4849e3cfd req-9d7fdc7d-f2bd-4b56-aef7-66f89becdd91 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-45154ba2-ac58-4c10-a3ff-b0290dae3c8d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:47:22 compute-0 nova_compute[187185]: 2025-11-29 07:47:22.349 187189 DEBUG oslo_concurrency.lockutils [req-ed9fd159-953c-433a-8078-53c4849e3cfd req-9d7fdc7d-f2bd-4b56-aef7-66f89becdd91 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-45154ba2-ac58-4c10-a3ff-b0290dae3c8d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:47:22 compute-0 nova_compute[187185]: 2025-11-29 07:47:22.350 187189 DEBUG nova.network.neutron [req-ed9fd159-953c-433a-8078-53c4849e3cfd req-9d7fdc7d-f2bd-4b56-aef7-66f89becdd91 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Refreshing network info cache for port 231b4077-66e5-463d-9600-7e94d305692d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:47:22 compute-0 nova_compute[187185]: 2025-11-29 07:47:22.635 187189 DEBUG oslo_concurrency.lockutils [None req-25404b47-266f-4ef2-9dc5-0387c4eac0f7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "45154ba2-ac58-4c10-a3ff-b0290dae3c8d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:47:22 compute-0 nova_compute[187185]: 2025-11-29 07:47:22.636 187189 DEBUG oslo_concurrency.lockutils [None req-25404b47-266f-4ef2-9dc5-0387c4eac0f7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "45154ba2-ac58-4c10-a3ff-b0290dae3c8d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:47:22 compute-0 nova_compute[187185]: 2025-11-29 07:47:22.637 187189 DEBUG oslo_concurrency.lockutils [None req-25404b47-266f-4ef2-9dc5-0387c4eac0f7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "45154ba2-ac58-4c10-a3ff-b0290dae3c8d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:47:22 compute-0 nova_compute[187185]: 2025-11-29 07:47:22.637 187189 DEBUG oslo_concurrency.lockutils [None req-25404b47-266f-4ef2-9dc5-0387c4eac0f7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "45154ba2-ac58-4c10-a3ff-b0290dae3c8d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:47:22 compute-0 nova_compute[187185]: 2025-11-29 07:47:22.637 187189 DEBUG oslo_concurrency.lockutils [None req-25404b47-266f-4ef2-9dc5-0387c4eac0f7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "45154ba2-ac58-4c10-a3ff-b0290dae3c8d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:47:22 compute-0 nova_compute[187185]: 2025-11-29 07:47:22.657 187189 INFO nova.compute.manager [None req-25404b47-266f-4ef2-9dc5-0387c4eac0f7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Terminating instance
Nov 29 07:47:22 compute-0 nova_compute[187185]: 2025-11-29 07:47:22.671 187189 DEBUG nova.compute.manager [None req-25404b47-266f-4ef2-9dc5-0387c4eac0f7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 07:47:22 compute-0 kernel: tap231b4077-66 (unregistering): left promiscuous mode
Nov 29 07:47:22 compute-0 NetworkManager[55227]: <info>  [1764402442.7002] device (tap231b4077-66): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:47:22 compute-0 ovn_controller[95281]: 2025-11-29T07:47:22Z|00563|binding|INFO|Releasing lport 231b4077-66e5-463d-9600-7e94d305692d from this chassis (sb_readonly=0)
Nov 29 07:47:22 compute-0 nova_compute[187185]: 2025-11-29 07:47:22.722 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:47:22 compute-0 ovn_controller[95281]: 2025-11-29T07:47:22Z|00564|binding|INFO|Setting lport 231b4077-66e5-463d-9600-7e94d305692d down in Southbound
Nov 29 07:47:22 compute-0 ovn_controller[95281]: 2025-11-29T07:47:22Z|00565|binding|INFO|Removing iface tap231b4077-66 ovn-installed in OVS
Nov 29 07:47:22 compute-0 nova_compute[187185]: 2025-11-29 07:47:22.729 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:47:22 compute-0 nova_compute[187185]: 2025-11-29 07:47:22.753 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:47:22 compute-0 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d000000a7.scope: Deactivated successfully.
Nov 29 07:47:22 compute-0 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d000000a7.scope: Consumed 14.038s CPU time.
Nov 29 07:47:22 compute-0 systemd-machined[153486]: Machine qemu-65-instance-000000a7 terminated.
Nov 29 07:47:22 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:47:22.838 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:46:2d 10.100.0.9 2001:db8::f816:3eff:fe82:462d'], port_security=['fa:16:3e:82:46:2d 10.100.0.9 2001:db8::f816:3eff:fe82:462d'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28 2001:db8::f816:3eff:fe82:462d/64', 'neutron:device_id': '45154ba2-ac58-4c10-a3ff-b0290dae3c8d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b412a37-c227-42ad-9fca-23287613486a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0e7f9e0d-709d-40ea-bb38-80f3b9bd57d2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=045a9acc-370f-460b-b7b5-7c57bd647b8b, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=231b4077-66e5-463d-9600-7e94d305692d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:47:22 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:47:22.841 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 231b4077-66e5-463d-9600-7e94d305692d in datapath 7b412a37-c227-42ad-9fca-23287613486a unbound from our chassis
Nov 29 07:47:22 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:47:22.844 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7b412a37-c227-42ad-9fca-23287613486a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:47:22 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:47:22.846 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[c618c8c0-15cb-4de9-a372-7d4f2a6c42ad]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:47:22 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:47:22.846 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7b412a37-c227-42ad-9fca-23287613486a namespace which is not needed anymore
Nov 29 07:47:22 compute-0 NetworkManager[55227]: <info>  [1764402442.9048] manager: (tap231b4077-66): new Tun device (/org/freedesktop/NetworkManager/Devices/294)
Nov 29 07:47:22 compute-0 nova_compute[187185]: 2025-11-29 07:47:22.906 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:47:22 compute-0 nova_compute[187185]: 2025-11-29 07:47:22.914 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:47:22 compute-0 nova_compute[187185]: 2025-11-29 07:47:22.954 187189 INFO nova.virt.libvirt.driver [-] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Instance destroyed successfully.
Nov 29 07:47:22 compute-0 nova_compute[187185]: 2025-11-29 07:47:22.955 187189 DEBUG nova.objects.instance [None req-25404b47-266f-4ef2-9dc5-0387c4eac0f7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'resources' on Instance uuid 45154ba2-ac58-4c10-a3ff-b0290dae3c8d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:47:22 compute-0 nova_compute[187185]: 2025-11-29 07:47:22.992 187189 DEBUG nova.virt.libvirt.vif [None req-25404b47-266f-4ef2-9dc5-0387c4eac0f7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:46:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1930125207',display_name='tempest-TestGettingAddress-server-1930125207',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1930125207',id=167,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMRqMqB2m2OgDWFFhqrQomXOlqtsH3DkZX/q3f9H1IQ0ObpMW22Tv9hUlgFTK1dOU/3/nBcrYC6MrtFKRuaytFioBJb/QmB1UkysPgqPE38bvZ1GGYFs/tP1vPRobYVgdQ==',key_name='tempest-TestGettingAddress-277961909',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:47:01Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-nh1laork',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:47:01Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=45154ba2-ac58-4c10-a3ff-b0290dae3c8d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "231b4077-66e5-463d-9600-7e94d305692d", "address": "fa:16:3e:82:46:2d", "network": {"id": "7b412a37-c227-42ad-9fca-23287613486a", "bridge": "br-int", "label": "tempest-network-smoke--2027641750", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe82:462d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap231b4077-66", "ovs_interfaceid": "231b4077-66e5-463d-9600-7e94d305692d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:47:22 compute-0 nova_compute[187185]: 2025-11-29 07:47:22.993 187189 DEBUG nova.network.os_vif_util [None req-25404b47-266f-4ef2-9dc5-0387c4eac0f7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "231b4077-66e5-463d-9600-7e94d305692d", "address": "fa:16:3e:82:46:2d", "network": {"id": "7b412a37-c227-42ad-9fca-23287613486a", "bridge": "br-int", "label": "tempest-network-smoke--2027641750", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe82:462d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap231b4077-66", "ovs_interfaceid": "231b4077-66e5-463d-9600-7e94d305692d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:47:22 compute-0 nova_compute[187185]: 2025-11-29 07:47:22.994 187189 DEBUG nova.network.os_vif_util [None req-25404b47-266f-4ef2-9dc5-0387c4eac0f7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:82:46:2d,bridge_name='br-int',has_traffic_filtering=True,id=231b4077-66e5-463d-9600-7e94d305692d,network=Network(7b412a37-c227-42ad-9fca-23287613486a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap231b4077-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:47:22 compute-0 nova_compute[187185]: 2025-11-29 07:47:22.994 187189 DEBUG os_vif [None req-25404b47-266f-4ef2-9dc5-0387c4eac0f7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:82:46:2d,bridge_name='br-int',has_traffic_filtering=True,id=231b4077-66e5-463d-9600-7e94d305692d,network=Network(7b412a37-c227-42ad-9fca-23287613486a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap231b4077-66') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:47:22 compute-0 nova_compute[187185]: 2025-11-29 07:47:22.996 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:47:22 compute-0 nova_compute[187185]: 2025-11-29 07:47:22.997 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap231b4077-66, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:47:23 compute-0 nova_compute[187185]: 2025-11-29 07:47:23.024 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:47:23 compute-0 nova_compute[187185]: 2025-11-29 07:47:23.027 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:47:23 compute-0 nova_compute[187185]: 2025-11-29 07:47:23.033 187189 INFO os_vif [None req-25404b47-266f-4ef2-9dc5-0387c4eac0f7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:82:46:2d,bridge_name='br-int',has_traffic_filtering=True,id=231b4077-66e5-463d-9600-7e94d305692d,network=Network(7b412a37-c227-42ad-9fca-23287613486a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap231b4077-66')
Nov 29 07:47:23 compute-0 nova_compute[187185]: 2025-11-29 07:47:23.034 187189 INFO nova.virt.libvirt.driver [None req-25404b47-266f-4ef2-9dc5-0387c4eac0f7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Deleting instance files /var/lib/nova/instances/45154ba2-ac58-4c10-a3ff-b0290dae3c8d_del
Nov 29 07:47:23 compute-0 nova_compute[187185]: 2025-11-29 07:47:23.036 187189 INFO nova.virt.libvirt.driver [None req-25404b47-266f-4ef2-9dc5-0387c4eac0f7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Deletion of /var/lib/nova/instances/45154ba2-ac58-4c10-a3ff-b0290dae3c8d_del complete
Nov 29 07:47:23 compute-0 nova_compute[187185]: 2025-11-29 07:47:23.054 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:47:23 compute-0 nova_compute[187185]: 2025-11-29 07:47:23.259 187189 INFO nova.compute.manager [None req-25404b47-266f-4ef2-9dc5-0387c4eac0f7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Took 0.59 seconds to destroy the instance on the hypervisor.
Nov 29 07:47:23 compute-0 nova_compute[187185]: 2025-11-29 07:47:23.260 187189 DEBUG oslo.service.loopingcall [None req-25404b47-266f-4ef2-9dc5-0387c4eac0f7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 07:47:23 compute-0 nova_compute[187185]: 2025-11-29 07:47:23.261 187189 DEBUG nova.compute.manager [-] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 07:47:23 compute-0 nova_compute[187185]: 2025-11-29 07:47:23.261 187189 DEBUG nova.network.neutron [-] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 07:47:23 compute-0 neutron-haproxy-ovnmeta-7b412a37-c227-42ad-9fca-23287613486a[245449]: [NOTICE]   (245453) : haproxy version is 2.8.14-c23fe91
Nov 29 07:47:23 compute-0 neutron-haproxy-ovnmeta-7b412a37-c227-42ad-9fca-23287613486a[245449]: [NOTICE]   (245453) : path to executable is /usr/sbin/haproxy
Nov 29 07:47:23 compute-0 neutron-haproxy-ovnmeta-7b412a37-c227-42ad-9fca-23287613486a[245449]: [WARNING]  (245453) : Exiting Master process...
Nov 29 07:47:23 compute-0 neutron-haproxy-ovnmeta-7b412a37-c227-42ad-9fca-23287613486a[245449]: [WARNING]  (245453) : Exiting Master process...
Nov 29 07:47:23 compute-0 neutron-haproxy-ovnmeta-7b412a37-c227-42ad-9fca-23287613486a[245449]: [ALERT]    (245453) : Current worker (245455) exited with code 143 (Terminated)
Nov 29 07:47:23 compute-0 neutron-haproxy-ovnmeta-7b412a37-c227-42ad-9fca-23287613486a[245449]: [WARNING]  (245453) : All workers exited. Exiting... (0)
Nov 29 07:47:23 compute-0 systemd[1]: libpod-7755127ec8441879f78be828a3e70052a54493a5920decd1a02b59d23bed270f.scope: Deactivated successfully.
Nov 29 07:47:23 compute-0 podman[245615]: 2025-11-29 07:47:23.302611542 +0000 UTC m=+0.302361026 container died 7755127ec8441879f78be828a3e70052a54493a5920decd1a02b59d23bed270f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b412a37-c227-42ad-9fca-23287613486a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 07:47:23 compute-0 nova_compute[187185]: 2025-11-29 07:47:23.700 187189 DEBUG nova.compute.manager [req-bb4ed1b6-8e44-49fa-9adf-a6421f91b969 req-f74fefd7-c2b7-4c2e-adf4-721015ccbc4e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Received event network-vif-unplugged-231b4077-66e5-463d-9600-7e94d305692d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:47:23 compute-0 nova_compute[187185]: 2025-11-29 07:47:23.700 187189 DEBUG oslo_concurrency.lockutils [req-bb4ed1b6-8e44-49fa-9adf-a6421f91b969 req-f74fefd7-c2b7-4c2e-adf4-721015ccbc4e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "45154ba2-ac58-4c10-a3ff-b0290dae3c8d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:47:23 compute-0 nova_compute[187185]: 2025-11-29 07:47:23.701 187189 DEBUG oslo_concurrency.lockutils [req-bb4ed1b6-8e44-49fa-9adf-a6421f91b969 req-f74fefd7-c2b7-4c2e-adf4-721015ccbc4e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "45154ba2-ac58-4c10-a3ff-b0290dae3c8d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:47:23 compute-0 nova_compute[187185]: 2025-11-29 07:47:23.702 187189 DEBUG oslo_concurrency.lockutils [req-bb4ed1b6-8e44-49fa-9adf-a6421f91b969 req-f74fefd7-c2b7-4c2e-adf4-721015ccbc4e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "45154ba2-ac58-4c10-a3ff-b0290dae3c8d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:47:23 compute-0 nova_compute[187185]: 2025-11-29 07:47:23.704 187189 DEBUG nova.compute.manager [req-bb4ed1b6-8e44-49fa-9adf-a6421f91b969 req-f74fefd7-c2b7-4c2e-adf4-721015ccbc4e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] No waiting events found dispatching network-vif-unplugged-231b4077-66e5-463d-9600-7e94d305692d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:47:23 compute-0 nova_compute[187185]: 2025-11-29 07:47:23.704 187189 DEBUG nova.compute.manager [req-bb4ed1b6-8e44-49fa-9adf-a6421f91b969 req-f74fefd7-c2b7-4c2e-adf4-721015ccbc4e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Received event network-vif-unplugged-231b4077-66e5-463d-9600-7e94d305692d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 07:47:23 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7755127ec8441879f78be828a3e70052a54493a5920decd1a02b59d23bed270f-userdata-shm.mount: Deactivated successfully.
Nov 29 07:47:23 compute-0 systemd[1]: var-lib-containers-storage-overlay-d62ada2ba9ef00e188ab1aa85bf580c5695086f7c38b085fc61e9706a27c9ba5-merged.mount: Deactivated successfully.
Nov 29 07:47:23 compute-0 podman[245615]: 2025-11-29 07:47:23.863088984 +0000 UTC m=+0.862838368 container cleanup 7755127ec8441879f78be828a3e70052a54493a5920decd1a02b59d23bed270f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b412a37-c227-42ad-9fca-23287613486a, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 07:47:23 compute-0 systemd[1]: libpod-conmon-7755127ec8441879f78be828a3e70052a54493a5920decd1a02b59d23bed270f.scope: Deactivated successfully.
Nov 29 07:47:24 compute-0 podman[245641]: 2025-11-29 07:47:24.066455632 +0000 UTC m=+0.178667436 container remove 7755127ec8441879f78be828a3e70052a54493a5920decd1a02b59d23bed270f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b412a37-c227-42ad-9fca-23287613486a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 29 07:47:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:47:24.071 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[0b83071c-daf0-43fa-be8d-3010e95445df]: (4, ('Sat Nov 29 07:47:22 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7b412a37-c227-42ad-9fca-23287613486a (7755127ec8441879f78be828a3e70052a54493a5920decd1a02b59d23bed270f)\n7755127ec8441879f78be828a3e70052a54493a5920decd1a02b59d23bed270f\nSat Nov 29 07:47:23 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7b412a37-c227-42ad-9fca-23287613486a (7755127ec8441879f78be828a3e70052a54493a5920decd1a02b59d23bed270f)\n7755127ec8441879f78be828a3e70052a54493a5920decd1a02b59d23bed270f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:47:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:47:24.073 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[45a85296-2dd7-4148-8752-3f588dc7bc15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:47:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:47:24.075 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b412a37-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:47:24 compute-0 nova_compute[187185]: 2025-11-29 07:47:24.124 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:47:24 compute-0 kernel: tap7b412a37-c0: left promiscuous mode
Nov 29 07:47:24 compute-0 nova_compute[187185]: 2025-11-29 07:47:24.138 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:47:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:47:24.141 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[0ac0fec2-7c8f-473a-9871-b8ecb29c4f18]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:47:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:47:24.162 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a9c6971c-952d-4e5d-b243-d5c3126a2d8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:47:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:47:24.164 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[c84cd79e-6d92-4de8-a661-0a1fd35074ce]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:47:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:47:24.190 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[aac3b56c-2af7-4bad-92bd-84f8839899d1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 786532, 'reachable_time': 32211, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245657, 'error': None, 'target': 'ovnmeta-7b412a37-c227-42ad-9fca-23287613486a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:47:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:47:24.195 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7b412a37-c227-42ad-9fca-23287613486a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:47:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:47:24.196 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[4f4537b5-651b-4aed-8a87-512ccb8b993f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:47:24 compute-0 systemd[1]: run-netns-ovnmeta\x2d7b412a37\x2dc227\x2d42ad\x2d9fca\x2d23287613486a.mount: Deactivated successfully.
Nov 29 07:47:24 compute-0 nova_compute[187185]: 2025-11-29 07:47:24.634 187189 DEBUG nova.network.neutron [-] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:47:24 compute-0 nova_compute[187185]: 2025-11-29 07:47:24.652 187189 INFO nova.compute.manager [-] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Took 1.39 seconds to deallocate network for instance.
Nov 29 07:47:24 compute-0 nova_compute[187185]: 2025-11-29 07:47:24.734 187189 DEBUG oslo_concurrency.lockutils [None req-25404b47-266f-4ef2-9dc5-0387c4eac0f7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:47:24 compute-0 nova_compute[187185]: 2025-11-29 07:47:24.735 187189 DEBUG oslo_concurrency.lockutils [None req-25404b47-266f-4ef2-9dc5-0387c4eac0f7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:47:24 compute-0 nova_compute[187185]: 2025-11-29 07:47:24.806 187189 DEBUG nova.compute.provider_tree [None req-25404b47-266f-4ef2-9dc5-0387c4eac0f7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:47:24 compute-0 nova_compute[187185]: 2025-11-29 07:47:24.826 187189 DEBUG nova.scheduler.client.report [None req-25404b47-266f-4ef2-9dc5-0387c4eac0f7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:47:24 compute-0 nova_compute[187185]: 2025-11-29 07:47:24.847 187189 DEBUG oslo_concurrency.lockutils [None req-25404b47-266f-4ef2-9dc5-0387c4eac0f7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:47:24 compute-0 nova_compute[187185]: 2025-11-29 07:47:24.878 187189 INFO nova.scheduler.client.report [None req-25404b47-266f-4ef2-9dc5-0387c4eac0f7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Deleted allocations for instance 45154ba2-ac58-4c10-a3ff-b0290dae3c8d
Nov 29 07:47:24 compute-0 nova_compute[187185]: 2025-11-29 07:47:24.891 187189 DEBUG nova.network.neutron [req-ed9fd159-953c-433a-8078-53c4849e3cfd req-9d7fdc7d-f2bd-4b56-aef7-66f89becdd91 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Updated VIF entry in instance network info cache for port 231b4077-66e5-463d-9600-7e94d305692d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:47:24 compute-0 nova_compute[187185]: 2025-11-29 07:47:24.893 187189 DEBUG nova.network.neutron [req-ed9fd159-953c-433a-8078-53c4849e3cfd req-9d7fdc7d-f2bd-4b56-aef7-66f89becdd91 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Updating instance_info_cache with network_info: [{"id": "231b4077-66e5-463d-9600-7e94d305692d", "address": "fa:16:3e:82:46:2d", "network": {"id": "7b412a37-c227-42ad-9fca-23287613486a", "bridge": "br-int", "label": "tempest-network-smoke--2027641750", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe82:462d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap231b4077-66", "ovs_interfaceid": "231b4077-66e5-463d-9600-7e94d305692d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:47:24 compute-0 nova_compute[187185]: 2025-11-29 07:47:24.927 187189 DEBUG oslo_concurrency.lockutils [req-ed9fd159-953c-433a-8078-53c4849e3cfd req-9d7fdc7d-f2bd-4b56-aef7-66f89becdd91 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-45154ba2-ac58-4c10-a3ff-b0290dae3c8d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:47:24 compute-0 nova_compute[187185]: 2025-11-29 07:47:24.959 187189 DEBUG oslo_concurrency.lockutils [None req-25404b47-266f-4ef2-9dc5-0387c4eac0f7 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "45154ba2-ac58-4c10-a3ff-b0290dae3c8d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.323s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:47:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:47:25.746 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:47:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:47:25.748 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:47:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:47:25.749 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:47:25 compute-0 nova_compute[187185]: 2025-11-29 07:47:25.943 187189 DEBUG nova.compute.manager [req-1cc651d9-1f80-4b4c-8344-36a9bbf6c862 req-15198be3-b52b-46a1-892e-c042cf4dcafe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Received event network-vif-plugged-231b4077-66e5-463d-9600-7e94d305692d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:47:25 compute-0 nova_compute[187185]: 2025-11-29 07:47:25.943 187189 DEBUG oslo_concurrency.lockutils [req-1cc651d9-1f80-4b4c-8344-36a9bbf6c862 req-15198be3-b52b-46a1-892e-c042cf4dcafe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "45154ba2-ac58-4c10-a3ff-b0290dae3c8d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:47:25 compute-0 nova_compute[187185]: 2025-11-29 07:47:25.944 187189 DEBUG oslo_concurrency.lockutils [req-1cc651d9-1f80-4b4c-8344-36a9bbf6c862 req-15198be3-b52b-46a1-892e-c042cf4dcafe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "45154ba2-ac58-4c10-a3ff-b0290dae3c8d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:47:25 compute-0 nova_compute[187185]: 2025-11-29 07:47:25.944 187189 DEBUG oslo_concurrency.lockutils [req-1cc651d9-1f80-4b4c-8344-36a9bbf6c862 req-15198be3-b52b-46a1-892e-c042cf4dcafe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "45154ba2-ac58-4c10-a3ff-b0290dae3c8d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:47:25 compute-0 nova_compute[187185]: 2025-11-29 07:47:25.945 187189 DEBUG nova.compute.manager [req-1cc651d9-1f80-4b4c-8344-36a9bbf6c862 req-15198be3-b52b-46a1-892e-c042cf4dcafe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] No waiting events found dispatching network-vif-plugged-231b4077-66e5-463d-9600-7e94d305692d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:47:25 compute-0 nova_compute[187185]: 2025-11-29 07:47:25.945 187189 WARNING nova.compute.manager [req-1cc651d9-1f80-4b4c-8344-36a9bbf6c862 req-15198be3-b52b-46a1-892e-c042cf4dcafe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Received unexpected event network-vif-plugged-231b4077-66e5-463d-9600-7e94d305692d for instance with vm_state deleted and task_state None.
Nov 29 07:47:25 compute-0 nova_compute[187185]: 2025-11-29 07:47:25.946 187189 DEBUG nova.compute.manager [req-1cc651d9-1f80-4b4c-8344-36a9bbf6c862 req-15198be3-b52b-46a1-892e-c042cf4dcafe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Received event network-vif-deleted-231b4077-66e5-463d-9600-7e94d305692d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:47:25 compute-0 nova_compute[187185]: 2025-11-29 07:47:25.946 187189 INFO nova.compute.manager [req-1cc651d9-1f80-4b4c-8344-36a9bbf6c862 req-15198be3-b52b-46a1-892e-c042cf4dcafe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Neutron deleted interface 231b4077-66e5-463d-9600-7e94d305692d; detaching it from the instance and deleting it from the info cache
Nov 29 07:47:25 compute-0 nova_compute[187185]: 2025-11-29 07:47:25.947 187189 DEBUG nova.network.neutron [req-1cc651d9-1f80-4b4c-8344-36a9bbf6c862 req-15198be3-b52b-46a1-892e-c042cf4dcafe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Nov 29 07:47:25 compute-0 nova_compute[187185]: 2025-11-29 07:47:25.951 187189 DEBUG nova.compute.manager [req-1cc651d9-1f80-4b4c-8344-36a9bbf6c862 req-15198be3-b52b-46a1-892e-c042cf4dcafe 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Detach interface failed, port_id=231b4077-66e5-463d-9600-7e94d305692d, reason: Instance 45154ba2-ac58-4c10-a3ff-b0290dae3c8d could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Nov 29 07:47:27 compute-0 sshd-session[245537]: error: kex_exchange_identification: read: Connection timed out
Nov 29 07:47:27 compute-0 sshd-session[245537]: banner exchange: Connection from 115.190.187.93 port 58656: Connection timed out
Nov 29 07:47:28 compute-0 nova_compute[187185]: 2025-11-29 07:47:28.026 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:47:28 compute-0 nova_compute[187185]: 2025-11-29 07:47:28.057 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:47:29 compute-0 sshd-session[245658]: Invalid user odoo from 115.190.136.184 port 18232
Nov 29 07:47:29 compute-0 podman[245660]: 2025-11-29 07:47:29.292730218 +0000 UTC m=+0.071111985 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:47:29 compute-0 nova_compute[187185]: 2025-11-29 07:47:29.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:47:29 compute-0 podman[245661]: 2025-11-29 07:47:29.322104634 +0000 UTC m=+0.091869576 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, release=1755695350, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., name=ubi9-minimal, config_id=edpm)
Nov 29 07:47:29 compute-0 podman[245662]: 2025-11-29 07:47:29.347772885 +0000 UTC m=+0.112982017 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 07:47:29 compute-0 sshd-session[245658]: Received disconnect from 115.190.136.184 port 18232:11: Bye Bye [preauth]
Nov 29 07:47:29 compute-0 sshd-session[245658]: Disconnected from invalid user odoo 115.190.136.184 port 18232 [preauth]
Nov 29 07:47:29 compute-0 sshd-session[245638]: Connection closed by 45.78.219.119 port 52472 [preauth]
Nov 29 07:47:32 compute-0 nova_compute[187185]: 2025-11-29 07:47:32.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:47:32 compute-0 nova_compute[187185]: 2025-11-29 07:47:32.318 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:47:32 compute-0 nova_compute[187185]: 2025-11-29 07:47:32.340 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:47:32 compute-0 nova_compute[187185]: 2025-11-29 07:47:32.340 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:47:32 compute-0 nova_compute[187185]: 2025-11-29 07:47:32.340 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:47:32 compute-0 nova_compute[187185]: 2025-11-29 07:47:32.341 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:47:32 compute-0 nova_compute[187185]: 2025-11-29 07:47:32.519 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:47:32 compute-0 nova_compute[187185]: 2025-11-29 07:47:32.521 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5719MB free_disk=73.25361633300781GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:47:32 compute-0 nova_compute[187185]: 2025-11-29 07:47:32.534 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:47:32 compute-0 nova_compute[187185]: 2025-11-29 07:47:32.534 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:47:32 compute-0 nova_compute[187185]: 2025-11-29 07:47:32.842 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:47:32 compute-0 nova_compute[187185]: 2025-11-29 07:47:32.843 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:47:32 compute-0 nova_compute[187185]: 2025-11-29 07:47:32.865 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:47:32 compute-0 nova_compute[187185]: 2025-11-29 07:47:32.888 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:47:32 compute-0 nova_compute[187185]: 2025-11-29 07:47:32.911 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:47:32 compute-0 nova_compute[187185]: 2025-11-29 07:47:32.912 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.377s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:47:33 compute-0 nova_compute[187185]: 2025-11-29 07:47:33.030 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:47:33 compute-0 nova_compute[187185]: 2025-11-29 07:47:33.059 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:47:35 compute-0 nova_compute[187185]: 2025-11-29 07:47:35.889 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:47:35 compute-0 nova_compute[187185]: 2025-11-29 07:47:35.915 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:47:36 compute-0 nova_compute[187185]: 2025-11-29 07:47:36.064 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:47:36 compute-0 nova_compute[187185]: 2025-11-29 07:47:36.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:47:36 compute-0 nova_compute[187185]: 2025-11-29 07:47:36.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:47:36 compute-0 nova_compute[187185]: 2025-11-29 07:47:36.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:47:36 compute-0 nova_compute[187185]: 2025-11-29 07:47:36.345 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 07:47:36 compute-0 nova_compute[187185]: 2025-11-29 07:47:36.345 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:47:36 compute-0 nova_compute[187185]: 2025-11-29 07:47:36.346 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:47:37 compute-0 podman[245724]: 2025-11-29 07:47:37.843212806 +0000 UTC m=+0.105436202 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:47:37 compute-0 nova_compute[187185]: 2025-11-29 07:47:37.951 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764402442.9504383, 45154ba2-ac58-4c10-a3ff-b0290dae3c8d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:47:37 compute-0 nova_compute[187185]: 2025-11-29 07:47:37.952 187189 INFO nova.compute.manager [-] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] VM Stopped (Lifecycle Event)
Nov 29 07:47:38 compute-0 nova_compute[187185]: 2025-11-29 07:47:38.033 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:47:38 compute-0 nova_compute[187185]: 2025-11-29 07:47:38.061 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:47:38 compute-0 nova_compute[187185]: 2025-11-29 07:47:38.069 187189 DEBUG nova.compute.manager [None req-02d4fc05-7120-4688-909b-50862ee569e9 - - - - - -] [instance: 45154ba2-ac58-4c10-a3ff-b0290dae3c8d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:47:39 compute-0 nova_compute[187185]: 2025-11-29 07:47:39.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:47:41 compute-0 nova_compute[187185]: 2025-11-29 07:47:41.310 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:47:41 compute-0 nova_compute[187185]: 2025-11-29 07:47:41.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:47:42 compute-0 nova_compute[187185]: 2025-11-29 07:47:42.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:47:42 compute-0 nova_compute[187185]: 2025-11-29 07:47:42.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 29 07:47:43 compute-0 nova_compute[187185]: 2025-11-29 07:47:43.036 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:47:43 compute-0 nova_compute[187185]: 2025-11-29 07:47:43.065 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:47:46 compute-0 podman[245752]: 2025-11-29 07:47:46.831933726 +0000 UTC m=+0.085374550 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 07:47:48 compute-0 nova_compute[187185]: 2025-11-29 07:47:48.040 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:47:48 compute-0 nova_compute[187185]: 2025-11-29 07:47:48.066 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:47:49 compute-0 sshd-session[245777]: Invalid user saas from 190.181.27.27 port 33978
Nov 29 07:47:49 compute-0 podman[245780]: 2025-11-29 07:47:49.123090547 +0000 UTC m=+0.064216309 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute)
Nov 29 07:47:49 compute-0 podman[245779]: 2025-11-29 07:47:49.123921521 +0000 UTC m=+0.067930165 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true)
Nov 29 07:47:49 compute-0 sshd-session[245777]: Received disconnect from 190.181.27.27 port 33978:11: Bye Bye [preauth]
Nov 29 07:47:49 compute-0 sshd-session[245777]: Disconnected from invalid user saas 190.181.27.27 port 33978 [preauth]
Nov 29 07:47:53 compute-0 nova_compute[187185]: 2025-11-29 07:47:53.045 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:47:53 compute-0 nova_compute[187185]: 2025-11-29 07:47:53.069 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:47:56 compute-0 sshd-session[245818]: Invalid user kiosk from 115.190.136.184 port 16822
Nov 29 07:47:56 compute-0 sshd-session[245818]: Received disconnect from 115.190.136.184 port 16822:11: Bye Bye [preauth]
Nov 29 07:47:56 compute-0 sshd-session[245818]: Disconnected from invalid user kiosk 115.190.136.184 port 16822 [preauth]
Nov 29 07:47:58 compute-0 nova_compute[187185]: 2025-11-29 07:47:58.050 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:47:58 compute-0 nova_compute[187185]: 2025-11-29 07:47:58.071 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:47:59 compute-0 podman[245820]: 2025-11-29 07:47:59.799240056 +0000 UTC m=+0.064956951 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:47:59 compute-0 podman[245822]: 2025-11-29 07:47:59.8021932 +0000 UTC m=+0.058994891 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 07:47:59 compute-0 podman[245821]: 2025-11-29 07:47:59.815771117 +0000 UTC m=+0.071430925 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, vcs-type=git, architecture=x86_64)
Nov 29 07:48:03 compute-0 nova_compute[187185]: 2025-11-29 07:48:03.053 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:03 compute-0 nova_compute[187185]: 2025-11-29 07:48:03.073 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:08 compute-0 nova_compute[187185]: 2025-11-29 07:48:08.058 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:08 compute-0 nova_compute[187185]: 2025-11-29 07:48:08.076 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:08 compute-0 nova_compute[187185]: 2025-11-29 07:48:08.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:48:08 compute-0 podman[245878]: 2025-11-29 07:48:08.86961552 +0000 UTC m=+0.131527924 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller)
Nov 29 07:48:13 compute-0 nova_compute[187185]: 2025-11-29 07:48:13.078 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:48:13 compute-0 nova_compute[187185]: 2025-11-29 07:48:13.080 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:48:13 compute-0 nova_compute[187185]: 2025-11-29 07:48:13.080 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 29 07:48:13 compute-0 nova_compute[187185]: 2025-11-29 07:48:13.081 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 29 07:48:13 compute-0 nova_compute[187185]: 2025-11-29 07:48:13.111 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:13 compute-0 nova_compute[187185]: 2025-11-29 07:48:13.113 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 29 07:48:17 compute-0 nova_compute[187185]: 2025-11-29 07:48:17.544 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:48:17 compute-0 nova_compute[187185]: 2025-11-29 07:48:17.544 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 29 07:48:17 compute-0 podman[245905]: 2025-11-29 07:48:17.807400232 +0000 UTC m=+0.068228783 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 07:48:18 compute-0 nova_compute[187185]: 2025-11-29 07:48:18.064 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 29 07:48:18 compute-0 nova_compute[187185]: 2025-11-29 07:48:18.113 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:18 compute-0 sshd[128727]: Timeout before authentication for connection from 115.190.136.184 to 38.102.83.110, pid = 245069
Nov 29 07:48:19 compute-0 podman[245931]: 2025-11-29 07:48:19.799972153 +0000 UTC m=+0.064503657 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=multipathd)
Nov 29 07:48:19 compute-0 podman[245932]: 2025-11-29 07:48:19.802968688 +0000 UTC m=+0.063230161 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 29 07:48:20 compute-0 nova_compute[187185]: 2025-11-29 07:48:20.383 187189 DEBUG oslo_concurrency.lockutils [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Acquiring lock "f18e8e64-a0ef-4c29-85d0-955b86872379" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:48:20 compute-0 nova_compute[187185]: 2025-11-29 07:48:20.384 187189 DEBUG oslo_concurrency.lockutils [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "f18e8e64-a0ef-4c29-85d0-955b86872379" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:48:20 compute-0 nova_compute[187185]: 2025-11-29 07:48:20.400 187189 DEBUG nova.compute.manager [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 07:48:20 compute-0 nova_compute[187185]: 2025-11-29 07:48:20.540 187189 DEBUG oslo_concurrency.lockutils [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:48:20 compute-0 nova_compute[187185]: 2025-11-29 07:48:20.540 187189 DEBUG oslo_concurrency.lockutils [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:48:20 compute-0 nova_compute[187185]: 2025-11-29 07:48:20.553 187189 DEBUG nova.virt.hardware [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:48:20 compute-0 nova_compute[187185]: 2025-11-29 07:48:20.554 187189 INFO nova.compute.claims [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:48:20 compute-0 nova_compute[187185]: 2025-11-29 07:48:20.784 187189 DEBUG nova.compute.provider_tree [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:48:20 compute-0 nova_compute[187185]: 2025-11-29 07:48:20.812 187189 DEBUG nova.scheduler.client.report [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:48:20 compute-0 nova_compute[187185]: 2025-11-29 07:48:20.836 187189 DEBUG oslo_concurrency.lockutils [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.295s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:48:20 compute-0 nova_compute[187185]: 2025-11-29 07:48:20.837 187189 DEBUG nova.compute.manager [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 07:48:20 compute-0 ovn_controller[95281]: 2025-11-29T07:48:20Z|00566|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 29 07:48:20 compute-0 nova_compute[187185]: 2025-11-29 07:48:20.919 187189 DEBUG nova.compute.manager [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 07:48:20 compute-0 nova_compute[187185]: 2025-11-29 07:48:20.919 187189 DEBUG nova.network.neutron [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 07:48:20 compute-0 nova_compute[187185]: 2025-11-29 07:48:20.949 187189 INFO nova.virt.libvirt.driver [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 07:48:20 compute-0 nova_compute[187185]: 2025-11-29 07:48:20.968 187189 DEBUG nova.compute.manager [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 07:48:21 compute-0 nova_compute[187185]: 2025-11-29 07:48:21.096 187189 DEBUG nova.compute.manager [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 07:48:21 compute-0 nova_compute[187185]: 2025-11-29 07:48:21.098 187189 DEBUG nova.virt.libvirt.driver [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:48:21 compute-0 nova_compute[187185]: 2025-11-29 07:48:21.098 187189 INFO nova.virt.libvirt.driver [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Creating image(s)
Nov 29 07:48:21 compute-0 nova_compute[187185]: 2025-11-29 07:48:21.099 187189 DEBUG oslo_concurrency.lockutils [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Acquiring lock "/var/lib/nova/instances/f18e8e64-a0ef-4c29-85d0-955b86872379/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:48:21 compute-0 nova_compute[187185]: 2025-11-29 07:48:21.099 187189 DEBUG oslo_concurrency.lockutils [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "/var/lib/nova/instances/f18e8e64-a0ef-4c29-85d0-955b86872379/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:48:21 compute-0 nova_compute[187185]: 2025-11-29 07:48:21.100 187189 DEBUG oslo_concurrency.lockutils [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "/var/lib/nova/instances/f18e8e64-a0ef-4c29-85d0-955b86872379/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:48:21 compute-0 nova_compute[187185]: 2025-11-29 07:48:21.114 187189 DEBUG oslo_concurrency.processutils [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:48:21 compute-0 nova_compute[187185]: 2025-11-29 07:48:21.219 187189 DEBUG oslo_concurrency.processutils [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.105s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:48:21 compute-0 nova_compute[187185]: 2025-11-29 07:48:21.221 187189 DEBUG oslo_concurrency.lockutils [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:48:21 compute-0 nova_compute[187185]: 2025-11-29 07:48:21.222 187189 DEBUG oslo_concurrency.lockutils [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:48:21 compute-0 nova_compute[187185]: 2025-11-29 07:48:21.250 187189 DEBUG oslo_concurrency.processutils [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:48:21 compute-0 nova_compute[187185]: 2025-11-29 07:48:21.313 187189 DEBUG oslo_concurrency.processutils [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:48:21 compute-0 nova_compute[187185]: 2025-11-29 07:48:21.319 187189 DEBUG oslo_concurrency.processutils [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/f18e8e64-a0ef-4c29-85d0-955b86872379/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:48:21 compute-0 nova_compute[187185]: 2025-11-29 07:48:21.583 187189 DEBUG oslo_concurrency.processutils [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/f18e8e64-a0ef-4c29-85d0-955b86872379/disk 1073741824" returned: 0 in 0.264s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:48:21 compute-0 nova_compute[187185]: 2025-11-29 07:48:21.585 187189 DEBUG oslo_concurrency.lockutils [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.362s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:48:21 compute-0 nova_compute[187185]: 2025-11-29 07:48:21.585 187189 DEBUG oslo_concurrency.processutils [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:48:21 compute-0 nova_compute[187185]: 2025-11-29 07:48:21.686 187189 DEBUG oslo_concurrency.processutils [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:48:21 compute-0 nova_compute[187185]: 2025-11-29 07:48:21.688 187189 DEBUG nova.virt.disk.api [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Checking if we can resize image /var/lib/nova/instances/f18e8e64-a0ef-4c29-85d0-955b86872379/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:48:21 compute-0 nova_compute[187185]: 2025-11-29 07:48:21.688 187189 DEBUG oslo_concurrency.processutils [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f18e8e64-a0ef-4c29-85d0-955b86872379/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:48:21 compute-0 sshd[128727]: drop connection #1 from [115.190.136.184]:55966 on [38.102.83.110]:22 penalty: exceeded LoginGraceTime
Nov 29 07:48:21 compute-0 nova_compute[187185]: 2025-11-29 07:48:21.773 187189 DEBUG oslo_concurrency.processutils [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f18e8e64-a0ef-4c29-85d0-955b86872379/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:48:21 compute-0 nova_compute[187185]: 2025-11-29 07:48:21.774 187189 DEBUG nova.virt.disk.api [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Cannot resize image /var/lib/nova/instances/f18e8e64-a0ef-4c29-85d0-955b86872379/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:48:21 compute-0 nova_compute[187185]: 2025-11-29 07:48:21.775 187189 DEBUG nova.objects.instance [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lazy-loading 'migration_context' on Instance uuid f18e8e64-a0ef-4c29-85d0-955b86872379 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:48:21 compute-0 nova_compute[187185]: 2025-11-29 07:48:21.787 187189 DEBUG nova.virt.libvirt.driver [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:48:21 compute-0 nova_compute[187185]: 2025-11-29 07:48:21.788 187189 DEBUG nova.virt.libvirt.driver [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Ensure instance console log exists: /var/lib/nova/instances/f18e8e64-a0ef-4c29-85d0-955b86872379/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:48:21 compute-0 nova_compute[187185]: 2025-11-29 07:48:21.788 187189 DEBUG oslo_concurrency.lockutils [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:48:21 compute-0 nova_compute[187185]: 2025-11-29 07:48:21.789 187189 DEBUG oslo_concurrency.lockutils [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:48:21 compute-0 nova_compute[187185]: 2025-11-29 07:48:21.789 187189 DEBUG oslo_concurrency.lockutils [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:48:22 compute-0 sshd-session[245904]: Connection closed by 115.190.187.93 port 48892 [preauth]
Nov 29 07:48:22 compute-0 nova_compute[187185]: 2025-11-29 07:48:22.936 187189 DEBUG nova.network.neutron [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Successfully created port: 462e1ede-b054-4653-9f4d-b136bdab7915 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:48:23 compute-0 nova_compute[187185]: 2025-11-29 07:48:23.114 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:48:23 compute-0 nova_compute[187185]: 2025-11-29 07:48:23.116 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:48:23 compute-0 nova_compute[187185]: 2025-11-29 07:48:23.117 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Nov 29 07:48:23 compute-0 nova_compute[187185]: 2025-11-29 07:48:23.117 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 29 07:48:23 compute-0 nova_compute[187185]: 2025-11-29 07:48:23.170 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:23 compute-0 nova_compute[187185]: 2025-11-29 07:48:23.172 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 29 07:48:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:24.512 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=67, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=66) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:48:24 compute-0 nova_compute[187185]: 2025-11-29 07:48:24.512 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:24.514 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:48:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:24.515 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '67'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:48:24 compute-0 nova_compute[187185]: 2025-11-29 07:48:24.526 187189 DEBUG nova.network.neutron [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Successfully updated port: 462e1ede-b054-4653-9f4d-b136bdab7915 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:48:24 compute-0 nova_compute[187185]: 2025-11-29 07:48:24.555 187189 DEBUG oslo_concurrency.lockutils [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Acquiring lock "refresh_cache-f18e8e64-a0ef-4c29-85d0-955b86872379" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:48:24 compute-0 nova_compute[187185]: 2025-11-29 07:48:24.556 187189 DEBUG oslo_concurrency.lockutils [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Acquired lock "refresh_cache-f18e8e64-a0ef-4c29-85d0-955b86872379" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:48:24 compute-0 nova_compute[187185]: 2025-11-29 07:48:24.556 187189 DEBUG nova.network.neutron [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:48:24 compute-0 nova_compute[187185]: 2025-11-29 07:48:24.675 187189 DEBUG nova.compute.manager [req-d133c22b-8afc-43e9-990f-271651d5d037 req-e1e93d86-934a-4293-a8f1-a0c262e15843 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Received event network-changed-462e1ede-b054-4653-9f4d-b136bdab7915 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:48:24 compute-0 nova_compute[187185]: 2025-11-29 07:48:24.676 187189 DEBUG nova.compute.manager [req-d133c22b-8afc-43e9-990f-271651d5d037 req-e1e93d86-934a-4293-a8f1-a0c262e15843 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Refreshing instance network info cache due to event network-changed-462e1ede-b054-4653-9f4d-b136bdab7915. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:48:24 compute-0 nova_compute[187185]: 2025-11-29 07:48:24.676 187189 DEBUG oslo_concurrency.lockutils [req-d133c22b-8afc-43e9-990f-271651d5d037 req-e1e93d86-934a-4293-a8f1-a0c262e15843 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-f18e8e64-a0ef-4c29-85d0-955b86872379" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:48:24 compute-0 nova_compute[187185]: 2025-11-29 07:48:24.820 187189 DEBUG nova.network.neutron [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:48:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:25.748 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:48:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:25.749 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:48:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:25.749 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:48:28 compute-0 nova_compute[187185]: 2025-11-29 07:48:28.172 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:28 compute-0 nova_compute[187185]: 2025-11-29 07:48:28.175 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:48:28 compute-0 nova_compute[187185]: 2025-11-29 07:48:28.775 187189 DEBUG nova.network.neutron [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Updating instance_info_cache with network_info: [{"id": "462e1ede-b054-4653-9f4d-b136bdab7915", "address": "fa:16:3e:c0:77:28", "network": {"id": "7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d", "bridge": "br-int", "label": "tempest-TestServerMultinode-1724511624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff00c41537a48ac87c3d3460e6ff7e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap462e1ede-b0", "ovs_interfaceid": "462e1ede-b054-4653-9f4d-b136bdab7915", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:48:28 compute-0 nova_compute[187185]: 2025-11-29 07:48:28.976 187189 DEBUG oslo_concurrency.lockutils [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Releasing lock "refresh_cache-f18e8e64-a0ef-4c29-85d0-955b86872379" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:48:28 compute-0 nova_compute[187185]: 2025-11-29 07:48:28.977 187189 DEBUG nova.compute.manager [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Instance network_info: |[{"id": "462e1ede-b054-4653-9f4d-b136bdab7915", "address": "fa:16:3e:c0:77:28", "network": {"id": "7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d", "bridge": "br-int", "label": "tempest-TestServerMultinode-1724511624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff00c41537a48ac87c3d3460e6ff7e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap462e1ede-b0", "ovs_interfaceid": "462e1ede-b054-4653-9f4d-b136bdab7915", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 07:48:28 compute-0 nova_compute[187185]: 2025-11-29 07:48:28.978 187189 DEBUG oslo_concurrency.lockutils [req-d133c22b-8afc-43e9-990f-271651d5d037 req-e1e93d86-934a-4293-a8f1-a0c262e15843 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-f18e8e64-a0ef-4c29-85d0-955b86872379" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:48:28 compute-0 nova_compute[187185]: 2025-11-29 07:48:28.979 187189 DEBUG nova.network.neutron [req-d133c22b-8afc-43e9-990f-271651d5d037 req-e1e93d86-934a-4293-a8f1-a0c262e15843 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Refreshing network info cache for port 462e1ede-b054-4653-9f4d-b136bdab7915 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:48:28 compute-0 nova_compute[187185]: 2025-11-29 07:48:28.984 187189 DEBUG nova.virt.libvirt.driver [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Start _get_guest_xml network_info=[{"id": "462e1ede-b054-4653-9f4d-b136bdab7915", "address": "fa:16:3e:c0:77:28", "network": {"id": "7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d", "bridge": "br-int", "label": "tempest-TestServerMultinode-1724511624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff00c41537a48ac87c3d3460e6ff7e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap462e1ede-b0", "ovs_interfaceid": "462e1ede-b054-4653-9f4d-b136bdab7915", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:48:28 compute-0 nova_compute[187185]: 2025-11-29 07:48:28.992 187189 WARNING nova.virt.libvirt.driver [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:48:28 compute-0 nova_compute[187185]: 2025-11-29 07:48:28.999 187189 DEBUG nova.virt.libvirt.host [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:48:29 compute-0 nova_compute[187185]: 2025-11-29 07:48:29.000 187189 DEBUG nova.virt.libvirt.host [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:48:29 compute-0 nova_compute[187185]: 2025-11-29 07:48:29.010 187189 DEBUG nova.virt.libvirt.host [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:48:29 compute-0 nova_compute[187185]: 2025-11-29 07:48:29.011 187189 DEBUG nova.virt.libvirt.host [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:48:29 compute-0 nova_compute[187185]: 2025-11-29 07:48:29.013 187189 DEBUG nova.virt.libvirt.driver [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:48:29 compute-0 nova_compute[187185]: 2025-11-29 07:48:29.013 187189 DEBUG nova.virt.hardware [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:48:29 compute-0 nova_compute[187185]: 2025-11-29 07:48:29.014 187189 DEBUG nova.virt.hardware [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:48:29 compute-0 nova_compute[187185]: 2025-11-29 07:48:29.014 187189 DEBUG nova.virt.hardware [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:48:29 compute-0 nova_compute[187185]: 2025-11-29 07:48:29.015 187189 DEBUG nova.virt.hardware [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:48:29 compute-0 nova_compute[187185]: 2025-11-29 07:48:29.015 187189 DEBUG nova.virt.hardware [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:48:29 compute-0 nova_compute[187185]: 2025-11-29 07:48:29.015 187189 DEBUG nova.virt.hardware [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:48:29 compute-0 nova_compute[187185]: 2025-11-29 07:48:29.016 187189 DEBUG nova.virt.hardware [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:48:29 compute-0 nova_compute[187185]: 2025-11-29 07:48:29.016 187189 DEBUG nova.virt.hardware [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:48:29 compute-0 nova_compute[187185]: 2025-11-29 07:48:29.017 187189 DEBUG nova.virt.hardware [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:48:29 compute-0 nova_compute[187185]: 2025-11-29 07:48:29.017 187189 DEBUG nova.virt.hardware [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:48:29 compute-0 nova_compute[187185]: 2025-11-29 07:48:29.017 187189 DEBUG nova.virt.hardware [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:48:29 compute-0 nova_compute[187185]: 2025-11-29 07:48:29.022 187189 DEBUG nova.virt.libvirt.vif [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:48:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1299020398',display_name='tempest-TestServerMultinode-server-1299020398',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testservermultinode-server-1299020398',id=170,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='220340bd80db4bf5af391eb2e4247a6c',ramdisk_id='',reservation_id='r-t0h2676a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-521650901',owner_user_name='tempest-TestServerMultinode-521650901-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:48:21Z,user_data=None,user_id='b79809b822b248ae8be15d0233f5896e',uuid=f18e8e64-a0ef-4c29-85d0-955b86872379,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "462e1ede-b054-4653-9f4d-b136bdab7915", "address": "fa:16:3e:c0:77:28", "network": {"id": "7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d", "bridge": "br-int", "label": "tempest-TestServerMultinode-1724511624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff00c41537a48ac87c3d3460e6ff7e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap462e1ede-b0", "ovs_interfaceid": "462e1ede-b054-4653-9f4d-b136bdab7915", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:48:29 compute-0 nova_compute[187185]: 2025-11-29 07:48:29.023 187189 DEBUG nova.network.os_vif_util [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Converting VIF {"id": "462e1ede-b054-4653-9f4d-b136bdab7915", "address": "fa:16:3e:c0:77:28", "network": {"id": "7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d", "bridge": "br-int", "label": "tempest-TestServerMultinode-1724511624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff00c41537a48ac87c3d3460e6ff7e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap462e1ede-b0", "ovs_interfaceid": "462e1ede-b054-4653-9f4d-b136bdab7915", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:48:29 compute-0 nova_compute[187185]: 2025-11-29 07:48:29.024 187189 DEBUG nova.network.os_vif_util [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c0:77:28,bridge_name='br-int',has_traffic_filtering=True,id=462e1ede-b054-4653-9f4d-b136bdab7915,network=Network(7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap462e1ede-b0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:48:29 compute-0 nova_compute[187185]: 2025-11-29 07:48:29.025 187189 DEBUG nova.objects.instance [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lazy-loading 'pci_devices' on Instance uuid f18e8e64-a0ef-4c29-85d0-955b86872379 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:48:29 compute-0 nova_compute[187185]: 2025-11-29 07:48:29.171 187189 DEBUG nova.virt.libvirt.driver [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:48:29 compute-0 nova_compute[187185]:   <uuid>f18e8e64-a0ef-4c29-85d0-955b86872379</uuid>
Nov 29 07:48:29 compute-0 nova_compute[187185]:   <name>instance-000000aa</name>
Nov 29 07:48:29 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:48:29 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:48:29 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:48:29 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:       <nova:name>tempest-TestServerMultinode-server-1299020398</nova:name>
Nov 29 07:48:29 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:48:28</nova:creationTime>
Nov 29 07:48:29 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:48:29 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:48:29 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:48:29 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:48:29 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:48:29 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:48:29 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:48:29 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:48:29 compute-0 nova_compute[187185]:         <nova:user uuid="b79809b822b248ae8be15d0233f5896e">tempest-TestServerMultinode-521650901-project-admin</nova:user>
Nov 29 07:48:29 compute-0 nova_compute[187185]:         <nova:project uuid="220340bd80db4bf5af391eb2e4247a6c">tempest-TestServerMultinode-521650901</nova:project>
Nov 29 07:48:29 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:48:29 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:48:29 compute-0 nova_compute[187185]:         <nova:port uuid="462e1ede-b054-4653-9f4d-b136bdab7915">
Nov 29 07:48:29 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:48:29 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:48:29 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:48:29 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <system>
Nov 29 07:48:29 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:48:29 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:48:29 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:48:29 compute-0 nova_compute[187185]:       <entry name="serial">f18e8e64-a0ef-4c29-85d0-955b86872379</entry>
Nov 29 07:48:29 compute-0 nova_compute[187185]:       <entry name="uuid">f18e8e64-a0ef-4c29-85d0-955b86872379</entry>
Nov 29 07:48:29 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     </system>
Nov 29 07:48:29 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:48:29 compute-0 nova_compute[187185]:   <os>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:   </os>
Nov 29 07:48:29 compute-0 nova_compute[187185]:   <features>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:   </features>
Nov 29 07:48:29 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:48:29 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:48:29 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:48:29 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/f18e8e64-a0ef-4c29-85d0-955b86872379/disk"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:48:29 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/f18e8e64-a0ef-4c29-85d0-955b86872379/disk.config"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:48:29 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:c0:77:28"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:       <target dev="tap462e1ede-b0"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:48:29 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/f18e8e64-a0ef-4c29-85d0-955b86872379/console.log" append="off"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <video>
Nov 29 07:48:29 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     </video>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:48:29 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:48:29 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:48:29 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:48:29 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:48:29 compute-0 nova_compute[187185]: </domain>
Nov 29 07:48:29 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:48:29 compute-0 nova_compute[187185]: 2025-11-29 07:48:29.172 187189 DEBUG nova.compute.manager [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Preparing to wait for external event network-vif-plugged-462e1ede-b054-4653-9f4d-b136bdab7915 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:48:29 compute-0 nova_compute[187185]: 2025-11-29 07:48:29.173 187189 DEBUG oslo_concurrency.lockutils [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Acquiring lock "f18e8e64-a0ef-4c29-85d0-955b86872379-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:48:29 compute-0 nova_compute[187185]: 2025-11-29 07:48:29.173 187189 DEBUG oslo_concurrency.lockutils [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "f18e8e64-a0ef-4c29-85d0-955b86872379-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:48:29 compute-0 nova_compute[187185]: 2025-11-29 07:48:29.174 187189 DEBUG oslo_concurrency.lockutils [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "f18e8e64-a0ef-4c29-85d0-955b86872379-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:48:29 compute-0 nova_compute[187185]: 2025-11-29 07:48:29.174 187189 DEBUG nova.virt.libvirt.vif [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:48:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1299020398',display_name='tempest-TestServerMultinode-server-1299020398',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testservermultinode-server-1299020398',id=170,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='220340bd80db4bf5af391eb2e4247a6c',ramdisk_id='',reservation_id='r-t0h2676a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-521650901',owner_user_name='tempest-TestServerMultinode-521650901-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:48:21Z,user_data=None,user_id='b79809b822b248ae8be15d0233f5896e',uuid=f18e8e64-a0ef-4c29-85d0-955b86872379,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "462e1ede-b054-4653-9f4d-b136bdab7915", "address": "fa:16:3e:c0:77:28", "network": {"id": "7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d", "bridge": "br-int", "label": "tempest-TestServerMultinode-1724511624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff00c41537a48ac87c3d3460e6ff7e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap462e1ede-b0", "ovs_interfaceid": "462e1ede-b054-4653-9f4d-b136bdab7915", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:48:29 compute-0 nova_compute[187185]: 2025-11-29 07:48:29.175 187189 DEBUG nova.network.os_vif_util [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Converting VIF {"id": "462e1ede-b054-4653-9f4d-b136bdab7915", "address": "fa:16:3e:c0:77:28", "network": {"id": "7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d", "bridge": "br-int", "label": "tempest-TestServerMultinode-1724511624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff00c41537a48ac87c3d3460e6ff7e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap462e1ede-b0", "ovs_interfaceid": "462e1ede-b054-4653-9f4d-b136bdab7915", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:48:29 compute-0 nova_compute[187185]: 2025-11-29 07:48:29.175 187189 DEBUG nova.network.os_vif_util [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c0:77:28,bridge_name='br-int',has_traffic_filtering=True,id=462e1ede-b054-4653-9f4d-b136bdab7915,network=Network(7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap462e1ede-b0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:48:29 compute-0 nova_compute[187185]: 2025-11-29 07:48:29.176 187189 DEBUG os_vif [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c0:77:28,bridge_name='br-int',has_traffic_filtering=True,id=462e1ede-b054-4653-9f4d-b136bdab7915,network=Network(7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap462e1ede-b0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:48:29 compute-0 nova_compute[187185]: 2025-11-29 07:48:29.176 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:29 compute-0 nova_compute[187185]: 2025-11-29 07:48:29.177 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:48:29 compute-0 nova_compute[187185]: 2025-11-29 07:48:29.177 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:48:29 compute-0 nova_compute[187185]: 2025-11-29 07:48:29.180 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:29 compute-0 nova_compute[187185]: 2025-11-29 07:48:29.180 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap462e1ede-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:48:29 compute-0 nova_compute[187185]: 2025-11-29 07:48:29.181 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap462e1ede-b0, col_values=(('external_ids', {'iface-id': '462e1ede-b054-4653-9f4d-b136bdab7915', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c0:77:28', 'vm-uuid': 'f18e8e64-a0ef-4c29-85d0-955b86872379'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:48:29 compute-0 nova_compute[187185]: 2025-11-29 07:48:29.182 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:29 compute-0 NetworkManager[55227]: <info>  [1764402509.1844] manager: (tap462e1ede-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/295)
Nov 29 07:48:29 compute-0 nova_compute[187185]: 2025-11-29 07:48:29.185 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:48:29 compute-0 nova_compute[187185]: 2025-11-29 07:48:29.192 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:29 compute-0 nova_compute[187185]: 2025-11-29 07:48:29.193 187189 INFO os_vif [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c0:77:28,bridge_name='br-int',has_traffic_filtering=True,id=462e1ede-b054-4653-9f4d-b136bdab7915,network=Network(7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap462e1ede-b0')
Nov 29 07:48:29 compute-0 nova_compute[187185]: 2025-11-29 07:48:29.536 187189 DEBUG nova.virt.libvirt.driver [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:48:29 compute-0 nova_compute[187185]: 2025-11-29 07:48:29.536 187189 DEBUG nova.virt.libvirt.driver [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:48:29 compute-0 nova_compute[187185]: 2025-11-29 07:48:29.537 187189 DEBUG nova.virt.libvirt.driver [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] No VIF found with MAC fa:16:3e:c0:77:28, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:48:29 compute-0 nova_compute[187185]: 2025-11-29 07:48:29.537 187189 INFO nova.virt.libvirt.driver [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Using config drive
Nov 29 07:48:30 compute-0 nova_compute[187185]: 2025-11-29 07:48:30.467 187189 INFO nova.virt.libvirt.driver [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Creating config drive at /var/lib/nova/instances/f18e8e64-a0ef-4c29-85d0-955b86872379/disk.config
Nov 29 07:48:30 compute-0 nova_compute[187185]: 2025-11-29 07:48:30.473 187189 DEBUG oslo_concurrency.processutils [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f18e8e64-a0ef-4c29-85d0-955b86872379/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzamnxzk4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:48:30 compute-0 nova_compute[187185]: 2025-11-29 07:48:30.604 187189 DEBUG oslo_concurrency.processutils [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f18e8e64-a0ef-4c29-85d0-955b86872379/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzamnxzk4" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:48:30 compute-0 kernel: tap462e1ede-b0: entered promiscuous mode
Nov 29 07:48:30 compute-0 NetworkManager[55227]: <info>  [1764402510.7211] manager: (tap462e1ede-b0): new Tun device (/org/freedesktop/NetworkManager/Devices/296)
Nov 29 07:48:30 compute-0 nova_compute[187185]: 2025-11-29 07:48:30.721 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:30 compute-0 nova_compute[187185]: 2025-11-29 07:48:30.724 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:30 compute-0 ovn_controller[95281]: 2025-11-29T07:48:30Z|00567|binding|INFO|Claiming lport 462e1ede-b054-4653-9f4d-b136bdab7915 for this chassis.
Nov 29 07:48:30 compute-0 ovn_controller[95281]: 2025-11-29T07:48:30Z|00568|binding|INFO|462e1ede-b054-4653-9f4d-b136bdab7915: Claiming fa:16:3e:c0:77:28 10.100.0.5
Nov 29 07:48:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:30.753 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c0:77:28 10.100.0.5'], port_security=['fa:16:3e:c0:77:28 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '220340bd80db4bf5af391eb2e4247a6c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7d07af2a-16f6-4fe3-b2a4-ed6b96a38a93', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cbb62e23-e8c7-432f-b445-db50c529fe8e, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=462e1ede-b054-4653-9f4d-b136bdab7915) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:48:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:30.755 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 462e1ede-b054-4653-9f4d-b136bdab7915 in datapath 7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d bound to our chassis
Nov 29 07:48:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:30.758 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d
Nov 29 07:48:30 compute-0 systemd-udevd[246054]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:48:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:30.775 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[df7ff8c8-9b25-486e-bab1-ae770706c0b1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:48:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:30.777 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7fbe5e7f-51 in ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:48:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:30.780 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7fbe5e7f-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:48:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:30.780 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[54bba956-20eb-4f36-b4b6-b7c406ba3dc1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:48:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:30.782 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[94eab528-ecc2-4cac-83d6-b1d6c07d0f4f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:48:30 compute-0 nova_compute[187185]: 2025-11-29 07:48:30.787 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:30 compute-0 podman[246000]: 2025-11-29 07:48:30.790294773 +0000 UTC m=+0.083529838 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 07:48:30 compute-0 ovn_controller[95281]: 2025-11-29T07:48:30Z|00569|binding|INFO|Setting lport 462e1ede-b054-4653-9f4d-b136bdab7915 ovn-installed in OVS
Nov 29 07:48:30 compute-0 ovn_controller[95281]: 2025-11-29T07:48:30Z|00570|binding|INFO|Setting lport 462e1ede-b054-4653-9f4d-b136bdab7915 up in Southbound
Nov 29 07:48:30 compute-0 nova_compute[187185]: 2025-11-29 07:48:30.796 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:30 compute-0 NetworkManager[55227]: <info>  [1764402510.7994] device (tap462e1ede-b0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:48:30 compute-0 NetworkManager[55227]: <info>  [1764402510.8014] device (tap462e1ede-b0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:48:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:30.801 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[7210c3ba-1dd9-43a2-bfa7-1b24fcaef697]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:48:30 compute-0 systemd-machined[153486]: New machine qemu-66-instance-000000aa.
Nov 29 07:48:30 compute-0 podman[245999]: 2025-11-29 07:48:30.813483653 +0000 UTC m=+0.109343713 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, version=9.6, managed_by=edpm_ansible, vcs-type=git, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc.)
Nov 29 07:48:30 compute-0 systemd[1]: Started Virtual Machine qemu-66-instance-000000aa.
Nov 29 07:48:30 compute-0 podman[245998]: 2025-11-29 07:48:30.834672876 +0000 UTC m=+0.130694521 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 07:48:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:30.832 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6e00f33b-24d2-4daa-b131-fb114654a546]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:48:30 compute-0 nova_compute[187185]: 2025-11-29 07:48:30.838 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:48:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:30.866 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[cf17e8fd-23ff-4f89-9e2d-46c3816236b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:48:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:30.871 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d2507267-f0ea-49ed-a8df-3a64de9db156]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:48:30 compute-0 NetworkManager[55227]: <info>  [1764402510.8739] manager: (tap7fbe5e7f-50): new Veth device (/org/freedesktop/NetworkManager/Devices/297)
Nov 29 07:48:30 compute-0 systemd-udevd[246073]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:48:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:30.906 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[66e04555-e3f7-46ad-bdfe-9a2efab10b35]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:48:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:30.910 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[a64cf5c3-b66e-4189-934a-e89ada004a5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:48:30 compute-0 NetworkManager[55227]: <info>  [1764402510.9396] device (tap7fbe5e7f-50): carrier: link connected
Nov 29 07:48:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:30.946 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[7dc56744-1832-4023-9737-b894e1ed4ed8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:48:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:30.968 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a9fea93f-addc-4bf5-8697-404a0a0d8224]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7fbe5e7f-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:76:bf:c0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 178], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 795860, 'reachable_time': 29605, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246103, 'error': None, 'target': 'ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:48:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:30.988 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[20672755-b4fe-41c5-b046-5ee99456240c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe76:bfc0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 795860, 'tstamp': 795860}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 246104, 'error': None, 'target': 'ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:48:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:31.010 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[87abf388-be33-42da-8184-c9df2c384b84]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7fbe5e7f-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:76:bf:c0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 178], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 795860, 'reachable_time': 29605, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 246105, 'error': None, 'target': 'ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:48:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:31.052 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[5d2958ba-1d69-439f-a16c-caf9b7413d88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:48:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:31.128 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[5bb44628-9bf0-435b-82da-6ea8a0a46e34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:48:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:31.130 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7fbe5e7f-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:48:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:31.131 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:48:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:31.131 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7fbe5e7f-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:48:31 compute-0 nova_compute[187185]: 2025-11-29 07:48:31.134 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:31 compute-0 NetworkManager[55227]: <info>  [1764402511.1352] manager: (tap7fbe5e7f-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/298)
Nov 29 07:48:31 compute-0 kernel: tap7fbe5e7f-50: entered promiscuous mode
Nov 29 07:48:31 compute-0 nova_compute[187185]: 2025-11-29 07:48:31.137 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:31.138 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7fbe5e7f-50, col_values=(('external_ids', {'iface-id': 'e08502a1-bdde-4e8d-89e4-c05bd265f847'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:48:31 compute-0 nova_compute[187185]: 2025-11-29 07:48:31.139 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:31.142 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:48:31 compute-0 ovn_controller[95281]: 2025-11-29T07:48:31Z|00571|binding|INFO|Releasing lport e08502a1-bdde-4e8d-89e4-c05bd265f847 from this chassis (sb_readonly=0)
Nov 29 07:48:31 compute-0 nova_compute[187185]: 2025-11-29 07:48:31.143 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:31.143 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[0a2e8ff1-a7b8-4984-814a-855e02d24e1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:48:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:31.145 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:48:31 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:48:31 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:48:31 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d
Nov 29 07:48:31 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:48:31 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:48:31 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:48:31 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d.pid.haproxy
Nov 29 07:48:31 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:48:31 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:48:31 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:48:31 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:48:31 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:48:31 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:48:31 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:48:31 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:48:31 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:48:31 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:48:31 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:48:31 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:48:31 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:48:31 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:48:31 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:48:31 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:48:31 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:48:31 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:48:31 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:48:31 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:48:31 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID 7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d
Nov 29 07:48:31 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:48:31 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:31.146 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d', 'env', 'PROCESS_TAG=haproxy-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:48:31 compute-0 nova_compute[187185]: 2025-11-29 07:48:31.153 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:31 compute-0 nova_compute[187185]: 2025-11-29 07:48:31.204 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764402511.2034817, f18e8e64-a0ef-4c29-85d0-955b86872379 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:48:31 compute-0 nova_compute[187185]: 2025-11-29 07:48:31.205 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] VM Started (Lifecycle Event)
Nov 29 07:48:31 compute-0 nova_compute[187185]: 2025-11-29 07:48:31.235 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:48:31 compute-0 nova_compute[187185]: 2025-11-29 07:48:31.242 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764402511.2036343, f18e8e64-a0ef-4c29-85d0-955b86872379 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:48:31 compute-0 nova_compute[187185]: 2025-11-29 07:48:31.242 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] VM Paused (Lifecycle Event)
Nov 29 07:48:31 compute-0 nova_compute[187185]: 2025-11-29 07:48:31.342 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:48:31 compute-0 nova_compute[187185]: 2025-11-29 07:48:31.347 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:48:31 compute-0 nova_compute[187185]: 2025-11-29 07:48:31.475 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:48:31 compute-0 podman[246144]: 2025-11-29 07:48:31.571968631 +0000 UTC m=+0.054578405 container create 83057635f63208d865cee3182e0a568e7345ab5d2fc595b6210304beb14674f8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 29 07:48:31 compute-0 systemd[1]: Started libpod-conmon-83057635f63208d865cee3182e0a568e7345ab5d2fc595b6210304beb14674f8.scope.
Nov 29 07:48:31 compute-0 podman[246144]: 2025-11-29 07:48:31.542115061 +0000 UTC m=+0.024724855 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:48:31 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:48:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3791371261b83bd234d0aee3a9d289b8ab68200bd8c9b66619176c6a0ac9c8d2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:48:31 compute-0 podman[246144]: 2025-11-29 07:48:31.660263504 +0000 UTC m=+0.142873278 container init 83057635f63208d865cee3182e0a568e7345ab5d2fc595b6210304beb14674f8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 29 07:48:31 compute-0 podman[246144]: 2025-11-29 07:48:31.66714775 +0000 UTC m=+0.149757524 container start 83057635f63208d865cee3182e0a568e7345ab5d2fc595b6210304beb14674f8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:48:31 compute-0 neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d[246159]: [NOTICE]   (246163) : New worker (246165) forked
Nov 29 07:48:31 compute-0 neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d[246159]: [NOTICE]   (246163) : Loading success.
Nov 29 07:48:31 compute-0 nova_compute[187185]: 2025-11-29 07:48:31.994 187189 DEBUG nova.network.neutron [req-d133c22b-8afc-43e9-990f-271651d5d037 req-e1e93d86-934a-4293-a8f1-a0c262e15843 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Updated VIF entry in instance network info cache for port 462e1ede-b054-4653-9f4d-b136bdab7915. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:48:31 compute-0 nova_compute[187185]: 2025-11-29 07:48:31.995 187189 DEBUG nova.network.neutron [req-d133c22b-8afc-43e9-990f-271651d5d037 req-e1e93d86-934a-4293-a8f1-a0c262e15843 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Updating instance_info_cache with network_info: [{"id": "462e1ede-b054-4653-9f4d-b136bdab7915", "address": "fa:16:3e:c0:77:28", "network": {"id": "7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d", "bridge": "br-int", "label": "tempest-TestServerMultinode-1724511624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff00c41537a48ac87c3d3460e6ff7e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap462e1ede-b0", "ovs_interfaceid": "462e1ede-b054-4653-9f4d-b136bdab7915", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:48:32 compute-0 nova_compute[187185]: 2025-11-29 07:48:32.029 187189 DEBUG oslo_concurrency.lockutils [req-d133c22b-8afc-43e9-990f-271651d5d037 req-e1e93d86-934a-4293-a8f1-a0c262e15843 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-f18e8e64-a0ef-4c29-85d0-955b86872379" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:48:32 compute-0 nova_compute[187185]: 2025-11-29 07:48:32.291 187189 DEBUG nova.compute.manager [req-d1452b54-5306-4268-a32a-cff1cca91298 req-e735f267-fd3d-4887-9af7-55a25866e5d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Received event network-vif-plugged-462e1ede-b054-4653-9f4d-b136bdab7915 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:48:32 compute-0 nova_compute[187185]: 2025-11-29 07:48:32.291 187189 DEBUG oslo_concurrency.lockutils [req-d1452b54-5306-4268-a32a-cff1cca91298 req-e735f267-fd3d-4887-9af7-55a25866e5d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "f18e8e64-a0ef-4c29-85d0-955b86872379-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:48:32 compute-0 nova_compute[187185]: 2025-11-29 07:48:32.292 187189 DEBUG oslo_concurrency.lockutils [req-d1452b54-5306-4268-a32a-cff1cca91298 req-e735f267-fd3d-4887-9af7-55a25866e5d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f18e8e64-a0ef-4c29-85d0-955b86872379-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:48:32 compute-0 nova_compute[187185]: 2025-11-29 07:48:32.292 187189 DEBUG oslo_concurrency.lockutils [req-d1452b54-5306-4268-a32a-cff1cca91298 req-e735f267-fd3d-4887-9af7-55a25866e5d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f18e8e64-a0ef-4c29-85d0-955b86872379-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:48:32 compute-0 nova_compute[187185]: 2025-11-29 07:48:32.292 187189 DEBUG nova.compute.manager [req-d1452b54-5306-4268-a32a-cff1cca91298 req-e735f267-fd3d-4887-9af7-55a25866e5d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Processing event network-vif-plugged-462e1ede-b054-4653-9f4d-b136bdab7915 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:48:32 compute-0 nova_compute[187185]: 2025-11-29 07:48:32.293 187189 DEBUG nova.compute.manager [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:48:32 compute-0 nova_compute[187185]: 2025-11-29 07:48:32.297 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764402512.2973597, f18e8e64-a0ef-4c29-85d0-955b86872379 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:48:32 compute-0 nova_compute[187185]: 2025-11-29 07:48:32.297 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] VM Resumed (Lifecycle Event)
Nov 29 07:48:32 compute-0 nova_compute[187185]: 2025-11-29 07:48:32.299 187189 DEBUG nova.virt.libvirt.driver [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:48:32 compute-0 nova_compute[187185]: 2025-11-29 07:48:32.302 187189 INFO nova.virt.libvirt.driver [-] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Instance spawned successfully.
Nov 29 07:48:32 compute-0 nova_compute[187185]: 2025-11-29 07:48:32.302 187189 DEBUG nova.virt.libvirt.driver [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:48:32 compute-0 nova_compute[187185]: 2025-11-29 07:48:32.337 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:48:32 compute-0 nova_compute[187185]: 2025-11-29 07:48:32.342 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:48:32 compute-0 nova_compute[187185]: 2025-11-29 07:48:32.347 187189 DEBUG nova.virt.libvirt.driver [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:48:32 compute-0 nova_compute[187185]: 2025-11-29 07:48:32.347 187189 DEBUG nova.virt.libvirt.driver [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:48:32 compute-0 nova_compute[187185]: 2025-11-29 07:48:32.347 187189 DEBUG nova.virt.libvirt.driver [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:48:32 compute-0 nova_compute[187185]: 2025-11-29 07:48:32.348 187189 DEBUG nova.virt.libvirt.driver [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:48:32 compute-0 nova_compute[187185]: 2025-11-29 07:48:32.348 187189 DEBUG nova.virt.libvirt.driver [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:48:32 compute-0 nova_compute[187185]: 2025-11-29 07:48:32.348 187189 DEBUG nova.virt.libvirt.driver [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:48:32 compute-0 nova_compute[187185]: 2025-11-29 07:48:32.382 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:48:32 compute-0 nova_compute[187185]: 2025-11-29 07:48:32.538 187189 INFO nova.compute.manager [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Took 11.44 seconds to spawn the instance on the hypervisor.
Nov 29 07:48:32 compute-0 nova_compute[187185]: 2025-11-29 07:48:32.539 187189 DEBUG nova.compute.manager [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:48:32 compute-0 nova_compute[187185]: 2025-11-29 07:48:32.736 187189 INFO nova.compute.manager [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Took 12.24 seconds to build instance.
Nov 29 07:48:32 compute-0 nova_compute[187185]: 2025-11-29 07:48:32.757 187189 DEBUG oslo_concurrency.lockutils [None req-c9f25917-d4ac-4a1c-bb07-fb5fe8ba92ae b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "f18e8e64-a0ef-4c29-85d0-955b86872379" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.374s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:48:33 compute-0 nova_compute[187185]: 2025-11-29 07:48:33.230 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:33 compute-0 nova_compute[187185]: 2025-11-29 07:48:33.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:48:33 compute-0 nova_compute[187185]: 2025-11-29 07:48:33.355 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:48:33 compute-0 nova_compute[187185]: 2025-11-29 07:48:33.356 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:48:33 compute-0 nova_compute[187185]: 2025-11-29 07:48:33.357 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:48:33 compute-0 nova_compute[187185]: 2025-11-29 07:48:33.357 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:48:33 compute-0 nova_compute[187185]: 2025-11-29 07:48:33.439 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f18e8e64-a0ef-4c29-85d0-955b86872379/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:48:33 compute-0 nova_compute[187185]: 2025-11-29 07:48:33.540 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f18e8e64-a0ef-4c29-85d0-955b86872379/disk --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:48:33 compute-0 nova_compute[187185]: 2025-11-29 07:48:33.541 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f18e8e64-a0ef-4c29-85d0-955b86872379/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:48:33 compute-0 nova_compute[187185]: 2025-11-29 07:48:33.600 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f18e8e64-a0ef-4c29-85d0-955b86872379/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:48:33 compute-0 nova_compute[187185]: 2025-11-29 07:48:33.792 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:48:33 compute-0 nova_compute[187185]: 2025-11-29 07:48:33.794 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5625MB free_disk=73.25278854370117GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:48:33 compute-0 nova_compute[187185]: 2025-11-29 07:48:33.794 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:48:33 compute-0 nova_compute[187185]: 2025-11-29 07:48:33.795 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:48:33 compute-0 nova_compute[187185]: 2025-11-29 07:48:33.883 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance f18e8e64-a0ef-4c29-85d0-955b86872379 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:48:33 compute-0 nova_compute[187185]: 2025-11-29 07:48:33.883 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:48:33 compute-0 nova_compute[187185]: 2025-11-29 07:48:33.884 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:48:33 compute-0 nova_compute[187185]: 2025-11-29 07:48:33.922 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:48:33 compute-0 nova_compute[187185]: 2025-11-29 07:48:33.940 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:48:33 compute-0 nova_compute[187185]: 2025-11-29 07:48:33.971 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:48:33 compute-0 nova_compute[187185]: 2025-11-29 07:48:33.972 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:48:34 compute-0 sshd-session[246174]: Invalid user administrator from 20.255.62.58 port 51310
Nov 29 07:48:34 compute-0 nova_compute[187185]: 2025-11-29 07:48:34.184 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:34 compute-0 sshd-session[246174]: Received disconnect from 20.255.62.58 port 51310:11: Bye Bye [preauth]
Nov 29 07:48:34 compute-0 sshd-session[246174]: Disconnected from invalid user administrator 20.255.62.58 port 51310 [preauth]
Nov 29 07:48:34 compute-0 nova_compute[187185]: 2025-11-29 07:48:34.445 187189 DEBUG nova.compute.manager [req-45aaafdc-7f59-4b77-aa6d-ccf0b2641ecf req-9f07ddbe-e07b-4caa-85d1-bbec5da7e122 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Received event network-vif-plugged-462e1ede-b054-4653-9f4d-b136bdab7915 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:48:34 compute-0 nova_compute[187185]: 2025-11-29 07:48:34.446 187189 DEBUG oslo_concurrency.lockutils [req-45aaafdc-7f59-4b77-aa6d-ccf0b2641ecf req-9f07ddbe-e07b-4caa-85d1-bbec5da7e122 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "f18e8e64-a0ef-4c29-85d0-955b86872379-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:48:34 compute-0 nova_compute[187185]: 2025-11-29 07:48:34.447 187189 DEBUG oslo_concurrency.lockutils [req-45aaafdc-7f59-4b77-aa6d-ccf0b2641ecf req-9f07ddbe-e07b-4caa-85d1-bbec5da7e122 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f18e8e64-a0ef-4c29-85d0-955b86872379-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:48:34 compute-0 nova_compute[187185]: 2025-11-29 07:48:34.447 187189 DEBUG oslo_concurrency.lockutils [req-45aaafdc-7f59-4b77-aa6d-ccf0b2641ecf req-9f07ddbe-e07b-4caa-85d1-bbec5da7e122 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f18e8e64-a0ef-4c29-85d0-955b86872379-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:48:34 compute-0 nova_compute[187185]: 2025-11-29 07:48:34.448 187189 DEBUG nova.compute.manager [req-45aaafdc-7f59-4b77-aa6d-ccf0b2641ecf req-9f07ddbe-e07b-4caa-85d1-bbec5da7e122 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] No waiting events found dispatching network-vif-plugged-462e1ede-b054-4653-9f4d-b136bdab7915 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:48:34 compute-0 nova_compute[187185]: 2025-11-29 07:48:34.448 187189 WARNING nova.compute.manager [req-45aaafdc-7f59-4b77-aa6d-ccf0b2641ecf req-9f07ddbe-e07b-4caa-85d1-bbec5da7e122 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Received unexpected event network-vif-plugged-462e1ede-b054-4653-9f4d-b136bdab7915 for instance with vm_state active and task_state None.
Nov 29 07:48:34 compute-0 nova_compute[187185]: 2025-11-29 07:48:34.972 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:48:35 compute-0 nova_compute[187185]: 2025-11-29 07:48:35.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:48:38 compute-0 nova_compute[187185]: 2025-11-29 07:48:38.232 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:38 compute-0 nova_compute[187185]: 2025-11-29 07:48:38.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:48:38 compute-0 nova_compute[187185]: 2025-11-29 07:48:38.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:48:38 compute-0 nova_compute[187185]: 2025-11-29 07:48:38.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:48:38 compute-0 nova_compute[187185]: 2025-11-29 07:48:38.682 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "refresh_cache-f18e8e64-a0ef-4c29-85d0-955b86872379" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:48:38 compute-0 nova_compute[187185]: 2025-11-29 07:48:38.683 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquired lock "refresh_cache-f18e8e64-a0ef-4c29-85d0-955b86872379" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:48:38 compute-0 nova_compute[187185]: 2025-11-29 07:48:38.683 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 29 07:48:38 compute-0 nova_compute[187185]: 2025-11-29 07:48:38.683 187189 DEBUG nova.objects.instance [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f18e8e64-a0ef-4c29-85d0-955b86872379 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:48:39 compute-0 nova_compute[187185]: 2025-11-29 07:48:39.188 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:39 compute-0 podman[246183]: 2025-11-29 07:48:39.875025766 +0000 UTC m=+0.132581003 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 29 07:48:39 compute-0 nova_compute[187185]: 2025-11-29 07:48:39.885 187189 DEBUG oslo_concurrency.lockutils [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "9bd8796c-97a5-491a-b6b4-713222c15142" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:48:39 compute-0 nova_compute[187185]: 2025-11-29 07:48:39.886 187189 DEBUG oslo_concurrency.lockutils [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "9bd8796c-97a5-491a-b6b4-713222c15142" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:48:39 compute-0 nova_compute[187185]: 2025-11-29 07:48:39.901 187189 DEBUG nova.compute.manager [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 07:48:39 compute-0 nova_compute[187185]: 2025-11-29 07:48:39.998 187189 DEBUG oslo_concurrency.lockutils [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:48:39 compute-0 nova_compute[187185]: 2025-11-29 07:48:39.998 187189 DEBUG oslo_concurrency.lockutils [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:48:40 compute-0 nova_compute[187185]: 2025-11-29 07:48:40.006 187189 DEBUG nova.virt.hardware [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:48:40 compute-0 nova_compute[187185]: 2025-11-29 07:48:40.007 187189 INFO nova.compute.claims [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:48:40 compute-0 nova_compute[187185]: 2025-11-29 07:48:40.213 187189 DEBUG nova.compute.provider_tree [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:48:40 compute-0 nova_compute[187185]: 2025-11-29 07:48:40.232 187189 DEBUG nova.scheduler.client.report [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:48:40 compute-0 nova_compute[187185]: 2025-11-29 07:48:40.258 187189 DEBUG oslo_concurrency.lockutils [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.259s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:48:40 compute-0 nova_compute[187185]: 2025-11-29 07:48:40.259 187189 DEBUG nova.compute.manager [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 07:48:40 compute-0 nova_compute[187185]: 2025-11-29 07:48:40.324 187189 DEBUG nova.compute.manager [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 07:48:40 compute-0 nova_compute[187185]: 2025-11-29 07:48:40.325 187189 DEBUG nova.network.neutron [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 07:48:40 compute-0 nova_compute[187185]: 2025-11-29 07:48:40.346 187189 INFO nova.virt.libvirt.driver [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 07:48:40 compute-0 nova_compute[187185]: 2025-11-29 07:48:40.364 187189 DEBUG nova.compute.manager [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 07:48:40 compute-0 nova_compute[187185]: 2025-11-29 07:48:40.475 187189 DEBUG nova.compute.manager [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 07:48:40 compute-0 nova_compute[187185]: 2025-11-29 07:48:40.477 187189 DEBUG nova.virt.libvirt.driver [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:48:40 compute-0 nova_compute[187185]: 2025-11-29 07:48:40.478 187189 INFO nova.virt.libvirt.driver [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Creating image(s)
Nov 29 07:48:40 compute-0 nova_compute[187185]: 2025-11-29 07:48:40.479 187189 DEBUG oslo_concurrency.lockutils [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "/var/lib/nova/instances/9bd8796c-97a5-491a-b6b4-713222c15142/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:48:40 compute-0 nova_compute[187185]: 2025-11-29 07:48:40.479 187189 DEBUG oslo_concurrency.lockutils [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "/var/lib/nova/instances/9bd8796c-97a5-491a-b6b4-713222c15142/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:48:40 compute-0 nova_compute[187185]: 2025-11-29 07:48:40.480 187189 DEBUG oslo_concurrency.lockutils [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "/var/lib/nova/instances/9bd8796c-97a5-491a-b6b4-713222c15142/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:48:40 compute-0 nova_compute[187185]: 2025-11-29 07:48:40.497 187189 DEBUG oslo_concurrency.processutils [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:48:40 compute-0 nova_compute[187185]: 2025-11-29 07:48:40.579 187189 DEBUG oslo_concurrency.processutils [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:48:40 compute-0 nova_compute[187185]: 2025-11-29 07:48:40.581 187189 DEBUG oslo_concurrency.lockutils [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:48:40 compute-0 nova_compute[187185]: 2025-11-29 07:48:40.583 187189 DEBUG oslo_concurrency.lockutils [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:48:40 compute-0 nova_compute[187185]: 2025-11-29 07:48:40.607 187189 DEBUG oslo_concurrency.processutils [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:48:40 compute-0 nova_compute[187185]: 2025-11-29 07:48:40.681 187189 DEBUG oslo_concurrency.processutils [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:48:40 compute-0 nova_compute[187185]: 2025-11-29 07:48:40.683 187189 DEBUG oslo_concurrency.processutils [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/9bd8796c-97a5-491a-b6b4-713222c15142/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:48:40 compute-0 nova_compute[187185]: 2025-11-29 07:48:40.723 187189 DEBUG oslo_concurrency.processutils [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/9bd8796c-97a5-491a-b6b4-713222c15142/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:48:40 compute-0 nova_compute[187185]: 2025-11-29 07:48:40.724 187189 DEBUG oslo_concurrency.lockutils [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:48:40 compute-0 nova_compute[187185]: 2025-11-29 07:48:40.725 187189 DEBUG oslo_concurrency.processutils [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:48:40 compute-0 nova_compute[187185]: 2025-11-29 07:48:40.782 187189 DEBUG oslo_concurrency.processutils [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:48:40 compute-0 nova_compute[187185]: 2025-11-29 07:48:40.784 187189 DEBUG nova.virt.disk.api [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Checking if we can resize image /var/lib/nova/instances/9bd8796c-97a5-491a-b6b4-713222c15142/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:48:40 compute-0 nova_compute[187185]: 2025-11-29 07:48:40.784 187189 DEBUG oslo_concurrency.processutils [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9bd8796c-97a5-491a-b6b4-713222c15142/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:48:40 compute-0 nova_compute[187185]: 2025-11-29 07:48:40.822 187189 DEBUG nova.policy [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 07:48:40 compute-0 nova_compute[187185]: 2025-11-29 07:48:40.843 187189 DEBUG oslo_concurrency.processutils [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9bd8796c-97a5-491a-b6b4-713222c15142/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:48:40 compute-0 nova_compute[187185]: 2025-11-29 07:48:40.844 187189 DEBUG nova.virt.disk.api [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Cannot resize image /var/lib/nova/instances/9bd8796c-97a5-491a-b6b4-713222c15142/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:48:40 compute-0 nova_compute[187185]: 2025-11-29 07:48:40.845 187189 DEBUG nova.objects.instance [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'migration_context' on Instance uuid 9bd8796c-97a5-491a-b6b4-713222c15142 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:48:40 compute-0 nova_compute[187185]: 2025-11-29 07:48:40.862 187189 DEBUG nova.virt.libvirt.driver [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:48:40 compute-0 nova_compute[187185]: 2025-11-29 07:48:40.863 187189 DEBUG nova.virt.libvirt.driver [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Ensure instance console log exists: /var/lib/nova/instances/9bd8796c-97a5-491a-b6b4-713222c15142/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:48:40 compute-0 nova_compute[187185]: 2025-11-29 07:48:40.864 187189 DEBUG oslo_concurrency.lockutils [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:48:40 compute-0 nova_compute[187185]: 2025-11-29 07:48:40.865 187189 DEBUG oslo_concurrency.lockutils [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:48:40 compute-0 nova_compute[187185]: 2025-11-29 07:48:40.865 187189 DEBUG oslo_concurrency.lockutils [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:48:41 compute-0 nova_compute[187185]: 2025-11-29 07:48:41.105 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Updating instance_info_cache with network_info: [{"id": "462e1ede-b054-4653-9f4d-b136bdab7915", "address": "fa:16:3e:c0:77:28", "network": {"id": "7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d", "bridge": "br-int", "label": "tempest-TestServerMultinode-1724511624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff00c41537a48ac87c3d3460e6ff7e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap462e1ede-b0", "ovs_interfaceid": "462e1ede-b054-4653-9f4d-b136bdab7915", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:48:41 compute-0 nova_compute[187185]: 2025-11-29 07:48:41.143 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Releasing lock "refresh_cache-f18e8e64-a0ef-4c29-85d0-955b86872379" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:48:41 compute-0 nova_compute[187185]: 2025-11-29 07:48:41.144 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 29 07:48:41 compute-0 nova_compute[187185]: 2025-11-29 07:48:41.145 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:48:41 compute-0 nova_compute[187185]: 2025-11-29 07:48:41.145 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:48:41 compute-0 nova_compute[187185]: 2025-11-29 07:48:41.146 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:48:41 compute-0 nova_compute[187185]: 2025-11-29 07:48:41.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:48:41 compute-0 nova_compute[187185]: 2025-11-29 07:48:41.318 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:48:41 compute-0 nova_compute[187185]: 2025-11-29 07:48:41.837 187189 DEBUG nova.network.neutron [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Successfully created port: 720d6d0b-554f-4c56-b9e4-1de1309b83f0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:48:42 compute-0 nova_compute[187185]: 2025-11-29 07:48:42.864 187189 DEBUG nova.network.neutron [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Successfully updated port: 720d6d0b-554f-4c56-b9e4-1de1309b83f0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:48:42 compute-0 nova_compute[187185]: 2025-11-29 07:48:42.881 187189 DEBUG oslo_concurrency.lockutils [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "refresh_cache-9bd8796c-97a5-491a-b6b4-713222c15142" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:48:42 compute-0 nova_compute[187185]: 2025-11-29 07:48:42.882 187189 DEBUG oslo_concurrency.lockutils [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquired lock "refresh_cache-9bd8796c-97a5-491a-b6b4-713222c15142" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:48:42 compute-0 nova_compute[187185]: 2025-11-29 07:48:42.883 187189 DEBUG nova.network.neutron [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:48:42 compute-0 nova_compute[187185]: 2025-11-29 07:48:42.968 187189 DEBUG nova.compute.manager [req-c92a782d-ea48-48cf-b9d4-0281368daa22 req-81f978ee-4de3-4207-a53f-5cb4033b8b24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Received event network-changed-720d6d0b-554f-4c56-b9e4-1de1309b83f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:48:42 compute-0 nova_compute[187185]: 2025-11-29 07:48:42.969 187189 DEBUG nova.compute.manager [req-c92a782d-ea48-48cf-b9d4-0281368daa22 req-81f978ee-4de3-4207-a53f-5cb4033b8b24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Refreshing instance network info cache due to event network-changed-720d6d0b-554f-4c56-b9e4-1de1309b83f0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:48:42 compute-0 nova_compute[187185]: 2025-11-29 07:48:42.969 187189 DEBUG oslo_concurrency.lockutils [req-c92a782d-ea48-48cf-b9d4-0281368daa22 req-81f978ee-4de3-4207-a53f-5cb4033b8b24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-9bd8796c-97a5-491a-b6b4-713222c15142" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:48:43 compute-0 nova_compute[187185]: 2025-11-29 07:48:43.031 187189 DEBUG nova.network.neutron [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:48:43 compute-0 nova_compute[187185]: 2025-11-29 07:48:43.233 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.168 187189 DEBUG nova.network.neutron [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Updating instance_info_cache with network_info: [{"id": "720d6d0b-554f-4c56-b9e4-1de1309b83f0", "address": "fa:16:3e:b8:79:f5", "network": {"id": "cfd1ce3c-e516-46ef-8712-573fe4de8313", "bridge": "br-int", "label": "tempest-network-smoke--423607195", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap720d6d0b-55", "ovs_interfaceid": "720d6d0b-554f-4c56-b9e4-1de1309b83f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.191 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.325 187189 DEBUG oslo_concurrency.lockutils [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Releasing lock "refresh_cache-9bd8796c-97a5-491a-b6b4-713222c15142" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.326 187189 DEBUG nova.compute.manager [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Instance network_info: |[{"id": "720d6d0b-554f-4c56-b9e4-1de1309b83f0", "address": "fa:16:3e:b8:79:f5", "network": {"id": "cfd1ce3c-e516-46ef-8712-573fe4de8313", "bridge": "br-int", "label": "tempest-network-smoke--423607195", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap720d6d0b-55", "ovs_interfaceid": "720d6d0b-554f-4c56-b9e4-1de1309b83f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.326 187189 DEBUG oslo_concurrency.lockutils [req-c92a782d-ea48-48cf-b9d4-0281368daa22 req-81f978ee-4de3-4207-a53f-5cb4033b8b24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-9bd8796c-97a5-491a-b6b4-713222c15142" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.327 187189 DEBUG nova.network.neutron [req-c92a782d-ea48-48cf-b9d4-0281368daa22 req-81f978ee-4de3-4207-a53f-5cb4033b8b24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Refreshing network info cache for port 720d6d0b-554f-4c56-b9e4-1de1309b83f0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.330 187189 DEBUG nova.virt.libvirt.driver [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Start _get_guest_xml network_info=[{"id": "720d6d0b-554f-4c56-b9e4-1de1309b83f0", "address": "fa:16:3e:b8:79:f5", "network": {"id": "cfd1ce3c-e516-46ef-8712-573fe4de8313", "bridge": "br-int", "label": "tempest-network-smoke--423607195", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap720d6d0b-55", "ovs_interfaceid": "720d6d0b-554f-4c56-b9e4-1de1309b83f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.336 187189 WARNING nova.virt.libvirt.driver [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.340 187189 DEBUG nova.virt.libvirt.host [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.341 187189 DEBUG nova.virt.libvirt.host [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.343 187189 DEBUG nova.virt.libvirt.host [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.344 187189 DEBUG nova.virt.libvirt.host [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.345 187189 DEBUG nova.virt.libvirt.driver [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.346 187189 DEBUG nova.virt.hardware [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.346 187189 DEBUG nova.virt.hardware [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.347 187189 DEBUG nova.virt.hardware [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.347 187189 DEBUG nova.virt.hardware [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.347 187189 DEBUG nova.virt.hardware [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.347 187189 DEBUG nova.virt.hardware [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.348 187189 DEBUG nova.virt.hardware [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.348 187189 DEBUG nova.virt.hardware [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.348 187189 DEBUG nova.virt.hardware [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.348 187189 DEBUG nova.virt.hardware [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.349 187189 DEBUG nova.virt.hardware [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.353 187189 DEBUG nova.virt.libvirt.vif [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:48:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1530170034',display_name='tempest-TestNetworkBasicOps-server-1530170034',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1530170034',id=173,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCsZ1YtekqB3TYskLMIUT4bWP/hYgw9UzbfG3Gvozen43xm6R9wLUZYbDGo0BSiqoLWRVB1FDlITni1cXBB1DxXKDkaeEvAx/GpXX+X+BSa0WJ777uIPLhKFYfzCgERKfg==',key_name='tempest-TestNetworkBasicOps-905637856',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-dq7pvgw5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:48:40Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=9bd8796c-97a5-491a-b6b4-713222c15142,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "720d6d0b-554f-4c56-b9e4-1de1309b83f0", "address": "fa:16:3e:b8:79:f5", "network": {"id": "cfd1ce3c-e516-46ef-8712-573fe4de8313", "bridge": "br-int", "label": "tempest-network-smoke--423607195", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap720d6d0b-55", "ovs_interfaceid": "720d6d0b-554f-4c56-b9e4-1de1309b83f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.354 187189 DEBUG nova.network.os_vif_util [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "720d6d0b-554f-4c56-b9e4-1de1309b83f0", "address": "fa:16:3e:b8:79:f5", "network": {"id": "cfd1ce3c-e516-46ef-8712-573fe4de8313", "bridge": "br-int", "label": "tempest-network-smoke--423607195", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap720d6d0b-55", "ovs_interfaceid": "720d6d0b-554f-4c56-b9e4-1de1309b83f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.354 187189 DEBUG nova.network.os_vif_util [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b8:79:f5,bridge_name='br-int',has_traffic_filtering=True,id=720d6d0b-554f-4c56-b9e4-1de1309b83f0,network=Network(cfd1ce3c-e516-46ef-8712-573fe4de8313),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap720d6d0b-55') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.355 187189 DEBUG nova.objects.instance [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9bd8796c-97a5-491a-b6b4-713222c15142 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.375 187189 DEBUG nova.virt.libvirt.driver [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:48:44 compute-0 nova_compute[187185]:   <uuid>9bd8796c-97a5-491a-b6b4-713222c15142</uuid>
Nov 29 07:48:44 compute-0 nova_compute[187185]:   <name>instance-000000ad</name>
Nov 29 07:48:44 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:48:44 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:48:44 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:48:44 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:       <nova:name>tempest-TestNetworkBasicOps-server-1530170034</nova:name>
Nov 29 07:48:44 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:48:44</nova:creationTime>
Nov 29 07:48:44 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:48:44 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:48:44 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:48:44 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:48:44 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:48:44 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:48:44 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:48:44 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:48:44 compute-0 nova_compute[187185]:         <nova:user uuid="1dd0a7f5aaff402eb032cd5e60540dcb">tempest-TestNetworkBasicOps-1587012976-project-member</nova:user>
Nov 29 07:48:44 compute-0 nova_compute[187185]:         <nova:project uuid="ec8b80be17a14d1caf666636283749d0">tempest-TestNetworkBasicOps-1587012976</nova:project>
Nov 29 07:48:44 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:48:44 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:48:44 compute-0 nova_compute[187185]:         <nova:port uuid="720d6d0b-554f-4c56-b9e4-1de1309b83f0">
Nov 29 07:48:44 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:48:44 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:48:44 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:48:44 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <system>
Nov 29 07:48:44 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:48:44 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:48:44 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:48:44 compute-0 nova_compute[187185]:       <entry name="serial">9bd8796c-97a5-491a-b6b4-713222c15142</entry>
Nov 29 07:48:44 compute-0 nova_compute[187185]:       <entry name="uuid">9bd8796c-97a5-491a-b6b4-713222c15142</entry>
Nov 29 07:48:44 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     </system>
Nov 29 07:48:44 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:48:44 compute-0 nova_compute[187185]:   <os>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:   </os>
Nov 29 07:48:44 compute-0 nova_compute[187185]:   <features>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:   </features>
Nov 29 07:48:44 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:48:44 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:48:44 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:48:44 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/9bd8796c-97a5-491a-b6b4-713222c15142/disk"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:48:44 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/9bd8796c-97a5-491a-b6b4-713222c15142/disk.config"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:48:44 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:b8:79:f5"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:       <target dev="tap720d6d0b-55"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:48:44 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/9bd8796c-97a5-491a-b6b4-713222c15142/console.log" append="off"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <video>
Nov 29 07:48:44 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     </video>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:48:44 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:48:44 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:48:44 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:48:44 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:48:44 compute-0 nova_compute[187185]: </domain>
Nov 29 07:48:44 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.377 187189 DEBUG nova.compute.manager [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Preparing to wait for external event network-vif-plugged-720d6d0b-554f-4c56-b9e4-1de1309b83f0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.377 187189 DEBUG oslo_concurrency.lockutils [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "9bd8796c-97a5-491a-b6b4-713222c15142-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.377 187189 DEBUG oslo_concurrency.lockutils [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "9bd8796c-97a5-491a-b6b4-713222c15142-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.378 187189 DEBUG oslo_concurrency.lockutils [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "9bd8796c-97a5-491a-b6b4-713222c15142-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.378 187189 DEBUG nova.virt.libvirt.vif [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:48:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1530170034',display_name='tempest-TestNetworkBasicOps-server-1530170034',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1530170034',id=173,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCsZ1YtekqB3TYskLMIUT4bWP/hYgw9UzbfG3Gvozen43xm6R9wLUZYbDGo0BSiqoLWRVB1FDlITni1cXBB1DxXKDkaeEvAx/GpXX+X+BSa0WJ777uIPLhKFYfzCgERKfg==',key_name='tempest-TestNetworkBasicOps-905637856',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-dq7pvgw5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:48:40Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=9bd8796c-97a5-491a-b6b4-713222c15142,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "720d6d0b-554f-4c56-b9e4-1de1309b83f0", "address": "fa:16:3e:b8:79:f5", "network": {"id": "cfd1ce3c-e516-46ef-8712-573fe4de8313", "bridge": "br-int", "label": "tempest-network-smoke--423607195", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap720d6d0b-55", "ovs_interfaceid": "720d6d0b-554f-4c56-b9e4-1de1309b83f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.379 187189 DEBUG nova.network.os_vif_util [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "720d6d0b-554f-4c56-b9e4-1de1309b83f0", "address": "fa:16:3e:b8:79:f5", "network": {"id": "cfd1ce3c-e516-46ef-8712-573fe4de8313", "bridge": "br-int", "label": "tempest-network-smoke--423607195", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap720d6d0b-55", "ovs_interfaceid": "720d6d0b-554f-4c56-b9e4-1de1309b83f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.380 187189 DEBUG nova.network.os_vif_util [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b8:79:f5,bridge_name='br-int',has_traffic_filtering=True,id=720d6d0b-554f-4c56-b9e4-1de1309b83f0,network=Network(cfd1ce3c-e516-46ef-8712-573fe4de8313),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap720d6d0b-55') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.380 187189 DEBUG os_vif [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:79:f5,bridge_name='br-int',has_traffic_filtering=True,id=720d6d0b-554f-4c56-b9e4-1de1309b83f0,network=Network(cfd1ce3c-e516-46ef-8712-573fe4de8313),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap720d6d0b-55') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.381 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.382 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.382 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.387 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.387 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap720d6d0b-55, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.388 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap720d6d0b-55, col_values=(('external_ids', {'iface-id': '720d6d0b-554f-4c56-b9e4-1de1309b83f0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b8:79:f5', 'vm-uuid': '9bd8796c-97a5-491a-b6b4-713222c15142'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.390 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:44 compute-0 NetworkManager[55227]: <info>  [1764402524.3909] manager: (tap720d6d0b-55): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/299)
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.393 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.401 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.403 187189 INFO os_vif [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:79:f5,bridge_name='br-int',has_traffic_filtering=True,id=720d6d0b-554f-4c56-b9e4-1de1309b83f0,network=Network(cfd1ce3c-e516-46ef-8712-573fe4de8313),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap720d6d0b-55')
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.997 187189 DEBUG nova.virt.libvirt.driver [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.998 187189 DEBUG nova.virt.libvirt.driver [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.999 187189 DEBUG nova.virt.libvirt.driver [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No VIF found with MAC fa:16:3e:b8:79:f5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:48:44 compute-0 nova_compute[187185]: 2025-11-29 07:48:44.999 187189 INFO nova.virt.libvirt.driver [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Using config drive
Nov 29 07:48:45 compute-0 nova_compute[187185]: 2025-11-29 07:48:45.971 187189 INFO nova.virt.libvirt.driver [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Creating config drive at /var/lib/nova/instances/9bd8796c-97a5-491a-b6b4-713222c15142/disk.config
Nov 29 07:48:45 compute-0 nova_compute[187185]: 2025-11-29 07:48:45.976 187189 DEBUG oslo_concurrency.processutils [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9bd8796c-97a5-491a-b6b4-713222c15142/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp985j5czr execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:48:46 compute-0 nova_compute[187185]: 2025-11-29 07:48:46.003 187189 DEBUG nova.network.neutron [req-c92a782d-ea48-48cf-b9d4-0281368daa22 req-81f978ee-4de3-4207-a53f-5cb4033b8b24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Updated VIF entry in instance network info cache for port 720d6d0b-554f-4c56-b9e4-1de1309b83f0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:48:46 compute-0 nova_compute[187185]: 2025-11-29 07:48:46.004 187189 DEBUG nova.network.neutron [req-c92a782d-ea48-48cf-b9d4-0281368daa22 req-81f978ee-4de3-4207-a53f-5cb4033b8b24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Updating instance_info_cache with network_info: [{"id": "720d6d0b-554f-4c56-b9e4-1de1309b83f0", "address": "fa:16:3e:b8:79:f5", "network": {"id": "cfd1ce3c-e516-46ef-8712-573fe4de8313", "bridge": "br-int", "label": "tempest-network-smoke--423607195", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap720d6d0b-55", "ovs_interfaceid": "720d6d0b-554f-4c56-b9e4-1de1309b83f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:48:46 compute-0 nova_compute[187185]: 2025-11-29 07:48:46.110 187189 DEBUG oslo_concurrency.processutils [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9bd8796c-97a5-491a-b6b4-713222c15142/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp985j5czr" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:48:46 compute-0 kernel: tap720d6d0b-55: entered promiscuous mode
Nov 29 07:48:46 compute-0 NetworkManager[55227]: <info>  [1764402526.1752] manager: (tap720d6d0b-55): new Tun device (/org/freedesktop/NetworkManager/Devices/300)
Nov 29 07:48:46 compute-0 ovn_controller[95281]: 2025-11-29T07:48:46Z|00572|binding|INFO|Claiming lport 720d6d0b-554f-4c56-b9e4-1de1309b83f0 for this chassis.
Nov 29 07:48:46 compute-0 ovn_controller[95281]: 2025-11-29T07:48:46Z|00573|binding|INFO|720d6d0b-554f-4c56-b9e4-1de1309b83f0: Claiming fa:16:3e:b8:79:f5 10.100.0.9
Nov 29 07:48:46 compute-0 nova_compute[187185]: 2025-11-29 07:48:46.179 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:46 compute-0 nova_compute[187185]: 2025-11-29 07:48:46.182 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:46 compute-0 nova_compute[187185]: 2025-11-29 07:48:46.184 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:46 compute-0 nova_compute[187185]: 2025-11-29 07:48:46.196 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:46 compute-0 NetworkManager[55227]: <info>  [1764402526.1975] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/301)
Nov 29 07:48:46 compute-0 NetworkManager[55227]: <info>  [1764402526.1982] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/302)
Nov 29 07:48:46 compute-0 systemd-udevd[246257]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:48:46 compute-0 systemd-machined[153486]: New machine qemu-67-instance-000000ad.
Nov 29 07:48:46 compute-0 NetworkManager[55227]: <info>  [1764402526.2306] device (tap720d6d0b-55): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:48:46 compute-0 NetworkManager[55227]: <info>  [1764402526.2319] device (tap720d6d0b-55): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:48:46 compute-0 systemd[1]: Started Virtual Machine qemu-67-instance-000000ad.
Nov 29 07:48:46 compute-0 nova_compute[187185]: 2025-11-29 07:48:46.369 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:46 compute-0 nova_compute[187185]: 2025-11-29 07:48:46.380 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:46 compute-0 ovn_controller[95281]: 2025-11-29T07:48:46Z|00574|binding|INFO|Releasing lport e08502a1-bdde-4e8d-89e4-c05bd265f847 from this chassis (sb_readonly=0)
Nov 29 07:48:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:46.585 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b8:79:f5 10.100.0.9'], port_security=['fa:16:3e:b8:79:f5 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '9bd8796c-97a5-491a-b6b4-713222c15142', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cfd1ce3c-e516-46ef-8712-573fe4de8313', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ec8b80be17a14d1caf666636283749d0', 'neutron:revision_number': '2', 'neutron:security_group_ids': '87a5a543-5e79-469e-89a8-5c2f146e65d2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8074c60c-bc9e-40bb-8493-fc40fe113e9f, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=720d6d0b-554f-4c56-b9e4-1de1309b83f0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:48:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:46.588 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 720d6d0b-554f-4c56-b9e4-1de1309b83f0 in datapath cfd1ce3c-e516-46ef-8712-573fe4de8313 bound to our chassis
Nov 29 07:48:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:46.591 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cfd1ce3c-e516-46ef-8712-573fe4de8313
Nov 29 07:48:46 compute-0 nova_compute[187185]: 2025-11-29 07:48:46.598 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764402526.5976634, 9bd8796c-97a5-491a-b6b4-713222c15142 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:48:46 compute-0 nova_compute[187185]: 2025-11-29 07:48:46.598 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] VM Started (Lifecycle Event)
Nov 29 07:48:46 compute-0 ovn_controller[95281]: 2025-11-29T07:48:46Z|00575|binding|INFO|Setting lport 720d6d0b-554f-4c56-b9e4-1de1309b83f0 ovn-installed in OVS
Nov 29 07:48:46 compute-0 ovn_controller[95281]: 2025-11-29T07:48:46Z|00576|binding|INFO|Setting lport 720d6d0b-554f-4c56-b9e4-1de1309b83f0 up in Southbound
Nov 29 07:48:46 compute-0 nova_compute[187185]: 2025-11-29 07:48:46.606 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:46.607 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e43e6f46-07fa-4733-9357-1ae33e683b52]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:48:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:46.608 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapcfd1ce3c-e1 in ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:48:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:46.611 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapcfd1ce3c-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:48:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:46.612 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[97cc00c2-6bca-4e86-9bee-a6edfa3d41cc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:48:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:46.613 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[36431c53-881e-4550-ac13-23e1fb561fc5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:48:46 compute-0 nova_compute[187185]: 2025-11-29 07:48:46.620 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:46 compute-0 nova_compute[187185]: 2025-11-29 07:48:46.629 187189 DEBUG oslo_concurrency.lockutils [req-c92a782d-ea48-48cf-b9d4-0281368daa22 req-81f978ee-4de3-4207-a53f-5cb4033b8b24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-9bd8796c-97a5-491a-b6b4-713222c15142" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:48:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:46.629 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[af9ff0cf-530a-4cf3-a6e8-7ca5b5e25da1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:48:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:46.644 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e15ceb2d-a658-4c1d-b9ae-8b0f720c67a0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:48:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:46.676 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[30268140-65c1-4b28-8e8a-acc2c5198374]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:48:46 compute-0 NetworkManager[55227]: <info>  [1764402526.6826] manager: (tapcfd1ce3c-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/303)
Nov 29 07:48:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:46.682 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[b4c8161b-510c-483b-9e1b-108465a3e732]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:48:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:46.711 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[f3bb6b92-5550-4dca-8999-eabc399ee0e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:48:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:46.714 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[e320ebf0-f7a6-4f70-bd88-a60b93238acd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:48:46 compute-0 NetworkManager[55227]: <info>  [1764402526.7402] device (tapcfd1ce3c-e0): carrier: link connected
Nov 29 07:48:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:46.748 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[f460654b-ff05-484c-ab26-460a23e97495]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:48:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:46.773 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[7a225597-10f7-4f2c-b896-e79b738625af]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcfd1ce3c-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:58:05:06'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 180], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 797440, 'reachable_time': 21368, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246297, 'error': None, 'target': 'ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:48:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:46.791 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d42b24fb-d73b-4d1d-adba-733e0731d2ce]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe58:506'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 797440, 'tstamp': 797440}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 246298, 'error': None, 'target': 'ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:48:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:46.811 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[fe9cc3fe-a830-4636-a8d4-2b63f6fb88fc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcfd1ce3c-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:58:05:06'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 180], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 797440, 'reachable_time': 21368, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 246299, 'error': None, 'target': 'ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:48:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:46.859 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[923975b0-6710-4941-bc3f-8f1b2d0b973c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:48:46 compute-0 nova_compute[187185]: 2025-11-29 07:48:46.876 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:48:46 compute-0 nova_compute[187185]: 2025-11-29 07:48:46.881 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764402526.597827, 9bd8796c-97a5-491a-b6b4-713222c15142 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:48:46 compute-0 nova_compute[187185]: 2025-11-29 07:48:46.882 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] VM Paused (Lifecycle Event)
Nov 29 07:48:46 compute-0 nova_compute[187185]: 2025-11-29 07:48:46.901 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:48:46 compute-0 nova_compute[187185]: 2025-11-29 07:48:46.905 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:48:46 compute-0 nova_compute[187185]: 2025-11-29 07:48:46.929 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:48:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:46.940 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[5b659e33-676a-4ab0-9608-165f260c7f83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:48:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:46.942 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcfd1ce3c-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:48:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:46.942 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:48:46 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:46.943 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcfd1ce3c-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:48:46 compute-0 kernel: tapcfd1ce3c-e0: entered promiscuous mode
Nov 29 07:48:46 compute-0 NetworkManager[55227]: <info>  [1764402526.9949] manager: (tapcfd1ce3c-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/304)
Nov 29 07:48:46 compute-0 nova_compute[187185]: 2025-11-29 07:48:46.996 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:47.002 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcfd1ce3c-e0, col_values=(('external_ids', {'iface-id': '7f4b3b3b-6ee7-4970-9e8f-3e592045a366'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:48:47 compute-0 nova_compute[187185]: 2025-11-29 07:48:47.004 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:47 compute-0 ovn_controller[95281]: 2025-11-29T07:48:47Z|00577|binding|INFO|Releasing lport 7f4b3b3b-6ee7-4970-9e8f-3e592045a366 from this chassis (sb_readonly=0)
Nov 29 07:48:47 compute-0 nova_compute[187185]: 2025-11-29 07:48:47.022 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:47.023 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cfd1ce3c-e516-46ef-8712-573fe4de8313.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cfd1ce3c-e516-46ef-8712-573fe4de8313.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:48:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:47.024 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[646d8fdc-6afe-4f86-8649-482036607a99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:48:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:47.025 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:48:47 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:48:47 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:48:47 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-cfd1ce3c-e516-46ef-8712-573fe4de8313
Nov 29 07:48:47 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:48:47 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:48:47 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:48:47 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/cfd1ce3c-e516-46ef-8712-573fe4de8313.pid.haproxy
Nov 29 07:48:47 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:48:47 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:48:47 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:48:47 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:48:47 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:48:47 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:48:47 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:48:47 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:48:47 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:48:47 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:48:47 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:48:47 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:48:47 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:48:47 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:48:47 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:48:47 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:48:47 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:48:47 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:48:47 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:48:47 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:48:47 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID cfd1ce3c-e516-46ef-8712-573fe4de8313
Nov 29 07:48:47 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:48:47 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:47.026 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313', 'env', 'PROCESS_TAG=haproxy-cfd1ce3c-e516-46ef-8712-573fe4de8313', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/cfd1ce3c-e516-46ef-8712-573fe4de8313.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:48:47 compute-0 nova_compute[187185]: 2025-11-29 07:48:47.179 187189 DEBUG nova.compute.manager [req-bea640d9-73a6-4544-bac1-1f5eb31c4b88 req-22a72022-1d42-4a56-b23e-fa11152c4bb6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Received event network-vif-plugged-720d6d0b-554f-4c56-b9e4-1de1309b83f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:48:47 compute-0 nova_compute[187185]: 2025-11-29 07:48:47.180 187189 DEBUG oslo_concurrency.lockutils [req-bea640d9-73a6-4544-bac1-1f5eb31c4b88 req-22a72022-1d42-4a56-b23e-fa11152c4bb6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "9bd8796c-97a5-491a-b6b4-713222c15142-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:48:47 compute-0 nova_compute[187185]: 2025-11-29 07:48:47.180 187189 DEBUG oslo_concurrency.lockutils [req-bea640d9-73a6-4544-bac1-1f5eb31c4b88 req-22a72022-1d42-4a56-b23e-fa11152c4bb6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9bd8796c-97a5-491a-b6b4-713222c15142-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:48:47 compute-0 nova_compute[187185]: 2025-11-29 07:48:47.180 187189 DEBUG oslo_concurrency.lockutils [req-bea640d9-73a6-4544-bac1-1f5eb31c4b88 req-22a72022-1d42-4a56-b23e-fa11152c4bb6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9bd8796c-97a5-491a-b6b4-713222c15142-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:48:47 compute-0 nova_compute[187185]: 2025-11-29 07:48:47.180 187189 DEBUG nova.compute.manager [req-bea640d9-73a6-4544-bac1-1f5eb31c4b88 req-22a72022-1d42-4a56-b23e-fa11152c4bb6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Processing event network-vif-plugged-720d6d0b-554f-4c56-b9e4-1de1309b83f0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:48:47 compute-0 nova_compute[187185]: 2025-11-29 07:48:47.181 187189 DEBUG nova.compute.manager [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:48:47 compute-0 nova_compute[187185]: 2025-11-29 07:48:47.192 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764402527.1924498, 9bd8796c-97a5-491a-b6b4-713222c15142 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:48:47 compute-0 nova_compute[187185]: 2025-11-29 07:48:47.193 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] VM Resumed (Lifecycle Event)
Nov 29 07:48:47 compute-0 nova_compute[187185]: 2025-11-29 07:48:47.195 187189 DEBUG nova.virt.libvirt.driver [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:48:47 compute-0 nova_compute[187185]: 2025-11-29 07:48:47.199 187189 INFO nova.virt.libvirt.driver [-] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Instance spawned successfully.
Nov 29 07:48:47 compute-0 nova_compute[187185]: 2025-11-29 07:48:47.200 187189 DEBUG nova.virt.libvirt.driver [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:48:47 compute-0 nova_compute[187185]: 2025-11-29 07:48:47.250 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:48:47 compute-0 nova_compute[187185]: 2025-11-29 07:48:47.256 187189 DEBUG nova.virt.libvirt.driver [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:48:47 compute-0 nova_compute[187185]: 2025-11-29 07:48:47.256 187189 DEBUG nova.virt.libvirt.driver [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:48:47 compute-0 nova_compute[187185]: 2025-11-29 07:48:47.257 187189 DEBUG nova.virt.libvirt.driver [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:48:47 compute-0 nova_compute[187185]: 2025-11-29 07:48:47.257 187189 DEBUG nova.virt.libvirt.driver [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:48:47 compute-0 nova_compute[187185]: 2025-11-29 07:48:47.257 187189 DEBUG nova.virt.libvirt.driver [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:48:47 compute-0 nova_compute[187185]: 2025-11-29 07:48:47.258 187189 DEBUG nova.virt.libvirt.driver [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:48:47 compute-0 nova_compute[187185]: 2025-11-29 07:48:47.262 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:48:47 compute-0 nova_compute[187185]: 2025-11-29 07:48:47.483 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:48:47 compute-0 podman[246331]: 2025-11-29 07:48:47.401155399 +0000 UTC m=+0.028579335 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:48:47 compute-0 podman[246331]: 2025-11-29 07:48:47.592234427 +0000 UTC m=+0.219658343 container create 71f872cde640e6b41c5669a66e6e4de4ce2c060ccbb4afb1cbafee44c1100364 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Nov 29 07:48:47 compute-0 systemd[1]: Started libpod-conmon-71f872cde640e6b41c5669a66e6e4de4ce2c060ccbb4afb1cbafee44c1100364.scope.
Nov 29 07:48:47 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:48:47 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43638b97acd5e63f213daee811f1dced2719c42441c70e3046234e2c17d43dbd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:48:47 compute-0 podman[246331]: 2025-11-29 07:48:47.984312886 +0000 UTC m=+0.611736852 container init 71f872cde640e6b41c5669a66e6e4de4ce2c060ccbb4afb1cbafee44c1100364 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 29 07:48:47 compute-0 podman[246331]: 2025-11-29 07:48:47.99286708 +0000 UTC m=+0.620290996 container start 71f872cde640e6b41c5669a66e6e4de4ce2c060ccbb4afb1cbafee44c1100364 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 07:48:48 compute-0 neutron-haproxy-ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313[246345]: [NOTICE]   (246349) : New worker (246351) forked
Nov 29 07:48:48 compute-0 neutron-haproxy-ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313[246345]: [NOTICE]   (246349) : Loading success.
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.019 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f18e8e64-a0ef-4c29-85d0-955b86872379', 'name': 'tempest-TestServerMultinode-server-1299020398', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000aa', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '220340bd80db4bf5af391eb2e4247a6c', 'user_id': 'b79809b822b248ae8be15d0233f5896e', 'hostId': '27916006d701479b1bc480dc2e437f25bd7c8990572854afa8cf554c', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.021 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '9bd8796c-97a5-491a-b6b4-713222c15142', 'name': 'tempest-TestNetworkBasicOps-server-1530170034', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000ad', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'ec8b80be17a14d1caf666636283749d0', 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'hostId': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.021 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.021 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.021 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestServerMultinode-server-1299020398>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1530170034>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestServerMultinode-server-1299020398>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1530170034>]
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.022 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.026 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for f18e8e64-a0ef-4c29-85d0-955b86872379 / tap462e1ede-b0 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.026 12 DEBUG ceilometer.compute.pollsters [-] f18e8e64-a0ef-4c29-85d0-955b86872379/network.outgoing.packets volume: 3 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.029 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 9bd8796c-97a5-491a-b6b4-713222c15142 / tap720d6d0b-55 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.029 12 DEBUG ceilometer.compute.pollsters [-] 9bd8796c-97a5-491a-b6b4-713222c15142/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b1a61c04-f1e2-4ec0-adc5-2dfdd379043c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 3, 'user_id': 'b79809b822b248ae8be15d0233f5896e', 'user_name': None, 'project_id': '220340bd80db4bf5af391eb2e4247a6c', 'project_name': None, 'resource_id': 'instance-000000aa-f18e8e64-a0ef-4c29-85d0-955b86872379-tap462e1ede-b0', 'timestamp': '2025-11-29T07:48:48.022377', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-1299020398', 'name': 'tap462e1ede-b0', 'instance_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379', 'instance_type': 'm1.nano', 'host': '27916006d701479b1bc480dc2e437f25bd7c8990572854afa8cf554c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:77:28', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap462e1ede-b0'}, 'message_id': 'd6be5054-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.740580451, 'message_signature': 'a14a31d93c3dab8ab8892bb6d95c8e34d759e3dd3d5171247bf964b8ba369817'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-000000ad-9bd8796c-97a5-491a-b6b4-713222c15142-tap720d6d0b-55', 'timestamp': '2025-11-29T07:48:48.022377', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1530170034', 'name': 'tap720d6d0b-55', 'instance_id': '9bd8796c-97a5-491a-b6b4-713222c15142', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b8:79:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap720d6d0b-55'}, 'message_id': 'd6beb40e-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.745418638, 'message_signature': '867e749e3fc76368c83ec847e4681513452915758258d1890b35a537df879029'}]}, 'timestamp': '2025-11-29 07:48:48.029731', '_unique_id': 'b8e7350e01614352b93417c442921f99'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.031 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestServerMultinode-server-1299020398>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1530170034>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestServerMultinode-server-1299020398>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1530170034>]
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.032 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.061 12 DEBUG ceilometer.compute.pollsters [-] f18e8e64-a0ef-4c29-85d0-955b86872379/disk.device.read.bytes volume: 28470272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.061 12 DEBUG ceilometer.compute.pollsters [-] f18e8e64-a0ef-4c29-85d0-955b86872379/disk.device.read.bytes volume: 221502 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.092 12 DEBUG ceilometer.compute.pollsters [-] 9bd8796c-97a5-491a-b6b4-713222c15142/disk.device.read.bytes volume: 13117952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.093 12 DEBUG ceilometer.compute.pollsters [-] 9bd8796c-97a5-491a-b6b4-713222c15142/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '95a8b869-7996-4549-af73-c92f981ea701', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 28470272, 'user_id': 'b79809b822b248ae8be15d0233f5896e', 'user_name': None, 'project_id': '220340bd80db4bf5af391eb2e4247a6c', 'project_name': None, 'resource_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379-vda', 'timestamp': '2025-11-29T07:48:48.032150', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-1299020398', 'name': 'instance-000000aa', 'instance_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379', 'instance_type': 'm1.nano', 'host': '27916006d701479b1bc480dc2e437f25bd7c8990572854afa8cf554c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd6c39c12-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.750352269, 'message_signature': '46444e63e5e97afd6ff16504843ac26681d9981a0d03169b3dbaaa8906807971'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 221502, 'user_id': 'b79809b822b248ae8be15d0233f5896e', 'user_name': None, 'project_id': '220340bd80db4bf5af391eb2e4247a6c', 'project_name': None, 'resource_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379-sda', 'timestamp': '2025-11-29T07:48:48.032150', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-1299020398', 'name': 'instance-000000aa', 'instance_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379', 'instance_type': 'm1.nano', 'host': '27916006d701479b1bc480dc2e437f25bd7c8990572854afa8cf554c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd6c3ac84-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.750352269, 'message_signature': '6f908b320bf56bc41d5a23d29c1865443ab69905a0166ea202a2b4d02a2f9ce8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 13117952, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '9bd8796c-97a5-491a-b6b4-713222c15142-vda', 'timestamp': '2025-11-29T07:48:48.032150', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1530170034', 'name': 'instance-000000ad', 'instance_id': '9bd8796c-97a5-491a-b6b4-713222c15142', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd6c86b34-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.780567639, 'message_signature': '616f24b4c43fe6ddd7254b95db9f2b6705f7eb17295bd52fd294940277516e98'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '9bd8796c-97a5-491a-b6b4-713222c15142-sda', 'timestamp': '2025-11-29T07:48:48.032150', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1530170034', 'name': 'instance-000000ad', 'instance_id': '9bd8796c-97a5-491a-b6b4-713222c15142', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd6c87700-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.780567639, 'message_signature': 'be8cca6ee4826c6ae1c9f664108aa1b58b63b9562344745ffb51e45aa1efd23c'}]}, 'timestamp': '2025-11-29 07:48:48.093683', '_unique_id': '506e49b37ef84fbe943d546611b07966'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.094 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.096 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.096 12 DEBUG ceilometer.compute.pollsters [-] f18e8e64-a0ef-4c29-85d0-955b86872379/network.outgoing.bytes volume: 266 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.096 12 DEBUG ceilometer.compute.pollsters [-] 9bd8796c-97a5-491a-b6b4-713222c15142/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '439d8649-6aec-4dfd-b2c2-867db775a533', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 266, 'user_id': 'b79809b822b248ae8be15d0233f5896e', 'user_name': None, 'project_id': '220340bd80db4bf5af391eb2e4247a6c', 'project_name': None, 'resource_id': 'instance-000000aa-f18e8e64-a0ef-4c29-85d0-955b86872379-tap462e1ede-b0', 'timestamp': '2025-11-29T07:48:48.096091', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-1299020398', 'name': 'tap462e1ede-b0', 'instance_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379', 'instance_type': 'm1.nano', 'host': '27916006d701479b1bc480dc2e437f25bd7c8990572854afa8cf554c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:77:28', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap462e1ede-b0'}, 'message_id': 'd6c8dfba-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.740580451, 'message_signature': 'bbb7db7cc03b72257a03e13f3f56841ed5bafc3d8fc4d3b284216810595b0120'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-000000ad-9bd8796c-97a5-491a-b6b4-713222c15142-tap720d6d0b-55', 'timestamp': '2025-11-29T07:48:48.096091', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1530170034', 'name': 'tap720d6d0b-55', 'instance_id': '9bd8796c-97a5-491a-b6b4-713222c15142', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b8:79:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap720d6d0b-55'}, 'message_id': 'd6c8e91a-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.745418638, 'message_signature': '64484c33a474b5e591558e17fa0af7b1f0f58ae57ad133227c7ee43c5450d38f'}]}, 'timestamp': '2025-11-29 07:48:48.096680', '_unique_id': 'fc980aaea30d4ac49dae00baba5ce62c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.097 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.098 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.098 12 DEBUG ceilometer.compute.pollsters [-] f18e8e64-a0ef-4c29-85d0-955b86872379/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.098 12 DEBUG ceilometer.compute.pollsters [-] 9bd8796c-97a5-491a-b6b4-713222c15142/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5bc3b6e2-d447-424a-b45b-2963abeb866a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b79809b822b248ae8be15d0233f5896e', 'user_name': None, 'project_id': '220340bd80db4bf5af391eb2e4247a6c', 'project_name': None, 'resource_id': 'instance-000000aa-f18e8e64-a0ef-4c29-85d0-955b86872379-tap462e1ede-b0', 'timestamp': '2025-11-29T07:48:48.098456', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-1299020398', 'name': 'tap462e1ede-b0', 'instance_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379', 'instance_type': 'm1.nano', 'host': '27916006d701479b1bc480dc2e437f25bd7c8990572854afa8cf554c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:77:28', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap462e1ede-b0'}, 'message_id': 'd6c93abe-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.740580451, 'message_signature': '7b79327dc1cac6372087345df220531bf39744259b2e1e4750fbec57cb1e87b8'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-000000ad-9bd8796c-97a5-491a-b6b4-713222c15142-tap720d6d0b-55', 'timestamp': '2025-11-29T07:48:48.098456', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1530170034', 'name': 'tap720d6d0b-55', 'instance_id': '9bd8796c-97a5-491a-b6b4-713222c15142', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b8:79:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap720d6d0b-55'}, 'message_id': 'd6c942c0-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.745418638, 'message_signature': 'bfbf0690d4cdb0ee209cb39199b768cd66cf278335fa21120becc2e0b0b95fde'}]}, 'timestamp': '2025-11-29 07:48:48.098903', '_unique_id': 'd8783ca89bf34ba8adfb8a021d7eda91'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.099 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.100 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.100 12 DEBUG ceilometer.compute.pollsters [-] f18e8e64-a0ef-4c29-85d0-955b86872379/network.incoming.packets volume: 7 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.100 12 DEBUG ceilometer.compute.pollsters [-] 9bd8796c-97a5-491a-b6b4-713222c15142/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc2b7675-5441-4de0-95ee-06fce2384a92', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 7, 'user_id': 'b79809b822b248ae8be15d0233f5896e', 'user_name': None, 'project_id': '220340bd80db4bf5af391eb2e4247a6c', 'project_name': None, 'resource_id': 'instance-000000aa-f18e8e64-a0ef-4c29-85d0-955b86872379-tap462e1ede-b0', 'timestamp': '2025-11-29T07:48:48.100493', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-1299020398', 'name': 'tap462e1ede-b0', 'instance_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379', 'instance_type': 'm1.nano', 'host': '27916006d701479b1bc480dc2e437f25bd7c8990572854afa8cf554c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:77:28', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap462e1ede-b0'}, 'message_id': 'd6c98bf4-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.740580451, 'message_signature': '41a33abd2c12cd2d047bf350ca5efbc43eca84d64a62879a733c1b1baa2819ab'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-000000ad-9bd8796c-97a5-491a-b6b4-713222c15142-tap720d6d0b-55', 'timestamp': '2025-11-29T07:48:48.100493', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1530170034', 'name': 'tap720d6d0b-55', 'instance_id': '9bd8796c-97a5-491a-b6b4-713222c15142', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b8:79:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap720d6d0b-55'}, 'message_id': 'd6c99586-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.745418638, 'message_signature': '7a03c848379b209eca1095f06480ad23ae706d9100421c62b8682b1493cb814d'}]}, 'timestamp': '2025-11-29 07:48:48.101006', '_unique_id': '817799849fbe47dcbde141cedddc3007'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.101 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.102 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.102 12 DEBUG ceilometer.compute.pollsters [-] f18e8e64-a0ef-4c29-85d0-955b86872379/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.102 12 DEBUG ceilometer.compute.pollsters [-] 9bd8796c-97a5-491a-b6b4-713222c15142/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cd623b18-48b9-4fe8-9d35-264af5034639', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b79809b822b248ae8be15d0233f5896e', 'user_name': None, 'project_id': '220340bd80db4bf5af391eb2e4247a6c', 'project_name': None, 'resource_id': 'instance-000000aa-f18e8e64-a0ef-4c29-85d0-955b86872379-tap462e1ede-b0', 'timestamp': '2025-11-29T07:48:48.102340', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-1299020398', 'name': 'tap462e1ede-b0', 'instance_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379', 'instance_type': 'm1.nano', 'host': '27916006d701479b1bc480dc2e437f25bd7c8990572854afa8cf554c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:77:28', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap462e1ede-b0'}, 'message_id': 'd6c9d2a8-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.740580451, 'message_signature': '05ac428a5012e684e2c9b47e81ce0ebcb94694d8ec1d66ad95313e36c0a80a55'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-000000ad-9bd8796c-97a5-491a-b6b4-713222c15142-tap720d6d0b-55', 'timestamp': '2025-11-29T07:48:48.102340', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1530170034', 'name': 'tap720d6d0b-55', 'instance_id': '9bd8796c-97a5-491a-b6b4-713222c15142', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b8:79:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap720d6d0b-55'}, 'message_id': 'd6c9db04-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.745418638, 'message_signature': 'e1b44980de2f5ff8a1e1dfde7dee54260f60ecd756717cb264f6ebb49c25a3bf'}]}, 'timestamp': '2025-11-29 07:48:48.102782', '_unique_id': '56198e5eeaac40309a4a7d77d5741785'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.103 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.104 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.104 12 DEBUG ceilometer.compute.pollsters [-] f18e8e64-a0ef-4c29-85d0-955b86872379/network.incoming.bytes volume: 622 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.104 12 DEBUG ceilometer.compute.pollsters [-] 9bd8796c-97a5-491a-b6b4-713222c15142/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c3cd176f-812e-432d-a216-3dbdeb40ae77', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 622, 'user_id': 'b79809b822b248ae8be15d0233f5896e', 'user_name': None, 'project_id': '220340bd80db4bf5af391eb2e4247a6c', 'project_name': None, 'resource_id': 'instance-000000aa-f18e8e64-a0ef-4c29-85d0-955b86872379-tap462e1ede-b0', 'timestamp': '2025-11-29T07:48:48.104261', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-1299020398', 'name': 'tap462e1ede-b0', 'instance_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379', 'instance_type': 'm1.nano', 'host': '27916006d701479b1bc480dc2e437f25bd7c8990572854afa8cf554c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:77:28', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap462e1ede-b0'}, 'message_id': 'd6ca1e66-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.740580451, 'message_signature': '49998c949120eb0de7a44c53f9af8aa3aa477ea19d92e8ae75bc7a1533d3e205'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-000000ad-9bd8796c-97a5-491a-b6b4-713222c15142-tap720d6d0b-55', 'timestamp': '2025-11-29T07:48:48.104261', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1530170034', 'name': 'tap720d6d0b-55', 'instance_id': '9bd8796c-97a5-491a-b6b4-713222c15142', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b8:79:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap720d6d0b-55'}, 'message_id': 'd6ca28de-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.745418638, 'message_signature': '71c66201479ac41c662d597efc7e98c94db55f703441d4524e6c2bd32aa1c45f'}]}, 'timestamp': '2025-11-29 07:48:48.104797', '_unique_id': '24c606cbffac4e33a7ccbb9d9f4ed02c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.105 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.106 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.106 12 DEBUG ceilometer.compute.pollsters [-] f18e8e64-a0ef-4c29-85d0-955b86872379/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.106 12 DEBUG ceilometer.compute.pollsters [-] 9bd8796c-97a5-491a-b6b4-713222c15142/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3c9b814f-7615-4e17-ab3b-625075643c14', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b79809b822b248ae8be15d0233f5896e', 'user_name': None, 'project_id': '220340bd80db4bf5af391eb2e4247a6c', 'project_name': None, 'resource_id': 'instance-000000aa-f18e8e64-a0ef-4c29-85d0-955b86872379-tap462e1ede-b0', 'timestamp': '2025-11-29T07:48:48.106065', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-1299020398', 'name': 'tap462e1ede-b0', 'instance_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379', 'instance_type': 'm1.nano', 'host': '27916006d701479b1bc480dc2e437f25bd7c8990572854afa8cf554c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:77:28', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap462e1ede-b0'}, 'message_id': 'd6ca63e4-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.740580451, 'message_signature': 'b49c23b0d44bb07d6d8f67b4cdfb150948de045101c319850142ef6b2dbf729d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-000000ad-9bd8796c-97a5-491a-b6b4-713222c15142-tap720d6d0b-55', 'timestamp': '2025-11-29T07:48:48.106065', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1530170034', 'name': 'tap720d6d0b-55', 'instance_id': '9bd8796c-97a5-491a-b6b4-713222c15142', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b8:79:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap720d6d0b-55'}, 'message_id': 'd6ca6c04-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.745418638, 'message_signature': '8b2dcb869038f3aace509685cf7190bee9724d0b0afcd8ce033be47e12721154'}]}, 'timestamp': '2025-11-29 07:48:48.106495', '_unique_id': '6838a9b5bf2f46b2837507abadd6783d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.107 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.124 12 DEBUG ceilometer.compute.pollsters [-] f18e8e64-a0ef-4c29-85d0-955b86872379/cpu volume: 11640000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.139 12 DEBUG ceilometer.compute.pollsters [-] 9bd8796c-97a5-491a-b6b4-713222c15142/cpu volume: 850000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b44f9740-2e06-47fd-b8a2-e76d933c8aca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11640000000, 'user_id': 'b79809b822b248ae8be15d0233f5896e', 'user_name': None, 'project_id': '220340bd80db4bf5af391eb2e4247a6c', 'project_name': None, 'resource_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379', 'timestamp': '2025-11-29T07:48:48.107974', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-1299020398', 'name': 'instance-000000aa', 'instance_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379', 'instance_type': 'm1.nano', 'host': '27916006d701479b1bc480dc2e437f25bd7c8990572854afa8cf554c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'd6cd4be0-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.84280388, 'message_signature': 'a16b16edcfe7c8c1dd806861363f9ab2488a99bb62f8e9f39309467da85b2fb2'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 850000000, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '9bd8796c-97a5-491a-b6b4-713222c15142', 'timestamp': '2025-11-29T07:48:48.107974', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1530170034', 'name': 'instance-000000ad', 'instance_id': '9bd8796c-97a5-491a-b6b4-713222c15142', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'd6cf9422-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.85790607, 'message_signature': '3eff0acebdda4e2ec1ef9dd60b7750e31a4214ccc46494e9c2921cb0e835e261'}]}, 'timestamp': '2025-11-29 07:48:48.140374', '_unique_id': 'fba0fc69c9dd40fb800a4756d085365d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.141 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.142 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.142 12 DEBUG ceilometer.compute.pollsters [-] f18e8e64-a0ef-4c29-85d0-955b86872379/disk.device.read.latency volume: 289080356 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.142 12 DEBUG ceilometer.compute.pollsters [-] f18e8e64-a0ef-4c29-85d0-955b86872379/disk.device.read.latency volume: 21202862 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.142 12 DEBUG ceilometer.compute.pollsters [-] 9bd8796c-97a5-491a-b6b4-713222c15142/disk.device.read.latency volume: 157497866 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.142 12 DEBUG ceilometer.compute.pollsters [-] 9bd8796c-97a5-491a-b6b4-713222c15142/disk.device.read.latency volume: 697039 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4499f47c-1da4-42f8-a9fe-fa97cf4d7fd4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 289080356, 'user_id': 'b79809b822b248ae8be15d0233f5896e', 'user_name': None, 'project_id': '220340bd80db4bf5af391eb2e4247a6c', 'project_name': None, 'resource_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379-vda', 'timestamp': '2025-11-29T07:48:48.142267', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-1299020398', 'name': 'instance-000000aa', 'instance_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379', 'instance_type': 'm1.nano', 'host': '27916006d701479b1bc480dc2e437f25bd7c8990572854afa8cf554c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd6cfea1c-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.750352269, 'message_signature': '4985ffc51a0fd91b23d0b2c2cf994c9fee778e609fd6942858372d9ad3f2f7c3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21202862, 'user_id': 'b79809b822b248ae8be15d0233f5896e', 'user_name': None, 'project_id': '220340bd80db4bf5af391eb2e4247a6c', 'project_name': None, 'resource_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379-sda', 'timestamp': '2025-11-29T07:48:48.142267', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-1299020398', 'name': 'instance-000000aa', 'instance_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379', 'instance_type': 'm1.nano', 'host': '27916006d701479b1bc480dc2e437f25bd7c8990572854afa8cf554c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd6cff214-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.750352269, 'message_signature': '191fe0496c320cdb09fd0312c708ff21ecad1f232b1aea98ded76d01b04f151a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 157497866, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '9bd8796c-97a5-491a-b6b4-713222c15142-vda', 'timestamp': '2025-11-29T07:48:48.142267', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1530170034', 'name': 'instance-000000ad', 'instance_id': '9bd8796c-97a5-491a-b6b4-713222c15142', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd6cff96c-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.780567639, 'message_signature': 'ec3793e3f67852cbea9d479b7ea8e1d8ba1664c67f6709330d921a443e11ef93'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 697039, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '9bd8796c-97a5-491a-b6b4-713222c15142-sda', 'timestamp': '2025-11-29T07:48:48.142267', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1530170034', 'name': 'instance-000000ad', 'instance_id': '9bd8796c-97a5-491a-b6b4-713222c15142', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd6d00290-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.780567639, 'message_signature': '3776e4b20d76c3f4282c25af451e1d3dbc32807769231eb00e5ac8e401010985'}]}, 'timestamp': '2025-11-29 07:48:48.143110', '_unique_id': '8205d4341b864fcf9e79a4aa6173c59b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.143 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.144 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.144 12 DEBUG ceilometer.compute.pollsters [-] f18e8e64-a0ef-4c29-85d0-955b86872379/disk.device.write.requests volume: 229 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.144 12 DEBUG ceilometer.compute.pollsters [-] f18e8e64-a0ef-4c29-85d0-955b86872379/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.144 12 DEBUG ceilometer.compute.pollsters [-] 9bd8796c-97a5-491a-b6b4-713222c15142/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.144 12 DEBUG ceilometer.compute.pollsters [-] 9bd8796c-97a5-491a-b6b4-713222c15142/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '529ed4e4-ba7f-4f9e-8bcc-5b619912778c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 229, 'user_id': 'b79809b822b248ae8be15d0233f5896e', 'user_name': None, 'project_id': '220340bd80db4bf5af391eb2e4247a6c', 'project_name': None, 'resource_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379-vda', 'timestamp': '2025-11-29T07:48:48.144268', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-1299020398', 'name': 'instance-000000aa', 'instance_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379', 'instance_type': 'm1.nano', 'host': '27916006d701479b1bc480dc2e437f25bd7c8990572854afa8cf554c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd6d0380a-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.750352269, 'message_signature': '0a3eb367c112337d46f5cc145703c455911d749d94e6602f614498ae753e4837'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'b79809b822b248ae8be15d0233f5896e', 'user_name': None, 'project_id': '220340bd80db4bf5af391eb2e4247a6c', 'project_name': None, 'resource_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379-sda', 'timestamp': '2025-11-29T07:48:48.144268', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-1299020398', 'name': 'instance-000000aa', 'instance_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379', 'instance_type': 'm1.nano', 'host': '27916006d701479b1bc480dc2e437f25bd7c8990572854afa8cf554c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd6d04016-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.750352269, 'message_signature': '7e697a23f02ead5fb837f8ba4043dc4c77c5c1214dd23e498a6c66123ac920b3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '9bd8796c-97a5-491a-b6b4-713222c15142-vda', 'timestamp': '2025-11-29T07:48:48.144268', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1530170034', 'name': 'instance-000000ad', 'instance_id': '9bd8796c-97a5-491a-b6b4-713222c15142', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd6d047a0-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.780567639, 'message_signature': '4f6d1220bb524d75541eba02383339bc0948ef839dfe90d58ce0158fc2150a39'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '9bd8796c-97a5-491a-b6b4-713222c15142-sda', 'timestamp': '2025-11-29T07:48:48.144268', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1530170034', 'name': 'instance-000000ad', 'instance_id': '9bd8796c-97a5-491a-b6b4-713222c15142', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd6d050d8-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.780567639, 'message_signature': '4e97b39ccb4da5891c368e173b69088c57fb87a7bd5aecffc22baa07d25d85df'}]}, 'timestamp': '2025-11-29 07:48:48.145125', '_unique_id': 'd7492c78cdcb4681b8857a659ccf82c4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.145 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.146 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.146 12 DEBUG ceilometer.compute.pollsters [-] f18e8e64-a0ef-4c29-85d0-955b86872379/disk.device.write.bytes volume: 25649152 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.146 12 DEBUG ceilometer.compute.pollsters [-] f18e8e64-a0ef-4c29-85d0-955b86872379/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.146 12 DEBUG ceilometer.compute.pollsters [-] 9bd8796c-97a5-491a-b6b4-713222c15142/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 DEBUG ceilometer.compute.pollsters [-] 9bd8796c-97a5-491a-b6b4-713222c15142/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '27c083a1-3996-4cc6-8119-d19dd8b878ac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 25649152, 'user_id': 'b79809b822b248ae8be15d0233f5896e', 'user_name': None, 'project_id': '220340bd80db4bf5af391eb2e4247a6c', 'project_name': None, 'resource_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379-vda', 'timestamp': '2025-11-29T07:48:48.146351', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-1299020398', 'name': 'instance-000000aa', 'instance_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379', 'instance_type': 'm1.nano', 'host': '27916006d701479b1bc480dc2e437f25bd7c8990572854afa8cf554c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd6d08986-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.750352269, 'message_signature': '92effdccf458a270f5e4871f7aee298ac622bd33390042627b07cd7a0aa38f1f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b79809b822b248ae8be15d0233f5896e', 'user_name': None, 'project_id': '220340bd80db4bf5af391eb2e4247a6c', 'project_name': None, 'resource_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379-sda', 'timestamp': '2025-11-29T07:48:48.146351', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-1299020398', 'name': 'instance-000000aa', 'instance_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379', 'instance_type': 'm1.nano', 'host': '27916006d701479b1bc480dc2e437f25bd7c8990572854afa8cf554c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd6d0914c-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.750352269, 'message_signature': 'e90eb3672cc4961fcddbb8a921c3efa36e1d30214d557c6a8aaf8d283b0f32eb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '9bd8796c-97a5-491a-b6b4-713222c15142-vda', 'timestamp': '2025-11-29T07:48:48.146351', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1530170034', 'name': 'instance-000000ad', 'instance_id': '9bd8796c-97a5-491a-b6b4-713222c15142', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd6d09c64-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.780567639, 'message_signature': '2773c62933a31e9e513a6e399bf79268b434b943994aa58f17423ea876c2c65f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '9bd8796c-97a5-491a-b6b4-713222c15142-sda', 'timestamp': '2025-11-29T07:48:48.146351', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1530170034', 'name': 'instance-000000ad', 'instance_id': '9bd8796c-97a5-491a-b6b4-713222c15142', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd6d0a740-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.780567639, 'message_signature': 'cf0aaf12d6b51d920acd792a27f7106d8173bba9451cec6bb1b62eec1751b037'}]}, 'timestamp': '2025-11-29 07:48:48.147327', '_unique_id': '982b7e00e0794294b1060b15c3495859'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.147 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.148 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.148 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.148 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestServerMultinode-server-1299020398>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1530170034>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestServerMultinode-server-1299020398>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1530170034>]
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.148 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 DEBUG ceilometer.compute.pollsters [-] f18e8e64-a0ef-4c29-85d0-955b86872379/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 DEBUG ceilometer.compute.pollsters [-] 9bd8796c-97a5-491a-b6b4-713222c15142/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '645be0f0-1954-4140-8597-3bca5ff59537', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b79809b822b248ae8be15d0233f5896e', 'user_name': None, 'project_id': '220340bd80db4bf5af391eb2e4247a6c', 'project_name': None, 'resource_id': 'instance-000000aa-f18e8e64-a0ef-4c29-85d0-955b86872379-tap462e1ede-b0', 'timestamp': '2025-11-29T07:48:48.149010', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-1299020398', 'name': 'tap462e1ede-b0', 'instance_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379', 'instance_type': 'm1.nano', 'host': '27916006d701479b1bc480dc2e437f25bd7c8990572854afa8cf554c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:77:28', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap462e1ede-b0'}, 'message_id': 'd6d0f1b4-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.740580451, 'message_signature': '88a400a6098bdc8c401ccf823294932609c7bb19b689304d7b08f39f28efb6b0'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-000000ad-9bd8796c-97a5-491a-b6b4-713222c15142-tap720d6d0b-55', 'timestamp': '2025-11-29T07:48:48.149010', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1530170034', 'name': 'tap720d6d0b-55', 'instance_id': '9bd8796c-97a5-491a-b6b4-713222c15142', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b8:79:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap720d6d0b-55'}, 'message_id': 'd6d0fae2-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.745418638, 'message_signature': 'e713bdf68ad185aebcc145ffb7a6ce6faa2db6d28bbca2a4c3b2483a7f78ab3c'}]}, 'timestamp': '2025-11-29 07:48:48.149478', '_unique_id': '2319c582a58d4d6ab9c851b835efaf20'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.149 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.150 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.150 12 DEBUG ceilometer.compute.pollsters [-] f18e8e64-a0ef-4c29-85d0-955b86872379/disk.device.read.requests volume: 1020 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.151 12 DEBUG ceilometer.compute.pollsters [-] f18e8e64-a0ef-4c29-85d0-955b86872379/disk.device.read.requests volume: 95 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.151 12 DEBUG ceilometer.compute.pollsters [-] 9bd8796c-97a5-491a-b6b4-713222c15142/disk.device.read.requests volume: 423 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.151 12 DEBUG ceilometer.compute.pollsters [-] 9bd8796c-97a5-491a-b6b4-713222c15142/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '84bf35f5-ef20-42c3-9ead-a40a1a3e2715', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1020, 'user_id': 'b79809b822b248ae8be15d0233f5896e', 'user_name': None, 'project_id': '220340bd80db4bf5af391eb2e4247a6c', 'project_name': None, 'resource_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379-vda', 'timestamp': '2025-11-29T07:48:48.150756', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-1299020398', 'name': 'instance-000000aa', 'instance_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379', 'instance_type': 'm1.nano', 'host': '27916006d701479b1bc480dc2e437f25bd7c8990572854afa8cf554c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd6d13728-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.750352269, 'message_signature': '923a1569418be88ef5cdabe6dc3538b8d8a5105e50cb934d15d9ba0206ff92fb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 95, 'user_id': 'b79809b822b248ae8be15d0233f5896e', 'user_name': None, 'project_id': '220340bd80db4bf5af391eb2e4247a6c', 'project_name': None, 'resource_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379-sda', 'timestamp': '2025-11-29T07:48:48.150756', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-1299020398', 'name': 'instance-000000aa', 'instance_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379', 'instance_type': 'm1.nano', 'host': '27916006d701479b1bc480dc2e437f25bd7c8990572854afa8cf554c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd6d13fac-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.750352269, 'message_signature': '5fc58616e8224785a6fc433975d1dbb8ad266093522173c2740cf885c1a27036'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 423, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '9bd8796c-97a5-491a-b6b4-713222c15142-vda', 'timestamp': '2025-11-29T07:48:48.150756', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1530170034', 'name': 'instance-000000ad', 'instance_id': '9bd8796c-97a5-491a-b6b4-713222c15142', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd6d1477c-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.780567639, 'message_signature': 'bc8ab9a2e2c3b350f82f68546ca91799b1278eb40abbf4c28b7961ea831e0c25'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '9bd8796c-97a5-491a-b6b4-713222c15142-sda', 'timestamp': '2025-11-29T07:48:48.150756', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1530170034', 'name': 'instance-000000ad', 'instance_id': '9bd8796c-97a5-491a-b6b4-713222c15142', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd6d14f2e-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.780567639, 'message_signature': '2eb5f0f4cdd116c99283ff4f006117a706e56f3a970e93b9a84328b16eda6e83'}]}, 'timestamp': '2025-11-29 07:48:48.151622', '_unique_id': 'ef10c27238e44aae866ab7385ccde03d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.152 12 DEBUG ceilometer.compute.pollsters [-] f18e8e64-a0ef-4c29-85d0-955b86872379/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.153 12 DEBUG ceilometer.compute.pollsters [-] 9bd8796c-97a5-491a-b6b4-713222c15142/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e48f4301-0b4e-44d9-8e2f-e5fe388b0300', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b79809b822b248ae8be15d0233f5896e', 'user_name': None, 'project_id': '220340bd80db4bf5af391eb2e4247a6c', 'project_name': None, 'resource_id': 'instance-000000aa-f18e8e64-a0ef-4c29-85d0-955b86872379-tap462e1ede-b0', 'timestamp': '2025-11-29T07:48:48.152912', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-1299020398', 'name': 'tap462e1ede-b0', 'instance_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379', 'instance_type': 'm1.nano', 'host': '27916006d701479b1bc480dc2e437f25bd7c8990572854afa8cf554c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:77:28', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap462e1ede-b0'}, 'message_id': 'd6d18b2e-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.740580451, 'message_signature': '6a862a5c3c65a391a8246657db6a982e3c6cf57871016b71a65f611212b4d1cc'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-000000ad-9bd8796c-97a5-491a-b6b4-713222c15142-tap720d6d0b-55', 'timestamp': '2025-11-29T07:48:48.152912', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1530170034', 'name': 'tap720d6d0b-55', 'instance_id': '9bd8796c-97a5-491a-b6b4-713222c15142', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b8:79:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap720d6d0b-55'}, 'message_id': 'd6d19650-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.745418638, 'message_signature': '2d6d2632db8fb045de9be8d9b16597bfa2bf47add2e5d5eb929ae61491c76fd5'}]}, 'timestamp': '2025-11-29 07:48:48.153490', '_unique_id': '13e768ce65e24af28e169445d7ea9f4b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.154 12 DEBUG ceilometer.compute.pollsters [-] f18e8e64-a0ef-4c29-85d0-955b86872379/memory.usage volume: 40.421875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 DEBUG ceilometer.compute.pollsters [-] 9bd8796c-97a5-491a-b6b4-713222c15142/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 9bd8796c-97a5-491a-b6b4-713222c15142: ceilometer.compute.pollsters.NoVolumeException
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '63054362-525c-4c2d-92ea-97930cfe302a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.421875, 'user_id': 'b79809b822b248ae8be15d0233f5896e', 'user_name': None, 'project_id': '220340bd80db4bf5af391eb2e4247a6c', 'project_name': None, 'resource_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379', 'timestamp': '2025-11-29T07:48:48.154923', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-1299020398', 'name': 'instance-000000aa', 'instance_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379', 'instance_type': 'm1.nano', 'host': '27916006d701479b1bc480dc2e437f25bd7c8990572854afa8cf554c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'd6d1d962-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.84280388, 'message_signature': '7d3066e893e0ff9fe0a56bef13173b858544518fac07b74afc1dcd8dc265c8d7'}]}, 'timestamp': '2025-11-29 07:48:48.155417', '_unique_id': '073083dd221d44f69585706eceb22e88'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.155 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.156 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.167 12 DEBUG ceilometer.compute.pollsters [-] f18e8e64-a0ef-4c29-85d0-955b86872379/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.167 12 DEBUG ceilometer.compute.pollsters [-] f18e8e64-a0ef-4c29-85d0-955b86872379/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.177 12 DEBUG ceilometer.compute.pollsters [-] 9bd8796c-97a5-491a-b6b4-713222c15142/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.177 12 DEBUG ceilometer.compute.pollsters [-] 9bd8796c-97a5-491a-b6b4-713222c15142/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '94591ac2-17b7-49ee-99af-ad8f926fd098', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'b79809b822b248ae8be15d0233f5896e', 'user_name': None, 'project_id': '220340bd80db4bf5af391eb2e4247a6c', 'project_name': None, 'resource_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379-vda', 'timestamp': '2025-11-29T07:48:48.156810', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-1299020398', 'name': 'instance-000000aa', 'instance_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379', 'instance_type': 'm1.nano', 'host': '27916006d701479b1bc480dc2e437f25bd7c8990572854afa8cf554c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd6d3bd9a-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.875081509, 'message_signature': 'ecae691915a6a4af14993672fa3958577df535d18bb7ba9a2c0b4b84af50b26e'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'b79809b822b248ae8be15d0233f5896e', 'user_name': None, 'project_id': '220340bd80db4bf5af391eb2e4247a6c', 'project_name': None, 'resource_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379-sda', 'timestamp': '2025-11-29T07:48:48.156810', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-1299020398', 'name': 'instance-000000aa', 'instance_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379', 'instance_type': 'm1.nano', 'host': '27916006d701479b1bc480dc2e437f25bd7c8990572854afa8cf554c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd6d3c826-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.875081509, 'message_signature': 'aefbfa5ad8cbad1f6ff236e5bb039b935aaaaea9d0e212eb8e12e1cca8f6c47d'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '9bd8796c-97a5-491a-b6b4-713222c15142-vda', 'timestamp': '2025-11-29T07:48:48.156810', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1530170034', 'name': 'instance-000000ad', 'instance_id': '9bd8796c-97a5-491a-b6b4-713222c15142', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd6d54bb0-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.886093012, 'message_signature': 'cfa5247e5f388f86386bf1252d32671376d5b7615172a0f61f51405f621291dd'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '9bd8796c-97a5-491a-b6b4-713222c15142-sda', 'timestamp': '2025-11-29T07:48:48.156810', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1530170034', 'name': 'instance-000000ad', 'instance_id': '9bd8796c-97a5-491a-b6b4-713222c15142', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd6d55556-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.886093012, 'message_signature': '81c85170a824db3a4d3210b971063e645a31a9b81fffaaa860940a0bbe520c3f'}]}, 'timestamp': '2025-11-29 07:48:48.178013', '_unique_id': '6060831829fd4888a16eb26e83d93d0e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.179 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.180 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.180 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestServerMultinode-server-1299020398>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1530170034>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestServerMultinode-server-1299020398>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1530170034>]
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.180 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.180 12 DEBUG ceilometer.compute.pollsters [-] f18e8e64-a0ef-4c29-85d0-955b86872379/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.180 12 DEBUG ceilometer.compute.pollsters [-] 9bd8796c-97a5-491a-b6b4-713222c15142/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '20fdbe78-8b52-461a-b7b9-f195f15ae832', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b79809b822b248ae8be15d0233f5896e', 'user_name': None, 'project_id': '220340bd80db4bf5af391eb2e4247a6c', 'project_name': None, 'resource_id': 'instance-000000aa-f18e8e64-a0ef-4c29-85d0-955b86872379-tap462e1ede-b0', 'timestamp': '2025-11-29T07:48:48.180343', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-1299020398', 'name': 'tap462e1ede-b0', 'instance_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379', 'instance_type': 'm1.nano', 'host': '27916006d701479b1bc480dc2e437f25bd7c8990572854afa8cf554c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:77:28', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap462e1ede-b0'}, 'message_id': 'd6d5b992-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.740580451, 'message_signature': '9560bf5b1ac259fda8b8bf42970a58dabb90ea4a3f31322700a9c4de29b6781e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-000000ad-9bd8796c-97a5-491a-b6b4-713222c15142-tap720d6d0b-55', 'timestamp': '2025-11-29T07:48:48.180343', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1530170034', 'name': 'tap720d6d0b-55', 'instance_id': '9bd8796c-97a5-491a-b6b4-713222c15142', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b8:79:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap720d6d0b-55'}, 'message_id': 'd6d5c4dc-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.745418638, 'message_signature': '2ecf01a0e0c1153d21a7a80bbc53ea455d21f52087cc10e91ec7d1615fb0220c'}]}, 'timestamp': '2025-11-29 07:48:48.180913', '_unique_id': 'f56a76db337f43e6acea144cc722d0f3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.181 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.182 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.182 12 DEBUG ceilometer.compute.pollsters [-] f18e8e64-a0ef-4c29-85d0-955b86872379/disk.device.write.latency volume: 26559188792 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.182 12 DEBUG ceilometer.compute.pollsters [-] f18e8e64-a0ef-4c29-85d0-955b86872379/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.182 12 DEBUG ceilometer.compute.pollsters [-] 9bd8796c-97a5-491a-b6b4-713222c15142/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.182 12 DEBUG ceilometer.compute.pollsters [-] 9bd8796c-97a5-491a-b6b4-713222c15142/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b526af26-173d-4012-a1d3-53b7007b78fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 26559188792, 'user_id': 'b79809b822b248ae8be15d0233f5896e', 'user_name': None, 'project_id': '220340bd80db4bf5af391eb2e4247a6c', 'project_name': None, 'resource_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379-vda', 'timestamp': '2025-11-29T07:48:48.182106', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-1299020398', 'name': 'instance-000000aa', 'instance_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379', 'instance_type': 'm1.nano', 'host': '27916006d701479b1bc480dc2e437f25bd7c8990572854afa8cf554c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd6d5fe0c-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.750352269, 'message_signature': '71bc412c614217edb1a8333b2244c54898bd8770e7fbcdb55a5bb4d1df7eb89c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'b79809b822b248ae8be15d0233f5896e', 'user_name': None, 'project_id': '220340bd80db4bf5af391eb2e4247a6c', 'project_name': None, 'resource_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379-sda', 'timestamp': '2025-11-29T07:48:48.182106', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-1299020398', 'name': 'instance-000000aa', 'instance_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379', 'instance_type': 'm1.nano', 'host': '27916006d701479b1bc480dc2e437f25bd7c8990572854afa8cf554c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd6d6056e-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.750352269, 'message_signature': '4e78e4efdfa427e6549bd8e724a6499d8ea55ab8ade8d06ab7c71cae91914b0a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '9bd8796c-97a5-491a-b6b4-713222c15142-vda', 'timestamp': '2025-11-29T07:48:48.182106', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1530170034', 'name': 'instance-000000ad', 'instance_id': '9bd8796c-97a5-491a-b6b4-713222c15142', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd6d60dca-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.780567639, 'message_signature': '715ef875e171d96a276d20e3bc6ced2ef82a70ed905e9e0ce578079169cc1299'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '9bd8796c-97a5-491a-b6b4-713222c15142-sda', 'timestamp': '2025-11-29T07:48:48.182106', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1530170034', 'name': 'instance-000000ad', 'instance_id': '9bd8796c-97a5-491a-b6b4-713222c15142', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd6d6172a-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.780567639, 'message_signature': '4234af371272dee25a4013386866470069acfbc32aa385ace33b622091e88231'}]}, 'timestamp': '2025-11-29 07:48:48.182957', '_unique_id': 'a55ac9345a494f72a0ef5c876f2d2797'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.183 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.184 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.184 12 DEBUG ceilometer.compute.pollsters [-] f18e8e64-a0ef-4c29-85d0-955b86872379/disk.device.usage volume: 28442624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.184 12 DEBUG ceilometer.compute.pollsters [-] f18e8e64-a0ef-4c29-85d0-955b86872379/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.184 12 DEBUG ceilometer.compute.pollsters [-] 9bd8796c-97a5-491a-b6b4-713222c15142/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.184 12 DEBUG ceilometer.compute.pollsters [-] 9bd8796c-97a5-491a-b6b4-713222c15142/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a5bdd879-cbdb-4a7f-ad56-926a0ff67496', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 28442624, 'user_id': 'b79809b822b248ae8be15d0233f5896e', 'user_name': None, 'project_id': '220340bd80db4bf5af391eb2e4247a6c', 'project_name': None, 'resource_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379-vda', 'timestamp': '2025-11-29T07:48:48.184213', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-1299020398', 'name': 'instance-000000aa', 'instance_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379', 'instance_type': 'm1.nano', 'host': '27916006d701479b1bc480dc2e437f25bd7c8990572854afa8cf554c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd6d65032-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.875081509, 'message_signature': '566526f887d4dc517a10740b52949b0eb429daa061c67b03f02b11c68ea4b3d1'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'b79809b822b248ae8be15d0233f5896e', 'user_name': None, 'project_id': '220340bd80db4bf5af391eb2e4247a6c', 'project_name': None, 'resource_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379-sda', 'timestamp': '2025-11-29T07:48:48.184213', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-1299020398', 'name': 'instance-000000aa', 'instance_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379', 'instance_type': 'm1.nano', 'host': '27916006d701479b1bc480dc2e437f25bd7c8990572854afa8cf554c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd6d657b2-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.875081509, 'message_signature': '5d843ec55288b6801ff3259521856f39725e64f2d1eadc3664d6c2b78437a631'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '9bd8796c-97a5-491a-b6b4-713222c15142-vda', 'timestamp': '2025-11-29T07:48:48.184213', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1530170034', 'name': 'instance-000000ad', 'instance_id': '9bd8796c-97a5-491a-b6b4-713222c15142', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd6d65fe6-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.886093012, 'message_signature': '1b2ee6efc39d3f414115f50e381ebad75c7b67ef26b1f3384050e76d6d632c5e'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '9bd8796c-97a5-491a-b6b4-713222c15142-sda', 'timestamp': '2025-11-29T07:48:48.184213', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1530170034', 'name': 'instance-000000ad', 'instance_id': '9bd8796c-97a5-491a-b6b4-713222c15142', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd6d6693c-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.886093012, 'message_signature': '3acfeeba2beebffe0ec65cdc69f514b2d23acaa1afdbe24a59125fdc2283dc78'}]}, 'timestamp': '2025-11-29 07:48:48.185090', '_unique_id': '9f306ca934fd4ec991854913ba187799'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.185 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.186 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.186 12 DEBUG ceilometer.compute.pollsters [-] f18e8e64-a0ef-4c29-85d0-955b86872379/disk.device.allocation volume: 29106176 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.186 12 DEBUG ceilometer.compute.pollsters [-] f18e8e64-a0ef-4c29-85d0-955b86872379/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 DEBUG ceilometer.compute.pollsters [-] 9bd8796c-97a5-491a-b6b4-713222c15142/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 DEBUG ceilometer.compute.pollsters [-] 9bd8796c-97a5-491a-b6b4-713222c15142/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '34d04a79-f5cb-4bea-a8ac-0160cbeea0ca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29106176, 'user_id': 'b79809b822b248ae8be15d0233f5896e', 'user_name': None, 'project_id': '220340bd80db4bf5af391eb2e4247a6c', 'project_name': None, 'resource_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379-vda', 'timestamp': '2025-11-29T07:48:48.186609', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-1299020398', 'name': 'instance-000000aa', 'instance_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379', 'instance_type': 'm1.nano', 'host': '27916006d701479b1bc480dc2e437f25bd7c8990572854afa8cf554c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd6d6ae92-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.875081509, 'message_signature': '9a6b99d05ea985317f65eb1e237ba313dc50e005996c27e5b27ab7f36173ea57'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'b79809b822b248ae8be15d0233f5896e', 'user_name': None, 'project_id': '220340bd80db4bf5af391eb2e4247a6c', 'project_name': None, 'resource_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379-sda', 'timestamp': '2025-11-29T07:48:48.186609', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-1299020398', 'name': 'instance-000000aa', 'instance_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379', 'instance_type': 'm1.nano', 'host': '27916006d701479b1bc480dc2e437f25bd7c8990572854afa8cf554c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd6d6b734-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.875081509, 'message_signature': 'abc863fe78b8a028de67824220eb3f6729bb18196ce0400cd90027dcb5fe7522'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '9bd8796c-97a5-491a-b6b4-713222c15142-vda', 'timestamp': '2025-11-29T07:48:48.186609', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1530170034', 'name': 'instance-000000ad', 'instance_id': '9bd8796c-97a5-491a-b6b4-713222c15142', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd6d6bed2-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.886093012, 'message_signature': 'f4ffceed881ff38155d75d1911ee21c11973a11578ffdbe40d06e78c351246e6'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '9bd8796c-97a5-491a-b6b4-713222c15142-sda', 'timestamp': '2025-11-29T07:48:48.186609', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1530170034', 'name': 'instance-000000ad', 'instance_id': '9bd8796c-97a5-491a-b6b4-713222c15142', 'instance_type': 'm1.nano', 'host': '2d02b3fea42d390710bdac8343d48aee40d786586a3e285958353a48', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd6d6c6c0-ccf7-11f0-8f64-fa163e220349', 'monotonic_time': 7975.886093012, 'message_signature': 'd7f9108faa423c525510aef16a45e34c1b9bad47ff978018e6db1181be06947f'}]}, 'timestamp': '2025-11-29 07:48:48.187471', '_unique_id': '35f8d6ee27124dbf95cde139fbc643eb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:48:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:48:48.187 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:48:48 compute-0 nova_compute[187185]: 2025-11-29 07:48:48.236 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:48 compute-0 nova_compute[187185]: 2025-11-29 07:48:48.262 187189 INFO nova.compute.manager [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Took 7.79 seconds to spawn the instance on the hypervisor.
Nov 29 07:48:48 compute-0 nova_compute[187185]: 2025-11-29 07:48:48.263 187189 DEBUG nova.compute.manager [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:48:48 compute-0 ovn_controller[95281]: 2025-11-29T07:48:48Z|00077|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c0:77:28 10.100.0.5
Nov 29 07:48:48 compute-0 ovn_controller[95281]: 2025-11-29T07:48:48Z|00078|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c0:77:28 10.100.0.5
Nov 29 07:48:48 compute-0 podman[246360]: 2025-11-29 07:48:48.806376294 +0000 UTC m=+0.069834969 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 07:48:49 compute-0 nova_compute[187185]: 2025-11-29 07:48:49.348 187189 DEBUG nova.compute.manager [req-fcd824b9-ea70-442c-bba7-5ae3a58b861b req-6a1f40d4-530f-4d1a-9172-791376cabb6f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Received event network-vif-plugged-720d6d0b-554f-4c56-b9e4-1de1309b83f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:48:49 compute-0 nova_compute[187185]: 2025-11-29 07:48:49.349 187189 DEBUG oslo_concurrency.lockutils [req-fcd824b9-ea70-442c-bba7-5ae3a58b861b req-6a1f40d4-530f-4d1a-9172-791376cabb6f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "9bd8796c-97a5-491a-b6b4-713222c15142-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:48:49 compute-0 nova_compute[187185]: 2025-11-29 07:48:49.349 187189 DEBUG oslo_concurrency.lockutils [req-fcd824b9-ea70-442c-bba7-5ae3a58b861b req-6a1f40d4-530f-4d1a-9172-791376cabb6f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9bd8796c-97a5-491a-b6b4-713222c15142-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:48:49 compute-0 nova_compute[187185]: 2025-11-29 07:48:49.350 187189 DEBUG oslo_concurrency.lockutils [req-fcd824b9-ea70-442c-bba7-5ae3a58b861b req-6a1f40d4-530f-4d1a-9172-791376cabb6f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9bd8796c-97a5-491a-b6b4-713222c15142-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:48:49 compute-0 nova_compute[187185]: 2025-11-29 07:48:49.350 187189 DEBUG nova.compute.manager [req-fcd824b9-ea70-442c-bba7-5ae3a58b861b req-6a1f40d4-530f-4d1a-9172-791376cabb6f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] No waiting events found dispatching network-vif-plugged-720d6d0b-554f-4c56-b9e4-1de1309b83f0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:48:49 compute-0 nova_compute[187185]: 2025-11-29 07:48:49.350 187189 WARNING nova.compute.manager [req-fcd824b9-ea70-442c-bba7-5ae3a58b861b req-6a1f40d4-530f-4d1a-9172-791376cabb6f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Received unexpected event network-vif-plugged-720d6d0b-554f-4c56-b9e4-1de1309b83f0 for instance with vm_state active and task_state None.
Nov 29 07:48:49 compute-0 nova_compute[187185]: 2025-11-29 07:48:49.365 187189 INFO nova.compute.manager [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Took 9.41 seconds to build instance.
Nov 29 07:48:49 compute-0 nova_compute[187185]: 2025-11-29 07:48:49.391 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:49 compute-0 nova_compute[187185]: 2025-11-29 07:48:49.403 187189 DEBUG oslo_concurrency.lockutils [None req-e32d4123-321d-42ed-9ff7-e1756396f805 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "9bd8796c-97a5-491a-b6b4-713222c15142" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.517s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:48:50 compute-0 podman[246385]: 2025-11-29 07:48:50.804098101 +0000 UTC m=+0.065506585 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3)
Nov 29 07:48:50 compute-0 podman[246386]: 2025-11-29 07:48:50.812217462 +0000 UTC m=+0.071973049 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:48:50 compute-0 sshd[128727]: drop connection #0 from [115.190.136.184]:59232 on [38.102.83.110]:22 penalty: exceeded LoginGraceTime
Nov 29 07:48:52 compute-0 nova_compute[187185]: 2025-11-29 07:48:52.089 187189 DEBUG oslo_concurrency.lockutils [None req-11d6737f-9254-41fb-8272-28171365a350 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Acquiring lock "f18e8e64-a0ef-4c29-85d0-955b86872379" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:48:52 compute-0 nova_compute[187185]: 2025-11-29 07:48:52.090 187189 DEBUG oslo_concurrency.lockutils [None req-11d6737f-9254-41fb-8272-28171365a350 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "f18e8e64-a0ef-4c29-85d0-955b86872379" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:48:52 compute-0 nova_compute[187185]: 2025-11-29 07:48:52.091 187189 DEBUG oslo_concurrency.lockutils [None req-11d6737f-9254-41fb-8272-28171365a350 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Acquiring lock "f18e8e64-a0ef-4c29-85d0-955b86872379-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:48:52 compute-0 nova_compute[187185]: 2025-11-29 07:48:52.091 187189 DEBUG oslo_concurrency.lockutils [None req-11d6737f-9254-41fb-8272-28171365a350 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "f18e8e64-a0ef-4c29-85d0-955b86872379-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:48:52 compute-0 nova_compute[187185]: 2025-11-29 07:48:52.092 187189 DEBUG oslo_concurrency.lockutils [None req-11d6737f-9254-41fb-8272-28171365a350 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "f18e8e64-a0ef-4c29-85d0-955b86872379-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:48:52 compute-0 nova_compute[187185]: 2025-11-29 07:48:52.109 187189 INFO nova.compute.manager [None req-11d6737f-9254-41fb-8272-28171365a350 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Terminating instance
Nov 29 07:48:52 compute-0 nova_compute[187185]: 2025-11-29 07:48:52.123 187189 DEBUG nova.compute.manager [None req-11d6737f-9254-41fb-8272-28171365a350 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 07:48:52 compute-0 kernel: tap462e1ede-b0 (unregistering): left promiscuous mode
Nov 29 07:48:52 compute-0 NetworkManager[55227]: <info>  [1764402532.1665] device (tap462e1ede-b0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:48:52 compute-0 ovn_controller[95281]: 2025-11-29T07:48:52Z|00578|binding|INFO|Releasing lport 462e1ede-b054-4653-9f4d-b136bdab7915 from this chassis (sb_readonly=0)
Nov 29 07:48:52 compute-0 ovn_controller[95281]: 2025-11-29T07:48:52Z|00579|binding|INFO|Setting lport 462e1ede-b054-4653-9f4d-b136bdab7915 down in Southbound
Nov 29 07:48:52 compute-0 ovn_controller[95281]: 2025-11-29T07:48:52Z|00580|binding|INFO|Removing iface tap462e1ede-b0 ovn-installed in OVS
Nov 29 07:48:52 compute-0 nova_compute[187185]: 2025-11-29 07:48:52.193 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:52 compute-0 nova_compute[187185]: 2025-11-29 07:48:52.198 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:52.206 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c0:77:28 10.100.0.5'], port_security=['fa:16:3e:c0:77:28 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'f18e8e64-a0ef-4c29-85d0-955b86872379', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '220340bd80db4bf5af391eb2e4247a6c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7d07af2a-16f6-4fe3-b2a4-ed6b96a38a93', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cbb62e23-e8c7-432f-b445-db50c529fe8e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=462e1ede-b054-4653-9f4d-b136bdab7915) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:48:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:52.209 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 462e1ede-b054-4653-9f4d-b136bdab7915 in datapath 7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d unbound from our chassis
Nov 29 07:48:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:52.213 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:48:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:52.218 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[4f3abcbc-f80e-493f-b158-4b601a91079f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:48:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:52.222 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d namespace which is not needed anymore
Nov 29 07:48:52 compute-0 nova_compute[187185]: 2025-11-29 07:48:52.221 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:52 compute-0 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d000000aa.scope: Deactivated successfully.
Nov 29 07:48:52 compute-0 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d000000aa.scope: Consumed 13.581s CPU time.
Nov 29 07:48:52 compute-0 systemd-machined[153486]: Machine qemu-66-instance-000000aa terminated.
Nov 29 07:48:52 compute-0 nova_compute[187185]: 2025-11-29 07:48:52.358 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:52 compute-0 nova_compute[187185]: 2025-11-29 07:48:52.365 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:52 compute-0 nova_compute[187185]: 2025-11-29 07:48:52.405 187189 INFO nova.virt.libvirt.driver [-] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Instance destroyed successfully.
Nov 29 07:48:52 compute-0 nova_compute[187185]: 2025-11-29 07:48:52.406 187189 DEBUG nova.objects.instance [None req-11d6737f-9254-41fb-8272-28171365a350 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lazy-loading 'resources' on Instance uuid f18e8e64-a0ef-4c29-85d0-955b86872379 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:48:52 compute-0 nova_compute[187185]: 2025-11-29 07:48:52.423 187189 DEBUG nova.virt.libvirt.vif [None req-11d6737f-9254-41fb-8272-28171365a350 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:48:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1299020398',display_name='tempest-TestServerMultinode-server-1299020398',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testservermultinode-server-1299020398',id=170,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:48:32Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='220340bd80db4bf5af391eb2e4247a6c',ramdisk_id='',reservation_id='r-t0h2676a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerMultinode-521650901',owner_user_name='tempest-TestServerMultinode-521650901-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:48:32Z,user_data=None,user_id='b79809b822b248ae8be15d0233f5896e',uuid=f18e8e64-a0ef-4c29-85d0-955b86872379,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "462e1ede-b054-4653-9f4d-b136bdab7915", "address": "fa:16:3e:c0:77:28", "network": {"id": "7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d", "bridge": "br-int", "label": "tempest-TestServerMultinode-1724511624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff00c41537a48ac87c3d3460e6ff7e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap462e1ede-b0", "ovs_interfaceid": "462e1ede-b054-4653-9f4d-b136bdab7915", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:48:52 compute-0 nova_compute[187185]: 2025-11-29 07:48:52.423 187189 DEBUG nova.network.os_vif_util [None req-11d6737f-9254-41fb-8272-28171365a350 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Converting VIF {"id": "462e1ede-b054-4653-9f4d-b136bdab7915", "address": "fa:16:3e:c0:77:28", "network": {"id": "7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d", "bridge": "br-int", "label": "tempest-TestServerMultinode-1724511624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff00c41537a48ac87c3d3460e6ff7e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap462e1ede-b0", "ovs_interfaceid": "462e1ede-b054-4653-9f4d-b136bdab7915", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:48:52 compute-0 nova_compute[187185]: 2025-11-29 07:48:52.424 187189 DEBUG nova.network.os_vif_util [None req-11d6737f-9254-41fb-8272-28171365a350 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c0:77:28,bridge_name='br-int',has_traffic_filtering=True,id=462e1ede-b054-4653-9f4d-b136bdab7915,network=Network(7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap462e1ede-b0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:48:52 compute-0 nova_compute[187185]: 2025-11-29 07:48:52.425 187189 DEBUG os_vif [None req-11d6737f-9254-41fb-8272-28171365a350 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c0:77:28,bridge_name='br-int',has_traffic_filtering=True,id=462e1ede-b054-4653-9f4d-b136bdab7915,network=Network(7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap462e1ede-b0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:48:52 compute-0 nova_compute[187185]: 2025-11-29 07:48:52.427 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:52 compute-0 nova_compute[187185]: 2025-11-29 07:48:52.428 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap462e1ede-b0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:48:52 compute-0 nova_compute[187185]: 2025-11-29 07:48:52.429 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:52 compute-0 nova_compute[187185]: 2025-11-29 07:48:52.431 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:52 compute-0 nova_compute[187185]: 2025-11-29 07:48:52.434 187189 INFO os_vif [None req-11d6737f-9254-41fb-8272-28171365a350 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c0:77:28,bridge_name='br-int',has_traffic_filtering=True,id=462e1ede-b054-4653-9f4d-b136bdab7915,network=Network(7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap462e1ede-b0')
Nov 29 07:48:52 compute-0 nova_compute[187185]: 2025-11-29 07:48:52.435 187189 INFO nova.virt.libvirt.driver [None req-11d6737f-9254-41fb-8272-28171365a350 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Deleting instance files /var/lib/nova/instances/f18e8e64-a0ef-4c29-85d0-955b86872379_del
Nov 29 07:48:52 compute-0 nova_compute[187185]: 2025-11-29 07:48:52.436 187189 INFO nova.virt.libvirt.driver [None req-11d6737f-9254-41fb-8272-28171365a350 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Deletion of /var/lib/nova/instances/f18e8e64-a0ef-4c29-85d0-955b86872379_del complete
Nov 29 07:48:52 compute-0 nova_compute[187185]: 2025-11-29 07:48:52.536 187189 INFO nova.compute.manager [None req-11d6737f-9254-41fb-8272-28171365a350 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Took 0.41 seconds to destroy the instance on the hypervisor.
Nov 29 07:48:52 compute-0 nova_compute[187185]: 2025-11-29 07:48:52.537 187189 DEBUG oslo.service.loopingcall [None req-11d6737f-9254-41fb-8272-28171365a350 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 07:48:52 compute-0 nova_compute[187185]: 2025-11-29 07:48:52.538 187189 DEBUG nova.compute.manager [-] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 07:48:52 compute-0 nova_compute[187185]: 2025-11-29 07:48:52.538 187189 DEBUG nova.network.neutron [-] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 07:48:52 compute-0 neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d[246159]: [NOTICE]   (246163) : haproxy version is 2.8.14-c23fe91
Nov 29 07:48:52 compute-0 neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d[246159]: [NOTICE]   (246163) : path to executable is /usr/sbin/haproxy
Nov 29 07:48:52 compute-0 neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d[246159]: [WARNING]  (246163) : Exiting Master process...
Nov 29 07:48:52 compute-0 neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d[246159]: [WARNING]  (246163) : Exiting Master process...
Nov 29 07:48:52 compute-0 neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d[246159]: [ALERT]    (246163) : Current worker (246165) exited with code 143 (Terminated)
Nov 29 07:48:52 compute-0 neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d[246159]: [WARNING]  (246163) : All workers exited. Exiting... (0)
Nov 29 07:48:52 compute-0 systemd[1]: libpod-83057635f63208d865cee3182e0a568e7345ab5d2fc595b6210304beb14674f8.scope: Deactivated successfully.
Nov 29 07:48:52 compute-0 podman[246446]: 2025-11-29 07:48:52.984992593 +0000 UTC m=+0.611378792 container died 83057635f63208d865cee3182e0a568e7345ab5d2fc595b6210304beb14674f8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:48:53 compute-0 nova_compute[187185]: 2025-11-29 07:48:53.037 187189 DEBUG nova.network.neutron [-] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:48:53 compute-0 nova_compute[187185]: 2025-11-29 07:48:53.058 187189 INFO nova.compute.manager [-] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Took 0.52 seconds to deallocate network for instance.
Nov 29 07:48:53 compute-0 nova_compute[187185]: 2025-11-29 07:48:53.129 187189 DEBUG nova.compute.manager [req-79ae760c-f9a6-4c7c-9488-4b1b278664e0 req-2d28f4a0-6c2b-4cb0-8295-39e74428b28e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Received event network-vif-unplugged-462e1ede-b054-4653-9f4d-b136bdab7915 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:48:53 compute-0 nova_compute[187185]: 2025-11-29 07:48:53.130 187189 DEBUG oslo_concurrency.lockutils [req-79ae760c-f9a6-4c7c-9488-4b1b278664e0 req-2d28f4a0-6c2b-4cb0-8295-39e74428b28e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "f18e8e64-a0ef-4c29-85d0-955b86872379-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:48:53 compute-0 nova_compute[187185]: 2025-11-29 07:48:53.131 187189 DEBUG oslo_concurrency.lockutils [req-79ae760c-f9a6-4c7c-9488-4b1b278664e0 req-2d28f4a0-6c2b-4cb0-8295-39e74428b28e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f18e8e64-a0ef-4c29-85d0-955b86872379-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:48:53 compute-0 nova_compute[187185]: 2025-11-29 07:48:53.131 187189 DEBUG oslo_concurrency.lockutils [req-79ae760c-f9a6-4c7c-9488-4b1b278664e0 req-2d28f4a0-6c2b-4cb0-8295-39e74428b28e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f18e8e64-a0ef-4c29-85d0-955b86872379-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:48:53 compute-0 nova_compute[187185]: 2025-11-29 07:48:53.132 187189 DEBUG nova.compute.manager [req-79ae760c-f9a6-4c7c-9488-4b1b278664e0 req-2d28f4a0-6c2b-4cb0-8295-39e74428b28e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] No waiting events found dispatching network-vif-unplugged-462e1ede-b054-4653-9f4d-b136bdab7915 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:48:53 compute-0 nova_compute[187185]: 2025-11-29 07:48:53.132 187189 WARNING nova.compute.manager [req-79ae760c-f9a6-4c7c-9488-4b1b278664e0 req-2d28f4a0-6c2b-4cb0-8295-39e74428b28e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Received unexpected event network-vif-unplugged-462e1ede-b054-4653-9f4d-b136bdab7915 for instance with vm_state deleted and task_state None.
Nov 29 07:48:53 compute-0 nova_compute[187185]: 2025-11-29 07:48:53.132 187189 DEBUG nova.compute.manager [req-79ae760c-f9a6-4c7c-9488-4b1b278664e0 req-2d28f4a0-6c2b-4cb0-8295-39e74428b28e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Received event network-vif-plugged-462e1ede-b054-4653-9f4d-b136bdab7915 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:48:53 compute-0 nova_compute[187185]: 2025-11-29 07:48:53.132 187189 DEBUG oslo_concurrency.lockutils [req-79ae760c-f9a6-4c7c-9488-4b1b278664e0 req-2d28f4a0-6c2b-4cb0-8295-39e74428b28e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "f18e8e64-a0ef-4c29-85d0-955b86872379-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:48:53 compute-0 nova_compute[187185]: 2025-11-29 07:48:53.133 187189 DEBUG oslo_concurrency.lockutils [req-79ae760c-f9a6-4c7c-9488-4b1b278664e0 req-2d28f4a0-6c2b-4cb0-8295-39e74428b28e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f18e8e64-a0ef-4c29-85d0-955b86872379-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:48:53 compute-0 nova_compute[187185]: 2025-11-29 07:48:53.133 187189 DEBUG oslo_concurrency.lockutils [req-79ae760c-f9a6-4c7c-9488-4b1b278664e0 req-2d28f4a0-6c2b-4cb0-8295-39e74428b28e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f18e8e64-a0ef-4c29-85d0-955b86872379-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:48:53 compute-0 nova_compute[187185]: 2025-11-29 07:48:53.133 187189 DEBUG nova.compute.manager [req-79ae760c-f9a6-4c7c-9488-4b1b278664e0 req-2d28f4a0-6c2b-4cb0-8295-39e74428b28e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] No waiting events found dispatching network-vif-plugged-462e1ede-b054-4653-9f4d-b136bdab7915 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:48:53 compute-0 nova_compute[187185]: 2025-11-29 07:48:53.133 187189 WARNING nova.compute.manager [req-79ae760c-f9a6-4c7c-9488-4b1b278664e0 req-2d28f4a0-6c2b-4cb0-8295-39e74428b28e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Received unexpected event network-vif-plugged-462e1ede-b054-4653-9f4d-b136bdab7915 for instance with vm_state deleted and task_state None.
Nov 29 07:48:53 compute-0 nova_compute[187185]: 2025-11-29 07:48:53.134 187189 DEBUG nova.compute.manager [req-79ae760c-f9a6-4c7c-9488-4b1b278664e0 req-2d28f4a0-6c2b-4cb0-8295-39e74428b28e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Received event network-vif-deleted-462e1ede-b054-4653-9f4d-b136bdab7915 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:48:53 compute-0 nova_compute[187185]: 2025-11-29 07:48:53.136 187189 DEBUG oslo_concurrency.lockutils [None req-11d6737f-9254-41fb-8272-28171365a350 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:48:53 compute-0 nova_compute[187185]: 2025-11-29 07:48:53.136 187189 DEBUG oslo_concurrency.lockutils [None req-11d6737f-9254-41fb-8272-28171365a350 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:48:53 compute-0 nova_compute[187185]: 2025-11-29 07:48:53.213 187189 DEBUG nova.compute.provider_tree [None req-11d6737f-9254-41fb-8272-28171365a350 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:48:53 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-83057635f63208d865cee3182e0a568e7345ab5d2fc595b6210304beb14674f8-userdata-shm.mount: Deactivated successfully.
Nov 29 07:48:53 compute-0 systemd[1]: var-lib-containers-storage-overlay-3791371261b83bd234d0aee3a9d289b8ab68200bd8c9b66619176c6a0ac9c8d2-merged.mount: Deactivated successfully.
Nov 29 07:48:53 compute-0 nova_compute[187185]: 2025-11-29 07:48:53.232 187189 DEBUG nova.scheduler.client.report [None req-11d6737f-9254-41fb-8272-28171365a350 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:48:53 compute-0 nova_compute[187185]: 2025-11-29 07:48:53.238 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:53 compute-0 nova_compute[187185]: 2025-11-29 07:48:53.289 187189 DEBUG oslo_concurrency.lockutils [None req-11d6737f-9254-41fb-8272-28171365a350 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:48:53 compute-0 nova_compute[187185]: 2025-11-29 07:48:53.319 187189 INFO nova.scheduler.client.report [None req-11d6737f-9254-41fb-8272-28171365a350 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Deleted allocations for instance f18e8e64-a0ef-4c29-85d0-955b86872379
Nov 29 07:48:53 compute-0 podman[246446]: 2025-11-29 07:48:53.342136188 +0000 UTC m=+0.968522357 container cleanup 83057635f63208d865cee3182e0a568e7345ab5d2fc595b6210304beb14674f8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 07:48:53 compute-0 systemd[1]: libpod-conmon-83057635f63208d865cee3182e0a568e7345ab5d2fc595b6210304beb14674f8.scope: Deactivated successfully.
Nov 29 07:48:53 compute-0 nova_compute[187185]: 2025-11-29 07:48:53.395 187189 DEBUG oslo_concurrency.lockutils [None req-11d6737f-9254-41fb-8272-28171365a350 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "f18e8e64-a0ef-4c29-85d0-955b86872379" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.305s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:48:53 compute-0 podman[246489]: 2025-11-29 07:48:53.484084858 +0000 UTC m=+0.113446090 container remove 83057635f63208d865cee3182e0a568e7345ab5d2fc595b6210304beb14674f8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:48:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:53.492 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[4a41664a-22f9-45c1-8789-4bb15151c0a2]: (4, ('Sat Nov 29 07:48:52 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d (83057635f63208d865cee3182e0a568e7345ab5d2fc595b6210304beb14674f8)\n83057635f63208d865cee3182e0a568e7345ab5d2fc595b6210304beb14674f8\nSat Nov 29 07:48:53 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d (83057635f63208d865cee3182e0a568e7345ab5d2fc595b6210304beb14674f8)\n83057635f63208d865cee3182e0a568e7345ab5d2fc595b6210304beb14674f8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:48:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:53.494 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[dcb1743a-d8c9-483e-a1f5-4ff35a3c67fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:48:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:53.495 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7fbe5e7f-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:48:53 compute-0 nova_compute[187185]: 2025-11-29 07:48:53.497 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:53 compute-0 kernel: tap7fbe5e7f-50: left promiscuous mode
Nov 29 07:48:53 compute-0 nova_compute[187185]: 2025-11-29 07:48:53.500 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:53.503 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[ac1f0843-497f-4534-8a0e-7818cba803d1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:48:53 compute-0 nova_compute[187185]: 2025-11-29 07:48:53.514 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:53.528 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[5bee7cf0-d9bb-4e70-9e6b-295d7440e03a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:48:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:53.530 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[dfd07c0f-8f3f-4ffb-be10-2fc686dde9cd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:48:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:53.551 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[dfcda702-04d8-401a-9432-e66c2d51c537]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 795852, 'reachable_time': 19392, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246506, 'error': None, 'target': 'ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:48:53 compute-0 systemd[1]: run-netns-ovnmeta\x2d7fbe5e7f\x2d5bf0\x2d42e8\x2d9d22\x2dc7ee6968433d.mount: Deactivated successfully.
Nov 29 07:48:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:53.557 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:48:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:48:53.557 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[7ceb8cf2-f263-4867-9017-3e1ae29cb501]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:48:55 compute-0 nova_compute[187185]: 2025-11-29 07:48:55.249 187189 DEBUG nova.compute.manager [req-96463452-1d52-4703-bc4e-dbca8655cf15 req-4d2fcaff-b383-4043-9404-e01f5fc2ae0f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Received event network-changed-720d6d0b-554f-4c56-b9e4-1de1309b83f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:48:55 compute-0 nova_compute[187185]: 2025-11-29 07:48:55.249 187189 DEBUG nova.compute.manager [req-96463452-1d52-4703-bc4e-dbca8655cf15 req-4d2fcaff-b383-4043-9404-e01f5fc2ae0f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Refreshing instance network info cache due to event network-changed-720d6d0b-554f-4c56-b9e4-1de1309b83f0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:48:55 compute-0 nova_compute[187185]: 2025-11-29 07:48:55.249 187189 DEBUG oslo_concurrency.lockutils [req-96463452-1d52-4703-bc4e-dbca8655cf15 req-4d2fcaff-b383-4043-9404-e01f5fc2ae0f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-9bd8796c-97a5-491a-b6b4-713222c15142" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:48:55 compute-0 nova_compute[187185]: 2025-11-29 07:48:55.249 187189 DEBUG oslo_concurrency.lockutils [req-96463452-1d52-4703-bc4e-dbca8655cf15 req-4d2fcaff-b383-4043-9404-e01f5fc2ae0f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-9bd8796c-97a5-491a-b6b4-713222c15142" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:48:55 compute-0 nova_compute[187185]: 2025-11-29 07:48:55.249 187189 DEBUG nova.network.neutron [req-96463452-1d52-4703-bc4e-dbca8655cf15 req-4d2fcaff-b383-4043-9404-e01f5fc2ae0f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Refreshing network info cache for port 720d6d0b-554f-4c56-b9e4-1de1309b83f0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:48:57 compute-0 nova_compute[187185]: 2025-11-29 07:48:57.310 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:48:57 compute-0 nova_compute[187185]: 2025-11-29 07:48:57.430 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:48:57 compute-0 nova_compute[187185]: 2025-11-29 07:48:57.807 187189 DEBUG nova.network.neutron [req-96463452-1d52-4703-bc4e-dbca8655cf15 req-4d2fcaff-b383-4043-9404-e01f5fc2ae0f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Updated VIF entry in instance network info cache for port 720d6d0b-554f-4c56-b9e4-1de1309b83f0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:48:57 compute-0 nova_compute[187185]: 2025-11-29 07:48:57.808 187189 DEBUG nova.network.neutron [req-96463452-1d52-4703-bc4e-dbca8655cf15 req-4d2fcaff-b383-4043-9404-e01f5fc2ae0f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Updating instance_info_cache with network_info: [{"id": "720d6d0b-554f-4c56-b9e4-1de1309b83f0", "address": "fa:16:3e:b8:79:f5", "network": {"id": "cfd1ce3c-e516-46ef-8712-573fe4de8313", "bridge": "br-int", "label": "tempest-network-smoke--423607195", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap720d6d0b-55", "ovs_interfaceid": "720d6d0b-554f-4c56-b9e4-1de1309b83f0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:48:58 compute-0 nova_compute[187185]: 2025-11-29 07:48:58.241 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:49:00 compute-0 ovn_controller[95281]: 2025-11-29T07:49:00Z|00079|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b8:79:f5 10.100.0.9
Nov 29 07:49:00 compute-0 ovn_controller[95281]: 2025-11-29T07:49:00Z|00080|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b8:79:f5 10.100.0.9
Nov 29 07:49:01 compute-0 nova_compute[187185]: 2025-11-29 07:49:01.717 187189 DEBUG oslo_concurrency.lockutils [req-96463452-1d52-4703-bc4e-dbca8655cf15 req-4d2fcaff-b383-4043-9404-e01f5fc2ae0f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-9bd8796c-97a5-491a-b6b4-713222c15142" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:49:01 compute-0 podman[246522]: 2025-11-29 07:49:01.82181272 +0000 UTC m=+0.070658042 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 07:49:01 compute-0 podman[246521]: 2025-11-29 07:49:01.833110562 +0000 UTC m=+0.083202219 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, version=9.6, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, io.buildah.version=1.33.7)
Nov 29 07:49:01 compute-0 podman[246520]: 2025-11-29 07:49:01.843255391 +0000 UTC m=+0.097510907 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 29 07:49:02 compute-0 nova_compute[187185]: 2025-11-29 07:49:02.433 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:49:03 compute-0 nova_compute[187185]: 2025-11-29 07:49:03.244 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:49:04 compute-0 ovn_controller[95281]: 2025-11-29T07:49:04Z|00581|binding|INFO|Releasing lport 7f4b3b3b-6ee7-4970-9e8f-3e592045a366 from this chassis (sb_readonly=0)
Nov 29 07:49:04 compute-0 nova_compute[187185]: 2025-11-29 07:49:04.630 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:49:07 compute-0 nova_compute[187185]: 2025-11-29 07:49:07.403 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764402532.4009483, f18e8e64-a0ef-4c29-85d0-955b86872379 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:49:07 compute-0 nova_compute[187185]: 2025-11-29 07:49:07.404 187189 INFO nova.compute.manager [-] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] VM Stopped (Lifecycle Event)
Nov 29 07:49:07 compute-0 nova_compute[187185]: 2025-11-29 07:49:07.440 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:49:07 compute-0 nova_compute[187185]: 2025-11-29 07:49:07.449 187189 DEBUG nova.compute.manager [None req-bda28502-85f8-4758-b779-ad681cfe9260 - - - - - -] [instance: f18e8e64-a0ef-4c29-85d0-955b86872379] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:49:08 compute-0 nova_compute[187185]: 2025-11-29 07:49:08.246 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:49:10 compute-0 sshd-session[246584]: Invalid user dangulo from 190.181.27.27 port 54332
Nov 29 07:49:10 compute-0 sshd-session[246584]: Received disconnect from 190.181.27.27 port 54332:11: Bye Bye [preauth]
Nov 29 07:49:10 compute-0 sshd-session[246584]: Disconnected from invalid user dangulo 190.181.27.27 port 54332 [preauth]
Nov 29 07:49:10 compute-0 podman[246586]: 2025-11-29 07:49:10.947065088 +0000 UTC m=+0.194879728 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Nov 29 07:49:12 compute-0 nova_compute[187185]: 2025-11-29 07:49:12.443 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:49:13 compute-0 nova_compute[187185]: 2025-11-29 07:49:13.250 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:49:14 compute-0 ovn_controller[95281]: 2025-11-29T07:49:14Z|00582|binding|INFO|Releasing lport 7f4b3b3b-6ee7-4970-9e8f-3e592045a366 from this chassis (sb_readonly=0)
Nov 29 07:49:14 compute-0 nova_compute[187185]: 2025-11-29 07:49:14.166 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:49:17 compute-0 nova_compute[187185]: 2025-11-29 07:49:17.179 187189 DEBUG nova.compute.manager [req-fdabac2f-addd-482e-9967-312995a7f788 req-41571c6f-2466-4ce3-8fa6-e3b903a72368 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Received event network-changed-720d6d0b-554f-4c56-b9e4-1de1309b83f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:49:17 compute-0 nova_compute[187185]: 2025-11-29 07:49:17.179 187189 DEBUG nova.compute.manager [req-fdabac2f-addd-482e-9967-312995a7f788 req-41571c6f-2466-4ce3-8fa6-e3b903a72368 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Refreshing instance network info cache due to event network-changed-720d6d0b-554f-4c56-b9e4-1de1309b83f0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:49:17 compute-0 nova_compute[187185]: 2025-11-29 07:49:17.179 187189 DEBUG oslo_concurrency.lockutils [req-fdabac2f-addd-482e-9967-312995a7f788 req-41571c6f-2466-4ce3-8fa6-e3b903a72368 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-9bd8796c-97a5-491a-b6b4-713222c15142" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:49:17 compute-0 nova_compute[187185]: 2025-11-29 07:49:17.180 187189 DEBUG oslo_concurrency.lockutils [req-fdabac2f-addd-482e-9967-312995a7f788 req-41571c6f-2466-4ce3-8fa6-e3b903a72368 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-9bd8796c-97a5-491a-b6b4-713222c15142" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:49:17 compute-0 nova_compute[187185]: 2025-11-29 07:49:17.180 187189 DEBUG nova.network.neutron [req-fdabac2f-addd-482e-9967-312995a7f788 req-41571c6f-2466-4ce3-8fa6-e3b903a72368 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Refreshing network info cache for port 720d6d0b-554f-4c56-b9e4-1de1309b83f0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:49:17 compute-0 nova_compute[187185]: 2025-11-29 07:49:17.379 187189 DEBUG oslo_concurrency.lockutils [None req-8226c857-ac84-48e5-87fe-ddf705ebf951 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "9bd8796c-97a5-491a-b6b4-713222c15142" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:49:17 compute-0 nova_compute[187185]: 2025-11-29 07:49:17.380 187189 DEBUG oslo_concurrency.lockutils [None req-8226c857-ac84-48e5-87fe-ddf705ebf951 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "9bd8796c-97a5-491a-b6b4-713222c15142" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:49:17 compute-0 nova_compute[187185]: 2025-11-29 07:49:17.380 187189 DEBUG oslo_concurrency.lockutils [None req-8226c857-ac84-48e5-87fe-ddf705ebf951 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "9bd8796c-97a5-491a-b6b4-713222c15142-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:49:17 compute-0 nova_compute[187185]: 2025-11-29 07:49:17.380 187189 DEBUG oslo_concurrency.lockutils [None req-8226c857-ac84-48e5-87fe-ddf705ebf951 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "9bd8796c-97a5-491a-b6b4-713222c15142-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:49:17 compute-0 nova_compute[187185]: 2025-11-29 07:49:17.381 187189 DEBUG oslo_concurrency.lockutils [None req-8226c857-ac84-48e5-87fe-ddf705ebf951 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "9bd8796c-97a5-491a-b6b4-713222c15142-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:49:17 compute-0 nova_compute[187185]: 2025-11-29 07:49:17.395 187189 INFO nova.compute.manager [None req-8226c857-ac84-48e5-87fe-ddf705ebf951 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Terminating instance
Nov 29 07:49:17 compute-0 nova_compute[187185]: 2025-11-29 07:49:17.408 187189 DEBUG nova.compute.manager [None req-8226c857-ac84-48e5-87fe-ddf705ebf951 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 07:49:17 compute-0 kernel: tap720d6d0b-55 (unregistering): left promiscuous mode
Nov 29 07:49:17 compute-0 NetworkManager[55227]: <info>  [1764402557.4327] device (tap720d6d0b-55): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:49:17 compute-0 ovn_controller[95281]: 2025-11-29T07:49:17Z|00583|binding|INFO|Releasing lport 720d6d0b-554f-4c56-b9e4-1de1309b83f0 from this chassis (sb_readonly=0)
Nov 29 07:49:17 compute-0 ovn_controller[95281]: 2025-11-29T07:49:17Z|00584|binding|INFO|Setting lport 720d6d0b-554f-4c56-b9e4-1de1309b83f0 down in Southbound
Nov 29 07:49:17 compute-0 ovn_controller[95281]: 2025-11-29T07:49:17Z|00585|binding|INFO|Removing iface tap720d6d0b-55 ovn-installed in OVS
Nov 29 07:49:17 compute-0 nova_compute[187185]: 2025-11-29 07:49:17.441 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:49:17 compute-0 nova_compute[187185]: 2025-11-29 07:49:17.446 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:49:17 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:49:17.449 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b8:79:f5 10.100.0.9'], port_security=['fa:16:3e:b8:79:f5 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '9bd8796c-97a5-491a-b6b4-713222c15142', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cfd1ce3c-e516-46ef-8712-573fe4de8313', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ec8b80be17a14d1caf666636283749d0', 'neutron:revision_number': '4', 'neutron:security_group_ids': '87a5a543-5e79-469e-89a8-5c2f146e65d2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8074c60c-bc9e-40bb-8493-fc40fe113e9f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=720d6d0b-554f-4c56-b9e4-1de1309b83f0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:49:17 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:49:17.451 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 720d6d0b-554f-4c56-b9e4-1de1309b83f0 in datapath cfd1ce3c-e516-46ef-8712-573fe4de8313 unbound from our chassis
Nov 29 07:49:17 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:49:17.452 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cfd1ce3c-e516-46ef-8712-573fe4de8313, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:49:17 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:49:17.454 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[b09618f9-ec18-467e-bcba-ef09c1c9e787]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:49:17 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:49:17.455 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313 namespace which is not needed anymore
Nov 29 07:49:17 compute-0 nova_compute[187185]: 2025-11-29 07:49:17.465 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:49:17 compute-0 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d000000ad.scope: Deactivated successfully.
Nov 29 07:49:17 compute-0 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d000000ad.scope: Consumed 13.947s CPU time.
Nov 29 07:49:17 compute-0 systemd-machined[153486]: Machine qemu-67-instance-000000ad terminated.
Nov 29 07:49:17 compute-0 neutron-haproxy-ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313[246345]: [NOTICE]   (246349) : haproxy version is 2.8.14-c23fe91
Nov 29 07:49:17 compute-0 neutron-haproxy-ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313[246345]: [NOTICE]   (246349) : path to executable is /usr/sbin/haproxy
Nov 29 07:49:17 compute-0 neutron-haproxy-ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313[246345]: [WARNING]  (246349) : Exiting Master process...
Nov 29 07:49:17 compute-0 neutron-haproxy-ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313[246345]: [ALERT]    (246349) : Current worker (246351) exited with code 143 (Terminated)
Nov 29 07:49:17 compute-0 neutron-haproxy-ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313[246345]: [WARNING]  (246349) : All workers exited. Exiting... (0)
Nov 29 07:49:17 compute-0 systemd[1]: libpod-71f872cde640e6b41c5669a66e6e4de4ce2c060ccbb4afb1cbafee44c1100364.scope: Deactivated successfully.
Nov 29 07:49:17 compute-0 podman[246639]: 2025-11-29 07:49:17.645572476 +0000 UTC m=+0.064506457 container died 71f872cde640e6b41c5669a66e6e4de4ce2c060ccbb4afb1cbafee44c1100364 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 07:49:17 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-71f872cde640e6b41c5669a66e6e4de4ce2c060ccbb4afb1cbafee44c1100364-userdata-shm.mount: Deactivated successfully.
Nov 29 07:49:17 compute-0 systemd[1]: var-lib-containers-storage-overlay-43638b97acd5e63f213daee811f1dced2719c42441c70e3046234e2c17d43dbd-merged.mount: Deactivated successfully.
Nov 29 07:49:17 compute-0 podman[246639]: 2025-11-29 07:49:17.689292561 +0000 UTC m=+0.108226532 container cleanup 71f872cde640e6b41c5669a66e6e4de4ce2c060ccbb4afb1cbafee44c1100364 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 07:49:17 compute-0 nova_compute[187185]: 2025-11-29 07:49:17.694 187189 INFO nova.virt.libvirt.driver [-] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Instance destroyed successfully.
Nov 29 07:49:17 compute-0 nova_compute[187185]: 2025-11-29 07:49:17.695 187189 DEBUG nova.objects.instance [None req-8226c857-ac84-48e5-87fe-ddf705ebf951 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'resources' on Instance uuid 9bd8796c-97a5-491a-b6b4-713222c15142 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:49:17 compute-0 systemd[1]: libpod-conmon-71f872cde640e6b41c5669a66e6e4de4ce2c060ccbb4afb1cbafee44c1100364.scope: Deactivated successfully.
Nov 29 07:49:17 compute-0 nova_compute[187185]: 2025-11-29 07:49:17.711 187189 DEBUG nova.virt.libvirt.vif [None req-8226c857-ac84-48e5-87fe-ddf705ebf951 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:48:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1530170034',display_name='tempest-TestNetworkBasicOps-server-1530170034',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1530170034',id=173,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCsZ1YtekqB3TYskLMIUT4bWP/hYgw9UzbfG3Gvozen43xm6R9wLUZYbDGo0BSiqoLWRVB1FDlITni1cXBB1DxXKDkaeEvAx/GpXX+X+BSa0WJ777uIPLhKFYfzCgERKfg==',key_name='tempest-TestNetworkBasicOps-905637856',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:48:48Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-dq7pvgw5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:48:49Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=9bd8796c-97a5-491a-b6b4-713222c15142,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "720d6d0b-554f-4c56-b9e4-1de1309b83f0", "address": "fa:16:3e:b8:79:f5", "network": {"id": "cfd1ce3c-e516-46ef-8712-573fe4de8313", "bridge": "br-int", "label": "tempest-network-smoke--423607195", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap720d6d0b-55", "ovs_interfaceid": "720d6d0b-554f-4c56-b9e4-1de1309b83f0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:49:17 compute-0 nova_compute[187185]: 2025-11-29 07:49:17.712 187189 DEBUG nova.network.os_vif_util [None req-8226c857-ac84-48e5-87fe-ddf705ebf951 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "720d6d0b-554f-4c56-b9e4-1de1309b83f0", "address": "fa:16:3e:b8:79:f5", "network": {"id": "cfd1ce3c-e516-46ef-8712-573fe4de8313", "bridge": "br-int", "label": "tempest-network-smoke--423607195", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap720d6d0b-55", "ovs_interfaceid": "720d6d0b-554f-4c56-b9e4-1de1309b83f0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:49:17 compute-0 nova_compute[187185]: 2025-11-29 07:49:17.713 187189 DEBUG nova.network.os_vif_util [None req-8226c857-ac84-48e5-87fe-ddf705ebf951 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b8:79:f5,bridge_name='br-int',has_traffic_filtering=True,id=720d6d0b-554f-4c56-b9e4-1de1309b83f0,network=Network(cfd1ce3c-e516-46ef-8712-573fe4de8313),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap720d6d0b-55') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:49:17 compute-0 nova_compute[187185]: 2025-11-29 07:49:17.713 187189 DEBUG os_vif [None req-8226c857-ac84-48e5-87fe-ddf705ebf951 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b8:79:f5,bridge_name='br-int',has_traffic_filtering=True,id=720d6d0b-554f-4c56-b9e4-1de1309b83f0,network=Network(cfd1ce3c-e516-46ef-8712-573fe4de8313),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap720d6d0b-55') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:49:17 compute-0 nova_compute[187185]: 2025-11-29 07:49:17.715 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:49:17 compute-0 nova_compute[187185]: 2025-11-29 07:49:17.715 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap720d6d0b-55, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:49:17 compute-0 nova_compute[187185]: 2025-11-29 07:49:17.717 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:49:17 compute-0 nova_compute[187185]: 2025-11-29 07:49:17.718 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:49:17 compute-0 nova_compute[187185]: 2025-11-29 07:49:17.721 187189 INFO os_vif [None req-8226c857-ac84-48e5-87fe-ddf705ebf951 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b8:79:f5,bridge_name='br-int',has_traffic_filtering=True,id=720d6d0b-554f-4c56-b9e4-1de1309b83f0,network=Network(cfd1ce3c-e516-46ef-8712-573fe4de8313),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap720d6d0b-55')
Nov 29 07:49:17 compute-0 nova_compute[187185]: 2025-11-29 07:49:17.722 187189 INFO nova.virt.libvirt.driver [None req-8226c857-ac84-48e5-87fe-ddf705ebf951 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Deleting instance files /var/lib/nova/instances/9bd8796c-97a5-491a-b6b4-713222c15142_del
Nov 29 07:49:17 compute-0 nova_compute[187185]: 2025-11-29 07:49:17.722 187189 INFO nova.virt.libvirt.driver [None req-8226c857-ac84-48e5-87fe-ddf705ebf951 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Deletion of /var/lib/nova/instances/9bd8796c-97a5-491a-b6b4-713222c15142_del complete
Nov 29 07:49:17 compute-0 podman[246683]: 2025-11-29 07:49:17.765235402 +0000 UTC m=+0.052457894 container remove 71f872cde640e6b41c5669a66e6e4de4ce2c060ccbb4afb1cbafee44c1100364 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:49:17 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:49:17.773 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[dbb5a696-b8f8-4169-b444-75c6f6c05cd3]: (4, ('Sat Nov 29 07:49:17 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313 (71f872cde640e6b41c5669a66e6e4de4ce2c060ccbb4afb1cbafee44c1100364)\n71f872cde640e6b41c5669a66e6e4de4ce2c060ccbb4afb1cbafee44c1100364\nSat Nov 29 07:49:17 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313 (71f872cde640e6b41c5669a66e6e4de4ce2c060ccbb4afb1cbafee44c1100364)\n71f872cde640e6b41c5669a66e6e4de4ce2c060ccbb4afb1cbafee44c1100364\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:49:17 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:49:17.775 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e4f504fd-5d61-4e3c-b6db-7817ad49f8df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:49:17 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:49:17.777 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcfd1ce3c-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:49:17 compute-0 nova_compute[187185]: 2025-11-29 07:49:17.779 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:49:17 compute-0 kernel: tapcfd1ce3c-e0: left promiscuous mode
Nov 29 07:49:17 compute-0 nova_compute[187185]: 2025-11-29 07:49:17.782 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:49:17 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:49:17.785 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[32ba163b-b514-4b67-b27d-ee7f0a858da2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:49:17 compute-0 nova_compute[187185]: 2025-11-29 07:49:17.799 187189 INFO nova.compute.manager [None req-8226c857-ac84-48e5-87fe-ddf705ebf951 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Took 0.39 seconds to destroy the instance on the hypervisor.
Nov 29 07:49:17 compute-0 nova_compute[187185]: 2025-11-29 07:49:17.800 187189 DEBUG oslo.service.loopingcall [None req-8226c857-ac84-48e5-87fe-ddf705ebf951 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 07:49:17 compute-0 nova_compute[187185]: 2025-11-29 07:49:17.800 187189 DEBUG nova.compute.manager [-] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 07:49:17 compute-0 nova_compute[187185]: 2025-11-29 07:49:17.800 187189 DEBUG nova.network.neutron [-] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 07:49:17 compute-0 nova_compute[187185]: 2025-11-29 07:49:17.805 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:49:17 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:49:17.814 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f64c77f8-46a3-49d2-9c69-0ead8f569b29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:49:17 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:49:17.816 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[9dac6361-2230-43ba-a25f-ddb3c332005d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:49:17 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:49:17.834 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[b4cfaf3f-eea5-4291-ad19-ccc0976a9aee]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 797433, 'reachable_time': 44422, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246698, 'error': None, 'target': 'ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:49:17 compute-0 systemd[1]: run-netns-ovnmeta\x2dcfd1ce3c\x2de516\x2d46ef\x2d8712\x2d573fe4de8313.mount: Deactivated successfully.
Nov 29 07:49:17 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:49:17.840 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-cfd1ce3c-e516-46ef-8712-573fe4de8313 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:49:17 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:49:17.841 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[d275b071-7ff1-4f58-84a9-adaea01612f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:49:18 compute-0 nova_compute[187185]: 2025-11-29 07:49:18.253 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:49:18 compute-0 nova_compute[187185]: 2025-11-29 07:49:18.456 187189 DEBUG nova.network.neutron [-] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:49:18 compute-0 nova_compute[187185]: 2025-11-29 07:49:18.480 187189 INFO nova.compute.manager [-] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Took 0.68 seconds to deallocate network for instance.
Nov 29 07:49:18 compute-0 nova_compute[187185]: 2025-11-29 07:49:18.540 187189 DEBUG nova.compute.manager [req-d42792cc-766e-4d9c-9a13-f013ce0c1750 req-3abe06b7-7e59-42d0-ac29-5ba88b53f582 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Received event network-vif-deleted-720d6d0b-554f-4c56-b9e4-1de1309b83f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:49:18 compute-0 nova_compute[187185]: 2025-11-29 07:49:18.578 187189 DEBUG oslo_concurrency.lockutils [None req-8226c857-ac84-48e5-87fe-ddf705ebf951 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:49:18 compute-0 nova_compute[187185]: 2025-11-29 07:49:18.579 187189 DEBUG oslo_concurrency.lockutils [None req-8226c857-ac84-48e5-87fe-ddf705ebf951 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:49:18 compute-0 sshd[128727]: drop connection #1 from [115.190.136.184]:43196 on [38.102.83.110]:22 penalty: exceeded LoginGraceTime
Nov 29 07:49:18 compute-0 nova_compute[187185]: 2025-11-29 07:49:18.643 187189 DEBUG nova.compute.provider_tree [None req-8226c857-ac84-48e5-87fe-ddf705ebf951 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:49:18 compute-0 nova_compute[187185]: 2025-11-29 07:49:18.662 187189 DEBUG nova.scheduler.client.report [None req-8226c857-ac84-48e5-87fe-ddf705ebf951 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:49:18 compute-0 nova_compute[187185]: 2025-11-29 07:49:18.686 187189 DEBUG oslo_concurrency.lockutils [None req-8226c857-ac84-48e5-87fe-ddf705ebf951 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:49:18 compute-0 nova_compute[187185]: 2025-11-29 07:49:18.718 187189 INFO nova.scheduler.client.report [None req-8226c857-ac84-48e5-87fe-ddf705ebf951 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Deleted allocations for instance 9bd8796c-97a5-491a-b6b4-713222c15142
Nov 29 07:49:18 compute-0 nova_compute[187185]: 2025-11-29 07:49:18.797 187189 DEBUG nova.compute.manager [req-38fafc16-bc25-41eb-a278-242d32037570 req-db7d4305-6d2d-4894-b31f-fac52647cda7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Received event network-vif-unplugged-720d6d0b-554f-4c56-b9e4-1de1309b83f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:49:18 compute-0 nova_compute[187185]: 2025-11-29 07:49:18.798 187189 DEBUG oslo_concurrency.lockutils [req-38fafc16-bc25-41eb-a278-242d32037570 req-db7d4305-6d2d-4894-b31f-fac52647cda7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "9bd8796c-97a5-491a-b6b4-713222c15142-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:49:18 compute-0 nova_compute[187185]: 2025-11-29 07:49:18.798 187189 DEBUG oslo_concurrency.lockutils [req-38fafc16-bc25-41eb-a278-242d32037570 req-db7d4305-6d2d-4894-b31f-fac52647cda7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9bd8796c-97a5-491a-b6b4-713222c15142-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:49:18 compute-0 nova_compute[187185]: 2025-11-29 07:49:18.798 187189 DEBUG oslo_concurrency.lockutils [req-38fafc16-bc25-41eb-a278-242d32037570 req-db7d4305-6d2d-4894-b31f-fac52647cda7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9bd8796c-97a5-491a-b6b4-713222c15142-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:49:18 compute-0 nova_compute[187185]: 2025-11-29 07:49:18.799 187189 DEBUG nova.compute.manager [req-38fafc16-bc25-41eb-a278-242d32037570 req-db7d4305-6d2d-4894-b31f-fac52647cda7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] No waiting events found dispatching network-vif-unplugged-720d6d0b-554f-4c56-b9e4-1de1309b83f0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:49:18 compute-0 nova_compute[187185]: 2025-11-29 07:49:18.799 187189 WARNING nova.compute.manager [req-38fafc16-bc25-41eb-a278-242d32037570 req-db7d4305-6d2d-4894-b31f-fac52647cda7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Received unexpected event network-vif-unplugged-720d6d0b-554f-4c56-b9e4-1de1309b83f0 for instance with vm_state deleted and task_state None.
Nov 29 07:49:18 compute-0 nova_compute[187185]: 2025-11-29 07:49:18.799 187189 DEBUG nova.compute.manager [req-38fafc16-bc25-41eb-a278-242d32037570 req-db7d4305-6d2d-4894-b31f-fac52647cda7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Received event network-vif-plugged-720d6d0b-554f-4c56-b9e4-1de1309b83f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:49:18 compute-0 nova_compute[187185]: 2025-11-29 07:49:18.799 187189 DEBUG oslo_concurrency.lockutils [req-38fafc16-bc25-41eb-a278-242d32037570 req-db7d4305-6d2d-4894-b31f-fac52647cda7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "9bd8796c-97a5-491a-b6b4-713222c15142-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:49:18 compute-0 nova_compute[187185]: 2025-11-29 07:49:18.800 187189 DEBUG oslo_concurrency.lockutils [req-38fafc16-bc25-41eb-a278-242d32037570 req-db7d4305-6d2d-4894-b31f-fac52647cda7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9bd8796c-97a5-491a-b6b4-713222c15142-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:49:18 compute-0 nova_compute[187185]: 2025-11-29 07:49:18.800 187189 DEBUG oslo_concurrency.lockutils [req-38fafc16-bc25-41eb-a278-242d32037570 req-db7d4305-6d2d-4894-b31f-fac52647cda7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9bd8796c-97a5-491a-b6b4-713222c15142-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:49:18 compute-0 nova_compute[187185]: 2025-11-29 07:49:18.800 187189 DEBUG nova.compute.manager [req-38fafc16-bc25-41eb-a278-242d32037570 req-db7d4305-6d2d-4894-b31f-fac52647cda7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] No waiting events found dispatching network-vif-plugged-720d6d0b-554f-4c56-b9e4-1de1309b83f0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:49:18 compute-0 nova_compute[187185]: 2025-11-29 07:49:18.800 187189 WARNING nova.compute.manager [req-38fafc16-bc25-41eb-a278-242d32037570 req-db7d4305-6d2d-4894-b31f-fac52647cda7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Received unexpected event network-vif-plugged-720d6d0b-554f-4c56-b9e4-1de1309b83f0 for instance with vm_state deleted and task_state None.
Nov 29 07:49:18 compute-0 nova_compute[187185]: 2025-11-29 07:49:18.812 187189 DEBUG oslo_concurrency.lockutils [None req-8226c857-ac84-48e5-87fe-ddf705ebf951 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "9bd8796c-97a5-491a-b6b4-713222c15142" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.432s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:49:19 compute-0 nova_compute[187185]: 2025-11-29 07:49:19.056 187189 DEBUG nova.network.neutron [req-fdabac2f-addd-482e-9967-312995a7f788 req-41571c6f-2466-4ce3-8fa6-e3b903a72368 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Updated VIF entry in instance network info cache for port 720d6d0b-554f-4c56-b9e4-1de1309b83f0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:49:19 compute-0 nova_compute[187185]: 2025-11-29 07:49:19.057 187189 DEBUG nova.network.neutron [req-fdabac2f-addd-482e-9967-312995a7f788 req-41571c6f-2466-4ce3-8fa6-e3b903a72368 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Updating instance_info_cache with network_info: [{"id": "720d6d0b-554f-4c56-b9e4-1de1309b83f0", "address": "fa:16:3e:b8:79:f5", "network": {"id": "cfd1ce3c-e516-46ef-8712-573fe4de8313", "bridge": "br-int", "label": "tempest-network-smoke--423607195", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap720d6d0b-55", "ovs_interfaceid": "720d6d0b-554f-4c56-b9e4-1de1309b83f0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:49:19 compute-0 nova_compute[187185]: 2025-11-29 07:49:19.080 187189 DEBUG oslo_concurrency.lockutils [req-fdabac2f-addd-482e-9967-312995a7f788 req-41571c6f-2466-4ce3-8fa6-e3b903a72368 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-9bd8796c-97a5-491a-b6b4-713222c15142" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:49:19 compute-0 podman[246699]: 2025-11-29 07:49:19.788594709 +0000 UTC m=+0.056498199 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:49:21 compute-0 podman[246724]: 2025-11-29 07:49:21.795133879 +0000 UTC m=+0.064125787 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 07:49:21 compute-0 podman[246725]: 2025-11-29 07:49:21.819634986 +0000 UTC m=+0.084378933 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=edpm, io.buildah.version=1.41.3)
Nov 29 07:49:22 compute-0 sshd-session[246630]: Received disconnect from 115.190.187.93 port 59580:11: Bye Bye [preauth]
Nov 29 07:49:22 compute-0 sshd-session[246630]: Disconnected from authenticating user root 115.190.187.93 port 59580 [preauth]
Nov 29 07:49:22 compute-0 nova_compute[187185]: 2025-11-29 07:49:22.718 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:49:23 compute-0 nova_compute[187185]: 2025-11-29 07:49:23.257 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:49:23 compute-0 nova_compute[187185]: 2025-11-29 07:49:23.648 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:49:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:49:25.749 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:49:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:49:25.750 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:49:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:49:25.751 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:49:25 compute-0 nova_compute[187185]: 2025-11-29 07:49:25.873 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:49:27 compute-0 nova_compute[187185]: 2025-11-29 07:49:27.721 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:49:28 compute-0 nova_compute[187185]: 2025-11-29 07:49:28.259 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:49:30 compute-0 nova_compute[187185]: 2025-11-29 07:49:30.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:49:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:49:30.676 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=68, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=67) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:49:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:49:30.677 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:49:30 compute-0 nova_compute[187185]: 2025-11-29 07:49:30.678 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:49:32 compute-0 nova_compute[187185]: 2025-11-29 07:49:32.692 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764402557.6912115, 9bd8796c-97a5-491a-b6b4-713222c15142 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:49:32 compute-0 nova_compute[187185]: 2025-11-29 07:49:32.692 187189 INFO nova.compute.manager [-] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] VM Stopped (Lifecycle Event)
Nov 29 07:49:32 compute-0 nova_compute[187185]: 2025-11-29 07:49:32.723 187189 DEBUG nova.compute.manager [None req-cc25bfb9-542b-4df1-835e-6de993f4834d - - - - - -] [instance: 9bd8796c-97a5-491a-b6b4-713222c15142] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:49:32 compute-0 nova_compute[187185]: 2025-11-29 07:49:32.723 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:49:32 compute-0 podman[246767]: 2025-11-29 07:49:32.845079256 +0000 UTC m=+0.086941126 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 07:49:32 compute-0 podman[246765]: 2025-11-29 07:49:32.860413732 +0000 UTC m=+0.111618108 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Nov 29 07:49:32 compute-0 podman[246766]: 2025-11-29 07:49:32.861766571 +0000 UTC m=+0.108150040 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, release=1755695350, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm)
Nov 29 07:49:33 compute-0 nova_compute[187185]: 2025-11-29 07:49:33.262 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:49:33 compute-0 nova_compute[187185]: 2025-11-29 07:49:33.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:49:33 compute-0 nova_compute[187185]: 2025-11-29 07:49:33.346 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:49:33 compute-0 nova_compute[187185]: 2025-11-29 07:49:33.347 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:49:33 compute-0 nova_compute[187185]: 2025-11-29 07:49:33.347 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:49:33 compute-0 nova_compute[187185]: 2025-11-29 07:49:33.347 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:49:33 compute-0 nova_compute[187185]: 2025-11-29 07:49:33.575 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:49:33 compute-0 nova_compute[187185]: 2025-11-29 07:49:33.576 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5695MB free_disk=73.25358581542969GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:49:33 compute-0 nova_compute[187185]: 2025-11-29 07:49:33.576 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:49:33 compute-0 nova_compute[187185]: 2025-11-29 07:49:33.576 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:49:33 compute-0 nova_compute[187185]: 2025-11-29 07:49:33.655 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:49:33 compute-0 nova_compute[187185]: 2025-11-29 07:49:33.656 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:49:33 compute-0 nova_compute[187185]: 2025-11-29 07:49:33.680 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:49:33 compute-0 nova_compute[187185]: 2025-11-29 07:49:33.698 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:49:33 compute-0 nova_compute[187185]: 2025-11-29 07:49:33.718 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:49:33 compute-0 nova_compute[187185]: 2025-11-29 07:49:33.718 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:49:35 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:49:35.680 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '68'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:49:35 compute-0 nova_compute[187185]: 2025-11-29 07:49:35.718 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:49:36 compute-0 nova_compute[187185]: 2025-11-29 07:49:36.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:49:37 compute-0 nova_compute[187185]: 2025-11-29 07:49:37.725 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:49:38 compute-0 nova_compute[187185]: 2025-11-29 07:49:38.263 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:49:39 compute-0 nova_compute[187185]: 2025-11-29 07:49:39.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:49:39 compute-0 nova_compute[187185]: 2025-11-29 07:49:39.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:49:40 compute-0 nova_compute[187185]: 2025-11-29 07:49:40.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:49:40 compute-0 nova_compute[187185]: 2025-11-29 07:49:40.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:49:40 compute-0 nova_compute[187185]: 2025-11-29 07:49:40.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:49:40 compute-0 nova_compute[187185]: 2025-11-29 07:49:40.339 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 07:49:41 compute-0 podman[246829]: 2025-11-29 07:49:41.861329531 +0000 UTC m=+0.117516627 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 07:49:42 compute-0 nova_compute[187185]: 2025-11-29 07:49:42.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:49:42 compute-0 nova_compute[187185]: 2025-11-29 07:49:42.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:49:42 compute-0 nova_compute[187185]: 2025-11-29 07:49:42.318 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:49:42 compute-0 nova_compute[187185]: 2025-11-29 07:49:42.728 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:49:43 compute-0 nova_compute[187185]: 2025-11-29 07:49:43.298 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:49:46 compute-0 sshd[128727]: drop connection #0 from [115.190.136.184]:14036 on [38.102.83.110]:22 penalty: exceeded LoginGraceTime
Nov 29 07:49:47 compute-0 nova_compute[187185]: 2025-11-29 07:49:47.786 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:49:48 compute-0 nova_compute[187185]: 2025-11-29 07:49:48.300 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:49:50 compute-0 podman[246855]: 2025-11-29 07:49:50.80040414 +0000 UTC m=+0.059803983 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 07:49:52 compute-0 nova_compute[187185]: 2025-11-29 07:49:52.788 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:49:52 compute-0 podman[246881]: 2025-11-29 07:49:52.806195977 +0000 UTC m=+0.068310805 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd)
Nov 29 07:49:52 compute-0 podman[246882]: 2025-11-29 07:49:52.83334737 +0000 UTC m=+0.090186498 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 29 07:49:53 compute-0 nova_compute[187185]: 2025-11-29 07:49:53.304 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:49:53 compute-0 sshd-session[246879]: Invalid user saas from 20.255.62.58 port 51440
Nov 29 07:49:54 compute-0 sshd-session[246879]: Received disconnect from 20.255.62.58 port 51440:11: Bye Bye [preauth]
Nov 29 07:49:54 compute-0 sshd-session[246879]: Disconnected from invalid user saas 20.255.62.58 port 51440 [preauth]
Nov 29 07:49:57 compute-0 nova_compute[187185]: 2025-11-29 07:49:57.790 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:49:58 compute-0 nova_compute[187185]: 2025-11-29 07:49:58.307 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:50:02 compute-0 nova_compute[187185]: 2025-11-29 07:50:02.792 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:50:03 compute-0 nova_compute[187185]: 2025-11-29 07:50:03.310 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:50:03 compute-0 podman[246920]: 2025-11-29 07:50:03.790358242 +0000 UTC m=+0.057469748 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 29 07:50:03 compute-0 podman[246921]: 2025-11-29 07:50:03.823350211 +0000 UTC m=+0.086146744 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_id=edpm, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 29 07:50:03 compute-0 podman[246922]: 2025-11-29 07:50:03.823828444 +0000 UTC m=+0.081353717 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 07:50:07 compute-0 nova_compute[187185]: 2025-11-29 07:50:07.795 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:50:08 compute-0 nova_compute[187185]: 2025-11-29 07:50:08.313 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:50:12 compute-0 nova_compute[187185]: 2025-11-29 07:50:12.800 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:50:12 compute-0 podman[246987]: 2025-11-29 07:50:12.855983763 +0000 UTC m=+0.120114369 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 29 07:50:13 compute-0 nova_compute[187185]: 2025-11-29 07:50:13.316 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:50:13 compute-0 sshd-session[246985]: Invalid user autrede from 115.190.136.184 port 29070
Nov 29 07:50:14 compute-0 sshd-session[246985]: Received disconnect from 115.190.136.184 port 29070:11: Bye Bye [preauth]
Nov 29 07:50:14 compute-0 sshd-session[246985]: Disconnected from invalid user autrede 115.190.136.184 port 29070 [preauth]
Nov 29 07:50:17 compute-0 nova_compute[187185]: 2025-11-29 07:50:17.804 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:50:18 compute-0 nova_compute[187185]: 2025-11-29 07:50:18.319 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:50:21 compute-0 podman[247013]: 2025-11-29 07:50:21.816921674 +0000 UTC m=+0.073118443 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 07:50:22 compute-0 nova_compute[187185]: 2025-11-29 07:50:22.809 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:50:23 compute-0 nova_compute[187185]: 2025-11-29 07:50:23.321 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:50:23 compute-0 podman[247040]: 2025-11-29 07:50:23.819396617 +0000 UTC m=+0.075956293 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd)
Nov 29 07:50:23 compute-0 podman[247041]: 2025-11-29 07:50:23.828157786 +0000 UTC m=+0.078716521 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 07:50:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:50:25.750 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:50:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:50:25.751 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:50:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:50:25.751 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:50:27 compute-0 sshd-session[247031]: Connection closed by 115.190.187.93 port 52098 [preauth]
Nov 29 07:50:27 compute-0 nova_compute[187185]: 2025-11-29 07:50:27.811 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:50:28 compute-0 sshd-session[247081]: Invalid user exx from 45.78.219.119 port 40552
Nov 29 07:50:28 compute-0 nova_compute[187185]: 2025-11-29 07:50:28.323 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:50:29 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:50:29.029 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=69, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=68) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:50:29 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:50:29.030 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:50:29 compute-0 nova_compute[187185]: 2025-11-29 07:50:29.075 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:50:29 compute-0 sshd-session[247081]: Received disconnect from 45.78.219.119 port 40552:11: Bye Bye [preauth]
Nov 29 07:50:29 compute-0 sshd-session[247081]: Disconnected from invalid user exx 45.78.219.119 port 40552 [preauth]
Nov 29 07:50:30 compute-0 nova_compute[187185]: 2025-11-29 07:50:30.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:50:31 compute-0 ovn_controller[95281]: 2025-11-29T07:50:31Z|00586|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Nov 29 07:50:32 compute-0 sshd-session[247083]: Received disconnect from 190.181.27.27 port 49264:11: Bye Bye [preauth]
Nov 29 07:50:32 compute-0 sshd-session[247083]: Disconnected from authenticating user root 190.181.27.27 port 49264 [preauth]
Nov 29 07:50:32 compute-0 nova_compute[187185]: 2025-11-29 07:50:32.813 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:50:33 compute-0 nova_compute[187185]: 2025-11-29 07:50:33.326 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:50:34 compute-0 nova_compute[187185]: 2025-11-29 07:50:34.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:50:34 compute-0 nova_compute[187185]: 2025-11-29 07:50:34.345 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:50:34 compute-0 nova_compute[187185]: 2025-11-29 07:50:34.346 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:50:34 compute-0 nova_compute[187185]: 2025-11-29 07:50:34.346 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:50:34 compute-0 nova_compute[187185]: 2025-11-29 07:50:34.346 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:50:34 compute-0 nova_compute[187185]: 2025-11-29 07:50:34.591 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:50:34 compute-0 nova_compute[187185]: 2025-11-29 07:50:34.592 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5745MB free_disk=73.2535514831543GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:50:34 compute-0 nova_compute[187185]: 2025-11-29 07:50:34.593 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:50:34 compute-0 nova_compute[187185]: 2025-11-29 07:50:34.593 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:50:34 compute-0 nova_compute[187185]: 2025-11-29 07:50:34.742 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:50:34 compute-0 nova_compute[187185]: 2025-11-29 07:50:34.742 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:50:34 compute-0 podman[247085]: 2025-11-29 07:50:34.787601206 +0000 UTC m=+0.052295049 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:50:34 compute-0 nova_compute[187185]: 2025-11-29 07:50:34.798 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:50:34 compute-0 podman[247087]: 2025-11-29 07:50:34.810749965 +0000 UTC m=+0.066930986 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 07:50:34 compute-0 podman[247086]: 2025-11-29 07:50:34.810269902 +0000 UTC m=+0.066380131 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, name=ubi9-minimal, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, config_id=edpm, io.openshift.expose-services=, version=9.6, com.redhat.component=ubi9-minimal-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 29 07:50:34 compute-0 nova_compute[187185]: 2025-11-29 07:50:34.818 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:50:34 compute-0 nova_compute[187185]: 2025-11-29 07:50:34.820 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:50:34 compute-0 nova_compute[187185]: 2025-11-29 07:50:34.820 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.227s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:50:35 compute-0 nova_compute[187185]: 2025-11-29 07:50:35.820 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:50:36 compute-0 nova_compute[187185]: 2025-11-29 07:50:36.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:50:37 compute-0 nova_compute[187185]: 2025-11-29 07:50:37.816 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:50:38 compute-0 nova_compute[187185]: 2025-11-29 07:50:38.327 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:50:39 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:50:39.034 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '69'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:50:39 compute-0 nova_compute[187185]: 2025-11-29 07:50:39.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:50:39 compute-0 nova_compute[187185]: 2025-11-29 07:50:39.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:50:40 compute-0 sshd-session[247149]: Invalid user elemental from 115.190.136.184 port 31402
Nov 29 07:50:40 compute-0 sshd-session[247149]: Received disconnect from 115.190.136.184 port 31402:11: Bye Bye [preauth]
Nov 29 07:50:40 compute-0 sshd-session[247149]: Disconnected from invalid user elemental 115.190.136.184 port 31402 [preauth]
Nov 29 07:50:42 compute-0 nova_compute[187185]: 2025-11-29 07:50:42.311 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:50:42 compute-0 nova_compute[187185]: 2025-11-29 07:50:42.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:50:42 compute-0 nova_compute[187185]: 2025-11-29 07:50:42.315 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:50:42 compute-0 nova_compute[187185]: 2025-11-29 07:50:42.315 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:50:42 compute-0 nova_compute[187185]: 2025-11-29 07:50:42.328 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 07:50:42 compute-0 nova_compute[187185]: 2025-11-29 07:50:42.862 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:50:43 compute-0 nova_compute[187185]: 2025-11-29 07:50:43.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:50:43 compute-0 nova_compute[187185]: 2025-11-29 07:50:43.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:50:43 compute-0 nova_compute[187185]: 2025-11-29 07:50:43.330 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:50:43 compute-0 podman[247151]: 2025-11-29 07:50:43.864118589 +0000 UTC m=+0.128355085 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller)
Nov 29 07:50:47 compute-0 nova_compute[187185]: 2025-11-29 07:50:47.864 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:50:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:50:48.017 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:50:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:50:48.018 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:50:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:50:48.018 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:50:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:50:48.018 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:50:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:50:48.018 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:50:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:50:48.018 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:50:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:50:48.019 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:50:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:50:48.019 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:50:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:50:48.019 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:50:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:50:48.019 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:50:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:50:48.019 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:50:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:50:48.019 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:50:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:50:48.019 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:50:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:50:48.019 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:50:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:50:48.019 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:50:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:50:48.019 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:50:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:50:48.019 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:50:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:50:48.019 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:50:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:50:48.019 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:50:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:50:48.020 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:50:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:50:48.020 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:50:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:50:48.020 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:50:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:50:48.020 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:50:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:50:48.020 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:50:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:50:48.020 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:50:48 compute-0 nova_compute[187185]: 2025-11-29 07:50:48.332 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:50:52 compute-0 podman[247177]: 2025-11-29 07:50:52.828896268 +0000 UTC m=+0.085075832 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 07:50:52 compute-0 nova_compute[187185]: 2025-11-29 07:50:52.866 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:50:53 compute-0 nova_compute[187185]: 2025-11-29 07:50:53.334 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:50:54 compute-0 podman[247203]: 2025-11-29 07:50:54.836709904 +0000 UTC m=+0.086611946 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Nov 29 07:50:54 compute-0 podman[247202]: 2025-11-29 07:50:54.841110579 +0000 UTC m=+0.096418475 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:50:57 compute-0 nova_compute[187185]: 2025-11-29 07:50:57.868 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:50:58 compute-0 nova_compute[187185]: 2025-11-29 07:50:58.336 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:51:02 compute-0 nova_compute[187185]: 2025-11-29 07:51:02.311 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:51:02 compute-0 nova_compute[187185]: 2025-11-29 07:51:02.870 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:51:03 compute-0 nova_compute[187185]: 2025-11-29 07:51:03.338 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:51:05 compute-0 podman[247240]: 2025-11-29 07:51:05.816968966 +0000 UTC m=+0.080925254 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git)
Nov 29 07:51:05 compute-0 podman[247239]: 2025-11-29 07:51:05.825089437 +0000 UTC m=+0.084925398 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 07:51:05 compute-0 podman[247241]: 2025-11-29 07:51:05.840970309 +0000 UTC m=+0.084212618 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 07:51:07 compute-0 sshd-session[247303]: Invalid user tibero from 115.190.136.184 port 26130
Nov 29 07:51:07 compute-0 sshd-session[247303]: Received disconnect from 115.190.136.184 port 26130:11: Bye Bye [preauth]
Nov 29 07:51:07 compute-0 sshd-session[247303]: Disconnected from invalid user tibero 115.190.136.184 port 26130 [preauth]
Nov 29 07:51:07 compute-0 nova_compute[187185]: 2025-11-29 07:51:07.873 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:51:08 compute-0 nova_compute[187185]: 2025-11-29 07:51:08.164 187189 DEBUG oslo_concurrency.lockutils [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "1885cb6b-9e7f-433f-86bf-9e88d6199d90" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:51:08 compute-0 nova_compute[187185]: 2025-11-29 07:51:08.164 187189 DEBUG oslo_concurrency.lockutils [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "1885cb6b-9e7f-433f-86bf-9e88d6199d90" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:51:08 compute-0 nova_compute[187185]: 2025-11-29 07:51:08.272 187189 DEBUG nova.compute.manager [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 07:51:08 compute-0 nova_compute[187185]: 2025-11-29 07:51:08.340 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:51:08 compute-0 nova_compute[187185]: 2025-11-29 07:51:08.580 187189 DEBUG oslo_concurrency.lockutils [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:51:08 compute-0 nova_compute[187185]: 2025-11-29 07:51:08.581 187189 DEBUG oslo_concurrency.lockutils [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:51:08 compute-0 nova_compute[187185]: 2025-11-29 07:51:08.588 187189 DEBUG nova.virt.hardware [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:51:08 compute-0 nova_compute[187185]: 2025-11-29 07:51:08.588 187189 INFO nova.compute.claims [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:51:08 compute-0 nova_compute[187185]: 2025-11-29 07:51:08.842 187189 DEBUG nova.scheduler.client.report [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Refreshing inventories for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 29 07:51:08 compute-0 nova_compute[187185]: 2025-11-29 07:51:08.956 187189 DEBUG nova.scheduler.client.report [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Updating ProviderTree inventory for provider 4e39a026-df39-4e20-874a-dbb5a40df044 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 29 07:51:08 compute-0 nova_compute[187185]: 2025-11-29 07:51:08.957 187189 DEBUG nova.compute.provider_tree [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Updating inventory in ProviderTree for provider 4e39a026-df39-4e20-874a-dbb5a40df044 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 29 07:51:08 compute-0 nova_compute[187185]: 2025-11-29 07:51:08.972 187189 DEBUG nova.scheduler.client.report [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Refreshing aggregate associations for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 29 07:51:08 compute-0 nova_compute[187185]: 2025-11-29 07:51:08.993 187189 DEBUG nova.scheduler.client.report [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Refreshing trait associations for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044, traits: HW_CPU_X86_SSE,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_TRUSTED_CERTS,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 29 07:51:09 compute-0 nova_compute[187185]: 2025-11-29 07:51:09.037 187189 DEBUG nova.compute.provider_tree [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:51:09 compute-0 nova_compute[187185]: 2025-11-29 07:51:09.067 187189 DEBUG nova.scheduler.client.report [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:51:09 compute-0 nova_compute[187185]: 2025-11-29 07:51:09.196 187189 DEBUG oslo_concurrency.lockutils [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.615s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:51:09 compute-0 nova_compute[187185]: 2025-11-29 07:51:09.198 187189 DEBUG nova.compute.manager [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 07:51:09 compute-0 nova_compute[187185]: 2025-11-29 07:51:09.372 187189 DEBUG nova.compute.manager [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 07:51:09 compute-0 nova_compute[187185]: 2025-11-29 07:51:09.373 187189 DEBUG nova.network.neutron [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 07:51:09 compute-0 nova_compute[187185]: 2025-11-29 07:51:09.482 187189 INFO nova.virt.libvirt.driver [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 07:51:09 compute-0 nova_compute[187185]: 2025-11-29 07:51:09.564 187189 DEBUG nova.compute.manager [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 07:51:09 compute-0 nova_compute[187185]: 2025-11-29 07:51:09.835 187189 DEBUG nova.policy [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 07:51:09 compute-0 nova_compute[187185]: 2025-11-29 07:51:09.903 187189 DEBUG nova.compute.manager [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 07:51:09 compute-0 nova_compute[187185]: 2025-11-29 07:51:09.905 187189 DEBUG nova.virt.libvirt.driver [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:51:09 compute-0 nova_compute[187185]: 2025-11-29 07:51:09.906 187189 INFO nova.virt.libvirt.driver [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Creating image(s)
Nov 29 07:51:09 compute-0 nova_compute[187185]: 2025-11-29 07:51:09.907 187189 DEBUG oslo_concurrency.lockutils [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "/var/lib/nova/instances/1885cb6b-9e7f-433f-86bf-9e88d6199d90/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:51:09 compute-0 nova_compute[187185]: 2025-11-29 07:51:09.908 187189 DEBUG oslo_concurrency.lockutils [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "/var/lib/nova/instances/1885cb6b-9e7f-433f-86bf-9e88d6199d90/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:51:09 compute-0 nova_compute[187185]: 2025-11-29 07:51:09.909 187189 DEBUG oslo_concurrency.lockutils [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "/var/lib/nova/instances/1885cb6b-9e7f-433f-86bf-9e88d6199d90/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:51:09 compute-0 nova_compute[187185]: 2025-11-29 07:51:09.938 187189 DEBUG oslo_concurrency.processutils [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:51:10 compute-0 nova_compute[187185]: 2025-11-29 07:51:10.024 187189 DEBUG oslo_concurrency.processutils [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:51:10 compute-0 nova_compute[187185]: 2025-11-29 07:51:10.026 187189 DEBUG oslo_concurrency.lockutils [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:51:10 compute-0 nova_compute[187185]: 2025-11-29 07:51:10.027 187189 DEBUG oslo_concurrency.lockutils [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:51:10 compute-0 nova_compute[187185]: 2025-11-29 07:51:10.054 187189 DEBUG oslo_concurrency.processutils [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:51:10 compute-0 nova_compute[187185]: 2025-11-29 07:51:10.138 187189 DEBUG oslo_concurrency.processutils [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:51:10 compute-0 nova_compute[187185]: 2025-11-29 07:51:10.140 187189 DEBUG oslo_concurrency.processutils [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/1885cb6b-9e7f-433f-86bf-9e88d6199d90/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:51:10 compute-0 nova_compute[187185]: 2025-11-29 07:51:10.663 187189 DEBUG oslo_concurrency.processutils [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/1885cb6b-9e7f-433f-86bf-9e88d6199d90/disk 1073741824" returned: 0 in 0.523s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:51:10 compute-0 nova_compute[187185]: 2025-11-29 07:51:10.665 187189 DEBUG oslo_concurrency.lockutils [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.638s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:51:10 compute-0 nova_compute[187185]: 2025-11-29 07:51:10.666 187189 DEBUG oslo_concurrency.processutils [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:51:10 compute-0 nova_compute[187185]: 2025-11-29 07:51:10.764 187189 DEBUG oslo_concurrency.processutils [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:51:10 compute-0 nova_compute[187185]: 2025-11-29 07:51:10.765 187189 DEBUG nova.virt.disk.api [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Checking if we can resize image /var/lib/nova/instances/1885cb6b-9e7f-433f-86bf-9e88d6199d90/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:51:10 compute-0 nova_compute[187185]: 2025-11-29 07:51:10.766 187189 DEBUG oslo_concurrency.processutils [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1885cb6b-9e7f-433f-86bf-9e88d6199d90/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:51:10 compute-0 nova_compute[187185]: 2025-11-29 07:51:10.833 187189 DEBUG oslo_concurrency.processutils [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1885cb6b-9e7f-433f-86bf-9e88d6199d90/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:51:10 compute-0 nova_compute[187185]: 2025-11-29 07:51:10.836 187189 DEBUG nova.virt.disk.api [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Cannot resize image /var/lib/nova/instances/1885cb6b-9e7f-433f-86bf-9e88d6199d90/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:51:10 compute-0 nova_compute[187185]: 2025-11-29 07:51:10.837 187189 DEBUG nova.objects.instance [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lazy-loading 'migration_context' on Instance uuid 1885cb6b-9e7f-433f-86bf-9e88d6199d90 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:51:10 compute-0 nova_compute[187185]: 2025-11-29 07:51:10.931 187189 DEBUG nova.virt.libvirt.driver [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:51:10 compute-0 nova_compute[187185]: 2025-11-29 07:51:10.932 187189 DEBUG nova.virt.libvirt.driver [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Ensure instance console log exists: /var/lib/nova/instances/1885cb6b-9e7f-433f-86bf-9e88d6199d90/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:51:10 compute-0 nova_compute[187185]: 2025-11-29 07:51:10.933 187189 DEBUG oslo_concurrency.lockutils [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:51:10 compute-0 nova_compute[187185]: 2025-11-29 07:51:10.933 187189 DEBUG oslo_concurrency.lockutils [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:51:10 compute-0 nova_compute[187185]: 2025-11-29 07:51:10.934 187189 DEBUG oslo_concurrency.lockutils [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:51:11 compute-0 nova_compute[187185]: 2025-11-29 07:51:11.845 187189 DEBUG nova.network.neutron [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Successfully created port: 092b4f60-4cd8-4ca5-9e92-0131a96a6acf _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:51:12 compute-0 nova_compute[187185]: 2025-11-29 07:51:12.874 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:51:13 compute-0 nova_compute[187185]: 2025-11-29 07:51:13.308 187189 DEBUG nova.network.neutron [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Successfully updated port: 092b4f60-4cd8-4ca5-9e92-0131a96a6acf _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:51:13 compute-0 nova_compute[187185]: 2025-11-29 07:51:13.342 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:51:13 compute-0 nova_compute[187185]: 2025-11-29 07:51:13.370 187189 DEBUG oslo_concurrency.lockutils [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "refresh_cache-1885cb6b-9e7f-433f-86bf-9e88d6199d90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:51:13 compute-0 nova_compute[187185]: 2025-11-29 07:51:13.371 187189 DEBUG oslo_concurrency.lockutils [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquired lock "refresh_cache-1885cb6b-9e7f-433f-86bf-9e88d6199d90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:51:13 compute-0 nova_compute[187185]: 2025-11-29 07:51:13.371 187189 DEBUG nova.network.neutron [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:51:13 compute-0 nova_compute[187185]: 2025-11-29 07:51:13.458 187189 DEBUG nova.compute.manager [req-d10ce810-3235-499a-9274-aba7cb56c6c6 req-c7d2c22c-ad94-4e35-a781-ba5905dbc627 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Received event network-changed-092b4f60-4cd8-4ca5-9e92-0131a96a6acf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:51:13 compute-0 nova_compute[187185]: 2025-11-29 07:51:13.459 187189 DEBUG nova.compute.manager [req-d10ce810-3235-499a-9274-aba7cb56c6c6 req-c7d2c22c-ad94-4e35-a781-ba5905dbc627 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Refreshing instance network info cache due to event network-changed-092b4f60-4cd8-4ca5-9e92-0131a96a6acf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:51:13 compute-0 nova_compute[187185]: 2025-11-29 07:51:13.459 187189 DEBUG oslo_concurrency.lockutils [req-d10ce810-3235-499a-9274-aba7cb56c6c6 req-c7d2c22c-ad94-4e35-a781-ba5905dbc627 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-1885cb6b-9e7f-433f-86bf-9e88d6199d90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:51:13 compute-0 nova_compute[187185]: 2025-11-29 07:51:13.808 187189 DEBUG nova.network.neutron [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.470 187189 DEBUG nova.network.neutron [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Updating instance_info_cache with network_info: [{"id": "092b4f60-4cd8-4ca5-9e92-0131a96a6acf", "address": "fa:16:3e:fd:a5:f5", "network": {"id": "dca15c0d-501b-43a9-ac14-6bd62da0f9ec", "bridge": "br-int", "label": "tempest-network-smoke--1940175519", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap092b4f60-4c", "ovs_interfaceid": "092b4f60-4cd8-4ca5-9e92-0131a96a6acf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.574 187189 DEBUG oslo_concurrency.lockutils [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Releasing lock "refresh_cache-1885cb6b-9e7f-433f-86bf-9e88d6199d90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.575 187189 DEBUG nova.compute.manager [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Instance network_info: |[{"id": "092b4f60-4cd8-4ca5-9e92-0131a96a6acf", "address": "fa:16:3e:fd:a5:f5", "network": {"id": "dca15c0d-501b-43a9-ac14-6bd62da0f9ec", "bridge": "br-int", "label": "tempest-network-smoke--1940175519", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap092b4f60-4c", "ovs_interfaceid": "092b4f60-4cd8-4ca5-9e92-0131a96a6acf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.576 187189 DEBUG oslo_concurrency.lockutils [req-d10ce810-3235-499a-9274-aba7cb56c6c6 req-c7d2c22c-ad94-4e35-a781-ba5905dbc627 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-1885cb6b-9e7f-433f-86bf-9e88d6199d90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.577 187189 DEBUG nova.network.neutron [req-d10ce810-3235-499a-9274-aba7cb56c6c6 req-c7d2c22c-ad94-4e35-a781-ba5905dbc627 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Refreshing network info cache for port 092b4f60-4cd8-4ca5-9e92-0131a96a6acf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.580 187189 DEBUG nova.virt.libvirt.driver [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Start _get_guest_xml network_info=[{"id": "092b4f60-4cd8-4ca5-9e92-0131a96a6acf", "address": "fa:16:3e:fd:a5:f5", "network": {"id": "dca15c0d-501b-43a9-ac14-6bd62da0f9ec", "bridge": "br-int", "label": "tempest-network-smoke--1940175519", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap092b4f60-4c", "ovs_interfaceid": "092b4f60-4cd8-4ca5-9e92-0131a96a6acf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.588 187189 WARNING nova.virt.libvirt.driver [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.593 187189 DEBUG nova.virt.libvirt.host [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.594 187189 DEBUG nova.virt.libvirt.host [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.596 187189 DEBUG nova.virt.libvirt.host [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.597 187189 DEBUG nova.virt.libvirt.host [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.598 187189 DEBUG nova.virt.libvirt.driver [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.598 187189 DEBUG nova.virt.hardware [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.599 187189 DEBUG nova.virt.hardware [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.599 187189 DEBUG nova.virt.hardware [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.599 187189 DEBUG nova.virt.hardware [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.600 187189 DEBUG nova.virt.hardware [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.600 187189 DEBUG nova.virt.hardware [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.600 187189 DEBUG nova.virt.hardware [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.600 187189 DEBUG nova.virt.hardware [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.600 187189 DEBUG nova.virt.hardware [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.601 187189 DEBUG nova.virt.hardware [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.601 187189 DEBUG nova.virt.hardware [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.605 187189 DEBUG nova.virt.libvirt.vif [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:51:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2022058758-ac',id=176,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHeXwv7HCMDE973nSxHTOU9Ex4gPKNp8jzgRfha1raP+nxdvcDhoV7VW+zo1781kKiel2wqiyR9rjNc2n+cKdfsB5frQVtod1rlGQWI90bAcU+zMc1WHDM0LSIWch78VpQ==',key_name='tempest-TestSecurityGroupsBasicOps-1050361794',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e8e45e91223b45a79dd698a82af4a2a5',ramdisk_id='',reservation_id='r-64j8zebl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2022058758',owner_user_name='tempest-TestSecurityGroupsBasicOps-2022058758-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:51:09Z,user_data=None,user_id='dec30fbde18e4b2382ea2c59847d067f',uuid=1885cb6b-9e7f-433f-86bf-9e88d6199d90,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "092b4f60-4cd8-4ca5-9e92-0131a96a6acf", "address": "fa:16:3e:fd:a5:f5", "network": {"id": "dca15c0d-501b-43a9-ac14-6bd62da0f9ec", "bridge": "br-int", "label": "tempest-network-smoke--1940175519", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap092b4f60-4c", "ovs_interfaceid": "092b4f60-4cd8-4ca5-9e92-0131a96a6acf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.605 187189 DEBUG nova.network.os_vif_util [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converting VIF {"id": "092b4f60-4cd8-4ca5-9e92-0131a96a6acf", "address": "fa:16:3e:fd:a5:f5", "network": {"id": "dca15c0d-501b-43a9-ac14-6bd62da0f9ec", "bridge": "br-int", "label": "tempest-network-smoke--1940175519", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap092b4f60-4c", "ovs_interfaceid": "092b4f60-4cd8-4ca5-9e92-0131a96a6acf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.606 187189 DEBUG nova.network.os_vif_util [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fd:a5:f5,bridge_name='br-int',has_traffic_filtering=True,id=092b4f60-4cd8-4ca5-9e92-0131a96a6acf,network=Network(dca15c0d-501b-43a9-ac14-6bd62da0f9ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap092b4f60-4c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.607 187189 DEBUG nova.objects.instance [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1885cb6b-9e7f-433f-86bf-9e88d6199d90 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.681 187189 DEBUG nova.virt.libvirt.driver [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:51:14 compute-0 nova_compute[187185]:   <uuid>1885cb6b-9e7f-433f-86bf-9e88d6199d90</uuid>
Nov 29 07:51:14 compute-0 nova_compute[187185]:   <name>instance-000000b0</name>
Nov 29 07:51:14 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:51:14 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:51:14 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:51:14 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:       <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324</nova:name>
Nov 29 07:51:14 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:51:14</nova:creationTime>
Nov 29 07:51:14 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:51:14 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:51:14 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:51:14 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:51:14 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:51:14 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:51:14 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:51:14 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:51:14 compute-0 nova_compute[187185]:         <nova:user uuid="dec30fbde18e4b2382ea2c59847d067f">tempest-TestSecurityGroupsBasicOps-2022058758-project-member</nova:user>
Nov 29 07:51:14 compute-0 nova_compute[187185]:         <nova:project uuid="e8e45e91223b45a79dd698a82af4a2a5">tempest-TestSecurityGroupsBasicOps-2022058758</nova:project>
Nov 29 07:51:14 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:51:14 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:51:14 compute-0 nova_compute[187185]:         <nova:port uuid="092b4f60-4cd8-4ca5-9e92-0131a96a6acf">
Nov 29 07:51:14 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:51:14 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:51:14 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:51:14 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <system>
Nov 29 07:51:14 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:51:14 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:51:14 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:51:14 compute-0 nova_compute[187185]:       <entry name="serial">1885cb6b-9e7f-433f-86bf-9e88d6199d90</entry>
Nov 29 07:51:14 compute-0 nova_compute[187185]:       <entry name="uuid">1885cb6b-9e7f-433f-86bf-9e88d6199d90</entry>
Nov 29 07:51:14 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     </system>
Nov 29 07:51:14 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:51:14 compute-0 nova_compute[187185]:   <os>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:   </os>
Nov 29 07:51:14 compute-0 nova_compute[187185]:   <features>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:   </features>
Nov 29 07:51:14 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:51:14 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:51:14 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:51:14 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/1885cb6b-9e7f-433f-86bf-9e88d6199d90/disk"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:51:14 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/1885cb6b-9e7f-433f-86bf-9e88d6199d90/disk.config"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:51:14 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:fd:a5:f5"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:       <target dev="tap092b4f60-4c"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:51:14 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/1885cb6b-9e7f-433f-86bf-9e88d6199d90/console.log" append="off"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <video>
Nov 29 07:51:14 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     </video>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:51:14 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:51:14 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:51:14 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:51:14 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:51:14 compute-0 nova_compute[187185]: </domain>
Nov 29 07:51:14 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.682 187189 DEBUG nova.compute.manager [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Preparing to wait for external event network-vif-plugged-092b4f60-4cd8-4ca5-9e92-0131a96a6acf prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.683 187189 DEBUG oslo_concurrency.lockutils [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "1885cb6b-9e7f-433f-86bf-9e88d6199d90-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.683 187189 DEBUG oslo_concurrency.lockutils [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "1885cb6b-9e7f-433f-86bf-9e88d6199d90-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.683 187189 DEBUG oslo_concurrency.lockutils [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "1885cb6b-9e7f-433f-86bf-9e88d6199d90-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.684 187189 DEBUG nova.virt.libvirt.vif [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:51:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2022058758-ac',id=176,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHeXwv7HCMDE973nSxHTOU9Ex4gPKNp8jzgRfha1raP+nxdvcDhoV7VW+zo1781kKiel2wqiyR9rjNc2n+cKdfsB5frQVtod1rlGQWI90bAcU+zMc1WHDM0LSIWch78VpQ==',key_name='tempest-TestSecurityGroupsBasicOps-1050361794',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e8e45e91223b45a79dd698a82af4a2a5',ramdisk_id='',reservation_id='r-64j8zebl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2022058758',owner_user_name='tempest-TestSecurityGroupsBasicOps-2022058758-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:51:09Z,user_data=None,user_id='dec30fbde18e4b2382ea2c59847d067f',uuid=1885cb6b-9e7f-433f-86bf-9e88d6199d90,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "092b4f60-4cd8-4ca5-9e92-0131a96a6acf", "address": "fa:16:3e:fd:a5:f5", "network": {"id": "dca15c0d-501b-43a9-ac14-6bd62da0f9ec", "bridge": "br-int", "label": "tempest-network-smoke--1940175519", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap092b4f60-4c", "ovs_interfaceid": "092b4f60-4cd8-4ca5-9e92-0131a96a6acf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.686 187189 DEBUG nova.network.os_vif_util [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converting VIF {"id": "092b4f60-4cd8-4ca5-9e92-0131a96a6acf", "address": "fa:16:3e:fd:a5:f5", "network": {"id": "dca15c0d-501b-43a9-ac14-6bd62da0f9ec", "bridge": "br-int", "label": "tempest-network-smoke--1940175519", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap092b4f60-4c", "ovs_interfaceid": "092b4f60-4cd8-4ca5-9e92-0131a96a6acf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.687 187189 DEBUG nova.network.os_vif_util [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fd:a5:f5,bridge_name='br-int',has_traffic_filtering=True,id=092b4f60-4cd8-4ca5-9e92-0131a96a6acf,network=Network(dca15c0d-501b-43a9-ac14-6bd62da0f9ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap092b4f60-4c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.687 187189 DEBUG os_vif [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:a5:f5,bridge_name='br-int',has_traffic_filtering=True,id=092b4f60-4cd8-4ca5-9e92-0131a96a6acf,network=Network(dca15c0d-501b-43a9-ac14-6bd62da0f9ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap092b4f60-4c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.688 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.689 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.689 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.693 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.693 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap092b4f60-4c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.694 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap092b4f60-4c, col_values=(('external_ids', {'iface-id': '092b4f60-4cd8-4ca5-9e92-0131a96a6acf', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fd:a5:f5', 'vm-uuid': '1885cb6b-9e7f-433f-86bf-9e88d6199d90'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.695 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:51:14 compute-0 NetworkManager[55227]: <info>  [1764402674.6979] manager: (tap092b4f60-4c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/305)
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.698 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.705 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.706 187189 INFO os_vif [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:a5:f5,bridge_name='br-int',has_traffic_filtering=True,id=092b4f60-4cd8-4ca5-9e92-0131a96a6acf,network=Network(dca15c0d-501b-43a9-ac14-6bd62da0f9ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap092b4f60-4c')
Nov 29 07:51:14 compute-0 podman[247324]: 2025-11-29 07:51:14.870938963 +0000 UTC m=+0.142165677 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.954 187189 DEBUG nova.virt.libvirt.driver [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.954 187189 DEBUG nova.virt.libvirt.driver [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.954 187189 DEBUG nova.virt.libvirt.driver [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] No VIF found with MAC fa:16:3e:fd:a5:f5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:51:14 compute-0 nova_compute[187185]: 2025-11-29 07:51:14.955 187189 INFO nova.virt.libvirt.driver [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Using config drive
Nov 29 07:51:15 compute-0 sshd-session[247320]: Invalid user postgres from 20.255.62.58 port 44574
Nov 29 07:51:15 compute-0 nova_compute[187185]: 2025-11-29 07:51:15.673 187189 INFO nova.virt.libvirt.driver [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Creating config drive at /var/lib/nova/instances/1885cb6b-9e7f-433f-86bf-9e88d6199d90/disk.config
Nov 29 07:51:15 compute-0 nova_compute[187185]: 2025-11-29 07:51:15.681 187189 DEBUG oslo_concurrency.processutils [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1885cb6b-9e7f-433f-86bf-9e88d6199d90/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5zjbsdsx execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:51:15 compute-0 sshd-session[247320]: Received disconnect from 20.255.62.58 port 44574:11: Bye Bye [preauth]
Nov 29 07:51:15 compute-0 sshd-session[247320]: Disconnected from invalid user postgres 20.255.62.58 port 44574 [preauth]
Nov 29 07:51:15 compute-0 nova_compute[187185]: 2025-11-29 07:51:15.814 187189 DEBUG oslo_concurrency.processutils [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1885cb6b-9e7f-433f-86bf-9e88d6199d90/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5zjbsdsx" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:51:15 compute-0 kernel: tap092b4f60-4c: entered promiscuous mode
Nov 29 07:51:15 compute-0 NetworkManager[55227]: <info>  [1764402675.8909] manager: (tap092b4f60-4c): new Tun device (/org/freedesktop/NetworkManager/Devices/306)
Nov 29 07:51:15 compute-0 systemd-udevd[247365]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:51:15 compute-0 ovn_controller[95281]: 2025-11-29T07:51:15Z|00587|binding|INFO|Claiming lport 092b4f60-4cd8-4ca5-9e92-0131a96a6acf for this chassis.
Nov 29 07:51:15 compute-0 nova_compute[187185]: 2025-11-29 07:51:15.982 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:51:15 compute-0 ovn_controller[95281]: 2025-11-29T07:51:15Z|00588|binding|INFO|092b4f60-4cd8-4ca5-9e92-0131a96a6acf: Claiming fa:16:3e:fd:a5:f5 10.100.0.4
Nov 29 07:51:15 compute-0 nova_compute[187185]: 2025-11-29 07:51:15.993 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:51:16 compute-0 NetworkManager[55227]: <info>  [1764402676.0022] device (tap092b4f60-4c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:51:16 compute-0 NetworkManager[55227]: <info>  [1764402676.0032] device (tap092b4f60-4c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:51:16 compute-0 systemd-machined[153486]: New machine qemu-68-instance-000000b0.
Nov 29 07:51:16 compute-0 ovn_controller[95281]: 2025-11-29T07:51:16Z|00589|binding|INFO|Setting lport 092b4f60-4cd8-4ca5-9e92-0131a96a6acf ovn-installed in OVS
Nov 29 07:51:16 compute-0 nova_compute[187185]: 2025-11-29 07:51:16.074 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:51:16 compute-0 systemd[1]: Started Virtual Machine qemu-68-instance-000000b0.
Nov 29 07:51:16 compute-0 ovn_controller[95281]: 2025-11-29T07:51:16Z|00590|binding|INFO|Setting lport 092b4f60-4cd8-4ca5-9e92-0131a96a6acf up in Southbound
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:51:16.103 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fd:a5:f5 10.100.0.4'], port_security=['fa:16:3e:fd:a5:f5 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dca15c0d-501b-43a9-ac14-6bd62da0f9ec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'afb0a87a-e6cb-4bf6-93dc-3e1b8fd7af88 bf778d2c-6f77-4017-b7b4-2a7103c8ac47', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0b105a9-0bbb-468a-9eb3-7a5d5e9f7fdf, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=092b4f60-4cd8-4ca5-9e92-0131a96a6acf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:51:16.104 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 092b4f60-4cd8-4ca5-9e92-0131a96a6acf in datapath dca15c0d-501b-43a9-ac14-6bd62da0f9ec bound to our chassis
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:51:16.106 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dca15c0d-501b-43a9-ac14-6bd62da0f9ec
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:51:16.128 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[9036dc37-08eb-41ef-bf5c-ea00ddf4a569]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:51:16.129 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdca15c0d-51 in ovnmeta-dca15c0d-501b-43a9-ac14-6bd62da0f9ec namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:51:16.131 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdca15c0d-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:51:16.131 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[5352a004-b9d9-486c-8cf1-1482c29270d6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:51:16.132 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[3eb891bd-0a6a-4c6b-bae7-5cb66a6c2f30]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:51:16.154 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[0f435e0b-0627-490a-b98e-17b3bdb553c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:51:16.175 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f72d0211-1081-4b32-8955-1512db8cf486]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:51:16.215 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[15929ce6-8ba2-42b9-92c2-4e8379472ea7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:51:16 compute-0 NetworkManager[55227]: <info>  [1764402676.2243] manager: (tapdca15c0d-50): new Veth device (/org/freedesktop/NetworkManager/Devices/307)
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:51:16.226 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[866155d2-8ecd-4c24-af4e-5d85ceefc13d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:51:16.272 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[bbfe16c0-4933-4b01-b666-db7193c8c296]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:51:16.277 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[b6f5f129-d71b-40d9-9791-2d904a2206ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:51:16 compute-0 NetworkManager[55227]: <info>  [1764402676.3055] device (tapdca15c0d-50): carrier: link connected
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:51:16.313 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[6ea5ad69-ef46-4534-9584-165715e16cfb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:51:16.332 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[2470133f-d970-406e-a397-8a1d91797c17]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdca15c0d-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:7e:1d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 184], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 812396, 'reachable_time': 39135, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247402, 'error': None, 'target': 'ovnmeta-dca15c0d-501b-43a9-ac14-6bd62da0f9ec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:51:16.358 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6197f197-5dff-4b84-806f-d313a3ecb12b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1b:7e1d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 812396, 'tstamp': 812396}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 247408, 'error': None, 'target': 'ovnmeta-dca15c0d-501b-43a9-ac14-6bd62da0f9ec', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:51:16.383 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[198eb10e-411d-4c4a-a50a-8b39722188c3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdca15c0d-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:7e:1d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 184], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 812396, 'reachable_time': 39135, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 247409, 'error': None, 'target': 'ovnmeta-dca15c0d-501b-43a9-ac14-6bd62da0f9ec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:51:16.421 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d195c0e7-4684-4d5c-8b27-f91ead2d28d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:51:16 compute-0 nova_compute[187185]: 2025-11-29 07:51:16.434 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764402676.4331198, 1885cb6b-9e7f-433f-86bf-9e88d6199d90 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:51:16 compute-0 nova_compute[187185]: 2025-11-29 07:51:16.435 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] VM Started (Lifecycle Event)
Nov 29 07:51:16 compute-0 nova_compute[187185]: 2025-11-29 07:51:16.471 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:51:16 compute-0 nova_compute[187185]: 2025-11-29 07:51:16.476 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764402676.4347858, 1885cb6b-9e7f-433f-86bf-9e88d6199d90 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:51:16 compute-0 nova_compute[187185]: 2025-11-29 07:51:16.477 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] VM Paused (Lifecycle Event)
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:51:16.490 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[53637833-96cd-4a9d-8742-f6ec206e956c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:51:16.492 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdca15c0d-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:51:16.492 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:51:16.492 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdca15c0d-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:51:16 compute-0 nova_compute[187185]: 2025-11-29 07:51:16.494 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:51:16 compute-0 kernel: tapdca15c0d-50: entered promiscuous mode
Nov 29 07:51:16 compute-0 nova_compute[187185]: 2025-11-29 07:51:16.496 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:51:16 compute-0 NetworkManager[55227]: <info>  [1764402676.4971] manager: (tapdca15c0d-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/308)
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:51:16.497 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdca15c0d-50, col_values=(('external_ids', {'iface-id': 'abbbcf20-7d52-44a5-8f71-8c43d1bae146'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:51:16 compute-0 nova_compute[187185]: 2025-11-29 07:51:16.497 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:51:16 compute-0 ovn_controller[95281]: 2025-11-29T07:51:16Z|00591|binding|INFO|Releasing lport abbbcf20-7d52-44a5-8f71-8c43d1bae146 from this chassis (sb_readonly=0)
Nov 29 07:51:16 compute-0 nova_compute[187185]: 2025-11-29 07:51:16.499 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:51:16.499 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dca15c0d-501b-43a9-ac14-6bd62da0f9ec.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dca15c0d-501b-43a9-ac14-6bd62da0f9ec.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:51:16.500 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[0b20bded-92ef-4501-b9c0-ccaca8aacef6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:51:16.501 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-dca15c0d-501b-43a9-ac14-6bd62da0f9ec
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/dca15c0d-501b-43a9-ac14-6bd62da0f9ec.pid.haproxy
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID dca15c0d-501b-43a9-ac14-6bd62da0f9ec
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:51:16 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:51:16.502 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-dca15c0d-501b-43a9-ac14-6bd62da0f9ec', 'env', 'PROCESS_TAG=haproxy-dca15c0d-501b-43a9-ac14-6bd62da0f9ec', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/dca15c0d-501b-43a9-ac14-6bd62da0f9ec.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:51:16 compute-0 nova_compute[187185]: 2025-11-29 07:51:16.513 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:51:16 compute-0 nova_compute[187185]: 2025-11-29 07:51:16.517 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:51:16 compute-0 nova_compute[187185]: 2025-11-29 07:51:16.522 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:51:16 compute-0 nova_compute[187185]: 2025-11-29 07:51:16.567 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:51:16 compute-0 podman[247442]: 2025-11-29 07:51:16.932023765 +0000 UTC m=+0.065721072 container create 79dcfd79ef5c4e7444e90aaff518d167a77bdb6e8b096275e06deea95f16b41e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dca15c0d-501b-43a9-ac14-6bd62da0f9ec, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 29 07:51:16 compute-0 systemd[1]: Started libpod-conmon-79dcfd79ef5c4e7444e90aaff518d167a77bdb6e8b096275e06deea95f16b41e.scope.
Nov 29 07:51:16 compute-0 podman[247442]: 2025-11-29 07:51:16.902474714 +0000 UTC m=+0.036172041 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:51:17 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:51:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/76252cf9ea9b3cfc173252094cab89ff3f1acc6899763e686c0489e8559b1abf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:51:17 compute-0 podman[247442]: 2025-11-29 07:51:17.031706772 +0000 UTC m=+0.165404139 container init 79dcfd79ef5c4e7444e90aaff518d167a77bdb6e8b096275e06deea95f16b41e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dca15c0d-501b-43a9-ac14-6bd62da0f9ec, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:51:17 compute-0 podman[247442]: 2025-11-29 07:51:17.040178843 +0000 UTC m=+0.173876160 container start 79dcfd79ef5c4e7444e90aaff518d167a77bdb6e8b096275e06deea95f16b41e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dca15c0d-501b-43a9-ac14-6bd62da0f9ec, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 29 07:51:17 compute-0 neutron-haproxy-ovnmeta-dca15c0d-501b-43a9-ac14-6bd62da0f9ec[247457]: [NOTICE]   (247461) : New worker (247463) forked
Nov 29 07:51:17 compute-0 neutron-haproxy-ovnmeta-dca15c0d-501b-43a9-ac14-6bd62da0f9ec[247457]: [NOTICE]   (247461) : Loading success.
Nov 29 07:51:17 compute-0 nova_compute[187185]: 2025-11-29 07:51:17.857 187189 DEBUG nova.network.neutron [req-d10ce810-3235-499a-9274-aba7cb56c6c6 req-c7d2c22c-ad94-4e35-a781-ba5905dbc627 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Updated VIF entry in instance network info cache for port 092b4f60-4cd8-4ca5-9e92-0131a96a6acf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:51:17 compute-0 nova_compute[187185]: 2025-11-29 07:51:17.858 187189 DEBUG nova.network.neutron [req-d10ce810-3235-499a-9274-aba7cb56c6c6 req-c7d2c22c-ad94-4e35-a781-ba5905dbc627 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Updating instance_info_cache with network_info: [{"id": "092b4f60-4cd8-4ca5-9e92-0131a96a6acf", "address": "fa:16:3e:fd:a5:f5", "network": {"id": "dca15c0d-501b-43a9-ac14-6bd62da0f9ec", "bridge": "br-int", "label": "tempest-network-smoke--1940175519", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap092b4f60-4c", "ovs_interfaceid": "092b4f60-4cd8-4ca5-9e92-0131a96a6acf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:51:17 compute-0 nova_compute[187185]: 2025-11-29 07:51:17.978 187189 DEBUG oslo_concurrency.lockutils [req-d10ce810-3235-499a-9274-aba7cb56c6c6 req-c7d2c22c-ad94-4e35-a781-ba5905dbc627 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-1885cb6b-9e7f-433f-86bf-9e88d6199d90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:51:18 compute-0 nova_compute[187185]: 2025-11-29 07:51:18.029 187189 DEBUG nova.compute.manager [req-30016dcc-4bb7-451f-8052-a554cabbb835 req-66add2b4-f1c3-4f7b-b1c6-f6a7fc3c9430 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Received event network-vif-plugged-092b4f60-4cd8-4ca5-9e92-0131a96a6acf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:51:18 compute-0 nova_compute[187185]: 2025-11-29 07:51:18.030 187189 DEBUG oslo_concurrency.lockutils [req-30016dcc-4bb7-451f-8052-a554cabbb835 req-66add2b4-f1c3-4f7b-b1c6-f6a7fc3c9430 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "1885cb6b-9e7f-433f-86bf-9e88d6199d90-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:51:18 compute-0 nova_compute[187185]: 2025-11-29 07:51:18.030 187189 DEBUG oslo_concurrency.lockutils [req-30016dcc-4bb7-451f-8052-a554cabbb835 req-66add2b4-f1c3-4f7b-b1c6-f6a7fc3c9430 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1885cb6b-9e7f-433f-86bf-9e88d6199d90-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:51:18 compute-0 nova_compute[187185]: 2025-11-29 07:51:18.031 187189 DEBUG oslo_concurrency.lockutils [req-30016dcc-4bb7-451f-8052-a554cabbb835 req-66add2b4-f1c3-4f7b-b1c6-f6a7fc3c9430 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1885cb6b-9e7f-433f-86bf-9e88d6199d90-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:51:18 compute-0 nova_compute[187185]: 2025-11-29 07:51:18.031 187189 DEBUG nova.compute.manager [req-30016dcc-4bb7-451f-8052-a554cabbb835 req-66add2b4-f1c3-4f7b-b1c6-f6a7fc3c9430 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Processing event network-vif-plugged-092b4f60-4cd8-4ca5-9e92-0131a96a6acf _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:51:18 compute-0 nova_compute[187185]: 2025-11-29 07:51:18.031 187189 DEBUG nova.compute.manager [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:51:18 compute-0 nova_compute[187185]: 2025-11-29 07:51:18.036 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764402678.0364654, 1885cb6b-9e7f-433f-86bf-9e88d6199d90 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:51:18 compute-0 nova_compute[187185]: 2025-11-29 07:51:18.037 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] VM Resumed (Lifecycle Event)
Nov 29 07:51:18 compute-0 nova_compute[187185]: 2025-11-29 07:51:18.039 187189 DEBUG nova.virt.libvirt.driver [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:51:18 compute-0 nova_compute[187185]: 2025-11-29 07:51:18.042 187189 INFO nova.virt.libvirt.driver [-] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Instance spawned successfully.
Nov 29 07:51:18 compute-0 nova_compute[187185]: 2025-11-29 07:51:18.042 187189 DEBUG nova.virt.libvirt.driver [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:51:18 compute-0 nova_compute[187185]: 2025-11-29 07:51:18.058 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:51:18 compute-0 nova_compute[187185]: 2025-11-29 07:51:18.063 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:51:18 compute-0 nova_compute[187185]: 2025-11-29 07:51:18.072 187189 DEBUG nova.virt.libvirt.driver [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:51:18 compute-0 nova_compute[187185]: 2025-11-29 07:51:18.072 187189 DEBUG nova.virt.libvirt.driver [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:51:18 compute-0 nova_compute[187185]: 2025-11-29 07:51:18.073 187189 DEBUG nova.virt.libvirt.driver [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:51:18 compute-0 nova_compute[187185]: 2025-11-29 07:51:18.073 187189 DEBUG nova.virt.libvirt.driver [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:51:18 compute-0 nova_compute[187185]: 2025-11-29 07:51:18.074 187189 DEBUG nova.virt.libvirt.driver [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:51:18 compute-0 nova_compute[187185]: 2025-11-29 07:51:18.074 187189 DEBUG nova.virt.libvirt.driver [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:51:18 compute-0 nova_compute[187185]: 2025-11-29 07:51:18.080 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:51:18 compute-0 nova_compute[187185]: 2025-11-29 07:51:18.144 187189 INFO nova.compute.manager [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Took 8.24 seconds to spawn the instance on the hypervisor.
Nov 29 07:51:18 compute-0 nova_compute[187185]: 2025-11-29 07:51:18.145 187189 DEBUG nova.compute.manager [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:51:18 compute-0 nova_compute[187185]: 2025-11-29 07:51:18.260 187189 INFO nova.compute.manager [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Took 9.78 seconds to build instance.
Nov 29 07:51:18 compute-0 nova_compute[187185]: 2025-11-29 07:51:18.288 187189 DEBUG oslo_concurrency.lockutils [None req-a23658a3-6116-4c28-b280-312c6f57ed41 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "1885cb6b-9e7f-433f-86bf-9e88d6199d90" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:51:18 compute-0 nova_compute[187185]: 2025-11-29 07:51:18.344 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:51:19 compute-0 nova_compute[187185]: 2025-11-29 07:51:19.697 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:51:20 compute-0 nova_compute[187185]: 2025-11-29 07:51:20.941 187189 DEBUG nova.compute.manager [req-9ca1d1f6-2d90-47b8-8a1c-dc855829797b req-4499fba1-61ac-458d-b1ea-fe5ff9f2b18d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Received event network-vif-plugged-092b4f60-4cd8-4ca5-9e92-0131a96a6acf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:51:20 compute-0 nova_compute[187185]: 2025-11-29 07:51:20.941 187189 DEBUG oslo_concurrency.lockutils [req-9ca1d1f6-2d90-47b8-8a1c-dc855829797b req-4499fba1-61ac-458d-b1ea-fe5ff9f2b18d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "1885cb6b-9e7f-433f-86bf-9e88d6199d90-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:51:20 compute-0 nova_compute[187185]: 2025-11-29 07:51:20.942 187189 DEBUG oslo_concurrency.lockutils [req-9ca1d1f6-2d90-47b8-8a1c-dc855829797b req-4499fba1-61ac-458d-b1ea-fe5ff9f2b18d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1885cb6b-9e7f-433f-86bf-9e88d6199d90-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:51:20 compute-0 nova_compute[187185]: 2025-11-29 07:51:20.942 187189 DEBUG oslo_concurrency.lockutils [req-9ca1d1f6-2d90-47b8-8a1c-dc855829797b req-4499fba1-61ac-458d-b1ea-fe5ff9f2b18d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1885cb6b-9e7f-433f-86bf-9e88d6199d90-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:51:20 compute-0 nova_compute[187185]: 2025-11-29 07:51:20.942 187189 DEBUG nova.compute.manager [req-9ca1d1f6-2d90-47b8-8a1c-dc855829797b req-4499fba1-61ac-458d-b1ea-fe5ff9f2b18d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] No waiting events found dispatching network-vif-plugged-092b4f60-4cd8-4ca5-9e92-0131a96a6acf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:51:20 compute-0 nova_compute[187185]: 2025-11-29 07:51:20.943 187189 WARNING nova.compute.manager [req-9ca1d1f6-2d90-47b8-8a1c-dc855829797b req-4499fba1-61ac-458d-b1ea-fe5ff9f2b18d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Received unexpected event network-vif-plugged-092b4f60-4cd8-4ca5-9e92-0131a96a6acf for instance with vm_state active and task_state None.
Nov 29 07:51:23 compute-0 nova_compute[187185]: 2025-11-29 07:51:23.347 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:51:23 compute-0 podman[247472]: 2025-11-29 07:51:23.815443117 +0000 UTC m=+0.066945657 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:51:24 compute-0 NetworkManager[55227]: <info>  [1764402684.1365] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/309)
Nov 29 07:51:24 compute-0 NetworkManager[55227]: <info>  [1764402684.1374] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/310)
Nov 29 07:51:24 compute-0 nova_compute[187185]: 2025-11-29 07:51:24.144 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:51:24 compute-0 nova_compute[187185]: 2025-11-29 07:51:24.341 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:51:24 compute-0 ovn_controller[95281]: 2025-11-29T07:51:24Z|00592|binding|INFO|Releasing lport abbbcf20-7d52-44a5-8f71-8c43d1bae146 from this chassis (sb_readonly=0)
Nov 29 07:51:24 compute-0 nova_compute[187185]: 2025-11-29 07:51:24.371 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:51:24 compute-0 nova_compute[187185]: 2025-11-29 07:51:24.700 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:51:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:51:25.751 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:51:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:51:25.754 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:51:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:51:25.754 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:51:25 compute-0 podman[247499]: 2025-11-29 07:51:25.821966514 +0000 UTC m=+0.089831307 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=multipathd)
Nov 29 07:51:25 compute-0 podman[247500]: 2025-11-29 07:51:25.861983653 +0000 UTC m=+0.115200189 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:51:26 compute-0 nova_compute[187185]: 2025-11-29 07:51:26.301 187189 DEBUG nova.compute.manager [req-8f1447e2-7f28-476a-befe-c217928cf765 req-378b00fa-6d30-48c6-856a-4627ae96f001 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Received event network-changed-092b4f60-4cd8-4ca5-9e92-0131a96a6acf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:51:26 compute-0 nova_compute[187185]: 2025-11-29 07:51:26.302 187189 DEBUG nova.compute.manager [req-8f1447e2-7f28-476a-befe-c217928cf765 req-378b00fa-6d30-48c6-856a-4627ae96f001 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Refreshing instance network info cache due to event network-changed-092b4f60-4cd8-4ca5-9e92-0131a96a6acf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:51:26 compute-0 nova_compute[187185]: 2025-11-29 07:51:26.302 187189 DEBUG oslo_concurrency.lockutils [req-8f1447e2-7f28-476a-befe-c217928cf765 req-378b00fa-6d30-48c6-856a-4627ae96f001 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-1885cb6b-9e7f-433f-86bf-9e88d6199d90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:51:26 compute-0 nova_compute[187185]: 2025-11-29 07:51:26.302 187189 DEBUG oslo_concurrency.lockutils [req-8f1447e2-7f28-476a-befe-c217928cf765 req-378b00fa-6d30-48c6-856a-4627ae96f001 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-1885cb6b-9e7f-433f-86bf-9e88d6199d90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:51:26 compute-0 nova_compute[187185]: 2025-11-29 07:51:26.302 187189 DEBUG nova.network.neutron [req-8f1447e2-7f28-476a-befe-c217928cf765 req-378b00fa-6d30-48c6-856a-4627ae96f001 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Refreshing network info cache for port 092b4f60-4cd8-4ca5-9e92-0131a96a6acf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:51:27 compute-0 nova_compute[187185]: 2025-11-29 07:51:27.987 187189 DEBUG nova.network.neutron [req-8f1447e2-7f28-476a-befe-c217928cf765 req-378b00fa-6d30-48c6-856a-4627ae96f001 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Updated VIF entry in instance network info cache for port 092b4f60-4cd8-4ca5-9e92-0131a96a6acf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:51:27 compute-0 nova_compute[187185]: 2025-11-29 07:51:27.988 187189 DEBUG nova.network.neutron [req-8f1447e2-7f28-476a-befe-c217928cf765 req-378b00fa-6d30-48c6-856a-4627ae96f001 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Updating instance_info_cache with network_info: [{"id": "092b4f60-4cd8-4ca5-9e92-0131a96a6acf", "address": "fa:16:3e:fd:a5:f5", "network": {"id": "dca15c0d-501b-43a9-ac14-6bd62da0f9ec", "bridge": "br-int", "label": "tempest-network-smoke--1940175519", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap092b4f60-4c", "ovs_interfaceid": "092b4f60-4cd8-4ca5-9e92-0131a96a6acf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:51:28 compute-0 nova_compute[187185]: 2025-11-29 07:51:28.009 187189 DEBUG oslo_concurrency.lockutils [req-8f1447e2-7f28-476a-befe-c217928cf765 req-378b00fa-6d30-48c6-856a-4627ae96f001 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-1885cb6b-9e7f-433f-86bf-9e88d6199d90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:51:28 compute-0 nova_compute[187185]: 2025-11-29 07:51:28.349 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:51:28 compute-0 ovn_controller[95281]: 2025-11-29T07:51:28Z|00593|binding|INFO|Releasing lport abbbcf20-7d52-44a5-8f71-8c43d1bae146 from this chassis (sb_readonly=0)
Nov 29 07:51:29 compute-0 nova_compute[187185]: 2025-11-29 07:51:29.021 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:51:29 compute-0 nova_compute[187185]: 2025-11-29 07:51:29.702 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:51:30 compute-0 ovn_controller[95281]: 2025-11-29T07:51:30Z|00081|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:fd:a5:f5 10.100.0.4
Nov 29 07:51:30 compute-0 ovn_controller[95281]: 2025-11-29T07:51:30Z|00082|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:fd:a5:f5 10.100.0.4
Nov 29 07:51:32 compute-0 nova_compute[187185]: 2025-11-29 07:51:32.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:51:32 compute-0 ovn_controller[95281]: 2025-11-29T07:51:32Z|00594|binding|INFO|Releasing lport abbbcf20-7d52-44a5-8f71-8c43d1bae146 from this chassis (sb_readonly=0)
Nov 29 07:51:32 compute-0 nova_compute[187185]: 2025-11-29 07:51:32.946 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:51:33 compute-0 nova_compute[187185]: 2025-11-29 07:51:33.352 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:51:34 compute-0 nova_compute[187185]: 2025-11-29 07:51:34.705 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:51:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:51:34.806 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=70, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=69) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:51:34 compute-0 nova_compute[187185]: 2025-11-29 07:51:34.806 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:51:34 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:51:34.807 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:51:35 compute-0 sshd-session[247559]: Received disconnect from 115.190.136.184 port 61406:11: Bye Bye [preauth]
Nov 29 07:51:35 compute-0 sshd-session[247559]: Disconnected from authenticating user root 115.190.136.184 port 61406 [preauth]
Nov 29 07:51:35 compute-0 nova_compute[187185]: 2025-11-29 07:51:35.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:51:35 compute-0 nova_compute[187185]: 2025-11-29 07:51:35.371 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:51:35 compute-0 nova_compute[187185]: 2025-11-29 07:51:35.372 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:51:35 compute-0 nova_compute[187185]: 2025-11-29 07:51:35.372 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:51:35 compute-0 nova_compute[187185]: 2025-11-29 07:51:35.372 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:51:35 compute-0 nova_compute[187185]: 2025-11-29 07:51:35.465 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1885cb6b-9e7f-433f-86bf-9e88d6199d90/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:51:35 compute-0 nova_compute[187185]: 2025-11-29 07:51:35.566 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1885cb6b-9e7f-433f-86bf-9e88d6199d90/disk --force-share --output=json" returned: 0 in 0.102s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:51:35 compute-0 nova_compute[187185]: 2025-11-29 07:51:35.568 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1885cb6b-9e7f-433f-86bf-9e88d6199d90/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:51:35 compute-0 nova_compute[187185]: 2025-11-29 07:51:35.646 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1885cb6b-9e7f-433f-86bf-9e88d6199d90/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:51:35 compute-0 nova_compute[187185]: 2025-11-29 07:51:35.828 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:51:35 compute-0 nova_compute[187185]: 2025-11-29 07:51:35.831 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5549MB free_disk=73.22471237182617GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:51:35 compute-0 nova_compute[187185]: 2025-11-29 07:51:35.832 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:51:35 compute-0 nova_compute[187185]: 2025-11-29 07:51:35.833 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:51:35 compute-0 nova_compute[187185]: 2025-11-29 07:51:35.910 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance 1885cb6b-9e7f-433f-86bf-9e88d6199d90 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:51:35 compute-0 nova_compute[187185]: 2025-11-29 07:51:35.911 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:51:35 compute-0 nova_compute[187185]: 2025-11-29 07:51:35.911 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:51:35 compute-0 nova_compute[187185]: 2025-11-29 07:51:35.961 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:51:35 compute-0 nova_compute[187185]: 2025-11-29 07:51:35.978 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:51:35 compute-0 nova_compute[187185]: 2025-11-29 07:51:35.998 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:51:35 compute-0 nova_compute[187185]: 2025-11-29 07:51:35.998 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:51:36 compute-0 podman[247570]: 2025-11-29 07:51:36.811017189 +0000 UTC m=+0.064478666 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 07:51:36 compute-0 podman[247568]: 2025-11-29 07:51:36.811244386 +0000 UTC m=+0.077049424 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 07:51:36 compute-0 podman[247569]: 2025-11-29 07:51:36.815932159 +0000 UTC m=+0.069867749 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., config_id=edpm, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, distribution-scope=public, managed_by=edpm_ansible, vcs-type=git, name=ubi9-minimal, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, release=1755695350, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 29 07:51:37 compute-0 nova_compute[187185]: 2025-11-29 07:51:36.999 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:51:37 compute-0 nova_compute[187185]: 2025-11-29 07:51:37.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:51:37 compute-0 sshd-session[247538]: error: kex_exchange_identification: read: Connection timed out
Nov 29 07:51:37 compute-0 sshd-session[247538]: banner exchange: Connection from 115.190.187.93 port 34414: Connection timed out
Nov 29 07:51:38 compute-0 nova_compute[187185]: 2025-11-29 07:51:38.354 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:51:39 compute-0 nova_compute[187185]: 2025-11-29 07:51:39.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:51:39 compute-0 nova_compute[187185]: 2025-11-29 07:51:39.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:51:39 compute-0 nova_compute[187185]: 2025-11-29 07:51:39.707 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:51:43 compute-0 nova_compute[187185]: 2025-11-29 07:51:43.311 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:51:43 compute-0 nova_compute[187185]: 2025-11-29 07:51:43.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:51:43 compute-0 nova_compute[187185]: 2025-11-29 07:51:43.315 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:51:43 compute-0 nova_compute[187185]: 2025-11-29 07:51:43.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:51:43 compute-0 nova_compute[187185]: 2025-11-29 07:51:43.357 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:51:44 compute-0 nova_compute[187185]: 2025-11-29 07:51:44.710 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:51:44 compute-0 nova_compute[187185]: 2025-11-29 07:51:44.806 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "refresh_cache-1885cb6b-9e7f-433f-86bf-9e88d6199d90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:51:44 compute-0 nova_compute[187185]: 2025-11-29 07:51:44.807 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquired lock "refresh_cache-1885cb6b-9e7f-433f-86bf-9e88d6199d90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:51:44 compute-0 nova_compute[187185]: 2025-11-29 07:51:44.807 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 29 07:51:44 compute-0 nova_compute[187185]: 2025-11-29 07:51:44.808 187189 DEBUG nova.objects.instance [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 1885cb6b-9e7f-433f-86bf-9e88d6199d90 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:51:44 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:51:44.810 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '70'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:51:45 compute-0 podman[247630]: 2025-11-29 07:51:45.891960626 +0000 UTC m=+0.126379828 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Nov 29 07:51:48 compute-0 nova_compute[187185]: 2025-11-29 07:51:48.178 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Updating instance_info_cache with network_info: [{"id": "092b4f60-4cd8-4ca5-9e92-0131a96a6acf", "address": "fa:16:3e:fd:a5:f5", "network": {"id": "dca15c0d-501b-43a9-ac14-6bd62da0f9ec", "bridge": "br-int", "label": "tempest-network-smoke--1940175519", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap092b4f60-4c", "ovs_interfaceid": "092b4f60-4cd8-4ca5-9e92-0131a96a6acf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:51:48 compute-0 nova_compute[187185]: 2025-11-29 07:51:48.360 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:51:48 compute-0 nova_compute[187185]: 2025-11-29 07:51:48.365 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Releasing lock "refresh_cache-1885cb6b-9e7f-433f-86bf-9e88d6199d90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:51:48 compute-0 nova_compute[187185]: 2025-11-29 07:51:48.366 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 29 07:51:48 compute-0 nova_compute[187185]: 2025-11-29 07:51:48.366 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:51:48 compute-0 nova_compute[187185]: 2025-11-29 07:51:48.367 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:51:49 compute-0 nova_compute[187185]: 2025-11-29 07:51:49.714 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:51:50 compute-0 sshd-session[247656]: Invalid user minecraft from 190.181.27.27 port 40948
Nov 29 07:51:50 compute-0 sshd-session[247656]: Received disconnect from 190.181.27.27 port 40948:11: Bye Bye [preauth]
Nov 29 07:51:50 compute-0 sshd-session[247656]: Disconnected from invalid user minecraft 190.181.27.27 port 40948 [preauth]
Nov 29 07:51:53 compute-0 nova_compute[187185]: 2025-11-29 07:51:53.361 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:51:54 compute-0 nova_compute[187185]: 2025-11-29 07:51:54.720 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:51:54 compute-0 podman[247658]: 2025-11-29 07:51:54.812025233 +0000 UTC m=+0.067861032 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 07:51:56 compute-0 podman[247684]: 2025-11-29 07:51:56.805327504 +0000 UTC m=+0.066802332 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 29 07:51:56 compute-0 podman[247683]: 2025-11-29 07:51:56.820717252 +0000 UTC m=+0.081274314 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:51:58 compute-0 nova_compute[187185]: 2025-11-29 07:51:58.202 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:51:58 compute-0 nova_compute[187185]: 2025-11-29 07:51:58.363 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:51:59 compute-0 nova_compute[187185]: 2025-11-29 07:51:59.723 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:52:03 compute-0 nova_compute[187185]: 2025-11-29 07:52:03.365 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:52:04 compute-0 nova_compute[187185]: 2025-11-29 07:52:04.727 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:52:07 compute-0 podman[247721]: 2025-11-29 07:52:07.799790331 +0000 UTC m=+0.060717889 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:52:07 compute-0 podman[247722]: 2025-11-29 07:52:07.806547754 +0000 UTC m=+0.062173561 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.openshift.expose-services=, managed_by=edpm_ansible, release=1755695350, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc.)
Nov 29 07:52:07 compute-0 podman[247723]: 2025-11-29 07:52:07.819016268 +0000 UTC m=+0.065920497 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 07:52:08 compute-0 nova_compute[187185]: 2025-11-29 07:52:08.367 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:52:09 compute-0 nova_compute[187185]: 2025-11-29 07:52:09.732 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:52:09 compute-0 nova_compute[187185]: 2025-11-29 07:52:09.966 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:52:13 compute-0 nova_compute[187185]: 2025-11-29 07:52:13.370 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:52:14 compute-0 nova_compute[187185]: 2025-11-29 07:52:14.736 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:52:16 compute-0 podman[247786]: 2025-11-29 07:52:16.881120119 +0000 UTC m=+0.140027087 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:52:18 compute-0 nova_compute[187185]: 2025-11-29 07:52:18.375 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:52:19 compute-0 nova_compute[187185]: 2025-11-29 07:52:19.740 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:52:23 compute-0 nova_compute[187185]: 2025-11-29 07:52:23.377 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:52:24 compute-0 nova_compute[187185]: 2025-11-29 07:52:24.744 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:52:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:52:25.752 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:52:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:52:25.753 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:52:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:52:25.754 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:52:25 compute-0 podman[247812]: 2025-11-29 07:52:25.794997098 +0000 UTC m=+0.059185095 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 07:52:28 compute-0 podman[247840]: 2025-11-29 07:52:28.185001761 +0000 UTC m=+0.067170463 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 07:52:28 compute-0 podman[247841]: 2025-11-29 07:52:28.185985789 +0000 UTC m=+0.064588509 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:52:28 compute-0 nova_compute[187185]: 2025-11-29 07:52:28.379 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:52:29 compute-0 nova_compute[187185]: 2025-11-29 07:52:29.756 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:52:31 compute-0 sshd-session[247839]: Connection closed by 115.190.187.93 port 53394 [preauth]
Nov 29 07:52:33 compute-0 nova_compute[187185]: 2025-11-29 07:52:33.326 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:52:33 compute-0 nova_compute[187185]: 2025-11-29 07:52:33.381 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:52:34 compute-0 nova_compute[187185]: 2025-11-29 07:52:34.759 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:52:36 compute-0 nova_compute[187185]: 2025-11-29 07:52:36.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:52:36 compute-0 nova_compute[187185]: 2025-11-29 07:52:36.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:52:36 compute-0 nova_compute[187185]: 2025-11-29 07:52:36.354 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:52:36 compute-0 nova_compute[187185]: 2025-11-29 07:52:36.355 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:52:36 compute-0 nova_compute[187185]: 2025-11-29 07:52:36.355 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:52:36 compute-0 nova_compute[187185]: 2025-11-29 07:52:36.355 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:52:36 compute-0 nova_compute[187185]: 2025-11-29 07:52:36.448 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1885cb6b-9e7f-433f-86bf-9e88d6199d90/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:52:36 compute-0 nova_compute[187185]: 2025-11-29 07:52:36.525 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1885cb6b-9e7f-433f-86bf-9e88d6199d90/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:52:36 compute-0 nova_compute[187185]: 2025-11-29 07:52:36.526 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1885cb6b-9e7f-433f-86bf-9e88d6199d90/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:52:36 compute-0 nova_compute[187185]: 2025-11-29 07:52:36.589 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1885cb6b-9e7f-433f-86bf-9e88d6199d90/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:52:36 compute-0 nova_compute[187185]: 2025-11-29 07:52:36.768 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:52:36 compute-0 nova_compute[187185]: 2025-11-29 07:52:36.770 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5576MB free_disk=73.21733856201172GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:52:36 compute-0 nova_compute[187185]: 2025-11-29 07:52:36.770 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:52:36 compute-0 nova_compute[187185]: 2025-11-29 07:52:36.771 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:52:36 compute-0 nova_compute[187185]: 2025-11-29 07:52:36.854 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance 1885cb6b-9e7f-433f-86bf-9e88d6199d90 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:52:36 compute-0 nova_compute[187185]: 2025-11-29 07:52:36.855 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:52:36 compute-0 nova_compute[187185]: 2025-11-29 07:52:36.855 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:52:36 compute-0 nova_compute[187185]: 2025-11-29 07:52:36.896 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:52:36 compute-0 nova_compute[187185]: 2025-11-29 07:52:36.921 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:52:36 compute-0 nova_compute[187185]: 2025-11-29 07:52:36.923 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:52:36 compute-0 nova_compute[187185]: 2025-11-29 07:52:36.923 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:52:37 compute-0 nova_compute[187185]: 2025-11-29 07:52:37.924 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:52:38 compute-0 nova_compute[187185]: 2025-11-29 07:52:38.427 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:52:38 compute-0 podman[247887]: 2025-11-29 07:52:38.802171022 +0000 UTC m=+0.058032213 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 07:52:38 compute-0 podman[247888]: 2025-11-29 07:52:38.814849923 +0000 UTC m=+0.065822365 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container)
Nov 29 07:52:38 compute-0 podman[247889]: 2025-11-29 07:52:38.817238881 +0000 UTC m=+0.062363176 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 07:52:39 compute-0 nova_compute[187185]: 2025-11-29 07:52:39.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:52:39 compute-0 nova_compute[187185]: 2025-11-29 07:52:39.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:52:39 compute-0 sshd-session[247885]: Invalid user tempuser from 20.255.62.58 port 50530
Nov 29 07:52:39 compute-0 nova_compute[187185]: 2025-11-29 07:52:39.762 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:52:39 compute-0 sshd-session[247885]: Received disconnect from 20.255.62.58 port 50530:11: Bye Bye [preauth]
Nov 29 07:52:39 compute-0 sshd-session[247885]: Disconnected from invalid user tempuser 20.255.62.58 port 50530 [preauth]
Nov 29 07:52:43 compute-0 nova_compute[187185]: 2025-11-29 07:52:43.431 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:52:44 compute-0 nova_compute[187185]: 2025-11-29 07:52:44.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:52:44 compute-0 nova_compute[187185]: 2025-11-29 07:52:44.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:52:44 compute-0 nova_compute[187185]: 2025-11-29 07:52:44.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:52:44 compute-0 nova_compute[187185]: 2025-11-29 07:52:44.765 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:52:44 compute-0 nova_compute[187185]: 2025-11-29 07:52:44.833 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "refresh_cache-1885cb6b-9e7f-433f-86bf-9e88d6199d90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:52:44 compute-0 nova_compute[187185]: 2025-11-29 07:52:44.834 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquired lock "refresh_cache-1885cb6b-9e7f-433f-86bf-9e88d6199d90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:52:44 compute-0 nova_compute[187185]: 2025-11-29 07:52:44.834 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 29 07:52:44 compute-0 nova_compute[187185]: 2025-11-29 07:52:44.835 187189 DEBUG nova.objects.instance [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 1885cb6b-9e7f-433f-86bf-9e88d6199d90 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:52:47 compute-0 podman[247947]: 2025-11-29 07:52:47.830940931 +0000 UTC m=+0.093979026 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:52:48 compute-0 nova_compute[187185]: 2025-11-29 07:52:48.017 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Updating instance_info_cache with network_info: [{"id": "092b4f60-4cd8-4ca5-9e92-0131a96a6acf", "address": "fa:16:3e:fd:a5:f5", "network": {"id": "dca15c0d-501b-43a9-ac14-6bd62da0f9ec", "bridge": "br-int", "label": "tempest-network-smoke--1940175519", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap092b4f60-4c", "ovs_interfaceid": "092b4f60-4cd8-4ca5-9e92-0131a96a6acf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.026 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90', 'name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000b0', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'hostId': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.028 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.045 12 DEBUG ceilometer.compute.pollsters [-] 1885cb6b-9e7f-433f-86bf-9e88d6199d90/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.048 12 DEBUG ceilometer.compute.pollsters [-] 1885cb6b-9e7f-433f-86bf-9e88d6199d90/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '92546069-c857-410d-915d-90b88c1df5d4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90-vda', 'timestamp': '2025-11-29T07:52:48.028523', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324', 'name': 'instance-000000b0', 'instance_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '65ce8e44-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8215.746807577, 'message_signature': 'e39cc1822e59e0d60685a28234c379203b96f10ef951577d8eadd60d29684176'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90-sda', 'timestamp': '2025-11-29T07:52:48.028523', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324', 'name': 'instance-000000b0', 'instance_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '65ceb202-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8215.746807577, 'message_signature': '0dd4df0920af155c0b195a41747d1997b50cbf6203272b9a415639a1aa1159df'}]}, 'timestamp': '2025-11-29 07:52:48.048777', '_unique_id': 'eeaf29e31f8441bbb6521c40e4567420'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.051 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.052 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 07:52:48 compute-0 nova_compute[187185]: 2025-11-29 07:52:48.085 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Releasing lock "refresh_cache-1885cb6b-9e7f-433f-86bf-9e88d6199d90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:52:48 compute-0 nova_compute[187185]: 2025-11-29 07:52:48.086 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 29 07:52:48 compute-0 nova_compute[187185]: 2025-11-29 07:52:48.087 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:52:48 compute-0 nova_compute[187185]: 2025-11-29 07:52:48.087 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:52:48 compute-0 nova_compute[187185]: 2025-11-29 07:52:48.087 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.087 12 DEBUG ceilometer.compute.pollsters [-] 1885cb6b-9e7f-433f-86bf-9e88d6199d90/disk.device.write.bytes volume: 73076736 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:52:48 compute-0 nova_compute[187185]: 2025-11-29 07:52:48.088 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.088 12 DEBUG ceilometer.compute.pollsters [-] 1885cb6b-9e7f-433f-86bf-9e88d6199d90/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d2d1578-3287-4f6b-b70f-33b28a1c7645', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73076736, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90-vda', 'timestamp': '2025-11-29T07:52:48.052265', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324', 'name': 'instance-000000b0', 'instance_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '65d4bf76-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8215.770526782, 'message_signature': '77963c00d53fec7e0c55ed8dfa6e5ef6aa259ac160fba4ce08a37cac3808ae03'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90-sda', 'timestamp': '2025-11-29T07:52:48.052265', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324', 'name': 'instance-000000b0', 'instance_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '65d4d06a-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8215.770526782, 'message_signature': '1ee416f2b6384f9faac52a19d29bfb935df9fc7baacd1b8886df60421072b233'}]}, 'timestamp': '2025-11-29 07:52:48.088879', '_unique_id': '45619b1a335444b6adcaa28571a223bc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.090 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.091 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.091 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.092 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324>]
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.092 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.092 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.092 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324>]
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.092 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.093 12 DEBUG ceilometer.compute.pollsters [-] 1885cb6b-9e7f-433f-86bf-9e88d6199d90/disk.device.write.latency volume: 2956230219 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.093 12 DEBUG ceilometer.compute.pollsters [-] 1885cb6b-9e7f-433f-86bf-9e88d6199d90/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a4fda31a-402d-46c4-91ba-aabfd1e4e83e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2956230219, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90-vda', 'timestamp': '2025-11-29T07:52:48.093007', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324', 'name': 'instance-000000b0', 'instance_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '65d58096-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8215.770526782, 'message_signature': '080a3770e8bb67d6898130ea8d850c2e70f265c853f5338a62905a4978076222'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90-sda', 'timestamp': '2025-11-29T07:52:48.093007', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324', 'name': 'instance-000000b0', 'instance_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '65d58f0a-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8215.770526782, 'message_signature': 'f589c789db3450bc42d8e1f9f514ed08c9a5255f0da80d6b702968203d429259'}]}, 'timestamp': '2025-11-29 07:52:48.093720', '_unique_id': 'd3af977ca7bb4844a324ddeaf59a7271'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.094 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.095 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.100 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 1885cb6b-9e7f-433f-86bf-9e88d6199d90 / tap092b4f60-4c inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.100 12 DEBUG ceilometer.compute.pollsters [-] 1885cb6b-9e7f-433f-86bf-9e88d6199d90/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '507834f2-a328-4bd3-b7c5-ea86ead8d294', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b0-1885cb6b-9e7f-433f-86bf-9e88d6199d90-tap092b4f60-4c', 'timestamp': '2025-11-29T07:52:48.095599', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324', 'name': 'tap092b4f60-4c', 'instance_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fd:a5:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap092b4f60-4c'}, 'message_id': '65d6a98a-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8215.813859746, 'message_signature': '6bff01744b063520d7e8089bde5edd719ff4d765eb97da5a2e45b25d0ba95ae4'}]}, 'timestamp': '2025-11-29 07:52:48.101093', '_unique_id': '4a3f03c631854e1f9a86974a89784ab7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.102 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.103 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.103 12 DEBUG ceilometer.compute.pollsters [-] 1885cb6b-9e7f-433f-86bf-9e88d6199d90/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ad061d28-a51e-4315-94e7-b359411cad4e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b0-1885cb6b-9e7f-433f-86bf-9e88d6199d90-tap092b4f60-4c', 'timestamp': '2025-11-29T07:52:48.103311', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324', 'name': 'tap092b4f60-4c', 'instance_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fd:a5:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap092b4f60-4c'}, 'message_id': '65d712ee-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8215.813859746, 'message_signature': '3c4bc4f70f87c18fb96a9ac81c30dcff423d127e5b015eb279e7c664b7ea6924'}]}, 'timestamp': '2025-11-29 07:52:48.103674', '_unique_id': 'e10f17e0a86e4c4ebd98b4629db07ba0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.104 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.106 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.106 12 DEBUG ceilometer.compute.pollsters [-] 1885cb6b-9e7f-433f-86bf-9e88d6199d90/network.outgoing.bytes volume: 5816 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9cd85d2b-76b0-4ceb-ac53-470ff4aa8b8f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 5816, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b0-1885cb6b-9e7f-433f-86bf-9e88d6199d90-tap092b4f60-4c', 'timestamp': '2025-11-29T07:52:48.106179', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324', 'name': 'tap092b4f60-4c', 'instance_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fd:a5:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap092b4f60-4c'}, 'message_id': '65d782b0-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8215.813859746, 'message_signature': '2a2f1d6ffe5e3db4e1185dc92be147bc8f1c3e31f0116e0409085db029d9c4c7'}]}, 'timestamp': '2025-11-29 07:52:48.106551', '_unique_id': '827e923a31044c988d80035adfc0762a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.107 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.108 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.109 12 DEBUG ceilometer.compute.pollsters [-] 1885cb6b-9e7f-433f-86bf-9e88d6199d90/disk.device.read.latency volume: 181321800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.109 12 DEBUG ceilometer.compute.pollsters [-] 1885cb6b-9e7f-433f-86bf-9e88d6199d90/disk.device.read.latency volume: 23400765 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7a128051-6cdb-4974-afee-1a2beb434cf4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 181321800, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90-vda', 'timestamp': '2025-11-29T07:52:48.109114', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324', 'name': 'instance-000000b0', 'instance_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '65d7f86c-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8215.770526782, 'message_signature': 'bb92ffc96986a099c6c6d50682b3adccf5d32b6758c3f725761cda895c79e992'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 23400765, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90-sda', 'timestamp': '2025-11-29T07:52:48.109114', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324', 'name': 'instance-000000b0', 'instance_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '65d8067c-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8215.770526782, 'message_signature': '4ff5b6296394f9e185ac40eb5efc924c459c434787a35928bd8f4668ec51c1a3'}]}, 'timestamp': '2025-11-29 07:52:48.109959', '_unique_id': 'c722b5ef0be9457183e3e2dcc2ce90e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.110 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.112 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.112 12 DEBUG ceilometer.compute.pollsters [-] 1885cb6b-9e7f-433f-86bf-9e88d6199d90/disk.device.write.requests volume: 296 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.113 12 DEBUG ceilometer.compute.pollsters [-] 1885cb6b-9e7f-433f-86bf-9e88d6199d90/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '89c1844e-f1c5-47a3-9f90-9b9495727270', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 296, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90-vda', 'timestamp': '2025-11-29T07:52:48.112527', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324', 'name': 'instance-000000b0', 'instance_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '65d88066-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8215.770526782, 'message_signature': 'cd155bfe41210707f12df8d9b4b7d2ecefa43227e22fff828e3ff0272aa3ccf9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90-sda', 'timestamp': '2025-11-29T07:52:48.112527', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324', 'name': 'instance-000000b0', 'instance_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '65d89128-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8215.770526782, 'message_signature': 'ab37a1f8fae44be933ec9819f2f505cec4a0b1d695102590a7e2b542da13323b'}]}, 'timestamp': '2025-11-29 07:52:48.113438', '_unique_id': '13d029567d114955950647f32cf4e1dc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.114 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.116 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.137 12 DEBUG ceilometer.compute.pollsters [-] 1885cb6b-9e7f-433f-86bf-9e88d6199d90/memory.usage volume: 42.86328125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b3a41417-2977-473d-a6ea-d45ab57746b8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.86328125, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90', 'timestamp': '2025-11-29T07:52:48.116520', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324', 'name': 'instance-000000b0', 'instance_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '65dc5e02-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8215.855762028, 'message_signature': '74e74e4e51704b6b1f27526ff9b954f80309e223ca4859fa779c4b3031690176'}]}, 'timestamp': '2025-11-29 07:52:48.138401', '_unique_id': '43db0e2e0b3548abbd1ad63821832dcd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.139 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.140 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.141 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.141 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324>]
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.141 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.141 12 DEBUG ceilometer.compute.pollsters [-] 1885cb6b-9e7f-433f-86bf-9e88d6199d90/network.incoming.packets volume: 41 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1efc85cb-19d4-49e7-b7f5-d0252707daf2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 41, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b0-1885cb6b-9e7f-433f-86bf-9e88d6199d90-tap092b4f60-4c', 'timestamp': '2025-11-29T07:52:48.141662', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324', 'name': 'tap092b4f60-4c', 'instance_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fd:a5:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap092b4f60-4c'}, 'message_id': '65dcefc0-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8215.813859746, 'message_signature': '6f5066ca386668ef8de4153614cc41ffc7258ff526095e6ed2d6ffb798c914b9'}]}, 'timestamp': '2025-11-29 07:52:48.142072', '_unique_id': '8d62fcc1eb744953b29bcfce72320e19'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.142 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.143 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.143 12 DEBUG ceilometer.compute.pollsters [-] 1885cb6b-9e7f-433f-86bf-9e88d6199d90/disk.device.read.requests volume: 1060 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 DEBUG ceilometer.compute.pollsters [-] 1885cb6b-9e7f-433f-86bf-9e88d6199d90/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '091c05a7-a14b-434f-aa01-eec80d0f90c9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1060, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90-vda', 'timestamp': '2025-11-29T07:52:48.143684', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324', 'name': 'instance-000000b0', 'instance_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '65dd3d4a-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8215.770526782, 'message_signature': '420017656395024b73ee73879431d78e6be721b1126e1856342bc489893a7d02'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90-sda', 'timestamp': '2025-11-29T07:52:48.143684', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324', 'name': 'instance-000000b0', 'instance_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '65dd490c-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8215.770526782, 'message_signature': '22578a8833c2a4b02a37516e086f0727edcbf399e64042b1d50adc4e129600d8'}]}, 'timestamp': '2025-11-29 07:52:48.144344', '_unique_id': 'e02328a5cb0a4c43ac856297c288feb7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.144 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.146 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.146 12 DEBUG ceilometer.compute.pollsters [-] 1885cb6b-9e7f-433f-86bf-9e88d6199d90/disk.device.read.bytes volume: 29551104 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.146 12 DEBUG ceilometer.compute.pollsters [-] 1885cb6b-9e7f-433f-86bf-9e88d6199d90/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '777a7f30-dbbd-41c7-b338-2845138d7eea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29551104, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90-vda', 'timestamp': '2025-11-29T07:52:48.146235', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324', 'name': 'instance-000000b0', 'instance_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '65dda0fa-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8215.770526782, 'message_signature': '2b9059458392eaaf1c23da498128131deb3015dcc9f6c348e3f09caddcd5b0aa'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90-sda', 'timestamp': '2025-11-29T07:52:48.146235', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324', 'name': 'instance-000000b0', 'instance_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '65ddae06-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8215.770526782, 'message_signature': '1201597c117e0762561817bfaed79c6d2321a81990b7a15cd9cab2dcd46fa816'}]}, 'timestamp': '2025-11-29 07:52:48.146953', '_unique_id': '4b4acd5fea5e491796493bfb90c27fbe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.147 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.148 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.149 12 DEBUG ceilometer.compute.pollsters [-] 1885cb6b-9e7f-433f-86bf-9e88d6199d90/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.149 12 DEBUG ceilometer.compute.pollsters [-] 1885cb6b-9e7f-433f-86bf-9e88d6199d90/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ea81e61a-8a90-4d98-aae8-b85c0dfaa191', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90-vda', 'timestamp': '2025-11-29T07:52:48.148987', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324', 'name': 'instance-000000b0', 'instance_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '65de0d92-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8215.746807577, 'message_signature': '54facf819d0f30a4d49c204e145bdb94109ca27df0c373604c2e3332bcddbffa'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90-sda', 'timestamp': '2025-11-29T07:52:48.148987', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324', 'name': 'instance-000000b0', 'instance_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '65de1b2a-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8215.746807577, 'message_signature': '04c47df5b7c261388a28e89864c576253f4dc42748a3637ee8a928c64c2cbf5b'}]}, 'timestamp': '2025-11-29 07:52:48.149732', '_unique_id': 'e2f907c6d2dd4dee882067e3a8c10d04'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.150 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.151 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.151 12 DEBUG ceilometer.compute.pollsters [-] 1885cb6b-9e7f-433f-86bf-9e88d6199d90/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.152 12 DEBUG ceilometer.compute.pollsters [-] 1885cb6b-9e7f-433f-86bf-9e88d6199d90/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f5f0f732-67d6-4623-ba1a-d2b8da7ba882', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90-vda', 'timestamp': '2025-11-29T07:52:48.151803', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324', 'name': 'instance-000000b0', 'instance_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '65de7ba6-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8215.746807577, 'message_signature': '7aa9d4e7c34cf9dee49d71415433a70c7050debef45ab83814b05b6a6fdc3e59'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90-sda', 'timestamp': '2025-11-29T07:52:48.151803', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324', 'name': 'instance-000000b0', 'instance_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '65de8768-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8215.746807577, 'message_signature': 'a57a5195bb60c82b59dd0f2d6021712e477b6b78861149fd2c529cf3b78ea98d'}]}, 'timestamp': '2025-11-29 07:52:48.152492', '_unique_id': 'f8f991a64e464a84ac576b6509030d4d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.154 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.154 12 DEBUG ceilometer.compute.pollsters [-] 1885cb6b-9e7f-433f-86bf-9e88d6199d90/network.outgoing.packets volume: 44 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6ae4e13d-15d1-4982-963d-57c04fde1a61', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 44, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b0-1885cb6b-9e7f-433f-86bf-9e88d6199d90-tap092b4f60-4c', 'timestamp': '2025-11-29T07:52:48.154273', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324', 'name': 'tap092b4f60-4c', 'instance_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fd:a5:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap092b4f60-4c'}, 'message_id': '65dedb64-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8215.813859746, 'message_signature': '3db96e8158e70e67b6906378716e06a520c13959fe68c03cd2e7749c2321ab29'}]}, 'timestamp': '2025-11-29 07:52:48.154646', '_unique_id': '29a7ba628d2247dc9d6552a978b4f618'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.155 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.156 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.156 12 DEBUG ceilometer.compute.pollsters [-] 1885cb6b-9e7f-433f-86bf-9e88d6199d90/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ca46295f-87ff-46bf-b69e-1147ee693166', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b0-1885cb6b-9e7f-433f-86bf-9e88d6199d90-tap092b4f60-4c', 'timestamp': '2025-11-29T07:52:48.156529', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324', 'name': 'tap092b4f60-4c', 'instance_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fd:a5:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap092b4f60-4c'}, 'message_id': '65df332a-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8215.813859746, 'message_signature': '52800f198aa9de79f66e96f66adf0121adb6d710a6ff9eda323cc798c2ffce36'}]}, 'timestamp': '2025-11-29 07:52:48.156928', '_unique_id': '745900af461c4226b9a84c83cf350840'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.157 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.158 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.158 12 DEBUG ceilometer.compute.pollsters [-] 1885cb6b-9e7f-433f-86bf-9e88d6199d90/cpu volume: 12770000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '940bb852-6a3c-4d5e-a80b-41011f715d6b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12770000000, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90', 'timestamp': '2025-11-29T07:52:48.158901', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324', 'name': 'instance-000000b0', 'instance_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '65df8f78-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8215.855762028, 'message_signature': 'a03d15ba3c5b081c5a33edd4c1440eede92fe7eab3761e12ed8672804dab00ab'}]}, 'timestamp': '2025-11-29 07:52:48.159258', '_unique_id': 'ccc91006317540edb29b87bc76b66de9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.159 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.161 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.161 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.161 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324>]
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.161 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.161 12 DEBUG ceilometer.compute.pollsters [-] 1885cb6b-9e7f-433f-86bf-9e88d6199d90/network.incoming.bytes volume: 7194 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4bbd46f1-e964-4f81-9891-0a79c5a179fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7194, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b0-1885cb6b-9e7f-433f-86bf-9e88d6199d90-tap092b4f60-4c', 'timestamp': '2025-11-29T07:52:48.161696', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324', 'name': 'tap092b4f60-4c', 'instance_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fd:a5:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap092b4f60-4c'}, 'message_id': '65dffcec-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8215.813859746, 'message_signature': 'f68559445fd7cfaebdfb363603808e8afd5222136726f176a47d2c5a8ca734c9'}]}, 'timestamp': '2025-11-29 07:52:48.162070', '_unique_id': '1db6a044f0c3404b9a6e78b8efdba632'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.162 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.163 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.163 12 DEBUG ceilometer.compute.pollsters [-] 1885cb6b-9e7f-433f-86bf-9e88d6199d90/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6b08abc6-beb3-421f-bac2-b093516915eb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b0-1885cb6b-9e7f-433f-86bf-9e88d6199d90-tap092b4f60-4c', 'timestamp': '2025-11-29T07:52:48.163739', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324', 'name': 'tap092b4f60-4c', 'instance_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fd:a5:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap092b4f60-4c'}, 'message_id': '65e04c2e-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8215.813859746, 'message_signature': '4f70f0ed7be56bfcdfb3d56d1fc6d21bc8c15a320c9554680fa04f3aefabc25c'}]}, 'timestamp': '2025-11-29 07:52:48.164085', '_unique_id': 'd6872816092a4c418df6049a63fab273'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.164 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.166 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.166 12 DEBUG ceilometer.compute.pollsters [-] 1885cb6b-9e7f-433f-86bf-9e88d6199d90/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '57141b31-470d-4d4c-9c11-5e3d560ae5a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b0-1885cb6b-9e7f-433f-86bf-9e88d6199d90-tap092b4f60-4c', 'timestamp': '2025-11-29T07:52:48.166255', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324', 'name': 'tap092b4f60-4c', 'instance_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fd:a5:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap092b4f60-4c'}, 'message_id': '65e0aeda-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8215.813859746, 'message_signature': '7e76ece5162903b5253287dfa40ae1581634a7802beed6ec15aaa23cbbfda814'}]}, 'timestamp': '2025-11-29 07:52:48.166625', '_unique_id': '48a946404c634659915d65505e1a8282'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.167 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.168 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.168 12 DEBUG ceilometer.compute.pollsters [-] 1885cb6b-9e7f-433f-86bf-9e88d6199d90/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '49f62dde-b791-41d4-91f6-7f3f30634da6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b0-1885cb6b-9e7f-433f-86bf-9e88d6199d90-tap092b4f60-4c', 'timestamp': '2025-11-29T07:52:48.168412', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324', 'name': 'tap092b4f60-4c', 'instance_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fd:a5:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap092b4f60-4c'}, 'message_id': '65e100a6-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8215.813859746, 'message_signature': '5de054a0bac0d8a9aa619b1f5c9d77f06790cbd5262f7b717be584e2ce55c458'}]}, 'timestamp': '2025-11-29 07:52:48.168677', '_unique_id': '67d01e4f285142fb892284d55eb566d5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:52:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:52:48.169 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:52:48 compute-0 nova_compute[187185]: 2025-11-29 07:52:48.434 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:52:49 compute-0 nova_compute[187185]: 2025-11-29 07:52:49.128 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:52:49 compute-0 nova_compute[187185]: 2025-11-29 07:52:49.769 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:52:51 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:52:51.770 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=71, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=70) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:52:51 compute-0 nova_compute[187185]: 2025-11-29 07:52:51.772 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:52:51 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:52:51.773 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:52:53 compute-0 nova_compute[187185]: 2025-11-29 07:52:53.437 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:52:54 compute-0 nova_compute[187185]: 2025-11-29 07:52:54.774 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:52:56 compute-0 podman[247976]: 2025-11-29 07:52:56.805391257 +0000 UTC m=+0.064891838 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 07:52:58 compute-0 nova_compute[187185]: 2025-11-29 07:52:58.439 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:52:58 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:52:58.775 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '71'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:52:58 compute-0 podman[248001]: 2025-11-29 07:52:58.823240466 +0000 UTC m=+0.074996455 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 07:52:58 compute-0 podman[248000]: 2025-11-29 07:52:58.823932856 +0000 UTC m=+0.077048604 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS)
Nov 29 07:52:59 compute-0 nova_compute[187185]: 2025-11-29 07:52:59.856 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:01 compute-0 ovn_controller[95281]: 2025-11-29T07:53:01Z|00595|binding|INFO|Releasing lport abbbcf20-7d52-44a5-8f71-8c43d1bae146 from this chassis (sb_readonly=0)
Nov 29 07:53:01 compute-0 nova_compute[187185]: 2025-11-29 07:53:01.871 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:03 compute-0 nova_compute[187185]: 2025-11-29 07:53:03.310 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:53:03 compute-0 nova_compute[187185]: 2025-11-29 07:53:03.473 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:04 compute-0 nova_compute[187185]: 2025-11-29 07:53:04.861 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:08 compute-0 nova_compute[187185]: 2025-11-29 07:53:08.476 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:09 compute-0 sshd-session[248036]: Invalid user administrator from 190.181.27.27 port 49844
Nov 29 07:53:09 compute-0 podman[248038]: 2025-11-29 07:53:09.79899798 +0000 UTC m=+0.077530187 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 29 07:53:09 compute-0 podman[248040]: 2025-11-29 07:53:09.803739815 +0000 UTC m=+0.072936027 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 07:53:09 compute-0 podman[248039]: 2025-11-29 07:53:09.817192788 +0000 UTC m=+0.082324164 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., release=1755695350, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, vcs-type=git, distribution-scope=public, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 29 07:53:09 compute-0 sshd-session[248036]: Received disconnect from 190.181.27.27 port 49844:11: Bye Bye [preauth]
Nov 29 07:53:09 compute-0 sshd-session[248036]: Disconnected from invalid user administrator 190.181.27.27 port 49844 [preauth]
Nov 29 07:53:09 compute-0 nova_compute[187185]: 2025-11-29 07:53:09.863 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:10 compute-0 nova_compute[187185]: 2025-11-29 07:53:10.047 187189 DEBUG nova.compute.manager [req-c20c795d-ca12-4ba7-afe6-434bdf2efda4 req-cfaf5644-53a7-4445-b0e4-10a935bbf02c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Received event network-changed-092b4f60-4cd8-4ca5-9e92-0131a96a6acf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:53:10 compute-0 nova_compute[187185]: 2025-11-29 07:53:10.048 187189 DEBUG nova.compute.manager [req-c20c795d-ca12-4ba7-afe6-434bdf2efda4 req-cfaf5644-53a7-4445-b0e4-10a935bbf02c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Refreshing instance network info cache due to event network-changed-092b4f60-4cd8-4ca5-9e92-0131a96a6acf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:53:10 compute-0 nova_compute[187185]: 2025-11-29 07:53:10.048 187189 DEBUG oslo_concurrency.lockutils [req-c20c795d-ca12-4ba7-afe6-434bdf2efda4 req-cfaf5644-53a7-4445-b0e4-10a935bbf02c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-1885cb6b-9e7f-433f-86bf-9e88d6199d90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:53:10 compute-0 nova_compute[187185]: 2025-11-29 07:53:10.049 187189 DEBUG oslo_concurrency.lockutils [req-c20c795d-ca12-4ba7-afe6-434bdf2efda4 req-cfaf5644-53a7-4445-b0e4-10a935bbf02c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-1885cb6b-9e7f-433f-86bf-9e88d6199d90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:53:10 compute-0 nova_compute[187185]: 2025-11-29 07:53:10.049 187189 DEBUG nova.network.neutron [req-c20c795d-ca12-4ba7-afe6-434bdf2efda4 req-cfaf5644-53a7-4445-b0e4-10a935bbf02c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Refreshing network info cache for port 092b4f60-4cd8-4ca5-9e92-0131a96a6acf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:53:10 compute-0 nova_compute[187185]: 2025-11-29 07:53:10.294 187189 DEBUG oslo_concurrency.lockutils [None req-59842d20-ac41-4883-be05-6ed79044db5b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "1885cb6b-9e7f-433f-86bf-9e88d6199d90" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:53:10 compute-0 nova_compute[187185]: 2025-11-29 07:53:10.295 187189 DEBUG oslo_concurrency.lockutils [None req-59842d20-ac41-4883-be05-6ed79044db5b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "1885cb6b-9e7f-433f-86bf-9e88d6199d90" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:53:10 compute-0 nova_compute[187185]: 2025-11-29 07:53:10.295 187189 DEBUG oslo_concurrency.lockutils [None req-59842d20-ac41-4883-be05-6ed79044db5b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "1885cb6b-9e7f-433f-86bf-9e88d6199d90-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:53:10 compute-0 nova_compute[187185]: 2025-11-29 07:53:10.295 187189 DEBUG oslo_concurrency.lockutils [None req-59842d20-ac41-4883-be05-6ed79044db5b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "1885cb6b-9e7f-433f-86bf-9e88d6199d90-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:53:10 compute-0 nova_compute[187185]: 2025-11-29 07:53:10.295 187189 DEBUG oslo_concurrency.lockutils [None req-59842d20-ac41-4883-be05-6ed79044db5b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "1885cb6b-9e7f-433f-86bf-9e88d6199d90-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:53:10 compute-0 nova_compute[187185]: 2025-11-29 07:53:10.311 187189 INFO nova.compute.manager [None req-59842d20-ac41-4883-be05-6ed79044db5b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Terminating instance
Nov 29 07:53:10 compute-0 nova_compute[187185]: 2025-11-29 07:53:10.324 187189 DEBUG nova.compute.manager [None req-59842d20-ac41-4883-be05-6ed79044db5b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 07:53:10 compute-0 kernel: tap092b4f60-4c (unregistering): left promiscuous mode
Nov 29 07:53:10 compute-0 NetworkManager[55227]: <info>  [1764402790.3575] device (tap092b4f60-4c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:53:10 compute-0 ovn_controller[95281]: 2025-11-29T07:53:10Z|00596|binding|INFO|Releasing lport 092b4f60-4cd8-4ca5-9e92-0131a96a6acf from this chassis (sb_readonly=0)
Nov 29 07:53:10 compute-0 ovn_controller[95281]: 2025-11-29T07:53:10Z|00597|binding|INFO|Setting lport 092b4f60-4cd8-4ca5-9e92-0131a96a6acf down in Southbound
Nov 29 07:53:10 compute-0 ovn_controller[95281]: 2025-11-29T07:53:10Z|00598|binding|INFO|Removing iface tap092b4f60-4c ovn-installed in OVS
Nov 29 07:53:10 compute-0 nova_compute[187185]: 2025-11-29 07:53:10.369 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:10.377 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fd:a5:f5 10.100.0.4'], port_security=['fa:16:3e:fd:a5:f5 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '1885cb6b-9e7f-433f-86bf-9e88d6199d90', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dca15c0d-501b-43a9-ac14-6bd62da0f9ec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'afb0a87a-e6cb-4bf6-93dc-3e1b8fd7af88 bf778d2c-6f77-4017-b7b4-2a7103c8ac47', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e0b105a9-0bbb-468a-9eb3-7a5d5e9f7fdf, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=092b4f60-4cd8-4ca5-9e92-0131a96a6acf) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:53:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:10.379 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 092b4f60-4cd8-4ca5-9e92-0131a96a6acf in datapath dca15c0d-501b-43a9-ac14-6bd62da0f9ec unbound from our chassis
Nov 29 07:53:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:10.381 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dca15c0d-501b-43a9-ac14-6bd62da0f9ec, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:53:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:10.383 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[b5667c21-a3ba-4c46-b280-a0a23ee56c72]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:53:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:10.383 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-dca15c0d-501b-43a9-ac14-6bd62da0f9ec namespace which is not needed anymore
Nov 29 07:53:10 compute-0 nova_compute[187185]: 2025-11-29 07:53:10.395 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:10 compute-0 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d000000b0.scope: Deactivated successfully.
Nov 29 07:53:10 compute-0 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d000000b0.scope: Consumed 18.307s CPU time.
Nov 29 07:53:10 compute-0 systemd-machined[153486]: Machine qemu-68-instance-000000b0 terminated.
Nov 29 07:53:10 compute-0 neutron-haproxy-ovnmeta-dca15c0d-501b-43a9-ac14-6bd62da0f9ec[247457]: [NOTICE]   (247461) : haproxy version is 2.8.14-c23fe91
Nov 29 07:53:10 compute-0 neutron-haproxy-ovnmeta-dca15c0d-501b-43a9-ac14-6bd62da0f9ec[247457]: [NOTICE]   (247461) : path to executable is /usr/sbin/haproxy
Nov 29 07:53:10 compute-0 neutron-haproxy-ovnmeta-dca15c0d-501b-43a9-ac14-6bd62da0f9ec[247457]: [WARNING]  (247461) : Exiting Master process...
Nov 29 07:53:10 compute-0 neutron-haproxy-ovnmeta-dca15c0d-501b-43a9-ac14-6bd62da0f9ec[247457]: [ALERT]    (247461) : Current worker (247463) exited with code 143 (Terminated)
Nov 29 07:53:10 compute-0 neutron-haproxy-ovnmeta-dca15c0d-501b-43a9-ac14-6bd62da0f9ec[247457]: [WARNING]  (247461) : All workers exited. Exiting... (0)
Nov 29 07:53:10 compute-0 systemd[1]: libpod-79dcfd79ef5c4e7444e90aaff518d167a77bdb6e8b096275e06deea95f16b41e.scope: Deactivated successfully.
Nov 29 07:53:10 compute-0 podman[248127]: 2025-11-29 07:53:10.544384285 +0000 UTC m=+0.057468047 container died 79dcfd79ef5c4e7444e90aaff518d167a77bdb6e8b096275e06deea95f16b41e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dca15c0d-501b-43a9-ac14-6bd62da0f9ec, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 07:53:10 compute-0 nova_compute[187185]: 2025-11-29 07:53:10.547 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:10 compute-0 nova_compute[187185]: 2025-11-29 07:53:10.551 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:10 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-79dcfd79ef5c4e7444e90aaff518d167a77bdb6e8b096275e06deea95f16b41e-userdata-shm.mount: Deactivated successfully.
Nov 29 07:53:10 compute-0 systemd[1]: var-lib-containers-storage-overlay-76252cf9ea9b3cfc173252094cab89ff3f1acc6899763e686c0489e8559b1abf-merged.mount: Deactivated successfully.
Nov 29 07:53:10 compute-0 podman[248127]: 2025-11-29 07:53:10.589071007 +0000 UTC m=+0.102154759 container cleanup 79dcfd79ef5c4e7444e90aaff518d167a77bdb6e8b096275e06deea95f16b41e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dca15c0d-501b-43a9-ac14-6bd62da0f9ec, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 07:53:10 compute-0 nova_compute[187185]: 2025-11-29 07:53:10.600 187189 INFO nova.virt.libvirt.driver [-] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Instance destroyed successfully.
Nov 29 07:53:10 compute-0 nova_compute[187185]: 2025-11-29 07:53:10.600 187189 DEBUG nova.objects.instance [None req-59842d20-ac41-4883-be05-6ed79044db5b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lazy-loading 'resources' on Instance uuid 1885cb6b-9e7f-433f-86bf-9e88d6199d90 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:53:10 compute-0 systemd[1]: libpod-conmon-79dcfd79ef5c4e7444e90aaff518d167a77bdb6e8b096275e06deea95f16b41e.scope: Deactivated successfully.
Nov 29 07:53:10 compute-0 nova_compute[187185]: 2025-11-29 07:53:10.615 187189 DEBUG nova.virt.libvirt.vif [None req-59842d20-ac41-4883-be05-6ed79044db5b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:51:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1437223324',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2022058758-ac',id=176,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHeXwv7HCMDE973nSxHTOU9Ex4gPKNp8jzgRfha1raP+nxdvcDhoV7VW+zo1781kKiel2wqiyR9rjNc2n+cKdfsB5frQVtod1rlGQWI90bAcU+zMc1WHDM0LSIWch78VpQ==',key_name='tempest-TestSecurityGroupsBasicOps-1050361794',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:51:18Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e8e45e91223b45a79dd698a82af4a2a5',ramdisk_id='',reservation_id='r-64j8zebl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-2022058758',owner_user_name='tempest-TestSecurityGroupsBasicOps-2022058758-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:51:18Z,user_data=None,user_id='dec30fbde18e4b2382ea2c59847d067f',uuid=1885cb6b-9e7f-433f-86bf-9e88d6199d90,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "092b4f60-4cd8-4ca5-9e92-0131a96a6acf", "address": "fa:16:3e:fd:a5:f5", "network": {"id": "dca15c0d-501b-43a9-ac14-6bd62da0f9ec", "bridge": "br-int", "label": "tempest-network-smoke--1940175519", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap092b4f60-4c", "ovs_interfaceid": "092b4f60-4cd8-4ca5-9e92-0131a96a6acf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:53:10 compute-0 nova_compute[187185]: 2025-11-29 07:53:10.616 187189 DEBUG nova.network.os_vif_util [None req-59842d20-ac41-4883-be05-6ed79044db5b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converting VIF {"id": "092b4f60-4cd8-4ca5-9e92-0131a96a6acf", "address": "fa:16:3e:fd:a5:f5", "network": {"id": "dca15c0d-501b-43a9-ac14-6bd62da0f9ec", "bridge": "br-int", "label": "tempest-network-smoke--1940175519", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.204", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap092b4f60-4c", "ovs_interfaceid": "092b4f60-4cd8-4ca5-9e92-0131a96a6acf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:53:10 compute-0 nova_compute[187185]: 2025-11-29 07:53:10.617 187189 DEBUG nova.network.os_vif_util [None req-59842d20-ac41-4883-be05-6ed79044db5b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:fd:a5:f5,bridge_name='br-int',has_traffic_filtering=True,id=092b4f60-4cd8-4ca5-9e92-0131a96a6acf,network=Network(dca15c0d-501b-43a9-ac14-6bd62da0f9ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap092b4f60-4c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:53:10 compute-0 nova_compute[187185]: 2025-11-29 07:53:10.618 187189 DEBUG os_vif [None req-59842d20-ac41-4883-be05-6ed79044db5b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:fd:a5:f5,bridge_name='br-int',has_traffic_filtering=True,id=092b4f60-4cd8-4ca5-9e92-0131a96a6acf,network=Network(dca15c0d-501b-43a9-ac14-6bd62da0f9ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap092b4f60-4c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:53:10 compute-0 nova_compute[187185]: 2025-11-29 07:53:10.620 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:10 compute-0 nova_compute[187185]: 2025-11-29 07:53:10.620 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap092b4f60-4c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:53:10 compute-0 nova_compute[187185]: 2025-11-29 07:53:10.625 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:53:10 compute-0 nova_compute[187185]: 2025-11-29 07:53:10.628 187189 INFO os_vif [None req-59842d20-ac41-4883-be05-6ed79044db5b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:fd:a5:f5,bridge_name='br-int',has_traffic_filtering=True,id=092b4f60-4cd8-4ca5-9e92-0131a96a6acf,network=Network(dca15c0d-501b-43a9-ac14-6bd62da0f9ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap092b4f60-4c')
Nov 29 07:53:10 compute-0 nova_compute[187185]: 2025-11-29 07:53:10.629 187189 INFO nova.virt.libvirt.driver [None req-59842d20-ac41-4883-be05-6ed79044db5b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Deleting instance files /var/lib/nova/instances/1885cb6b-9e7f-433f-86bf-9e88d6199d90_del
Nov 29 07:53:10 compute-0 nova_compute[187185]: 2025-11-29 07:53:10.630 187189 INFO nova.virt.libvirt.driver [None req-59842d20-ac41-4883-be05-6ed79044db5b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Deletion of /var/lib/nova/instances/1885cb6b-9e7f-433f-86bf-9e88d6199d90_del complete
Nov 29 07:53:10 compute-0 podman[248171]: 2025-11-29 07:53:10.6689348 +0000 UTC m=+0.053774482 container remove 79dcfd79ef5c4e7444e90aaff518d167a77bdb6e8b096275e06deea95f16b41e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dca15c0d-501b-43a9-ac14-6bd62da0f9ec, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 07:53:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:10.675 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[209d6c91-c543-4a97-b253-eae63eb75573]: (4, ('Sat Nov 29 07:53:10 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-dca15c0d-501b-43a9-ac14-6bd62da0f9ec (79dcfd79ef5c4e7444e90aaff518d167a77bdb6e8b096275e06deea95f16b41e)\n79dcfd79ef5c4e7444e90aaff518d167a77bdb6e8b096275e06deea95f16b41e\nSat Nov 29 07:53:10 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-dca15c0d-501b-43a9-ac14-6bd62da0f9ec (79dcfd79ef5c4e7444e90aaff518d167a77bdb6e8b096275e06deea95f16b41e)\n79dcfd79ef5c4e7444e90aaff518d167a77bdb6e8b096275e06deea95f16b41e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:53:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:10.677 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[dc2cdd14-7bc3-4134-91ba-f2058e136310]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:53:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:10.678 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdca15c0d-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:53:10 compute-0 nova_compute[187185]: 2025-11-29 07:53:10.681 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:10 compute-0 kernel: tapdca15c0d-50: left promiscuous mode
Nov 29 07:53:10 compute-0 nova_compute[187185]: 2025-11-29 07:53:10.694 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:10 compute-0 nova_compute[187185]: 2025-11-29 07:53:10.694 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:10.698 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[9999ade5-e838-41ed-941c-f9fc8fd851f0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:53:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:10.714 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[aee03e36-d9d9-4e0a-9dc0-f47ba09d4de8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:53:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:10.716 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[4023270f-c582-4310-948f-be3c62c773f1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:53:10 compute-0 nova_compute[187185]: 2025-11-29 07:53:10.717 187189 INFO nova.compute.manager [None req-59842d20-ac41-4883-be05-6ed79044db5b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Took 0.39 seconds to destroy the instance on the hypervisor.
Nov 29 07:53:10 compute-0 nova_compute[187185]: 2025-11-29 07:53:10.718 187189 DEBUG oslo.service.loopingcall [None req-59842d20-ac41-4883-be05-6ed79044db5b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 07:53:10 compute-0 nova_compute[187185]: 2025-11-29 07:53:10.719 187189 DEBUG nova.compute.manager [-] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 07:53:10 compute-0 nova_compute[187185]: 2025-11-29 07:53:10.719 187189 DEBUG nova.network.neutron [-] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 07:53:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:10.733 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a344a76a-d367-489c-9230-b3d4f5519ac6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 812387, 'reachable_time': 15058, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248189, 'error': None, 'target': 'ovnmeta-dca15c0d-501b-43a9-ac14-6bd62da0f9ec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:53:10 compute-0 systemd[1]: run-netns-ovnmeta\x2ddca15c0d\x2d501b\x2d43a9\x2dac14\x2d6bd62da0f9ec.mount: Deactivated successfully.
Nov 29 07:53:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:10.739 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-dca15c0d-501b-43a9-ac14-6bd62da0f9ec deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:53:10 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:10.740 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[a6d17b9b-24b3-4832-b0c5-edc8f5d77527]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:53:11 compute-0 nova_compute[187185]: 2025-11-29 07:53:11.167 187189 DEBUG nova.compute.manager [req-e5af70d1-3fdc-47f6-93c4-bf2606a3b867 req-1270df1e-bb56-4c73-9897-2de48d6d5e57 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Received event network-vif-unplugged-092b4f60-4cd8-4ca5-9e92-0131a96a6acf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:53:11 compute-0 nova_compute[187185]: 2025-11-29 07:53:11.168 187189 DEBUG oslo_concurrency.lockutils [req-e5af70d1-3fdc-47f6-93c4-bf2606a3b867 req-1270df1e-bb56-4c73-9897-2de48d6d5e57 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "1885cb6b-9e7f-433f-86bf-9e88d6199d90-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:53:11 compute-0 nova_compute[187185]: 2025-11-29 07:53:11.168 187189 DEBUG oslo_concurrency.lockutils [req-e5af70d1-3fdc-47f6-93c4-bf2606a3b867 req-1270df1e-bb56-4c73-9897-2de48d6d5e57 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1885cb6b-9e7f-433f-86bf-9e88d6199d90-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:53:11 compute-0 nova_compute[187185]: 2025-11-29 07:53:11.168 187189 DEBUG oslo_concurrency.lockutils [req-e5af70d1-3fdc-47f6-93c4-bf2606a3b867 req-1270df1e-bb56-4c73-9897-2de48d6d5e57 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1885cb6b-9e7f-433f-86bf-9e88d6199d90-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:53:11 compute-0 nova_compute[187185]: 2025-11-29 07:53:11.168 187189 DEBUG nova.compute.manager [req-e5af70d1-3fdc-47f6-93c4-bf2606a3b867 req-1270df1e-bb56-4c73-9897-2de48d6d5e57 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] No waiting events found dispatching network-vif-unplugged-092b4f60-4cd8-4ca5-9e92-0131a96a6acf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:53:11 compute-0 nova_compute[187185]: 2025-11-29 07:53:11.169 187189 DEBUG nova.compute.manager [req-e5af70d1-3fdc-47f6-93c4-bf2606a3b867 req-1270df1e-bb56-4c73-9897-2de48d6d5e57 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Received event network-vif-unplugged-092b4f60-4cd8-4ca5-9e92-0131a96a6acf for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 07:53:11 compute-0 nova_compute[187185]: 2025-11-29 07:53:11.352 187189 DEBUG nova.network.neutron [-] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:53:11 compute-0 nova_compute[187185]: 2025-11-29 07:53:11.372 187189 INFO nova.compute.manager [-] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Took 0.65 seconds to deallocate network for instance.
Nov 29 07:53:11 compute-0 nova_compute[187185]: 2025-11-29 07:53:11.440 187189 DEBUG nova.compute.manager [req-db797d9c-8167-471b-a2ad-93cd9e9f94d4 req-89e63d0f-d776-4c49-83ea-0d74b7514ae0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Received event network-vif-deleted-092b4f60-4cd8-4ca5-9e92-0131a96a6acf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:53:11 compute-0 nova_compute[187185]: 2025-11-29 07:53:11.467 187189 DEBUG oslo_concurrency.lockutils [None req-59842d20-ac41-4883-be05-6ed79044db5b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:53:11 compute-0 nova_compute[187185]: 2025-11-29 07:53:11.468 187189 DEBUG oslo_concurrency.lockutils [None req-59842d20-ac41-4883-be05-6ed79044db5b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:53:11 compute-0 nova_compute[187185]: 2025-11-29 07:53:11.561 187189 DEBUG nova.compute.provider_tree [None req-59842d20-ac41-4883-be05-6ed79044db5b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:53:11 compute-0 nova_compute[187185]: 2025-11-29 07:53:11.579 187189 DEBUG nova.scheduler.client.report [None req-59842d20-ac41-4883-be05-6ed79044db5b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:53:11 compute-0 nova_compute[187185]: 2025-11-29 07:53:11.603 187189 DEBUG oslo_concurrency.lockutils [None req-59842d20-ac41-4883-be05-6ed79044db5b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:53:11 compute-0 nova_compute[187185]: 2025-11-29 07:53:11.626 187189 INFO nova.scheduler.client.report [None req-59842d20-ac41-4883-be05-6ed79044db5b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Deleted allocations for instance 1885cb6b-9e7f-433f-86bf-9e88d6199d90
Nov 29 07:53:11 compute-0 nova_compute[187185]: 2025-11-29 07:53:11.712 187189 DEBUG oslo_concurrency.lockutils [None req-59842d20-ac41-4883-be05-6ed79044db5b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "1885cb6b-9e7f-433f-86bf-9e88d6199d90" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.417s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:53:11 compute-0 nova_compute[187185]: 2025-11-29 07:53:11.914 187189 DEBUG nova.network.neutron [req-c20c795d-ca12-4ba7-afe6-434bdf2efda4 req-cfaf5644-53a7-4445-b0e4-10a935bbf02c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Updated VIF entry in instance network info cache for port 092b4f60-4cd8-4ca5-9e92-0131a96a6acf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:53:11 compute-0 nova_compute[187185]: 2025-11-29 07:53:11.915 187189 DEBUG nova.network.neutron [req-c20c795d-ca12-4ba7-afe6-434bdf2efda4 req-cfaf5644-53a7-4445-b0e4-10a935bbf02c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Updating instance_info_cache with network_info: [{"id": "092b4f60-4cd8-4ca5-9e92-0131a96a6acf", "address": "fa:16:3e:fd:a5:f5", "network": {"id": "dca15c0d-501b-43a9-ac14-6bd62da0f9ec", "bridge": "br-int", "label": "tempest-network-smoke--1940175519", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap092b4f60-4c", "ovs_interfaceid": "092b4f60-4cd8-4ca5-9e92-0131a96a6acf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:53:11 compute-0 nova_compute[187185]: 2025-11-29 07:53:11.930 187189 DEBUG oslo_concurrency.lockutils [req-c20c795d-ca12-4ba7-afe6-434bdf2efda4 req-cfaf5644-53a7-4445-b0e4-10a935bbf02c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-1885cb6b-9e7f-433f-86bf-9e88d6199d90" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:53:13 compute-0 nova_compute[187185]: 2025-11-29 07:53:13.260 187189 DEBUG nova.compute.manager [req-1e92d3f5-5f46-47be-bca2-a51a9cf254b6 req-15c2a915-ad57-429d-89c8-1840d39bea4d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Received event network-vif-plugged-092b4f60-4cd8-4ca5-9e92-0131a96a6acf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:53:13 compute-0 nova_compute[187185]: 2025-11-29 07:53:13.261 187189 DEBUG oslo_concurrency.lockutils [req-1e92d3f5-5f46-47be-bca2-a51a9cf254b6 req-15c2a915-ad57-429d-89c8-1840d39bea4d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "1885cb6b-9e7f-433f-86bf-9e88d6199d90-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:53:13 compute-0 nova_compute[187185]: 2025-11-29 07:53:13.261 187189 DEBUG oslo_concurrency.lockutils [req-1e92d3f5-5f46-47be-bca2-a51a9cf254b6 req-15c2a915-ad57-429d-89c8-1840d39bea4d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1885cb6b-9e7f-433f-86bf-9e88d6199d90-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:53:13 compute-0 nova_compute[187185]: 2025-11-29 07:53:13.261 187189 DEBUG oslo_concurrency.lockutils [req-1e92d3f5-5f46-47be-bca2-a51a9cf254b6 req-15c2a915-ad57-429d-89c8-1840d39bea4d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1885cb6b-9e7f-433f-86bf-9e88d6199d90-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:53:13 compute-0 nova_compute[187185]: 2025-11-29 07:53:13.261 187189 DEBUG nova.compute.manager [req-1e92d3f5-5f46-47be-bca2-a51a9cf254b6 req-15c2a915-ad57-429d-89c8-1840d39bea4d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] No waiting events found dispatching network-vif-plugged-092b4f60-4cd8-4ca5-9e92-0131a96a6acf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:53:13 compute-0 nova_compute[187185]: 2025-11-29 07:53:13.262 187189 WARNING nova.compute.manager [req-1e92d3f5-5f46-47be-bca2-a51a9cf254b6 req-15c2a915-ad57-429d-89c8-1840d39bea4d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Received unexpected event network-vif-plugged-092b4f60-4cd8-4ca5-9e92-0131a96a6acf for instance with vm_state deleted and task_state None.
Nov 29 07:53:13 compute-0 nova_compute[187185]: 2025-11-29 07:53:13.478 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:15 compute-0 nova_compute[187185]: 2025-11-29 07:53:15.623 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:18 compute-0 nova_compute[187185]: 2025-11-29 07:53:18.479 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:18 compute-0 podman[248192]: 2025-11-29 07:53:18.860018202 +0000 UTC m=+0.112113882 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 07:53:19 compute-0 nova_compute[187185]: 2025-11-29 07:53:19.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:53:20 compute-0 sshd-session[248190]: Invalid user temp from 45.78.219.119 port 58706
Nov 29 07:53:20 compute-0 sshd-session[248190]: Received disconnect from 45.78.219.119 port 58706:11: Bye Bye [preauth]
Nov 29 07:53:20 compute-0 sshd-session[248190]: Disconnected from invalid user temp 45.78.219.119 port 58706 [preauth]
Nov 29 07:53:20 compute-0 nova_compute[187185]: 2025-11-29 07:53:20.659 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:23 compute-0 nova_compute[187185]: 2025-11-29 07:53:23.414 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:23 compute-0 nova_compute[187185]: 2025-11-29 07:53:23.428 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:53:23 compute-0 nova_compute[187185]: 2025-11-29 07:53:23.429 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 29 07:53:23 compute-0 nova_compute[187185]: 2025-11-29 07:53:23.482 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 29 07:53:23 compute-0 nova_compute[187185]: 2025-11-29 07:53:23.588 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:23 compute-0 nova_compute[187185]: 2025-11-29 07:53:23.598 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:25 compute-0 nova_compute[187185]: 2025-11-29 07:53:25.599 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764402790.5978904, 1885cb6b-9e7f-433f-86bf-9e88d6199d90 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:53:25 compute-0 nova_compute[187185]: 2025-11-29 07:53:25.599 187189 INFO nova.compute.manager [-] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] VM Stopped (Lifecycle Event)
Nov 29 07:53:25 compute-0 nova_compute[187185]: 2025-11-29 07:53:25.628 187189 DEBUG nova.compute.manager [None req-5413cfb2-0dba-46ec-a12c-648b1717cd51 - - - - - -] [instance: 1885cb6b-9e7f-433f-86bf-9e88d6199d90] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:53:25 compute-0 nova_compute[187185]: 2025-11-29 07:53:25.662 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:25.753 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:53:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:25.754 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:53:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:25.754 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:53:27 compute-0 podman[248223]: 2025-11-29 07:53:27.811208344 +0000 UTC m=+0.071156417 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 07:53:28 compute-0 nova_compute[187185]: 2025-11-29 07:53:28.592 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:29 compute-0 podman[248246]: 2025-11-29 07:53:29.834272162 +0000 UTC m=+0.093148381 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd)
Nov 29 07:53:29 compute-0 podman[248247]: 2025-11-29 07:53:29.8689892 +0000 UTC m=+0.117793782 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 29 07:53:30 compute-0 nova_compute[187185]: 2025-11-29 07:53:30.664 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:33 compute-0 nova_compute[187185]: 2025-11-29 07:53:33.595 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:34 compute-0 nova_compute[187185]: 2025-11-29 07:53:34.370 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:53:35 compute-0 nova_compute[187185]: 2025-11-29 07:53:35.667 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:37 compute-0 nova_compute[187185]: 2025-11-29 07:53:37.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:53:37 compute-0 nova_compute[187185]: 2025-11-29 07:53:37.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:53:38 compute-0 nova_compute[187185]: 2025-11-29 07:53:38.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:53:38 compute-0 nova_compute[187185]: 2025-11-29 07:53:38.449 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:53:38 compute-0 nova_compute[187185]: 2025-11-29 07:53:38.449 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:53:38 compute-0 nova_compute[187185]: 2025-11-29 07:53:38.450 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:53:38 compute-0 nova_compute[187185]: 2025-11-29 07:53:38.450 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:53:38 compute-0 nova_compute[187185]: 2025-11-29 07:53:38.655 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:38 compute-0 nova_compute[187185]: 2025-11-29 07:53:38.713 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:53:38 compute-0 nova_compute[187185]: 2025-11-29 07:53:38.715 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5720MB free_disk=73.24602508544922GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:53:38 compute-0 nova_compute[187185]: 2025-11-29 07:53:38.715 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:53:38 compute-0 nova_compute[187185]: 2025-11-29 07:53:38.716 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:53:39 compute-0 nova_compute[187185]: 2025-11-29 07:53:39.165 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:53:39 compute-0 nova_compute[187185]: 2025-11-29 07:53:39.166 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:53:39 compute-0 nova_compute[187185]: 2025-11-29 07:53:39.189 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:53:39 compute-0 nova_compute[187185]: 2025-11-29 07:53:39.379 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:53:39 compute-0 nova_compute[187185]: 2025-11-29 07:53:39.428 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:53:39 compute-0 nova_compute[187185]: 2025-11-29 07:53:39.429 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.714s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:53:40 compute-0 nova_compute[187185]: 2025-11-29 07:53:40.669 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:40 compute-0 podman[248289]: 2025-11-29 07:53:40.848137952 +0000 UTC m=+0.093633235 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 07:53:40 compute-0 podman[248287]: 2025-11-29 07:53:40.849674346 +0000 UTC m=+0.108257982 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 07:53:40 compute-0 podman[248288]: 2025-11-29 07:53:40.874236455 +0000 UTC m=+0.125141972 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, release=1755695350, version=9.6, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible)
Nov 29 07:53:41 compute-0 nova_compute[187185]: 2025-11-29 07:53:41.430 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:53:41 compute-0 nova_compute[187185]: 2025-11-29 07:53:41.431 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:53:43 compute-0 nova_compute[187185]: 2025-11-29 07:53:43.657 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:45 compute-0 nova_compute[187185]: 2025-11-29 07:53:45.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:53:45 compute-0 nova_compute[187185]: 2025-11-29 07:53:45.318 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:53:45 compute-0 nova_compute[187185]: 2025-11-29 07:53:45.318 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:53:45 compute-0 nova_compute[187185]: 2025-11-29 07:53:45.352 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 07:53:45 compute-0 nova_compute[187185]: 2025-11-29 07:53:45.353 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:53:45 compute-0 nova_compute[187185]: 2025-11-29 07:53:45.671 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:45 compute-0 nova_compute[187185]: 2025-11-29 07:53:45.725 187189 DEBUG oslo_concurrency.lockutils [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "43cb7661-81a7-4e91-96aa-5d72329a58b7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:53:45 compute-0 nova_compute[187185]: 2025-11-29 07:53:45.726 187189 DEBUG oslo_concurrency.lockutils [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "43cb7661-81a7-4e91-96aa-5d72329a58b7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:53:45 compute-0 nova_compute[187185]: 2025-11-29 07:53:45.746 187189 DEBUG nova.compute.manager [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 07:53:45 compute-0 nova_compute[187185]: 2025-11-29 07:53:45.875 187189 DEBUG oslo_concurrency.lockutils [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:53:45 compute-0 nova_compute[187185]: 2025-11-29 07:53:45.876 187189 DEBUG oslo_concurrency.lockutils [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:53:45 compute-0 nova_compute[187185]: 2025-11-29 07:53:45.884 187189 DEBUG nova.virt.hardware [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:53:45 compute-0 nova_compute[187185]: 2025-11-29 07:53:45.884 187189 INFO nova.compute.claims [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:53:46 compute-0 nova_compute[187185]: 2025-11-29 07:53:46.029 187189 DEBUG nova.compute.provider_tree [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:53:46 compute-0 nova_compute[187185]: 2025-11-29 07:53:46.074 187189 DEBUG nova.scheduler.client.report [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:53:46 compute-0 nova_compute[187185]: 2025-11-29 07:53:46.132 187189 DEBUG oslo_concurrency.lockutils [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.256s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:53:46 compute-0 nova_compute[187185]: 2025-11-29 07:53:46.134 187189 DEBUG nova.compute.manager [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 07:53:46 compute-0 nova_compute[187185]: 2025-11-29 07:53:46.237 187189 DEBUG nova.compute.manager [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 07:53:46 compute-0 nova_compute[187185]: 2025-11-29 07:53:46.238 187189 DEBUG nova.network.neutron [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 07:53:46 compute-0 nova_compute[187185]: 2025-11-29 07:53:46.287 187189 INFO nova.virt.libvirt.driver [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 07:53:46 compute-0 nova_compute[187185]: 2025-11-29 07:53:46.342 187189 DEBUG nova.compute.manager [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 07:53:46 compute-0 nova_compute[187185]: 2025-11-29 07:53:46.527 187189 DEBUG nova.compute.manager [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 07:53:46 compute-0 nova_compute[187185]: 2025-11-29 07:53:46.528 187189 DEBUG nova.virt.libvirt.driver [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:53:46 compute-0 nova_compute[187185]: 2025-11-29 07:53:46.529 187189 INFO nova.virt.libvirt.driver [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Creating image(s)
Nov 29 07:53:46 compute-0 nova_compute[187185]: 2025-11-29 07:53:46.530 187189 DEBUG oslo_concurrency.lockutils [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "/var/lib/nova/instances/43cb7661-81a7-4e91-96aa-5d72329a58b7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:53:46 compute-0 nova_compute[187185]: 2025-11-29 07:53:46.530 187189 DEBUG oslo_concurrency.lockutils [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "/var/lib/nova/instances/43cb7661-81a7-4e91-96aa-5d72329a58b7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:53:46 compute-0 nova_compute[187185]: 2025-11-29 07:53:46.531 187189 DEBUG oslo_concurrency.lockutils [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "/var/lib/nova/instances/43cb7661-81a7-4e91-96aa-5d72329a58b7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:53:46 compute-0 nova_compute[187185]: 2025-11-29 07:53:46.548 187189 DEBUG oslo_concurrency.processutils [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:53:46 compute-0 nova_compute[187185]: 2025-11-29 07:53:46.575 187189 DEBUG nova.policy [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 07:53:46 compute-0 nova_compute[187185]: 2025-11-29 07:53:46.617 187189 DEBUG oslo_concurrency.processutils [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:53:46 compute-0 nova_compute[187185]: 2025-11-29 07:53:46.618 187189 DEBUG oslo_concurrency.lockutils [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:53:46 compute-0 nova_compute[187185]: 2025-11-29 07:53:46.619 187189 DEBUG oslo_concurrency.lockutils [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:53:46 compute-0 nova_compute[187185]: 2025-11-29 07:53:46.635 187189 DEBUG oslo_concurrency.processutils [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:53:46 compute-0 nova_compute[187185]: 2025-11-29 07:53:46.699 187189 DEBUG oslo_concurrency.processutils [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:53:46 compute-0 nova_compute[187185]: 2025-11-29 07:53:46.700 187189 DEBUG oslo_concurrency.processutils [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/43cb7661-81a7-4e91-96aa-5d72329a58b7/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:53:47 compute-0 nova_compute[187185]: 2025-11-29 07:53:47.153 187189 DEBUG oslo_concurrency.processutils [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/43cb7661-81a7-4e91-96aa-5d72329a58b7/disk 1073741824" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:53:47 compute-0 nova_compute[187185]: 2025-11-29 07:53:47.155 187189 DEBUG oslo_concurrency.lockutils [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.536s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:53:47 compute-0 nova_compute[187185]: 2025-11-29 07:53:47.156 187189 DEBUG oslo_concurrency.processutils [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:53:47 compute-0 nova_compute[187185]: 2025-11-29 07:53:47.223 187189 DEBUG oslo_concurrency.processutils [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:53:47 compute-0 nova_compute[187185]: 2025-11-29 07:53:47.225 187189 DEBUG nova.virt.disk.api [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Checking if we can resize image /var/lib/nova/instances/43cb7661-81a7-4e91-96aa-5d72329a58b7/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:53:47 compute-0 nova_compute[187185]: 2025-11-29 07:53:47.226 187189 DEBUG oslo_concurrency.processutils [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/43cb7661-81a7-4e91-96aa-5d72329a58b7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:53:47 compute-0 nova_compute[187185]: 2025-11-29 07:53:47.299 187189 DEBUG nova.network.neutron [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Successfully created port: 27f71340-6ac0-4431-b058-f02eca4fb423 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:53:47 compute-0 nova_compute[187185]: 2025-11-29 07:53:47.303 187189 DEBUG oslo_concurrency.processutils [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/43cb7661-81a7-4e91-96aa-5d72329a58b7/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:53:47 compute-0 nova_compute[187185]: 2025-11-29 07:53:47.304 187189 DEBUG nova.virt.disk.api [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Cannot resize image /var/lib/nova/instances/43cb7661-81a7-4e91-96aa-5d72329a58b7/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:53:47 compute-0 nova_compute[187185]: 2025-11-29 07:53:47.305 187189 DEBUG nova.objects.instance [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lazy-loading 'migration_context' on Instance uuid 43cb7661-81a7-4e91-96aa-5d72329a58b7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:53:47 compute-0 nova_compute[187185]: 2025-11-29 07:53:47.321 187189 DEBUG nova.virt.libvirt.driver [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:53:47 compute-0 nova_compute[187185]: 2025-11-29 07:53:47.321 187189 DEBUG nova.virt.libvirt.driver [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Ensure instance console log exists: /var/lib/nova/instances/43cb7661-81a7-4e91-96aa-5d72329a58b7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:53:47 compute-0 nova_compute[187185]: 2025-11-29 07:53:47.322 187189 DEBUG oslo_concurrency.lockutils [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:53:47 compute-0 nova_compute[187185]: 2025-11-29 07:53:47.322 187189 DEBUG oslo_concurrency.lockutils [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:53:47 compute-0 nova_compute[187185]: 2025-11-29 07:53:47.323 187189 DEBUG oslo_concurrency.lockutils [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:53:47 compute-0 nova_compute[187185]: 2025-11-29 07:53:47.346 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:53:48 compute-0 nova_compute[187185]: 2025-11-29 07:53:48.102 187189 DEBUG nova.network.neutron [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Successfully updated port: 27f71340-6ac0-4431-b058-f02eca4fb423 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:53:48 compute-0 nova_compute[187185]: 2025-11-29 07:53:48.126 187189 DEBUG oslo_concurrency.lockutils [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "refresh_cache-43cb7661-81a7-4e91-96aa-5d72329a58b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:53:48 compute-0 nova_compute[187185]: 2025-11-29 07:53:48.127 187189 DEBUG oslo_concurrency.lockutils [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquired lock "refresh_cache-43cb7661-81a7-4e91-96aa-5d72329a58b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:53:48 compute-0 nova_compute[187185]: 2025-11-29 07:53:48.127 187189 DEBUG nova.network.neutron [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:53:48 compute-0 nova_compute[187185]: 2025-11-29 07:53:48.226 187189 DEBUG nova.compute.manager [req-040c1996-bf88-47e8-86b2-d236926e8da5 req-c5a7c5b3-9899-4927-a6c5-659878ae5b58 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Received event network-changed-27f71340-6ac0-4431-b058-f02eca4fb423 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:53:48 compute-0 nova_compute[187185]: 2025-11-29 07:53:48.227 187189 DEBUG nova.compute.manager [req-040c1996-bf88-47e8-86b2-d236926e8da5 req-c5a7c5b3-9899-4927-a6c5-659878ae5b58 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Refreshing instance network info cache due to event network-changed-27f71340-6ac0-4431-b058-f02eca4fb423. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:53:48 compute-0 nova_compute[187185]: 2025-11-29 07:53:48.227 187189 DEBUG oslo_concurrency.lockutils [req-040c1996-bf88-47e8-86b2-d236926e8da5 req-c5a7c5b3-9899-4927-a6c5-659878ae5b58 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-43cb7661-81a7-4e91-96aa-5d72329a58b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:53:48 compute-0 nova_compute[187185]: 2025-11-29 07:53:48.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:53:48 compute-0 nova_compute[187185]: 2025-11-29 07:53:48.318 187189 DEBUG nova.network.neutron [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:53:48 compute-0 nova_compute[187185]: 2025-11-29 07:53:48.714 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:49 compute-0 podman[248365]: 2025-11-29 07:53:49.885979143 +0000 UTC m=+0.145849282 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 07:53:49 compute-0 nova_compute[187185]: 2025-11-29 07:53:49.886 187189 DEBUG nova.network.neutron [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Updating instance_info_cache with network_info: [{"id": "27f71340-6ac0-4431-b058-f02eca4fb423", "address": "fa:16:3e:0b:ca:be", "network": {"id": "8429e89c-8540-4db3-b6b2-48775311a13d", "bridge": "br-int", "label": "tempest-network-smoke--2013284958", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27f71340-6a", "ovs_interfaceid": "27f71340-6ac0-4431-b058-f02eca4fb423", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:53:49 compute-0 nova_compute[187185]: 2025-11-29 07:53:49.911 187189 DEBUG oslo_concurrency.lockutils [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Releasing lock "refresh_cache-43cb7661-81a7-4e91-96aa-5d72329a58b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:53:49 compute-0 nova_compute[187185]: 2025-11-29 07:53:49.911 187189 DEBUG nova.compute.manager [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Instance network_info: |[{"id": "27f71340-6ac0-4431-b058-f02eca4fb423", "address": "fa:16:3e:0b:ca:be", "network": {"id": "8429e89c-8540-4db3-b6b2-48775311a13d", "bridge": "br-int", "label": "tempest-network-smoke--2013284958", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27f71340-6a", "ovs_interfaceid": "27f71340-6ac0-4431-b058-f02eca4fb423", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 07:53:49 compute-0 nova_compute[187185]: 2025-11-29 07:53:49.912 187189 DEBUG oslo_concurrency.lockutils [req-040c1996-bf88-47e8-86b2-d236926e8da5 req-c5a7c5b3-9899-4927-a6c5-659878ae5b58 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-43cb7661-81a7-4e91-96aa-5d72329a58b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:53:49 compute-0 nova_compute[187185]: 2025-11-29 07:53:49.912 187189 DEBUG nova.network.neutron [req-040c1996-bf88-47e8-86b2-d236926e8da5 req-c5a7c5b3-9899-4927-a6c5-659878ae5b58 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Refreshing network info cache for port 27f71340-6ac0-4431-b058-f02eca4fb423 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:53:49 compute-0 nova_compute[187185]: 2025-11-29 07:53:49.922 187189 DEBUG nova.virt.libvirt.driver [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Start _get_guest_xml network_info=[{"id": "27f71340-6ac0-4431-b058-f02eca4fb423", "address": "fa:16:3e:0b:ca:be", "network": {"id": "8429e89c-8540-4db3-b6b2-48775311a13d", "bridge": "br-int", "label": "tempest-network-smoke--2013284958", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27f71340-6a", "ovs_interfaceid": "27f71340-6ac0-4431-b058-f02eca4fb423", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:53:49 compute-0 nova_compute[187185]: 2025-11-29 07:53:49.932 187189 WARNING nova.virt.libvirt.driver [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:53:49 compute-0 nova_compute[187185]: 2025-11-29 07:53:49.940 187189 DEBUG nova.virt.libvirt.host [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:53:49 compute-0 nova_compute[187185]: 2025-11-29 07:53:49.941 187189 DEBUG nova.virt.libvirt.host [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:53:49 compute-0 nova_compute[187185]: 2025-11-29 07:53:49.950 187189 DEBUG nova.virt.libvirt.host [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:53:49 compute-0 nova_compute[187185]: 2025-11-29 07:53:49.952 187189 DEBUG nova.virt.libvirt.host [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:53:49 compute-0 nova_compute[187185]: 2025-11-29 07:53:49.953 187189 DEBUG nova.virt.libvirt.driver [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:53:49 compute-0 nova_compute[187185]: 2025-11-29 07:53:49.954 187189 DEBUG nova.virt.hardware [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:53:49 compute-0 nova_compute[187185]: 2025-11-29 07:53:49.954 187189 DEBUG nova.virt.hardware [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:53:49 compute-0 nova_compute[187185]: 2025-11-29 07:53:49.954 187189 DEBUG nova.virt.hardware [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:53:49 compute-0 nova_compute[187185]: 2025-11-29 07:53:49.955 187189 DEBUG nova.virt.hardware [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:53:49 compute-0 nova_compute[187185]: 2025-11-29 07:53:49.955 187189 DEBUG nova.virt.hardware [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:53:49 compute-0 nova_compute[187185]: 2025-11-29 07:53:49.955 187189 DEBUG nova.virt.hardware [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:53:49 compute-0 nova_compute[187185]: 2025-11-29 07:53:49.956 187189 DEBUG nova.virt.hardware [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:53:49 compute-0 nova_compute[187185]: 2025-11-29 07:53:49.956 187189 DEBUG nova.virt.hardware [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:53:49 compute-0 nova_compute[187185]: 2025-11-29 07:53:49.956 187189 DEBUG nova.virt.hardware [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:53:49 compute-0 nova_compute[187185]: 2025-11-29 07:53:49.957 187189 DEBUG nova.virt.hardware [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:53:49 compute-0 nova_compute[187185]: 2025-11-29 07:53:49.957 187189 DEBUG nova.virt.hardware [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:53:49 compute-0 nova_compute[187185]: 2025-11-29 07:53:49.962 187189 DEBUG nova.virt.libvirt.vif [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:53:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2022058758-ac',id=178,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDa1HqYHK6Fbyj4+WZlz/MxWl+SfeIBiBlR8m/oY3Vy4Q3n28dGa98Jt6Jmq1CjInsdStO6SA1dYTN5Q75hPAjlWGa44sox6aoYIWGLELZYzttGtittaInjJUuQncR/BLQ==',key_name='tempest-TestSecurityGroupsBasicOps-2142300595',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e8e45e91223b45a79dd698a82af4a2a5',ramdisk_id='',reservation_id='r-hij45f5r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2022058758',owner_user_name='tempest-TestSecurityGroupsBasicOps-2022058758-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:53:46Z,user_data=None,user_id='dec30fbde18e4b2382ea2c59847d067f',uuid=43cb7661-81a7-4e91-96aa-5d72329a58b7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "27f71340-6ac0-4431-b058-f02eca4fb423", "address": "fa:16:3e:0b:ca:be", "network": {"id": "8429e89c-8540-4db3-b6b2-48775311a13d", "bridge": "br-int", "label": "tempest-network-smoke--2013284958", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27f71340-6a", "ovs_interfaceid": "27f71340-6ac0-4431-b058-f02eca4fb423", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:53:49 compute-0 nova_compute[187185]: 2025-11-29 07:53:49.962 187189 DEBUG nova.network.os_vif_util [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converting VIF {"id": "27f71340-6ac0-4431-b058-f02eca4fb423", "address": "fa:16:3e:0b:ca:be", "network": {"id": "8429e89c-8540-4db3-b6b2-48775311a13d", "bridge": "br-int", "label": "tempest-network-smoke--2013284958", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27f71340-6a", "ovs_interfaceid": "27f71340-6ac0-4431-b058-f02eca4fb423", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:53:49 compute-0 nova_compute[187185]: 2025-11-29 07:53:49.963 187189 DEBUG nova.network.os_vif_util [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0b:ca:be,bridge_name='br-int',has_traffic_filtering=True,id=27f71340-6ac0-4431-b058-f02eca4fb423,network=Network(8429e89c-8540-4db3-b6b2-48775311a13d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27f71340-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:53:49 compute-0 nova_compute[187185]: 2025-11-29 07:53:49.965 187189 DEBUG nova.objects.instance [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lazy-loading 'pci_devices' on Instance uuid 43cb7661-81a7-4e91-96aa-5d72329a58b7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:53:50 compute-0 nova_compute[187185]: 2025-11-29 07:53:50.228 187189 DEBUG nova.virt.libvirt.driver [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:53:50 compute-0 nova_compute[187185]:   <uuid>43cb7661-81a7-4e91-96aa-5d72329a58b7</uuid>
Nov 29 07:53:50 compute-0 nova_compute[187185]:   <name>instance-000000b2</name>
Nov 29 07:53:50 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:53:50 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:53:50 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:53:50 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:       <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573</nova:name>
Nov 29 07:53:50 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:53:49</nova:creationTime>
Nov 29 07:53:50 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:53:50 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:53:50 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:53:50 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:53:50 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:53:50 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:53:50 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:53:50 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:53:50 compute-0 nova_compute[187185]:         <nova:user uuid="dec30fbde18e4b2382ea2c59847d067f">tempest-TestSecurityGroupsBasicOps-2022058758-project-member</nova:user>
Nov 29 07:53:50 compute-0 nova_compute[187185]:         <nova:project uuid="e8e45e91223b45a79dd698a82af4a2a5">tempest-TestSecurityGroupsBasicOps-2022058758</nova:project>
Nov 29 07:53:50 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:53:50 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:53:50 compute-0 nova_compute[187185]:         <nova:port uuid="27f71340-6ac0-4431-b058-f02eca4fb423">
Nov 29 07:53:50 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:53:50 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:53:50 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:53:50 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <system>
Nov 29 07:53:50 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:53:50 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:53:50 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:53:50 compute-0 nova_compute[187185]:       <entry name="serial">43cb7661-81a7-4e91-96aa-5d72329a58b7</entry>
Nov 29 07:53:50 compute-0 nova_compute[187185]:       <entry name="uuid">43cb7661-81a7-4e91-96aa-5d72329a58b7</entry>
Nov 29 07:53:50 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     </system>
Nov 29 07:53:50 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:53:50 compute-0 nova_compute[187185]:   <os>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:   </os>
Nov 29 07:53:50 compute-0 nova_compute[187185]:   <features>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:   </features>
Nov 29 07:53:50 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:53:50 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:53:50 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:53:50 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/43cb7661-81a7-4e91-96aa-5d72329a58b7/disk"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:53:50 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/43cb7661-81a7-4e91-96aa-5d72329a58b7/disk.config"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:53:50 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:0b:ca:be"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:       <target dev="tap27f71340-6a"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:53:50 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/43cb7661-81a7-4e91-96aa-5d72329a58b7/console.log" append="off"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <video>
Nov 29 07:53:50 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     </video>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:53:50 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:53:50 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:53:50 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:53:50 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:53:50 compute-0 nova_compute[187185]: </domain>
Nov 29 07:53:50 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:53:50 compute-0 nova_compute[187185]: 2025-11-29 07:53:50.230 187189 DEBUG nova.compute.manager [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Preparing to wait for external event network-vif-plugged-27f71340-6ac0-4431-b058-f02eca4fb423 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:53:50 compute-0 nova_compute[187185]: 2025-11-29 07:53:50.231 187189 DEBUG oslo_concurrency.lockutils [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "43cb7661-81a7-4e91-96aa-5d72329a58b7-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:53:50 compute-0 nova_compute[187185]: 2025-11-29 07:53:50.231 187189 DEBUG oslo_concurrency.lockutils [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "43cb7661-81a7-4e91-96aa-5d72329a58b7-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:53:50 compute-0 nova_compute[187185]: 2025-11-29 07:53:50.232 187189 DEBUG oslo_concurrency.lockutils [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "43cb7661-81a7-4e91-96aa-5d72329a58b7-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:53:50 compute-0 nova_compute[187185]: 2025-11-29 07:53:50.232 187189 DEBUG nova.virt.libvirt.vif [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:53:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2022058758-ac',id=178,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDa1HqYHK6Fbyj4+WZlz/MxWl+SfeIBiBlR8m/oY3Vy4Q3n28dGa98Jt6Jmq1CjInsdStO6SA1dYTN5Q75hPAjlWGa44sox6aoYIWGLELZYzttGtittaInjJUuQncR/BLQ==',key_name='tempest-TestSecurityGroupsBasicOps-2142300595',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e8e45e91223b45a79dd698a82af4a2a5',ramdisk_id='',reservation_id='r-hij45f5r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2022058758',owner_user_name='tempest-TestSecurityGroupsBasicOps-2022058758-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:53:46Z,user_data=None,user_id='dec30fbde18e4b2382ea2c59847d067f',uuid=43cb7661-81a7-4e91-96aa-5d72329a58b7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "27f71340-6ac0-4431-b058-f02eca4fb423", "address": "fa:16:3e:0b:ca:be", "network": {"id": "8429e89c-8540-4db3-b6b2-48775311a13d", "bridge": "br-int", "label": "tempest-network-smoke--2013284958", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27f71340-6a", "ovs_interfaceid": "27f71340-6ac0-4431-b058-f02eca4fb423", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:53:50 compute-0 nova_compute[187185]: 2025-11-29 07:53:50.233 187189 DEBUG nova.network.os_vif_util [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converting VIF {"id": "27f71340-6ac0-4431-b058-f02eca4fb423", "address": "fa:16:3e:0b:ca:be", "network": {"id": "8429e89c-8540-4db3-b6b2-48775311a13d", "bridge": "br-int", "label": "tempest-network-smoke--2013284958", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27f71340-6a", "ovs_interfaceid": "27f71340-6ac0-4431-b058-f02eca4fb423", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:53:50 compute-0 nova_compute[187185]: 2025-11-29 07:53:50.234 187189 DEBUG nova.network.os_vif_util [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0b:ca:be,bridge_name='br-int',has_traffic_filtering=True,id=27f71340-6ac0-4431-b058-f02eca4fb423,network=Network(8429e89c-8540-4db3-b6b2-48775311a13d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27f71340-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:53:50 compute-0 nova_compute[187185]: 2025-11-29 07:53:50.234 187189 DEBUG os_vif [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0b:ca:be,bridge_name='br-int',has_traffic_filtering=True,id=27f71340-6ac0-4431-b058-f02eca4fb423,network=Network(8429e89c-8540-4db3-b6b2-48775311a13d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27f71340-6a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:53:50 compute-0 nova_compute[187185]: 2025-11-29 07:53:50.235 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:50 compute-0 nova_compute[187185]: 2025-11-29 07:53:50.235 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:53:50 compute-0 nova_compute[187185]: 2025-11-29 07:53:50.236 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:53:50 compute-0 nova_compute[187185]: 2025-11-29 07:53:50.239 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:50 compute-0 nova_compute[187185]: 2025-11-29 07:53:50.239 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap27f71340-6a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:53:50 compute-0 nova_compute[187185]: 2025-11-29 07:53:50.241 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap27f71340-6a, col_values=(('external_ids', {'iface-id': '27f71340-6ac0-4431-b058-f02eca4fb423', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0b:ca:be', 'vm-uuid': '43cb7661-81a7-4e91-96aa-5d72329a58b7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:53:50 compute-0 nova_compute[187185]: 2025-11-29 07:53:50.243 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:50 compute-0 nova_compute[187185]: 2025-11-29 07:53:50.244 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:53:50 compute-0 NetworkManager[55227]: <info>  [1764402830.2483] manager: (tap27f71340-6a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/311)
Nov 29 07:53:50 compute-0 nova_compute[187185]: 2025-11-29 07:53:50.251 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:50 compute-0 nova_compute[187185]: 2025-11-29 07:53:50.252 187189 INFO os_vif [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0b:ca:be,bridge_name='br-int',has_traffic_filtering=True,id=27f71340-6ac0-4431-b058-f02eca4fb423,network=Network(8429e89c-8540-4db3-b6b2-48775311a13d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27f71340-6a')
Nov 29 07:53:50 compute-0 nova_compute[187185]: 2025-11-29 07:53:50.576 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:53:50 compute-0 nova_compute[187185]: 2025-11-29 07:53:50.598 187189 DEBUG nova.virt.libvirt.driver [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:53:50 compute-0 nova_compute[187185]: 2025-11-29 07:53:50.599 187189 DEBUG nova.virt.libvirt.driver [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:53:50 compute-0 nova_compute[187185]: 2025-11-29 07:53:50.599 187189 DEBUG nova.virt.libvirt.driver [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] No VIF found with MAC fa:16:3e:0b:ca:be, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:53:50 compute-0 nova_compute[187185]: 2025-11-29 07:53:50.599 187189 INFO nova.virt.libvirt.driver [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Using config drive
Nov 29 07:53:50 compute-0 nova_compute[187185]: 2025-11-29 07:53:50.741 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Triggering sync for uuid 43cb7661-81a7-4e91-96aa-5d72329a58b7 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Nov 29 07:53:50 compute-0 nova_compute[187185]: 2025-11-29 07:53:50.742 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "43cb7661-81a7-4e91-96aa-5d72329a58b7" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:53:51 compute-0 nova_compute[187185]: 2025-11-29 07:53:51.860 187189 INFO nova.virt.libvirt.driver [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Creating config drive at /var/lib/nova/instances/43cb7661-81a7-4e91-96aa-5d72329a58b7/disk.config
Nov 29 07:53:51 compute-0 nova_compute[187185]: 2025-11-29 07:53:51.866 187189 DEBUG oslo_concurrency.processutils [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/43cb7661-81a7-4e91-96aa-5d72329a58b7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8w2s6dop execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:53:52 compute-0 nova_compute[187185]: 2025-11-29 07:53:52.003 187189 DEBUG oslo_concurrency.processutils [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/43cb7661-81a7-4e91-96aa-5d72329a58b7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8w2s6dop" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:53:52 compute-0 kernel: tap27f71340-6a: entered promiscuous mode
Nov 29 07:53:52 compute-0 NetworkManager[55227]: <info>  [1764402832.0922] manager: (tap27f71340-6a): new Tun device (/org/freedesktop/NetworkManager/Devices/312)
Nov 29 07:53:52 compute-0 ovn_controller[95281]: 2025-11-29T07:53:52Z|00599|binding|INFO|Claiming lport 27f71340-6ac0-4431-b058-f02eca4fb423 for this chassis.
Nov 29 07:53:52 compute-0 nova_compute[187185]: 2025-11-29 07:53:52.093 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:52 compute-0 ovn_controller[95281]: 2025-11-29T07:53:52Z|00600|binding|INFO|27f71340-6ac0-4431-b058-f02eca4fb423: Claiming fa:16:3e:0b:ca:be 10.100.0.11
Nov 29 07:53:52 compute-0 nova_compute[187185]: 2025-11-29 07:53:52.098 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:52.111 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0b:ca:be 10.100.0.11'], port_security=['fa:16:3e:0b:ca:be 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8429e89c-8540-4db3-b6b2-48775311a13d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '302ff4eb-5b37-47a5-8263-6df9580417a7 c9c03ecd-65fb-4137-bd9c-bfe8eac1c96d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ab1419e5-3fc4-47d1-a2be-d34ec9f548ab, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=27f71340-6ac0-4431-b058-f02eca4fb423) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:52.114 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 27f71340-6ac0-4431-b058-f02eca4fb423 in datapath 8429e89c-8540-4db3-b6b2-48775311a13d bound to our chassis
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:52.116 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8429e89c-8540-4db3-b6b2-48775311a13d
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:52.129 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[fd646ae6-ab97-4831-bdd2-dde27ef8715f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:52.130 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8429e89c-81 in ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:53:52 compute-0 systemd-udevd[248412]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:52.133 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8429e89c-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:52.133 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[9e7c5268-adce-4d06-a4b9-198cb6235f05]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:52.134 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[624069a3-58ff-4fdc-8d04-37f6c54dfaf6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:53:52 compute-0 NetworkManager[55227]: <info>  [1764402832.1490] device (tap27f71340-6a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:53:52 compute-0 NetworkManager[55227]: <info>  [1764402832.1503] device (tap27f71340-6a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:53:52 compute-0 systemd-machined[153486]: New machine qemu-69-instance-000000b2.
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:52.153 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[55d534c8-cdb9-41e4-83d3-9a15d9239db4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:53:52 compute-0 nova_compute[187185]: 2025-11-29 07:53:52.165 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:52 compute-0 nova_compute[187185]: 2025-11-29 07:53:52.169 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:52 compute-0 ovn_controller[95281]: 2025-11-29T07:53:52Z|00601|binding|INFO|Setting lport 27f71340-6ac0-4431-b058-f02eca4fb423 ovn-installed in OVS
Nov 29 07:53:52 compute-0 ovn_controller[95281]: 2025-11-29T07:53:52Z|00602|binding|INFO|Setting lport 27f71340-6ac0-4431-b058-f02eca4fb423 up in Southbound
Nov 29 07:53:52 compute-0 nova_compute[187185]: 2025-11-29 07:53:52.174 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:52.173 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[2fe3178a-2c6a-4672-b289-2ce60064cc38]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:53:52 compute-0 systemd[1]: Started Virtual Machine qemu-69-instance-000000b2.
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:52.208 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[12cbcc81-37ac-4c57-8da9-8be2b11668c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:53:52 compute-0 NetworkManager[55227]: <info>  [1764402832.2159] manager: (tap8429e89c-80): new Veth device (/org/freedesktop/NetworkManager/Devices/313)
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:52.215 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[c16f4cf1-321f-44f7-8b09-3828ac31dea2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:52.267 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[8c757cde-1af8-4297-9347-b22142310afd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:52.274 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[7b8a05e3-27f6-4369-8a1e-6018829ec955]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:53:52 compute-0 NetworkManager[55227]: <info>  [1764402832.3050] device (tap8429e89c-80): carrier: link connected
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:52.309 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[c380652a-429a-4e41-82ff-42bfba2c9073]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:52.328 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[2b397996-0179-44c7-9014-cefa6d5abfd5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8429e89c-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:64:be:84'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 187], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 827996, 'reachable_time': 20443, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248445, 'error': None, 'target': 'ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:52.343 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[0bf0ca3c-a627-4095-96d4-557d8bf7d865]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe64:be84'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 827996, 'tstamp': 827996}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 248448, 'error': None, 'target': 'ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:52.363 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[c08f8ad3-d908-4baf-aaf4-d359bd88d870]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8429e89c-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:64:be:84'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 187], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 827996, 'reachable_time': 20443, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 248453, 'error': None, 'target': 'ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:52.407 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[4cc5d112-c293-4e6e-93b9-4af517cfd7e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:53:52 compute-0 nova_compute[187185]: 2025-11-29 07:53:52.439 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764402832.4384286, 43cb7661-81a7-4e91-96aa-5d72329a58b7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:53:52 compute-0 nova_compute[187185]: 2025-11-29 07:53:52.439 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] VM Started (Lifecycle Event)
Nov 29 07:53:52 compute-0 nova_compute[187185]: 2025-11-29 07:53:52.473 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:53:52 compute-0 nova_compute[187185]: 2025-11-29 07:53:52.478 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764402832.442439, 43cb7661-81a7-4e91-96aa-5d72329a58b7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:53:52 compute-0 nova_compute[187185]: 2025-11-29 07:53:52.479 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] VM Paused (Lifecycle Event)
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:52.489 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[ede237be-5e3b-44ad-befa-1b1b12d79bb2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:52.491 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8429e89c-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:52.491 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:52.492 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8429e89c-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:53:52 compute-0 nova_compute[187185]: 2025-11-29 07:53:52.494 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:52 compute-0 kernel: tap8429e89c-80: entered promiscuous mode
Nov 29 07:53:52 compute-0 NetworkManager[55227]: <info>  [1764402832.4951] manager: (tap8429e89c-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/314)
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:52.498 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8429e89c-80, col_values=(('external_ids', {'iface-id': '09e63821-cfdc-4962-ad09-7970b232d886'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:53:52 compute-0 nova_compute[187185]: 2025-11-29 07:53:52.498 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:53:52 compute-0 nova_compute[187185]: 2025-11-29 07:53:52.498 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:52 compute-0 ovn_controller[95281]: 2025-11-29T07:53:52Z|00603|binding|INFO|Releasing lport 09e63821-cfdc-4962-ad09-7970b232d886 from this chassis (sb_readonly=0)
Nov 29 07:53:52 compute-0 nova_compute[187185]: 2025-11-29 07:53:52.501 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:52.502 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8429e89c-8540-4db3-b6b2-48775311a13d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8429e89c-8540-4db3-b6b2-48775311a13d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:53:52 compute-0 nova_compute[187185]: 2025-11-29 07:53:52.503 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:52.503 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[3fecbd06-d0ff-4205-b2ce-4570a1a4a92e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:52.504 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-8429e89c-8540-4db3-b6b2-48775311a13d
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/8429e89c-8540-4db3-b6b2-48775311a13d.pid.haproxy
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID 8429e89c-8540-4db3-b6b2-48775311a13d
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:53:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:52.505 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d', 'env', 'PROCESS_TAG=haproxy-8429e89c-8540-4db3-b6b2-48775311a13d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8429e89c-8540-4db3-b6b2-48775311a13d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:53:52 compute-0 nova_compute[187185]: 2025-11-29 07:53:52.513 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:52 compute-0 nova_compute[187185]: 2025-11-29 07:53:52.535 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:53:52 compute-0 podman[248485]: 2025-11-29 07:53:52.933186909 +0000 UTC m=+0.059648058 container create 21f9cefb932825ef1ffd89978c3e42c3aa2915bd11dff3d8f5308f6db84d9975 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:53:52 compute-0 systemd[1]: Started libpod-conmon-21f9cefb932825ef1ffd89978c3e42c3aa2915bd11dff3d8f5308f6db84d9975.scope.
Nov 29 07:53:52 compute-0 podman[248485]: 2025-11-29 07:53:52.89948322 +0000 UTC m=+0.025944399 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:53:53 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:53:53 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/326761c22494d9d9deefcf31147889af1a48e495a396cd410e351f04fb70704a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:53:53 compute-0 podman[248485]: 2025-11-29 07:53:53.037633072 +0000 UTC m=+0.164094251 container init 21f9cefb932825ef1ffd89978c3e42c3aa2915bd11dff3d8f5308f6db84d9975 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 29 07:53:53 compute-0 podman[248485]: 2025-11-29 07:53:53.044877408 +0000 UTC m=+0.171338567 container start 21f9cefb932825ef1ffd89978c3e42c3aa2915bd11dff3d8f5308f6db84d9975 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 29 07:53:53 compute-0 neutron-haproxy-ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d[248501]: [NOTICE]   (248505) : New worker (248507) forked
Nov 29 07:53:53 compute-0 neutron-haproxy-ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d[248501]: [NOTICE]   (248505) : Loading success.
Nov 29 07:53:53 compute-0 nova_compute[187185]: 2025-11-29 07:53:53.152 187189 DEBUG nova.compute.manager [req-2ceb3d5f-0f46-4e18-bed8-6df946d99f7b req-0922ab6a-21fe-48b2-bfe1-bdde363145af 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Received event network-vif-plugged-27f71340-6ac0-4431-b058-f02eca4fb423 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:53:53 compute-0 nova_compute[187185]: 2025-11-29 07:53:53.153 187189 DEBUG oslo_concurrency.lockutils [req-2ceb3d5f-0f46-4e18-bed8-6df946d99f7b req-0922ab6a-21fe-48b2-bfe1-bdde363145af 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "43cb7661-81a7-4e91-96aa-5d72329a58b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:53:53 compute-0 nova_compute[187185]: 2025-11-29 07:53:53.153 187189 DEBUG oslo_concurrency.lockutils [req-2ceb3d5f-0f46-4e18-bed8-6df946d99f7b req-0922ab6a-21fe-48b2-bfe1-bdde363145af 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "43cb7661-81a7-4e91-96aa-5d72329a58b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:53:53 compute-0 nova_compute[187185]: 2025-11-29 07:53:53.153 187189 DEBUG oslo_concurrency.lockutils [req-2ceb3d5f-0f46-4e18-bed8-6df946d99f7b req-0922ab6a-21fe-48b2-bfe1-bdde363145af 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "43cb7661-81a7-4e91-96aa-5d72329a58b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:53:53 compute-0 nova_compute[187185]: 2025-11-29 07:53:53.153 187189 DEBUG nova.compute.manager [req-2ceb3d5f-0f46-4e18-bed8-6df946d99f7b req-0922ab6a-21fe-48b2-bfe1-bdde363145af 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Processing event network-vif-plugged-27f71340-6ac0-4431-b058-f02eca4fb423 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:53:53 compute-0 nova_compute[187185]: 2025-11-29 07:53:53.154 187189 DEBUG nova.compute.manager [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:53:53 compute-0 nova_compute[187185]: 2025-11-29 07:53:53.180 187189 DEBUG nova.virt.libvirt.driver [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:53:53 compute-0 nova_compute[187185]: 2025-11-29 07:53:53.181 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764402833.180407, 43cb7661-81a7-4e91-96aa-5d72329a58b7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:53:53 compute-0 nova_compute[187185]: 2025-11-29 07:53:53.181 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] VM Resumed (Lifecycle Event)
Nov 29 07:53:53 compute-0 nova_compute[187185]: 2025-11-29 07:53:53.186 187189 INFO nova.virt.libvirt.driver [-] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Instance spawned successfully.
Nov 29 07:53:53 compute-0 nova_compute[187185]: 2025-11-29 07:53:53.187 187189 DEBUG nova.virt.libvirt.driver [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:53:53 compute-0 nova_compute[187185]: 2025-11-29 07:53:53.202 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:53:53 compute-0 nova_compute[187185]: 2025-11-29 07:53:53.207 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:53:53 compute-0 nova_compute[187185]: 2025-11-29 07:53:53.211 187189 DEBUG nova.virt.libvirt.driver [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:53:53 compute-0 nova_compute[187185]: 2025-11-29 07:53:53.211 187189 DEBUG nova.virt.libvirt.driver [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:53:53 compute-0 nova_compute[187185]: 2025-11-29 07:53:53.212 187189 DEBUG nova.virt.libvirt.driver [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:53:53 compute-0 nova_compute[187185]: 2025-11-29 07:53:53.212 187189 DEBUG nova.virt.libvirt.driver [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:53:53 compute-0 nova_compute[187185]: 2025-11-29 07:53:53.213 187189 DEBUG nova.virt.libvirt.driver [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:53:53 compute-0 nova_compute[187185]: 2025-11-29 07:53:53.213 187189 DEBUG nova.virt.libvirt.driver [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:53:53 compute-0 nova_compute[187185]: 2025-11-29 07:53:53.250 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:53:53 compute-0 nova_compute[187185]: 2025-11-29 07:53:53.294 187189 INFO nova.compute.manager [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Took 6.77 seconds to spawn the instance on the hypervisor.
Nov 29 07:53:53 compute-0 nova_compute[187185]: 2025-11-29 07:53:53.295 187189 DEBUG nova.compute.manager [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:53:53 compute-0 nova_compute[187185]: 2025-11-29 07:53:53.376 187189 INFO nova.compute.manager [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Took 7.55 seconds to build instance.
Nov 29 07:53:53 compute-0 nova_compute[187185]: 2025-11-29 07:53:53.402 187189 DEBUG oslo_concurrency.lockutils [None req-d60d21c2-f704-42bb-be3a-dda45603cdda dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "43cb7661-81a7-4e91-96aa-5d72329a58b7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.676s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:53:53 compute-0 nova_compute[187185]: 2025-11-29 07:53:53.403 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "43cb7661-81a7-4e91-96aa-5d72329a58b7" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 2.661s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:53:53 compute-0 nova_compute[187185]: 2025-11-29 07:53:53.403 187189 INFO nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:53:53 compute-0 nova_compute[187185]: 2025-11-29 07:53:53.403 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "43cb7661-81a7-4e91-96aa-5d72329a58b7" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:53:53 compute-0 nova_compute[187185]: 2025-11-29 07:53:53.716 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:54 compute-0 nova_compute[187185]: 2025-11-29 07:53:54.047 187189 DEBUG nova.network.neutron [req-040c1996-bf88-47e8-86b2-d236926e8da5 req-c5a7c5b3-9899-4927-a6c5-659878ae5b58 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Updated VIF entry in instance network info cache for port 27f71340-6ac0-4431-b058-f02eca4fb423. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:53:54 compute-0 nova_compute[187185]: 2025-11-29 07:53:54.048 187189 DEBUG nova.network.neutron [req-040c1996-bf88-47e8-86b2-d236926e8da5 req-c5a7c5b3-9899-4927-a6c5-659878ae5b58 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Updating instance_info_cache with network_info: [{"id": "27f71340-6ac0-4431-b058-f02eca4fb423", "address": "fa:16:3e:0b:ca:be", "network": {"id": "8429e89c-8540-4db3-b6b2-48775311a13d", "bridge": "br-int", "label": "tempest-network-smoke--2013284958", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27f71340-6a", "ovs_interfaceid": "27f71340-6ac0-4431-b058-f02eca4fb423", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:53:54 compute-0 nova_compute[187185]: 2025-11-29 07:53:54.077 187189 DEBUG oslo_concurrency.lockutils [req-040c1996-bf88-47e8-86b2-d236926e8da5 req-c5a7c5b3-9899-4927-a6c5-659878ae5b58 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-43cb7661-81a7-4e91-96aa-5d72329a58b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:53:55 compute-0 nova_compute[187185]: 2025-11-29 07:53:55.244 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:55 compute-0 nova_compute[187185]: 2025-11-29 07:53:55.297 187189 DEBUG nova.compute.manager [req-bb510414-1396-4ad8-bb67-d5493132baee req-91990aad-e301-4d44-ba24-4f8485b0a325 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Received event network-vif-plugged-27f71340-6ac0-4431-b058-f02eca4fb423 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:53:55 compute-0 nova_compute[187185]: 2025-11-29 07:53:55.297 187189 DEBUG oslo_concurrency.lockutils [req-bb510414-1396-4ad8-bb67-d5493132baee req-91990aad-e301-4d44-ba24-4f8485b0a325 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "43cb7661-81a7-4e91-96aa-5d72329a58b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:53:55 compute-0 nova_compute[187185]: 2025-11-29 07:53:55.297 187189 DEBUG oslo_concurrency.lockutils [req-bb510414-1396-4ad8-bb67-d5493132baee req-91990aad-e301-4d44-ba24-4f8485b0a325 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "43cb7661-81a7-4e91-96aa-5d72329a58b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:53:55 compute-0 nova_compute[187185]: 2025-11-29 07:53:55.298 187189 DEBUG oslo_concurrency.lockutils [req-bb510414-1396-4ad8-bb67-d5493132baee req-91990aad-e301-4d44-ba24-4f8485b0a325 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "43cb7661-81a7-4e91-96aa-5d72329a58b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:53:55 compute-0 nova_compute[187185]: 2025-11-29 07:53:55.298 187189 DEBUG nova.compute.manager [req-bb510414-1396-4ad8-bb67-d5493132baee req-91990aad-e301-4d44-ba24-4f8485b0a325 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] No waiting events found dispatching network-vif-plugged-27f71340-6ac0-4431-b058-f02eca4fb423 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:53:55 compute-0 nova_compute[187185]: 2025-11-29 07:53:55.298 187189 WARNING nova.compute.manager [req-bb510414-1396-4ad8-bb67-d5493132baee req-91990aad-e301-4d44-ba24-4f8485b0a325 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Received unexpected event network-vif-plugged-27f71340-6ac0-4431-b058-f02eca4fb423 for instance with vm_state active and task_state None.
Nov 29 07:53:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:57.040 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=72, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=71) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:53:57 compute-0 nova_compute[187185]: 2025-11-29 07:53:57.040 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:57 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:53:57.041 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:53:57 compute-0 nova_compute[187185]: 2025-11-29 07:53:57.222 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:57 compute-0 NetworkManager[55227]: <info>  [1764402837.2231] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/315)
Nov 29 07:53:57 compute-0 NetworkManager[55227]: <info>  [1764402837.2244] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/316)
Nov 29 07:53:57 compute-0 nova_compute[187185]: 2025-11-29 07:53:57.326 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:57 compute-0 ovn_controller[95281]: 2025-11-29T07:53:57Z|00604|binding|INFO|Releasing lport 09e63821-cfdc-4962-ad09-7970b232d886 from this chassis (sb_readonly=0)
Nov 29 07:53:57 compute-0 nova_compute[187185]: 2025-11-29 07:53:57.343 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:58 compute-0 nova_compute[187185]: 2025-11-29 07:53:58.021 187189 DEBUG nova.compute.manager [req-522a73a7-e38f-4843-a259-eaf43358c8cc req-dfe425a7-7bb4-4aaa-9d39-1809eba02946 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Received event network-changed-27f71340-6ac0-4431-b058-f02eca4fb423 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:53:58 compute-0 nova_compute[187185]: 2025-11-29 07:53:58.022 187189 DEBUG nova.compute.manager [req-522a73a7-e38f-4843-a259-eaf43358c8cc req-dfe425a7-7bb4-4aaa-9d39-1809eba02946 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Refreshing instance network info cache due to event network-changed-27f71340-6ac0-4431-b058-f02eca4fb423. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:53:58 compute-0 nova_compute[187185]: 2025-11-29 07:53:58.022 187189 DEBUG oslo_concurrency.lockutils [req-522a73a7-e38f-4843-a259-eaf43358c8cc req-dfe425a7-7bb4-4aaa-9d39-1809eba02946 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-43cb7661-81a7-4e91-96aa-5d72329a58b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:53:58 compute-0 nova_compute[187185]: 2025-11-29 07:53:58.023 187189 DEBUG oslo_concurrency.lockutils [req-522a73a7-e38f-4843-a259-eaf43358c8cc req-dfe425a7-7bb4-4aaa-9d39-1809eba02946 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-43cb7661-81a7-4e91-96aa-5d72329a58b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:53:58 compute-0 nova_compute[187185]: 2025-11-29 07:53:58.023 187189 DEBUG nova.network.neutron [req-522a73a7-e38f-4843-a259-eaf43358c8cc req-dfe425a7-7bb4-4aaa-9d39-1809eba02946 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Refreshing network info cache for port 27f71340-6ac0-4431-b058-f02eca4fb423 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:53:58 compute-0 nova_compute[187185]: 2025-11-29 07:53:58.718 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:53:58 compute-0 podman[248517]: 2025-11-29 07:53:58.796572189 +0000 UTC m=+0.056515879 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 07:53:59 compute-0 nova_compute[187185]: 2025-11-29 07:53:59.325 187189 DEBUG nova.network.neutron [req-522a73a7-e38f-4843-a259-eaf43358c8cc req-dfe425a7-7bb4-4aaa-9d39-1809eba02946 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Updated VIF entry in instance network info cache for port 27f71340-6ac0-4431-b058-f02eca4fb423. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:53:59 compute-0 nova_compute[187185]: 2025-11-29 07:53:59.326 187189 DEBUG nova.network.neutron [req-522a73a7-e38f-4843-a259-eaf43358c8cc req-dfe425a7-7bb4-4aaa-9d39-1809eba02946 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Updating instance_info_cache with network_info: [{"id": "27f71340-6ac0-4431-b058-f02eca4fb423", "address": "fa:16:3e:0b:ca:be", "network": {"id": "8429e89c-8540-4db3-b6b2-48775311a13d", "bridge": "br-int", "label": "tempest-network-smoke--2013284958", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27f71340-6a", "ovs_interfaceid": "27f71340-6ac0-4431-b058-f02eca4fb423", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:53:59 compute-0 nova_compute[187185]: 2025-11-29 07:53:59.350 187189 DEBUG oslo_concurrency.lockutils [req-522a73a7-e38f-4843-a259-eaf43358c8cc req-dfe425a7-7bb4-4aaa-9d39-1809eba02946 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-43cb7661-81a7-4e91-96aa-5d72329a58b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:54:00 compute-0 nova_compute[187185]: 2025-11-29 07:54:00.247 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:54:00 compute-0 podman[248541]: 2025-11-29 07:54:00.808002998 +0000 UTC m=+0.074310916 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 07:54:00 compute-0 podman[248542]: 2025-11-29 07:54:00.85443739 +0000 UTC m=+0.111360071 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute)
Nov 29 07:54:01 compute-0 sshd-session[248539]: Invalid user dangulo from 20.255.62.58 port 54272
Nov 29 07:54:01 compute-0 sshd-session[248539]: Received disconnect from 20.255.62.58 port 54272:11: Bye Bye [preauth]
Nov 29 07:54:01 compute-0 sshd-session[248539]: Disconnected from invalid user dangulo 20.255.62.58 port 54272 [preauth]
Nov 29 07:54:03 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:54:03.045 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '72'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:54:03 compute-0 nova_compute[187185]: 2025-11-29 07:54:03.721 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:54:05 compute-0 nova_compute[187185]: 2025-11-29 07:54:05.252 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:54:05 compute-0 ovn_controller[95281]: 2025-11-29T07:54:05Z|00083|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0b:ca:be 10.100.0.11
Nov 29 07:54:05 compute-0 ovn_controller[95281]: 2025-11-29T07:54:05Z|00084|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0b:ca:be 10.100.0.11
Nov 29 07:54:08 compute-0 nova_compute[187185]: 2025-11-29 07:54:08.724 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:54:10 compute-0 nova_compute[187185]: 2025-11-29 07:54:10.255 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:54:11 compute-0 podman[248601]: 2025-11-29 07:54:11.817666638 +0000 UTC m=+0.072933077 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 07:54:11 compute-0 podman[248599]: 2025-11-29 07:54:11.826734566 +0000 UTC m=+0.095355695 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 29 07:54:11 compute-0 podman[248600]: 2025-11-29 07:54:11.83778415 +0000 UTC m=+0.092043771 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, name=ubi9-minimal, architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, managed_by=edpm_ansible, release=1755695350, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.openshift.expose-services=)
Nov 29 07:54:13 compute-0 nova_compute[187185]: 2025-11-29 07:54:13.728 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:54:15 compute-0 nova_compute[187185]: 2025-11-29 07:54:15.258 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:54:18 compute-0 nova_compute[187185]: 2025-11-29 07:54:18.731 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:54:20 compute-0 nova_compute[187185]: 2025-11-29 07:54:20.259 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:54:20 compute-0 podman[248660]: 2025-11-29 07:54:20.845211863 +0000 UTC m=+0.112276607 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 07:54:23 compute-0 nova_compute[187185]: 2025-11-29 07:54:23.735 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:54:25 compute-0 nova_compute[187185]: 2025-11-29 07:54:25.299 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:54:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:54:25.754 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:54:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:54:25.755 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:54:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:54:25.756 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:54:27 compute-0 ovn_controller[95281]: 2025-11-29T07:54:27Z|00605|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory
Nov 29 07:54:28 compute-0 sshd-session[248687]: Invalid user kingbase from 190.181.27.27 port 36736
Nov 29 07:54:28 compute-0 sshd-session[248687]: Received disconnect from 190.181.27.27 port 36736:11: Bye Bye [preauth]
Nov 29 07:54:28 compute-0 sshd-session[248687]: Disconnected from invalid user kingbase 190.181.27.27 port 36736 [preauth]
Nov 29 07:54:28 compute-0 nova_compute[187185]: 2025-11-29 07:54:28.741 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:54:29 compute-0 podman[248689]: 2025-11-29 07:54:29.798150076 +0000 UTC m=+0.062053510 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 07:54:30 compute-0 nova_compute[187185]: 2025-11-29 07:54:30.303 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:54:31 compute-0 podman[248715]: 2025-11-29 07:54:31.802704439 +0000 UTC m=+0.065310762 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 29 07:54:31 compute-0 podman[248716]: 2025-11-29 07:54:31.815428872 +0000 UTC m=+0.071469768 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:54:33 compute-0 nova_compute[187185]: 2025-11-29 07:54:33.743 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:54:35 compute-0 nova_compute[187185]: 2025-11-29 07:54:35.307 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:54:35 compute-0 nova_compute[187185]: 2025-11-29 07:54:35.482 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:54:36 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:54:36.680 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=73, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=72) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:54:36 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:54:36.681 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:54:36 compute-0 nova_compute[187185]: 2025-11-29 07:54:36.681 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:54:36 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:54:36.682 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '73'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:54:37 compute-0 nova_compute[187185]: 2025-11-29 07:54:37.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:54:38 compute-0 nova_compute[187185]: 2025-11-29 07:54:38.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:54:38 compute-0 nova_compute[187185]: 2025-11-29 07:54:38.747 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:54:39 compute-0 nova_compute[187185]: 2025-11-29 07:54:39.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:54:39 compute-0 nova_compute[187185]: 2025-11-29 07:54:39.345 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:54:39 compute-0 nova_compute[187185]: 2025-11-29 07:54:39.346 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:54:39 compute-0 nova_compute[187185]: 2025-11-29 07:54:39.346 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:54:39 compute-0 nova_compute[187185]: 2025-11-29 07:54:39.347 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:54:39 compute-0 nova_compute[187185]: 2025-11-29 07:54:39.435 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/43cb7661-81a7-4e91-96aa-5d72329a58b7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:54:39 compute-0 nova_compute[187185]: 2025-11-29 07:54:39.513 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/43cb7661-81a7-4e91-96aa-5d72329a58b7/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:54:39 compute-0 nova_compute[187185]: 2025-11-29 07:54:39.514 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/43cb7661-81a7-4e91-96aa-5d72329a58b7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:54:39 compute-0 nova_compute[187185]: 2025-11-29 07:54:39.582 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/43cb7661-81a7-4e91-96aa-5d72329a58b7/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:54:39 compute-0 nova_compute[187185]: 2025-11-29 07:54:39.743 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:54:39 compute-0 nova_compute[187185]: 2025-11-29 07:54:39.745 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5554MB free_disk=73.21699523925781GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:54:39 compute-0 nova_compute[187185]: 2025-11-29 07:54:39.746 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:54:39 compute-0 nova_compute[187185]: 2025-11-29 07:54:39.746 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:54:39 compute-0 nova_compute[187185]: 2025-11-29 07:54:39.816 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance 43cb7661-81a7-4e91-96aa-5d72329a58b7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:54:39 compute-0 nova_compute[187185]: 2025-11-29 07:54:39.818 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:54:39 compute-0 nova_compute[187185]: 2025-11-29 07:54:39.818 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:54:39 compute-0 nova_compute[187185]: 2025-11-29 07:54:39.868 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:54:39 compute-0 nova_compute[187185]: 2025-11-29 07:54:39.891 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:54:39 compute-0 nova_compute[187185]: 2025-11-29 07:54:39.921 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:54:39 compute-0 nova_compute[187185]: 2025-11-29 07:54:39.922 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:54:40 compute-0 nova_compute[187185]: 2025-11-29 07:54:40.309 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:54:41 compute-0 nova_compute[187185]: 2025-11-29 07:54:41.923 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:54:41 compute-0 nova_compute[187185]: 2025-11-29 07:54:41.923 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:54:42 compute-0 podman[248764]: 2025-11-29 07:54:42.802098368 +0000 UTC m=+0.061946446 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, version=9.6, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 07:54:42 compute-0 podman[248765]: 2025-11-29 07:54:42.812145665 +0000 UTC m=+0.068503234 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 07:54:42 compute-0 podman[248763]: 2025-11-29 07:54:42.839976448 +0000 UTC m=+0.094487524 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 07:54:43 compute-0 nova_compute[187185]: 2025-11-29 07:54:43.750 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:54:45 compute-0 nova_compute[187185]: 2025-11-29 07:54:45.313 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:54:46 compute-0 nova_compute[187185]: 2025-11-29 07:54:46.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:54:46 compute-0 nova_compute[187185]: 2025-11-29 07:54:46.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:54:46 compute-0 nova_compute[187185]: 2025-11-29 07:54:46.318 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:54:46 compute-0 nova_compute[187185]: 2025-11-29 07:54:46.960 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "refresh_cache-43cb7661-81a7-4e91-96aa-5d72329a58b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:54:46 compute-0 nova_compute[187185]: 2025-11-29 07:54:46.960 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquired lock "refresh_cache-43cb7661-81a7-4e91-96aa-5d72329a58b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:54:46 compute-0 nova_compute[187185]: 2025-11-29 07:54:46.961 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 29 07:54:46 compute-0 nova_compute[187185]: 2025-11-29 07:54:46.961 187189 DEBUG nova.objects.instance [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 43cb7661-81a7-4e91-96aa-5d72329a58b7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.026 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '43cb7661-81a7-4e91-96aa-5d72329a58b7', 'name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000b2', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'hostId': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.027 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.027 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.028 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573>]
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.028 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.033 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 43cb7661-81a7-4e91-96aa-5d72329a58b7 / tap27f71340-6a inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.033 12 DEBUG ceilometer.compute.pollsters [-] 43cb7661-81a7-4e91-96aa-5d72329a58b7/network.incoming.packets volume: 131 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7df33af8-bfa7-4bc0-bd9a-cdad24f09742', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 131, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b2-43cb7661-81a7-4e91-96aa-5d72329a58b7-tap27f71340-6a', 'timestamp': '2025-11-29T07:54:48.028825', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573', 'name': 'tap27f71340-6a', 'instance_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0b:ca:be', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap27f71340-6a'}, 'message_id': 'ad530ca4-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8335.747115157, 'message_signature': '80e1fd6dd247a375a1e012576c927f749fc20aa7da7edd6df80af4e6dadc9265'}]}, 'timestamp': '2025-11-29 07:54:48.034694', '_unique_id': '55b45c3a94cc4ab980cbceea423d1469'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.037 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.038 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.038 12 DEBUG ceilometer.compute.pollsters [-] 43cb7661-81a7-4e91-96aa-5d72329a58b7/network.outgoing.packets volume: 148 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f3a70674-79d3-47ec-a68c-49dc31c388fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 148, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b2-43cb7661-81a7-4e91-96aa-5d72329a58b7-tap27f71340-6a', 'timestamp': '2025-11-29T07:54:48.038790', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573', 'name': 'tap27f71340-6a', 'instance_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0b:ca:be', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap27f71340-6a'}, 'message_id': 'ad53cc5c-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8335.747115157, 'message_signature': '66c69b5ea8726d6c9d0429e3a2b5c86f3ccba91f36f0f5448fdb50382a9580dc'}]}, 'timestamp': '2025-11-29 07:54:48.039402', '_unique_id': '85e659fec56b4c60a34517379b4e7861'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.040 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.042 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.043 12 DEBUG ceilometer.compute.pollsters [-] 43cb7661-81a7-4e91-96aa-5d72329a58b7/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7079ff94-eb02-4c3e-82f0-a86e80c2b9d3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b2-43cb7661-81a7-4e91-96aa-5d72329a58b7-tap27f71340-6a', 'timestamp': '2025-11-29T07:54:48.043145', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573', 'name': 'tap27f71340-6a', 'instance_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0b:ca:be', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap27f71340-6a'}, 'message_id': 'ad547832-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8335.747115157, 'message_signature': '4836fe7ac1a97fa43657d2c1c453be419d4f68d2e2d282ddeac0879e002344cd'}]}, 'timestamp': '2025-11-29 07:54:48.043811', '_unique_id': 'e9ae7d40aaa7464d980b0956dc95d5d1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.045 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.047 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.068 12 DEBUG ceilometer.compute.pollsters [-] 43cb7661-81a7-4e91-96aa-5d72329a58b7/memory.usage volume: 42.640625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '066b9042-839a-4db3-98f5-253574d802be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.640625, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7', 'timestamp': '2025-11-29T07:54:48.047355', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573', 'name': 'instance-000000b2', 'instance_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'ad5852e0-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8335.78617483, 'message_signature': 'dca626cda64a1cc73f27a825fa2c46944313c1628e8bf80bfaeec3158f272e1e'}]}, 'timestamp': '2025-11-29 07:54:48.069132', '_unique_id': '474774425e27437f87875d8b5bbb08cc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.070 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.072 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.073 12 DEBUG ceilometer.compute.pollsters [-] 43cb7661-81a7-4e91-96aa-5d72329a58b7/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7cf92c38-5bbc-46ed-9715-175fe496c81a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b2-43cb7661-81a7-4e91-96aa-5d72329a58b7-tap27f71340-6a', 'timestamp': '2025-11-29T07:54:48.073012', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573', 'name': 'tap27f71340-6a', 'instance_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0b:ca:be', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap27f71340-6a'}, 'message_id': 'ad5905aa-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8335.747115157, 'message_signature': '9039b3d65a35e16aa4eaf1a846979be3f4fe00198ebef05e8151e4e579609e88'}]}, 'timestamp': '2025-11-29 07:54:48.073801', '_unique_id': 'a0b555498dec4f71bb138761ac495c39'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.074 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.076 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.106 12 DEBUG ceilometer.compute.pollsters [-] 43cb7661-81a7-4e91-96aa-5d72329a58b7/disk.device.write.requests volume: 312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.107 12 DEBUG ceilometer.compute.pollsters [-] 43cb7661-81a7-4e91-96aa-5d72329a58b7/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '23cc250f-786d-4f96-9368-5ee964d61778', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 312, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7-vda', 'timestamp': '2025-11-29T07:54:48.076919', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573', 'name': 'instance-000000b2', 'instance_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad5e2602-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8335.795187887, 'message_signature': '8079a2e6ed66508545461ed2f34c9c66e7621bd9631a2ce902762c831b726028'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7-sda', 'timestamp': '2025-11-29T07:54:48.076919', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573', 'name': 'instance-000000b2', 'instance_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad5e3dae-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8335.795187887, 'message_signature': '050fb9817c17e1f632003c446f1697d3877f2a98b52c55f2b45d4c8af8776801'}]}, 'timestamp': '2025-11-29 07:54:48.107737', '_unique_id': '0aa7bd466bbf4f48a1277edf33b705ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.109 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.111 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.111 12 DEBUG ceilometer.compute.pollsters [-] 43cb7661-81a7-4e91-96aa-5d72329a58b7/disk.device.read.requests volume: 1109 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.112 12 DEBUG ceilometer.compute.pollsters [-] 43cb7661-81a7-4e91-96aa-5d72329a58b7/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1222163e-96ef-48c5-950e-76fdfd9b0b86', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1109, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7-vda', 'timestamp': '2025-11-29T07:54:48.111695', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573', 'name': 'instance-000000b2', 'instance_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad5eecc2-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8335.795187887, 'message_signature': '0cb566b85e7b2ca3838f36e7d2fb8dd9ae2f6361f2df943b6bfa80010e6e5c21'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7-sda', 'timestamp': '2025-11-29T07:54:48.111695', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573', 'name': 'instance-000000b2', 'instance_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad5efff0-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8335.795187887, 'message_signature': 'b8de569d619b0d90b53c9c04c6196add22660d83702b7f44f58b2573d47f6a98'}]}, 'timestamp': '2025-11-29 07:54:48.112738', '_unique_id': '2c56cb5bf6f24c72975fc683ebfc668d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.114 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.115 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.115 12 DEBUG ceilometer.compute.pollsters [-] 43cb7661-81a7-4e91-96aa-5d72329a58b7/network.outgoing.bytes volume: 21036 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c524337f-af93-4042-b0cf-4b8ada57289d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 21036, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b2-43cb7661-81a7-4e91-96aa-5d72329a58b7-tap27f71340-6a', 'timestamp': '2025-11-29T07:54:48.115748', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573', 'name': 'tap27f71340-6a', 'instance_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0b:ca:be', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap27f71340-6a'}, 'message_id': 'ad5f888a-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8335.747115157, 'message_signature': 'f20479187913bc36099549ae353eee5132f139a59595507c0d325ed8ebdb6e8f'}]}, 'timestamp': '2025-11-29 07:54:48.116202', '_unique_id': 'c36a2352bdfa4af69e84bd02481632fd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.117 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.118 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.119 12 DEBUG ceilometer.compute.pollsters [-] 43cb7661-81a7-4e91-96aa-5d72329a58b7/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9d512dfd-adc0-4b35-826c-842f405c333c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b2-43cb7661-81a7-4e91-96aa-5d72329a58b7-tap27f71340-6a', 'timestamp': '2025-11-29T07:54:48.119031', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573', 'name': 'tap27f71340-6a', 'instance_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0b:ca:be', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap27f71340-6a'}, 'message_id': 'ad600792-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8335.747115157, 'message_signature': '80ba428b030040a779f2793c934c069084fa052f21f85125c8689aa84d65f0c1'}]}, 'timestamp': '2025-11-29 07:54:48.119464', '_unique_id': '168af32379da4ab289545717cc76e14d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.120 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.121 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.133 12 DEBUG ceilometer.compute.pollsters [-] 43cb7661-81a7-4e91-96aa-5d72329a58b7/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.134 12 DEBUG ceilometer.compute.pollsters [-] 43cb7661-81a7-4e91-96aa-5d72329a58b7/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82f50236-abcb-4930-b416-58b0ada59047', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7-vda', 'timestamp': '2025-11-29T07:54:48.121482', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573', 'name': 'instance-000000b2', 'instance_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad62597a-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8335.839739977, 'message_signature': '6066dd062a49817dce556b3628f8d8e5af34dea0af5eb3e3f52305a8232fd092'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7-sda', 'timestamp': '2025-11-29T07:54:48.121482', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573', 'name': 'instance-000000b2', 'instance_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad626c1c-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8335.839739977, 'message_signature': 'c7b9fac20f9fbd1e6b428942b7077958b5ef5291e457d10e8bca5e84e8501532'}]}, 'timestamp': '2025-11-29 07:54:48.135127', '_unique_id': '41621ba72b93441dadee7985bf8ae584'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.136 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.137 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.138 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.138 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573>]
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.138 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.138 12 DEBUG ceilometer.compute.pollsters [-] 43cb7661-81a7-4e91-96aa-5d72329a58b7/disk.device.write.bytes volume: 72990720 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.138 12 DEBUG ceilometer.compute.pollsters [-] 43cb7661-81a7-4e91-96aa-5d72329a58b7/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4971b8eb-8528-4693-926c-354094c1ad7f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72990720, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7-vda', 'timestamp': '2025-11-29T07:54:48.138548', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573', 'name': 'instance-000000b2', 'instance_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad6300be-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8335.795187887, 'message_signature': '28cbda3837dd7a0a95a164baa0f5fb3d2454407b4757eb1cb6d2f3838a90a9cd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7-sda', 'timestamp': '2025-11-29T07:54:48.138548', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573', 'name': 'instance-000000b2', 'instance_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad630f14-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8335.795187887, 'message_signature': 'f4553e74d0038852d531e359d19e3650710b5ea7c060f1ba1affea4929110c9c'}]}, 'timestamp': '2025-11-29 07:54:48.139277', '_unique_id': '7d7ca3becd7641b696e6caedf5640188'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.140 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.141 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.141 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.141 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573>]
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.141 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.141 12 DEBUG ceilometer.compute.pollsters [-] 43cb7661-81a7-4e91-96aa-5d72329a58b7/disk.device.write.latency volume: 3299312727 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 DEBUG ceilometer.compute.pollsters [-] 43cb7661-81a7-4e91-96aa-5d72329a58b7/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b90d7bcc-48c7-4d95-9713-6e66ed0c3313', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3299312727, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7-vda', 'timestamp': '2025-11-29T07:54:48.141796', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573', 'name': 'instance-000000b2', 'instance_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad637daa-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8335.795187887, 'message_signature': '5d19e4c5ff7d3df0ff6e2f68856801218465a4de1637570a85a06a6702c5c74c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7-sda', 'timestamp': '2025-11-29T07:54:48.141796', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573', 'name': 'instance-000000b2', 'instance_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad6386f6-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8335.795187887, 'message_signature': '0ace137ae359665961bb5352afa6a73f6b84f54757a7a30a39319a34b2cc66a1'}]}, 'timestamp': '2025-11-29 07:54:48.142304', '_unique_id': '4787b294e09f47eb9e082e2efecedb23'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.142 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.143 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.143 12 DEBUG ceilometer.compute.pollsters [-] 43cb7661-81a7-4e91-96aa-5d72329a58b7/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 DEBUG ceilometer.compute.pollsters [-] 43cb7661-81a7-4e91-96aa-5d72329a58b7/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3304799e-d6fd-4704-bcf1-f5bdf92bbb5d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7-vda', 'timestamp': '2025-11-29T07:54:48.143784', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573', 'name': 'instance-000000b2', 'instance_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad63cb52-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8335.839739977, 'message_signature': 'cb05ed59f2fc3b737e34ab4201154275213e51b3c289efa0731e8f5211de5df3'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7-sda', 'timestamp': '2025-11-29T07:54:48.143784', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573', 'name': 'instance-000000b2', 'instance_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad63d4bc-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8335.839739977, 'message_signature': 'c54fddc80ef70d7c105c29f168bd9099b37d3b61cfd96fb553c534b49e79283e'}]}, 'timestamp': '2025-11-29 07:54:48.144294', '_unique_id': '1e0149b66c494f6b81fffd00fed361af'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.144 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.145 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.145 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.146 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573>]
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.146 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.146 12 DEBUG ceilometer.compute.pollsters [-] 43cb7661-81a7-4e91-96aa-5d72329a58b7/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bc6764c4-c8c0-49b4-8a4b-826ecdb6879f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b2-43cb7661-81a7-4e91-96aa-5d72329a58b7-tap27f71340-6a', 'timestamp': '2025-11-29T07:54:48.146397', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573', 'name': 'tap27f71340-6a', 'instance_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0b:ca:be', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap27f71340-6a'}, 'message_id': 'ad6432f4-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8335.747115157, 'message_signature': '19f47dd26ef17cd5ee06a22a9f31f6d3161bbdb3ba42dcedbdced37a3072c802'}]}, 'timestamp': '2025-11-29 07:54:48.146759', '_unique_id': '808a92757cfa4e798e7cefb6b31c2e73'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.147 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.148 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.148 12 DEBUG ceilometer.compute.pollsters [-] 43cb7661-81a7-4e91-96aa-5d72329a58b7/disk.device.read.bytes volume: 30591488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 DEBUG ceilometer.compute.pollsters [-] 43cb7661-81a7-4e91-96aa-5d72329a58b7/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b8f42b92-b6f2-4028-a3cc-1e4468259b37', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30591488, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7-vda', 'timestamp': '2025-11-29T07:54:48.148681', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573', 'name': 'instance-000000b2', 'instance_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad648cb8-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8335.795187887, 'message_signature': '5aa827ad2843f7f8f8012aaa1d2ee1213f346473ad67fd75eea831b47050ea73'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7-sda', 'timestamp': '2025-11-29T07:54:48.148681', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573', 'name': 'instance-000000b2', 'instance_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad6498ca-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8335.795187887, 'message_signature': 'dde238a61d8d2022fe6d413cb9d497ecd73a6d37ff67537d2809ec28c4c6e198'}]}, 'timestamp': '2025-11-29 07:54:48.149317', '_unique_id': '7965734f20134260b8925e409d6a82c4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.149 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.150 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 DEBUG ceilometer.compute.pollsters [-] 43cb7661-81a7-4e91-96aa-5d72329a58b7/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '01a4bfdf-c97b-469c-b855-dba08ea7aa92', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b2-43cb7661-81a7-4e91-96aa-5d72329a58b7-tap27f71340-6a', 'timestamp': '2025-11-29T07:54:48.151030', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573', 'name': 'tap27f71340-6a', 'instance_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0b:ca:be', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap27f71340-6a'}, 'message_id': 'ad64e64a-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8335.747115157, 'message_signature': '9ee38c23a6bb90bb90cf979d28179b012b04cfb8484cbc96b38575c13e86e38a'}]}, 'timestamp': '2025-11-29 07:54:48.151319', '_unique_id': '40d5f3d50b004c5a95cb3cb3bf1def1f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.151 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.152 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.152 12 DEBUG ceilometer.compute.pollsters [-] 43cb7661-81a7-4e91-96aa-5d72329a58b7/network.incoming.bytes volume: 24692 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae529762-fb7e-442c-8090-ac057956d4aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 24692, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b2-43cb7661-81a7-4e91-96aa-5d72329a58b7-tap27f71340-6a', 'timestamp': '2025-11-29T07:54:48.152931', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573', 'name': 'tap27f71340-6a', 'instance_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0b:ca:be', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap27f71340-6a'}, 'message_id': 'ad653226-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8335.747115157, 'message_signature': '4e608c91b660873ea75ef0935d8ecc81c28e586ccb9cb1dc380c90145ad65326'}]}, 'timestamp': '2025-11-29 07:54:48.153292', '_unique_id': '301ac053bf2e4ff1a09ebe32b80bc247'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.153 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.154 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.155 12 DEBUG ceilometer.compute.pollsters [-] 43cb7661-81a7-4e91-96aa-5d72329a58b7/disk.device.allocation volume: 30613504 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.155 12 DEBUG ceilometer.compute.pollsters [-] 43cb7661-81a7-4e91-96aa-5d72329a58b7/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9d11f2aa-e338-4ca0-b943-4f97cd80698f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30613504, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7-vda', 'timestamp': '2025-11-29T07:54:48.155014', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573', 'name': 'instance-000000b2', 'instance_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad6581c2-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8335.839739977, 'message_signature': '2aea94a1b1827474f39c95f2712410a6ce1448cb8595aaa6827fcbdbb1a2ab42'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7-sda', 'timestamp': '2025-11-29T07:54:48.155014', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573', 'name': 'instance-000000b2', 'instance_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad658b2c-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8335.839739977, 'message_signature': 'de7b1e85f703b4e488c83cb03d0df2a111d4353c2d334f04f98e6c75417ca862'}]}, 'timestamp': '2025-11-29 07:54:48.155519', '_unique_id': '1cdcfc2bcbbc47889758284682f2ed4c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.156 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 DEBUG ceilometer.compute.pollsters [-] 43cb7661-81a7-4e91-96aa-5d72329a58b7/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9f133c3a-a2be-463a-9291-bfc4dda7b903', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b2-43cb7661-81a7-4e91-96aa-5d72329a58b7-tap27f71340-6a', 'timestamp': '2025-11-29T07:54:48.157022', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573', 'name': 'tap27f71340-6a', 'instance_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0b:ca:be', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap27f71340-6a'}, 'message_id': 'ad65cfec-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8335.747115157, 'message_signature': '9c2c3e09585b1a6d16d024ca597021189c4831465854357c82ef1458b61d207c'}]}, 'timestamp': '2025-11-29 07:54:48.157310', '_unique_id': '89ea4bf7f3874bc4932c49f0d924a47a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.157 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.158 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.158 12 DEBUG ceilometer.compute.pollsters [-] 43cb7661-81a7-4e91-96aa-5d72329a58b7/disk.device.read.latency volume: 462362168 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 DEBUG ceilometer.compute.pollsters [-] 43cb7661-81a7-4e91-96aa-5d72329a58b7/disk.device.read.latency volume: 24255625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6eee6978-336c-4780-a679-868514024e0d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 462362168, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7-vda', 'timestamp': '2025-11-29T07:54:48.158782', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573', 'name': 'instance-000000b2', 'instance_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad661524-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8335.795187887, 'message_signature': '64be61389f4f01e7438418026873fbf6f9e94ac627f0f6a53f59803c34fe1627'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24255625, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7-sda', 'timestamp': '2025-11-29T07:54:48.158782', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573', 'name': 'instance-000000b2', 'instance_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad661ea2-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8335.795187887, 'message_signature': 'f6f6d06b6257c4bb02295055eb23e2bdad4b3bff77223656b3f52c88b8586ce4'}]}, 'timestamp': '2025-11-29 07:54:48.159295', '_unique_id': 'd3ba567f251a4efeb2bab68173d75055'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.159 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.160 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.160 12 DEBUG ceilometer.compute.pollsters [-] 43cb7661-81a7-4e91-96aa-5d72329a58b7/cpu volume: 12720000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd6fe3e3e-3641-4f88-8408-3abd99072d74', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12720000000, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7', 'timestamp': '2025-11-29T07:54:48.160784', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573', 'name': 'instance-000000b2', 'instance_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'ad666556-ccf8-11f0-8f64-fa163e220349', 'monotonic_time': 8335.78617483, 'message_signature': '2f7100c2b95aa32a05bed79323ec6c08003f214d01ece25ae9df6f18b21b5bb5'}]}, 'timestamp': '2025-11-29 07:54:48.161141', '_unique_id': 'a3d39122028b447daf282afcafb694fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:54:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:54:48.161 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:54:48 compute-0 nova_compute[187185]: 2025-11-29 07:54:48.824 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:54:50 compute-0 nova_compute[187185]: 2025-11-29 07:54:50.317 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:54:50 compute-0 nova_compute[187185]: 2025-11-29 07:54:50.993 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Updating instance_info_cache with network_info: [{"id": "27f71340-6ac0-4431-b058-f02eca4fb423", "address": "fa:16:3e:0b:ca:be", "network": {"id": "8429e89c-8540-4db3-b6b2-48775311a13d", "bridge": "br-int", "label": "tempest-network-smoke--2013284958", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27f71340-6a", "ovs_interfaceid": "27f71340-6ac0-4431-b058-f02eca4fb423", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:54:51 compute-0 nova_compute[187185]: 2025-11-29 07:54:51.116 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Releasing lock "refresh_cache-43cb7661-81a7-4e91-96aa-5d72329a58b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:54:51 compute-0 nova_compute[187185]: 2025-11-29 07:54:51.117 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 29 07:54:51 compute-0 nova_compute[187185]: 2025-11-29 07:54:51.117 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:54:51 compute-0 nova_compute[187185]: 2025-11-29 07:54:51.117 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:54:51 compute-0 podman[248826]: 2025-11-29 07:54:51.86912726 +0000 UTC m=+0.136971245 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller)
Nov 29 07:54:52 compute-0 nova_compute[187185]: 2025-11-29 07:54:52.111 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:54:53 compute-0 nova_compute[187185]: 2025-11-29 07:54:53.827 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:54:55 compute-0 nova_compute[187185]: 2025-11-29 07:54:55.321 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:54:58 compute-0 nova_compute[187185]: 2025-11-29 07:54:58.831 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:55:00 compute-0 ovn_controller[95281]: 2025-11-29T07:55:00Z|00606|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Nov 29 07:55:00 compute-0 nova_compute[187185]: 2025-11-29 07:55:00.326 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:55:00 compute-0 podman[248852]: 2025-11-29 07:55:00.806080553 +0000 UTC m=+0.067608738 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 07:55:02 compute-0 podman[248878]: 2025-11-29 07:55:02.815620989 +0000 UTC m=+0.068558654 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:55:02 compute-0 podman[248877]: 2025-11-29 07:55:02.832572332 +0000 UTC m=+0.085615430 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:55:03 compute-0 nova_compute[187185]: 2025-11-29 07:55:03.311 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:55:03 compute-0 nova_compute[187185]: 2025-11-29 07:55:03.833 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:55:04 compute-0 nova_compute[187185]: 2025-11-29 07:55:04.330 187189 DEBUG nova.compute.manager [req-0b7c0ca6-2acb-4ba1-8fe8-47c36ede0276 req-614809e9-ce52-441e-8230-67fae37ded6a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Received event network-changed-27f71340-6ac0-4431-b058-f02eca4fb423 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:55:04 compute-0 nova_compute[187185]: 2025-11-29 07:55:04.330 187189 DEBUG nova.compute.manager [req-0b7c0ca6-2acb-4ba1-8fe8-47c36ede0276 req-614809e9-ce52-441e-8230-67fae37ded6a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Refreshing instance network info cache due to event network-changed-27f71340-6ac0-4431-b058-f02eca4fb423. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:55:04 compute-0 nova_compute[187185]: 2025-11-29 07:55:04.331 187189 DEBUG oslo_concurrency.lockutils [req-0b7c0ca6-2acb-4ba1-8fe8-47c36ede0276 req-614809e9-ce52-441e-8230-67fae37ded6a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-43cb7661-81a7-4e91-96aa-5d72329a58b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:55:04 compute-0 nova_compute[187185]: 2025-11-29 07:55:04.331 187189 DEBUG oslo_concurrency.lockutils [req-0b7c0ca6-2acb-4ba1-8fe8-47c36ede0276 req-614809e9-ce52-441e-8230-67fae37ded6a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-43cb7661-81a7-4e91-96aa-5d72329a58b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:55:04 compute-0 nova_compute[187185]: 2025-11-29 07:55:04.331 187189 DEBUG nova.network.neutron [req-0b7c0ca6-2acb-4ba1-8fe8-47c36ede0276 req-614809e9-ce52-441e-8230-67fae37ded6a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Refreshing network info cache for port 27f71340-6ac0-4431-b058-f02eca4fb423 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:55:04 compute-0 nova_compute[187185]: 2025-11-29 07:55:04.461 187189 DEBUG oslo_concurrency.lockutils [None req-d458ec4d-0dfd-4736-a2f6-d249c5da0ab5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "43cb7661-81a7-4e91-96aa-5d72329a58b7" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:55:04 compute-0 nova_compute[187185]: 2025-11-29 07:55:04.462 187189 DEBUG oslo_concurrency.lockutils [None req-d458ec4d-0dfd-4736-a2f6-d249c5da0ab5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "43cb7661-81a7-4e91-96aa-5d72329a58b7" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:55:04 compute-0 nova_compute[187185]: 2025-11-29 07:55:04.463 187189 DEBUG oslo_concurrency.lockutils [None req-d458ec4d-0dfd-4736-a2f6-d249c5da0ab5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "43cb7661-81a7-4e91-96aa-5d72329a58b7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:55:04 compute-0 nova_compute[187185]: 2025-11-29 07:55:04.463 187189 DEBUG oslo_concurrency.lockutils [None req-d458ec4d-0dfd-4736-a2f6-d249c5da0ab5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "43cb7661-81a7-4e91-96aa-5d72329a58b7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:55:04 compute-0 nova_compute[187185]: 2025-11-29 07:55:04.463 187189 DEBUG oslo_concurrency.lockutils [None req-d458ec4d-0dfd-4736-a2f6-d249c5da0ab5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "43cb7661-81a7-4e91-96aa-5d72329a58b7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:55:04 compute-0 nova_compute[187185]: 2025-11-29 07:55:04.486 187189 INFO nova.compute.manager [None req-d458ec4d-0dfd-4736-a2f6-d249c5da0ab5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Terminating instance
Nov 29 07:55:04 compute-0 nova_compute[187185]: 2025-11-29 07:55:04.505 187189 DEBUG nova.compute.manager [None req-d458ec4d-0dfd-4736-a2f6-d249c5da0ab5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 07:55:04 compute-0 kernel: tap27f71340-6a (unregistering): left promiscuous mode
Nov 29 07:55:04 compute-0 NetworkManager[55227]: <info>  [1764402904.5353] device (tap27f71340-6a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:55:04 compute-0 nova_compute[187185]: 2025-11-29 07:55:04.550 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:55:04 compute-0 ovn_controller[95281]: 2025-11-29T07:55:04Z|00607|binding|INFO|Releasing lport 27f71340-6ac0-4431-b058-f02eca4fb423 from this chassis (sb_readonly=0)
Nov 29 07:55:04 compute-0 ovn_controller[95281]: 2025-11-29T07:55:04Z|00608|binding|INFO|Setting lport 27f71340-6ac0-4431-b058-f02eca4fb423 down in Southbound
Nov 29 07:55:04 compute-0 ovn_controller[95281]: 2025-11-29T07:55:04Z|00609|binding|INFO|Removing iface tap27f71340-6a ovn-installed in OVS
Nov 29 07:55:04 compute-0 nova_compute[187185]: 2025-11-29 07:55:04.553 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:55:04 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:55:04.559 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0b:ca:be 10.100.0.11'], port_security=['fa:16:3e:0b:ca:be 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '43cb7661-81a7-4e91-96aa-5d72329a58b7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8429e89c-8540-4db3-b6b2-48775311a13d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'neutron:revision_number': '4', 'neutron:security_group_ids': '302ff4eb-5b37-47a5-8263-6df9580417a7 c9c03ecd-65fb-4137-bd9c-bfe8eac1c96d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ab1419e5-3fc4-47d1-a2be-d34ec9f548ab, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=27f71340-6ac0-4431-b058-f02eca4fb423) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:55:04 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:55:04.561 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 27f71340-6ac0-4431-b058-f02eca4fb423 in datapath 8429e89c-8540-4db3-b6b2-48775311a13d unbound from our chassis
Nov 29 07:55:04 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:55:04.563 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8429e89c-8540-4db3-b6b2-48775311a13d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:55:04 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:55:04.565 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[5fe6b753-2f2d-4ec5-829d-997f4746c5e7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:55:04 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:55:04.566 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d namespace which is not needed anymore
Nov 29 07:55:04 compute-0 nova_compute[187185]: 2025-11-29 07:55:04.585 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:55:04 compute-0 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d000000b2.scope: Deactivated successfully.
Nov 29 07:55:04 compute-0 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d000000b2.scope: Consumed 16.478s CPU time.
Nov 29 07:55:04 compute-0 systemd-machined[153486]: Machine qemu-69-instance-000000b2 terminated.
Nov 29 07:55:04 compute-0 neutron-haproxy-ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d[248501]: [NOTICE]   (248505) : haproxy version is 2.8.14-c23fe91
Nov 29 07:55:04 compute-0 neutron-haproxy-ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d[248501]: [NOTICE]   (248505) : path to executable is /usr/sbin/haproxy
Nov 29 07:55:04 compute-0 neutron-haproxy-ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d[248501]: [WARNING]  (248505) : Exiting Master process...
Nov 29 07:55:04 compute-0 neutron-haproxy-ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d[248501]: [ALERT]    (248505) : Current worker (248507) exited with code 143 (Terminated)
Nov 29 07:55:04 compute-0 neutron-haproxy-ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d[248501]: [WARNING]  (248505) : All workers exited. Exiting... (0)
Nov 29 07:55:04 compute-0 systemd[1]: libpod-21f9cefb932825ef1ffd89978c3e42c3aa2915bd11dff3d8f5308f6db84d9975.scope: Deactivated successfully.
Nov 29 07:55:04 compute-0 podman[248942]: 2025-11-29 07:55:04.809608053 +0000 UTC m=+0.082334338 container died 21f9cefb932825ef1ffd89978c3e42c3aa2915bd11dff3d8f5308f6db84d9975 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 29 07:55:04 compute-0 nova_compute[187185]: 2025-11-29 07:55:04.811 187189 INFO nova.virt.libvirt.driver [-] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Instance destroyed successfully.
Nov 29 07:55:04 compute-0 nova_compute[187185]: 2025-11-29 07:55:04.812 187189 DEBUG nova.objects.instance [None req-d458ec4d-0dfd-4736-a2f6-d249c5da0ab5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lazy-loading 'resources' on Instance uuid 43cb7661-81a7-4e91-96aa-5d72329a58b7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:55:04 compute-0 nova_compute[187185]: 2025-11-29 07:55:04.845 187189 DEBUG nova.virt.libvirt.vif [None req-d458ec4d-0dfd-4736-a2f6-d249c5da0ab5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:53:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-914474573',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2022058758-ac',id=178,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDa1HqYHK6Fbyj4+WZlz/MxWl+SfeIBiBlR8m/oY3Vy4Q3n28dGa98Jt6Jmq1CjInsdStO6SA1dYTN5Q75hPAjlWGa44sox6aoYIWGLELZYzttGtittaInjJUuQncR/BLQ==',key_name='tempest-TestSecurityGroupsBasicOps-2142300595',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:53:53Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e8e45e91223b45a79dd698a82af4a2a5',ramdisk_id='',reservation_id='r-hij45f5r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-2022058758',owner_user_name='tempest-TestSecurityGroupsBasicOps-2022058758-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:53:53Z,user_data=None,user_id='dec30fbde18e4b2382ea2c59847d067f',uuid=43cb7661-81a7-4e91-96aa-5d72329a58b7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "27f71340-6ac0-4431-b058-f02eca4fb423", "address": "fa:16:3e:0b:ca:be", "network": {"id": "8429e89c-8540-4db3-b6b2-48775311a13d", "bridge": "br-int", "label": "tempest-network-smoke--2013284958", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27f71340-6a", "ovs_interfaceid": "27f71340-6ac0-4431-b058-f02eca4fb423", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:55:04 compute-0 nova_compute[187185]: 2025-11-29 07:55:04.848 187189 DEBUG nova.network.os_vif_util [None req-d458ec4d-0dfd-4736-a2f6-d249c5da0ab5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converting VIF {"id": "27f71340-6ac0-4431-b058-f02eca4fb423", "address": "fa:16:3e:0b:ca:be", "network": {"id": "8429e89c-8540-4db3-b6b2-48775311a13d", "bridge": "br-int", "label": "tempest-network-smoke--2013284958", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27f71340-6a", "ovs_interfaceid": "27f71340-6ac0-4431-b058-f02eca4fb423", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:55:04 compute-0 nova_compute[187185]: 2025-11-29 07:55:04.849 187189 DEBUG nova.network.os_vif_util [None req-d458ec4d-0dfd-4736-a2f6-d249c5da0ab5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0b:ca:be,bridge_name='br-int',has_traffic_filtering=True,id=27f71340-6ac0-4431-b058-f02eca4fb423,network=Network(8429e89c-8540-4db3-b6b2-48775311a13d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27f71340-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:55:04 compute-0 nova_compute[187185]: 2025-11-29 07:55:04.850 187189 DEBUG os_vif [None req-d458ec4d-0dfd-4736-a2f6-d249c5da0ab5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0b:ca:be,bridge_name='br-int',has_traffic_filtering=True,id=27f71340-6ac0-4431-b058-f02eca4fb423,network=Network(8429e89c-8540-4db3-b6b2-48775311a13d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27f71340-6a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:55:04 compute-0 nova_compute[187185]: 2025-11-29 07:55:04.854 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:55:04 compute-0 systemd[1]: var-lib-containers-storage-overlay-326761c22494d9d9deefcf31147889af1a48e495a396cd410e351f04fb70704a-merged.mount: Deactivated successfully.
Nov 29 07:55:04 compute-0 nova_compute[187185]: 2025-11-29 07:55:04.855 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap27f71340-6a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:55:04 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-21f9cefb932825ef1ffd89978c3e42c3aa2915bd11dff3d8f5308f6db84d9975-userdata-shm.mount: Deactivated successfully.
Nov 29 07:55:04 compute-0 nova_compute[187185]: 2025-11-29 07:55:04.857 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:55:04 compute-0 nova_compute[187185]: 2025-11-29 07:55:04.859 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:55:04 compute-0 podman[248942]: 2025-11-29 07:55:04.86563637 +0000 UTC m=+0.138362625 container cleanup 21f9cefb932825ef1ffd89978c3e42c3aa2915bd11dff3d8f5308f6db84d9975 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:55:04 compute-0 nova_compute[187185]: 2025-11-29 07:55:04.865 187189 INFO os_vif [None req-d458ec4d-0dfd-4736-a2f6-d249c5da0ab5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0b:ca:be,bridge_name='br-int',has_traffic_filtering=True,id=27f71340-6ac0-4431-b058-f02eca4fb423,network=Network(8429e89c-8540-4db3-b6b2-48775311a13d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27f71340-6a')
Nov 29 07:55:04 compute-0 nova_compute[187185]: 2025-11-29 07:55:04.866 187189 INFO nova.virt.libvirt.driver [None req-d458ec4d-0dfd-4736-a2f6-d249c5da0ab5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Deleting instance files /var/lib/nova/instances/43cb7661-81a7-4e91-96aa-5d72329a58b7_del
Nov 29 07:55:04 compute-0 nova_compute[187185]: 2025-11-29 07:55:04.867 187189 INFO nova.virt.libvirt.driver [None req-d458ec4d-0dfd-4736-a2f6-d249c5da0ab5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Deletion of /var/lib/nova/instances/43cb7661-81a7-4e91-96aa-5d72329a58b7_del complete
Nov 29 07:55:04 compute-0 systemd[1]: libpod-conmon-21f9cefb932825ef1ffd89978c3e42c3aa2915bd11dff3d8f5308f6db84d9975.scope: Deactivated successfully.
Nov 29 07:55:04 compute-0 podman[248987]: 2025-11-29 07:55:04.979177126 +0000 UTC m=+0.072821986 container remove 21f9cefb932825ef1ffd89978c3e42c3aa2915bd11dff3d8f5308f6db84d9975 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 07:55:04 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:55:04.991 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[bbbe0c4d-f084-443c-9d64-86c67621a7e5]: (4, ('Sat Nov 29 07:55:04 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d (21f9cefb932825ef1ffd89978c3e42c3aa2915bd11dff3d8f5308f6db84d9975)\n21f9cefb932825ef1ffd89978c3e42c3aa2915bd11dff3d8f5308f6db84d9975\nSat Nov 29 07:55:04 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d (21f9cefb932825ef1ffd89978c3e42c3aa2915bd11dff3d8f5308f6db84d9975)\n21f9cefb932825ef1ffd89978c3e42c3aa2915bd11dff3d8f5308f6db84d9975\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:55:04 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:55:04.994 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[8bfe83b9-97bd-413a-8c83-2910c876b5ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:55:04 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:55:04.996 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8429e89c-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:55:04 compute-0 nova_compute[187185]: 2025-11-29 07:55:04.998 187189 INFO nova.compute.manager [None req-d458ec4d-0dfd-4736-a2f6-d249c5da0ab5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Took 0.49 seconds to destroy the instance on the hypervisor.
Nov 29 07:55:05 compute-0 nova_compute[187185]: 2025-11-29 07:55:04.999 187189 DEBUG oslo.service.loopingcall [None req-d458ec4d-0dfd-4736-a2f6-d249c5da0ab5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 07:55:05 compute-0 kernel: tap8429e89c-80: left promiscuous mode
Nov 29 07:55:05 compute-0 nova_compute[187185]: 2025-11-29 07:55:05.000 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:55:05 compute-0 nova_compute[187185]: 2025-11-29 07:55:05.004 187189 DEBUG nova.compute.manager [-] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 07:55:05 compute-0 nova_compute[187185]: 2025-11-29 07:55:05.004 187189 DEBUG nova.network.neutron [-] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 07:55:05 compute-0 nova_compute[187185]: 2025-11-29 07:55:05.023 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:55:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:55:05.029 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[ba5a3574-13e7-48b8-b8f2-04fe0554793d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:55:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:55:05.049 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[5b607b20-1cf6-4d48-a212-b462aba30b19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:55:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:55:05.052 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[29294c80-f3a4-4244-b6b1-c924526aa47c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:55:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:55:05.086 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[a86a067c-90d2-4042-ae99-b20d2e1f746b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 827986, 'reachable_time': 28958, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249001, 'error': None, 'target': 'ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:55:05 compute-0 systemd[1]: run-netns-ovnmeta\x2d8429e89c\x2d8540\x2d4db3\x2db6b2\x2d48775311a13d.mount: Deactivated successfully.
Nov 29 07:55:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:55:05.091 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:55:05 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:55:05.091 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[3910cd7d-e10c-40ed-868f-f3a2aeb4d064]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:55:05 compute-0 nova_compute[187185]: 2025-11-29 07:55:05.224 187189 DEBUG nova.compute.manager [req-2320979f-5487-4cdf-8eb9-17b14a111341 req-0e5beea7-4108-422f-82e4-59d92361ad42 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Received event network-vif-unplugged-27f71340-6ac0-4431-b058-f02eca4fb423 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:55:05 compute-0 nova_compute[187185]: 2025-11-29 07:55:05.224 187189 DEBUG oslo_concurrency.lockutils [req-2320979f-5487-4cdf-8eb9-17b14a111341 req-0e5beea7-4108-422f-82e4-59d92361ad42 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "43cb7661-81a7-4e91-96aa-5d72329a58b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:55:05 compute-0 nova_compute[187185]: 2025-11-29 07:55:05.225 187189 DEBUG oslo_concurrency.lockutils [req-2320979f-5487-4cdf-8eb9-17b14a111341 req-0e5beea7-4108-422f-82e4-59d92361ad42 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "43cb7661-81a7-4e91-96aa-5d72329a58b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:55:05 compute-0 nova_compute[187185]: 2025-11-29 07:55:05.225 187189 DEBUG oslo_concurrency.lockutils [req-2320979f-5487-4cdf-8eb9-17b14a111341 req-0e5beea7-4108-422f-82e4-59d92361ad42 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "43cb7661-81a7-4e91-96aa-5d72329a58b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:55:05 compute-0 nova_compute[187185]: 2025-11-29 07:55:05.225 187189 DEBUG nova.compute.manager [req-2320979f-5487-4cdf-8eb9-17b14a111341 req-0e5beea7-4108-422f-82e4-59d92361ad42 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] No waiting events found dispatching network-vif-unplugged-27f71340-6ac0-4431-b058-f02eca4fb423 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:55:05 compute-0 nova_compute[187185]: 2025-11-29 07:55:05.226 187189 DEBUG nova.compute.manager [req-2320979f-5487-4cdf-8eb9-17b14a111341 req-0e5beea7-4108-422f-82e4-59d92361ad42 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Received event network-vif-unplugged-27f71340-6ac0-4431-b058-f02eca4fb423 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 07:55:08 compute-0 nova_compute[187185]: 2025-11-29 07:55:08.836 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:55:09 compute-0 nova_compute[187185]: 2025-11-29 07:55:09.859 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:55:11 compute-0 nova_compute[187185]: 2025-11-29 07:55:11.388 187189 DEBUG nova.compute.manager [req-1a49b358-75f6-4947-a896-66ce48e19960 req-ad9b81f5-400e-4910-8877-3d1f179f50d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Received event network-vif-plugged-27f71340-6ac0-4431-b058-f02eca4fb423 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:55:11 compute-0 nova_compute[187185]: 2025-11-29 07:55:11.389 187189 DEBUG oslo_concurrency.lockutils [req-1a49b358-75f6-4947-a896-66ce48e19960 req-ad9b81f5-400e-4910-8877-3d1f179f50d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "43cb7661-81a7-4e91-96aa-5d72329a58b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:55:11 compute-0 nova_compute[187185]: 2025-11-29 07:55:11.389 187189 DEBUG oslo_concurrency.lockutils [req-1a49b358-75f6-4947-a896-66ce48e19960 req-ad9b81f5-400e-4910-8877-3d1f179f50d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "43cb7661-81a7-4e91-96aa-5d72329a58b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:55:11 compute-0 nova_compute[187185]: 2025-11-29 07:55:11.390 187189 DEBUG oslo_concurrency.lockutils [req-1a49b358-75f6-4947-a896-66ce48e19960 req-ad9b81f5-400e-4910-8877-3d1f179f50d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "43cb7661-81a7-4e91-96aa-5d72329a58b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:55:11 compute-0 nova_compute[187185]: 2025-11-29 07:55:11.390 187189 DEBUG nova.compute.manager [req-1a49b358-75f6-4947-a896-66ce48e19960 req-ad9b81f5-400e-4910-8877-3d1f179f50d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] No waiting events found dispatching network-vif-plugged-27f71340-6ac0-4431-b058-f02eca4fb423 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:55:11 compute-0 nova_compute[187185]: 2025-11-29 07:55:11.391 187189 WARNING nova.compute.manager [req-1a49b358-75f6-4947-a896-66ce48e19960 req-ad9b81f5-400e-4910-8877-3d1f179f50d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Received unexpected event network-vif-plugged-27f71340-6ac0-4431-b058-f02eca4fb423 for instance with vm_state active and task_state deleting.
Nov 29 07:55:11 compute-0 nova_compute[187185]: 2025-11-29 07:55:11.583 187189 DEBUG nova.network.neutron [req-0b7c0ca6-2acb-4ba1-8fe8-47c36ede0276 req-614809e9-ce52-441e-8230-67fae37ded6a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Updated VIF entry in instance network info cache for port 27f71340-6ac0-4431-b058-f02eca4fb423. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:55:11 compute-0 nova_compute[187185]: 2025-11-29 07:55:11.584 187189 DEBUG nova.network.neutron [req-0b7c0ca6-2acb-4ba1-8fe8-47c36ede0276 req-614809e9-ce52-441e-8230-67fae37ded6a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Updating instance_info_cache with network_info: [{"id": "27f71340-6ac0-4431-b058-f02eca4fb423", "address": "fa:16:3e:0b:ca:be", "network": {"id": "8429e89c-8540-4db3-b6b2-48775311a13d", "bridge": "br-int", "label": "tempest-network-smoke--2013284958", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27f71340-6a", "ovs_interfaceid": "27f71340-6ac0-4431-b058-f02eca4fb423", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:55:11 compute-0 nova_compute[187185]: 2025-11-29 07:55:11.767 187189 DEBUG oslo_concurrency.lockutils [req-0b7c0ca6-2acb-4ba1-8fe8-47c36ede0276 req-614809e9-ce52-441e-8230-67fae37ded6a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-43cb7661-81a7-4e91-96aa-5d72329a58b7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:55:11 compute-0 nova_compute[187185]: 2025-11-29 07:55:11.808 187189 DEBUG nova.network.neutron [-] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:55:12 compute-0 nova_compute[187185]: 2025-11-29 07:55:12.013 187189 INFO nova.compute.manager [-] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Took 7.01 seconds to deallocate network for instance.
Nov 29 07:55:13 compute-0 nova_compute[187185]: 2025-11-29 07:55:13.231 187189 DEBUG oslo_concurrency.lockutils [None req-d458ec4d-0dfd-4736-a2f6-d249c5da0ab5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:55:13 compute-0 nova_compute[187185]: 2025-11-29 07:55:13.232 187189 DEBUG oslo_concurrency.lockutils [None req-d458ec4d-0dfd-4736-a2f6-d249c5da0ab5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:55:13 compute-0 nova_compute[187185]: 2025-11-29 07:55:13.446 187189 DEBUG nova.compute.provider_tree [None req-d458ec4d-0dfd-4736-a2f6-d249c5da0ab5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:55:13 compute-0 podman[249002]: 2025-11-29 07:55:13.812512326 +0000 UTC m=+0.072418155 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 07:55:13 compute-0 nova_compute[187185]: 2025-11-29 07:55:13.837 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:55:13 compute-0 podman[249004]: 2025-11-29 07:55:13.838941879 +0000 UTC m=+0.091627732 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:55:13 compute-0 podman[249003]: 2025-11-29 07:55:13.846546406 +0000 UTC m=+0.100463715 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., version=9.6, vendor=Red Hat, Inc., vcs-type=git, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container)
Nov 29 07:55:14 compute-0 nova_compute[187185]: 2025-11-29 07:55:14.863 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:55:18 compute-0 nova_compute[187185]: 2025-11-29 07:55:18.841 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:55:19 compute-0 nova_compute[187185]: 2025-11-29 07:55:19.439 187189 DEBUG nova.scheduler.client.report [None req-d458ec4d-0dfd-4736-a2f6-d249c5da0ab5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:55:19 compute-0 nova_compute[187185]: 2025-11-29 07:55:19.807 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764402904.8052783, 43cb7661-81a7-4e91-96aa-5d72329a58b7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:55:19 compute-0 nova_compute[187185]: 2025-11-29 07:55:19.810 187189 INFO nova.compute.manager [-] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] VM Stopped (Lifecycle Event)
Nov 29 07:55:19 compute-0 nova_compute[187185]: 2025-11-29 07:55:19.866 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:55:20 compute-0 nova_compute[187185]: 2025-11-29 07:55:20.583 187189 DEBUG nova.compute.manager [req-91d42812-1e90-4964-806c-bc6ec9c78c4b req-95ded238-ceea-4842-90ea-ce475f9462ac 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Received event network-vif-deleted-27f71340-6ac0-4431-b058-f02eca4fb423 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:55:20 compute-0 sshd-session[249065]: Invalid user bitnami from 20.255.62.58 port 48824
Nov 29 07:55:20 compute-0 sshd-session[249065]: Received disconnect from 20.255.62.58 port 48824:11: Bye Bye [preauth]
Nov 29 07:55:20 compute-0 sshd-session[249065]: Disconnected from invalid user bitnami 20.255.62.58 port 48824 [preauth]
Nov 29 07:55:22 compute-0 podman[249067]: 2025-11-29 07:55:22.879103705 +0000 UTC m=+0.140218578 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:55:23 compute-0 nova_compute[187185]: 2025-11-29 07:55:23.937 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:55:24 compute-0 nova_compute[187185]: 2025-11-29 07:55:24.868 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:55:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:55:25.755 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:55:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:55:25.755 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:55:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:55:25.755 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:55:28 compute-0 nova_compute[187185]: 2025-11-29 07:55:28.941 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:55:29 compute-0 nova_compute[187185]: 2025-11-29 07:55:29.872 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:55:31 compute-0 podman[249094]: 2025-11-29 07:55:31.860622058 +0000 UTC m=+0.110530691 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 07:55:33 compute-0 podman[249118]: 2025-11-29 07:55:33.845864173 +0000 UTC m=+0.095580616 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 07:55:33 compute-0 podman[249119]: 2025-11-29 07:55:33.866222113 +0000 UTC m=+0.106547178 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 29 07:55:33 compute-0 nova_compute[187185]: 2025-11-29 07:55:33.942 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:55:33 compute-0 nova_compute[187185]: 2025-11-29 07:55:33.983 187189 DEBUG nova.compute.manager [None req-d5ea15c3-5319-42fa-bff4-32e4a2882171 - - - - - -] [instance: 43cb7661-81a7-4e91-96aa-5d72329a58b7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:55:34 compute-0 nova_compute[187185]: 2025-11-29 07:55:34.274 187189 DEBUG oslo_concurrency.lockutils [None req-d458ec4d-0dfd-4736-a2f6-d249c5da0ab5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 21.042s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:55:34 compute-0 nova_compute[187185]: 2025-11-29 07:55:34.840 187189 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 3.09 sec
Nov 29 07:55:34 compute-0 nova_compute[187185]: 2025-11-29 07:55:34.875 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:55:37 compute-0 nova_compute[187185]: 2025-11-29 07:55:37.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:55:37 compute-0 nova_compute[187185]: 2025-11-29 07:55:37.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:55:38 compute-0 nova_compute[187185]: 2025-11-29 07:55:38.945 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:55:39 compute-0 nova_compute[187185]: 2025-11-29 07:55:39.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:55:39 compute-0 nova_compute[187185]: 2025-11-29 07:55:39.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:55:39 compute-0 nova_compute[187185]: 2025-11-29 07:55:39.879 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:55:40 compute-0 nova_compute[187185]: 2025-11-29 07:55:40.058 187189 INFO nova.scheduler.client.report [None req-d458ec4d-0dfd-4736-a2f6-d249c5da0ab5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Deleted allocations for instance 43cb7661-81a7-4e91-96aa-5d72329a58b7
Nov 29 07:55:40 compute-0 nova_compute[187185]: 2025-11-29 07:55:40.063 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:55:40 compute-0 nova_compute[187185]: 2025-11-29 07:55:40.064 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:55:40 compute-0 nova_compute[187185]: 2025-11-29 07:55:40.064 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:55:40 compute-0 nova_compute[187185]: 2025-11-29 07:55:40.065 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:55:40 compute-0 nova_compute[187185]: 2025-11-29 07:55:40.228 187189 DEBUG oslo_concurrency.lockutils [None req-d458ec4d-0dfd-4736-a2f6-d249c5da0ab5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "43cb7661-81a7-4e91-96aa-5d72329a58b7" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 35.766s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:55:40 compute-0 nova_compute[187185]: 2025-11-29 07:55:40.291 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:55:40 compute-0 nova_compute[187185]: 2025-11-29 07:55:40.292 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5735MB free_disk=73.24601745605469GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:55:40 compute-0 nova_compute[187185]: 2025-11-29 07:55:40.292 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:55:40 compute-0 nova_compute[187185]: 2025-11-29 07:55:40.293 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:55:40 compute-0 nova_compute[187185]: 2025-11-29 07:55:40.440 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:55:40 compute-0 nova_compute[187185]: 2025-11-29 07:55:40.440 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:55:40 compute-0 nova_compute[187185]: 2025-11-29 07:55:40.468 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:55:40 compute-0 nova_compute[187185]: 2025-11-29 07:55:40.486 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:55:40 compute-0 nova_compute[187185]: 2025-11-29 07:55:40.516 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:55:40 compute-0 nova_compute[187185]: 2025-11-29 07:55:40.516 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.224s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:55:41 compute-0 ovn_controller[95281]: 2025-11-29T07:55:41Z|00610|memory_trim|INFO|Detected inactivity (last active 30016 ms ago): trimming memory
Nov 29 07:55:42 compute-0 nova_compute[187185]: 2025-11-29 07:55:42.516 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:55:42 compute-0 nova_compute[187185]: 2025-11-29 07:55:42.517 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:55:43 compute-0 nova_compute[187185]: 2025-11-29 07:55:43.947 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:55:44 compute-0 podman[249161]: 2025-11-29 07:55:44.800534636 +0000 UTC m=+0.064599271 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:55:44 compute-0 podman[249162]: 2025-11-29 07:55:44.809059619 +0000 UTC m=+0.068008579 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_id=edpm, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350)
Nov 29 07:55:44 compute-0 podman[249163]: 2025-11-29 07:55:44.833074563 +0000 UTC m=+0.086996069 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:55:44 compute-0 nova_compute[187185]: 2025-11-29 07:55:44.881 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:55:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:55:45.048 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=74, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=73) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:55:45 compute-0 nova_compute[187185]: 2025-11-29 07:55:45.048 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:55:45 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:55:45.050 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:55:46 compute-0 nova_compute[187185]: 2025-11-29 07:55:46.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:55:48 compute-0 nova_compute[187185]: 2025-11-29 07:55:48.312 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:55:48 compute-0 nova_compute[187185]: 2025-11-29 07:55:48.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:55:48 compute-0 nova_compute[187185]: 2025-11-29 07:55:48.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:55:48 compute-0 nova_compute[187185]: 2025-11-29 07:55:48.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:55:48 compute-0 nova_compute[187185]: 2025-11-29 07:55:48.345 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 07:55:48 compute-0 sshd-session[249226]: Invalid user user from 190.181.27.27 port 41684
Nov 29 07:55:48 compute-0 sshd-session[249226]: Received disconnect from 190.181.27.27 port 41684:11: Bye Bye [preauth]
Nov 29 07:55:48 compute-0 sshd-session[249226]: Disconnected from invalid user user 190.181.27.27 port 41684 [preauth]
Nov 29 07:55:48 compute-0 nova_compute[187185]: 2025-11-29 07:55:48.949 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:55:49 compute-0 nova_compute[187185]: 2025-11-29 07:55:49.912 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:55:51 compute-0 nova_compute[187185]: 2025-11-29 07:55:51.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:55:53 compute-0 podman[249228]: 2025-11-29 07:55:53.884117448 +0000 UTC m=+0.132079255 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:55:53 compute-0 nova_compute[187185]: 2025-11-29 07:55:53.899 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:55:54 compute-0 nova_compute[187185]: 2025-11-29 07:55:54.050 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:55:54 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:55:54.052 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '74'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:55:54 compute-0 nova_compute[187185]: 2025-11-29 07:55:54.059 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:55:54 compute-0 nova_compute[187185]: 2025-11-29 07:55:54.915 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:55:59 compute-0 nova_compute[187185]: 2025-11-29 07:55:59.100 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:55:59 compute-0 nova_compute[187185]: 2025-11-29 07:55:59.918 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:56:02 compute-0 podman[249257]: 2025-11-29 07:56:02.829017368 +0000 UTC m=+0.079809196 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:56:04 compute-0 nova_compute[187185]: 2025-11-29 07:56:04.102 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:56:04 compute-0 podman[249282]: 2025-11-29 07:56:04.8261492 +0000 UTC m=+0.082682537 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3)
Nov 29 07:56:04 compute-0 podman[249283]: 2025-11-29 07:56:04.831520143 +0000 UTC m=+0.088436661 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 07:56:04 compute-0 nova_compute[187185]: 2025-11-29 07:56:04.921 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:56:09 compute-0 nova_compute[187185]: 2025-11-29 07:56:09.105 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:56:09 compute-0 nova_compute[187185]: 2025-11-29 07:56:09.924 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:56:14 compute-0 nova_compute[187185]: 2025-11-29 07:56:14.109 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:56:14 compute-0 nova_compute[187185]: 2025-11-29 07:56:14.927 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:56:15 compute-0 podman[249320]: 2025-11-29 07:56:15.822335856 +0000 UTC m=+0.078935091 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.33.7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 07:56:15 compute-0 podman[249321]: 2025-11-29 07:56:15.833796972 +0000 UTC m=+0.081630577 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 07:56:15 compute-0 podman[249319]: 2025-11-29 07:56:15.838608709 +0000 UTC m=+0.100309550 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 07:56:19 compute-0 nova_compute[187185]: 2025-11-29 07:56:19.112 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:56:19 compute-0 nova_compute[187185]: 2025-11-29 07:56:19.930 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:56:21 compute-0 nova_compute[187185]: 2025-11-29 07:56:21.384 187189 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 6.53 sec
Nov 29 07:56:24 compute-0 nova_compute[187185]: 2025-11-29 07:56:24.114 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:56:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:56:24.572 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=75, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=74) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:56:24 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:56:24.573 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:56:24 compute-0 nova_compute[187185]: 2025-11-29 07:56:24.573 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:56:24 compute-0 podman[249380]: 2025-11-29 07:56:24.865914958 +0000 UTC m=+0.123343876 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 07:56:24 compute-0 nova_compute[187185]: 2025-11-29 07:56:24.932 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:56:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:56:25.755 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:56:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:56:25.756 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:56:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:56:25.756 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:56:26 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:56:26.575 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '75'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:56:29 compute-0 nova_compute[187185]: 2025-11-29 07:56:29.163 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:56:29 compute-0 nova_compute[187185]: 2025-11-29 07:56:29.934 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:56:33 compute-0 podman[249407]: 2025-11-29 07:56:33.808635716 +0000 UTC m=+0.067182415 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 07:56:34 compute-0 nova_compute[187185]: 2025-11-29 07:56:34.165 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:56:34 compute-0 nova_compute[187185]: 2025-11-29 07:56:34.937 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:56:35 compute-0 podman[249433]: 2025-11-29 07:56:35.821109566 +0000 UTC m=+0.072478017 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 29 07:56:35 compute-0 podman[249432]: 2025-11-29 07:56:35.834770395 +0000 UTC m=+0.084432107 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd)
Nov 29 07:56:38 compute-0 nova_compute[187185]: 2025-11-29 07:56:38.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:56:39 compute-0 nova_compute[187185]: 2025-11-29 07:56:39.168 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:56:39 compute-0 nova_compute[187185]: 2025-11-29 07:56:39.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:56:39 compute-0 nova_compute[187185]: 2025-11-29 07:56:39.940 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:56:40 compute-0 nova_compute[187185]: 2025-11-29 07:56:40.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:56:40 compute-0 nova_compute[187185]: 2025-11-29 07:56:40.350 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:56:40 compute-0 nova_compute[187185]: 2025-11-29 07:56:40.350 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:56:40 compute-0 nova_compute[187185]: 2025-11-29 07:56:40.351 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:56:40 compute-0 nova_compute[187185]: 2025-11-29 07:56:40.351 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:56:40 compute-0 nova_compute[187185]: 2025-11-29 07:56:40.507 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:56:40 compute-0 nova_compute[187185]: 2025-11-29 07:56:40.508 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5741MB free_disk=73.24602127075195GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:56:40 compute-0 nova_compute[187185]: 2025-11-29 07:56:40.509 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:56:40 compute-0 nova_compute[187185]: 2025-11-29 07:56:40.509 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:56:40 compute-0 nova_compute[187185]: 2025-11-29 07:56:40.590 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:56:40 compute-0 nova_compute[187185]: 2025-11-29 07:56:40.591 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:56:40 compute-0 nova_compute[187185]: 2025-11-29 07:56:40.671 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Refreshing inventories for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 29 07:56:40 compute-0 nova_compute[187185]: 2025-11-29 07:56:40.780 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Updating ProviderTree inventory for provider 4e39a026-df39-4e20-874a-dbb5a40df044 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 29 07:56:40 compute-0 nova_compute[187185]: 2025-11-29 07:56:40.780 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Updating inventory in ProviderTree for provider 4e39a026-df39-4e20-874a-dbb5a40df044 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 29 07:56:40 compute-0 nova_compute[187185]: 2025-11-29 07:56:40.811 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Refreshing aggregate associations for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 29 07:56:40 compute-0 nova_compute[187185]: 2025-11-29 07:56:40.861 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Refreshing trait associations for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044, traits: HW_CPU_X86_SSE,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_TRUSTED_CERTS,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 29 07:56:40 compute-0 nova_compute[187185]: 2025-11-29 07:56:40.907 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:56:40 compute-0 nova_compute[187185]: 2025-11-29 07:56:40.923 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:56:40 compute-0 nova_compute[187185]: 2025-11-29 07:56:40.924 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:56:40 compute-0 nova_compute[187185]: 2025-11-29 07:56:40.924 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.416s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:56:41 compute-0 nova_compute[187185]: 2025-11-29 07:56:41.926 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:56:43 compute-0 nova_compute[187185]: 2025-11-29 07:56:43.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:56:43 compute-0 nova_compute[187185]: 2025-11-29 07:56:43.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:56:44 compute-0 sshd-session[249473]: Received disconnect from 20.255.62.58 port 36986:11: Bye Bye [preauth]
Nov 29 07:56:44 compute-0 sshd-session[249473]: Disconnected from authenticating user root 20.255.62.58 port 36986 [preauth]
Nov 29 07:56:44 compute-0 nova_compute[187185]: 2025-11-29 07:56:44.170 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:56:44 compute-0 nova_compute[187185]: 2025-11-29 07:56:44.944 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:56:46 compute-0 podman[249475]: 2025-11-29 07:56:46.828740447 +0000 UTC m=+0.068071040 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 07:56:46 compute-0 podman[249476]: 2025-11-29 07:56:46.850296172 +0000 UTC m=+0.078122767 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., release=1755695350, container_name=openstack_network_exporter, vendor=Red Hat, Inc., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_id=edpm, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public)
Nov 29 07:56:46 compute-0 podman[249477]: 2025-11-29 07:56:46.863890809 +0000 UTC m=+0.090694955 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 07:56:47 compute-0 nova_compute[187185]: 2025-11-29 07:56:47.318 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:56:48.021 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:56:48.022 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:56:48.022 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:56:48.022 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:56:48.022 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:56:48.022 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:56:48.022 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:56:48.022 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:56:48.022 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:56:48.022 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:56:48.022 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:56:48.022 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:56:48.022 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:56:48.022 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:56:48.022 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:56:48.023 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:56:48.023 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:56:48.023 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:56:48.023 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:56:48.023 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:56:48.023 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:56:48.023 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:56:48.023 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:56:48.023 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:56:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:56:48.023 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 07:56:49 compute-0 nova_compute[187185]: 2025-11-29 07:56:49.172 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:56:49 compute-0 nova_compute[187185]: 2025-11-29 07:56:49.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:56:49 compute-0 nova_compute[187185]: 2025-11-29 07:56:49.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:56:49 compute-0 nova_compute[187185]: 2025-11-29 07:56:49.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:56:49 compute-0 nova_compute[187185]: 2025-11-29 07:56:49.493 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 07:56:49 compute-0 nova_compute[187185]: 2025-11-29 07:56:49.947 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:56:50 compute-0 nova_compute[187185]: 2025-11-29 07:56:50.488 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:56:52 compute-0 nova_compute[187185]: 2025-11-29 07:56:52.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:56:54 compute-0 nova_compute[187185]: 2025-11-29 07:56:54.174 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:56:54 compute-0 nova_compute[187185]: 2025-11-29 07:56:54.950 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:56:55 compute-0 podman[249538]: 2025-11-29 07:56:55.829021347 +0000 UTC m=+0.086658630 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 07:56:59 compute-0 nova_compute[187185]: 2025-11-29 07:56:59.176 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:56:59 compute-0 nova_compute[187185]: 2025-11-29 07:56:59.953 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:57:04 compute-0 nova_compute[187185]: 2025-11-29 07:57:04.178 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:57:04 compute-0 podman[249564]: 2025-11-29 07:57:04.783097278 +0000 UTC m=+0.048985866 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:57:04 compute-0 nova_compute[187185]: 2025-11-29 07:57:04.955 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:57:05 compute-0 ovn_controller[95281]: 2025-11-29T07:57:05Z|00611|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Nov 29 07:57:05 compute-0 nova_compute[187185]: 2025-11-29 07:57:05.311 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:57:06 compute-0 podman[249590]: 2025-11-29 07:57:06.79773108 +0000 UTC m=+0.059132917 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 29 07:57:06 compute-0 podman[249589]: 2025-11-29 07:57:06.816731981 +0000 UTC m=+0.082785660 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd)
Nov 29 07:57:09 compute-0 nova_compute[187185]: 2025-11-29 07:57:09.180 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:57:09 compute-0 nova_compute[187185]: 2025-11-29 07:57:09.958 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:57:12 compute-0 sshd-session[249627]: Invalid user username from 190.181.27.27 port 40352
Nov 29 07:57:12 compute-0 sshd-session[249627]: Received disconnect from 190.181.27.27 port 40352:11: Bye Bye [preauth]
Nov 29 07:57:12 compute-0 sshd-session[249627]: Disconnected from invalid user username 190.181.27.27 port 40352 [preauth]
Nov 29 07:57:14 compute-0 nova_compute[187185]: 2025-11-29 07:57:14.182 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:57:14 compute-0 nova_compute[187185]: 2025-11-29 07:57:14.961 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:57:17 compute-0 podman[249629]: 2025-11-29 07:57:17.791481236 +0000 UTC m=+0.052132337 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent)
Nov 29 07:57:17 compute-0 podman[249630]: 2025-11-29 07:57:17.809287323 +0000 UTC m=+0.064616523 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., managed_by=edpm_ansible, version=9.6, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, name=ubi9-minimal, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-type=git, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 29 07:57:17 compute-0 podman[249631]: 2025-11-29 07:57:17.826277697 +0000 UTC m=+0.076727828 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 07:57:19 compute-0 nova_compute[187185]: 2025-11-29 07:57:19.184 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:57:19 compute-0 nova_compute[187185]: 2025-11-29 07:57:19.964 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:57:22 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:57:22.594 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=76, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=75) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:57:22 compute-0 nova_compute[187185]: 2025-11-29 07:57:22.595 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:57:22 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:57:22.596 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:57:24 compute-0 nova_compute[187185]: 2025-11-29 07:57:24.230 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:57:24 compute-0 nova_compute[187185]: 2025-11-29 07:57:24.967 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:57:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:57:25.599 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '76'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:57:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:57:25.757 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:57:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:57:25.758 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:57:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:57:25.758 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:57:26 compute-0 podman[249690]: 2025-11-29 07:57:26.820889744 +0000 UTC m=+0.089284876 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:57:29 compute-0 nova_compute[187185]: 2025-11-29 07:57:29.232 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:57:29 compute-0 nova_compute[187185]: 2025-11-29 07:57:29.969 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:57:34 compute-0 nova_compute[187185]: 2025-11-29 07:57:34.235 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:57:34 compute-0 nova_compute[187185]: 2025-11-29 07:57:34.973 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:57:35 compute-0 podman[249716]: 2025-11-29 07:57:35.794818782 +0000 UTC m=+0.057953213 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 07:57:37 compute-0 podman[249740]: 2025-11-29 07:57:37.80199911 +0000 UTC m=+0.061349788 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 29 07:57:37 compute-0 podman[249739]: 2025-11-29 07:57:37.813769296 +0000 UTC m=+0.079249159 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 07:57:39 compute-0 nova_compute[187185]: 2025-11-29 07:57:39.237 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:57:39 compute-0 nova_compute[187185]: 2025-11-29 07:57:39.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:57:39 compute-0 nova_compute[187185]: 2025-11-29 07:57:39.976 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:57:40 compute-0 nova_compute[187185]: 2025-11-29 07:57:40.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:57:41 compute-0 nova_compute[187185]: 2025-11-29 07:57:41.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:57:41 compute-0 nova_compute[187185]: 2025-11-29 07:57:41.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:57:41 compute-0 nova_compute[187185]: 2025-11-29 07:57:41.343 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:57:41 compute-0 nova_compute[187185]: 2025-11-29 07:57:41.344 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:57:41 compute-0 nova_compute[187185]: 2025-11-29 07:57:41.344 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:57:41 compute-0 nova_compute[187185]: 2025-11-29 07:57:41.344 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:57:41 compute-0 nova_compute[187185]: 2025-11-29 07:57:41.545 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:57:41 compute-0 nova_compute[187185]: 2025-11-29 07:57:41.546 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5736MB free_disk=73.24602127075195GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:57:41 compute-0 nova_compute[187185]: 2025-11-29 07:57:41.546 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:57:41 compute-0 nova_compute[187185]: 2025-11-29 07:57:41.546 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:57:41 compute-0 nova_compute[187185]: 2025-11-29 07:57:41.615 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:57:41 compute-0 nova_compute[187185]: 2025-11-29 07:57:41.616 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:57:41 compute-0 nova_compute[187185]: 2025-11-29 07:57:41.637 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:57:41 compute-0 nova_compute[187185]: 2025-11-29 07:57:41.654 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:57:41 compute-0 nova_compute[187185]: 2025-11-29 07:57:41.655 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:57:41 compute-0 nova_compute[187185]: 2025-11-29 07:57:41.656 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:57:44 compute-0 nova_compute[187185]: 2025-11-29 07:57:44.239 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:57:44 compute-0 nova_compute[187185]: 2025-11-29 07:57:44.978 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:57:45 compute-0 nova_compute[187185]: 2025-11-29 07:57:45.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:57:45 compute-0 nova_compute[187185]: 2025-11-29 07:57:45.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:57:45 compute-0 nova_compute[187185]: 2025-11-29 07:57:45.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:57:45 compute-0 nova_compute[187185]: 2025-11-29 07:57:45.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 29 07:57:48 compute-0 nova_compute[187185]: 2025-11-29 07:57:48.333 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:57:48 compute-0 podman[249780]: 2025-11-29 07:57:48.78815591 +0000 UTC m=+0.054883114 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:57:48 compute-0 podman[249782]: 2025-11-29 07:57:48.795778908 +0000 UTC m=+0.053875406 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 07:57:48 compute-0 podman[249781]: 2025-11-29 07:57:48.79761152 +0000 UTC m=+0.060629989 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, version=9.6, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_id=edpm, name=ubi9-minimal, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 29 07:57:49 compute-0 nova_compute[187185]: 2025-11-29 07:57:49.240 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:57:49 compute-0 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 07:57:49 compute-0 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 07:57:49 compute-0 nova_compute[187185]: 2025-11-29 07:57:49.981 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:57:51 compute-0 nova_compute[187185]: 2025-11-29 07:57:51.311 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:57:51 compute-0 nova_compute[187185]: 2025-11-29 07:57:51.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:57:51 compute-0 nova_compute[187185]: 2025-11-29 07:57:51.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:57:51 compute-0 nova_compute[187185]: 2025-11-29 07:57:51.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:57:51 compute-0 nova_compute[187185]: 2025-11-29 07:57:51.334 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 07:57:52 compute-0 nova_compute[187185]: 2025-11-29 07:57:52.669 187189 DEBUG oslo_concurrency.lockutils [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "da8e5049-4048-486b-a0d4-9d9c53478c31" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:57:52 compute-0 nova_compute[187185]: 2025-11-29 07:57:52.670 187189 DEBUG oslo_concurrency.lockutils [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "da8e5049-4048-486b-a0d4-9d9c53478c31" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:57:52 compute-0 nova_compute[187185]: 2025-11-29 07:57:52.687 187189 DEBUG nova.compute.manager [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 07:57:52 compute-0 nova_compute[187185]: 2025-11-29 07:57:52.790 187189 DEBUG oslo_concurrency.lockutils [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:57:52 compute-0 nova_compute[187185]: 2025-11-29 07:57:52.791 187189 DEBUG oslo_concurrency.lockutils [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:57:52 compute-0 nova_compute[187185]: 2025-11-29 07:57:52.798 187189 DEBUG nova.virt.hardware [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 07:57:52 compute-0 nova_compute[187185]: 2025-11-29 07:57:52.799 187189 INFO nova.compute.claims [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Claim successful on node compute-0.ctlplane.example.com
Nov 29 07:57:52 compute-0 nova_compute[187185]: 2025-11-29 07:57:52.933 187189 DEBUG nova.compute.provider_tree [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:57:52 compute-0 nova_compute[187185]: 2025-11-29 07:57:52.951 187189 DEBUG nova.scheduler.client.report [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:57:52 compute-0 nova_compute[187185]: 2025-11-29 07:57:52.979 187189 DEBUG oslo_concurrency.lockutils [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.188s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:57:52 compute-0 nova_compute[187185]: 2025-11-29 07:57:52.980 187189 DEBUG nova.compute.manager [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 07:57:53 compute-0 nova_compute[187185]: 2025-11-29 07:57:53.042 187189 DEBUG nova.compute.manager [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 07:57:53 compute-0 nova_compute[187185]: 2025-11-29 07:57:53.043 187189 DEBUG nova.network.neutron [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 07:57:53 compute-0 nova_compute[187185]: 2025-11-29 07:57:53.064 187189 INFO nova.virt.libvirt.driver [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 07:57:53 compute-0 nova_compute[187185]: 2025-11-29 07:57:53.084 187189 DEBUG nova.compute.manager [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 07:57:53 compute-0 nova_compute[187185]: 2025-11-29 07:57:53.205 187189 DEBUG nova.compute.manager [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 07:57:53 compute-0 nova_compute[187185]: 2025-11-29 07:57:53.207 187189 DEBUG nova.virt.libvirt.driver [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 07:57:53 compute-0 nova_compute[187185]: 2025-11-29 07:57:53.207 187189 INFO nova.virt.libvirt.driver [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Creating image(s)
Nov 29 07:57:53 compute-0 nova_compute[187185]: 2025-11-29 07:57:53.208 187189 DEBUG oslo_concurrency.lockutils [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "/var/lib/nova/instances/da8e5049-4048-486b-a0d4-9d9c53478c31/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:57:53 compute-0 nova_compute[187185]: 2025-11-29 07:57:53.208 187189 DEBUG oslo_concurrency.lockutils [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "/var/lib/nova/instances/da8e5049-4048-486b-a0d4-9d9c53478c31/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:57:53 compute-0 nova_compute[187185]: 2025-11-29 07:57:53.209 187189 DEBUG oslo_concurrency.lockutils [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "/var/lib/nova/instances/da8e5049-4048-486b-a0d4-9d9c53478c31/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:57:53 compute-0 nova_compute[187185]: 2025-11-29 07:57:53.231 187189 DEBUG oslo_concurrency.processutils [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:57:53 compute-0 nova_compute[187185]: 2025-11-29 07:57:53.303 187189 DEBUG oslo_concurrency.processutils [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:57:53 compute-0 nova_compute[187185]: 2025-11-29 07:57:53.306 187189 DEBUG oslo_concurrency.lockutils [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:57:53 compute-0 nova_compute[187185]: 2025-11-29 07:57:53.308 187189 DEBUG oslo_concurrency.lockutils [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:57:53 compute-0 nova_compute[187185]: 2025-11-29 07:57:53.335 187189 DEBUG oslo_concurrency.processutils [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:57:53 compute-0 nova_compute[187185]: 2025-11-29 07:57:53.406 187189 DEBUG oslo_concurrency.processutils [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:57:53 compute-0 nova_compute[187185]: 2025-11-29 07:57:53.409 187189 DEBUG oslo_concurrency.processutils [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/da8e5049-4048-486b-a0d4-9d9c53478c31/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:57:53 compute-0 nova_compute[187185]: 2025-11-29 07:57:53.464 187189 DEBUG oslo_concurrency.processutils [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/da8e5049-4048-486b-a0d4-9d9c53478c31/disk 1073741824" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:57:53 compute-0 nova_compute[187185]: 2025-11-29 07:57:53.465 187189 DEBUG oslo_concurrency.lockutils [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:57:53 compute-0 nova_compute[187185]: 2025-11-29 07:57:53.466 187189 DEBUG oslo_concurrency.processutils [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:57:53 compute-0 nova_compute[187185]: 2025-11-29 07:57:53.546 187189 DEBUG oslo_concurrency.processutils [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:57:53 compute-0 nova_compute[187185]: 2025-11-29 07:57:53.547 187189 DEBUG nova.virt.disk.api [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Checking if we can resize image /var/lib/nova/instances/da8e5049-4048-486b-a0d4-9d9c53478c31/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 07:57:53 compute-0 nova_compute[187185]: 2025-11-29 07:57:53.548 187189 DEBUG oslo_concurrency.processutils [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/da8e5049-4048-486b-a0d4-9d9c53478c31/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:57:53 compute-0 nova_compute[187185]: 2025-11-29 07:57:53.614 187189 DEBUG oslo_concurrency.processutils [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/da8e5049-4048-486b-a0d4-9d9c53478c31/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:57:53 compute-0 nova_compute[187185]: 2025-11-29 07:57:53.615 187189 DEBUG nova.virt.disk.api [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Cannot resize image /var/lib/nova/instances/da8e5049-4048-486b-a0d4-9d9c53478c31/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 07:57:53 compute-0 nova_compute[187185]: 2025-11-29 07:57:53.616 187189 DEBUG nova.objects.instance [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lazy-loading 'migration_context' on Instance uuid da8e5049-4048-486b-a0d4-9d9c53478c31 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:57:53 compute-0 nova_compute[187185]: 2025-11-29 07:57:53.637 187189 DEBUG nova.virt.libvirt.driver [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 07:57:53 compute-0 nova_compute[187185]: 2025-11-29 07:57:53.638 187189 DEBUG nova.virt.libvirt.driver [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Ensure instance console log exists: /var/lib/nova/instances/da8e5049-4048-486b-a0d4-9d9c53478c31/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 07:57:53 compute-0 nova_compute[187185]: 2025-11-29 07:57:53.639 187189 DEBUG oslo_concurrency.lockutils [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:57:53 compute-0 nova_compute[187185]: 2025-11-29 07:57:53.640 187189 DEBUG oslo_concurrency.lockutils [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:57:53 compute-0 nova_compute[187185]: 2025-11-29 07:57:53.640 187189 DEBUG oslo_concurrency.lockutils [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:57:54 compute-0 nova_compute[187185]: 2025-11-29 07:57:54.190 187189 DEBUG nova.policy [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 07:57:54 compute-0 nova_compute[187185]: 2025-11-29 07:57:54.241 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:57:54 compute-0 nova_compute[187185]: 2025-11-29 07:57:54.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:57:54 compute-0 nova_compute[187185]: 2025-11-29 07:57:54.984 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:57:57 compute-0 podman[249859]: 2025-11-29 07:57:57.846513194 +0000 UTC m=+0.109799089 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 07:57:58 compute-0 nova_compute[187185]: 2025-11-29 07:57:58.000 187189 DEBUG nova.network.neutron [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Successfully created port: 95c265c1-1b57-4a91-af2e-9d38073be611 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 07:57:58 compute-0 nova_compute[187185]: 2025-11-29 07:57:58.736 187189 DEBUG nova.network.neutron [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Successfully updated port: 95c265c1-1b57-4a91-af2e-9d38073be611 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 07:57:58 compute-0 nova_compute[187185]: 2025-11-29 07:57:58.767 187189 DEBUG oslo_concurrency.lockutils [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "refresh_cache-da8e5049-4048-486b-a0d4-9d9c53478c31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:57:58 compute-0 nova_compute[187185]: 2025-11-29 07:57:58.767 187189 DEBUG oslo_concurrency.lockutils [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquired lock "refresh_cache-da8e5049-4048-486b-a0d4-9d9c53478c31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:57:58 compute-0 nova_compute[187185]: 2025-11-29 07:57:58.768 187189 DEBUG nova.network.neutron [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 07:57:58 compute-0 nova_compute[187185]: 2025-11-29 07:57:58.813 187189 DEBUG nova.compute.manager [req-1c4471d3-36d6-48f9-bab5-cbc87f347555 req-2016ffe2-bc02-4723-97a5-8cc142cdef7a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Received event network-changed-95c265c1-1b57-4a91-af2e-9d38073be611 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:57:58 compute-0 nova_compute[187185]: 2025-11-29 07:57:58.814 187189 DEBUG nova.compute.manager [req-1c4471d3-36d6-48f9-bab5-cbc87f347555 req-2016ffe2-bc02-4723-97a5-8cc142cdef7a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Refreshing instance network info cache due to event network-changed-95c265c1-1b57-4a91-af2e-9d38073be611. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:57:58 compute-0 nova_compute[187185]: 2025-11-29 07:57:58.814 187189 DEBUG oslo_concurrency.lockutils [req-1c4471d3-36d6-48f9-bab5-cbc87f347555 req-2016ffe2-bc02-4723-97a5-8cc142cdef7a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-da8e5049-4048-486b-a0d4-9d9c53478c31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:57:58 compute-0 nova_compute[187185]: 2025-11-29 07:57:58.890 187189 DEBUG nova.network.neutron [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 07:57:59 compute-0 nova_compute[187185]: 2025-11-29 07:57:59.244 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.024 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.152 187189 DEBUG nova.network.neutron [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Updating instance_info_cache with network_info: [{"id": "95c265c1-1b57-4a91-af2e-9d38073be611", "address": "fa:16:3e:84:0e:ef", "network": {"id": "e25d5113-a42d-44ca-8e65-a777d9e11f48", "bridge": "br-int", "label": "tempest-network-smoke--1591284109", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95c265c1-1b", "ovs_interfaceid": "95c265c1-1b57-4a91-af2e-9d38073be611", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.197 187189 DEBUG oslo_concurrency.lockutils [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Releasing lock "refresh_cache-da8e5049-4048-486b-a0d4-9d9c53478c31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.197 187189 DEBUG nova.compute.manager [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Instance network_info: |[{"id": "95c265c1-1b57-4a91-af2e-9d38073be611", "address": "fa:16:3e:84:0e:ef", "network": {"id": "e25d5113-a42d-44ca-8e65-a777d9e11f48", "bridge": "br-int", "label": "tempest-network-smoke--1591284109", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95c265c1-1b", "ovs_interfaceid": "95c265c1-1b57-4a91-af2e-9d38073be611", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.198 187189 DEBUG oslo_concurrency.lockutils [req-1c4471d3-36d6-48f9-bab5-cbc87f347555 req-2016ffe2-bc02-4723-97a5-8cc142cdef7a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-da8e5049-4048-486b-a0d4-9d9c53478c31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.199 187189 DEBUG nova.network.neutron [req-1c4471d3-36d6-48f9-bab5-cbc87f347555 req-2016ffe2-bc02-4723-97a5-8cc142cdef7a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Refreshing network info cache for port 95c265c1-1b57-4a91-af2e-9d38073be611 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.204 187189 DEBUG nova.virt.libvirt.driver [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Start _get_guest_xml network_info=[{"id": "95c265c1-1b57-4a91-af2e-9d38073be611", "address": "fa:16:3e:84:0e:ef", "network": {"id": "e25d5113-a42d-44ca-8e65-a777d9e11f48", "bridge": "br-int", "label": "tempest-network-smoke--1591284109", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95c265c1-1b", "ovs_interfaceid": "95c265c1-1b57-4a91-af2e-9d38073be611", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.211 187189 WARNING nova.virt.libvirt.driver [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.223 187189 DEBUG nova.virt.libvirt.host [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.224 187189 DEBUG nova.virt.libvirt.host [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.231 187189 DEBUG nova.virt.libvirt.host [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.232 187189 DEBUG nova.virt.libvirt.host [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.234 187189 DEBUG nova.virt.libvirt.driver [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.235 187189 DEBUG nova.virt.hardware [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.236 187189 DEBUG nova.virt.hardware [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.237 187189 DEBUG nova.virt.hardware [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.237 187189 DEBUG nova.virt.hardware [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.238 187189 DEBUG nova.virt.hardware [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.238 187189 DEBUG nova.virt.hardware [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.238 187189 DEBUG nova.virt.hardware [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.239 187189 DEBUG nova.virt.hardware [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.239 187189 DEBUG nova.virt.hardware [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.240 187189 DEBUG nova.virt.hardware [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.240 187189 DEBUG nova.virt.hardware [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.248 187189 DEBUG nova.virt.libvirt.vif [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:57:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2022058758-ac',id=181,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCbO5lg6MOx4cmqxiU7BbZWHBZr0hw1OFyx1wi8ylE3DD5XkolVPeqN7tTosQEkEUch/54VxCcm6kKSgh3IzPEqFnEAKzoXbjFEzrMqt3IWtOqFGx5kqMbL3gOxuWmKn/A==',key_name='tempest-TestSecurityGroupsBasicOps-519869731',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e8e45e91223b45a79dd698a82af4a2a5',ramdisk_id='',reservation_id='r-rzp6eqal',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2022058758',owner_user_name='tempest-TestSecurityGroupsBasicOps-2022058758-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:57:53Z,user_data=None,user_id='dec30fbde18e4b2382ea2c59847d067f',uuid=da8e5049-4048-486b-a0d4-9d9c53478c31,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "95c265c1-1b57-4a91-af2e-9d38073be611", "address": "fa:16:3e:84:0e:ef", "network": {"id": "e25d5113-a42d-44ca-8e65-a777d9e11f48", "bridge": "br-int", "label": "tempest-network-smoke--1591284109", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95c265c1-1b", "ovs_interfaceid": "95c265c1-1b57-4a91-af2e-9d38073be611", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.249 187189 DEBUG nova.network.os_vif_util [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converting VIF {"id": "95c265c1-1b57-4a91-af2e-9d38073be611", "address": "fa:16:3e:84:0e:ef", "network": {"id": "e25d5113-a42d-44ca-8e65-a777d9e11f48", "bridge": "br-int", "label": "tempest-network-smoke--1591284109", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95c265c1-1b", "ovs_interfaceid": "95c265c1-1b57-4a91-af2e-9d38073be611", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.251 187189 DEBUG nova.network.os_vif_util [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:84:0e:ef,bridge_name='br-int',has_traffic_filtering=True,id=95c265c1-1b57-4a91-af2e-9d38073be611,network=Network(e25d5113-a42d-44ca-8e65-a777d9e11f48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95c265c1-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.252 187189 DEBUG nova.objects.instance [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lazy-loading 'pci_devices' on Instance uuid da8e5049-4048-486b-a0d4-9d9c53478c31 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.284 187189 DEBUG nova.virt.libvirt.driver [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] End _get_guest_xml xml=<domain type="kvm">
Nov 29 07:58:00 compute-0 nova_compute[187185]:   <uuid>da8e5049-4048-486b-a0d4-9d9c53478c31</uuid>
Nov 29 07:58:00 compute-0 nova_compute[187185]:   <name>instance-000000b5</name>
Nov 29 07:58:00 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 07:58:00 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 07:58:00 compute-0 nova_compute[187185]:   <metadata>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 07:58:00 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:       <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911</nova:name>
Nov 29 07:58:00 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 07:58:00</nova:creationTime>
Nov 29 07:58:00 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 07:58:00 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 07:58:00 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 07:58:00 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 07:58:00 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 07:58:00 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 07:58:00 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 07:58:00 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 07:58:00 compute-0 nova_compute[187185]:         <nova:user uuid="dec30fbde18e4b2382ea2c59847d067f">tempest-TestSecurityGroupsBasicOps-2022058758-project-member</nova:user>
Nov 29 07:58:00 compute-0 nova_compute[187185]:         <nova:project uuid="e8e45e91223b45a79dd698a82af4a2a5">tempest-TestSecurityGroupsBasicOps-2022058758</nova:project>
Nov 29 07:58:00 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 07:58:00 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 07:58:00 compute-0 nova_compute[187185]:         <nova:port uuid="95c265c1-1b57-4a91-af2e-9d38073be611">
Nov 29 07:58:00 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 07:58:00 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 07:58:00 compute-0 nova_compute[187185]:   </metadata>
Nov 29 07:58:00 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <system>
Nov 29 07:58:00 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 07:58:00 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 07:58:00 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 07:58:00 compute-0 nova_compute[187185]:       <entry name="serial">da8e5049-4048-486b-a0d4-9d9c53478c31</entry>
Nov 29 07:58:00 compute-0 nova_compute[187185]:       <entry name="uuid">da8e5049-4048-486b-a0d4-9d9c53478c31</entry>
Nov 29 07:58:00 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     </system>
Nov 29 07:58:00 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 07:58:00 compute-0 nova_compute[187185]:   <os>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:   </os>
Nov 29 07:58:00 compute-0 nova_compute[187185]:   <features>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <apic/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:   </features>
Nov 29 07:58:00 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:   </clock>
Nov 29 07:58:00 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:   </cpu>
Nov 29 07:58:00 compute-0 nova_compute[187185]:   <devices>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 07:58:00 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/da8e5049-4048-486b-a0d4-9d9c53478c31/disk"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 07:58:00 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/da8e5049-4048-486b-a0d4-9d9c53478c31/disk.config"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     </disk>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 07:58:00 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:84:0e:ef"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:       <target dev="tap95c265c1-1b"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     </interface>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 07:58:00 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/da8e5049-4048-486b-a0d4-9d9c53478c31/console.log" append="off"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     </serial>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <video>
Nov 29 07:58:00 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     </video>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 07:58:00 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     </rng>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 07:58:00 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 07:58:00 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 07:58:00 compute-0 nova_compute[187185]:   </devices>
Nov 29 07:58:00 compute-0 nova_compute[187185]: </domain>
Nov 29 07:58:00 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.286 187189 DEBUG nova.compute.manager [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Preparing to wait for external event network-vif-plugged-95c265c1-1b57-4a91-af2e-9d38073be611 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.286 187189 DEBUG oslo_concurrency.lockutils [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "da8e5049-4048-486b-a0d4-9d9c53478c31-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.286 187189 DEBUG oslo_concurrency.lockutils [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "da8e5049-4048-486b-a0d4-9d9c53478c31-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.287 187189 DEBUG oslo_concurrency.lockutils [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "da8e5049-4048-486b-a0d4-9d9c53478c31-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.288 187189 DEBUG nova.virt.libvirt.vif [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:57:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2022058758-ac',id=181,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCbO5lg6MOx4cmqxiU7BbZWHBZr0hw1OFyx1wi8ylE3DD5XkolVPeqN7tTosQEkEUch/54VxCcm6kKSgh3IzPEqFnEAKzoXbjFEzrMqt3IWtOqFGx5kqMbL3gOxuWmKn/A==',key_name='tempest-TestSecurityGroupsBasicOps-519869731',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e8e45e91223b45a79dd698a82af4a2a5',ramdisk_id='',reservation_id='r-rzp6eqal',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2022058758',owner_user_name='tempest-TestSecurityGroupsBasicOps-2022058758-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:57:53Z,user_data=None,user_id='dec30fbde18e4b2382ea2c59847d067f',uuid=da8e5049-4048-486b-a0d4-9d9c53478c31,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "95c265c1-1b57-4a91-af2e-9d38073be611", "address": "fa:16:3e:84:0e:ef", "network": {"id": "e25d5113-a42d-44ca-8e65-a777d9e11f48", "bridge": "br-int", "label": "tempest-network-smoke--1591284109", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95c265c1-1b", "ovs_interfaceid": "95c265c1-1b57-4a91-af2e-9d38073be611", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.288 187189 DEBUG nova.network.os_vif_util [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converting VIF {"id": "95c265c1-1b57-4a91-af2e-9d38073be611", "address": "fa:16:3e:84:0e:ef", "network": {"id": "e25d5113-a42d-44ca-8e65-a777d9e11f48", "bridge": "br-int", "label": "tempest-network-smoke--1591284109", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95c265c1-1b", "ovs_interfaceid": "95c265c1-1b57-4a91-af2e-9d38073be611", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.289 187189 DEBUG nova.network.os_vif_util [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:84:0e:ef,bridge_name='br-int',has_traffic_filtering=True,id=95c265c1-1b57-4a91-af2e-9d38073be611,network=Network(e25d5113-a42d-44ca-8e65-a777d9e11f48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95c265c1-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.289 187189 DEBUG os_vif [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:84:0e:ef,bridge_name='br-int',has_traffic_filtering=True,id=95c265c1-1b57-4a91-af2e-9d38073be611,network=Network(e25d5113-a42d-44ca-8e65-a777d9e11f48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95c265c1-1b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.290 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.291 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.291 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.297 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.298 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap95c265c1-1b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.300 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap95c265c1-1b, col_values=(('external_ids', {'iface-id': '95c265c1-1b57-4a91-af2e-9d38073be611', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:84:0e:ef', 'vm-uuid': 'da8e5049-4048-486b-a0d4-9d9c53478c31'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:58:00 compute-0 NetworkManager[55227]: <info>  [1764403080.3057] manager: (tap95c265c1-1b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/317)
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.307 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.312 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.314 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.315 187189 INFO os_vif [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:84:0e:ef,bridge_name='br-int',has_traffic_filtering=True,id=95c265c1-1b57-4a91-af2e-9d38073be611,network=Network(e25d5113-a42d-44ca-8e65-a777d9e11f48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95c265c1-1b')
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.367 187189 DEBUG nova.virt.libvirt.driver [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.367 187189 DEBUG nova.virt.libvirt.driver [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.368 187189 DEBUG nova.virt.libvirt.driver [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] No VIF found with MAC fa:16:3e:84:0e:ef, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 07:58:00 compute-0 nova_compute[187185]: 2025-11-29 07:58:00.368 187189 INFO nova.virt.libvirt.driver [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Using config drive
Nov 29 07:58:01 compute-0 nova_compute[187185]: 2025-11-29 07:58:01.443 187189 INFO nova.virt.libvirt.driver [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Creating config drive at /var/lib/nova/instances/da8e5049-4048-486b-a0d4-9d9c53478c31/disk.config
Nov 29 07:58:01 compute-0 nova_compute[187185]: 2025-11-29 07:58:01.450 187189 DEBUG oslo_concurrency.processutils [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/da8e5049-4048-486b-a0d4-9d9c53478c31/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpznrgzup2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:58:01 compute-0 nova_compute[187185]: 2025-11-29 07:58:01.581 187189 DEBUG oslo_concurrency.processutils [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/da8e5049-4048-486b-a0d4-9d9c53478c31/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpznrgzup2" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:58:01 compute-0 kernel: tap95c265c1-1b: entered promiscuous mode
Nov 29 07:58:01 compute-0 NetworkManager[55227]: <info>  [1764403081.6517] manager: (tap95c265c1-1b): new Tun device (/org/freedesktop/NetworkManager/Devices/318)
Nov 29 07:58:01 compute-0 ovn_controller[95281]: 2025-11-29T07:58:01Z|00612|binding|INFO|Claiming lport 95c265c1-1b57-4a91-af2e-9d38073be611 for this chassis.
Nov 29 07:58:01 compute-0 ovn_controller[95281]: 2025-11-29T07:58:01Z|00613|binding|INFO|95c265c1-1b57-4a91-af2e-9d38073be611: Claiming fa:16:3e:84:0e:ef 10.100.0.10
Nov 29 07:58:01 compute-0 nova_compute[187185]: 2025-11-29 07:58:01.652 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:01 compute-0 nova_compute[187185]: 2025-11-29 07:58:01.659 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:01.667 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:84:0e:ef 10.100.0.10'], port_security=['fa:16:3e:84:0e:ef 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e25d5113-a42d-44ca-8e65-a777d9e11f48', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '09815a98-4b0c-4b42-8c70-30ee20d821a5 155897d4-49e9-4196-9d87-858cab256c02', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e3099386-1dbe-4af7-95d0-de6761c24471, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=95c265c1-1b57-4a91-af2e-9d38073be611) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:01.669 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 95c265c1-1b57-4a91-af2e-9d38073be611 in datapath e25d5113-a42d-44ca-8e65-a777d9e11f48 bound to our chassis
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:01.670 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e25d5113-a42d-44ca-8e65-a777d9e11f48
Nov 29 07:58:01 compute-0 systemd-udevd[249903]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:01.685 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[f09faada-9966-44cc-aa23-54b5a33b2f90]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:01.687 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape25d5113-a1 in ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:01.689 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape25d5113-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:01.689 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[1d113c9c-fb4b-4e3a-8966-6d9556ad5073]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:01.690 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[c771d215-64b2-4ef9-a02d-42c25bce38da]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:58:01 compute-0 NetworkManager[55227]: <info>  [1764403081.7063] device (tap95c265c1-1b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 07:58:01 compute-0 NetworkManager[55227]: <info>  [1764403081.7086] device (tap95c265c1-1b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:01.707 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[6c87d89b-c1fd-4bb0-830a-93addaf10fdf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:58:01 compute-0 nova_compute[187185]: 2025-11-29 07:58:01.718 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:01 compute-0 ovn_controller[95281]: 2025-11-29T07:58:01Z|00614|binding|INFO|Setting lport 95c265c1-1b57-4a91-af2e-9d38073be611 ovn-installed in OVS
Nov 29 07:58:01 compute-0 systemd-machined[153486]: New machine qemu-70-instance-000000b5.
Nov 29 07:58:01 compute-0 ovn_controller[95281]: 2025-11-29T07:58:01Z|00615|binding|INFO|Setting lport 95c265c1-1b57-4a91-af2e-9d38073be611 up in Southbound
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:01.723 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[21088cc9-3e99-44ed-8658-6974b3843744]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:58:01 compute-0 nova_compute[187185]: 2025-11-29 07:58:01.726 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:01 compute-0 systemd[1]: Started Virtual Machine qemu-70-instance-000000b5.
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:01.761 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[f91bc7fa-5456-460c-88d5-6b155bed588b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:58:01 compute-0 NetworkManager[55227]: <info>  [1764403081.7682] manager: (tape25d5113-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/319)
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:01.767 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d692acda-e4d1-4de2-86be-d971e9b3289d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:01.801 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[26791762-b374-472d-95e1-916e8d9475b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:01.805 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[c1949c88-0040-4807-9b17-f31152a1fe50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:58:01 compute-0 NetworkManager[55227]: <info>  [1764403081.8301] device (tape25d5113-a0): carrier: link connected
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:01.838 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[005743ba-8dc0-4da2-969d-12a8b5320961]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:01.857 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[44f2ed8c-968e-4bc1-86bd-29cb1ac6592e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape25d5113-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9f:60:0a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 190], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 852949, 'reachable_time': 24294, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249938, 'error': None, 'target': 'ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:01.873 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[323ed20f-4678-4dbf-810c-7909d8ece47e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9f:600a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 852949, 'tstamp': 852949}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 249939, 'error': None, 'target': 'ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:01.890 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[0d914e45-66ac-4ed5-b462-eda284b96cfb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape25d5113-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9f:60:0a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 190], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 852949, 'reachable_time': 24294, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 249940, 'error': None, 'target': 'ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:01.920 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[9b072948-66d0-40a0-bd36-e6cad88bb1a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:01.979 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[feac0355-e468-4e7b-ac96-e3dc78e05fcb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:01.981 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape25d5113-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:01.981 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:01.982 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape25d5113-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:58:01 compute-0 kernel: tape25d5113-a0: entered promiscuous mode
Nov 29 07:58:01 compute-0 NetworkManager[55227]: <info>  [1764403081.9850] manager: (tape25d5113-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/320)
Nov 29 07:58:01 compute-0 nova_compute[187185]: 2025-11-29 07:58:01.984 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:01 compute-0 nova_compute[187185]: 2025-11-29 07:58:01.987 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:01.988 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape25d5113-a0, col_values=(('external_ids', {'iface-id': '75a0c6b7-2dfb-46f1-937c-112eb9b0d504'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:58:01 compute-0 nova_compute[187185]: 2025-11-29 07:58:01.989 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:01 compute-0 ovn_controller[95281]: 2025-11-29T07:58:01Z|00616|binding|INFO|Releasing lport 75a0c6b7-2dfb-46f1-937c-112eb9b0d504 from this chassis (sb_readonly=0)
Nov 29 07:58:01 compute-0 nova_compute[187185]: 2025-11-29 07:58:01.990 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:01.991 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e25d5113-a42d-44ca-8e65-a777d9e11f48.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e25d5113-a42d-44ca-8e65-a777d9e11f48.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:01.992 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[795f5e4e-08fa-4998-ac14-528f0d6b7e4c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:01.993 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]: global
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-e25d5113-a42d-44ca-8e65-a777d9e11f48
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/e25d5113-a42d-44ca-8e65-a777d9e11f48.pid.haproxy
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]: 
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID e25d5113-a42d-44ca-8e65-a777d9e11f48
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 07:58:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:01.994 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48', 'env', 'PROCESS_TAG=haproxy-e25d5113-a42d-44ca-8e65-a777d9e11f48', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e25d5113-a42d-44ca-8e65-a777d9e11f48.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 07:58:02 compute-0 nova_compute[187185]: 2025-11-29 07:58:02.003 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:02 compute-0 nova_compute[187185]: 2025-11-29 07:58:02.069 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764403082.068435, da8e5049-4048-486b-a0d4-9d9c53478c31 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:58:02 compute-0 nova_compute[187185]: 2025-11-29 07:58:02.070 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] VM Started (Lifecycle Event)
Nov 29 07:58:02 compute-0 nova_compute[187185]: 2025-11-29 07:58:02.087 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:58:02 compute-0 nova_compute[187185]: 2025-11-29 07:58:02.092 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764403082.068702, da8e5049-4048-486b-a0d4-9d9c53478c31 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:58:02 compute-0 nova_compute[187185]: 2025-11-29 07:58:02.093 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] VM Paused (Lifecycle Event)
Nov 29 07:58:02 compute-0 nova_compute[187185]: 2025-11-29 07:58:02.109 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:58:02 compute-0 nova_compute[187185]: 2025-11-29 07:58:02.112 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:58:02 compute-0 nova_compute[187185]: 2025-11-29 07:58:02.137 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:58:02 compute-0 nova_compute[187185]: 2025-11-29 07:58:02.400 187189 DEBUG nova.compute.manager [req-56b209b7-6fbd-4b9a-8f3b-687f6d513031 req-569306e0-a8c5-4988-99c8-82569edd9240 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Received event network-vif-plugged-95c265c1-1b57-4a91-af2e-9d38073be611 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:58:02 compute-0 nova_compute[187185]: 2025-11-29 07:58:02.401 187189 DEBUG oslo_concurrency.lockutils [req-56b209b7-6fbd-4b9a-8f3b-687f6d513031 req-569306e0-a8c5-4988-99c8-82569edd9240 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "da8e5049-4048-486b-a0d4-9d9c53478c31-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:58:02 compute-0 nova_compute[187185]: 2025-11-29 07:58:02.402 187189 DEBUG oslo_concurrency.lockutils [req-56b209b7-6fbd-4b9a-8f3b-687f6d513031 req-569306e0-a8c5-4988-99c8-82569edd9240 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "da8e5049-4048-486b-a0d4-9d9c53478c31-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:58:02 compute-0 nova_compute[187185]: 2025-11-29 07:58:02.402 187189 DEBUG oslo_concurrency.lockutils [req-56b209b7-6fbd-4b9a-8f3b-687f6d513031 req-569306e0-a8c5-4988-99c8-82569edd9240 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "da8e5049-4048-486b-a0d4-9d9c53478c31-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:58:02 compute-0 nova_compute[187185]: 2025-11-29 07:58:02.402 187189 DEBUG nova.compute.manager [req-56b209b7-6fbd-4b9a-8f3b-687f6d513031 req-569306e0-a8c5-4988-99c8-82569edd9240 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Processing event network-vif-plugged-95c265c1-1b57-4a91-af2e-9d38073be611 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 07:58:02 compute-0 nova_compute[187185]: 2025-11-29 07:58:02.403 187189 DEBUG nova.compute.manager [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 07:58:02 compute-0 nova_compute[187185]: 2025-11-29 07:58:02.408 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764403082.4079032, da8e5049-4048-486b-a0d4-9d9c53478c31 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:58:02 compute-0 nova_compute[187185]: 2025-11-29 07:58:02.409 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] VM Resumed (Lifecycle Event)
Nov 29 07:58:02 compute-0 nova_compute[187185]: 2025-11-29 07:58:02.411 187189 DEBUG nova.virt.libvirt.driver [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 07:58:02 compute-0 nova_compute[187185]: 2025-11-29 07:58:02.416 187189 INFO nova.virt.libvirt.driver [-] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Instance spawned successfully.
Nov 29 07:58:02 compute-0 nova_compute[187185]: 2025-11-29 07:58:02.417 187189 DEBUG nova.virt.libvirt.driver [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 07:58:02 compute-0 nova_compute[187185]: 2025-11-29 07:58:02.435 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:58:02 compute-0 nova_compute[187185]: 2025-11-29 07:58:02.441 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 07:58:02 compute-0 nova_compute[187185]: 2025-11-29 07:58:02.445 187189 DEBUG nova.virt.libvirt.driver [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:58:02 compute-0 nova_compute[187185]: 2025-11-29 07:58:02.446 187189 DEBUG nova.virt.libvirt.driver [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:58:02 compute-0 nova_compute[187185]: 2025-11-29 07:58:02.446 187189 DEBUG nova.virt.libvirt.driver [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:58:02 compute-0 nova_compute[187185]: 2025-11-29 07:58:02.447 187189 DEBUG nova.virt.libvirt.driver [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:58:02 compute-0 nova_compute[187185]: 2025-11-29 07:58:02.447 187189 DEBUG nova.virt.libvirt.driver [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:58:02 compute-0 nova_compute[187185]: 2025-11-29 07:58:02.447 187189 DEBUG nova.virt.libvirt.driver [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 07:58:02 compute-0 podman[249979]: 2025-11-29 07:58:02.368362037 +0000 UTC m=+0.036453021 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 07:58:02 compute-0 nova_compute[187185]: 2025-11-29 07:58:02.488 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 07:58:02 compute-0 nova_compute[187185]: 2025-11-29 07:58:02.549 187189 INFO nova.compute.manager [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Took 9.34 seconds to spawn the instance on the hypervisor.
Nov 29 07:58:02 compute-0 nova_compute[187185]: 2025-11-29 07:58:02.550 187189 DEBUG nova.compute.manager [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:58:02 compute-0 podman[249979]: 2025-11-29 07:58:02.572570047 +0000 UTC m=+0.240660981 container create acaa5b8a76597b3228cbd92c11c10ecf9de4afb13c0013d108721329cc8457fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:58:02 compute-0 systemd[1]: Started libpod-conmon-acaa5b8a76597b3228cbd92c11c10ecf9de4afb13c0013d108721329cc8457fc.scope.
Nov 29 07:58:02 compute-0 systemd[1]: Started libcrun container.
Nov 29 07:58:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a4fd49dcbc60113e256e9813ae8cc73eb0e2f41aa746d73d0d9d77da4a25fff2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 07:58:02 compute-0 podman[249979]: 2025-11-29 07:58:02.700480063 +0000 UTC m=+0.368571087 container init acaa5b8a76597b3228cbd92c11c10ecf9de4afb13c0013d108721329cc8457fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:58:02 compute-0 podman[249979]: 2025-11-29 07:58:02.707026319 +0000 UTC m=+0.375117293 container start acaa5b8a76597b3228cbd92c11c10ecf9de4afb13c0013d108721329cc8457fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 29 07:58:02 compute-0 neutron-haproxy-ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48[249994]: [NOTICE]   (249998) : New worker (250000) forked
Nov 29 07:58:02 compute-0 neutron-haproxy-ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48[249994]: [NOTICE]   (249998) : Loading success.
Nov 29 07:58:02 compute-0 nova_compute[187185]: 2025-11-29 07:58:02.796 187189 INFO nova.compute.manager [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Took 10.04 seconds to build instance.
Nov 29 07:58:02 compute-0 nova_compute[187185]: 2025-11-29 07:58:02.822 187189 DEBUG oslo_concurrency.lockutils [None req-1f3e69dd-1768-4a84-89ef-9b70f6a83a7b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "da8e5049-4048-486b-a0d4-9d9c53478c31" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:58:03 compute-0 nova_compute[187185]: 2025-11-29 07:58:03.471 187189 DEBUG nova.network.neutron [req-1c4471d3-36d6-48f9-bab5-cbc87f347555 req-2016ffe2-bc02-4723-97a5-8cc142cdef7a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Updated VIF entry in instance network info cache for port 95c265c1-1b57-4a91-af2e-9d38073be611. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:58:03 compute-0 nova_compute[187185]: 2025-11-29 07:58:03.473 187189 DEBUG nova.network.neutron [req-1c4471d3-36d6-48f9-bab5-cbc87f347555 req-2016ffe2-bc02-4723-97a5-8cc142cdef7a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Updating instance_info_cache with network_info: [{"id": "95c265c1-1b57-4a91-af2e-9d38073be611", "address": "fa:16:3e:84:0e:ef", "network": {"id": "e25d5113-a42d-44ca-8e65-a777d9e11f48", "bridge": "br-int", "label": "tempest-network-smoke--1591284109", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95c265c1-1b", "ovs_interfaceid": "95c265c1-1b57-4a91-af2e-9d38073be611", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:58:03 compute-0 nova_compute[187185]: 2025-11-29 07:58:03.488 187189 DEBUG oslo_concurrency.lockutils [req-1c4471d3-36d6-48f9-bab5-cbc87f347555 req-2016ffe2-bc02-4723-97a5-8cc142cdef7a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-da8e5049-4048-486b-a0d4-9d9c53478c31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:58:04 compute-0 nova_compute[187185]: 2025-11-29 07:58:04.247 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:04 compute-0 nova_compute[187185]: 2025-11-29 07:58:04.499 187189 DEBUG nova.compute.manager [req-53f5457e-4ed9-4770-8bc1-5c99c32ce7a3 req-db17791a-c5b8-4fcd-8418-9c7db08c52b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Received event network-vif-plugged-95c265c1-1b57-4a91-af2e-9d38073be611 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:58:04 compute-0 nova_compute[187185]: 2025-11-29 07:58:04.501 187189 DEBUG oslo_concurrency.lockutils [req-53f5457e-4ed9-4770-8bc1-5c99c32ce7a3 req-db17791a-c5b8-4fcd-8418-9c7db08c52b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "da8e5049-4048-486b-a0d4-9d9c53478c31-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:58:04 compute-0 nova_compute[187185]: 2025-11-29 07:58:04.501 187189 DEBUG oslo_concurrency.lockutils [req-53f5457e-4ed9-4770-8bc1-5c99c32ce7a3 req-db17791a-c5b8-4fcd-8418-9c7db08c52b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "da8e5049-4048-486b-a0d4-9d9c53478c31-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:58:04 compute-0 nova_compute[187185]: 2025-11-29 07:58:04.501 187189 DEBUG oslo_concurrency.lockutils [req-53f5457e-4ed9-4770-8bc1-5c99c32ce7a3 req-db17791a-c5b8-4fcd-8418-9c7db08c52b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "da8e5049-4048-486b-a0d4-9d9c53478c31-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:58:04 compute-0 nova_compute[187185]: 2025-11-29 07:58:04.502 187189 DEBUG nova.compute.manager [req-53f5457e-4ed9-4770-8bc1-5c99c32ce7a3 req-db17791a-c5b8-4fcd-8418-9c7db08c52b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] No waiting events found dispatching network-vif-plugged-95c265c1-1b57-4a91-af2e-9d38073be611 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:58:04 compute-0 nova_compute[187185]: 2025-11-29 07:58:04.502 187189 WARNING nova.compute.manager [req-53f5457e-4ed9-4770-8bc1-5c99c32ce7a3 req-db17791a-c5b8-4fcd-8418-9c7db08c52b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Received unexpected event network-vif-plugged-95c265c1-1b57-4a91-af2e-9d38073be611 for instance with vm_state active and task_state None.
Nov 29 07:58:05 compute-0 nova_compute[187185]: 2025-11-29 07:58:05.305 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:06 compute-0 nova_compute[187185]: 2025-11-29 07:58:06.527 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:06 compute-0 NetworkManager[55227]: <info>  [1764403086.5293] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/321)
Nov 29 07:58:06 compute-0 NetworkManager[55227]: <info>  [1764403086.5314] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/322)
Nov 29 07:58:06 compute-0 ovn_controller[95281]: 2025-11-29T07:58:06Z|00617|binding|INFO|Releasing lport 75a0c6b7-2dfb-46f1-937c-112eb9b0d504 from this chassis (sb_readonly=0)
Nov 29 07:58:06 compute-0 ovn_controller[95281]: 2025-11-29T07:58:06Z|00618|binding|INFO|Releasing lport 75a0c6b7-2dfb-46f1-937c-112eb9b0d504 from this chassis (sb_readonly=0)
Nov 29 07:58:06 compute-0 nova_compute[187185]: 2025-11-29 07:58:06.696 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:06 compute-0 podman[250010]: 2025-11-29 07:58:06.798912807 +0000 UTC m=+0.058347714 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 07:58:06 compute-0 nova_compute[187185]: 2025-11-29 07:58:06.913 187189 DEBUG nova.compute.manager [req-eb860f60-48a2-4e68-b070-7ff3ef38f603 req-dec7487f-5493-4bfe-82e4-e7a62716d6e6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Received event network-changed-95c265c1-1b57-4a91-af2e-9d38073be611 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:58:06 compute-0 nova_compute[187185]: 2025-11-29 07:58:06.914 187189 DEBUG nova.compute.manager [req-eb860f60-48a2-4e68-b070-7ff3ef38f603 req-dec7487f-5493-4bfe-82e4-e7a62716d6e6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Refreshing instance network info cache due to event network-changed-95c265c1-1b57-4a91-af2e-9d38073be611. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:58:06 compute-0 nova_compute[187185]: 2025-11-29 07:58:06.914 187189 DEBUG oslo_concurrency.lockutils [req-eb860f60-48a2-4e68-b070-7ff3ef38f603 req-dec7487f-5493-4bfe-82e4-e7a62716d6e6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-da8e5049-4048-486b-a0d4-9d9c53478c31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:58:06 compute-0 nova_compute[187185]: 2025-11-29 07:58:06.914 187189 DEBUG oslo_concurrency.lockutils [req-eb860f60-48a2-4e68-b070-7ff3ef38f603 req-dec7487f-5493-4bfe-82e4-e7a62716d6e6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-da8e5049-4048-486b-a0d4-9d9c53478c31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:58:06 compute-0 nova_compute[187185]: 2025-11-29 07:58:06.914 187189 DEBUG nova.network.neutron [req-eb860f60-48a2-4e68-b070-7ff3ef38f603 req-dec7487f-5493-4bfe-82e4-e7a62716d6e6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Refreshing network info cache for port 95c265c1-1b57-4a91-af2e-9d38073be611 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:58:08 compute-0 nova_compute[187185]: 2025-11-29 07:58:08.557 187189 DEBUG nova.network.neutron [req-eb860f60-48a2-4e68-b070-7ff3ef38f603 req-dec7487f-5493-4bfe-82e4-e7a62716d6e6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Updated VIF entry in instance network info cache for port 95c265c1-1b57-4a91-af2e-9d38073be611. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:58:08 compute-0 nova_compute[187185]: 2025-11-29 07:58:08.558 187189 DEBUG nova.network.neutron [req-eb860f60-48a2-4e68-b070-7ff3ef38f603 req-dec7487f-5493-4bfe-82e4-e7a62716d6e6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Updating instance_info_cache with network_info: [{"id": "95c265c1-1b57-4a91-af2e-9d38073be611", "address": "fa:16:3e:84:0e:ef", "network": {"id": "e25d5113-a42d-44ca-8e65-a777d9e11f48", "bridge": "br-int", "label": "tempest-network-smoke--1591284109", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95c265c1-1b", "ovs_interfaceid": "95c265c1-1b57-4a91-af2e-9d38073be611", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:58:08 compute-0 nova_compute[187185]: 2025-11-29 07:58:08.578 187189 DEBUG oslo_concurrency.lockutils [req-eb860f60-48a2-4e68-b070-7ff3ef38f603 req-dec7487f-5493-4bfe-82e4-e7a62716d6e6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-da8e5049-4048-486b-a0d4-9d9c53478c31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:58:08 compute-0 podman[250036]: 2025-11-29 07:58:08.80902243 +0000 UTC m=+0.066346611 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 29 07:58:08 compute-0 podman[250037]: 2025-11-29 07:58:08.819755446 +0000 UTC m=+0.074340429 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm)
Nov 29 07:58:09 compute-0 sshd-session[250034]: Invalid user production from 20.255.62.58 port 42078
Nov 29 07:58:09 compute-0 nova_compute[187185]: 2025-11-29 07:58:09.284 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:09 compute-0 sshd-session[250034]: Received disconnect from 20.255.62.58 port 42078:11: Bye Bye [preauth]
Nov 29 07:58:09 compute-0 sshd-session[250034]: Disconnected from invalid user production 20.255.62.58 port 42078 [preauth]
Nov 29 07:58:10 compute-0 nova_compute[187185]: 2025-11-29 07:58:10.352 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:14 compute-0 nova_compute[187185]: 2025-11-29 07:58:14.359 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:15 compute-0 nova_compute[187185]: 2025-11-29 07:58:15.356 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:15 compute-0 ovn_controller[95281]: 2025-11-29T07:58:15Z|00085|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:84:0e:ef 10.100.0.10
Nov 29 07:58:15 compute-0 ovn_controller[95281]: 2025-11-29T07:58:15Z|00086|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:84:0e:ef 10.100.0.10
Nov 29 07:58:19 compute-0 nova_compute[187185]: 2025-11-29 07:58:19.361 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:19 compute-0 podman[250100]: 2025-11-29 07:58:19.810154446 +0000 UTC m=+0.055191844 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 07:58:19 compute-0 podman[250098]: 2025-11-29 07:58:19.818146094 +0000 UTC m=+0.067782033 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:58:19 compute-0 podman[250099]: 2025-11-29 07:58:19.81941404 +0000 UTC m=+0.067851895 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git, architecture=x86_64, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, container_name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal)
Nov 29 07:58:20 compute-0 nova_compute[187185]: 2025-11-29 07:58:20.358 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:24 compute-0 nova_compute[187185]: 2025-11-29 07:58:24.365 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:25 compute-0 nova_compute[187185]: 2025-11-29 07:58:25.361 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:25.758 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:58:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:25.760 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:58:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:25.761 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:58:26 compute-0 nova_compute[187185]: 2025-11-29 07:58:26.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:58:26 compute-0 nova_compute[187185]: 2025-11-29 07:58:26.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 29 07:58:26 compute-0 nova_compute[187185]: 2025-11-29 07:58:26.335 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 29 07:58:28 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:28.780 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=77, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=76) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:58:28 compute-0 nova_compute[187185]: 2025-11-29 07:58:28.780 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:28 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:28.781 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:58:28 compute-0 podman[250160]: 2025-11-29 07:58:28.826015788 +0000 UTC m=+0.085666993 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Nov 29 07:58:29 compute-0 nova_compute[187185]: 2025-11-29 07:58:29.367 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:30 compute-0 nova_compute[187185]: 2025-11-29 07:58:30.365 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:33 compute-0 nova_compute[187185]: 2025-11-29 07:58:33.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:58:34 compute-0 nova_compute[187185]: 2025-11-29 07:58:34.369 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:35 compute-0 nova_compute[187185]: 2025-11-29 07:58:35.367 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:35 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:35.784 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '77'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:58:36 compute-0 sshd-session[250186]: Received disconnect from 190.181.27.27 port 53874:11: Bye Bye [preauth]
Nov 29 07:58:36 compute-0 sshd-session[250186]: Disconnected from authenticating user root 190.181.27.27 port 53874 [preauth]
Nov 29 07:58:37 compute-0 podman[250188]: 2025-11-29 07:58:37.806399957 +0000 UTC m=+0.057926482 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:58:39 compute-0 nova_compute[187185]: 2025-11-29 07:58:39.332 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:58:39 compute-0 nova_compute[187185]: 2025-11-29 07:58:39.373 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:39 compute-0 podman[250213]: 2025-11-29 07:58:39.815060089 +0000 UTC m=+0.071449128 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Nov 29 07:58:39 compute-0 podman[250212]: 2025-11-29 07:58:39.838863007 +0000 UTC m=+0.088036830 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd)
Nov 29 07:58:40 compute-0 nova_compute[187185]: 2025-11-29 07:58:40.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:58:40 compute-0 nova_compute[187185]: 2025-11-29 07:58:40.371 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:43 compute-0 nova_compute[187185]: 2025-11-29 07:58:43.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:58:43 compute-0 nova_compute[187185]: 2025-11-29 07:58:43.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:58:43 compute-0 nova_compute[187185]: 2025-11-29 07:58:43.346 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:58:43 compute-0 nova_compute[187185]: 2025-11-29 07:58:43.346 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:58:43 compute-0 nova_compute[187185]: 2025-11-29 07:58:43.346 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:58:43 compute-0 nova_compute[187185]: 2025-11-29 07:58:43.346 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:58:43 compute-0 nova_compute[187185]: 2025-11-29 07:58:43.432 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/da8e5049-4048-486b-a0d4-9d9c53478c31/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:58:43 compute-0 nova_compute[187185]: 2025-11-29 07:58:43.493 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/da8e5049-4048-486b-a0d4-9d9c53478c31/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:58:43 compute-0 nova_compute[187185]: 2025-11-29 07:58:43.494 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/da8e5049-4048-486b-a0d4-9d9c53478c31/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 07:58:43 compute-0 nova_compute[187185]: 2025-11-29 07:58:43.554 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/da8e5049-4048-486b-a0d4-9d9c53478c31/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 07:58:43 compute-0 nova_compute[187185]: 2025-11-29 07:58:43.738 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:58:43 compute-0 nova_compute[187185]: 2025-11-29 07:58:43.740 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5565MB free_disk=73.22222518920898GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:58:43 compute-0 nova_compute[187185]: 2025-11-29 07:58:43.740 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:58:43 compute-0 nova_compute[187185]: 2025-11-29 07:58:43.740 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:58:43 compute-0 nova_compute[187185]: 2025-11-29 07:58:43.815 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance da8e5049-4048-486b-a0d4-9d9c53478c31 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 07:58:43 compute-0 nova_compute[187185]: 2025-11-29 07:58:43.815 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:58:43 compute-0 nova_compute[187185]: 2025-11-29 07:58:43.815 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:58:43 compute-0 nova_compute[187185]: 2025-11-29 07:58:43.855 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:58:43 compute-0 nova_compute[187185]: 2025-11-29 07:58:43.875 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:58:43 compute-0 nova_compute[187185]: 2025-11-29 07:58:43.897 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:58:43 compute-0 nova_compute[187185]: 2025-11-29 07:58:43.898 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:58:44 compute-0 nova_compute[187185]: 2025-11-29 07:58:44.376 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:45 compute-0 nova_compute[187185]: 2025-11-29 07:58:45.373 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:46 compute-0 nova_compute[187185]: 2025-11-29 07:58:46.898 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:58:46 compute-0 nova_compute[187185]: 2025-11-29 07:58:46.899 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.026 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'da8e5049-4048-486b-a0d4-9d9c53478c31', 'name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000b5', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'hostId': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.027 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.068 12 DEBUG ceilometer.compute.pollsters [-] da8e5049-4048-486b-a0d4-9d9c53478c31/disk.device.write.requests volume: 327 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.069 12 DEBUG ceilometer.compute.pollsters [-] da8e5049-4048-486b-a0d4-9d9c53478c31/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c55a437f-4db3-404d-9bee-fb636e044906', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 327, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31-vda', 'timestamp': '2025-11-29T07:58:48.028069', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911', 'name': 'instance-000000b5', 'instance_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3c657e86-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8575.746321151, 'message_signature': '036cfb873bf0d0131f3a63296a66c085d19a615443875b54990caa526fd1364c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31-sda', 'timestamp': '2025-11-29T07:58:48.028069', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911', 'name': 'instance-000000b5', 'instance_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3c659d1c-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8575.746321151, 'message_signature': '8628ed7a859d68151162c21be7329d0deb14ca96246aa6de660b7978c1db2a36'}]}, 'timestamp': '2025-11-29 07:58:48.070421', '_unique_id': '88cb118ceed2415e8b248f208e4b948b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.073 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.075 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.076 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.076 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911>]
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.077 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.077 12 DEBUG ceilometer.compute.pollsters [-] da8e5049-4048-486b-a0d4-9d9c53478c31/disk.device.write.bytes volume: 73003008 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.078 12 DEBUG ceilometer.compute.pollsters [-] da8e5049-4048-486b-a0d4-9d9c53478c31/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9d308d22-3275-4ced-92b2-5ae6095abbd0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73003008, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31-vda', 'timestamp': '2025-11-29T07:58:48.077357', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911', 'name': 'instance-000000b5', 'instance_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3c66c854-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8575.746321151, 'message_signature': 'e44ea3d02251a475f45cb3cbd38bebcfc324b26eba0372e255a2a547abfafd94'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31-sda', 'timestamp': '2025-11-29T07:58:48.077357', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911', 'name': 'instance-000000b5', 'instance_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3c66e3ac-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8575.746321151, 'message_signature': '0106c5e26bcc5509c0aea0b2012f67af9471d86b5ca51ce60bef323fd7724b84'}]}, 'timestamp': '2025-11-29 07:58:48.078670', '_unique_id': 'f8c8e76771804f4c907dfeeca1827ea8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.080 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.082 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.088 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for da8e5049-4048-486b-a0d4-9d9c53478c31 / tap95c265c1-1b inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.090 12 DEBUG ceilometer.compute.pollsters [-] da8e5049-4048-486b-a0d4-9d9c53478c31/network.incoming.packets volume: 172 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3de91267-f4fa-4965-a19f-a85a81a33684', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 172, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b5-da8e5049-4048-486b-a0d4-9d9c53478c31-tap95c265c1-1b', 'timestamp': '2025-11-29T07:58:48.082951', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911', 'name': 'tap95c265c1-1b', 'instance_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:84:0e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap95c265c1-1b'}, 'message_id': '3c68b95c-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8575.801306408, 'message_signature': '89a079caddd7c28ff9731b4dcc4204803bc7305651fa0b40a9d24d014bf0b36d'}]}, 'timestamp': '2025-11-29 07:58:48.090750', '_unique_id': '5250bbc4c40147e39dfb29d4fd00461e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.092 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.094 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.112 12 DEBUG ceilometer.compute.pollsters [-] da8e5049-4048-486b-a0d4-9d9c53478c31/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.113 12 DEBUG ceilometer.compute.pollsters [-] da8e5049-4048-486b-a0d4-9d9c53478c31/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '03a3cfa3-c88f-43d0-837b-588081c1d4f9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31-vda', 'timestamp': '2025-11-29T07:58:48.094864', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911', 'name': 'instance-000000b5', 'instance_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3c6c27cc-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8575.813195227, 'message_signature': '71c6424dec6591011f3e8bf57024d20c825cc6b6b987aae0582b3ad98059fec3'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31-sda', 'timestamp': '2025-11-29T07:58:48.094864', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911', 'name': 'instance-000000b5', 'instance_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3c6c3712-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8575.813195227, 'message_signature': '84c650a6177bfd99e8aa6841448ff92f0024b599f9b791578f981f75ae3a0b1d'}]}, 'timestamp': '2025-11-29 07:58:48.113458', '_unique_id': '74afe0cb61014727a376234aed42182f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.114 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.115 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.115 12 DEBUG ceilometer.compute.pollsters [-] da8e5049-4048-486b-a0d4-9d9c53478c31/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8f3bb3d1-2555-428c-97eb-7303671892cd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b5-da8e5049-4048-486b-a0d4-9d9c53478c31-tap95c265c1-1b', 'timestamp': '2025-11-29T07:58:48.115811', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911', 'name': 'tap95c265c1-1b', 'instance_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:84:0e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap95c265c1-1b'}, 'message_id': '3c6c9eaa-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8575.801306408, 'message_signature': 'f3d31c83ac099366d3519949a018763af076e9bd271095e686035213ee497737'}]}, 'timestamp': '2025-11-29 07:58:48.116103', '_unique_id': 'f7357e3ab5ae471680aaa6371827d5d3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.116 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.117 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.117 12 DEBUG ceilometer.compute.pollsters [-] da8e5049-4048-486b-a0d4-9d9c53478c31/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '213e53ec-1fe8-4b18-b586-9ac95cd051bf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b5-da8e5049-4048-486b-a0d4-9d9c53478c31-tap95c265c1-1b', 'timestamp': '2025-11-29T07:58:48.117259', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911', 'name': 'tap95c265c1-1b', 'instance_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:84:0e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap95c265c1-1b'}, 'message_id': '3c6cd708-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8575.801306408, 'message_signature': 'b290679ab490490616d53d70447d74f5c950b576a56b1af6ca292d76d1a5569c'}]}, 'timestamp': '2025-11-29 07:58:48.117538', '_unique_id': '005ed9474af64b9e8e3c2512130913ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 DEBUG ceilometer.compute.pollsters [-] da8e5049-4048-486b-a0d4-9d9c53478c31/disk.device.read.bytes volume: 30366208 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.118 12 DEBUG ceilometer.compute.pollsters [-] da8e5049-4048-486b-a0d4-9d9c53478c31/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4bd551a3-a0a3-4186-ac94-29cccbb2b43d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30366208, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31-vda', 'timestamp': '2025-11-29T07:58:48.118707', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911', 'name': 'instance-000000b5', 'instance_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3c6d0e62-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8575.746321151, 'message_signature': '5690955a615696b640fd71a4880ba1f5b6d4f32ee95cec63be429cad8769393a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31-sda', 'timestamp': '2025-11-29T07:58:48.118707', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911', 'name': 'instance-000000b5', 'instance_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3c6d16dc-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8575.746321151, 'message_signature': 'dc08a1cd3d25b6759a4a56f4149d11145913cb8573bc4022402375c7c698edf6'}]}, 'timestamp': '2025-11-29 07:58:48.119157', '_unique_id': 'fd77d948eb604ec0acacb1ceb80fc10a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.119 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.120 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.120 12 DEBUG ceilometer.compute.pollsters [-] da8e5049-4048-486b-a0d4-9d9c53478c31/disk.device.write.latency volume: 12351075308 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.120 12 DEBUG ceilometer.compute.pollsters [-] da8e5049-4048-486b-a0d4-9d9c53478c31/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b3234703-f62c-4b34-b1a0-c3ca31aaecb0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12351075308, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31-vda', 'timestamp': '2025-11-29T07:58:48.120352', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911', 'name': 'instance-000000b5', 'instance_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3c6d4de6-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8575.746321151, 'message_signature': '80c83c91f9b6c3d8a436b7f4f9ca769019c4b79d6c6f2822e43251718d763d96'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31-sda', 'timestamp': '2025-11-29T07:58:48.120352', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911', 'name': 'instance-000000b5', 'instance_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3c6d55ca-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8575.746321151, 'message_signature': '70babf455155fbd7fc3724c7272ccddf426357b7a03492047b910afa0d43a9bc'}]}, 'timestamp': '2025-11-29 07:58:48.120772', '_unique_id': '73990ffd352948e19c04c86feae8f822'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.121 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.122 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.122 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.122 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911>]
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.122 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.122 12 DEBUG ceilometer.compute.pollsters [-] da8e5049-4048-486b-a0d4-9d9c53478c31/network.outgoing.packets volume: 190 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6d670357-28ad-4bae-8701-42f1beff7edf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 190, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b5-da8e5049-4048-486b-a0d4-9d9c53478c31-tap95c265c1-1b', 'timestamp': '2025-11-29T07:58:48.122792', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911', 'name': 'tap95c265c1-1b', 'instance_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:84:0e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap95c265c1-1b'}, 'message_id': '3c6dae3a-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8575.801306408, 'message_signature': '9fd6de9245ac69187086943d1ce353786c4463bda287c77ed15ab921083b5c26'}]}, 'timestamp': '2025-11-29 07:58:48.123052', '_unique_id': '30eba943fd5b4c15bd3ac169cca51c2e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.123 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.124 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.124 12 DEBUG ceilometer.compute.pollsters [-] da8e5049-4048-486b-a0d4-9d9c53478c31/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a5ded3b6-f7be-4da3-b0d9-a7f8534b0a25', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b5-da8e5049-4048-486b-a0d4-9d9c53478c31-tap95c265c1-1b', 'timestamp': '2025-11-29T07:58:48.124246', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911', 'name': 'tap95c265c1-1b', 'instance_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:84:0e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap95c265c1-1b'}, 'message_id': '3c6de74c-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8575.801306408, 'message_signature': '781d21ae2529f531fd7737809790917d8da0275c6f7c4abd7948ac4d501df880'}]}, 'timestamp': '2025-11-29 07:58:48.124575', '_unique_id': '5fd0738b3fbe46c180c22dfbf0337c9c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 DEBUG ceilometer.compute.pollsters [-] da8e5049-4048-486b-a0d4-9d9c53478c31/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.125 12 DEBUG ceilometer.compute.pollsters [-] da8e5049-4048-486b-a0d4-9d9c53478c31/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '61611555-60d6-4cd2-828f-417ba875c60c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31-vda', 'timestamp': '2025-11-29T07:58:48.125711', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911', 'name': 'instance-000000b5', 'instance_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3c6e1f1e-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8575.813195227, 'message_signature': '26ca9e31b4d32cdffa666dc21d886efb7b89ba755bc11f6613814a01d6b6f7ee'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31-sda', 'timestamp': '2025-11-29T07:58:48.125711', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911', 'name': 'instance-000000b5', 'instance_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3c6e27a2-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8575.813195227, 'message_signature': 'e6fca82bf2dc23cd1a88490d84871e1ce35d662f30829c073dc24bf60370bac9'}]}, 'timestamp': '2025-11-29 07:58:48.126137', '_unique_id': 'd10f984ad909435e86d4ce43a4e41bc2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.126 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.127 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.127 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.127 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911>]
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.127 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.127 12 DEBUG ceilometer.compute.pollsters [-] da8e5049-4048-486b-a0d4-9d9c53478c31/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2e29aeef-693a-4653-a6cf-a6d505272680', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b5-da8e5049-4048-486b-a0d4-9d9c53478c31-tap95c265c1-1b', 'timestamp': '2025-11-29T07:58:48.127698', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911', 'name': 'tap95c265c1-1b', 'instance_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:84:0e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap95c265c1-1b'}, 'message_id': '3c6e6d5c-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8575.801306408, 'message_signature': '0f85a8ecc655bfb2ec4cc2555eb5866c9220b265a4159c4fcf956a53f84ab8bd'}]}, 'timestamp': '2025-11-29 07:58:48.127965', '_unique_id': '8897179a6c444f7fa691249805f9da5f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.128 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.129 12 DEBUG ceilometer.compute.pollsters [-] da8e5049-4048-486b-a0d4-9d9c53478c31/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.129 12 DEBUG ceilometer.compute.pollsters [-] da8e5049-4048-486b-a0d4-9d9c53478c31/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '28551548-fa01-40ac-a2ee-45c48b2309a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31-vda', 'timestamp': '2025-11-29T07:58:48.129039', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911', 'name': 'instance-000000b5', 'instance_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3c6ea11e-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8575.813195227, 'message_signature': 'd04f7726920782b883ee312d13ffc6f90f78fec4639267927aa63c0efcf3e041'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31-sda', 'timestamp': '2025-11-29T07:58:48.129039', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911', 'name': 'instance-000000b5', 'instance_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3c6ea89e-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8575.813195227, 'message_signature': '328f6ffab4f30f33f415008065ad6f86160a3cc9561ffc37dfe5aeb93ad6d380'}]}, 'timestamp': '2025-11-29 07:58:48.129485', '_unique_id': '33fe49c976d74f78a1b5b070f1421d84'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.130 12 DEBUG ceilometer.compute.pollsters [-] da8e5049-4048-486b-a0d4-9d9c53478c31/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd00eaea8-890c-41e8-a45e-8821bbc3e1d2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b5-da8e5049-4048-486b-a0d4-9d9c53478c31-tap95c265c1-1b', 'timestamp': '2025-11-29T07:58:48.130749', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911', 'name': 'tap95c265c1-1b', 'instance_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:84:0e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap95c265c1-1b'}, 'message_id': '3c6ee50c-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8575.801306408, 'message_signature': '84f47aa240a83577f9365debb20aaf22318f8e1967058dbcc2380bd5726d74d3'}]}, 'timestamp': '2025-11-29 07:58:48.131006', '_unique_id': 'ee3325afd7f64cb8b46c330355459fc0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.131 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.132 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.132 12 DEBUG ceilometer.compute.pollsters [-] da8e5049-4048-486b-a0d4-9d9c53478c31/network.incoming.bytes volume: 31784 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c38252a5-6630-48c3-95b9-04ba2b5cdf59', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31784, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b5-da8e5049-4048-486b-a0d4-9d9c53478c31-tap95c265c1-1b', 'timestamp': '2025-11-29T07:58:48.132295', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911', 'name': 'tap95c265c1-1b', 'instance_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:84:0e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap95c265c1-1b'}, 'message_id': '3c6f2314-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8575.801306408, 'message_signature': 'edef74cf7da36a3da9b16be49cb0627f0d3e432c120e0ec78840b06a3e29daa7'}]}, 'timestamp': '2025-11-29 07:58:48.132621', '_unique_id': '53971ed08aca491dad1775defeb81b93'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.133 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.134 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.134 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.134 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911>]
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.134 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.134 12 DEBUG ceilometer.compute.pollsters [-] da8e5049-4048-486b-a0d4-9d9c53478c31/disk.device.read.latency volume: 236358599 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.134 12 DEBUG ceilometer.compute.pollsters [-] da8e5049-4048-486b-a0d4-9d9c53478c31/disk.device.read.latency volume: 22719899 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e176bfbf-dd39-406c-9cdb-f7a4f5cbfece', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 236358599, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31-vda', 'timestamp': '2025-11-29T07:58:48.134637', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911', 'name': 'instance-000000b5', 'instance_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3c6f7c56-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8575.746321151, 'message_signature': '6b7397763c272cd7ce4ed020f164b7f79d5b023b87ebc87e5c65c38c972cdf67'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22719899, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31-sda', 'timestamp': '2025-11-29T07:58:48.134637', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911', 'name': 'instance-000000b5', 'instance_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3c6f8b1a-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8575.746321151, 'message_signature': '584075a9b810181829d93f1c8ef9de479f93e6ca0e4fb74952c5a223e8a4e74f'}]}, 'timestamp': '2025-11-29 07:58:48.135282', '_unique_id': 'c00631c8793340a2b087de4256e66822'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.136 12 DEBUG ceilometer.compute.pollsters [-] da8e5049-4048-486b-a0d4-9d9c53478c31/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e10ef154-e3b4-4ec5-a7c4-6ec8f5349b88', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b5-da8e5049-4048-486b-a0d4-9d9c53478c31-tap95c265c1-1b', 'timestamp': '2025-11-29T07:58:48.136897', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911', 'name': 'tap95c265c1-1b', 'instance_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:84:0e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap95c265c1-1b'}, 'message_id': '3c6fd48a-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8575.801306408, 'message_signature': '7105c0f46cad03f0431facf9356fe632ce5dbeb0c2052d5e692ab6cc694c69e2'}]}, 'timestamp': '2025-11-29 07:58:48.137135', '_unique_id': 'c97eae3b9c4f4f778b192f89ae15fa55'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.137 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.138 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.153 12 DEBUG ceilometer.compute.pollsters [-] da8e5049-4048-486b-a0d4-9d9c53478c31/memory.usage volume: 42.6796875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b4151656-f12b-4b42-96e9-35264bb8c38f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.6796875, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31', 'timestamp': '2025-11-29T07:58:48.138412', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911', 'name': 'instance-000000b5', 'instance_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '3c726b28-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8575.871643393, 'message_signature': '09b30e5f05c2f514a3740aea0a642d0a2d808ab1819f1b12c56d922e95a7490a'}]}, 'timestamp': '2025-11-29 07:58:48.154208', '_unique_id': 'cec605ce384e427089054ea85215ba5f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.155 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.156 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.156 12 DEBUG ceilometer.compute.pollsters [-] da8e5049-4048-486b-a0d4-9d9c53478c31/network.outgoing.bytes volume: 27140 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd3244bba-e24f-49b9-ae85-1a2f7a931dd4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 27140, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b5-da8e5049-4048-486b-a0d4-9d9c53478c31-tap95c265c1-1b', 'timestamp': '2025-11-29T07:58:48.156313', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911', 'name': 'tap95c265c1-1b', 'instance_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:84:0e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap95c265c1-1b'}, 'message_id': '3c72cb5e-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8575.801306408, 'message_signature': 'e39ec195d6fbf578226376c0f4906ebcf62e9dc962dcc43bf56251c28b430ac8'}]}, 'timestamp': '2025-11-29 07:58:48.156565', '_unique_id': 'b332b0c262f74a93aebe1a8451c1b1df'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.157 12 DEBUG ceilometer.compute.pollsters [-] da8e5049-4048-486b-a0d4-9d9c53478c31/cpu volume: 12380000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3bb755c2-e14b-4787-a906-a09a6a21c0be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12380000000, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31', 'timestamp': '2025-11-29T07:58:48.157742', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911', 'name': 'instance-000000b5', 'instance_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '3c730394-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8575.871643393, 'message_signature': '1b1c720ca5ea54df6109794f48e18a56ff762ff3a1a5e773922e34f143fdb05c'}]}, 'timestamp': '2025-11-29 07:58:48.158020', '_unique_id': '6df63cea71c4458eb5d3874bb4fe290d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.158 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.159 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.159 12 DEBUG ceilometer.compute.pollsters [-] da8e5049-4048-486b-a0d4-9d9c53478c31/disk.device.read.requests volume: 1093 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.159 12 DEBUG ceilometer.compute.pollsters [-] da8e5049-4048-486b-a0d4-9d9c53478c31/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '07aaab68-8594-4be8-a5c7-4d3b3a715285', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1093, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31-vda', 'timestamp': '2025-11-29T07:58:48.159141', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911', 'name': 'instance-000000b5', 'instance_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3c73397c-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8575.746321151, 'message_signature': 'a28bfd135506e255b22259a23039beb096e9375f892f033c8af04f45b89a1d20'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31-sda', 'timestamp': '2025-11-29T07:58:48.159141', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911', 'name': 'instance-000000b5', 'instance_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3c734174-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8575.746321151, 'message_signature': '105a7bf497c5afbbdb39c47662741a8efdc830dea515dd6db336c5f80b07da8a'}]}, 'timestamp': '2025-11-29 07:58:48.159590', '_unique_id': 'f8bafa4393eb45aa8805054f64eef5b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 07:58:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 07:58:48.160 12 ERROR oslo_messaging.notify.messaging 
Nov 29 07:58:49 compute-0 nova_compute[187185]: 2025-11-29 07:58:49.379 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:50 compute-0 nova_compute[187185]: 2025-11-29 07:58:50.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:58:50 compute-0 nova_compute[187185]: 2025-11-29 07:58:50.377 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:50 compute-0 podman[250259]: 2025-11-29 07:58:50.79392083 +0000 UTC m=+0.059230519 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 07:58:50 compute-0 podman[250260]: 2025-11-29 07:58:50.815932518 +0000 UTC m=+0.075585376 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vcs-type=git, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, distribution-scope=public, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 29 07:58:50 compute-0 podman[250261]: 2025-11-29 07:58:50.833760576 +0000 UTC m=+0.088103632 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 07:58:53 compute-0 nova_compute[187185]: 2025-11-29 07:58:53.311 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:58:53 compute-0 nova_compute[187185]: 2025-11-29 07:58:53.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:58:53 compute-0 nova_compute[187185]: 2025-11-29 07:58:53.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:58:53 compute-0 nova_compute[187185]: 2025-11-29 07:58:53.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:58:53 compute-0 nova_compute[187185]: 2025-11-29 07:58:53.491 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "refresh_cache-da8e5049-4048-486b-a0d4-9d9c53478c31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:58:53 compute-0 nova_compute[187185]: 2025-11-29 07:58:53.492 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquired lock "refresh_cache-da8e5049-4048-486b-a0d4-9d9c53478c31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:58:53 compute-0 nova_compute[187185]: 2025-11-29 07:58:53.493 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 29 07:58:53 compute-0 nova_compute[187185]: 2025-11-29 07:58:53.493 187189 DEBUG nova.objects.instance [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lazy-loading 'info_cache' on Instance uuid da8e5049-4048-486b-a0d4-9d9c53478c31 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:58:54 compute-0 nova_compute[187185]: 2025-11-29 07:58:54.423 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:55 compute-0 nova_compute[187185]: 2025-11-29 07:58:55.380 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:55 compute-0 nova_compute[187185]: 2025-11-29 07:58:55.463 187189 DEBUG nova.compute.manager [req-a850d0e4-a40e-458b-bc25-dd1f2e852696 req-0702d064-a25c-4096-8aa5-99ffe7ff910a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Received event network-changed-95c265c1-1b57-4a91-af2e-9d38073be611 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:58:55 compute-0 nova_compute[187185]: 2025-11-29 07:58:55.464 187189 DEBUG nova.compute.manager [req-a850d0e4-a40e-458b-bc25-dd1f2e852696 req-0702d064-a25c-4096-8aa5-99ffe7ff910a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Refreshing instance network info cache due to event network-changed-95c265c1-1b57-4a91-af2e-9d38073be611. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 07:58:55 compute-0 nova_compute[187185]: 2025-11-29 07:58:55.464 187189 DEBUG oslo_concurrency.lockutils [req-a850d0e4-a40e-458b-bc25-dd1f2e852696 req-0702d064-a25c-4096-8aa5-99ffe7ff910a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-da8e5049-4048-486b-a0d4-9d9c53478c31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 07:58:55 compute-0 nova_compute[187185]: 2025-11-29 07:58:55.879 187189 DEBUG oslo_concurrency.lockutils [None req-c46cde67-72a9-4418-93d6-df5cbd12858b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "da8e5049-4048-486b-a0d4-9d9c53478c31" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:58:55 compute-0 nova_compute[187185]: 2025-11-29 07:58:55.880 187189 DEBUG oslo_concurrency.lockutils [None req-c46cde67-72a9-4418-93d6-df5cbd12858b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "da8e5049-4048-486b-a0d4-9d9c53478c31" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:58:55 compute-0 nova_compute[187185]: 2025-11-29 07:58:55.881 187189 DEBUG oslo_concurrency.lockutils [None req-c46cde67-72a9-4418-93d6-df5cbd12858b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "da8e5049-4048-486b-a0d4-9d9c53478c31-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:58:55 compute-0 nova_compute[187185]: 2025-11-29 07:58:55.881 187189 DEBUG oslo_concurrency.lockutils [None req-c46cde67-72a9-4418-93d6-df5cbd12858b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "da8e5049-4048-486b-a0d4-9d9c53478c31-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:58:55 compute-0 nova_compute[187185]: 2025-11-29 07:58:55.881 187189 DEBUG oslo_concurrency.lockutils [None req-c46cde67-72a9-4418-93d6-df5cbd12858b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "da8e5049-4048-486b-a0d4-9d9c53478c31-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:58:55 compute-0 nova_compute[187185]: 2025-11-29 07:58:55.905 187189 INFO nova.compute.manager [None req-c46cde67-72a9-4418-93d6-df5cbd12858b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Terminating instance
Nov 29 07:58:55 compute-0 nova_compute[187185]: 2025-11-29 07:58:55.918 187189 DEBUG nova.compute.manager [None req-c46cde67-72a9-4418-93d6-df5cbd12858b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 07:58:55 compute-0 kernel: tap95c265c1-1b (unregistering): left promiscuous mode
Nov 29 07:58:55 compute-0 NetworkManager[55227]: <info>  [1764403135.9463] device (tap95c265c1-1b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 07:58:55 compute-0 nova_compute[187185]: 2025-11-29 07:58:55.956 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:55 compute-0 ovn_controller[95281]: 2025-11-29T07:58:55Z|00619|binding|INFO|Releasing lport 95c265c1-1b57-4a91-af2e-9d38073be611 from this chassis (sb_readonly=0)
Nov 29 07:58:55 compute-0 ovn_controller[95281]: 2025-11-29T07:58:55Z|00620|binding|INFO|Setting lport 95c265c1-1b57-4a91-af2e-9d38073be611 down in Southbound
Nov 29 07:58:55 compute-0 ovn_controller[95281]: 2025-11-29T07:58:55Z|00621|binding|INFO|Removing iface tap95c265c1-1b ovn-installed in OVS
Nov 29 07:58:55 compute-0 nova_compute[187185]: 2025-11-29 07:58:55.959 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:55 compute-0 nova_compute[187185]: 2025-11-29 07:58:55.978 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:55.980 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:84:0e:ef 10.100.0.10'], port_security=['fa:16:3e:84:0e:ef 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'da8e5049-4048-486b-a0d4-9d9c53478c31', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e25d5113-a42d-44ca-8e65-a777d9e11f48', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'neutron:revision_number': '4', 'neutron:security_group_ids': '09815a98-4b0c-4b42-8c70-30ee20d821a5 155897d4-49e9-4196-9d87-858cab256c02', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e3099386-1dbe-4af7-95d0-de6761c24471, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=95c265c1-1b57-4a91-af2e-9d38073be611) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:58:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:55.982 104254 INFO neutron.agent.ovn.metadata.agent [-] Port 95c265c1-1b57-4a91-af2e-9d38073be611 in datapath e25d5113-a42d-44ca-8e65-a777d9e11f48 unbound from our chassis
Nov 29 07:58:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:55.987 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e25d5113-a42d-44ca-8e65-a777d9e11f48, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 07:58:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:55.988 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d9fa0772-d762-4c0c-a521-df9039a5fe8a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:58:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:55.989 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48 namespace which is not needed anymore
Nov 29 07:58:56 compute-0 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d000000b5.scope: Deactivated successfully.
Nov 29 07:58:56 compute-0 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d000000b5.scope: Consumed 15.335s CPU time.
Nov 29 07:58:56 compute-0 systemd-machined[153486]: Machine qemu-70-instance-000000b5 terminated.
Nov 29 07:58:56 compute-0 neutron-haproxy-ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48[249994]: [NOTICE]   (249998) : haproxy version is 2.8.14-c23fe91
Nov 29 07:58:56 compute-0 neutron-haproxy-ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48[249994]: [NOTICE]   (249998) : path to executable is /usr/sbin/haproxy
Nov 29 07:58:56 compute-0 neutron-haproxy-ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48[249994]: [WARNING]  (249998) : Exiting Master process...
Nov 29 07:58:56 compute-0 neutron-haproxy-ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48[249994]: [WARNING]  (249998) : Exiting Master process...
Nov 29 07:58:56 compute-0 neutron-haproxy-ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48[249994]: [ALERT]    (249998) : Current worker (250000) exited with code 143 (Terminated)
Nov 29 07:58:56 compute-0 neutron-haproxy-ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48[249994]: [WARNING]  (249998) : All workers exited. Exiting... (0)
Nov 29 07:58:56 compute-0 systemd[1]: libpod-acaa5b8a76597b3228cbd92c11c10ecf9de4afb13c0013d108721329cc8457fc.scope: Deactivated successfully.
Nov 29 07:58:56 compute-0 podman[250349]: 2025-11-29 07:58:56.160015025 +0000 UTC m=+0.056602324 container died acaa5b8a76597b3228cbd92c11c10ecf9de4afb13c0013d108721329cc8457fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 07:58:56 compute-0 nova_compute[187185]: 2025-11-29 07:58:56.162 187189 DEBUG nova.network.neutron [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Updating instance_info_cache with network_info: [{"id": "95c265c1-1b57-4a91-af2e-9d38073be611", "address": "fa:16:3e:84:0e:ef", "network": {"id": "e25d5113-a42d-44ca-8e65-a777d9e11f48", "bridge": "br-int", "label": "tempest-network-smoke--1591284109", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95c265c1-1b", "ovs_interfaceid": "95c265c1-1b57-4a91-af2e-9d38073be611", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:58:56 compute-0 nova_compute[187185]: 2025-11-29 07:58:56.194 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Releasing lock "refresh_cache-da8e5049-4048-486b-a0d4-9d9c53478c31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:58:56 compute-0 nova_compute[187185]: 2025-11-29 07:58:56.195 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 29 07:58:56 compute-0 nova_compute[187185]: 2025-11-29 07:58:56.196 187189 DEBUG oslo_concurrency.lockutils [req-a850d0e4-a40e-458b-bc25-dd1f2e852696 req-0702d064-a25c-4096-8aa5-99ffe7ff910a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-da8e5049-4048-486b-a0d4-9d9c53478c31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 07:58:56 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-acaa5b8a76597b3228cbd92c11c10ecf9de4afb13c0013d108721329cc8457fc-userdata-shm.mount: Deactivated successfully.
Nov 29 07:58:56 compute-0 nova_compute[187185]: 2025-11-29 07:58:56.200 187189 DEBUG nova.network.neutron [req-a850d0e4-a40e-458b-bc25-dd1f2e852696 req-0702d064-a25c-4096-8aa5-99ffe7ff910a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Refreshing network info cache for port 95c265c1-1b57-4a91-af2e-9d38073be611 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 07:58:56 compute-0 nova_compute[187185]: 2025-11-29 07:58:56.204 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:58:56 compute-0 systemd[1]: var-lib-containers-storage-overlay-a4fd49dcbc60113e256e9813ae8cc73eb0e2f41aa746d73d0d9d77da4a25fff2-merged.mount: Deactivated successfully.
Nov 29 07:58:56 compute-0 nova_compute[187185]: 2025-11-29 07:58:56.207 187189 INFO nova.virt.libvirt.driver [-] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Instance destroyed successfully.
Nov 29 07:58:56 compute-0 nova_compute[187185]: 2025-11-29 07:58:56.207 187189 DEBUG nova.objects.instance [None req-c46cde67-72a9-4418-93d6-df5cbd12858b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lazy-loading 'resources' on Instance uuid da8e5049-4048-486b-a0d4-9d9c53478c31 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 07:58:56 compute-0 podman[250349]: 2025-11-29 07:58:56.210811373 +0000 UTC m=+0.107398672 container cleanup acaa5b8a76597b3228cbd92c11c10ecf9de4afb13c0013d108721329cc8457fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 07:58:56 compute-0 systemd[1]: libpod-conmon-acaa5b8a76597b3228cbd92c11c10ecf9de4afb13c0013d108721329cc8457fc.scope: Deactivated successfully.
Nov 29 07:58:56 compute-0 nova_compute[187185]: 2025-11-29 07:58:56.224 187189 DEBUG nova.virt.libvirt.vif [None req-c46cde67-72a9-4418-93d6-df5cbd12858b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:57:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-1552512911',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2022058758-ac',id=181,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCbO5lg6MOx4cmqxiU7BbZWHBZr0hw1OFyx1wi8ylE3DD5XkolVPeqN7tTosQEkEUch/54VxCcm6kKSgh3IzPEqFnEAKzoXbjFEzrMqt3IWtOqFGx5kqMbL3gOxuWmKn/A==',key_name='tempest-TestSecurityGroupsBasicOps-519869731',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:58:02Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e8e45e91223b45a79dd698a82af4a2a5',ramdisk_id='',reservation_id='r-rzp6eqal',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-2022058758',owner_user_name='tempest-TestSecurityGroupsBasicOps-2022058758-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:58:02Z,user_data=None,user_id='dec30fbde18e4b2382ea2c59847d067f',uuid=da8e5049-4048-486b-a0d4-9d9c53478c31,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "95c265c1-1b57-4a91-af2e-9d38073be611", "address": "fa:16:3e:84:0e:ef", "network": {"id": "e25d5113-a42d-44ca-8e65-a777d9e11f48", "bridge": "br-int", "label": "tempest-network-smoke--1591284109", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95c265c1-1b", "ovs_interfaceid": "95c265c1-1b57-4a91-af2e-9d38073be611", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 07:58:56 compute-0 nova_compute[187185]: 2025-11-29 07:58:56.225 187189 DEBUG nova.network.os_vif_util [None req-c46cde67-72a9-4418-93d6-df5cbd12858b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converting VIF {"id": "95c265c1-1b57-4a91-af2e-9d38073be611", "address": "fa:16:3e:84:0e:ef", "network": {"id": "e25d5113-a42d-44ca-8e65-a777d9e11f48", "bridge": "br-int", "label": "tempest-network-smoke--1591284109", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95c265c1-1b", "ovs_interfaceid": "95c265c1-1b57-4a91-af2e-9d38073be611", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 07:58:56 compute-0 nova_compute[187185]: 2025-11-29 07:58:56.225 187189 DEBUG nova.network.os_vif_util [None req-c46cde67-72a9-4418-93d6-df5cbd12858b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:84:0e:ef,bridge_name='br-int',has_traffic_filtering=True,id=95c265c1-1b57-4a91-af2e-9d38073be611,network=Network(e25d5113-a42d-44ca-8e65-a777d9e11f48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95c265c1-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 07:58:56 compute-0 nova_compute[187185]: 2025-11-29 07:58:56.226 187189 DEBUG os_vif [None req-c46cde67-72a9-4418-93d6-df5cbd12858b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:84:0e:ef,bridge_name='br-int',has_traffic_filtering=True,id=95c265c1-1b57-4a91-af2e-9d38073be611,network=Network(e25d5113-a42d-44ca-8e65-a777d9e11f48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95c265c1-1b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 07:58:56 compute-0 nova_compute[187185]: 2025-11-29 07:58:56.232 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:56 compute-0 nova_compute[187185]: 2025-11-29 07:58:56.233 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap95c265c1-1b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:58:56 compute-0 nova_compute[187185]: 2025-11-29 07:58:56.238 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 07:58:56 compute-0 nova_compute[187185]: 2025-11-29 07:58:56.243 187189 INFO os_vif [None req-c46cde67-72a9-4418-93d6-df5cbd12858b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:84:0e:ef,bridge_name='br-int',has_traffic_filtering=True,id=95c265c1-1b57-4a91-af2e-9d38073be611,network=Network(e25d5113-a42d-44ca-8e65-a777d9e11f48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95c265c1-1b')
Nov 29 07:58:56 compute-0 nova_compute[187185]: 2025-11-29 07:58:56.244 187189 INFO nova.virt.libvirt.driver [None req-c46cde67-72a9-4418-93d6-df5cbd12858b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Deleting instance files /var/lib/nova/instances/da8e5049-4048-486b-a0d4-9d9c53478c31_del
Nov 29 07:58:56 compute-0 nova_compute[187185]: 2025-11-29 07:58:56.245 187189 INFO nova.virt.libvirt.driver [None req-c46cde67-72a9-4418-93d6-df5cbd12858b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Deletion of /var/lib/nova/instances/da8e5049-4048-486b-a0d4-9d9c53478c31_del complete
Nov 29 07:58:56 compute-0 podman[250397]: 2025-11-29 07:58:56.284648808 +0000 UTC m=+0.048029060 container remove acaa5b8a76597b3228cbd92c11c10ecf9de4afb13c0013d108721329cc8457fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 07:58:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:56.290 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[b709f751-45db-4b5f-9470-7e4202749a80]: (4, ('Sat Nov 29 07:58:56 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48 (acaa5b8a76597b3228cbd92c11c10ecf9de4afb13c0013d108721329cc8457fc)\nacaa5b8a76597b3228cbd92c11c10ecf9de4afb13c0013d108721329cc8457fc\nSat Nov 29 07:58:56 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48 (acaa5b8a76597b3228cbd92c11c10ecf9de4afb13c0013d108721329cc8457fc)\nacaa5b8a76597b3228cbd92c11c10ecf9de4afb13c0013d108721329cc8457fc\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:58:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:56.292 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[05e983c2-c56f-4fca-84d0-5113c8fbfb6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:58:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:56.293 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape25d5113-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:58:56 compute-0 nova_compute[187185]: 2025-11-29 07:58:56.295 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:56 compute-0 kernel: tape25d5113-a0: left promiscuous mode
Nov 29 07:58:56 compute-0 nova_compute[187185]: 2025-11-29 07:58:56.298 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:56.301 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[997d4b0b-b7ca-4aac-be87-72f465bdbdd7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:58:56 compute-0 nova_compute[187185]: 2025-11-29 07:58:56.309 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:56.321 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[d7992bdd-bb2e-4a99-939a-1e48dd019362]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:58:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:56.323 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[e7124c78-a966-4a8f-a8e8-49c4251ea1c4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:58:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:56.340 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[58d129e4-00ec-4fdc-be84-3cd32a77d4f5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 852941, 'reachable_time': 33951, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250412, 'error': None, 'target': 'ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:58:56 compute-0 systemd[1]: run-netns-ovnmeta\x2de25d5113\x2da42d\x2d44ca\x2d8e65\x2da777d9e11f48.mount: Deactivated successfully.
Nov 29 07:58:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:56.344 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 07:58:56 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:58:56.345 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[6917014b-9054-4e58-b47a-c1bef8ad402d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 07:58:56 compute-0 nova_compute[187185]: 2025-11-29 07:58:56.803 187189 INFO nova.compute.manager [None req-c46cde67-72a9-4418-93d6-df5cbd12858b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Took 0.88 seconds to destroy the instance on the hypervisor.
Nov 29 07:58:56 compute-0 nova_compute[187185]: 2025-11-29 07:58:56.804 187189 DEBUG oslo.service.loopingcall [None req-c46cde67-72a9-4418-93d6-df5cbd12858b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 07:58:56 compute-0 nova_compute[187185]: 2025-11-29 07:58:56.805 187189 DEBUG nova.compute.manager [-] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 07:58:56 compute-0 nova_compute[187185]: 2025-11-29 07:58:56.805 187189 DEBUG nova.network.neutron [-] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 07:58:57 compute-0 sshd-session[250322]: Connection closed by 45.78.219.119 port 41630 [preauth]
Nov 29 07:58:59 compute-0 nova_compute[187185]: 2025-11-29 07:58:59.427 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:58:59 compute-0 podman[250413]: 2025-11-29 07:58:59.844174402 +0000 UTC m=+0.100782523 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 07:59:00 compute-0 nova_compute[187185]: 2025-11-29 07:59:00.154 187189 DEBUG nova.compute.manager [req-690246ab-2c87-46b9-900b-44b07435dfbb req-0e4cd46f-5779-4d35-beb8-ff760bf59ef3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Received event network-vif-unplugged-95c265c1-1b57-4a91-af2e-9d38073be611 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:59:00 compute-0 nova_compute[187185]: 2025-11-29 07:59:00.155 187189 DEBUG oslo_concurrency.lockutils [req-690246ab-2c87-46b9-900b-44b07435dfbb req-0e4cd46f-5779-4d35-beb8-ff760bf59ef3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "da8e5049-4048-486b-a0d4-9d9c53478c31-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:59:00 compute-0 nova_compute[187185]: 2025-11-29 07:59:00.155 187189 DEBUG oslo_concurrency.lockutils [req-690246ab-2c87-46b9-900b-44b07435dfbb req-0e4cd46f-5779-4d35-beb8-ff760bf59ef3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "da8e5049-4048-486b-a0d4-9d9c53478c31-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:59:00 compute-0 nova_compute[187185]: 2025-11-29 07:59:00.156 187189 DEBUG oslo_concurrency.lockutils [req-690246ab-2c87-46b9-900b-44b07435dfbb req-0e4cd46f-5779-4d35-beb8-ff760bf59ef3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "da8e5049-4048-486b-a0d4-9d9c53478c31-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:59:00 compute-0 nova_compute[187185]: 2025-11-29 07:59:00.156 187189 DEBUG nova.compute.manager [req-690246ab-2c87-46b9-900b-44b07435dfbb req-0e4cd46f-5779-4d35-beb8-ff760bf59ef3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] No waiting events found dispatching network-vif-unplugged-95c265c1-1b57-4a91-af2e-9d38073be611 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:59:00 compute-0 nova_compute[187185]: 2025-11-29 07:59:00.156 187189 DEBUG nova.compute.manager [req-690246ab-2c87-46b9-900b-44b07435dfbb req-0e4cd46f-5779-4d35-beb8-ff760bf59ef3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Received event network-vif-unplugged-95c265c1-1b57-4a91-af2e-9d38073be611 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 07:59:00 compute-0 nova_compute[187185]: 2025-11-29 07:59:00.157 187189 DEBUG nova.compute.manager [req-690246ab-2c87-46b9-900b-44b07435dfbb req-0e4cd46f-5779-4d35-beb8-ff760bf59ef3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Received event network-vif-plugged-95c265c1-1b57-4a91-af2e-9d38073be611 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:59:00 compute-0 nova_compute[187185]: 2025-11-29 07:59:00.157 187189 DEBUG oslo_concurrency.lockutils [req-690246ab-2c87-46b9-900b-44b07435dfbb req-0e4cd46f-5779-4d35-beb8-ff760bf59ef3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "da8e5049-4048-486b-a0d4-9d9c53478c31-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:59:00 compute-0 nova_compute[187185]: 2025-11-29 07:59:00.157 187189 DEBUG oslo_concurrency.lockutils [req-690246ab-2c87-46b9-900b-44b07435dfbb req-0e4cd46f-5779-4d35-beb8-ff760bf59ef3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "da8e5049-4048-486b-a0d4-9d9c53478c31-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:59:00 compute-0 nova_compute[187185]: 2025-11-29 07:59:00.158 187189 DEBUG oslo_concurrency.lockutils [req-690246ab-2c87-46b9-900b-44b07435dfbb req-0e4cd46f-5779-4d35-beb8-ff760bf59ef3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "da8e5049-4048-486b-a0d4-9d9c53478c31-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:59:00 compute-0 nova_compute[187185]: 2025-11-29 07:59:00.158 187189 DEBUG nova.compute.manager [req-690246ab-2c87-46b9-900b-44b07435dfbb req-0e4cd46f-5779-4d35-beb8-ff760bf59ef3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] No waiting events found dispatching network-vif-plugged-95c265c1-1b57-4a91-af2e-9d38073be611 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 07:59:00 compute-0 nova_compute[187185]: 2025-11-29 07:59:00.158 187189 WARNING nova.compute.manager [req-690246ab-2c87-46b9-900b-44b07435dfbb req-0e4cd46f-5779-4d35-beb8-ff760bf59ef3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Received unexpected event network-vif-plugged-95c265c1-1b57-4a91-af2e-9d38073be611 for instance with vm_state active and task_state deleting.
Nov 29 07:59:01 compute-0 nova_compute[187185]: 2025-11-29 07:59:01.249 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:59:01 compute-0 nova_compute[187185]: 2025-11-29 07:59:01.472 187189 DEBUG nova.network.neutron [req-a850d0e4-a40e-458b-bc25-dd1f2e852696 req-0702d064-a25c-4096-8aa5-99ffe7ff910a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Updated VIF entry in instance network info cache for port 95c265c1-1b57-4a91-af2e-9d38073be611. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 07:59:01 compute-0 nova_compute[187185]: 2025-11-29 07:59:01.472 187189 DEBUG nova.network.neutron [req-a850d0e4-a40e-458b-bc25-dd1f2e852696 req-0702d064-a25c-4096-8aa5-99ffe7ff910a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Updating instance_info_cache with network_info: [{"id": "95c265c1-1b57-4a91-af2e-9d38073be611", "address": "fa:16:3e:84:0e:ef", "network": {"id": "e25d5113-a42d-44ca-8e65-a777d9e11f48", "bridge": "br-int", "label": "tempest-network-smoke--1591284109", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95c265c1-1b", "ovs_interfaceid": "95c265c1-1b57-4a91-af2e-9d38073be611", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:59:04 compute-0 nova_compute[187185]: 2025-11-29 07:59:04.429 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:59:06 compute-0 nova_compute[187185]: 2025-11-29 07:59:06.067 187189 DEBUG oslo_concurrency.lockutils [req-a850d0e4-a40e-458b-bc25-dd1f2e852696 req-0702d064-a25c-4096-8aa5-99ffe7ff910a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-da8e5049-4048-486b-a0d4-9d9c53478c31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 07:59:06 compute-0 nova_compute[187185]: 2025-11-29 07:59:06.260 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:59:07 compute-0 nova_compute[187185]: 2025-11-29 07:59:07.622 187189 DEBUG nova.network.neutron [-] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:59:07 compute-0 nova_compute[187185]: 2025-11-29 07:59:07.819 187189 DEBUG nova.compute.manager [req-95f67841-fc09-4648-b2ad-e39060018b00 req-32e9543e-9fe1-460d-86dc-60349ff7c6f7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Received event network-vif-deleted-95c265c1-1b57-4a91-af2e-9d38073be611 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 07:59:07 compute-0 nova_compute[187185]: 2025-11-29 07:59:07.820 187189 INFO nova.compute.manager [req-95f67841-fc09-4648-b2ad-e39060018b00 req-32e9543e-9fe1-460d-86dc-60349ff7c6f7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Neutron deleted interface 95c265c1-1b57-4a91-af2e-9d38073be611; detaching it from the instance and deleting it from the info cache
Nov 29 07:59:07 compute-0 nova_compute[187185]: 2025-11-29 07:59:07.820 187189 DEBUG nova.network.neutron [req-95f67841-fc09-4648-b2ad-e39060018b00 req-32e9543e-9fe1-460d-86dc-60349ff7c6f7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 07:59:08 compute-0 nova_compute[187185]: 2025-11-29 07:59:08.199 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:59:08 compute-0 podman[250439]: 2025-11-29 07:59:08.822286478 +0000 UTC m=+0.079667552 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 07:59:09 compute-0 nova_compute[187185]: 2025-11-29 07:59:09.431 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:59:09 compute-0 nova_compute[187185]: 2025-11-29 07:59:09.624 187189 INFO nova.compute.manager [-] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Took 12.82 seconds to deallocate network for instance.
Nov 29 07:59:09 compute-0 nova_compute[187185]: 2025-11-29 07:59:09.634 187189 DEBUG nova.compute.manager [req-95f67841-fc09-4648-b2ad-e39060018b00 req-32e9543e-9fe1-460d-86dc-60349ff7c6f7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Detach interface failed, port_id=95c265c1-1b57-4a91-af2e-9d38073be611, reason: Instance da8e5049-4048-486b-a0d4-9d9c53478c31 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Nov 29 07:59:10 compute-0 nova_compute[187185]: 2025-11-29 07:59:10.055 187189 DEBUG oslo_concurrency.lockutils [None req-c46cde67-72a9-4418-93d6-df5cbd12858b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:59:10 compute-0 nova_compute[187185]: 2025-11-29 07:59:10.056 187189 DEBUG oslo_concurrency.lockutils [None req-c46cde67-72a9-4418-93d6-df5cbd12858b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:59:10 compute-0 nova_compute[187185]: 2025-11-29 07:59:10.134 187189 DEBUG nova.compute.provider_tree [None req-c46cde67-72a9-4418-93d6-df5cbd12858b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:59:10 compute-0 nova_compute[187185]: 2025-11-29 07:59:10.154 187189 DEBUG nova.scheduler.client.report [None req-c46cde67-72a9-4418-93d6-df5cbd12858b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:59:10 compute-0 nova_compute[187185]: 2025-11-29 07:59:10.195 187189 DEBUG oslo_concurrency.lockutils [None req-c46cde67-72a9-4418-93d6-df5cbd12858b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:59:10 compute-0 nova_compute[187185]: 2025-11-29 07:59:10.238 187189 INFO nova.scheduler.client.report [None req-c46cde67-72a9-4418-93d6-df5cbd12858b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Deleted allocations for instance da8e5049-4048-486b-a0d4-9d9c53478c31
Nov 29 07:59:10 compute-0 nova_compute[187185]: 2025-11-29 07:59:10.481 187189 DEBUG oslo_concurrency.lockutils [None req-c46cde67-72a9-4418-93d6-df5cbd12858b dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "da8e5049-4048-486b-a0d4-9d9c53478c31" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 14.601s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:59:10 compute-0 podman[250465]: 2025-11-29 07:59:10.811901916 +0000 UTC m=+0.068267537 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2)
Nov 29 07:59:10 compute-0 podman[250464]: 2025-11-29 07:59:10.821485639 +0000 UTC m=+0.085665712 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, org.label-schema.vendor=CentOS)
Nov 29 07:59:11 compute-0 nova_compute[187185]: 2025-11-29 07:59:11.206 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764403136.2054331, da8e5049-4048-486b-a0d4-9d9c53478c31 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 07:59:11 compute-0 nova_compute[187185]: 2025-11-29 07:59:11.207 187189 INFO nova.compute.manager [-] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] VM Stopped (Lifecycle Event)
Nov 29 07:59:11 compute-0 nova_compute[187185]: 2025-11-29 07:59:11.247 187189 DEBUG nova.compute.manager [None req-0eac908e-fcb6-45ed-b085-4d9189da8b6f - - - - - -] [instance: da8e5049-4048-486b-a0d4-9d9c53478c31] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 07:59:11 compute-0 nova_compute[187185]: 2025-11-29 07:59:11.263 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:59:14 compute-0 nova_compute[187185]: 2025-11-29 07:59:14.432 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:59:16 compute-0 nova_compute[187185]: 2025-11-29 07:59:16.267 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:59:18 compute-0 nova_compute[187185]: 2025-11-29 07:59:18.482 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:59:18 compute-0 nova_compute[187185]: 2025-11-29 07:59:18.628 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:59:19 compute-0 nova_compute[187185]: 2025-11-29 07:59:19.465 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:59:21 compute-0 nova_compute[187185]: 2025-11-29 07:59:21.271 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:59:21 compute-0 podman[250505]: 2025-11-29 07:59:21.792028545 +0000 UTC m=+0.050971124 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 07:59:21 compute-0 podman[250504]: 2025-11-29 07:59:21.803650236 +0000 UTC m=+0.064066727 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, distribution-scope=public, vcs-type=git, config_id=edpm, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350)
Nov 29 07:59:21 compute-0 podman[250503]: 2025-11-29 07:59:21.808519115 +0000 UTC m=+0.075452062 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 07:59:24 compute-0 nova_compute[187185]: 2025-11-29 07:59:24.515 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:59:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:59:25.759 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:59:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:59:25.760 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:59:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:59:25.760 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:59:26 compute-0 nova_compute[187185]: 2025-11-29 07:59:26.275 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:59:29 compute-0 nova_compute[187185]: 2025-11-29 07:59:29.518 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:59:30 compute-0 podman[250567]: 2025-11-29 07:59:30.838227252 +0000 UTC m=+0.097079018 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 07:59:31 compute-0 nova_compute[187185]: 2025-11-29 07:59:31.306 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:59:32 compute-0 sshd-session[250594]: Invalid user kingbase from 20.255.62.58 port 37826
Nov 29 07:59:32 compute-0 sshd-session[250594]: Received disconnect from 20.255.62.58 port 37826:11: Bye Bye [preauth]
Nov 29 07:59:32 compute-0 sshd-session[250594]: Disconnected from invalid user kingbase 20.255.62.58 port 37826 [preauth]
Nov 29 07:59:34 compute-0 nova_compute[187185]: 2025-11-29 07:59:34.521 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:59:36 compute-0 nova_compute[187185]: 2025-11-29 07:59:36.308 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:59:37 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:59:37.852 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=78, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=77) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 07:59:37 compute-0 nova_compute[187185]: 2025-11-29 07:59:37.853 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:59:37 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:59:37.853 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 07:59:38 compute-0 ovn_metadata_agent[104249]: 2025-11-29 07:59:38.855 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '78'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 07:59:39 compute-0 sshd-session[250566]: error: kex_exchange_identification: read: Connection timed out
Nov 29 07:59:39 compute-0 sshd-session[250566]: banner exchange: Connection from 115.190.187.93 port 37752: Connection timed out
Nov 29 07:59:39 compute-0 nova_compute[187185]: 2025-11-29 07:59:39.523 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:59:39 compute-0 podman[250596]: 2025-11-29 07:59:39.834336911 +0000 UTC m=+0.087052583 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 07:59:40 compute-0 nova_compute[187185]: 2025-11-29 07:59:40.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:59:41 compute-0 nova_compute[187185]: 2025-11-29 07:59:41.312 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:59:41 compute-0 nova_compute[187185]: 2025-11-29 07:59:41.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:59:41 compute-0 podman[250620]: 2025-11-29 07:59:41.794210562 +0000 UTC m=+0.055607036 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd)
Nov 29 07:59:41 compute-0 podman[250621]: 2025-11-29 07:59:41.80572888 +0000 UTC m=+0.065742104 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 07:59:43 compute-0 nova_compute[187185]: 2025-11-29 07:59:43.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:59:44 compute-0 nova_compute[187185]: 2025-11-29 07:59:44.563 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:59:45 compute-0 nova_compute[187185]: 2025-11-29 07:59:45.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:59:45 compute-0 nova_compute[187185]: 2025-11-29 07:59:45.347 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:59:45 compute-0 nova_compute[187185]: 2025-11-29 07:59:45.348 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:59:45 compute-0 nova_compute[187185]: 2025-11-29 07:59:45.348 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:59:45 compute-0 nova_compute[187185]: 2025-11-29 07:59:45.349 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 07:59:45 compute-0 nova_compute[187185]: 2025-11-29 07:59:45.538 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 07:59:45 compute-0 nova_compute[187185]: 2025-11-29 07:59:45.540 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5723MB free_disk=73.25090026855469GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 07:59:45 compute-0 nova_compute[187185]: 2025-11-29 07:59:45.540 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 07:59:45 compute-0 nova_compute[187185]: 2025-11-29 07:59:45.540 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 07:59:45 compute-0 nova_compute[187185]: 2025-11-29 07:59:45.610 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 07:59:45 compute-0 nova_compute[187185]: 2025-11-29 07:59:45.611 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 07:59:45 compute-0 nova_compute[187185]: 2025-11-29 07:59:45.631 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 07:59:45 compute-0 nova_compute[187185]: 2025-11-29 07:59:45.645 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 07:59:45 compute-0 nova_compute[187185]: 2025-11-29 07:59:45.668 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 07:59:45 compute-0 nova_compute[187185]: 2025-11-29 07:59:45.669 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 07:59:46 compute-0 nova_compute[187185]: 2025-11-29 07:59:46.316 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:59:48 compute-0 nova_compute[187185]: 2025-11-29 07:59:48.669 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:59:48 compute-0 nova_compute[187185]: 2025-11-29 07:59:48.669 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 07:59:49 compute-0 nova_compute[187185]: 2025-11-29 07:59:49.566 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:59:51 compute-0 nova_compute[187185]: 2025-11-29 07:59:51.319 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:59:52 compute-0 nova_compute[187185]: 2025-11-29 07:59:52.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:59:52 compute-0 podman[250659]: 2025-11-29 07:59:52.842631726 +0000 UTC m=+0.102709239 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, config_id=edpm, maintainer=Red Hat, Inc., vcs-type=git, io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, name=ubi9-minimal, distribution-scope=public, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6)
Nov 29 07:59:52 compute-0 podman[250660]: 2025-11-29 07:59:52.843590133 +0000 UTC m=+0.097445948 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 07:59:52 compute-0 podman[250658]: 2025-11-29 07:59:52.856215423 +0000 UTC m=+0.121312409 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 29 07:59:54 compute-0 nova_compute[187185]: 2025-11-29 07:59:54.569 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:59:55 compute-0 nova_compute[187185]: 2025-11-29 07:59:55.311 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:59:55 compute-0 nova_compute[187185]: 2025-11-29 07:59:55.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:59:55 compute-0 nova_compute[187185]: 2025-11-29 07:59:55.315 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 07:59:55 compute-0 nova_compute[187185]: 2025-11-29 07:59:55.315 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 07:59:55 compute-0 nova_compute[187185]: 2025-11-29 07:59:55.333 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 07:59:56 compute-0 nova_compute[187185]: 2025-11-29 07:59:56.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 07:59:56 compute-0 nova_compute[187185]: 2025-11-29 07:59:56.322 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 07:59:59 compute-0 nova_compute[187185]: 2025-11-29 07:59:59.571 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:01 compute-0 nova_compute[187185]: 2025-11-29 08:00:01.325 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:01 compute-0 podman[250716]: 2025-11-29 08:00:01.877917312 +0000 UTC m=+0.142437971 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 08:00:04 compute-0 sshd-session[250742]: Received disconnect from 190.181.27.27 port 33194:11: Bye Bye [preauth]
Nov 29 08:00:04 compute-0 sshd-session[250742]: Disconnected from authenticating user root 190.181.27.27 port 33194 [preauth]
Nov 29 08:00:04 compute-0 nova_compute[187185]: 2025-11-29 08:00:04.574 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:06 compute-0 nova_compute[187185]: 2025-11-29 08:00:06.328 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:09 compute-0 ovn_controller[95281]: 2025-11-29T08:00:09Z|00622|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory
Nov 29 08:00:09 compute-0 nova_compute[187185]: 2025-11-29 08:00:09.577 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:10 compute-0 podman[250744]: 2025-11-29 08:00:10.795664698 +0000 UTC m=+0.060444143 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 08:00:11 compute-0 nova_compute[187185]: 2025-11-29 08:00:11.331 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:12 compute-0 podman[250769]: 2025-11-29 08:00:12.806297286 +0000 UTC m=+0.066814635 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 29 08:00:12 compute-0 podman[250770]: 2025-11-29 08:00:12.80923786 +0000 UTC m=+0.065777106 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 08:00:14 compute-0 nova_compute[187185]: 2025-11-29 08:00:14.580 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:16 compute-0 nova_compute[187185]: 2025-11-29 08:00:16.334 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:19 compute-0 nova_compute[187185]: 2025-11-29 08:00:19.582 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:20 compute-0 nova_compute[187185]: 2025-11-29 08:00:20.017 187189 DEBUG oslo_concurrency.lockutils [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 08:00:20 compute-0 nova_compute[187185]: 2025-11-29 08:00:20.018 187189 DEBUG oslo_concurrency.lockutils [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 08:00:20 compute-0 nova_compute[187185]: 2025-11-29 08:00:20.050 187189 DEBUG nova.compute.manager [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 29 08:00:20 compute-0 nova_compute[187185]: 2025-11-29 08:00:20.166 187189 DEBUG oslo_concurrency.lockutils [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 08:00:20 compute-0 nova_compute[187185]: 2025-11-29 08:00:20.167 187189 DEBUG oslo_concurrency.lockutils [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 08:00:20 compute-0 nova_compute[187185]: 2025-11-29 08:00:20.175 187189 DEBUG nova.virt.hardware [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 29 08:00:20 compute-0 nova_compute[187185]: 2025-11-29 08:00:20.176 187189 INFO nova.compute.claims [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Claim successful on node compute-0.ctlplane.example.com
Nov 29 08:00:20 compute-0 nova_compute[187185]: 2025-11-29 08:00:20.324 187189 DEBUG nova.compute.provider_tree [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 08:00:20 compute-0 nova_compute[187185]: 2025-11-29 08:00:20.343 187189 DEBUG nova.scheduler.client.report [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 08:00:20 compute-0 nova_compute[187185]: 2025-11-29 08:00:20.376 187189 DEBUG oslo_concurrency.lockutils [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.209s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 08:00:20 compute-0 nova_compute[187185]: 2025-11-29 08:00:20.378 187189 DEBUG nova.compute.manager [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 29 08:00:20 compute-0 nova_compute[187185]: 2025-11-29 08:00:20.441 187189 DEBUG nova.compute.manager [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 29 08:00:20 compute-0 nova_compute[187185]: 2025-11-29 08:00:20.442 187189 DEBUG nova.network.neutron [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 29 08:00:20 compute-0 nova_compute[187185]: 2025-11-29 08:00:20.463 187189 INFO nova.virt.libvirt.driver [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 29 08:00:20 compute-0 nova_compute[187185]: 2025-11-29 08:00:20.479 187189 DEBUG nova.compute.manager [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 29 08:00:20 compute-0 nova_compute[187185]: 2025-11-29 08:00:20.622 187189 DEBUG nova.compute.manager [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 29 08:00:20 compute-0 nova_compute[187185]: 2025-11-29 08:00:20.624 187189 DEBUG nova.virt.libvirt.driver [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 29 08:00:20 compute-0 nova_compute[187185]: 2025-11-29 08:00:20.624 187189 INFO nova.virt.libvirt.driver [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Creating image(s)
Nov 29 08:00:20 compute-0 nova_compute[187185]: 2025-11-29 08:00:20.625 187189 DEBUG oslo_concurrency.lockutils [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "/var/lib/nova/instances/681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 08:00:20 compute-0 nova_compute[187185]: 2025-11-29 08:00:20.626 187189 DEBUG oslo_concurrency.lockutils [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "/var/lib/nova/instances/681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 08:00:20 compute-0 nova_compute[187185]: 2025-11-29 08:00:20.627 187189 DEBUG oslo_concurrency.lockutils [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "/var/lib/nova/instances/681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 08:00:20 compute-0 nova_compute[187185]: 2025-11-29 08:00:20.651 187189 DEBUG oslo_concurrency.processutils [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 08:00:20 compute-0 nova_compute[187185]: 2025-11-29 08:00:20.718 187189 DEBUG oslo_concurrency.processutils [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 08:00:20 compute-0 nova_compute[187185]: 2025-11-29 08:00:20.720 187189 DEBUG oslo_concurrency.lockutils [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 08:00:20 compute-0 nova_compute[187185]: 2025-11-29 08:00:20.721 187189 DEBUG oslo_concurrency.lockutils [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 08:00:20 compute-0 nova_compute[187185]: 2025-11-29 08:00:20.747 187189 DEBUG oslo_concurrency.processutils [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 08:00:20 compute-0 nova_compute[187185]: 2025-11-29 08:00:20.812 187189 DEBUG oslo_concurrency.processutils [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 08:00:20 compute-0 nova_compute[187185]: 2025-11-29 08:00:20.813 187189 DEBUG oslo_concurrency.processutils [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 08:00:21 compute-0 nova_compute[187185]: 2025-11-29 08:00:21.143 187189 DEBUG oslo_concurrency.processutils [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/disk 1073741824" returned: 0 in 0.330s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 08:00:21 compute-0 nova_compute[187185]: 2025-11-29 08:00:21.145 187189 DEBUG oslo_concurrency.lockutils [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.423s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 08:00:21 compute-0 nova_compute[187185]: 2025-11-29 08:00:21.145 187189 DEBUG oslo_concurrency.processutils [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 08:00:21 compute-0 nova_compute[187185]: 2025-11-29 08:00:21.210 187189 DEBUG oslo_concurrency.processutils [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 08:00:21 compute-0 nova_compute[187185]: 2025-11-29 08:00:21.212 187189 DEBUG nova.virt.disk.api [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Checking if we can resize image /var/lib/nova/instances/681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 29 08:00:21 compute-0 nova_compute[187185]: 2025-11-29 08:00:21.212 187189 DEBUG oslo_concurrency.processutils [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 08:00:21 compute-0 nova_compute[187185]: 2025-11-29 08:00:21.235 187189 DEBUG nova.policy [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 29 08:00:21 compute-0 nova_compute[187185]: 2025-11-29 08:00:21.279 187189 DEBUG oslo_concurrency.processutils [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 08:00:21 compute-0 nova_compute[187185]: 2025-11-29 08:00:21.280 187189 DEBUG nova.virt.disk.api [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Cannot resize image /var/lib/nova/instances/681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 29 08:00:21 compute-0 nova_compute[187185]: 2025-11-29 08:00:21.281 187189 DEBUG nova.objects.instance [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lazy-loading 'migration_context' on Instance uuid 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 08:00:21 compute-0 nova_compute[187185]: 2025-11-29 08:00:21.312 187189 DEBUG nova.virt.libvirt.driver [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 29 08:00:21 compute-0 nova_compute[187185]: 2025-11-29 08:00:21.313 187189 DEBUG nova.virt.libvirt.driver [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Ensure instance console log exists: /var/lib/nova/instances/681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 29 08:00:21 compute-0 nova_compute[187185]: 2025-11-29 08:00:21.314 187189 DEBUG oslo_concurrency.lockutils [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 08:00:21 compute-0 nova_compute[187185]: 2025-11-29 08:00:21.314 187189 DEBUG oslo_concurrency.lockutils [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 08:00:21 compute-0 nova_compute[187185]: 2025-11-29 08:00:21.315 187189 DEBUG oslo_concurrency.lockutils [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 08:00:21 compute-0 nova_compute[187185]: 2025-11-29 08:00:21.338 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:23 compute-0 nova_compute[187185]: 2025-11-29 08:00:23.553 187189 DEBUG nova.network.neutron [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Successfully created port: c7bce92b-6ded-4a7b-b9cc-ea46b70d977f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 29 08:00:23 compute-0 podman[250822]: 2025-11-29 08:00:23.795492213 +0000 UTC m=+0.062634966 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 08:00:23 compute-0 podman[250824]: 2025-11-29 08:00:23.823828631 +0000 UTC m=+0.069008078 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 08:00:23 compute-0 podman[250823]: 2025-11-29 08:00:23.823693577 +0000 UTC m=+0.085414816 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, release=1755695350, version=9.6, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 08:00:24 compute-0 nova_compute[187185]: 2025-11-29 08:00:24.587 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:25.760 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 08:00:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:25.761 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 08:00:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:25.761 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 08:00:26 compute-0 nova_compute[187185]: 2025-11-29 08:00:26.397 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:26 compute-0 nova_compute[187185]: 2025-11-29 08:00:26.456 187189 DEBUG nova.network.neutron [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Successfully updated port: c7bce92b-6ded-4a7b-b9cc-ea46b70d977f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 29 08:00:26 compute-0 nova_compute[187185]: 2025-11-29 08:00:26.462 187189 DEBUG nova.compute.manager [req-4e29015e-04e3-4dcb-a5cb-54fc10447351 req-7326d33d-7436-4be0-bbf9-968badddec5e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Received event network-changed-c7bce92b-6ded-4a7b-b9cc-ea46b70d977f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 08:00:26 compute-0 nova_compute[187185]: 2025-11-29 08:00:26.462 187189 DEBUG nova.compute.manager [req-4e29015e-04e3-4dcb-a5cb-54fc10447351 req-7326d33d-7436-4be0-bbf9-968badddec5e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Refreshing instance network info cache due to event network-changed-c7bce92b-6ded-4a7b-b9cc-ea46b70d977f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 08:00:26 compute-0 nova_compute[187185]: 2025-11-29 08:00:26.462 187189 DEBUG oslo_concurrency.lockutils [req-4e29015e-04e3-4dcb-a5cb-54fc10447351 req-7326d33d-7436-4be0-bbf9-968badddec5e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 08:00:26 compute-0 nova_compute[187185]: 2025-11-29 08:00:26.463 187189 DEBUG oslo_concurrency.lockutils [req-4e29015e-04e3-4dcb-a5cb-54fc10447351 req-7326d33d-7436-4be0-bbf9-968badddec5e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 08:00:26 compute-0 nova_compute[187185]: 2025-11-29 08:00:26.463 187189 DEBUG nova.network.neutron [req-4e29015e-04e3-4dcb-a5cb-54fc10447351 req-7326d33d-7436-4be0-bbf9-968badddec5e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Refreshing network info cache for port c7bce92b-6ded-4a7b-b9cc-ea46b70d977f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 08:00:26 compute-0 nova_compute[187185]: 2025-11-29 08:00:26.525 187189 DEBUG oslo_concurrency.lockutils [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "refresh_cache-681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 08:00:26 compute-0 nova_compute[187185]: 2025-11-29 08:00:26.642 187189 DEBUG nova.network.neutron [req-4e29015e-04e3-4dcb-a5cb-54fc10447351 req-7326d33d-7436-4be0-bbf9-968badddec5e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 08:00:27 compute-0 nova_compute[187185]: 2025-11-29 08:00:27.409 187189 DEBUG nova.network.neutron [req-4e29015e-04e3-4dcb-a5cb-54fc10447351 req-7326d33d-7436-4be0-bbf9-968badddec5e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 08:00:27 compute-0 nova_compute[187185]: 2025-11-29 08:00:27.428 187189 DEBUG oslo_concurrency.lockutils [req-4e29015e-04e3-4dcb-a5cb-54fc10447351 req-7326d33d-7436-4be0-bbf9-968badddec5e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 08:00:27 compute-0 nova_compute[187185]: 2025-11-29 08:00:27.429 187189 DEBUG oslo_concurrency.lockutils [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquired lock "refresh_cache-681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 08:00:27 compute-0 nova_compute[187185]: 2025-11-29 08:00:27.430 187189 DEBUG nova.network.neutron [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 29 08:00:27 compute-0 nova_compute[187185]: 2025-11-29 08:00:27.584 187189 DEBUG nova.network.neutron [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 29 08:00:28 compute-0 nova_compute[187185]: 2025-11-29 08:00:28.793 187189 DEBUG nova.network.neutron [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Updating instance_info_cache with network_info: [{"id": "c7bce92b-6ded-4a7b-b9cc-ea46b70d977f", "address": "fa:16:3e:77:82:d3", "network": {"id": "ff3fc050-ba7a-4fdb-b763-76384fb9149e", "bridge": "br-int", "label": "tempest-network-smoke--1131964355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7bce92b-6d", "ovs_interfaceid": "c7bce92b-6ded-4a7b-b9cc-ea46b70d977f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.096 187189 DEBUG oslo_concurrency.lockutils [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Releasing lock "refresh_cache-681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.097 187189 DEBUG nova.compute.manager [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Instance network_info: |[{"id": "c7bce92b-6ded-4a7b-b9cc-ea46b70d977f", "address": "fa:16:3e:77:82:d3", "network": {"id": "ff3fc050-ba7a-4fdb-b763-76384fb9149e", "bridge": "br-int", "label": "tempest-network-smoke--1131964355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7bce92b-6d", "ovs_interfaceid": "c7bce92b-6ded-4a7b-b9cc-ea46b70d977f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.100 187189 DEBUG nova.virt.libvirt.driver [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Start _get_guest_xml network_info=[{"id": "c7bce92b-6ded-4a7b-b9cc-ea46b70d977f", "address": "fa:16:3e:77:82:d3", "network": {"id": "ff3fc050-ba7a-4fdb-b763-76384fb9149e", "bridge": "br-int", "label": "tempest-network-smoke--1131964355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7bce92b-6d", "ovs_interfaceid": "c7bce92b-6ded-4a7b-b9cc-ea46b70d977f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_options': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.106 187189 WARNING nova.virt.libvirt.driver [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.112 187189 DEBUG nova.virt.libvirt.host [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.113 187189 DEBUG nova.virt.libvirt.host [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.116 187189 DEBUG nova.virt.libvirt.host [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.117 187189 DEBUG nova.virt.libvirt.host [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.118 187189 DEBUG nova.virt.libvirt.driver [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.119 187189 DEBUG nova.virt.hardware [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.119 187189 DEBUG nova.virt.hardware [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.119 187189 DEBUG nova.virt.hardware [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.120 187189 DEBUG nova.virt.hardware [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.120 187189 DEBUG nova.virt.hardware [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.120 187189 DEBUG nova.virt.hardware [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.121 187189 DEBUG nova.virt.hardware [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.121 187189 DEBUG nova.virt.hardware [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.121 187189 DEBUG nova.virt.hardware [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.121 187189 DEBUG nova.virt.hardware [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.122 187189 DEBUG nova.virt.hardware [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.126 187189 DEBUG nova.virt.libvirt.vif [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T08:00:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2022058758-ge',id=184,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA6QStI4WZvM8rjCdYdMBIewiD6P/FbGD7wLorCZFoLA7wHCNE+3A9eASDdB+YjnY2gBekoYo19AK1G+4bESS2fKDisJFlhhgBaK7LaZRlIAVhbJNU4lhjciEXZNpjSjZw==',key_name='tempest-TestSecurityGroupsBasicOps-1392443600',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e8e45e91223b45a79dd698a82af4a2a5',ramdisk_id='',reservation_id='r-bxmyiq49',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2022058758',owner_user_name='tempest-TestSecurityGroupsBasicOps-2022058758-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T08:00:20Z,user_data=None,user_id='dec30fbde18e4b2382ea2c59847d067f',uuid=681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c7bce92b-6ded-4a7b-b9cc-ea46b70d977f", "address": "fa:16:3e:77:82:d3", "network": {"id": "ff3fc050-ba7a-4fdb-b763-76384fb9149e", "bridge": "br-int", "label": "tempest-network-smoke--1131964355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7bce92b-6d", "ovs_interfaceid": "c7bce92b-6ded-4a7b-b9cc-ea46b70d977f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.127 187189 DEBUG nova.network.os_vif_util [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converting VIF {"id": "c7bce92b-6ded-4a7b-b9cc-ea46b70d977f", "address": "fa:16:3e:77:82:d3", "network": {"id": "ff3fc050-ba7a-4fdb-b763-76384fb9149e", "bridge": "br-int", "label": "tempest-network-smoke--1131964355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7bce92b-6d", "ovs_interfaceid": "c7bce92b-6ded-4a7b-b9cc-ea46b70d977f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.127 187189 DEBUG nova.network.os_vif_util [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:77:82:d3,bridge_name='br-int',has_traffic_filtering=True,id=c7bce92b-6ded-4a7b-b9cc-ea46b70d977f,network=Network(ff3fc050-ba7a-4fdb-b763-76384fb9149e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7bce92b-6d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.128 187189 DEBUG nova.objects.instance [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lazy-loading 'pci_devices' on Instance uuid 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.233 187189 DEBUG nova.virt.libvirt.driver [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] End _get_guest_xml xml=<domain type="kvm">
Nov 29 08:00:29 compute-0 nova_compute[187185]:   <uuid>681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4</uuid>
Nov 29 08:00:29 compute-0 nova_compute[187185]:   <name>instance-000000b8</name>
Nov 29 08:00:29 compute-0 nova_compute[187185]:   <memory>131072</memory>
Nov 29 08:00:29 compute-0 nova_compute[187185]:   <vcpu>1</vcpu>
Nov 29 08:00:29 compute-0 nova_compute[187185]:   <metadata>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 08:00:29 compute-0 nova_compute[187185]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:       <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925</nova:name>
Nov 29 08:00:29 compute-0 nova_compute[187185]:       <nova:creationTime>2025-11-29 08:00:29</nova:creationTime>
Nov 29 08:00:29 compute-0 nova_compute[187185]:       <nova:flavor name="m1.nano">
Nov 29 08:00:29 compute-0 nova_compute[187185]:         <nova:memory>128</nova:memory>
Nov 29 08:00:29 compute-0 nova_compute[187185]:         <nova:disk>1</nova:disk>
Nov 29 08:00:29 compute-0 nova_compute[187185]:         <nova:swap>0</nova:swap>
Nov 29 08:00:29 compute-0 nova_compute[187185]:         <nova:ephemeral>0</nova:ephemeral>
Nov 29 08:00:29 compute-0 nova_compute[187185]:         <nova:vcpus>1</nova:vcpus>
Nov 29 08:00:29 compute-0 nova_compute[187185]:       </nova:flavor>
Nov 29 08:00:29 compute-0 nova_compute[187185]:       <nova:owner>
Nov 29 08:00:29 compute-0 nova_compute[187185]:         <nova:user uuid="dec30fbde18e4b2382ea2c59847d067f">tempest-TestSecurityGroupsBasicOps-2022058758-project-member</nova:user>
Nov 29 08:00:29 compute-0 nova_compute[187185]:         <nova:project uuid="e8e45e91223b45a79dd698a82af4a2a5">tempest-TestSecurityGroupsBasicOps-2022058758</nova:project>
Nov 29 08:00:29 compute-0 nova_compute[187185]:       </nova:owner>
Nov 29 08:00:29 compute-0 nova_compute[187185]:       <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:       <nova:ports>
Nov 29 08:00:29 compute-0 nova_compute[187185]:         <nova:port uuid="c7bce92b-6ded-4a7b-b9cc-ea46b70d977f">
Nov 29 08:00:29 compute-0 nova_compute[187185]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:         </nova:port>
Nov 29 08:00:29 compute-0 nova_compute[187185]:       </nova:ports>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     </nova:instance>
Nov 29 08:00:29 compute-0 nova_compute[187185]:   </metadata>
Nov 29 08:00:29 compute-0 nova_compute[187185]:   <sysinfo type="smbios">
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <system>
Nov 29 08:00:29 compute-0 nova_compute[187185]:       <entry name="manufacturer">RDO</entry>
Nov 29 08:00:29 compute-0 nova_compute[187185]:       <entry name="product">OpenStack Compute</entry>
Nov 29 08:00:29 compute-0 nova_compute[187185]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 08:00:29 compute-0 nova_compute[187185]:       <entry name="serial">681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4</entry>
Nov 29 08:00:29 compute-0 nova_compute[187185]:       <entry name="uuid">681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4</entry>
Nov 29 08:00:29 compute-0 nova_compute[187185]:       <entry name="family">Virtual Machine</entry>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     </system>
Nov 29 08:00:29 compute-0 nova_compute[187185]:   </sysinfo>
Nov 29 08:00:29 compute-0 nova_compute[187185]:   <os>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <boot dev="hd"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <smbios mode="sysinfo"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:   </os>
Nov 29 08:00:29 compute-0 nova_compute[187185]:   <features>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <acpi/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <apic/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <vmcoreinfo/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:   </features>
Nov 29 08:00:29 compute-0 nova_compute[187185]:   <clock offset="utc">
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <timer name="pit" tickpolicy="delay"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <timer name="hpet" present="no"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:   </clock>
Nov 29 08:00:29 compute-0 nova_compute[187185]:   <cpu mode="custom" match="exact">
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <model>Nehalem</model>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <topology sockets="1" cores="1" threads="1"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:   </cpu>
Nov 29 08:00:29 compute-0 nova_compute[187185]:   <devices>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <disk type="file" device="disk">
Nov 29 08:00:29 compute-0 nova_compute[187185]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/disk"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:       <target dev="vda" bus="virtio"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     </disk>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <disk type="file" device="cdrom">
Nov 29 08:00:29 compute-0 nova_compute[187185]:       <driver name="qemu" type="raw" cache="none"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:       <source file="/var/lib/nova/instances/681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/disk.config"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:       <target dev="sda" bus="sata"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     </disk>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <interface type="ethernet">
Nov 29 08:00:29 compute-0 nova_compute[187185]:       <mac address="fa:16:3e:77:82:d3"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:       <driver name="vhost" rx_queue_size="512"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:       <mtu size="1442"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:       <target dev="tapc7bce92b-6d"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     </interface>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <serial type="pty">
Nov 29 08:00:29 compute-0 nova_compute[187185]:       <log file="/var/lib/nova/instances/681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/console.log" append="off"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     </serial>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <video>
Nov 29 08:00:29 compute-0 nova_compute[187185]:       <model type="virtio"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     </video>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <input type="tablet" bus="usb"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <rng model="virtio">
Nov 29 08:00:29 compute-0 nova_compute[187185]:       <backend model="random">/dev/urandom</backend>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     </rng>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <controller type="pci" model="pcie-root-port"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <controller type="usb" index="0"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     <memballoon model="virtio">
Nov 29 08:00:29 compute-0 nova_compute[187185]:       <stats period="10"/>
Nov 29 08:00:29 compute-0 nova_compute[187185]:     </memballoon>
Nov 29 08:00:29 compute-0 nova_compute[187185]:   </devices>
Nov 29 08:00:29 compute-0 nova_compute[187185]: </domain>
Nov 29 08:00:29 compute-0 nova_compute[187185]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.234 187189 DEBUG nova.compute.manager [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Preparing to wait for external event network-vif-plugged-c7bce92b-6ded-4a7b-b9cc-ea46b70d977f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.235 187189 DEBUG oslo_concurrency.lockutils [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.235 187189 DEBUG oslo_concurrency.lockutils [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.235 187189 DEBUG oslo_concurrency.lockutils [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.236 187189 DEBUG nova.virt.libvirt.vif [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T08:00:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2022058758-ge',id=184,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA6QStI4WZvM8rjCdYdMBIewiD6P/FbGD7wLorCZFoLA7wHCNE+3A9eASDdB+YjnY2gBekoYo19AK1G+4bESS2fKDisJFlhhgBaK7LaZRlIAVhbJNU4lhjciEXZNpjSjZw==',key_name='tempest-TestSecurityGroupsBasicOps-1392443600',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e8e45e91223b45a79dd698a82af4a2a5',ramdisk_id='',reservation_id='r-bxmyiq49',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2022058758',owner_user_name='tempest-TestSecurityGroupsBasicOps-2022058758-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T08:00:20Z,user_data=None,user_id='dec30fbde18e4b2382ea2c59847d067f',uuid=681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c7bce92b-6ded-4a7b-b9cc-ea46b70d977f", "address": "fa:16:3e:77:82:d3", "network": {"id": "ff3fc050-ba7a-4fdb-b763-76384fb9149e", "bridge": "br-int", "label": "tempest-network-smoke--1131964355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7bce92b-6d", "ovs_interfaceid": "c7bce92b-6ded-4a7b-b9cc-ea46b70d977f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.236 187189 DEBUG nova.network.os_vif_util [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converting VIF {"id": "c7bce92b-6ded-4a7b-b9cc-ea46b70d977f", "address": "fa:16:3e:77:82:d3", "network": {"id": "ff3fc050-ba7a-4fdb-b763-76384fb9149e", "bridge": "br-int", "label": "tempest-network-smoke--1131964355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7bce92b-6d", "ovs_interfaceid": "c7bce92b-6ded-4a7b-b9cc-ea46b70d977f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.237 187189 DEBUG nova.network.os_vif_util [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:77:82:d3,bridge_name='br-int',has_traffic_filtering=True,id=c7bce92b-6ded-4a7b-b9cc-ea46b70d977f,network=Network(ff3fc050-ba7a-4fdb-b763-76384fb9149e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7bce92b-6d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.238 187189 DEBUG os_vif [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:82:d3,bridge_name='br-int',has_traffic_filtering=True,id=c7bce92b-6ded-4a7b-b9cc-ea46b70d977f,network=Network(ff3fc050-ba7a-4fdb-b763-76384fb9149e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7bce92b-6d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.238 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.239 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.239 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.242 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.242 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc7bce92b-6d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.243 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc7bce92b-6d, col_values=(('external_ids', {'iface-id': 'c7bce92b-6ded-4a7b-b9cc-ea46b70d977f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:77:82:d3', 'vm-uuid': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.244 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:29 compute-0 NetworkManager[55227]: <info>  [1764403229.2470] manager: (tapc7bce92b-6d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/323)
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.247 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.253 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.255 187189 INFO os_vif [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:82:d3,bridge_name='br-int',has_traffic_filtering=True,id=c7bce92b-6ded-4a7b-b9cc-ea46b70d977f,network=Network(ff3fc050-ba7a-4fdb-b763-76384fb9149e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7bce92b-6d')
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.587 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.627 187189 DEBUG nova.virt.libvirt.driver [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.628 187189 DEBUG nova.virt.libvirt.driver [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.628 187189 DEBUG nova.virt.libvirt.driver [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] No VIF found with MAC fa:16:3e:77:82:d3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 29 08:00:29 compute-0 nova_compute[187185]: 2025-11-29 08:00:29.629 187189 INFO nova.virt.libvirt.driver [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Using config drive
Nov 29 08:00:30 compute-0 nova_compute[187185]: 2025-11-29 08:00:30.327 187189 INFO nova.virt.libvirt.driver [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Creating config drive at /var/lib/nova/instances/681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/disk.config
Nov 29 08:00:30 compute-0 nova_compute[187185]: 2025-11-29 08:00:30.335 187189 DEBUG oslo_concurrency.processutils [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1hk4ivig execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 08:00:30 compute-0 nova_compute[187185]: 2025-11-29 08:00:30.468 187189 DEBUG oslo_concurrency.processutils [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1hk4ivig" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 08:00:30 compute-0 kernel: tapc7bce92b-6d: entered promiscuous mode
Nov 29 08:00:30 compute-0 nova_compute[187185]: 2025-11-29 08:00:30.558 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:30 compute-0 ovn_controller[95281]: 2025-11-29T08:00:30Z|00623|binding|INFO|Claiming lport c7bce92b-6ded-4a7b-b9cc-ea46b70d977f for this chassis.
Nov 29 08:00:30 compute-0 ovn_controller[95281]: 2025-11-29T08:00:30Z|00624|binding|INFO|c7bce92b-6ded-4a7b-b9cc-ea46b70d977f: Claiming fa:16:3e:77:82:d3 10.100.0.9
Nov 29 08:00:30 compute-0 NetworkManager[55227]: <info>  [1764403230.5602] manager: (tapc7bce92b-6d): new Tun device (/org/freedesktop/NetworkManager/Devices/324)
Nov 29 08:00:30 compute-0 nova_compute[187185]: 2025-11-29 08:00:30.564 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:30 compute-0 nova_compute[187185]: 2025-11-29 08:00:30.570 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:30 compute-0 nova_compute[187185]: 2025-11-29 08:00:30.574 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:30 compute-0 nova_compute[187185]: 2025-11-29 08:00:30.577 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:30 compute-0 NetworkManager[55227]: <info>  [1764403230.5790] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/325)
Nov 29 08:00:30 compute-0 NetworkManager[55227]: <info>  [1764403230.5803] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/326)
Nov 29 08:00:30 compute-0 systemd-udevd[250907]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 08:00:30 compute-0 systemd-machined[153486]: New machine qemu-71-instance-000000b8.
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:30.613 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:77:82:d3 10.100.0.9'], port_security=['fa:16:3e:77:82:d3 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ff3fc050-ba7a-4fdb-b763-76384fb9149e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6649faa1-db80-42dc-8e4b-bb2b1a7f56ec', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d94b7d30-29ce-4bb4-a204-dc237d12f274, chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=c7bce92b-6ded-4a7b-b9cc-ea46b70d977f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 08:00:30 compute-0 NetworkManager[55227]: <info>  [1764403230.6154] device (tapc7bce92b-6d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 08:00:30 compute-0 NetworkManager[55227]: <info>  [1764403230.6165] device (tapc7bce92b-6d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:30.616 104254 INFO neutron.agent.ovn.metadata.agent [-] Port c7bce92b-6ded-4a7b-b9cc-ea46b70d977f in datapath ff3fc050-ba7a-4fdb-b763-76384fb9149e bound to our chassis
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:30.619 104254 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ff3fc050-ba7a-4fdb-b763-76384fb9149e
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:30.632 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[253994e6-fe26-4c55-a33d-7f7dbf41f94a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:30.633 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapff3fc050-b1 in ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:30.637 214223 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapff3fc050-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:30.638 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[de815353-5aec-4d25-8907-992c5b7d5057]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:30.639 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[3e3c69d6-c36f-498a-b125-2cfdbb7f59e7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:30.659 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[1fb45533-9a0f-4893-ab31-004610491635]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 08:00:30 compute-0 systemd[1]: Started Virtual Machine qemu-71-instance-000000b8.
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:30.690 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[ffbb7ccd-323d-42f4-a14a-c28909afa3d3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 08:00:30 compute-0 nova_compute[187185]: 2025-11-29 08:00:30.728 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:30.729 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[0a45151a-ae3a-4d4f-8e5a-43ce8dcac89e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 08:00:30 compute-0 NetworkManager[55227]: <info>  [1764403230.7410] manager: (tapff3fc050-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/327)
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:30.739 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[1bf6ff9c-2b53-4724-a6a1-d8e3a41d3bdf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 08:00:30 compute-0 nova_compute[187185]: 2025-11-29 08:00:30.742 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:30 compute-0 ovn_controller[95281]: 2025-11-29T08:00:30Z|00625|binding|INFO|Setting lport c7bce92b-6ded-4a7b-b9cc-ea46b70d977f ovn-installed in OVS
Nov 29 08:00:30 compute-0 ovn_controller[95281]: 2025-11-29T08:00:30Z|00626|binding|INFO|Setting lport c7bce92b-6ded-4a7b-b9cc-ea46b70d977f up in Southbound
Nov 29 08:00:30 compute-0 nova_compute[187185]: 2025-11-29 08:00:30.753 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:30.775 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[c7dbb4ff-2784-4e01-80d3-d216a5725b5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:30.779 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[1900ff90-73df-4c40-bf59-4e9ef92e5e68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 08:00:30 compute-0 NetworkManager[55227]: <info>  [1764403230.8065] device (tapff3fc050-b0): carrier: link connected
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:30.814 214273 DEBUG oslo.privsep.daemon [-] privsep: reply[3b54d320-18a9-4081-b433-7f9bef2c674b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:30.839 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[7df7b940-b04b-4356-b46b-b5ca50402792]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapff3fc050-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fd:47:d1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 193], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 867846, 'reachable_time': 21686, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250943, 'error': None, 'target': 'ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:30.856 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[b64c0311-3d4a-4a6f-a6a8-0c0493af3198]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefd:47d1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 867846, 'tstamp': 867846}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 250947, 'error': None, 'target': 'ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:30.872 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[755c4edf-55d5-47fd-8e42-5089ade15c09]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapff3fc050-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fd:47:d1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 193], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 867846, 'reachable_time': 21686, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 250949, 'error': None, 'target': 'ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:30.903 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[6ee688ac-841b-4436-8515-94abdf770490]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 08:00:30 compute-0 nova_compute[187185]: 2025-11-29 08:00:30.935 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764403230.9348538, 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 08:00:30 compute-0 nova_compute[187185]: 2025-11-29 08:00:30.936 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] VM Started (Lifecycle Event)
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:30.975 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[38b4d29b-cfd9-4f8d-ad2c-b8a5f39a17b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:30.977 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapff3fc050-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:30.978 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:30.978 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapff3fc050-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 08:00:30 compute-0 nova_compute[187185]: 2025-11-29 08:00:30.980 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:30 compute-0 NetworkManager[55227]: <info>  [1764403230.9812] manager: (tapff3fc050-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/328)
Nov 29 08:00:30 compute-0 kernel: tapff3fc050-b0: entered promiscuous mode
Nov 29 08:00:30 compute-0 nova_compute[187185]: 2025-11-29 08:00:30.982 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:30.983 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapff3fc050-b0, col_values=(('external_ids', {'iface-id': '965f3bec-4819-4a7d-a97c-c5af8f6aa242'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 08:00:30 compute-0 nova_compute[187185]: 2025-11-29 08:00:30.984 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:30 compute-0 ovn_controller[95281]: 2025-11-29T08:00:30Z|00627|binding|INFO|Releasing lport 965f3bec-4819-4a7d-a97c-c5af8f6aa242 from this chassis (sb_readonly=1)
Nov 29 08:00:30 compute-0 nova_compute[187185]: 2025-11-29 08:00:30.986 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:30.986 104254 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ff3fc050-ba7a-4fdb-b763-76384fb9149e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ff3fc050-ba7a-4fdb-b763-76384fb9149e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:30.988 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[c63d0257-7314-4e90-bddf-03f6d128467a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:30.990 104254 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]: global
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]:     log         /dev/log local0 debug
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]:     log-tag     haproxy-metadata-proxy-ff3fc050-ba7a-4fdb-b763-76384fb9149e
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]:     user        root
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]:     group       root
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]:     maxconn     1024
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]:     pidfile     /var/lib/neutron/external/pids/ff3fc050-ba7a-4fdb-b763-76384fb9149e.pid.haproxy
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]:     daemon
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]: 
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]: defaults
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]:     log global
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]:     mode http
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]:     option httplog
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]:     option dontlognull
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]:     option http-server-close
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]:     option forwardfor
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]:     retries                 3
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]:     timeout http-request    30s
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]:     timeout connect         30s
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]:     timeout client          32s
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]:     timeout server          32s
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]:     timeout http-keep-alive 30s
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]: 
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]: 
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]: listen listener
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]:     bind 169.254.169.254:80
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]:     server metadata /var/lib/neutron/metadata_proxy
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]:     http-request add-header X-OVN-Network-ID ff3fc050-ba7a-4fdb-b763-76384fb9149e
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 29 08:00:30 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:30.991 104254 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e', 'env', 'PROCESS_TAG=haproxy-ff3fc050-ba7a-4fdb-b763-76384fb9149e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ff3fc050-ba7a-4fdb-b763-76384fb9149e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 29 08:00:30 compute-0 nova_compute[187185]: 2025-11-29 08:00:30.997 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:31 compute-0 podman[250980]: 2025-11-29 08:00:31.392039532 +0000 UTC m=+0.052241500 container create 56799115b4281c4aee6326d7b8b25b454bdc6a7080ba9e1c5a5c770ffb28649e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 08:00:31 compute-0 nova_compute[187185]: 2025-11-29 08:00:31.415 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 08:00:31 compute-0 nova_compute[187185]: 2025-11-29 08:00:31.421 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764403230.9389825, 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 08:00:31 compute-0 nova_compute[187185]: 2025-11-29 08:00:31.422 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] VM Paused (Lifecycle Event)
Nov 29 08:00:31 compute-0 systemd[1]: Started libpod-conmon-56799115b4281c4aee6326d7b8b25b454bdc6a7080ba9e1c5a5c770ffb28649e.scope.
Nov 29 08:00:31 compute-0 podman[250980]: 2025-11-29 08:00:31.360887314 +0000 UTC m=+0.021089322 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 08:00:31 compute-0 systemd[1]: Started libcrun container.
Nov 29 08:00:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b583c2562be636b452186c3c6683e1bc416443c6a565754d84033fd6b0d3adad/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 08:00:31 compute-0 podman[250980]: 2025-11-29 08:00:31.504922389 +0000 UTC m=+0.165124377 container init 56799115b4281c4aee6326d7b8b25b454bdc6a7080ba9e1c5a5c770ffb28649e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 08:00:31 compute-0 podman[250980]: 2025-11-29 08:00:31.512421093 +0000 UTC m=+0.172623051 container start 56799115b4281c4aee6326d7b8b25b454bdc6a7080ba9e1c5a5c770ffb28649e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 29 08:00:31 compute-0 nova_compute[187185]: 2025-11-29 08:00:31.526 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 08:00:31 compute-0 nova_compute[187185]: 2025-11-29 08:00:31.530 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 08:00:31 compute-0 neutron-haproxy-ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e[250995]: [NOTICE]   (250999) : New worker (251001) forked
Nov 29 08:00:31 compute-0 neutron-haproxy-ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e[250995]: [NOTICE]   (250999) : Loading success.
Nov 29 08:00:31 compute-0 nova_compute[187185]: 2025-11-29 08:00:31.749 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 08:00:32 compute-0 nova_compute[187185]: 2025-11-29 08:00:32.370 187189 DEBUG nova.compute.manager [req-2eea96ad-e23f-4a31-b33f-7148e84263b6 req-7a7ec4bc-3df2-481c-8a2a-cb948e6d5bbf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Received event network-vif-plugged-c7bce92b-6ded-4a7b-b9cc-ea46b70d977f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 08:00:32 compute-0 nova_compute[187185]: 2025-11-29 08:00:32.371 187189 DEBUG oslo_concurrency.lockutils [req-2eea96ad-e23f-4a31-b33f-7148e84263b6 req-7a7ec4bc-3df2-481c-8a2a-cb948e6d5bbf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 08:00:32 compute-0 nova_compute[187185]: 2025-11-29 08:00:32.372 187189 DEBUG oslo_concurrency.lockutils [req-2eea96ad-e23f-4a31-b33f-7148e84263b6 req-7a7ec4bc-3df2-481c-8a2a-cb948e6d5bbf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 08:00:32 compute-0 nova_compute[187185]: 2025-11-29 08:00:32.372 187189 DEBUG oslo_concurrency.lockutils [req-2eea96ad-e23f-4a31-b33f-7148e84263b6 req-7a7ec4bc-3df2-481c-8a2a-cb948e6d5bbf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 08:00:32 compute-0 nova_compute[187185]: 2025-11-29 08:00:32.373 187189 DEBUG nova.compute.manager [req-2eea96ad-e23f-4a31-b33f-7148e84263b6 req-7a7ec4bc-3df2-481c-8a2a-cb948e6d5bbf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Processing event network-vif-plugged-c7bce92b-6ded-4a7b-b9cc-ea46b70d977f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 29 08:00:32 compute-0 nova_compute[187185]: 2025-11-29 08:00:32.374 187189 DEBUG nova.compute.manager [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 29 08:00:32 compute-0 nova_compute[187185]: 2025-11-29 08:00:32.378 187189 DEBUG nova.virt.driver [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] Emitting event <LifecycleEvent: 1764403232.3785, 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 08:00:32 compute-0 nova_compute[187185]: 2025-11-29 08:00:32.379 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] VM Resumed (Lifecycle Event)
Nov 29 08:00:32 compute-0 nova_compute[187185]: 2025-11-29 08:00:32.383 187189 DEBUG nova.virt.libvirt.driver [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 29 08:00:32 compute-0 nova_compute[187185]: 2025-11-29 08:00:32.389 187189 INFO nova.virt.libvirt.driver [-] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Instance spawned successfully.
Nov 29 08:00:32 compute-0 nova_compute[187185]: 2025-11-29 08:00:32.390 187189 DEBUG nova.virt.libvirt.driver [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 29 08:00:32 compute-0 nova_compute[187185]: 2025-11-29 08:00:32.399 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 08:00:32 compute-0 nova_compute[187185]: 2025-11-29 08:00:32.408 187189 DEBUG nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 29 08:00:32 compute-0 nova_compute[187185]: 2025-11-29 08:00:32.414 187189 DEBUG nova.virt.libvirt.driver [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 08:00:32 compute-0 nova_compute[187185]: 2025-11-29 08:00:32.415 187189 DEBUG nova.virt.libvirt.driver [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 08:00:32 compute-0 nova_compute[187185]: 2025-11-29 08:00:32.415 187189 DEBUG nova.virt.libvirt.driver [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 08:00:32 compute-0 nova_compute[187185]: 2025-11-29 08:00:32.416 187189 DEBUG nova.virt.libvirt.driver [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 08:00:32 compute-0 nova_compute[187185]: 2025-11-29 08:00:32.416 187189 DEBUG nova.virt.libvirt.driver [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 08:00:32 compute-0 nova_compute[187185]: 2025-11-29 08:00:32.416 187189 DEBUG nova.virt.libvirt.driver [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 29 08:00:32 compute-0 nova_compute[187185]: 2025-11-29 08:00:32.427 187189 INFO nova.compute.manager [None req-c39ad38b-2ee7-44b1-a4cf-daadfe7ac159 - - - - - -] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 29 08:00:32 compute-0 podman[251010]: 2025-11-29 08:00:32.827288199 +0000 UTC m=+0.085979672 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 08:00:33 compute-0 nova_compute[187185]: 2025-11-29 08:00:33.071 187189 INFO nova.compute.manager [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Took 12.45 seconds to spawn the instance on the hypervisor.
Nov 29 08:00:33 compute-0 nova_compute[187185]: 2025-11-29 08:00:33.072 187189 DEBUG nova.compute.manager [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 08:00:33 compute-0 nova_compute[187185]: 2025-11-29 08:00:33.168 187189 INFO nova.compute.manager [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Took 13.05 seconds to build instance.
Nov 29 08:00:33 compute-0 nova_compute[187185]: 2025-11-29 08:00:33.450 187189 DEBUG oslo_concurrency.lockutils [None req-061d4d5b-1d86-4a9d-896a-068afcd06dad dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.432s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 08:00:34 compute-0 nova_compute[187185]: 2025-11-29 08:00:34.246 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:34 compute-0 nova_compute[187185]: 2025-11-29 08:00:34.471 187189 DEBUG nova.compute.manager [req-dde02bea-2405-4971-af68-78fa52aa4bc3 req-5bb8a8cf-b093-4540-a58e-e4c8f15afed2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Received event network-vif-plugged-c7bce92b-6ded-4a7b-b9cc-ea46b70d977f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 08:00:34 compute-0 nova_compute[187185]: 2025-11-29 08:00:34.472 187189 DEBUG oslo_concurrency.lockutils [req-dde02bea-2405-4971-af68-78fa52aa4bc3 req-5bb8a8cf-b093-4540-a58e-e4c8f15afed2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 08:00:34 compute-0 nova_compute[187185]: 2025-11-29 08:00:34.472 187189 DEBUG oslo_concurrency.lockutils [req-dde02bea-2405-4971-af68-78fa52aa4bc3 req-5bb8a8cf-b093-4540-a58e-e4c8f15afed2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 08:00:34 compute-0 nova_compute[187185]: 2025-11-29 08:00:34.472 187189 DEBUG oslo_concurrency.lockutils [req-dde02bea-2405-4971-af68-78fa52aa4bc3 req-5bb8a8cf-b093-4540-a58e-e4c8f15afed2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 08:00:34 compute-0 nova_compute[187185]: 2025-11-29 08:00:34.472 187189 DEBUG nova.compute.manager [req-dde02bea-2405-4971-af68-78fa52aa4bc3 req-5bb8a8cf-b093-4540-a58e-e4c8f15afed2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] No waiting events found dispatching network-vif-plugged-c7bce92b-6ded-4a7b-b9cc-ea46b70d977f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 08:00:34 compute-0 nova_compute[187185]: 2025-11-29 08:00:34.473 187189 WARNING nova.compute.manager [req-dde02bea-2405-4971-af68-78fa52aa4bc3 req-5bb8a8cf-b093-4540-a58e-e4c8f15afed2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Received unexpected event network-vif-plugged-c7bce92b-6ded-4a7b-b9cc-ea46b70d977f for instance with vm_state active and task_state None.
Nov 29 08:00:34 compute-0 nova_compute[187185]: 2025-11-29 08:00:34.592 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:38 compute-0 nova_compute[187185]: 2025-11-29 08:00:38.675 187189 DEBUG nova.compute.manager [req-834bf477-4abc-41a3-aa10-f8d4d8c93832 req-b791078b-69ad-4823-873b-f9f15487ac14 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Received event network-changed-c7bce92b-6ded-4a7b-b9cc-ea46b70d977f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 08:00:38 compute-0 nova_compute[187185]: 2025-11-29 08:00:38.675 187189 DEBUG nova.compute.manager [req-834bf477-4abc-41a3-aa10-f8d4d8c93832 req-b791078b-69ad-4823-873b-f9f15487ac14 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Refreshing instance network info cache due to event network-changed-c7bce92b-6ded-4a7b-b9cc-ea46b70d977f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 29 08:00:38 compute-0 nova_compute[187185]: 2025-11-29 08:00:38.676 187189 DEBUG oslo_concurrency.lockutils [req-834bf477-4abc-41a3-aa10-f8d4d8c93832 req-b791078b-69ad-4823-873b-f9f15487ac14 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 29 08:00:38 compute-0 nova_compute[187185]: 2025-11-29 08:00:38.676 187189 DEBUG oslo_concurrency.lockutils [req-834bf477-4abc-41a3-aa10-f8d4d8c93832 req-b791078b-69ad-4823-873b-f9f15487ac14 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 29 08:00:38 compute-0 nova_compute[187185]: 2025-11-29 08:00:38.676 187189 DEBUG nova.network.neutron [req-834bf477-4abc-41a3-aa10-f8d4d8c93832 req-b791078b-69ad-4823-873b-f9f15487ac14 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Refreshing network info cache for port c7bce92b-6ded-4a7b-b9cc-ea46b70d977f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 29 08:00:39 compute-0 nova_compute[187185]: 2025-11-29 08:00:39.248 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:39 compute-0 nova_compute[187185]: 2025-11-29 08:00:39.595 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:40 compute-0 nova_compute[187185]: 2025-11-29 08:00:40.204 187189 DEBUG nova.network.neutron [req-834bf477-4abc-41a3-aa10-f8d4d8c93832 req-b791078b-69ad-4823-873b-f9f15487ac14 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Updated VIF entry in instance network info cache for port c7bce92b-6ded-4a7b-b9cc-ea46b70d977f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 29 08:00:40 compute-0 nova_compute[187185]: 2025-11-29 08:00:40.205 187189 DEBUG nova.network.neutron [req-834bf477-4abc-41a3-aa10-f8d4d8c93832 req-b791078b-69ad-4823-873b-f9f15487ac14 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Updating instance_info_cache with network_info: [{"id": "c7bce92b-6ded-4a7b-b9cc-ea46b70d977f", "address": "fa:16:3e:77:82:d3", "network": {"id": "ff3fc050-ba7a-4fdb-b763-76384fb9149e", "bridge": "br-int", "label": "tempest-network-smoke--1131964355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7bce92b-6d", "ovs_interfaceid": "c7bce92b-6ded-4a7b-b9cc-ea46b70d977f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 08:00:40 compute-0 nova_compute[187185]: 2025-11-29 08:00:40.229 187189 DEBUG oslo_concurrency.lockutils [req-834bf477-4abc-41a3-aa10-f8d4d8c93832 req-b791078b-69ad-4823-873b-f9f15487ac14 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 29 08:00:40 compute-0 sshd-session[250887]: error: kex_exchange_identification: read: Connection timed out
Nov 29 08:00:40 compute-0 sshd-session[250887]: banner exchange: Connection from 115.190.187.93 port 42264: Connection timed out
Nov 29 08:00:41 compute-0 nova_compute[187185]: 2025-11-29 08:00:41.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:00:41 compute-0 nova_compute[187185]: 2025-11-29 08:00:41.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:00:41 compute-0 podman[251037]: 2025-11-29 08:00:41.798695465 +0000 UTC m=+0.060951508 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 08:00:43 compute-0 podman[251064]: 2025-11-29 08:00:43.828468368 +0000 UTC m=+0.075819092 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm)
Nov 29 08:00:43 compute-0 podman[251063]: 2025-11-29 08:00:43.836865207 +0000 UTC m=+0.081887715 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 08:00:44 compute-0 nova_compute[187185]: 2025-11-29 08:00:44.251 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:44 compute-0 nova_compute[187185]: 2025-11-29 08:00:44.595 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:45 compute-0 nova_compute[187185]: 2025-11-29 08:00:45.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:00:45 compute-0 ovn_controller[95281]: 2025-11-29T08:00:45Z|00087|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:77:82:d3 10.100.0.9
Nov 29 08:00:45 compute-0 ovn_controller[95281]: 2025-11-29T08:00:45Z|00088|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:77:82:d3 10.100.0.9
Nov 29 08:00:47 compute-0 nova_compute[187185]: 2025-11-29 08:00:47.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:00:47 compute-0 nova_compute[187185]: 2025-11-29 08:00:47.385 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 08:00:47 compute-0 nova_compute[187185]: 2025-11-29 08:00:47.385 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 08:00:47 compute-0 nova_compute[187185]: 2025-11-29 08:00:47.386 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 08:00:47 compute-0 nova_compute[187185]: 2025-11-29 08:00:47.386 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 08:00:47 compute-0 nova_compute[187185]: 2025-11-29 08:00:47.519 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 08:00:47 compute-0 nova_compute[187185]: 2025-11-29 08:00:47.618 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/disk --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 08:00:47 compute-0 nova_compute[187185]: 2025-11-29 08:00:47.620 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 29 08:00:47 compute-0 nova_compute[187185]: 2025-11-29 08:00:47.692 187189 DEBUG oslo_concurrency.processutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 29 08:00:47 compute-0 nova_compute[187185]: 2025-11-29 08:00:47.877 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 08:00:47 compute-0 nova_compute[187185]: 2025-11-29 08:00:47.879 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5546MB free_disk=73.22204971313477GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 08:00:47 compute-0 nova_compute[187185]: 2025-11-29 08:00:47.879 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 08:00:47 compute-0 nova_compute[187185]: 2025-11-29 08:00:47.880 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.025 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4', 'name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000b8', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'hostId': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.026 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.030 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4 / tapc7bce92b-6d inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.030 12 DEBUG ceilometer.compute.pollsters [-] 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f966eae8-943f-4f6d-b369-56aa06a4d180', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b8-681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-tapc7bce92b-6d', 'timestamp': '2025-11-29T08:00:48.026114', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925', 'name': 'tapc7bce92b-6d', 'instance_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:77:82:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc7bce92b-6d'}, 'message_id': '83e61d7e-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8695.744327322, 'message_signature': 'bca803e588781fc7b409432e337bf22a02c887710324b7240fd777951ca9690e'}]}, 'timestamp': '2025-11-29 08:00:48.030687', '_unique_id': 'a8f243ca4d3e433f8e9bbec47a882e23'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.032 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.033 12 DEBUG ceilometer.compute.pollsters [-] 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4cb6f751-3c12-4ba1-a814-7497c622643d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b8-681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-tapc7bce92b-6d', 'timestamp': '2025-11-29T08:00:48.033051', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925', 'name': 'tapc7bce92b-6d', 'instance_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:77:82:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc7bce92b-6d'}, 'message_id': '83e6898a-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8695.744327322, 'message_signature': 'eb762b0c1191983413460600002510202a1debb8071a11b543f68c59e2c9fefa'}]}, 'timestamp': '2025-11-29 08:00:48.033382', '_unique_id': 'b51cf38fd64343b4a7abb5215810c7f5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.034 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.047 12 DEBUG ceilometer.compute.pollsters [-] 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.048 12 DEBUG ceilometer.compute.pollsters [-] 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9d0f5c97-c56d-428c-907d-28e2dbf6c334', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-vda', 'timestamp': '2025-11-29T08:00:48.034891', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925', 'name': 'instance-000000b8', 'instance_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '83e8d7e4-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8695.753119453, 'message_signature': '81cde6ce198e71a6b766885f219eab50cd7fb827d1088771b943a4717bd57d0c'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-sda', 'timestamp': '2025-11-29T08:00:48.034891', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925', 'name': 'instance-000000b8', 'instance_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '83e8e464-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8695.753119453, 'message_signature': '71d5aa626237e5697f987ce595a155bd60a20273230973c9439e4751f7347068'}]}, 'timestamp': '2025-11-29 08:00:48.048766', '_unique_id': 'fab7efc5ed0749df9b76dcd3bbbb0a9e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.049 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.050 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.050 12 DEBUG ceilometer.compute.pollsters [-] 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5880963c-2e4f-4fc5-9c47-75082165192c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b8-681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-tapc7bce92b-6d', 'timestamp': '2025-11-29T08:00:48.050638', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925', 'name': 'tapc7bce92b-6d', 'instance_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:77:82:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc7bce92b-6d'}, 'message_id': '83e93748-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8695.744327322, 'message_signature': '63c72f1c5dbb1d68fc15dbbe8e8da0b4b920e4c2f874d2b66736b2ac9c5cd4ae'}]}, 'timestamp': '2025-11-29 08:00:48.050903', '_unique_id': 'd67f160bb42044449b11a5b51146d405'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.051 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.052 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.080 12 DEBUG ceilometer.compute.pollsters [-] 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/disk.device.write.requests volume: 304 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.080 12 DEBUG ceilometer.compute.pollsters [-] 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd7216634-25e2-4c7a-8ec2-12866ef81513', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 304, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-vda', 'timestamp': '2025-11-29T08:00:48.052291', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925', 'name': 'instance-000000b8', 'instance_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '83edc3ee-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8695.770534529, 'message_signature': '531c7c463dd6104d4dff65584f922e9fcfb13c4c83c7c30318d1b9b3e7b42424'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-sda', 'timestamp': '2025-11-29T08:00:48.052291', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925', 'name': 'instance-000000b8', 'instance_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '83edd4b0-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8695.770534529, 'message_signature': '6f573aeda398e3ed4c5545d9478e559cda83d79f74fd0dd014ba088cf8aff6fb'}]}, 'timestamp': '2025-11-29 08:00:48.081176', '_unique_id': '6adc16cbc2244e9fa45d27ef128402ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.082 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.083 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.083 12 DEBUG ceilometer.compute.pollsters [-] 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1b0703c3-e625-45f9-bb7c-a862e3851614', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b8-681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-tapc7bce92b-6d', 'timestamp': '2025-11-29T08:00:48.083413', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925', 'name': 'tapc7bce92b-6d', 'instance_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:77:82:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc7bce92b-6d'}, 'message_id': '83ee3946-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8695.744327322, 'message_signature': 'b7617aabf1b03616dabedf4af285040c1e430730d46ec171cb9baab2c691d90a'}]}, 'timestamp': '2025-11-29 08:00:48.083781', '_unique_id': 'cfa4bf548677447483ad1710d220c892'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.084 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.085 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.086 12 DEBUG ceilometer.compute.pollsters [-] 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/disk.device.read.requests volume: 1092 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.086 12 DEBUG ceilometer.compute.pollsters [-] 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c59f3c07-1b13-4be5-a8c3-cfc88bad289d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1092, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-vda', 'timestamp': '2025-11-29T08:00:48.086017', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925', 'name': 'instance-000000b8', 'instance_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '83eea1b0-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8695.770534529, 'message_signature': '287fcfd38dd45058d909716d1a52d2485b6bd2adf7b7a01e20a0486e6f9c4df2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-sda', 'timestamp': '2025-11-29T08:00:48.086017', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925', 'name': 'instance-000000b8', 'instance_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '83eeb2a4-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8695.770534529, 'message_signature': '52c3fa428787f73ef9203e332d4e03912fce8d24645254839c8b5eb39a546afc'}]}, 'timestamp': '2025-11-29 08:00:48.086956', '_unique_id': '4153fdb6ef8e4fb6a2c9845d914471a6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.087 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.089 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.089 12 DEBUG ceilometer.compute.pollsters [-] 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/network.outgoing.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9c4ddd80-0443-4dc9-b29f-da46a9aad2bf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 10, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b8-681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-tapc7bce92b-6d', 'timestamp': '2025-11-29T08:00:48.089333', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925', 'name': 'tapc7bce92b-6d', 'instance_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:77:82:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc7bce92b-6d'}, 'message_id': '83ef23ba-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8695.744327322, 'message_signature': 'e11fd4d7f26194cf450a90fab0c855bcccf03896a019e10173086a15ccbbd5db'}]}, 'timestamp': '2025-11-29 08:00:48.089830', '_unique_id': '83be1d8d5d794b70b2462076ec4a8413'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.090 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.092 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.092 12 DEBUG ceilometer.compute.pollsters [-] 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/network.outgoing.bytes volume: 1284 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14778ede-17a4-45d7-be22-794eff9f91d2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1284, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b8-681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-tapc7bce92b-6d', 'timestamp': '2025-11-29T08:00:48.092372', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925', 'name': 'tapc7bce92b-6d', 'instance_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:77:82:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc7bce92b-6d'}, 'message_id': '83ef96c4-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8695.744327322, 'message_signature': '85ecfe43640a5bb4b27edb374fbbbbf59532d2c8d5b1778c04db83a68eba7b07'}]}, 'timestamp': '2025-11-29 08:00:48.092695', '_unique_id': '704968585f774f01a5a96e0b3fe4193f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.093 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.094 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 08:00:48 compute-0 nova_compute[187185]: 2025-11-29 08:00:48.101 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Instance 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 29 08:00:48 compute-0 nova_compute[187185]: 2025-11-29 08:00:48.101 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 08:00:48 compute-0 nova_compute[187185]: 2025-11-29 08:00:48.101 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.123 12 DEBUG ceilometer.compute.pollsters [-] 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/memory.usage volume: 40.37890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '271c536f-2879-4d4e-8a4d-aa1ab688e50f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.37890625, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4', 'timestamp': '2025-11-29T08:00:48.094495', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925', 'name': 'instance-000000b8', 'instance_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '83f4728e-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8695.84179417, 'message_signature': 'b9f97e1f96c16ec005805eb6161dda70138c4098c0d9df0dd26a4476599530ce'}]}, 'timestamp': '2025-11-29 08:00:48.124662', '_unique_id': 'f2115caa0e744dfa96facc0f094a258b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.126 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.127 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.127 12 DEBUG ceilometer.compute.pollsters [-] 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/disk.device.read.latency volume: 233748560 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.127 12 DEBUG ceilometer.compute.pollsters [-] 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/disk.device.read.latency volume: 23852136 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '908f1a97-e0fc-4ac9-8963-10bb84d55355', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 233748560, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-vda', 'timestamp': '2025-11-29T08:00:48.127191', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925', 'name': 'instance-000000b8', 'instance_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '83f4e746-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8695.770534529, 'message_signature': '5a970873781cc3e6304ba8b1f0c74630fcf9050a4309f41ae33bf92cf9a4dd4b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 23852136, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-sda', 'timestamp': '2025-11-29T08:00:48.127191', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925', 'name': 'instance-000000b8', 'instance_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '83f4f27c-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8695.770534529, 'message_signature': '6b9d5fffd8d82c3f0bc606ea53faea0203d382ee77ebb83f8908dd2fcbefd78c'}]}, 'timestamp': '2025-11-29 08:00:48.127807', '_unique_id': '6456ea7ffb55453abab760eb644073a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.128 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.129 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.129 12 DEBUG ceilometer.compute.pollsters [-] 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/disk.device.write.latency volume: 5823060965 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.129 12 DEBUG ceilometer.compute.pollsters [-] 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '43580b6d-bf25-48b1-8983-cc55ae2f9ed3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 5823060965, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-vda', 'timestamp': '2025-11-29T08:00:48.129336', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925', 'name': 'instance-000000b8', 'instance_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '83f53a0c-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8695.770534529, 'message_signature': '37e6da5edee9d642daee9ff839b8ac6cbe221aa75c9e85479c74cbd7460ef2bb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-sda', 'timestamp': '2025-11-29T08:00:48.129336', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925', 'name': 'instance-000000b8', 'instance_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '83f54434-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8695.770534529, 'message_signature': 'cea2035cae13897c49306a3f083ca91086025d92a680546adcaabe503920f88a'}]}, 'timestamp': '2025-11-29 08:00:48.129893', '_unique_id': '43b2682ab2c34b308fb50bb2ef524db3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.130 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.131 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.131 12 DEBUG ceilometer.compute.pollsters [-] 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aecd8786-af15-4064-a3cb-3cdbe0c349cd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b8-681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-tapc7bce92b-6d', 'timestamp': '2025-11-29T08:00:48.131226', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925', 'name': 'tapc7bce92b-6d', 'instance_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:77:82:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc7bce92b-6d'}, 'message_id': '83f58430-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8695.744327322, 'message_signature': '149113c2be10ea74563e10f9c8d6c8bb4201692e14bfb78ddaf8b083a74d2bf4'}]}, 'timestamp': '2025-11-29 08:00:48.131521', '_unique_id': '7d60ff6118194ce4a7ee50d176a1fba0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.132 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.133 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925>]
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.133 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.133 12 DEBUG ceilometer.compute.pollsters [-] 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/cpu volume: 11990000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f8ac4b87-f449-4973-aabc-ccbc1ffd3cf6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11990000000, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4', 'timestamp': '2025-11-29T08:00:48.133377', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925', 'name': 'instance-000000b8', 'instance_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '83f5d7dc-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8695.84179417, 'message_signature': '8cb7ea278b8f006b61ce3d4801130d4d0763e01864a41a18d051fcf938af4fde'}]}, 'timestamp': '2025-11-29 08:00:48.133658', '_unique_id': 'f1e490e1432846c39bfff19e38258ca5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.134 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.135 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.135 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925>]
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.135 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.135 12 DEBUG ceilometer.compute.pollsters [-] 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/disk.device.write.bytes volume: 72765440 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.135 12 DEBUG ceilometer.compute.pollsters [-] 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9d74419a-2bb9-4cca-bc67-9f3689a2a5d6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72765440, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-vda', 'timestamp': '2025-11-29T08:00:48.135339', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925', 'name': 'instance-000000b8', 'instance_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '83f6248a-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8695.770534529, 'message_signature': '059ed7b18f73af9d6a4a6d66e64bb2f193e9e62e0c550b2ff171eeddf1a2dfad'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-sda', 'timestamp': '2025-11-29T08:00:48.135339', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925', 'name': 'instance-000000b8', 'instance_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '83f62e94-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8695.770534529, 'message_signature': 'f76e067fa0342ac2b8c778323c903ec63bc5d338683976b93bb655dbba71918c'}]}, 'timestamp': '2025-11-29 08:00:48.135929', '_unique_id': 'e4eb2ce0257041dc823dc3c014550678'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.136 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.137 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.137 12 DEBUG ceilometer.compute.pollsters [-] 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.137 12 DEBUG ceilometer.compute.pollsters [-] 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ee37b148-82ea-4833-aefe-6469fcab5ba6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-vda', 'timestamp': '2025-11-29T08:00:48.137285', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925', 'name': 'instance-000000b8', 'instance_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '83f67048-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8695.753119453, 'message_signature': '9b68282fc388433bc568be7dae0e821df9c7c942de1b484f552e23a2947a07a6'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-sda', 'timestamp': '2025-11-29T08:00:48.137285', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925', 'name': 'instance-000000b8', 'instance_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '83f67a7a-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8695.753119453, 'message_signature': 'bc24f29c481baa10986da9c1c538e2ad8c9b7cd1df88b80bba70ae1c85ddfb31'}]}, 'timestamp': '2025-11-29 08:00:48.137811', '_unique_id': '80c6480c6fc34ded9cfbc779db93833a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.138 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.139 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.139 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.139 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925>]
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.139 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.139 12 DEBUG ceilometer.compute.pollsters [-] 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/disk.device.read.bytes volume: 30312960 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.139 12 DEBUG ceilometer.compute.pollsters [-] 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73199e48-20ae-40f6-a034-74af72f17456', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30312960, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-vda', 'timestamp': '2025-11-29T08:00:48.139519', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925', 'name': 'instance-000000b8', 'instance_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '83f6c7d2-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8695.770534529, 'message_signature': '0fa98e4d4e5d84a2311b7e24a736ac7e351a50cad655f73707caa55e58ab2033'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-sda', 'timestamp': '2025-11-29T08:00:48.139519', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925', 'name': 'instance-000000b8', 'instance_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '83f6d42a-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8695.770534529, 'message_signature': '87b454fd056b5c87305d6f3b7795775bc79d294e59b5dabeb870d8c49974b868'}]}, 'timestamp': '2025-11-29 08:00:48.140107', '_unique_id': 'de06fb1366e842f4ba0a1608f3440294'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.140 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.141 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.141 12 DEBUG ceilometer.compute.pollsters [-] 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/network.incoming.packets volume: 14 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '74b92747-3438-4d54-8f4a-22ae645ccc77', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 14, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b8-681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-tapc7bce92b-6d', 'timestamp': '2025-11-29T08:00:48.141461', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925', 'name': 'tapc7bce92b-6d', 'instance_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:77:82:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc7bce92b-6d'}, 'message_id': '83f713ea-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8695.744327322, 'message_signature': '3ca92692fd6150398f6b2b1a814f20f86e9cf89d9221d6d7c7b43b1b11694561'}]}, 'timestamp': '2025-11-29 08:00:48.141755', '_unique_id': '300a90e1e0824c4687a35e25346364f8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.142 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 DEBUG ceilometer.compute.pollsters [-] 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/network.incoming.bytes volume: 1842 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7d7b2a4e-225e-4b60-aab2-f8277be101fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1842, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b8-681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-tapc7bce92b-6d', 'timestamp': '2025-11-29T08:00:48.143086', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925', 'name': 'tapc7bce92b-6d', 'instance_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:77:82:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc7bce92b-6d'}, 'message_id': '83f752ec-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8695.744327322, 'message_signature': '27e7dc9102c28e2577e1eed735a8c026d46d77ea32e059260e2b1f90dcf51e23'}]}, 'timestamp': '2025-11-29 08:00:48.143373', '_unique_id': '06044653e617458398c775439384880a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.143 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.144 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.144 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.144 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925>]
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.145 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.145 12 DEBUG ceilometer.compute.pollsters [-] 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.145 12 DEBUG ceilometer.compute.pollsters [-] 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9422111e-8f77-4dcf-b6d5-a5f64797074d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-vda', 'timestamp': '2025-11-29T08:00:48.145083', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925', 'name': 'instance-000000b8', 'instance_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '83f7a0bc-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8695.753119453, 'message_signature': '27df9f0ac0de9de1fae1d2ac360a1a688fb21bbcad3796ffa26951b00527af32'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-sda', 'timestamp': '2025-11-29T08:00:48.145083', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925', 'name': 'instance-000000b8', 'instance_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '83f7aab2-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8695.753119453, 'message_signature': '2d29fc41c72dab95f15ebcd323c04eb3efd4e68883dd8dbeb6dc0cda839c9c7d'}]}, 'timestamp': '2025-11-29 08:00:48.145603', '_unique_id': '95e8c382441e4146a08a198256b1650c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.146 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 DEBUG ceilometer.compute.pollsters [-] 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4d2e26a5-7a9d-464f-8d3d-e3dc4b62de98', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b8-681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-tapc7bce92b-6d', 'timestamp': '2025-11-29T08:00:48.146974', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925', 'name': 'tapc7bce92b-6d', 'instance_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4', 'instance_type': 'm1.nano', 'host': '8a7ab400072f7114d2f7fe9bb96085b3a456f889fdb6f90b5cfa6607', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:77:82:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc7bce92b-6d'}, 'message_id': '83f7ead6-ccf9-11f0-8f64-fa163e220349', 'monotonic_time': 8695.744327322, 'message_signature': 'c0b771933567a3387018f2b83e36fab7162e336f264c88d0ba9a2ab3de7fd98a'}]}, 'timestamp': '2025-11-29 08:00:48.147257', '_unique_id': 'a03066bf44484474b2d70d4f05c9f27e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 08:00:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:00:48.147 12 ERROR oslo_messaging.notify.messaging 
Nov 29 08:00:48 compute-0 nova_compute[187185]: 2025-11-29 08:00:48.287 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 08:00:48 compute-0 nova_compute[187185]: 2025-11-29 08:00:48.575 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 08:00:48 compute-0 nova_compute[187185]: 2025-11-29 08:00:48.650 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 08:00:48 compute-0 nova_compute[187185]: 2025-11-29 08:00:48.650 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.771s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 08:00:49 compute-0 nova_compute[187185]: 2025-11-29 08:00:49.301 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:49 compute-0 nova_compute[187185]: 2025-11-29 08:00:49.597 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:50 compute-0 nova_compute[187185]: 2025-11-29 08:00:50.651 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:00:50 compute-0 nova_compute[187185]: 2025-11-29 08:00:50.653 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 08:00:51 compute-0 nova_compute[187185]: 2025-11-29 08:00:51.744 187189 DEBUG oslo_concurrency.lockutils [None req-57f721f8-2696-4e85-b0c6-15947d9d3f4d dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 08:00:51 compute-0 nova_compute[187185]: 2025-11-29 08:00:51.745 187189 DEBUG oslo_concurrency.lockutils [None req-57f721f8-2696-4e85-b0c6-15947d9d3f4d dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 08:00:51 compute-0 nova_compute[187185]: 2025-11-29 08:00:51.745 187189 DEBUG oslo_concurrency.lockutils [None req-57f721f8-2696-4e85-b0c6-15947d9d3f4d dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 08:00:51 compute-0 nova_compute[187185]: 2025-11-29 08:00:51.745 187189 DEBUG oslo_concurrency.lockutils [None req-57f721f8-2696-4e85-b0c6-15947d9d3f4d dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 08:00:51 compute-0 nova_compute[187185]: 2025-11-29 08:00:51.746 187189 DEBUG oslo_concurrency.lockutils [None req-57f721f8-2696-4e85-b0c6-15947d9d3f4d dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 08:00:51 compute-0 nova_compute[187185]: 2025-11-29 08:00:51.765 187189 INFO nova.compute.manager [None req-57f721f8-2696-4e85-b0c6-15947d9d3f4d dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Terminating instance
Nov 29 08:00:51 compute-0 nova_compute[187185]: 2025-11-29 08:00:51.779 187189 DEBUG nova.compute.manager [None req-57f721f8-2696-4e85-b0c6-15947d9d3f4d dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 29 08:00:51 compute-0 kernel: tapc7bce92b-6d (unregistering): left promiscuous mode
Nov 29 08:00:51 compute-0 NetworkManager[55227]: <info>  [1764403251.8226] device (tapc7bce92b-6d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 08:00:51 compute-0 nova_compute[187185]: 2025-11-29 08:00:51.830 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:51 compute-0 ovn_controller[95281]: 2025-11-29T08:00:51Z|00628|binding|INFO|Releasing lport c7bce92b-6ded-4a7b-b9cc-ea46b70d977f from this chassis (sb_readonly=0)
Nov 29 08:00:51 compute-0 ovn_controller[95281]: 2025-11-29T08:00:51Z|00629|binding|INFO|Setting lport c7bce92b-6ded-4a7b-b9cc-ea46b70d977f down in Southbound
Nov 29 08:00:51 compute-0 ovn_controller[95281]: 2025-11-29T08:00:51Z|00630|binding|INFO|Removing iface tapc7bce92b-6d ovn-installed in OVS
Nov 29 08:00:51 compute-0 nova_compute[187185]: 2025-11-29 08:00:51.833 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:51 compute-0 nova_compute[187185]: 2025-11-29 08:00:51.854 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:51 compute-0 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d000000b8.scope: Deactivated successfully.
Nov 29 08:00:51 compute-0 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d000000b8.scope: Consumed 13.463s CPU time.
Nov 29 08:00:51 compute-0 systemd-machined[153486]: Machine qemu-71-instance-000000b8 terminated.
Nov 29 08:00:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:52.034 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:77:82:d3 10.100.0.9'], port_security=['fa:16:3e:77:82:d3 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ff3fc050-ba7a-4fdb-b763-76384fb9149e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'neutron:revision_number': '5', 'neutron:security_group_ids': '83072d80-40fc-4286-be5e-979aa5482546', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d94b7d30-29ce-4bb4-a204-dc237d12f274, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>], logical_port=c7bce92b-6ded-4a7b-b9cc-ea46b70d977f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3578b796a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 08:00:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:52.040 104254 INFO neutron.agent.ovn.metadata.agent [-] Port c7bce92b-6ded-4a7b-b9cc-ea46b70d977f in datapath ff3fc050-ba7a-4fdb-b763-76384fb9149e unbound from our chassis
Nov 29 08:00:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:52.044 104254 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ff3fc050-ba7a-4fdb-b763-76384fb9149e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 29 08:00:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:52.047 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[5637d852-0526-4aa5-b255-f45a977b0b77]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 08:00:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:52.047 104254 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e namespace which is not needed anymore
Nov 29 08:00:52 compute-0 nova_compute[187185]: 2025-11-29 08:00:52.068 187189 INFO nova.virt.libvirt.driver [-] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Instance destroyed successfully.
Nov 29 08:00:52 compute-0 nova_compute[187185]: 2025-11-29 08:00:52.069 187189 DEBUG nova.objects.instance [None req-57f721f8-2696-4e85-b0c6-15947d9d3f4d dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lazy-loading 'resources' on Instance uuid 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 29 08:00:52 compute-0 nova_compute[187185]: 2025-11-29 08:00:52.088 187189 DEBUG nova.virt.libvirt.vif [None req-57f721f8-2696-4e85-b0c6-15947d9d3f4d dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T08:00:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-902695925',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2022058758-ge',id=184,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA6QStI4WZvM8rjCdYdMBIewiD6P/FbGD7wLorCZFoLA7wHCNE+3A9eASDdB+YjnY2gBekoYo19AK1G+4bESS2fKDisJFlhhgBaK7LaZRlIAVhbJNU4lhjciEXZNpjSjZw==',key_name='tempest-TestSecurityGroupsBasicOps-1392443600',keypairs=<?>,launch_index=0,launched_at=2025-11-29T08:00:33Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e8e45e91223b45a79dd698a82af4a2a5',ramdisk_id='',reservation_id='r-bxmyiq49',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-2022058758',owner_user_name='tempest-TestSecurityGroupsBasicOps-2022058758-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T08:00:33Z,user_data=None,user_id='dec30fbde18e4b2382ea2c59847d067f',uuid=681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c7bce92b-6ded-4a7b-b9cc-ea46b70d977f", "address": "fa:16:3e:77:82:d3", "network": {"id": "ff3fc050-ba7a-4fdb-b763-76384fb9149e", "bridge": "br-int", "label": "tempest-network-smoke--1131964355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7bce92b-6d", "ovs_interfaceid": "c7bce92b-6ded-4a7b-b9cc-ea46b70d977f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 29 08:00:52 compute-0 nova_compute[187185]: 2025-11-29 08:00:52.089 187189 DEBUG nova.network.os_vif_util [None req-57f721f8-2696-4e85-b0c6-15947d9d3f4d dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converting VIF {"id": "c7bce92b-6ded-4a7b-b9cc-ea46b70d977f", "address": "fa:16:3e:77:82:d3", "network": {"id": "ff3fc050-ba7a-4fdb-b763-76384fb9149e", "bridge": "br-int", "label": "tempest-network-smoke--1131964355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc7bce92b-6d", "ovs_interfaceid": "c7bce92b-6ded-4a7b-b9cc-ea46b70d977f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 29 08:00:52 compute-0 nova_compute[187185]: 2025-11-29 08:00:52.090 187189 DEBUG nova.network.os_vif_util [None req-57f721f8-2696-4e85-b0c6-15947d9d3f4d dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:77:82:d3,bridge_name='br-int',has_traffic_filtering=True,id=c7bce92b-6ded-4a7b-b9cc-ea46b70d977f,network=Network(ff3fc050-ba7a-4fdb-b763-76384fb9149e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7bce92b-6d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 29 08:00:52 compute-0 nova_compute[187185]: 2025-11-29 08:00:52.091 187189 DEBUG os_vif [None req-57f721f8-2696-4e85-b0c6-15947d9d3f4d dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:77:82:d3,bridge_name='br-int',has_traffic_filtering=True,id=c7bce92b-6ded-4a7b-b9cc-ea46b70d977f,network=Network(ff3fc050-ba7a-4fdb-b763-76384fb9149e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7bce92b-6d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 29 08:00:52 compute-0 nova_compute[187185]: 2025-11-29 08:00:52.093 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:52 compute-0 nova_compute[187185]: 2025-11-29 08:00:52.094 187189 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc7bce92b-6d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 08:00:52 compute-0 nova_compute[187185]: 2025-11-29 08:00:52.272 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:52 compute-0 nova_compute[187185]: 2025-11-29 08:00:52.277 187189 INFO os_vif [None req-57f721f8-2696-4e85-b0c6-15947d9d3f4d dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:77:82:d3,bridge_name='br-int',has_traffic_filtering=True,id=c7bce92b-6ded-4a7b-b9cc-ea46b70d977f,network=Network(ff3fc050-ba7a-4fdb-b763-76384fb9149e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc7bce92b-6d')
Nov 29 08:00:52 compute-0 nova_compute[187185]: 2025-11-29 08:00:52.277 187189 INFO nova.virt.libvirt.driver [None req-57f721f8-2696-4e85-b0c6-15947d9d3f4d dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Deleting instance files /var/lib/nova/instances/681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4_del
Nov 29 08:00:52 compute-0 nova_compute[187185]: 2025-11-29 08:00:52.278 187189 INFO nova.virt.libvirt.driver [None req-57f721f8-2696-4e85-b0c6-15947d9d3f4d dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Deletion of /var/lib/nova/instances/681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4_del complete
Nov 29 08:00:52 compute-0 nova_compute[187185]: 2025-11-29 08:00:52.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:00:52 compute-0 neutron-haproxy-ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e[250995]: [NOTICE]   (250999) : haproxy version is 2.8.14-c23fe91
Nov 29 08:00:52 compute-0 neutron-haproxy-ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e[250995]: [NOTICE]   (250999) : path to executable is /usr/sbin/haproxy
Nov 29 08:00:52 compute-0 neutron-haproxy-ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e[250995]: [WARNING]  (250999) : Exiting Master process...
Nov 29 08:00:52 compute-0 neutron-haproxy-ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e[250995]: [ALERT]    (250999) : Current worker (251001) exited with code 143 (Terminated)
Nov 29 08:00:52 compute-0 neutron-haproxy-ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e[250995]: [WARNING]  (250999) : All workers exited. Exiting... (0)
Nov 29 08:00:52 compute-0 systemd[1]: libpod-56799115b4281c4aee6326d7b8b25b454bdc6a7080ba9e1c5a5c770ffb28649e.scope: Deactivated successfully.
Nov 29 08:00:52 compute-0 podman[251167]: 2025-11-29 08:00:52.372854404 +0000 UTC m=+0.045949521 container died 56799115b4281c4aee6326d7b8b25b454bdc6a7080ba9e1c5a5c770ffb28649e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 08:00:52 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-56799115b4281c4aee6326d7b8b25b454bdc6a7080ba9e1c5a5c770ffb28649e-userdata-shm.mount: Deactivated successfully.
Nov 29 08:00:52 compute-0 systemd[1]: var-lib-containers-storage-overlay-b583c2562be636b452186c3c6683e1bc416443c6a565754d84033fd6b0d3adad-merged.mount: Deactivated successfully.
Nov 29 08:00:52 compute-0 podman[251167]: 2025-11-29 08:00:52.414284205 +0000 UTC m=+0.087379322 container cleanup 56799115b4281c4aee6326d7b8b25b454bdc6a7080ba9e1c5a5c770ffb28649e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 29 08:00:52 compute-0 systemd[1]: libpod-conmon-56799115b4281c4aee6326d7b8b25b454bdc6a7080ba9e1c5a5c770ffb28649e.scope: Deactivated successfully.
Nov 29 08:00:52 compute-0 podman[251199]: 2025-11-29 08:00:52.489439957 +0000 UTC m=+0.047444693 container remove 56799115b4281c4aee6326d7b8b25b454bdc6a7080ba9e1c5a5c770ffb28649e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 29 08:00:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:52.494 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[04c48d22-e586-4010-b369-9ddef798bca0]: (4, ('Sat Nov 29 08:00:52 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e (56799115b4281c4aee6326d7b8b25b454bdc6a7080ba9e1c5a5c770ffb28649e)\n56799115b4281c4aee6326d7b8b25b454bdc6a7080ba9e1c5a5c770ffb28649e\nSat Nov 29 08:00:52 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e (56799115b4281c4aee6326d7b8b25b454bdc6a7080ba9e1c5a5c770ffb28649e)\n56799115b4281c4aee6326d7b8b25b454bdc6a7080ba9e1c5a5c770ffb28649e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 08:00:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:52.496 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[b28f88f2-e40d-4b81-80c5-404d9db6a250]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 08:00:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:52.497 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapff3fc050-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 08:00:52 compute-0 nova_compute[187185]: 2025-11-29 08:00:52.499 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:52 compute-0 kernel: tapff3fc050-b0: left promiscuous mode
Nov 29 08:00:52 compute-0 nova_compute[187185]: 2025-11-29 08:00:52.502 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:52.504 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[5c46f31b-a69b-45e4-9679-c43eadf6eb09]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 08:00:52 compute-0 nova_compute[187185]: 2025-11-29 08:00:52.519 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:52.529 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[ff8e89e0-cacd-41ad-8791-fe3ee78c4ab7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 08:00:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:52.531 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[c529308d-a91f-45d8-8605-25220676a8ee]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 08:00:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:52.548 214223 DEBUG oslo.privsep.daemon [-] privsep: reply[b311a8a6-e085-4019-8b00-1f106b09e783]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 867838, 'reachable_time': 42432, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 251214, 'error': None, 'target': 'ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 08:00:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:52.553 104366 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 29 08:00:52 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:52.554 104366 DEBUG oslo.privsep.daemon [-] privsep: reply[cf0b613a-8908-4193-8105-1d8ccc3128da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 29 08:00:52 compute-0 systemd[1]: run-netns-ovnmeta\x2dff3fc050\x2dba7a\x2d4fdb\x2db763\x2d76384fb9149e.mount: Deactivated successfully.
Nov 29 08:00:52 compute-0 nova_compute[187185]: 2025-11-29 08:00:52.571 187189 DEBUG nova.compute.manager [req-004d9499-21c9-4144-a7bd-da27e29c9e4c req-3da2c2a2-d567-4216-9617-5d04d0134120 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Received event network-vif-unplugged-c7bce92b-6ded-4a7b-b9cc-ea46b70d977f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 08:00:52 compute-0 nova_compute[187185]: 2025-11-29 08:00:52.573 187189 DEBUG oslo_concurrency.lockutils [req-004d9499-21c9-4144-a7bd-da27e29c9e4c req-3da2c2a2-d567-4216-9617-5d04d0134120 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 08:00:52 compute-0 nova_compute[187185]: 2025-11-29 08:00:52.574 187189 DEBUG oslo_concurrency.lockutils [req-004d9499-21c9-4144-a7bd-da27e29c9e4c req-3da2c2a2-d567-4216-9617-5d04d0134120 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 08:00:52 compute-0 nova_compute[187185]: 2025-11-29 08:00:52.574 187189 DEBUG oslo_concurrency.lockutils [req-004d9499-21c9-4144-a7bd-da27e29c9e4c req-3da2c2a2-d567-4216-9617-5d04d0134120 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 08:00:52 compute-0 nova_compute[187185]: 2025-11-29 08:00:52.575 187189 DEBUG nova.compute.manager [req-004d9499-21c9-4144-a7bd-da27e29c9e4c req-3da2c2a2-d567-4216-9617-5d04d0134120 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] No waiting events found dispatching network-vif-unplugged-c7bce92b-6ded-4a7b-b9cc-ea46b70d977f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 08:00:52 compute-0 nova_compute[187185]: 2025-11-29 08:00:52.575 187189 DEBUG nova.compute.manager [req-004d9499-21c9-4144-a7bd-da27e29c9e4c req-3da2c2a2-d567-4216-9617-5d04d0134120 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Received event network-vif-unplugged-c7bce92b-6ded-4a7b-b9cc-ea46b70d977f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 29 08:00:52 compute-0 nova_compute[187185]: 2025-11-29 08:00:52.682 187189 INFO nova.compute.manager [None req-57f721f8-2696-4e85-b0c6-15947d9d3f4d dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Took 0.90 seconds to destroy the instance on the hypervisor.
Nov 29 08:00:52 compute-0 nova_compute[187185]: 2025-11-29 08:00:52.683 187189 DEBUG oslo.service.loopingcall [None req-57f721f8-2696-4e85-b0c6-15947d9d3f4d dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 29 08:00:52 compute-0 nova_compute[187185]: 2025-11-29 08:00:52.684 187189 DEBUG nova.compute.manager [-] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 29 08:00:52 compute-0 nova_compute[187185]: 2025-11-29 08:00:52.684 187189 DEBUG nova.network.neutron [-] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 29 08:00:52 compute-0 sshd-session[251124]: Invalid user user from 20.255.62.58 port 57798
Nov 29 08:00:53 compute-0 sshd-session[251124]: Received disconnect from 20.255.62.58 port 57798:11: Bye Bye [preauth]
Nov 29 08:00:53 compute-0 sshd-session[251124]: Disconnected from invalid user user 20.255.62.58 port 57798 [preauth]
Nov 29 08:00:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:53.190 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=79, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=78) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 08:00:53 compute-0 nova_compute[187185]: 2025-11-29 08:00:53.190 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:53.191 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 08:00:53 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:00:53.193 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '79'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 08:00:54 compute-0 nova_compute[187185]: 2025-11-29 08:00:54.624 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:54 compute-0 nova_compute[187185]: 2025-11-29 08:00:54.653 187189 DEBUG nova.compute.manager [req-b7d2cee1-6ccf-4d87-97ec-f1e1b57bc683 req-0e7745bc-9a13-4966-8092-2e3ff78b4190 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Received event network-vif-plugged-c7bce92b-6ded-4a7b-b9cc-ea46b70d977f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 08:00:54 compute-0 nova_compute[187185]: 2025-11-29 08:00:54.654 187189 DEBUG oslo_concurrency.lockutils [req-b7d2cee1-6ccf-4d87-97ec-f1e1b57bc683 req-0e7745bc-9a13-4966-8092-2e3ff78b4190 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 08:00:54 compute-0 nova_compute[187185]: 2025-11-29 08:00:54.654 187189 DEBUG oslo_concurrency.lockutils [req-b7d2cee1-6ccf-4d87-97ec-f1e1b57bc683 req-0e7745bc-9a13-4966-8092-2e3ff78b4190 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 08:00:54 compute-0 nova_compute[187185]: 2025-11-29 08:00:54.655 187189 DEBUG oslo_concurrency.lockutils [req-b7d2cee1-6ccf-4d87-97ec-f1e1b57bc683 req-0e7745bc-9a13-4966-8092-2e3ff78b4190 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 08:00:54 compute-0 nova_compute[187185]: 2025-11-29 08:00:54.655 187189 DEBUG nova.compute.manager [req-b7d2cee1-6ccf-4d87-97ec-f1e1b57bc683 req-0e7745bc-9a13-4966-8092-2e3ff78b4190 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] No waiting events found dispatching network-vif-plugged-c7bce92b-6ded-4a7b-b9cc-ea46b70d977f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 29 08:00:54 compute-0 nova_compute[187185]: 2025-11-29 08:00:54.655 187189 WARNING nova.compute.manager [req-b7d2cee1-6ccf-4d87-97ec-f1e1b57bc683 req-0e7745bc-9a13-4966-8092-2e3ff78b4190 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Received unexpected event network-vif-plugged-c7bce92b-6ded-4a7b-b9cc-ea46b70d977f for instance with vm_state active and task_state deleting.
Nov 29 08:00:54 compute-0 podman[251215]: 2025-11-29 08:00:54.817705498 +0000 UTC m=+0.067615208 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Nov 29 08:00:54 compute-0 podman[251217]: 2025-11-29 08:00:54.825642114 +0000 UTC m=+0.060565387 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 08:00:54 compute-0 podman[251216]: 2025-11-29 08:00:54.826322104 +0000 UTC m=+0.070241793 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, version=9.6, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., config_id=edpm, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.expose-services=)
Nov 29 08:00:56 compute-0 nova_compute[187185]: 2025-11-29 08:00:56.232 187189 DEBUG nova.network.neutron [-] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 29 08:00:56 compute-0 nova_compute[187185]: 2025-11-29 08:00:56.250 187189 INFO nova.compute.manager [-] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Took 3.57 seconds to deallocate network for instance.
Nov 29 08:00:56 compute-0 nova_compute[187185]: 2025-11-29 08:00:56.307 187189 DEBUG nova.compute.manager [req-53920fcd-2e6b-4d76-97da-f29cb4d88116 req-9a906600-4f47-4615-a9a4-bb29216f5a65 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Received event network-vif-deleted-c7bce92b-6ded-4a7b-b9cc-ea46b70d977f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 29 08:00:56 compute-0 nova_compute[187185]: 2025-11-29 08:00:56.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:00:56 compute-0 nova_compute[187185]: 2025-11-29 08:00:56.362 187189 DEBUG oslo_concurrency.lockutils [None req-57f721f8-2696-4e85-b0c6-15947d9d3f4d dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 08:00:56 compute-0 nova_compute[187185]: 2025-11-29 08:00:56.363 187189 DEBUG oslo_concurrency.lockutils [None req-57f721f8-2696-4e85-b0c6-15947d9d3f4d dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 08:00:56 compute-0 nova_compute[187185]: 2025-11-29 08:00:56.426 187189 DEBUG nova.compute.provider_tree [None req-57f721f8-2696-4e85-b0c6-15947d9d3f4d dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 08:00:56 compute-0 nova_compute[187185]: 2025-11-29 08:00:56.444 187189 DEBUG nova.scheduler.client.report [None req-57f721f8-2696-4e85-b0c6-15947d9d3f4d dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 08:00:56 compute-0 nova_compute[187185]: 2025-11-29 08:00:56.464 187189 DEBUG oslo_concurrency.lockutils [None req-57f721f8-2696-4e85-b0c6-15947d9d3f4d dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 08:00:56 compute-0 nova_compute[187185]: 2025-11-29 08:00:56.488 187189 INFO nova.scheduler.client.report [None req-57f721f8-2696-4e85-b0c6-15947d9d3f4d dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Deleted allocations for instance 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4
Nov 29 08:00:56 compute-0 nova_compute[187185]: 2025-11-29 08:00:56.688 187189 DEBUG oslo_concurrency.lockutils [None req-57f721f8-2696-4e85-b0c6-15947d9d3f4d dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.943s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 08:00:57 compute-0 nova_compute[187185]: 2025-11-29 08:00:57.271 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:00:57 compute-0 nova_compute[187185]: 2025-11-29 08:00:57.310 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:00:57 compute-0 nova_compute[187185]: 2025-11-29 08:00:57.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:00:57 compute-0 nova_compute[187185]: 2025-11-29 08:00:57.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 08:00:57 compute-0 nova_compute[187185]: 2025-11-29 08:00:57.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 08:00:57 compute-0 nova_compute[187185]: 2025-11-29 08:00:57.331 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 08:00:59 compute-0 nova_compute[187185]: 2025-11-29 08:00:59.627 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:01:01 compute-0 CROND[251277]: (root) CMD (run-parts /etc/cron.hourly)
Nov 29 08:01:01 compute-0 run-parts[251280]: (/etc/cron.hourly) starting 0anacron
Nov 29 08:01:01 compute-0 run-parts[251286]: (/etc/cron.hourly) finished 0anacron
Nov 29 08:01:01 compute-0 CROND[251276]: (root) CMDEND (run-parts /etc/cron.hourly)
Nov 29 08:01:02 compute-0 nova_compute[187185]: 2025-11-29 08:01:02.314 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:01:03 compute-0 podman[251287]: 2025-11-29 08:01:03.837202034 +0000 UTC m=+0.101279567 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 08:01:04 compute-0 nova_compute[187185]: 2025-11-29 08:01:04.629 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:01:05 compute-0 nova_compute[187185]: 2025-11-29 08:01:05.326 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:01:07 compute-0 nova_compute[187185]: 2025-11-29 08:01:07.067 187189 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764403252.0655649, 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 29 08:01:07 compute-0 nova_compute[187185]: 2025-11-29 08:01:07.068 187189 INFO nova.compute.manager [-] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] VM Stopped (Lifecycle Event)
Nov 29 08:01:07 compute-0 nova_compute[187185]: 2025-11-29 08:01:07.088 187189 DEBUG nova.compute.manager [None req-b93d4e90-2ae6-44f7-9c2d-0427cf5ae842 - - - - - -] [instance: 681dbbe0-cbd0-47a7-8c10-3040e8ddf5c4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 29 08:01:07 compute-0 nova_compute[187185]: 2025-11-29 08:01:07.316 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:01:08 compute-0 nova_compute[187185]: 2025-11-29 08:01:08.593 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:01:08 compute-0 nova_compute[187185]: 2025-11-29 08:01:08.739 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:01:09 compute-0 nova_compute[187185]: 2025-11-29 08:01:09.632 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:01:12 compute-0 nova_compute[187185]: 2025-11-29 08:01:12.320 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:01:12 compute-0 podman[251317]: 2025-11-29 08:01:12.797629666 +0000 UTC m=+0.058586110 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 08:01:14 compute-0 nova_compute[187185]: 2025-11-29 08:01:14.683 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:01:14 compute-0 podman[251341]: 2025-11-29 08:01:14.789533781 +0000 UTC m=+0.058185440 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 08:01:14 compute-0 podman[251342]: 2025-11-29 08:01:14.808754949 +0000 UTC m=+0.071404917 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 08:01:17 compute-0 nova_compute[187185]: 2025-11-29 08:01:17.323 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:01:19 compute-0 nova_compute[187185]: 2025-11-29 08:01:19.685 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:01:22 compute-0 nova_compute[187185]: 2025-11-29 08:01:22.326 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:01:24 compute-0 nova_compute[187185]: 2025-11-29 08:01:24.687 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:01:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:01:25.762 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 08:01:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:01:25.762 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 08:01:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:01:25.763 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 08:01:25 compute-0 podman[251382]: 2025-11-29 08:01:25.803191454 +0000 UTC m=+0.063724956 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 29 08:01:25 compute-0 podman[251383]: 2025-11-29 08:01:25.816748241 +0000 UTC m=+0.071669723 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, container_name=openstack_network_exporter, io.buildah.version=1.33.7, architecture=x86_64, build-date=2025-08-20T13:12:41, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 29 08:01:25 compute-0 podman[251384]: 2025-11-29 08:01:25.832800498 +0000 UTC m=+0.084034335 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 08:01:26 compute-0 sshd-session[251380]: Invalid user bitnami from 190.181.27.27 port 53212
Nov 29 08:01:26 compute-0 sshd-session[251380]: Received disconnect from 190.181.27.27 port 53212:11: Bye Bye [preauth]
Nov 29 08:01:26 compute-0 sshd-session[251380]: Disconnected from invalid user bitnami 190.181.27.27 port 53212 [preauth]
Nov 29 08:01:27 compute-0 nova_compute[187185]: 2025-11-29 08:01:27.329 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:01:29 compute-0 nova_compute[187185]: 2025-11-29 08:01:29.690 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:01:32 compute-0 nova_compute[187185]: 2025-11-29 08:01:32.332 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:01:34 compute-0 nova_compute[187185]: 2025-11-29 08:01:34.692 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:01:34 compute-0 sshd-session[251444]: Connection closed by 115.190.187.93 port 38818 [preauth]
Nov 29 08:01:34 compute-0 podman[251445]: 2025-11-29 08:01:34.849938869 +0000 UTC m=+0.101285567 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2)
Nov 29 08:01:37 compute-0 nova_compute[187185]: 2025-11-29 08:01:37.334 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:01:39 compute-0 nova_compute[187185]: 2025-11-29 08:01:39.695 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:01:41 compute-0 nova_compute[187185]: 2025-11-29 08:01:41.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:01:42 compute-0 nova_compute[187185]: 2025-11-29 08:01:42.337 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:01:43 compute-0 nova_compute[187185]: 2025-11-29 08:01:43.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:01:43 compute-0 podman[251475]: 2025-11-29 08:01:43.811799222 +0000 UTC m=+0.067132313 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 08:01:44 compute-0 nova_compute[187185]: 2025-11-29 08:01:44.698 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:01:45 compute-0 podman[251500]: 2025-11-29 08:01:45.804574161 +0000 UTC m=+0.065278632 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 29 08:01:45 compute-0 podman[251499]: 2025-11-29 08:01:45.825894118 +0000 UTC m=+0.079010553 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 29 08:01:47 compute-0 nova_compute[187185]: 2025-11-29 08:01:47.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:01:47 compute-0 nova_compute[187185]: 2025-11-29 08:01:47.339 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:01:48 compute-0 nova_compute[187185]: 2025-11-29 08:01:48.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:01:48 compute-0 nova_compute[187185]: 2025-11-29 08:01:48.357 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 08:01:48 compute-0 nova_compute[187185]: 2025-11-29 08:01:48.357 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 08:01:48 compute-0 nova_compute[187185]: 2025-11-29 08:01:48.357 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 08:01:48 compute-0 nova_compute[187185]: 2025-11-29 08:01:48.358 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 08:01:48 compute-0 nova_compute[187185]: 2025-11-29 08:01:48.539 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 08:01:48 compute-0 nova_compute[187185]: 2025-11-29 08:01:48.543 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5730MB free_disk=73.25086212158203GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 08:01:48 compute-0 nova_compute[187185]: 2025-11-29 08:01:48.544 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 08:01:48 compute-0 nova_compute[187185]: 2025-11-29 08:01:48.544 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 08:01:48 compute-0 nova_compute[187185]: 2025-11-29 08:01:48.769 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 08:01:48 compute-0 nova_compute[187185]: 2025-11-29 08:01:48.770 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 08:01:48 compute-0 nova_compute[187185]: 2025-11-29 08:01:48.790 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Refreshing inventories for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 29 08:01:48 compute-0 nova_compute[187185]: 2025-11-29 08:01:48.911 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Updating ProviderTree inventory for provider 4e39a026-df39-4e20-874a-dbb5a40df044 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 29 08:01:48 compute-0 nova_compute[187185]: 2025-11-29 08:01:48.911 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Updating inventory in ProviderTree for provider 4e39a026-df39-4e20-874a-dbb5a40df044 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 29 08:01:48 compute-0 nova_compute[187185]: 2025-11-29 08:01:48.946 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Refreshing aggregate associations for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 29 08:01:48 compute-0 nova_compute[187185]: 2025-11-29 08:01:48.977 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Refreshing trait associations for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044, traits: HW_CPU_X86_SSE,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_TRUSTED_CERTS,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 29 08:01:49 compute-0 nova_compute[187185]: 2025-11-29 08:01:49.022 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 08:01:49 compute-0 nova_compute[187185]: 2025-11-29 08:01:49.269 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 08:01:49 compute-0 nova_compute[187185]: 2025-11-29 08:01:49.376 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 08:01:49 compute-0 nova_compute[187185]: 2025-11-29 08:01:49.377 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.833s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 08:01:49 compute-0 sshd-session[251524]: Invalid user client from 45.78.219.119 port 37914
Nov 29 08:01:49 compute-0 nova_compute[187185]: 2025-11-29 08:01:49.734 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:01:49 compute-0 sshd-session[251524]: Received disconnect from 45.78.219.119 port 37914:11: Bye Bye [preauth]
Nov 29 08:01:49 compute-0 sshd-session[251524]: Disconnected from invalid user client 45.78.219.119 port 37914 [preauth]
Nov 29 08:01:50 compute-0 nova_compute[187185]: 2025-11-29 08:01:50.378 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:01:50 compute-0 nova_compute[187185]: 2025-11-29 08:01:50.379 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 08:01:52 compute-0 nova_compute[187185]: 2025-11-29 08:01:52.342 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:01:53 compute-0 nova_compute[187185]: 2025-11-29 08:01:53.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:01:54 compute-0 nova_compute[187185]: 2025-11-29 08:01:54.737 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:01:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:01:55.260 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=80, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=79) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 08:01:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:01:55.261 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 08:01:55 compute-0 nova_compute[187185]: 2025-11-29 08:01:55.261 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:01:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:01:55.262 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '80'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 08:01:56 compute-0 podman[251541]: 2025-11-29 08:01:56.795901537 +0000 UTC m=+0.063226853 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 08:01:56 compute-0 podman[251542]: 2025-11-29 08:01:56.812564482 +0000 UTC m=+0.073173397 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, config_id=edpm, container_name=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, release=1755695350)
Nov 29 08:01:56 compute-0 podman[251543]: 2025-11-29 08:01:56.829847854 +0000 UTC m=+0.085236260 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 08:01:57 compute-0 nova_compute[187185]: 2025-11-29 08:01:57.311 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:01:57 compute-0 nova_compute[187185]: 2025-11-29 08:01:57.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:01:57 compute-0 nova_compute[187185]: 2025-11-29 08:01:57.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 08:01:57 compute-0 nova_compute[187185]: 2025-11-29 08:01:57.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 08:01:57 compute-0 nova_compute[187185]: 2025-11-29 08:01:57.330 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 08:01:57 compute-0 nova_compute[187185]: 2025-11-29 08:01:57.345 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:01:58 compute-0 nova_compute[187185]: 2025-11-29 08:01:58.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:01:59 compute-0 nova_compute[187185]: 2025-11-29 08:01:59.739 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:02:02 compute-0 nova_compute[187185]: 2025-11-29 08:02:02.348 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:02:04 compute-0 nova_compute[187185]: 2025-11-29 08:02:04.741 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:02:05 compute-0 podman[251600]: 2025-11-29 08:02:05.834968902 +0000 UTC m=+0.101455993 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 29 08:02:07 compute-0 nova_compute[187185]: 2025-11-29 08:02:07.350 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:02:09 compute-0 nova_compute[187185]: 2025-11-29 08:02:09.744 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:02:12 compute-0 nova_compute[187185]: 2025-11-29 08:02:12.353 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:02:13 compute-0 sshd-session[251627]: Invalid user admin from 20.255.62.58 port 39040
Nov 29 08:02:13 compute-0 sshd-session[251627]: Received disconnect from 20.255.62.58 port 39040:11: Bye Bye [preauth]
Nov 29 08:02:13 compute-0 sshd-session[251627]: Disconnected from invalid user admin 20.255.62.58 port 39040 [preauth]
Nov 29 08:02:14 compute-0 nova_compute[187185]: 2025-11-29 08:02:14.796 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:02:14 compute-0 podman[251629]: 2025-11-29 08:02:14.83312299 +0000 UTC m=+0.100236078 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 08:02:16 compute-0 podman[251653]: 2025-11-29 08:02:16.800612887 +0000 UTC m=+0.068796401 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 29 08:02:16 compute-0 podman[251654]: 2025-11-29 08:02:16.808208884 +0000 UTC m=+0.070120179 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Nov 29 08:02:17 compute-0 nova_compute[187185]: 2025-11-29 08:02:17.356 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:02:19 compute-0 nova_compute[187185]: 2025-11-29 08:02:19.797 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:02:20 compute-0 ovn_controller[95281]: 2025-11-29T08:02:20Z|00631|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Nov 29 08:02:22 compute-0 nova_compute[187185]: 2025-11-29 08:02:22.358 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:02:24 compute-0 nova_compute[187185]: 2025-11-29 08:02:24.799 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:02:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:02:25.763 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 08:02:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:02:25.763 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 08:02:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:02:25.763 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 08:02:27 compute-0 nova_compute[187185]: 2025-11-29 08:02:27.361 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:02:27 compute-0 podman[251692]: 2025-11-29 08:02:27.789807034 +0000 UTC m=+0.056334997 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 08:02:27 compute-0 podman[251694]: 2025-11-29 08:02:27.796081323 +0000 UTC m=+0.052865598 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 08:02:27 compute-0 podman[251693]: 2025-11-29 08:02:27.830813563 +0000 UTC m=+0.090655405 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, vendor=Red Hat, Inc., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, version=9.6, vcs-type=git, architecture=x86_64, io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, config_id=edpm, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 29 08:02:29 compute-0 nova_compute[187185]: 2025-11-29 08:02:29.801 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:02:32 compute-0 nova_compute[187185]: 2025-11-29 08:02:32.364 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:02:34 compute-0 nova_compute[187185]: 2025-11-29 08:02:34.803 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:02:36 compute-0 sshd-session[251754]: Connection closed by 115.190.187.93 port 49010 [preauth]
Nov 29 08:02:36 compute-0 podman[251756]: 2025-11-29 08:02:36.83948216 +0000 UTC m=+0.098883500 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 29 08:02:37 compute-0 nova_compute[187185]: 2025-11-29 08:02:37.366 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:02:39 compute-0 nova_compute[187185]: 2025-11-29 08:02:39.807 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:02:41 compute-0 nova_compute[187185]: 2025-11-29 08:02:41.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:02:42 compute-0 nova_compute[187185]: 2025-11-29 08:02:42.370 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:02:44 compute-0 nova_compute[187185]: 2025-11-29 08:02:44.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:02:44 compute-0 nova_compute[187185]: 2025-11-29 08:02:44.808 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:02:45 compute-0 nova_compute[187185]: 2025-11-29 08:02:45.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:02:45 compute-0 nova_compute[187185]: 2025-11-29 08:02:45.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 29 08:02:45 compute-0 podman[251783]: 2025-11-29 08:02:45.831089981 +0000 UTC m=+0.094711880 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 08:02:47 compute-0 nova_compute[187185]: 2025-11-29 08:02:47.332 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:02:47 compute-0 nova_compute[187185]: 2025-11-29 08:02:47.374 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:02:47 compute-0 podman[251809]: 2025-11-29 08:02:47.805421645 +0000 UTC m=+0.059785335 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 08:02:47 compute-0 podman[251810]: 2025-11-29 08:02:47.805406494 +0000 UTC m=+0.059111156 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 29 08:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:02:48.024 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:02:48.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:02:48.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:02:48.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:02:48.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:02:48.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:02:48.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:02:48.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:02:48.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:02:48.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:02:48.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:02:48.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:02:48.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:02:48.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:02:48.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:02:48.027 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:02:48.027 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:02:48.027 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:02:48.027 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:02:48.027 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:02:48.027 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:02:48.027 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:02:48.027 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:02:48.027 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:02:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:02:48.027 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:02:48 compute-0 nova_compute[187185]: 2025-11-29 08:02:48.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:02:48 compute-0 nova_compute[187185]: 2025-11-29 08:02:48.342 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 08:02:48 compute-0 nova_compute[187185]: 2025-11-29 08:02:48.343 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 08:02:48 compute-0 nova_compute[187185]: 2025-11-29 08:02:48.343 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 08:02:48 compute-0 nova_compute[187185]: 2025-11-29 08:02:48.343 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 08:02:48 compute-0 nova_compute[187185]: 2025-11-29 08:02:48.508 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 08:02:48 compute-0 nova_compute[187185]: 2025-11-29 08:02:48.509 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5743MB free_disk=73.25086212158203GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 08:02:48 compute-0 nova_compute[187185]: 2025-11-29 08:02:48.509 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 08:02:48 compute-0 nova_compute[187185]: 2025-11-29 08:02:48.510 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 08:02:48 compute-0 nova_compute[187185]: 2025-11-29 08:02:48.571 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 08:02:48 compute-0 nova_compute[187185]: 2025-11-29 08:02:48.571 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 08:02:48 compute-0 nova_compute[187185]: 2025-11-29 08:02:48.591 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 08:02:48 compute-0 nova_compute[187185]: 2025-11-29 08:02:48.606 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 08:02:48 compute-0 nova_compute[187185]: 2025-11-29 08:02:48.608 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 08:02:48 compute-0 nova_compute[187185]: 2025-11-29 08:02:48.608 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 08:02:49 compute-0 nova_compute[187185]: 2025-11-29 08:02:49.811 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:02:50 compute-0 sshd-session[251850]: Received disconnect from 190.181.27.27 port 39594:11: Bye Bye [preauth]
Nov 29 08:02:50 compute-0 sshd-session[251850]: Disconnected from authenticating user root 190.181.27.27 port 39594 [preauth]
Nov 29 08:02:51 compute-0 nova_compute[187185]: 2025-11-29 08:02:51.609 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:02:51 compute-0 nova_compute[187185]: 2025-11-29 08:02:51.610 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 08:02:52 compute-0 nova_compute[187185]: 2025-11-29 08:02:52.376 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:02:54 compute-0 nova_compute[187185]: 2025-11-29 08:02:54.813 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:02:55 compute-0 nova_compute[187185]: 2025-11-29 08:02:55.318 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:02:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:02:55.773 104254 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=81, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=80) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 29 08:02:55 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:02:55.774 104254 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 29 08:02:55 compute-0 nova_compute[187185]: 2025-11-29 08:02:55.818 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:02:57 compute-0 nova_compute[187185]: 2025-11-29 08:02:57.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:02:57 compute-0 nova_compute[187185]: 2025-11-29 08:02:57.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 08:02:57 compute-0 nova_compute[187185]: 2025-11-29 08:02:57.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 08:02:57 compute-0 nova_compute[187185]: 2025-11-29 08:02:57.380 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:02:57 compute-0 nova_compute[187185]: 2025-11-29 08:02:57.455 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 08:02:58 compute-0 nova_compute[187185]: 2025-11-29 08:02:58.450 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:02:58 compute-0 podman[251853]: 2025-11-29 08:02:58.792276635 +0000 UTC m=+0.052899178 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 29 08:02:58 compute-0 podman[251855]: 2025-11-29 08:02:58.796056183 +0000 UTC m=+0.047867385 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 08:02:58 compute-0 podman[251854]: 2025-11-29 08:02:58.805638846 +0000 UTC m=+0.063319535 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.buildah.version=1.33.7, vcs-type=git, version=9.6)
Nov 29 08:02:59 compute-0 nova_compute[187185]: 2025-11-29 08:02:59.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:02:59 compute-0 nova_compute[187185]: 2025-11-29 08:02:59.815 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:03:01 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:03:01.778 104254 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7525db09-7529-4df7-96c0-bba03a4d5548, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '81'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 29 08:03:02 compute-0 nova_compute[187185]: 2025-11-29 08:03:02.383 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:03:04 compute-0 nova_compute[187185]: 2025-11-29 08:03:04.817 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:03:06 compute-0 nova_compute[187185]: 2025-11-29 08:03:06.311 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:03:07 compute-0 nova_compute[187185]: 2025-11-29 08:03:07.385 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:03:07 compute-0 podman[251910]: 2025-11-29 08:03:07.816142615 +0000 UTC m=+0.085111897 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Nov 29 08:03:09 compute-0 nova_compute[187185]: 2025-11-29 08:03:09.820 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:03:12 compute-0 nova_compute[187185]: 2025-11-29 08:03:12.389 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:03:14 compute-0 nova_compute[187185]: 2025-11-29 08:03:14.822 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:03:16 compute-0 podman[251938]: 2025-11-29 08:03:16.788434346 +0000 UTC m=+0.058159137 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 08:03:17 compute-0 nova_compute[187185]: 2025-11-29 08:03:17.391 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:03:18 compute-0 podman[251962]: 2025-11-29 08:03:18.800241218 +0000 UTC m=+0.062095821 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 29 08:03:18 compute-0 podman[251963]: 2025-11-29 08:03:18.800852895 +0000 UTC m=+0.060041642 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 29 08:03:19 compute-0 nova_compute[187185]: 2025-11-29 08:03:19.823 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:03:22 compute-0 nova_compute[187185]: 2025-11-29 08:03:22.394 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:03:24 compute-0 nova_compute[187185]: 2025-11-29 08:03:24.826 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:03:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:03:25.764 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 08:03:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:03:25.765 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 08:03:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:03:25.765 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 08:03:26 compute-0 nova_compute[187185]: 2025-11-29 08:03:26.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:03:26 compute-0 nova_compute[187185]: 2025-11-29 08:03:26.317 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 08:03:26 compute-0 nova_compute[187185]: 2025-11-29 08:03:26.318 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 08:03:26 compute-0 nova_compute[187185]: 2025-11-29 08:03:26.319 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 08:03:26 compute-0 nova_compute[187185]: 2025-11-29 08:03:26.319 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 08:03:26 compute-0 nova_compute[187185]: 2025-11-29 08:03:26.320 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 08:03:26 compute-0 nova_compute[187185]: 2025-11-29 08:03:26.320 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 08:03:26 compute-0 nova_compute[187185]: 2025-11-29 08:03:26.359 187189 DEBUG nova.virt.libvirt.imagecache [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314
Nov 29 08:03:26 compute-0 nova_compute[187185]: 2025-11-29 08:03:26.360 187189 WARNING nova.virt.libvirt.imagecache [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28
Nov 29 08:03:26 compute-0 nova_compute[187185]: 2025-11-29 08:03:26.360 187189 WARNING nova.virt.libvirt.imagecache [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123
Nov 29 08:03:26 compute-0 nova_compute[187185]: 2025-11-29 08:03:26.360 187189 WARNING nova.virt.libvirt.imagecache [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/f54dd85e52fe479e36220a2e2d112289f5828e52
Nov 29 08:03:26 compute-0 nova_compute[187185]: 2025-11-29 08:03:26.360 187189 INFO nova.virt.libvirt.imagecache [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Removable base files: /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 /var/lib/nova/instances/_base/f54dd85e52fe479e36220a2e2d112289f5828e52
Nov 29 08:03:26 compute-0 nova_compute[187185]: 2025-11-29 08:03:26.361 187189 INFO nova.virt.libvirt.imagecache [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28
Nov 29 08:03:26 compute-0 nova_compute[187185]: 2025-11-29 08:03:26.361 187189 INFO nova.virt.libvirt.imagecache [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123
Nov 29 08:03:26 compute-0 nova_compute[187185]: 2025-11-29 08:03:26.361 187189 INFO nova.virt.libvirt.imagecache [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/f54dd85e52fe479e36220a2e2d112289f5828e52
Nov 29 08:03:26 compute-0 nova_compute[187185]: 2025-11-29 08:03:26.361 187189 DEBUG nova.virt.libvirt.imagecache [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350
Nov 29 08:03:26 compute-0 nova_compute[187185]: 2025-11-29 08:03:26.361 187189 DEBUG nova.virt.libvirt.imagecache [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299
Nov 29 08:03:26 compute-0 nova_compute[187185]: 2025-11-29 08:03:26.361 187189 DEBUG nova.virt.libvirt.imagecache [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284
Nov 29 08:03:27 compute-0 nova_compute[187185]: 2025-11-29 08:03:27.397 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:03:29 compute-0 podman[251996]: 2025-11-29 08:03:29.792977287 +0000 UTC m=+0.056579272 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 29 08:03:29 compute-0 podman[251998]: 2025-11-29 08:03:29.809972972 +0000 UTC m=+0.064054436 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 08:03:29 compute-0 podman[251997]: 2025-11-29 08:03:29.82745163 +0000 UTC m=+0.088236255 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_id=edpm, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, architecture=x86_64, maintainer=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git)
Nov 29 08:03:29 compute-0 nova_compute[187185]: 2025-11-29 08:03:29.828 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:03:32 compute-0 nova_compute[187185]: 2025-11-29 08:03:32.400 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:03:34 compute-0 nova_compute[187185]: 2025-11-29 08:03:34.830 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:03:36 compute-0 sshd-session[252055]: Connection closed by 115.190.187.93 port 36676 [preauth]
Nov 29 08:03:37 compute-0 sshd-session[252057]: Invalid user mailuser from 20.255.62.58 port 37290
Nov 29 08:03:37 compute-0 nova_compute[187185]: 2025-11-29 08:03:37.403 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:03:37 compute-0 sshd-session[252057]: Received disconnect from 20.255.62.58 port 37290:11: Bye Bye [preauth]
Nov 29 08:03:37 compute-0 sshd-session[252057]: Disconnected from invalid user mailuser 20.255.62.58 port 37290 [preauth]
Nov 29 08:03:38 compute-0 podman[252059]: 2025-11-29 08:03:38.825745329 +0000 UTC m=+0.085715823 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 29 08:03:39 compute-0 nova_compute[187185]: 2025-11-29 08:03:39.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:03:39 compute-0 nova_compute[187185]: 2025-11-29 08:03:39.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 29 08:03:39 compute-0 nova_compute[187185]: 2025-11-29 08:03:39.332 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 29 08:03:39 compute-0 nova_compute[187185]: 2025-11-29 08:03:39.833 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:03:42 compute-0 nova_compute[187185]: 2025-11-29 08:03:42.332 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:03:42 compute-0 nova_compute[187185]: 2025-11-29 08:03:42.406 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:03:44 compute-0 nova_compute[187185]: 2025-11-29 08:03:44.834 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:03:45 compute-0 nova_compute[187185]: 2025-11-29 08:03:45.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:03:47 compute-0 nova_compute[187185]: 2025-11-29 08:03:47.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:03:47 compute-0 nova_compute[187185]: 2025-11-29 08:03:47.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:03:47 compute-0 nova_compute[187185]: 2025-11-29 08:03:47.408 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:03:47 compute-0 podman[252085]: 2025-11-29 08:03:47.784750732 +0000 UTC m=+0.051900619 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 08:03:49 compute-0 nova_compute[187185]: 2025-11-29 08:03:49.350 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:03:49 compute-0 nova_compute[187185]: 2025-11-29 08:03:49.479 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 08:03:49 compute-0 nova_compute[187185]: 2025-11-29 08:03:49.479 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 08:03:49 compute-0 nova_compute[187185]: 2025-11-29 08:03:49.479 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 08:03:49 compute-0 nova_compute[187185]: 2025-11-29 08:03:49.479 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 08:03:49 compute-0 nova_compute[187185]: 2025-11-29 08:03:49.681 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 08:03:49 compute-0 nova_compute[187185]: 2025-11-29 08:03:49.683 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5734MB free_disk=73.25088500976562GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 08:03:49 compute-0 nova_compute[187185]: 2025-11-29 08:03:49.683 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 08:03:49 compute-0 nova_compute[187185]: 2025-11-29 08:03:49.683 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 08:03:49 compute-0 nova_compute[187185]: 2025-11-29 08:03:49.772 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 08:03:49 compute-0 nova_compute[187185]: 2025-11-29 08:03:49.772 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 08:03:49 compute-0 podman[252109]: 2025-11-29 08:03:49.801496583 +0000 UTC m=+0.058823958 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 08:03:49 compute-0 nova_compute[187185]: 2025-11-29 08:03:49.803 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 08:03:49 compute-0 podman[252110]: 2025-11-29 08:03:49.811715164 +0000 UTC m=+0.063125150 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Nov 29 08:03:49 compute-0 nova_compute[187185]: 2025-11-29 08:03:49.836 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:03:49 compute-0 nova_compute[187185]: 2025-11-29 08:03:49.905 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 08:03:49 compute-0 nova_compute[187185]: 2025-11-29 08:03:49.907 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 08:03:49 compute-0 nova_compute[187185]: 2025-11-29 08:03:49.907 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.224s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 08:03:52 compute-0 nova_compute[187185]: 2025-11-29 08:03:52.411 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:03:52 compute-0 nova_compute[187185]: 2025-11-29 08:03:52.874 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:03:52 compute-0 nova_compute[187185]: 2025-11-29 08:03:52.875 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 08:03:54 compute-0 nova_compute[187185]: 2025-11-29 08:03:54.839 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:03:56 compute-0 nova_compute[187185]: 2025-11-29 08:03:56.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:03:57 compute-0 nova_compute[187185]: 2025-11-29 08:03:57.436 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:03:58 compute-0 nova_compute[187185]: 2025-11-29 08:03:58.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:03:58 compute-0 nova_compute[187185]: 2025-11-29 08:03:58.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 08:03:58 compute-0 nova_compute[187185]: 2025-11-29 08:03:58.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 08:03:58 compute-0 nova_compute[187185]: 2025-11-29 08:03:58.704 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 08:03:59 compute-0 nova_compute[187185]: 2025-11-29 08:03:59.882 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:04:00 compute-0 nova_compute[187185]: 2025-11-29 08:04:00.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:04:00 compute-0 nova_compute[187185]: 2025-11-29 08:04:00.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:04:00 compute-0 podman[252148]: 2025-11-29 08:04:00.824894926 +0000 UTC m=+0.066487626 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 08:04:00 compute-0 podman[252146]: 2025-11-29 08:04:00.825605746 +0000 UTC m=+0.070382696 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 29 08:04:00 compute-0 podman[252147]: 2025-11-29 08:04:00.842682102 +0000 UTC m=+0.090513259 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_id=edpm, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, distribution-scope=public, name=ubi9-minimal, vendor=Red Hat, Inc.)
Nov 29 08:04:02 compute-0 nova_compute[187185]: 2025-11-29 08:04:02.439 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:04:04 compute-0 nova_compute[187185]: 2025-11-29 08:04:04.884 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:04:07 compute-0 nova_compute[187185]: 2025-11-29 08:04:07.442 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:04:09 compute-0 podman[252207]: 2025-11-29 08:04:09.843906507 +0000 UTC m=+0.114627437 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 08:04:09 compute-0 nova_compute[187185]: 2025-11-29 08:04:09.886 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:04:10 compute-0 nova_compute[187185]: 2025-11-29 08:04:10.577 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:04:12 compute-0 nova_compute[187185]: 2025-11-29 08:04:12.444 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:04:14 compute-0 sshd-session[252234]: Received disconnect from 190.181.27.27 port 37458:11: Bye Bye [preauth]
Nov 29 08:04:14 compute-0 sshd-session[252234]: Disconnected from authenticating user root 190.181.27.27 port 37458 [preauth]
Nov 29 08:04:14 compute-0 nova_compute[187185]: 2025-11-29 08:04:14.888 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:04:17 compute-0 nova_compute[187185]: 2025-11-29 08:04:17.447 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:04:18 compute-0 podman[252236]: 2025-11-29 08:04:18.815227249 +0000 UTC m=+0.082873853 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 08:04:19 compute-0 nova_compute[187185]: 2025-11-29 08:04:19.890 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:04:20 compute-0 podman[252259]: 2025-11-29 08:04:20.816693915 +0000 UTC m=+0.079052524 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 08:04:20 compute-0 podman[252258]: 2025-11-29 08:04:20.818342782 +0000 UTC m=+0.082979016 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, container_name=multipathd)
Nov 29 08:04:22 compute-0 nova_compute[187185]: 2025-11-29 08:04:22.451 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:04:24 compute-0 nova_compute[187185]: 2025-11-29 08:04:24.894 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:04:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:04:25.765 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 08:04:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:04:25.766 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 08:04:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:04:25.766 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 08:04:27 compute-0 nova_compute[187185]: 2025-11-29 08:04:27.453 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:04:29 compute-0 nova_compute[187185]: 2025-11-29 08:04:29.895 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:04:31 compute-0 podman[252300]: 2025-11-29 08:04:31.794576061 +0000 UTC m=+0.059083655 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 08:04:31 compute-0 podman[252301]: 2025-11-29 08:04:31.806230193 +0000 UTC m=+0.066552658 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.tags=minimal rhel9, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.buildah.version=1.33.7, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc.)
Nov 29 08:04:31 compute-0 podman[252302]: 2025-11-29 08:04:31.806412838 +0000 UTC m=+0.062495492 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 08:04:32 compute-0 nova_compute[187185]: 2025-11-29 08:04:32.455 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:04:34 compute-0 nova_compute[187185]: 2025-11-29 08:04:34.899 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:04:37 compute-0 nova_compute[187185]: 2025-11-29 08:04:37.459 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:04:39 compute-0 nova_compute[187185]: 2025-11-29 08:04:39.901 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:04:40 compute-0 podman[252364]: 2025-11-29 08:04:40.864976307 +0000 UTC m=+0.115701059 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 08:04:41 compute-0 sshd-session[252299]: error: kex_exchange_identification: read: Connection timed out
Nov 29 08:04:41 compute-0 sshd-session[252299]: banner exchange: Connection from 115.190.187.93 port 38366: Connection timed out
Nov 29 08:04:42 compute-0 nova_compute[187185]: 2025-11-29 08:04:42.353 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:04:42 compute-0 sshd-session[252362]: Invalid user userb from 45.78.219.119 port 41272
Nov 29 08:04:42 compute-0 nova_compute[187185]: 2025-11-29 08:04:42.461 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:04:42 compute-0 sshd-session[252362]: Received disconnect from 45.78.219.119 port 41272:11: Bye Bye [preauth]
Nov 29 08:04:42 compute-0 sshd-session[252362]: Disconnected from invalid user userb 45.78.219.119 port 41272 [preauth]
Nov 29 08:04:44 compute-0 nova_compute[187185]: 2025-11-29 08:04:44.904 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:04:45 compute-0 nova_compute[187185]: 2025-11-29 08:04:45.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:04:47 compute-0 nova_compute[187185]: 2025-11-29 08:04:47.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:04:47 compute-0 nova_compute[187185]: 2025-11-29 08:04:47.463 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:04:48.023 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:04:48.024 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:04:48.024 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:04:48.024 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:04:48.024 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:04:48.024 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:04:48.024 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:04:48.024 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:04:48.024 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:04:48.024 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:04:48.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:04:48.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:04:48.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:04:48.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:04:48.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:04:48.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:04:48.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:04:48.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:04:48.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:04:48.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:04:48.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:04:48.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:04:48.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:04:48.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:04:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:04:48.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:04:49 compute-0 podman[252390]: 2025-11-29 08:04:49.790002121 +0000 UTC m=+0.056236694 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 08:04:49 compute-0 nova_compute[187185]: 2025-11-29 08:04:49.906 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:04:51 compute-0 nova_compute[187185]: 2025-11-29 08:04:51.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:04:51 compute-0 nova_compute[187185]: 2025-11-29 08:04:51.597 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 08:04:51 compute-0 nova_compute[187185]: 2025-11-29 08:04:51.597 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 08:04:51 compute-0 nova_compute[187185]: 2025-11-29 08:04:51.598 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 08:04:51 compute-0 nova_compute[187185]: 2025-11-29 08:04:51.598 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 08:04:51 compute-0 nova_compute[187185]: 2025-11-29 08:04:51.778 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 08:04:51 compute-0 nova_compute[187185]: 2025-11-29 08:04:51.779 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5735MB free_disk=73.25090408325195GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 08:04:51 compute-0 nova_compute[187185]: 2025-11-29 08:04:51.779 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 08:04:51 compute-0 nova_compute[187185]: 2025-11-29 08:04:51.779 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 08:04:51 compute-0 podman[252416]: 2025-11-29 08:04:51.800766421 +0000 UTC m=+0.062511113 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 08:04:51 compute-0 podman[252415]: 2025-11-29 08:04:51.800775441 +0000 UTC m=+0.065348694 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 08:04:52 compute-0 nova_compute[187185]: 2025-11-29 08:04:52.022 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 08:04:52 compute-0 nova_compute[187185]: 2025-11-29 08:04:52.022 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 08:04:52 compute-0 nova_compute[187185]: 2025-11-29 08:04:52.044 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 08:04:52 compute-0 nova_compute[187185]: 2025-11-29 08:04:52.061 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 08:04:52 compute-0 nova_compute[187185]: 2025-11-29 08:04:52.063 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 08:04:52 compute-0 nova_compute[187185]: 2025-11-29 08:04:52.063 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.284s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 08:04:52 compute-0 nova_compute[187185]: 2025-11-29 08:04:52.466 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:04:54 compute-0 nova_compute[187185]: 2025-11-29 08:04:54.908 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:04:55 compute-0 nova_compute[187185]: 2025-11-29 08:04:55.063 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:04:55 compute-0 nova_compute[187185]: 2025-11-29 08:04:55.064 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 08:04:56 compute-0 nova_compute[187185]: 2025-11-29 08:04:56.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:04:57 compute-0 nova_compute[187185]: 2025-11-29 08:04:57.468 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:04:59 compute-0 nova_compute[187185]: 2025-11-29 08:04:59.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:04:59 compute-0 nova_compute[187185]: 2025-11-29 08:04:59.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 08:04:59 compute-0 nova_compute[187185]: 2025-11-29 08:04:59.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 08:04:59 compute-0 nova_compute[187185]: 2025-11-29 08:04:59.910 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:04:59 compute-0 nova_compute[187185]: 2025-11-29 08:04:59.983 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 08:05:02 compute-0 nova_compute[187185]: 2025-11-29 08:05:02.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:05:02 compute-0 nova_compute[187185]: 2025-11-29 08:05:02.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:05:02 compute-0 nova_compute[187185]: 2025-11-29 08:05:02.471 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:05:02 compute-0 podman[252455]: 2025-11-29 08:05:02.812053238 +0000 UTC m=+0.070136080 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 08:05:02 compute-0 podman[252456]: 2025-11-29 08:05:02.828032323 +0000 UTC m=+0.088924525 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., name=ubi9-minimal, managed_by=edpm_ansible, distribution-scope=public, io.openshift.expose-services=, version=9.6, container_name=openstack_network_exporter)
Nov 29 08:05:02 compute-0 podman[252457]: 2025-11-29 08:05:02.83074089 +0000 UTC m=+0.084311164 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 08:05:04 compute-0 sshd-session[252517]: Invalid user ali from 20.255.62.58 port 48308
Nov 29 08:05:04 compute-0 sshd-session[252517]: Received disconnect from 20.255.62.58 port 48308:11: Bye Bye [preauth]
Nov 29 08:05:04 compute-0 sshd-session[252517]: Disconnected from invalid user ali 20.255.62.58 port 48308 [preauth]
Nov 29 08:05:04 compute-0 nova_compute[187185]: 2025-11-29 08:05:04.910 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:05:07 compute-0 nova_compute[187185]: 2025-11-29 08:05:07.473 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:05:09 compute-0 nova_compute[187185]: 2025-11-29 08:05:09.916 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:05:11 compute-0 nova_compute[187185]: 2025-11-29 08:05:11.310 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:05:11 compute-0 podman[252519]: 2025-11-29 08:05:11.854799165 +0000 UTC m=+0.112737894 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 29 08:05:12 compute-0 nova_compute[187185]: 2025-11-29 08:05:12.475 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:05:14 compute-0 nova_compute[187185]: 2025-11-29 08:05:14.917 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:05:17 compute-0 nova_compute[187185]: 2025-11-29 08:05:17.477 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:05:19 compute-0 nova_compute[187185]: 2025-11-29 08:05:19.953 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:05:20 compute-0 podman[252546]: 2025-11-29 08:05:20.787060715 +0000 UTC m=+0.053774584 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 08:05:22 compute-0 nova_compute[187185]: 2025-11-29 08:05:22.480 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:05:22 compute-0 podman[252571]: 2025-11-29 08:05:22.795542411 +0000 UTC m=+0.062089481 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 29 08:05:22 compute-0 podman[252570]: 2025-11-29 08:05:22.804828005 +0000 UTC m=+0.072857797 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 08:05:24 compute-0 nova_compute[187185]: 2025-11-29 08:05:24.956 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:05:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:05:25.767 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 08:05:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:05:25.768 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 08:05:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:05:25.768 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 08:05:27 compute-0 nova_compute[187185]: 2025-11-29 08:05:27.482 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:05:30 compute-0 nova_compute[187185]: 2025-11-29 08:05:30.011 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:05:32 compute-0 nova_compute[187185]: 2025-11-29 08:05:32.515 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:05:33 compute-0 podman[252611]: 2025-11-29 08:05:33.797173761 +0000 UTC m=+0.061180645 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 08:05:33 compute-0 podman[252613]: 2025-11-29 08:05:33.806060125 +0000 UTC m=+0.059655582 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 08:05:33 compute-0 podman[252612]: 2025-11-29 08:05:33.823422499 +0000 UTC m=+0.079810215 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, vendor=Red Hat, Inc., config_id=edpm, distribution-scope=public, release=1755695350, maintainer=Red Hat, Inc.)
Nov 29 08:05:35 compute-0 nova_compute[187185]: 2025-11-29 08:05:35.014 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:05:37 compute-0 nova_compute[187185]: 2025-11-29 08:05:37.518 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:05:40 compute-0 nova_compute[187185]: 2025-11-29 08:05:40.014 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:05:40 compute-0 sshd-session[252677]: Received disconnect from 190.181.27.27 port 49990:11: Bye Bye [preauth]
Nov 29 08:05:40 compute-0 sshd-session[252677]: Disconnected from authenticating user root 190.181.27.27 port 49990 [preauth]
Nov 29 08:05:42 compute-0 nova_compute[187185]: 2025-11-29 08:05:42.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:05:42 compute-0 nova_compute[187185]: 2025-11-29 08:05:42.520 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:05:42 compute-0 podman[252679]: 2025-11-29 08:05:42.833680764 +0000 UTC m=+0.101329160 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 29 08:05:45 compute-0 nova_compute[187185]: 2025-11-29 08:05:45.043 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:05:46 compute-0 nova_compute[187185]: 2025-11-29 08:05:46.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:05:47 compute-0 nova_compute[187185]: 2025-11-29 08:05:47.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:05:47 compute-0 nova_compute[187185]: 2025-11-29 08:05:47.522 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:05:50 compute-0 nova_compute[187185]: 2025-11-29 08:05:50.046 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:05:51 compute-0 nova_compute[187185]: 2025-11-29 08:05:51.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:05:51 compute-0 nova_compute[187185]: 2025-11-29 08:05:51.357 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 08:05:51 compute-0 nova_compute[187185]: 2025-11-29 08:05:51.357 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 08:05:51 compute-0 nova_compute[187185]: 2025-11-29 08:05:51.358 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 08:05:51 compute-0 nova_compute[187185]: 2025-11-29 08:05:51.358 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 08:05:51 compute-0 nova_compute[187185]: 2025-11-29 08:05:51.597 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 08:05:51 compute-0 nova_compute[187185]: 2025-11-29 08:05:51.599 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5730MB free_disk=73.25107955932617GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 08:05:51 compute-0 nova_compute[187185]: 2025-11-29 08:05:51.599 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 08:05:51 compute-0 nova_compute[187185]: 2025-11-29 08:05:51.599 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 08:05:51 compute-0 nova_compute[187185]: 2025-11-29 08:05:51.662 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 08:05:51 compute-0 nova_compute[187185]: 2025-11-29 08:05:51.663 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 08:05:51 compute-0 podman[252706]: 2025-11-29 08:05:51.799278292 +0000 UTC m=+0.068198385 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 08:05:51 compute-0 nova_compute[187185]: 2025-11-29 08:05:51.820 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 08:05:51 compute-0 nova_compute[187185]: 2025-11-29 08:05:51.853 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 08:05:51 compute-0 nova_compute[187185]: 2025-11-29 08:05:51.855 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 08:05:51 compute-0 nova_compute[187185]: 2025-11-29 08:05:51.855 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.256s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 08:05:52 compute-0 nova_compute[187185]: 2025-11-29 08:05:52.524 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:05:53 compute-0 podman[252731]: 2025-11-29 08:05:53.832109993 +0000 UTC m=+0.095416311 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd)
Nov 29 08:05:53 compute-0 podman[252732]: 2025-11-29 08:05:53.840568214 +0000 UTC m=+0.087815924 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 08:05:55 compute-0 nova_compute[187185]: 2025-11-29 08:05:55.048 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:05:55 compute-0 nova_compute[187185]: 2025-11-29 08:05:55.855 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:05:55 compute-0 nova_compute[187185]: 2025-11-29 08:05:55.855 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 08:05:56 compute-0 nova_compute[187185]: 2025-11-29 08:05:56.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:05:57 compute-0 nova_compute[187185]: 2025-11-29 08:05:57.561 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:05:59 compute-0 nova_compute[187185]: 2025-11-29 08:05:59.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:05:59 compute-0 nova_compute[187185]: 2025-11-29 08:05:59.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 08:05:59 compute-0 nova_compute[187185]: 2025-11-29 08:05:59.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 08:05:59 compute-0 nova_compute[187185]: 2025-11-29 08:05:59.655 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 08:06:00 compute-0 nova_compute[187185]: 2025-11-29 08:06:00.052 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:06:02 compute-0 nova_compute[187185]: 2025-11-29 08:06:02.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:06:02 compute-0 nova_compute[187185]: 2025-11-29 08:06:02.564 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:06:03 compute-0 nova_compute[187185]: 2025-11-29 08:06:03.311 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:06:04 compute-0 podman[252769]: 2025-11-29 08:06:04.79082338 +0000 UTC m=+0.057747326 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible)
Nov 29 08:06:04 compute-0 podman[252770]: 2025-11-29 08:06:04.818669984 +0000 UTC m=+0.078471117 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=9.6, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., release=1755695350, distribution-scope=public, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 08:06:04 compute-0 podman[252771]: 2025-11-29 08:06:04.822561365 +0000 UTC m=+0.080862615 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 08:06:05 compute-0 nova_compute[187185]: 2025-11-29 08:06:05.054 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:06:07 compute-0 nova_compute[187185]: 2025-11-29 08:06:07.566 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:06:10 compute-0 nova_compute[187185]: 2025-11-29 08:06:10.056 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:06:12 compute-0 nova_compute[187185]: 2025-11-29 08:06:12.569 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:06:13 compute-0 podman[252828]: 2025-11-29 08:06:13.840773004 +0000 UTC m=+0.110384797 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 29 08:06:15 compute-0 nova_compute[187185]: 2025-11-29 08:06:15.058 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:06:17 compute-0 nova_compute[187185]: 2025-11-29 08:06:17.572 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:06:20 compute-0 nova_compute[187185]: 2025-11-29 08:06:20.060 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:06:22 compute-0 nova_compute[187185]: 2025-11-29 08:06:22.574 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:06:22 compute-0 podman[252856]: 2025-11-29 08:06:22.816130573 +0000 UTC m=+0.083158111 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 08:06:24 compute-0 podman[252880]: 2025-11-29 08:06:24.806448481 +0000 UTC m=+0.071765266 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=multipathd)
Nov 29 08:06:24 compute-0 podman[252881]: 2025-11-29 08:06:24.807402059 +0000 UTC m=+0.065277242 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 08:06:25 compute-0 nova_compute[187185]: 2025-11-29 08:06:25.062 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:06:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:06:25.767 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 08:06:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:06:25.768 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 08:06:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:06:25.768 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 08:06:26 compute-0 sshd-session[252917]: Received disconnect from 20.255.62.58 port 41136:11: Bye Bye [preauth]
Nov 29 08:06:26 compute-0 sshd-session[252917]: Disconnected from authenticating user root 20.255.62.58 port 41136 [preauth]
Nov 29 08:06:27 compute-0 nova_compute[187185]: 2025-11-29 08:06:27.577 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:06:30 compute-0 nova_compute[187185]: 2025-11-29 08:06:30.064 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:06:32 compute-0 nova_compute[187185]: 2025-11-29 08:06:32.579 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:06:35 compute-0 nova_compute[187185]: 2025-11-29 08:06:35.067 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:06:35 compute-0 podman[252919]: 2025-11-29 08:06:35.796174543 +0000 UTC m=+0.055503403 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 08:06:35 compute-0 podman[252921]: 2025-11-29 08:06:35.807622639 +0000 UTC m=+0.057071528 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 08:06:35 compute-0 podman[252920]: 2025-11-29 08:06:35.829783671 +0000 UTC m=+0.082510803 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, managed_by=edpm_ansible, release=1755695350, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, vcs-type=git, maintainer=Red Hat, Inc.)
Nov 29 08:06:37 compute-0 nova_compute[187185]: 2025-11-29 08:06:37.581 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:06:40 compute-0 nova_compute[187185]: 2025-11-29 08:06:40.069 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:06:42 compute-0 nova_compute[187185]: 2025-11-29 08:06:42.585 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:06:44 compute-0 nova_compute[187185]: 2025-11-29 08:06:44.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:06:44 compute-0 podman[252981]: 2025-11-29 08:06:44.834053032 +0000 UTC m=+0.094834264 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 08:06:45 compute-0 nova_compute[187185]: 2025-11-29 08:06:45.071 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:06:47 compute-0 nova_compute[187185]: 2025-11-29 08:06:47.587 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:06:48.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:06:48.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:06:48.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:06:48.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:06:48.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:06:48.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:06:48.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:06:48.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:06:48.026 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:06:48.027 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:06:48.027 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:06:48.027 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:06:48.027 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:06:48.027 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:06:48.027 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:06:48.027 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:06:48.027 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:06:48.027 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:06:48.027 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:06:48.027 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:06:48.027 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:06:48.028 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:06:48.028 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:06:48.028 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:06:48 compute-0 ceilometer_agent_compute[197930]: 2025-11-29 08:06:48.028 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 08:06:48 compute-0 nova_compute[187185]: 2025-11-29 08:06:48.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:06:49 compute-0 nova_compute[187185]: 2025-11-29 08:06:49.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:06:50 compute-0 nova_compute[187185]: 2025-11-29 08:06:50.073 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:06:52 compute-0 nova_compute[187185]: 2025-11-29 08:06:52.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:06:52 compute-0 nova_compute[187185]: 2025-11-29 08:06:52.372 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 08:06:52 compute-0 nova_compute[187185]: 2025-11-29 08:06:52.373 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 08:06:52 compute-0 nova_compute[187185]: 2025-11-29 08:06:52.373 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 08:06:52 compute-0 nova_compute[187185]: 2025-11-29 08:06:52.374 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 08:06:52 compute-0 nova_compute[187185]: 2025-11-29 08:06:52.550 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 08:06:52 compute-0 nova_compute[187185]: 2025-11-29 08:06:52.551 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5740MB free_disk=73.25107955932617GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 08:06:52 compute-0 nova_compute[187185]: 2025-11-29 08:06:52.551 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 08:06:52 compute-0 nova_compute[187185]: 2025-11-29 08:06:52.552 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 08:06:52 compute-0 nova_compute[187185]: 2025-11-29 08:06:52.589 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:06:52 compute-0 nova_compute[187185]: 2025-11-29 08:06:52.802 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 08:06:52 compute-0 nova_compute[187185]: 2025-11-29 08:06:52.803 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 08:06:52 compute-0 nova_compute[187185]: 2025-11-29 08:06:52.821 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Refreshing inventories for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 29 08:06:52 compute-0 nova_compute[187185]: 2025-11-29 08:06:52.930 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Updating ProviderTree inventory for provider 4e39a026-df39-4e20-874a-dbb5a40df044 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 29 08:06:52 compute-0 nova_compute[187185]: 2025-11-29 08:06:52.931 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Updating inventory in ProviderTree for provider 4e39a026-df39-4e20-874a-dbb5a40df044 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 29 08:06:52 compute-0 nova_compute[187185]: 2025-11-29 08:06:52.965 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Refreshing aggregate associations for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 29 08:06:52 compute-0 nova_compute[187185]: 2025-11-29 08:06:52.997 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Refreshing trait associations for resource provider 4e39a026-df39-4e20-874a-dbb5a40df044, traits: HW_CPU_X86_SSE,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_TRUSTED_CERTS,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 29 08:06:53 compute-0 nova_compute[187185]: 2025-11-29 08:06:53.026 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 08:06:53 compute-0 nova_compute[187185]: 2025-11-29 08:06:53.048 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 08:06:53 compute-0 nova_compute[187185]: 2025-11-29 08:06:53.049 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 08:06:53 compute-0 nova_compute[187185]: 2025-11-29 08:06:53.049 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.498s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 08:06:53 compute-0 podman[253005]: 2025-11-29 08:06:53.808174676 +0000 UTC m=+0.075427410 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 08:06:55 compute-0 nova_compute[187185]: 2025-11-29 08:06:55.076 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:06:55 compute-0 podman[253030]: 2025-11-29 08:06:55.799017779 +0000 UTC m=+0.064332664 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 29 08:06:55 compute-0 podman[253031]: 2025-11-29 08:06:55.810752994 +0000 UTC m=+0.070408448 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 08:06:56 compute-0 nova_compute[187185]: 2025-11-29 08:06:56.050 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:06:56 compute-0 nova_compute[187185]: 2025-11-29 08:06:56.051 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 08:06:56 compute-0 nova_compute[187185]: 2025-11-29 08:06:56.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:06:57 compute-0 nova_compute[187185]: 2025-11-29 08:06:57.591 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:06:57 compute-0 sshd-session[253069]: Invalid user production from 190.181.27.27 port 45978
Nov 29 08:06:58 compute-0 sshd-session[253069]: Received disconnect from 190.181.27.27 port 45978:11: Bye Bye [preauth]
Nov 29 08:06:58 compute-0 sshd-session[253069]: Disconnected from invalid user production 190.181.27.27 port 45978 [preauth]
Nov 29 08:07:00 compute-0 nova_compute[187185]: 2025-11-29 08:07:00.078 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:07:00 compute-0 nova_compute[187185]: 2025-11-29 08:07:00.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:07:00 compute-0 nova_compute[187185]: 2025-11-29 08:07:00.316 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 08:07:00 compute-0 nova_compute[187185]: 2025-11-29 08:07:00.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 08:07:00 compute-0 nova_compute[187185]: 2025-11-29 08:07:00.348 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 08:07:02 compute-0 nova_compute[187185]: 2025-11-29 08:07:02.593 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:07:04 compute-0 nova_compute[187185]: 2025-11-29 08:07:04.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:07:05 compute-0 nova_compute[187185]: 2025-11-29 08:07:05.081 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:07:05 compute-0 nova_compute[187185]: 2025-11-29 08:07:05.311 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:07:06 compute-0 sshd-session[253071]: Connection closed by 220.250.59.155 port 53610
Nov 29 08:07:06 compute-0 podman[253072]: 2025-11-29 08:07:06.795871474 +0000 UTC m=+0.060822454 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 08:07:06 compute-0 podman[253074]: 2025-11-29 08:07:06.801928217 +0000 UTC m=+0.059594070 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 08:07:06 compute-0 podman[253073]: 2025-11-29 08:07:06.837947153 +0000 UTC m=+0.088992997 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, release=1755695350, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., config_id=edpm, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git)
Nov 29 08:07:07 compute-0 nova_compute[187185]: 2025-11-29 08:07:07.596 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:07:10 compute-0 nova_compute[187185]: 2025-11-29 08:07:10.083 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:07:11 compute-0 nova_compute[187185]: 2025-11-29 08:07:11.311 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:07:12 compute-0 nova_compute[187185]: 2025-11-29 08:07:12.598 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:07:15 compute-0 nova_compute[187185]: 2025-11-29 08:07:15.085 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:07:15 compute-0 podman[253132]: 2025-11-29 08:07:15.874973039 +0000 UTC m=+0.132426515 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 08:07:17 compute-0 nova_compute[187185]: 2025-11-29 08:07:17.600 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:07:20 compute-0 nova_compute[187185]: 2025-11-29 08:07:20.088 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:07:22 compute-0 nova_compute[187185]: 2025-11-29 08:07:22.604 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:07:24 compute-0 podman[253159]: 2025-11-29 08:07:24.832381115 +0000 UTC m=+0.087924487 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 08:07:25 compute-0 nova_compute[187185]: 2025-11-29 08:07:25.132 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:07:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:07:25.769 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 08:07:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:07:25.769 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 08:07:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:07:25.769 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 08:07:26 compute-0 podman[253183]: 2025-11-29 08:07:26.808929381 +0000 UTC m=+0.069681247 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd)
Nov 29 08:07:26 compute-0 podman[253184]: 2025-11-29 08:07:26.819467502 +0000 UTC m=+0.073485386 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Nov 29 08:07:27 compute-0 nova_compute[187185]: 2025-11-29 08:07:27.608 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:07:30 compute-0 nova_compute[187185]: 2025-11-29 08:07:30.133 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:07:32 compute-0 nova_compute[187185]: 2025-11-29 08:07:32.611 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:07:35 compute-0 nova_compute[187185]: 2025-11-29 08:07:35.134 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:07:37 compute-0 nova_compute[187185]: 2025-11-29 08:07:37.613 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:07:37 compute-0 podman[253223]: 2025-11-29 08:07:37.793988021 +0000 UTC m=+0.058770716 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 29 08:07:37 compute-0 podman[253225]: 2025-11-29 08:07:37.81394977 +0000 UTC m=+0.065198659 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 08:07:37 compute-0 podman[253224]: 2025-11-29 08:07:37.843981396 +0000 UTC m=+0.088898865 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, maintainer=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9)
Nov 29 08:07:40 compute-0 nova_compute[187185]: 2025-11-29 08:07:40.136 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:07:42 compute-0 nova_compute[187185]: 2025-11-29 08:07:42.615 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:07:44 compute-0 nova_compute[187185]: 2025-11-29 08:07:44.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:07:45 compute-0 nova_compute[187185]: 2025-11-29 08:07:45.138 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:07:46 compute-0 podman[253288]: 2025-11-29 08:07:46.829476611 +0000 UTC m=+0.097281363 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 08:07:47 compute-0 sshd-session[253286]: Invalid user ftpadmin from 20.255.62.58 port 57684
Nov 29 08:07:47 compute-0 nova_compute[187185]: 2025-11-29 08:07:47.618 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:07:47 compute-0 sshd-session[253286]: Received disconnect from 20.255.62.58 port 57684:11: Bye Bye [preauth]
Nov 29 08:07:47 compute-0 sshd-session[253286]: Disconnected from invalid user ftpadmin 20.255.62.58 port 57684 [preauth]
Nov 29 08:07:49 compute-0 nova_compute[187185]: 2025-11-29 08:07:49.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:07:50 compute-0 nova_compute[187185]: 2025-11-29 08:07:50.142 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:07:50 compute-0 nova_compute[187185]: 2025-11-29 08:07:50.315 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:07:52 compute-0 nova_compute[187185]: 2025-11-29 08:07:52.620 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:07:53 compute-0 nova_compute[187185]: 2025-11-29 08:07:53.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:07:55 compute-0 nova_compute[187185]: 2025-11-29 08:07:55.143 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:07:55 compute-0 podman[253314]: 2025-11-29 08:07:55.791608403 +0000 UTC m=+0.057875740 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 08:07:57 compute-0 nova_compute[187185]: 2025-11-29 08:07:57.206 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 08:07:57 compute-0 nova_compute[187185]: 2025-11-29 08:07:57.206 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 08:07:57 compute-0 nova_compute[187185]: 2025-11-29 08:07:57.206 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 08:07:57 compute-0 nova_compute[187185]: 2025-11-29 08:07:57.206 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 29 08:07:57 compute-0 nova_compute[187185]: 2025-11-29 08:07:57.398 187189 WARNING nova.virt.libvirt.driver [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 29 08:07:57 compute-0 nova_compute[187185]: 2025-11-29 08:07:57.400 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5752MB free_disk=73.25105667114258GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 29 08:07:57 compute-0 nova_compute[187185]: 2025-11-29 08:07:57.400 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 08:07:57 compute-0 nova_compute[187185]: 2025-11-29 08:07:57.400 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 08:07:57 compute-0 nova_compute[187185]: 2025-11-29 08:07:57.472 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 29 08:07:57 compute-0 nova_compute[187185]: 2025-11-29 08:07:57.473 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 29 08:07:57 compute-0 nova_compute[187185]: 2025-11-29 08:07:57.496 187189 DEBUG nova.compute.provider_tree [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed in ProviderTree for provider: 4e39a026-df39-4e20-874a-dbb5a40df044 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 29 08:07:57 compute-0 nova_compute[187185]: 2025-11-29 08:07:57.514 187189 DEBUG nova.scheduler.client.report [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Inventory has not changed for provider 4e39a026-df39-4e20-874a-dbb5a40df044 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 29 08:07:57 compute-0 nova_compute[187185]: 2025-11-29 08:07:57.516 187189 DEBUG nova.compute.resource_tracker [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 29 08:07:57 compute-0 nova_compute[187185]: 2025-11-29 08:07:57.516 187189 DEBUG oslo_concurrency.lockutils [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 08:07:57 compute-0 nova_compute[187185]: 2025-11-29 08:07:57.517 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:07:57 compute-0 nova_compute[187185]: 2025-11-29 08:07:57.517 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 29 08:07:57 compute-0 nova_compute[187185]: 2025-11-29 08:07:57.622 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:07:57 compute-0 podman[253338]: 2025-11-29 08:07:57.786993006 +0000 UTC m=+0.057866360 container health_status 03d57e1b4d9b19beb2fc4f8a08ee2e559e9e75eadd2e23db946dcf02e9a4d205 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd)
Nov 29 08:07:57 compute-0 podman[253339]: 2025-11-29 08:07:57.795365105 +0000 UTC m=+0.060599309 container health_status 39c9a04598137e9f2755fc174a1c5238b958900b1e552548fdf29109e8112e7d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 08:08:00 compute-0 nova_compute[187185]: 2025-11-29 08:08:00.144 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:08:01 compute-0 nova_compute[187185]: 2025-11-29 08:08:01.538 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:08:01 compute-0 nova_compute[187185]: 2025-11-29 08:08:01.538 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:08:01 compute-0 nova_compute[187185]: 2025-11-29 08:08:01.538 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 29 08:08:02 compute-0 nova_compute[187185]: 2025-11-29 08:08:02.317 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:08:02 compute-0 nova_compute[187185]: 2025-11-29 08:08:02.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 29 08:08:02 compute-0 nova_compute[187185]: 2025-11-29 08:08:02.317 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 29 08:08:02 compute-0 nova_compute[187185]: 2025-11-29 08:08:02.625 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:08:03 compute-0 nova_compute[187185]: 2025-11-29 08:08:03.662 187189 DEBUG nova.compute.manager [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 29 08:08:05 compute-0 nova_compute[187185]: 2025-11-29 08:08:05.153 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:08:06 compute-0 nova_compute[187185]: 2025-11-29 08:08:06.316 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:08:07 compute-0 nova_compute[187185]: 2025-11-29 08:08:07.311 187189 DEBUG oslo_service.periodic_task [None req-4dc94c05-0ccf-4500-9120-8ee849125014 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 29 08:08:07 compute-0 nova_compute[187185]: 2025-11-29 08:08:07.627 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:08:08 compute-0 podman[253380]: 2025-11-29 08:08:08.80693116 +0000 UTC m=+0.061280847 container health_status cce257436caf47c34d357b17e160c51e6106e71d295324590b3d622130ffeedd (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 08:08:08 compute-0 podman[253379]: 2025-11-29 08:08:08.807199227 +0000 UTC m=+0.065901678 container health_status 65a71db07c238beb8fb0b28b07e1dd9524ed8cfb653a9c6d1140703ba340fbdf (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, distribution-scope=public, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, name=ubi9-minimal, vcs-type=git, version=9.6, vendor=Red Hat, Inc.)
Nov 29 08:08:08 compute-0 podman[253378]: 2025-11-29 08:08:08.822135533 +0000 UTC m=+0.084782427 container health_status 0571023c93bbd4542fb1587ca97890f4e3279a9c14c3f428cd4fef397fdeca1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 29 08:08:10 compute-0 nova_compute[187185]: 2025-11-29 08:08:10.148 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:08:12 compute-0 sshd-session[253437]: Accepted publickey for zuul from 192.168.122.10 port 35274 ssh2: ECDSA SHA256:Ey+6YDlYfkE2dRe/gjWhHvvrJHee4xZo2Q1JojEuVBA
Nov 29 08:08:12 compute-0 systemd-logind[788]: New session 47 of user zuul.
Nov 29 08:08:12 compute-0 systemd[1]: Started Session 47 of User zuul.
Nov 29 08:08:12 compute-0 sshd-session[253437]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 29 08:08:12 compute-0 sudo[253441]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Nov 29 08:08:12 compute-0 sudo[253441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 29 08:08:12 compute-0 nova_compute[187185]: 2025-11-29 08:08:12.629 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:08:15 compute-0 nova_compute[187185]: 2025-11-29 08:08:15.151 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:08:17 compute-0 ovs-vsctl[253615]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Nov 29 08:08:17 compute-0 nova_compute[187185]: 2025-11-29 08:08:17.631 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:08:17 compute-0 podman[253648]: 2025-11-29 08:08:17.86976723 +0000 UTC m=+0.125617841 container health_status 8e0006c25f483be352fc39ea474a343de845ec5919bcd4308808eda28fe0f152 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 08:08:18 compute-0 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 253465 (sos)
Nov 29 08:08:18 compute-0 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Nov 29 08:08:18 compute-0 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Nov 29 08:08:18 compute-0 virtqemud[186729]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Nov 29 08:08:18 compute-0 virtqemud[186729]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Nov 29 08:08:18 compute-0 virtqemud[186729]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 29 08:08:19 compute-0 crontab[254050]: (root) LIST (root)
Nov 29 08:08:20 compute-0 nova_compute[187185]: 2025-11-29 08:08:20.153 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:08:22 compute-0 systemd[1]: Starting Hostname Service...
Nov 29 08:08:22 compute-0 systemd[1]: Started Hostname Service.
Nov 29 08:08:22 compute-0 nova_compute[187185]: 2025-11-29 08:08:22.639 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:08:23 compute-0 sshd-session[253475]: Received disconnect from 190.181.27.27 port 45006:11: Bye Bye [preauth]
Nov 29 08:08:23 compute-0 sshd-session[253475]: Disconnected from 190.181.27.27 port 45006 [preauth]
Nov 29 08:08:25 compute-0 nova_compute[187185]: 2025-11-29 08:08:25.155 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 29 08:08:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:08:25.770 104254 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 29 08:08:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:08:25.771 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 29 08:08:25 compute-0 ovn_metadata_agent[104249]: 2025-11-29 08:08:25.771 104254 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 29 08:08:27 compute-0 podman[254523]: 2025-11-29 08:08:27.386012996 +0000 UTC m=+0.072068486 container health_status 78c19f564b1ccfb839df6db097736315729cfb356b10571a5e166e7ad0c4637c (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 08:08:27 compute-0 nova_compute[187185]: 2025-11-29 08:08:27.643 187189 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
